All 58 contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Tue 19th Apr 2022
Online Safety Bill
Commons Chamber

2nd reading & 2nd reading
Tue 24th May 2022
Tue 24th May 2022
Thu 26th May 2022
Online Safety Bill (Third sitting)
Public Bill Committees

Committee stage: 3rd sitting & Committee Debate - 3rd sitting
Thu 26th May 2022
Online Safety Bill (Fourth sitting)
Public Bill Committees

Committee stage: 4th sitting & Committee Debate - 4th sitting
Tue 7th Jun 2022
Tue 7th Jun 2022
Thu 9th Jun 2022
Thu 9th Jun 2022
Tue 14th Jun 2022
Tue 14th Jun 2022
Thu 16th Jun 2022
Thu 16th Jun 2022
Tue 21st Jun 2022
Online Safety Bill (Thirteenth sitting)
Public Bill Committees

Committee stage: 13th sitting & Committee Debate - 13th sitting
Tue 21st Jun 2022
Thu 23rd Jun 2022
Tue 28th Jun 2022
Tue 28th Jun 2022
Tue 12th Jul 2022
Online Safety Bill
Commons Chamber

Report stage & Report stage (day 1) & Report stage
Mon 5th Dec 2022
Mon 5th Dec 2022
Tue 13th Dec 2022
ONLINE SAFETY BILL (First sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 1st sitting
Tue 13th Dec 2022
ONLINE SAFETY BILL (Second sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 2nd sitting
Thu 15th Dec 2022
ONLINE SAFETY BILL (Third sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 3rd sitting
Tue 17th Jan 2023
Wed 18th Jan 2023
Wed 1st Feb 2023
Wed 19th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage & Committee stage
Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 9th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 9th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 11th May 2023
Tue 16th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 16th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 23rd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 23rd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 25th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 25th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 22nd Jun 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 22nd Jun 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 6th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 1 & Report stage: Minutes of Proceedings
Thu 6th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 2
Thu 6th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 3
Mon 10th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 1
Mon 10th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 2
Wed 12th Jul 2023
Mon 17th Jul 2023
Wed 19th Jul 2023
Wed 6th Sep 2023
Tue 12th Sep 2023
Tue 12th Sep 2023
Online Safety Bill
Commons Chamber

Consideration of Lords amendments
Tue 19th Sep 2023
Online Safety Bill
Lords Chamber

Consideration of Commons amendments
Thu 26th Oct 2023
Royal Assent
Lords Chamber

Royal Assent & Royal Assent & Royal Assent & Royal Assent & Royal Assent & Royal Assent & Royal Assent

Online Safety Bill

2nd reading
Tuesday 19th April 2022

(2 years ago)

Commons Chamber
Read Full debate Online Safety Act 2023 Read Hansard Text Watch Debate Read Debate Ministerial Extracts
[Relevant Documents: Report of the Joint Committee on the Draft Online Safety Bill, Session 2021-22: Draft Online Safety Bill, HC 609, and the Government Response, CM 640; Eighth Report of the Digital, Culture, Media and Sport Committee, The Draft Online Safety Bill and the legal but harmful debate, HC 1039, and the Government response HC 1221; Second Report of the Digital, Culture, Media and Sport Committee, Session 2019-21, Misinformation in the COVID-19 Infodemic, HC 234, and the Government response, HC 894; Second Report of the Petitions Committee, Tackling Online Abuse, HC 766, and the Government response, HC 1224; Eleventh Report of the Treasury Committee, Economic Crime, HC 145; e-petition 272087, Hold online trolls accountable for their online abuse via their IP address; e-petition 332315, Ban anonymous accounts on social media; e-petition 575833, Make verified ID a requirement for opening a social media account; e-petition 582423, Repeal Section 127 of the Communications Act 2003 and expunge all convictions; e-petition 601932, Do not restrict our right to freedom of expression online.]
Second Reading
19:36
Nadine Dorries Portrait The Secretary of State for Digital, Culture, Media and Sport (Ms Nadine Dorries)
- View Speech - Hansard - - - Excerpts

I beg to move, That the Bill be now read a Second time.

Given the time and the number of people indicating that they wish to speak, and given that we will have my speech, the shadow Minister’s speech and the two winding-up speeches, there might be 10 minutes left for people to speak. I will therefore take only a couple of interventions and speak very fast in the way I can, being northern.

Almost every aspect of our lives is now conducted via the internet, from work and shopping to keeping up with our friends, family and worldwide real-time news. Via our smartphones and tablets, we increasingly spend more of our lives online than in the real world.

In the past 20 years or so, it is fair to say that the internet has overwhelmingly been a force for good, for prosperity and for progress, but Members on both sides of the House will agree that, as technology advances at warp speed, so have the new dangers this progress presents to children and young people.

Mark Francois Portrait Mr Mark Francois (Rayleigh and Wickford) (Con)
- Hansard - - - Excerpts

My right hon. Friend will know that, last Wednesday, the man who murdered our great friend Sir David Amess was sentenced to a whole-life term. David felt very strongly that we need legislation to protect MPs, particularly female MPs, from vile misogynistic abuse. In his memory, will she assure me that her Bill will honour the spirit of that request?

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

Sir David was a friend to all of us, and he was very much at the forefront of my mind during the redrafting of this Bill over the last few months. I give my right hon. Friend my absolute assurance on that.

Jim Shannon Portrait Jim Shannon (Strangford) (DUP)
- Hansard - - - Excerpts

A number of constituents have contacted me over the last few months about eating disorders, particularly anorexia and bulimia, and about bullying in schools. Will the Secretary of State assure me and this House that those concerns will be addressed by this Bill so that my constituents are protected?

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

They will. Inciting people to take their own life or encouraging eating disorders in anorexia chatrooms—all these issues are covered by the Bill.

None Portrait Several hon. Members rose—
- Hansard -

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

I will take one more intervention.

Jonathan Gullis Portrait Jonathan Gullis (Stoke-on-Trent North) (Con)
- Hansard - - - Excerpts

I am grateful to my right hon. Friend, and I thank her for her written communications regarding Angela Stevens, the mother of Brett, who tragically took his own life having been coerced by some of these vile online sites. The Law Commission considered harmful online communications as part of the Bill’s preparation, and one of its recommendations is to introduce a new offence of encouraging or assisting self-harm. I strongly urge my right hon. Friend to adopt that recommendation. Can she say more on that?

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

Yes. Exactly those issues will be listed in secondary legislation, under “legal but harmful”. I will talk about that further in my speech, but “legal but harmful” focuses on some of the worst harmful behaviours. We are talking not about an arbitrary list, but about incitement to encourage people to take their own life and encouraging people into suicide chatrooms—behaviour that is not illegal but which is indeed harmful.

None Portrait Several hon. Members rose—
- Hansard -

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

I am going to whizz through my speech now in order to allow people who have stayed and want to speak to do so.

As the Minister for mental health for two years, too often, I heard stories such as the one just highlighted by my hon. Friend the Member for Stoke-on-Trent North (Jonathan Gullis). We have all sat down with constituents and listened as the worst stories any parents could recount were retold: stories of how 14-year-old girls take their own life after being directed via harmful algorithms into a suicide chatroom; and of how a child has been bombarded with pro-anorexia content, or posts encouraging self-harm or cyber-bullying.

School bullying used to stop at the school gate. Today, it accompanies a child home, on their mobile phone, and is lurking in the bedroom waiting when they switch on their computer. It is the last thing a bullied child reads at night before they sleep and the first thing they see when they wake in the morning. A bullied child is no longer bullied in the playground on school days; they are bullied 24 hours a day, seven days a week. Childhood innocence is being stolen at the click of a button. One extremely worrying figure from 2020 showed that 80% of 12 to 15-year-olds had at least one potentially harmful online experience in the previous year.

We also see this every time a footballer steps on to the pitch, only to be subjected to horrific racism online, including banana and monkey emojis. As any female MP in this House will tell you, a woman on social media—I say this from experience—faces a daily barrage of toxic abuse. It is not criticism—criticism is a fair game—but horrific harassment and serious threats of violence. Trolls post that they hope we get raped or killed, urge us to put a rope around our neck, or want to watch us burn in a car alive—my own particular experience.

All this behaviour is either illegal or, almost without exception, explicitly banned in a platform’s terms and conditions. Commercially, it has to be. If a platform stated openly that it allowed such content on its sites, which advertisers, its financial lifeblood, would knowingly endorse and advertise on it? Which advertisers would do that? Who would openly use or allow their children to use sites that state that they allow illegal and harmful activity? None, I would suggest, and platforms know that. Yet we have almost come to accept this kind of toxic behaviour and abuse as part and parcel of online life. We have factored online abuse and harm into our daily way of life, but it should not and does not have to be this way.

This Government promised in their manifesto to pass legislation to tackle these problems and to make the UK the

“safest place in the world to be online”

especially for children. We promised legislation that would hold social media platforms to the promises they have made to their own users—their own stated terms and conditions—promises that too often are broken with no repercussions. We promised legislation that would bring some fundamental accountability to the online world. That legislation is here in the form of the ground- breaking Online Safety Bill. We are leading the way and free democracies across the globe are watching carefully to see how we progress this legislation.

The Bill has our children’s future, their unhindered development and their wellbeing at its heart, while at the same time providing enhanced protections for freedom of speech. At this point, I wish to pay tribute to my predecessors, who have each trodden the difficult path of balancing freedom of speech and addressing widespread harms, including my immediate predecessor and, in particular, my hon. Friend the Member for Gosport (Dame Caroline Dinenage), who worked so hard, prior to my arrival in the Department for Digital, Culture, Media and Sport, with stakeholders and platforms, digging in to identify the scope of the problem.

Let me summarise the scope of the Bill. We have reserved our strongest measures in this legislation for children. For the first time, platforms will be required under law to protect children and young people from all sorts of harm, from the most abhorrent child abuse to cyber-bullying and pornography. Tech companies will be expected to use every possible tool to do so, including introducing age-assurance technologies, and they will face severe consequences if they fail in the most fundamental of requirements to protect children. The bottom line is that, by our passing this legislation, our youngest members of society will be far safer when logging on. I am so glad to see James Okulaja and Alex Holmes from The Diana Award here today, watching from the Gallery as we debate this groundbreaking legislation. We have worked closely with them as we have developed the legislation, as they have dedicated a huge amount of their time to protecting children from online harms. This Bill is for them and those children.

The second part of the Bill makes sure that platforms design their services to prevent them from being abused by criminals. When illegal content does slip through the net, such as child sex abuse and terrorist content, they will need to have effective systems and processes in place to quickly identify it and remove it from their sites. We will not allow the web to be a hiding place or a safe space for criminals. The third part seeks to force the largest social media platforms to enforce their own bans on racism, misogyny, antisemitism, pile-ons and all sorts of other unacceptable behaviour that they claim not to allow but that ruins life in practice. In other words, we are just asking the largest platforms to simply do what they say they will do, as we do in all good consumer protection measures in any other industry. If platforms fail in any of those basic responsibilities, Ofcom will be empowered to pursue a range of actions against them, depending on the situation, and, if necessary, to bring down the full weight of the law upon them.

None Portrait Several hon. Members rose—
- Hansard -

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

I will take just two more interventions and that will be it, otherwise people will not have a chance to speak.

John Hayes Portrait Sir John Hayes (South Holland and The Deepings) (Con)
- Hansard - - - Excerpts

I am very grateful to my right hon. Friend for giving way. The internet giants that run the kind of awful practices that she has described have for too long been unaccountable, uncaring and unconscionable in the way they have fuelled every kind of spite and fed every kind of bigotry. Will she go further in this Bill and ensure that, rather like any other publisher, if those companies are prepared to allow anonymous posts, they are held accountable for those posts and subject to the legal constraints that a broadcaster or newspaper would face?

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

These online giants will be held accountable to their own terms and conditions. They will be unable any longer to allow illegal content to be published, and we will also be listing in secondary legislation offences that will be legal but harmful. We will be holding those tech giants to account.

Munira Wilson Portrait Munira Wilson (Twickenham) (LD)
- Hansard - - - Excerpts

I thank the Secretary of State for giving way. She talked about how this Bill is going to protect children much more, and it is a welcome step forward. However, does she accept that there are major gaps in this Bill? For instance, gaming is not covered. It is not clear whether things such as virtual reality and the metaverse are going to be covered. [Interruption.] It is not clear and all the experts will tell us that. The codes of practice in the Bill are only recommended guidance; they are not mandatary and binding on companies. That will encourage a race to the bottom.

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

The duties are mandatory; it is the Online Safety Bill and the metaverse is included in the Bill. Not only is it included, but, moving forward, the provisions in the Bill will allow us to move swiftly with the metaverse and other things. We did not even know that TikTok existed when this Bill started its journey. These provisions will allow us to move quickly to respond.

None Portrait Several hon. Members rose—
- Hansard -

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

I will take one more intervention, but that is it.

Damian Green Portrait Damian Green (Ashford) (Con)
- Hansard - - - Excerpts

I am grateful to my right hon. Friend for giving way. One of the most important national assets that needs protecting in this Bill and elsewhere is our reputation for serious journalism. Will she therefore confirm that, as she has said outside this House, she intends to table amendments during the passage of the Bill that will ensure that platforms and search engines that have strategic market status protect access to journalism and content from recognised news publishers, ensuring that it is not moderated, restricted or removed without notice or right of appeal, and that those news websites will be outside the scope of the Bill?

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

We have already done that—it is already in the Bill.

Daniel Kawczynski Portrait Daniel Kawczynski (Shrewsbury and Atcham) (Con)
- Hansard - - - Excerpts

Will my right hon. Friend give way?

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

No, I have to continue.

Not only will the Bill protect journalistic content, democratic content and democratic free speech, but if one of the tech companies wanted to take down journalistic content, the Bill includes a right of appeal for journalists, which currently does not exist. We are doing further work on that to ensure that content remains online while the appeal takes place. The appeal process has to be robust and consistent across the board for all the appeals that take place. We have already done more work on that issue in this version of the Bill and we are looking to do more as we move forward.

As I have said, we will not allow the web to be a hiding place or safe space for criminals and when illegal content does slip through the net—such as child sex abuse and terrorist content— online platforms will need to have in place effective systems and processes to quickly identify that illegal content and remove it from their sites.

The third measure will force the largest social media platforms to enforce their own bans on racism, misogyny, antisemitism, pile-ons and all the other unacceptable behaviours. In other words, we are asking the largest platforms to do what they say they will do, just as happens with all good consumer-protection measures in any other industry. Should platforms fail in any of their basic responsibilities, Ofcom will be empowered to pursue a range of actions against them, depending on the situation, and, if necessary, to bring down upon them the full weight of the law. Such action includes searching platforms’ premises and confiscating their equipment; imposing huge fines of up to 10% of their global turnover; pursuing criminal sanctions against senior managers who fail to co-operate; and, if necessary, blocking their sites in the UK.

We know that tech companies can act very quickly when they want to. Last year, when an investigation revealed that Pornhub allowed child sexual exploitation and abuse imagery to be uploaded to its platform, Mastercard and Visa blocked the use of their cards on the site. Lo and behold, threatened with the prospect of losing a huge chunk of its profit, Pornhub suddenly removed nearly 10 million child sexual exploitation videos from its site overnight. These companies have the tools but, unfortunately, as they have shown time and again, they need to be forced to use them. That is exactly what the Bill will do.

Before I move on, let me point out something very important: this is not the same Bill as the one published in draft form last year. I know that Members throughout the House are as passionate as I am about getting this legislation right, and I had lots of constructive feedback on the draft version of the Bill. I have listened carefully to all that Members have had to say throughout the Bill’s process, including by taking into account the detailed feedback from the Joint Committee, the Digital, Culture, Media and Sport Committee and the Petitions Committee. They have spent many hours considering every part of the Bill, and I am extremely grateful for their dedication and thorough recommendations on how the legislation could be improved.

As a result of that feedback process, over the past three months or so I have strengthened the legislation in a number of important ways. There were calls for cyber-flashing to be included; cyber-flashing is now in the Bill. There were calls to ensure that the legislation covered all commercial pornography sites; in fact, we have expanded the Bill’s scope to include every kind of provider of pornography. There were concerns about anonymity, so we have strengthened the Bill so that it now requires the biggest tech platforms to offer verification and empowerment tools for adult users, allowing people to block anonymous trolls from the beginning.

I know that countless MPs are deeply concerned about how online fraud—particularly scam ads—has proliferated over the past few years. Under the new version of the Bill, the largest and highest-risk companies—those that stand to make the most profit—must tackle scam ads that appear on their services.

We have expanded the list of priority offences named on the face of the legislation to include not just terrorism and child abuse imagery but revenge porn, fraud, hate crime, encouraging and assisting suicide, and organised immigration crime, among other offences.

If anyone doubted our appetite to go after Silicon Valley executives who do not co-operate with Ofcom, they will see that we have strengthened the Bill so that the criminal sanctions for senior managers will now come into effect as soon as possible after Royal Assent— I am talking weeks, not years. We have expanded the things for which those senior managers will be criminally liable to cover falsifying data, destroying data and obstructing Ofcom’s access to their premises.

In addition to the regulatory framework in the Bill that I have described, we are creating three new criminal offences. While the regulatory framework is focused on holding companies to account, the criminal offences will be focused on individuals and the way people use and abuse online communications. Recommended by the Law Commission, the offences will address coercive and controlling behaviour by domestic abusers; threats to rape, kill or inflict other physical violence; and the sharing of dangerous disinformation deliberately to inflict harm.

This is a new, stronger Online Safety Bill. It is the most important piece of legislation that I have ever worked on and it has been a huge team effort to get here. I am confident that we have produced something that will protect children and the most vulnerable members of society while being flexible and adaptable enough to meet the challenges of the future.

Let me make something clear in relation to freedom of speech. Anyone who has actually read the Bill will recognise that its defining focus is the tackling of serious harm, not the curtailing of free speech or the prevention of adults from being upset or offended by something they have seen online. In fact, along with countless others throughout the House, I am seriously concerned about the power that big tech has amassed over the past two decades and the huge influence that Silicon Valley now wields over public debate.

We in this place are not the arbiters of free speech. We have left it to unelected tech executives on the west coast to police themselves. They decide who is and who is not allowed on the internet. They decide whose voice should be heard and whose should be silenced—whose content is allowed up and what should be taken down. Too often, their decisions are arbitrary and inconsistent. We are left, then, with a situation in which the president of the United States can be banned by Twitter while the Taliban is not; in which talkRADIO can be banned by YouTube for 12 hours; in which an Oxford academic, Carl Heneghan, can be banned by Twitter; or in which an article in The Mail on Sunday can be plastered with a “fake news” label—all because they dared to challenge the west coast consensus or to express opinions that Silicon Valley does not like.

It is, then, vital that the Bill contains strong protections for free speech and for journalistic content. For the first time, under this legislation all users will have an official right to appeal if they feel their content has been unfairly removed. Platforms will have to explain themselves properly if they remove content and will have special new duties to protect journalistic content and democratically important content. They will have to keep those new duties in mind whenever they set their terms and conditions or moderate any content on their sites. I emphasise that the protections are new. The new criminal offences update section 1 of the Malicious Communications Act 1988 and section 127 of the Communications Act 2003, which were so broad that they interfered with free speech while failing to address seriously harmful consequences.

Without the Bill, social media companies would be free to continue to arbitrarily silence or cancel those with whom they do not agree, without any need for explanation or justification. That situation should be intolerable for anyone who values free speech. For those who quite obviously have not read the Bill and say that it concedes power to big tech companies, I have this to say: those big tech companies have all the power in the world that they could possibly want, right now. How much more power could we possibly concede?

That brings me to my final point. We now face two clear options. We could choose not to act and leave big tech to continue to regulate itself and mark its own homework, as it has been doing for years with predictable results. We have already seen that too often, without the right incentives, tech companies will not do what is needed to protect their users. Too often, their claims about taking steps to fix things are not backed up by genuine actions.

I can give countless examples from the past two months alone of tech not taking online harm and abuse seriously, wilfully promoting harmful algorithms or putting profit before people. A recent BBC investigation showed that women’s intimate pictures were being shared across the platform Telegram to harass, shame and blackmail women. The BBC reported 100 images to Telegram as pornography, but 96 were still accessible a month later. Tech did not act.

Twitter took six days to suspend the account of rapper Wiley after his disgusting two-day antisemitic rant. Just last week, the Centre for Countering Digital Hate said that it had reported 253 accounts to Instagram as part of an investigation into misogynistic abuse on the platform, but almost 90% remained active a month later. Again, tech did not act.

Remember: we have been debating these issues for years. They were the subject of one of my first meetings in this place in 2005. During that time, things have got worse, not better. If we choose the path of inaction, it will be on us to explain to our constituents why we did nothing to protect their children from preventable risks, such as grooming, pornography, suicide content or cyber-bullying. To those who say protecting children is the responsibility of parents, not the job of the state, I would quote the 19th-century philosopher John Stuart Mill, one of the staunchest defenders of individual freedom. He wrote in “On Liberty” that the role of the state was to fulfil the responsibility of the parent in order to protect a child where a parent could not. If we choose not to act, in the years to come we will no doubt ask ourselves why we did not act to impose fundamental online protections.

However, we have another option. We can pass this Bill and take huge steps towards tackling some of the most serious forms of online harm: child abuse, terrorism, harassment, death threats, and content that is harming children across the UK today. We could do what John Stuart Mill wrote was the core duty of Government. The right to self-determination is not unlimited. An action that results in doing harm to another is not only wrong, but wrong enough that the state can intervene to prevent that harm from occurring. We do that in every other part of our life. We erect streetlamps to make our cities and towns safer. We put speed limits on our roads and make seatbelts compulsory. We make small but necessary changes to protect people from grievous harm. Now it is time to bring in some fundamental protections online.

We have the legislation ready right now in the form of the Online Safety Bill. All we have to do is pass it. I am proud to commend the Bill to the House.

None Portrait Several hon. Members rose—
- Hansard -

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - - - Excerpts

Order. Before I call the shadow Secretary of State, it will be obvious to the House that we have approximately one hour for Back-Bench contributions and that a great many people want to speak. I warn colleagues that not everybody will have the opportunity and that there will certainly be a time limit, which will probably begin at five minutes.

20:02
Lucy Powell Portrait Lucy Powell (Manchester Central) (Lab/Co-op)
- Hansard - - - Excerpts

Thank you, Madam Deputy Speaker. It has been a busy day, and I will try to keep my remarks short. It is a real shame that the discussion of an important landmark Bill, with so many Members wanting to contribute, has been squeezed into such a tiny amount of time.

Labour supports the principles of the Online Safety Bill. There has been a wild west online for too long. Huge platforms such as Facebook and Google began as start-ups but now have huge influence over almost every aspect of our lives: how we socialise and shop, where we get our news and views, and even the outcomes of elections and propaganda wars. There have been undoubted benefits, but the lack of regulation has let harms and abuses proliferate. From record reports of child abuse to soaring fraud and scams, from racist tweets to Russia’s disinformation campaigns, there are too many harms that, as a society, we have been unable or unwilling to address.

There is currently no regulator. However, neither the Government nor silicon valley should have control over what we can say and do online. We need strong, independent regulation.

Dan Carden Portrait Dan Carden (Liverpool, Walton) (Lab)
- Hansard - - - Excerpts

Will my hon. Friend give way?

Lucy Powell Portrait Lucy Powell
- Hansard - - - Excerpts

I will give way once on this point.

Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

I am grateful. The Secretary of State talked about getting the tech giants to follow their own rules, but we know from Frances Haugen, the Facebook whistleblower, that companies were driving children and adults to harmful content, because it increased engagement. Does that not show that we must go even further than asking them to follow their own rules?

Lucy Powell Portrait Lucy Powell
- Hansard - - - Excerpts

I very much agree with my hon. Friend, and I will come on to talk about that shortly.

The Online Safety Bill is an important step towards strong, independent regulation. We welcome the Bill’s overall aim: the duty of care framework based on the work of the Carnegie Trust. I agree with the Secretary of State that the safety of children should be at the heart of this regulation. The Government have rightly now included fraud, online pornography and cyber-flashing in the new draft of the Bill, although they should have been in scope all along.

Wera Hobhouse Portrait Wera Hobhouse (Bath) (LD)
- Hansard - - - Excerpts

Will the hon. Lady give way?

Lucy Powell Portrait Lucy Powell
- Hansard - - - Excerpts

I am not going to give way, sorry.

Before I get onto the specifics, I will address the main area of contention: the balance between free speech and regulation, most notably expressed via the “legal but harmful” clauses.

Christian Wakeford Portrait Christian Wakeford (Bury South) (Lab)
- Hansard - - - Excerpts

Will my hon. Friend give way?

Lucy Powell Portrait Lucy Powell
- Hansard - - - Excerpts

I will give way one last time.

Christian Wakeford Portrait Christian Wakeford
- Hansard - - - Excerpts

I thank my hon. Friend. The Government have set out the priority offences in schedule 7 to the Bill, but legal harms have clearly not been specified. Given the torrent of racist, antisemitic and misogynistic abuse that grows every single day, does my hon. Friend know why the Bill has not been made more cohesive with a list of core legal harms, allowing for emerging threats to be dealt with in secondary legislation?

Lucy Powell Portrait Lucy Powell
- Hansard - - - Excerpts

I will come on to some of those issues. My hon. Friend makes a valid point.

I fear the Government’s current solution to the balance between free speech and regulation will please no one and takes us down an unhelpful rabbit hole. Some believe the Bill will stifle free speech, with platforms over-zealously taking down legitimate political and other views. In response, the Government have put in what they consider to be protections for freedom of speech and have committed to setting out an exhaustive list of “legal but harmful” content, thus relying almost entirely on a “take down content” approach, which many will still see as Government overreach.

On the other hand, those who want harmful outcomes addressed through stronger regulation are left arguing over a yet-to-be-published list of Government-determined harmful content. This content-driven approach moves us in the wrong direction away from the “duty of care” principles the Bill is supposed to enshrine. The real solution is a systems approach based on outcomes, which would not only solve the free speech question, but make the Bill overall much stronger.

What does that mean in practice? Essentially, rather than going after individual content, go after the business models, systems and policies that drive the impact of such harms—[Interruption.] The Minister for Security and Borders, the right hon. Member for East Hampshire (Damian Hinds), says from a sedentary position that that is what the Bill does, but none of the leading experts in the field think the same. He should talk to some of them before shouting at me.

The business models of most social media companies are currently based on engagement, as my hon. Friend the Member for Liverpool, Walton (Dan Carden) outlined. The more engagement, the more money they make, which rewards controversy, sensationalism and fake news. A post containing a racist slur or anti-vax comment that nobody notices, shares or reads is significantly less harmful than a post that is quickly able to go viral. A collective pile-on can have a profoundly harmful effect on the young person on the receiving end, even though most of the individual posts would not meet the threshold of harmful.

Matt Rodda Portrait Matt Rodda (Reading East) (Lab)
- Hansard - - - Excerpts

Will my hon. Friend give way on that point?

Lucy Powell Portrait Lucy Powell
- Hansard - - - Excerpts

I will not, sorry. Facebook whistleblower Frances Haugen, who I had the privilege of meeting, cited many examples to the Joint Committee on the draft Online Safety Bill of Facebook’s models and algorithms making things much worse. Had the Government chosen to follow the Joint Committee recommendations for a systems-based approach rather than a content-driven one, the Bill would be stronger and concerns about free speech would be reduced.

Lucy Powell Portrait Lucy Powell
- Hansard - - - Excerpts

I am sorry, but too many people want to speak. Members should talk to their business managers, who have cut—[Interruption.] I know the hon. Gentleman was Chair of the Committee—[Interruption.]

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - - - Excerpts

Order. The hon. Lady is not giving way. Let us get on with the debate.

Lucy Powell Portrait Lucy Powell
- Hansard - - - Excerpts

The business managers have failed everybody on both sides given the time available.

A systems-based approach also has the benefit of tackling the things that platforms can control, such as how content spreads, rather than what they cannot control, such as what people post. We would avoid the cul-de-sac of arguing over the definitions of what content is or is not harmful, and instead go straight to the impact. I urge the Government to adopt the recommendations that have been made consistently to focus the Bill on systems and models, not simply on content.

Turning to other aspects of the Bill, key issues with its effectiveness remain. The first relates to protecting children. As any parent will know, children face significant risks online, from poor body image, bullying and sexist trolling to the most extreme grooming and child abuse, which is, tragically, on the rise. This Bill is an important opportunity to make the internet a safe place for children. It sets out duties on platforms to prevent children from encountering illegal, harmful or pornographic content. That is all very welcome.

However, despite some of the Government’s ambitious claims, the Bill still falls short of fully protecting children. As the National Society for the Prevention of Cruelty to Children argues, the Government have failed to grasp the dynamics of online child abuse and grooming—[Interruption.] Again, I am being heckled from the Front Bench, but if Ministers engage with the children’s charities they will find a different response. For example—[Interruption.] Yes, but they are not coming out in support of the Bill, are they? For example, it is well evidenced that abusers will often first interact with children on open sites and then move to more encrypted platforms. The Government should require platforms to collaborate to reduce harm to children, prevent abuse from being displaced and close loopholes that let abusers advertise to each other in plain sight.

The second issue is illegal activity. We can all agree that what is illegal offline should be illegal online, and all platforms will be required to remove illegal content such as terrorism, child sex abuse and a range of other serious offences. It is welcome that the Government have set out an expanded list, but they can and must go further. Fraud was the single biggest crime in the UK last year, yet the Business Secretary dismissed it as not affecting people’s everyday lives.

The approach to fraud in this Bill has been a bit like the hokey-cokey: the White Paper said it was out, then it was in, then it was out again in the draft Bill and finally it is in again, but not for the smaller sites or the search services. The Government should be using every opportunity to make it harder for scammers to exploit people online, backed up by tough laws and enforcement. What is more, the scope of this Bill still leaves out too many of the Law Commission’s recommendations of online crimes.

The third issue is disinformation. The war in Ukraine has unleashed Putin’s propaganda machine once again. That comes after the co-ordinated campaign by Russia to discredit the truth about the Sergei Skripal poisonings. Many other groups have watched and learned: from covid anti-vaxxers to climate change deniers, the internet is rife with dangerous disinformation. The Government have set up a number of units to tackle disinformation and claim to be working with social media companies to take it down. However, that is opaque and far from optimal. The only mention of disinformation in the Bill is that a committee should publish a report. That is far from enough.

Returning to my earlier point, it is the business models and systems of social media companies that create a powerful tool for disinformation and false propaganda to flourish. Being a covid vaccine sceptic is one thing, but being able to quickly share false evidence dressed up as science to millions of people within hours is a completely different thing. It is the power of the platform that facilitates that, and it is the business models that encourage it. This Bill hardly begins to tackle those societal and democratic harms.

The fourth issue is online abuse. From racism to incels, social media has become a hotbed for hate. I agree with the Secretary of State that that has poisoned public life. I welcome steps to tackle anonymous abuse. However, we still do not know what the Government will designate as legal but harmful, which makes it very difficult to assess whether the Bill goes far enough, or indeed too far. I worry that those definitions are left entirely to the Secretary of State to determine. A particularly prevalent and pernicious form of online hate is misogyny, but violence against women and girls is not mentioned at all in the Bill—a serious oversight.

The decision on which platforms will be regulated by the Bill is also arbitrary and flawed. Only the largest platforms will be required to tackle harmful content, yet smaller platforms, which can still have a significant, highly motivated, well-organised and particularly harmful user base, will not. Ofcom should regulate based on risk, not just on size.

The fifth issue is that the regulator and the public need the teeth to take on the big tech companies, with all the lawyers they can afford. It is a David and Goliath situation. The Bill gives Ofcom powers to investigate companies and fine them up to 10% of their turnover, and there are some measures to help individual users. However, if bosses in Silicon Valley are to sit up and take notice of this Bill, it must go further. It should include stronger criminal liability, protections for whistleblowers, a meaningful ombudsman for individuals, and a route to sue companies through the courts.

The final issue is future-proofing, which we have heard something about already. This Bill is a step forward in dealing with the likes of Twitter, Facebook and Instagram—although it must be said that many companies have already begun to get their house in order ahead of any legislation—but it will have taken nearly six years for the Bill to appear on the statute book.

Since the Bill was first announced, TikTok has emerged on the scene, and Facebook has renamed itself Meta. The metaverse is already posing dangers to children, with virtual reality chat rooms allowing them to mix freely with predatory adults. Social media platforms are also adapting their business models to avoid regulation; Twitter, for example, says that it will decentralise and outsource moderation. There is a real danger that when the Bill finally comes into effect, it will already be out of date. A duty of care approach, focused on outcomes rather than content, would create a much more dynamic system of regulation, able to adapt to new technologies and platforms.

In conclusion, social media companies are now so powerful and pervasive that regulating them is long overdue. Everyone agrees that the Bill should reduce harm to children and prevent illegal activity online, yet there are serious loopholes, as I have laid out. Most of all, the focus on individual content rather than business models, outcomes and algorithms will leave too many grey areas and black spots, and will not satisfy either side in the free speech debate.

Despite full prelegislative scrutiny, the Government have been disappointingly reluctant to accept those bigger recommendations. In fact, they are going further in the wrong direction. As the Bill progresses through the House, we will work closely with Ministers to improve and strengthen it, to ensure that it truly becomes a piece of world-leading legislation.

None Portrait Several hon. Members rose—
- Hansard -

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - - - Excerpts

We will begin with a time limit of five minutes, but that is likely to reduce.

20:16
Julian Knight Portrait Julian Knight (Solihull) (Con)
- Hansard - - - Excerpts

Some colleagues have been in touch with me to ask my view on one overriding matter relating to this Bill: does it impinge on our civil liberties and our freedom of speech? I say to colleagues that it does neither, and I will explain how I have come to that conclusion.

In the mid-1990s, when social media and the internet were in their infancy, the forerunners of the likes of Google scored a major win in the United States. Effectively, they got the US Congress to agree to the greatest “get out of jail free” card in history: namely, to agree that social media platforms are not publishers and are not responsible for the content they carry. That has led to a huge flowering of debate, knowledge sharing and connections between people, the likes of which humanity has never seen before. We should never lose sight of that in our drive to fairly regulate this space. However, those platforms have also been used to cause great harm in our society, and because of their “get out of jail free” card, the platforms have not been accountable to society for the wrongs that are committed through them.

That is quite simplistic. I emphasise that as time has gone by, social media platforms have to some degree recognised that they have responsibilities, and that the content they carry is not without impact on society—the very society that they make their profits from, and that nurtured them into existence. Content moderation has sprung up, but it has been a slow process. It is only a few years ago that Google, a company whose turnover is higher than the entire economy of the Netherlands, was spending more on free staff lunches than on content moderation.

Content moderation is decided by algorithms, based on terms and conditions drawn up by the social media companies without any real public input. That is an inadequate state of affairs. Furthermore, where platforms have decided to act, there has been little accountability, and there can be unnecessary takedowns, as well as harmful content being carried. Is that democratic? Is it transparent? Is it right?

These masters of the online universe have a huge amount of power—more than any industrialist in our history—without facing any form of public scrutiny, legal framework or, in the case of unwarranted takedowns, appeal. I am pleased that the Government have listened in part to the recommendations published by the Digital, Culture, Media and Sport Committee, in particular on Parliament’s being given control through secondary legislation over legal but harmful content and its definition—an important safeguard for this legislation. However, the Committee and I still have queries about some of the Bill’s content. Specifically, we are concerned about the risks of cross-platform grooming and bread- crumbing—perpetrators using seemingly innocuous content to trap a child into a sequence of abuse. We also think that it is a mistake to focus on category 1 platforms, rather than extending the provisions to other platforms such as Telegram, which is a major carrier of disinformation. We need to recalibrate to a more risk-based approach, rather than just going by the numbers. These concerns are shared by charities such as the National Society for the Prevention of Cruelty to Children, as the hon. Member for Manchester Central (Lucy Powell) said.

On a systemic level, consideration should be given to allowing organisations such as the Internet Watch Foundation to identify where companies are failing to meet their duty of care, in order to prevent Ofcom from being influenced and captured by the heavy lobbying of the tech industry. There has been reference to the lawyers that the tech industry will deploy. If we look at any newspaper or LinkedIn, we see that right now, companies are recruiting, at speed, individuals who can potentially outgun regulation. It would therefore be sensible to bring in outside elements to provide scrutiny, and to review matters as we go forward.

On the culture of Ofcom, there needs to be greater flexibility. Simply reacting to a large number of complaints will not suffice. There needs to be direction and purpose, particularly with regard to the protection of children. We should allow for some forms of user advocacy at a systemic level, and potentially at an individual level, where there is extreme online harm.

On holding the tech companies to account, I welcome the sanctions regime and having named individuals at companies who are responsible. However, this Bill gives us an opportunity to bring about real culture change, as has happened in financial services over the past two decades. During Committee, the Government should actively consider the suggestion put forward by my Committee—namely, the introduction of compliance officers to drive safety by design in these companies.

Finally, I have concerns about the definition of “news publishers”. We do not want Ofcom to be effectively a regulator or a licensing body for the free press. However, I do not want in any way to do down this important and improved Bill. I will support it. It is essential. We must have this regulation in place.

John Nicolson Portrait John Nicolson (Ochil and South Perthshire) (SNP)
- Hansard - - - Excerpts

Thank you, Madam Deputy Speaker, but I was under the impression that I was to wind up for my party, rather than speaking at this juncture.

Eleanor Laing Portrait Madam Deputy Speaker
- Hansard - - - Excerpts

If the hon. Gentleman would prefer to save his slot until later—

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

I would, Madam Deputy Speaker, if that is all right with you.

Eleanor Laing Portrait Madam Deputy Speaker
- Hansard - - - Excerpts

Then we shall come to that arrangement. I call Dame Margaret Hodge.

20:22
Margaret Hodge Portrait Dame Margaret Hodge (Barking) (Lab)
- Hansard - - - Excerpts

Thank you, Madam Deputy Speaker. I hope that I will take only three minutes.

The human cost of abuse on the internet is unquantifiable—from self-harm to suicide, grooming to child abuse, and racism to misogyny. A space we thought gave the unheard a legitimate voice has become a space where too many feel forced to stay offline. As a Jewish female politician online, I have seen my identities perversely tied together to discredit my character and therefore silence my voice. I am regularly accused of being a “Zionist hag”, a “paedophile” and a “Nazi”. But this is not just about politicians. We all remember the tsunami of racism following the Euros, and we know women are targeted more online than men. Social media firms will not tackle this because their business model encourages harmful content. Nasty content attracts more traffic; more traffic brings more advertising revenue; and more revenue means bigger profits. Legislation is necessary to make the social media firms act. However, this Bill will simply gather dust if Ofcom and the police remain underfunded. The “polluter pays” principle—that is, securing funding through a levy on the platforms—would be much fairer than taxpayers picking up the bill for corporate failures.

I cherish anonymity for whistleblowers and domestic violence victims—it is vital—but when it is used as a cloak to harm others, it should be challenged. The Government’s halfway measure allows users to choose to block anonymous posts by verifying their own identity. That ignores police advice not to block abusive accounts, as those accounts help to identify genuine threats to individuals, and it ignores the danger of giving platforms the power to verify identities. We should think about the Cambridge Analytica scandal. Surely a third party with experience in unique identification should carry out checks on users. Then we all remain anonymous to platforms, but can be traced by law enforcement if found guilty of harmful abuse. We can then name and shame offenders.

On director liability, fines against platforms become a business cost and will not change behaviour, so personal liability is a powerful deterrent. However, enforcing this liability only when a platform fails to supply information to Ofcom is feeble. Directors must be made liable for breaching safety duties.

Finally, as others have said, most regulations apply only to category 1 platforms. Search engines fall through the cracks; BitChute, Gab, 4chan—all escape, but as we saw in the attacks on Pittsburgh’s synagogue and Christchurch’s mosque, all these platforms helped to foster those events. Regulation must be based on risk, not size. Safety should be embedded in any innovative products, so concern about over-regulating innovation is misplaced. This is the beginning of a generational change. I am grateful to Ministers, because I do think they have listened. If they continue to listen, we can make Britain the safest place online.

20:26
John Whittingdale Portrait Mr John Whittingdale (Maldon) (Con)
- Hansard - - - Excerpts

This Bill is a groundbreaking piece of legislation, and we are one of the first countries to attempt to bring in controls over content online. I therefore share the view of the hon. Member for Manchester Central (Lucy Powell) that it is a great pity that its Second Reading was scheduled for a day when there is so much other business.

The Bill has been a long time in the preparation. I can remember chairing an inquiry of the Culture, Media and Sport Committee in 2008 on the subject of harmful content online. Since then, we have had a Green Paper, a White Paper, a consultation, a draft Bill, a Joint Committee, and several more Select Committee inquiries. It is important that we get this right, and the Bill has grown steadily, as the Secretary of State outlined. I do not need to add to the reasons why it is important that we control content and protect vulnerable people from online content that is harmful to them.

There are two areas where I want to express a word of caution. First, as the Under-Secretary, my hon. Friend the Member for Croydon South (Chris Philp), is very much aware, the Government have an ambition to make the United Kingdom the tech capital of the world. We have been incredibly successful in attracting investment. He will know better than I that the tech industry in Britain is now worth over $1 trillion, and that we have over 100 unicorns, but the Bill creates uncertainty, mainly because so much is subject to secondary legislation and not spelled out in detail in the Bill. This will stifle innovation and growth.

It is fairly obvious which are the main companies that will fall into the category 1 definition. We are told that there may be some 15 to 20. Some of them are certainly obvious. However, I share the view that this needs to be determined more by risk than by reach. A company does not necessarily pose a significant risk simply because it is large. Companies such as Tripadvisor, eBay and Airbnb, which, on the size criteria, might fall within scope of category 1, should not do so. I hope that the Secretary of State and the Minister can say more about the precise definitions that will determine categories. This is more serious for the category 2 companies; it is estimated that some 25,000 may fall within scope. It is not clear precisely what the obligations on them will be, and that too is causing a degree of uncertainty. It is also unclear whether some parts of a large company with several businesses, such as Amazon, would be in category 1 or category 2, or what would happen if companies grow. Could they, for instance, be re-categorised from 1 to 2? These concerns are being raised by the tech industry, and I hope that my hon. Friend the Minister will continue to talk to techUK, to allay those fears.

The second issue, as has been rightly identified, is the effect on freedom of speech. As has been described, tech platforms already exercise censorship. At the moment, they exercise their own judgment as to what is permissible and what is not, and we have had examples such as YouTube taking down the talkRadio channel. I spent a great deal of time talking to the press and media about the special protections that journalism needs, and I welcome the progress that has been made in the Bill. It is excellent that journalistic content will be put in a special category. I repeat the question asked by my right hon. Friend the Member for Ashford (Damian Green). The Secretary of State made some very welcome comments on, I think, “This Morning” about the introduction of an additional protection so that, if a journalist’s shared content were removed from an online platform, they would need to be informed and able to appeal. That may require additional amendments to the Bill, so perhaps the Minister could say when we are likely to see those.

There is also the concern raised by the periodical publishers that specialist magazines appear to be outside the protection of journalistic content. I hope that that can be addressed, because there are publications that deserve the same level of protection.

There is a wider concern about freedom of speech. The definition “legal but harmful” raises real concerns, particularly given that it is left open to subsequent secondary legislation to set out exactly what the categories will be. There are also widespread concerns that we need to avoid, at all costs, setting a precedent that may be used by others who are more keen to censor discussion online. In particular, clause 103(2)(b) relates to messaging services and can require Ofcom to use accredited technology to identify CSEA material. The Minister will be aware that that matter is also causing concern.

20:31
Darren Jones Portrait Darren Jones (Bristol North West) (Lab)
- Hansard - - - Excerpts

In the interest of time, I will just pose a number of questions, which I hope the Minister might address in summing up. The first is about the scope of the Bill. The Joint Committee of which I was a member recommended that the age-appropriate design code, which is very effectively used by the Information Commissioner, be used as a benchmark in the Bill, so that any services accessed or likely to be accessed by children are regulated for safety. I do not understand why the Government rejected that suggestion, and I would be pleased to hear from the Minister why they did so.

Secondly, the Bill delegates lots of detail to statutory instruments, codes of practice from the regulator, or later decisions by the Secretary of State. Parliament must see that detail before the Bill becomes an Act. Will the Minister commit to those delegated decisions being published before the Bill becomes an Act? Could he explain why the codes of practice are not being set as mandatory? I do not understand why codes of practice, much of the detail of which the regulator is being asked to set, will not be made mandatory for businesses. How can minimum standards for age or identity verification be imposed if those codes of practice are not made mandatory? Perhaps the Minister could explain.

Many users across the country will want to ensure that their complaints are dealt with effectively. We recommended an ombudsman service that dealt with complaints that were exhausted through a complaints system at the regulated companies, but the Government rejected it. Please could the Minister explain why?

I was pleased that the Government accepted the concept of the ability for a super-complaint to be brought on behalf of groups of users, but the decision as to who will be able a bring a super-complaint has been deferred, subject to a decision by the Secretary of State. Why, and when will that decision be taken? If the Minister could allude to who they might be, I am sure that would be welcome.

Lastly, there is a number of exemptions and more work to be done, which leaves significant holes in the legislation. There is much more work to be done on clauses 5, 6 and 50—on democratic importance, journalism and the definition of journalism, on the exemptions for news publishers, and on disinformation, which is mentioned only once in the entire Bill. I and many others recognise that these are not easy issues, but they should be considered fully before legislation is proposed that has gaping holes for people who want to get around it, and for those who wish to test the parameters of this law in the courts, probably for many years. All of us, on a cross-party basis in this House, support the Government’s endeavours to make it safe for children and others to be online. We want the legislation to be implemented as quickly as possible and to be as effective as possible, but there are significant concerns that it will be jammed up in the judicial system, where this House is unacceptably giving judges the job of fleshing out the definition of what many of the important exemptions will mean in practice.

The idea that the Secretary of State has the power to intervene with the independent regulator and tell it what it should or should not do obviously undermines the idea of an independent regulator. While Ministers might give assurances to this House that the power will not be abused, I believe that other countries, whether China, Russia, Turkey or anywhere else, will say, “Look at Great Britain. It thinks this is an appropriate thing to do. We’re going to follow the golden precedent set by the UK in legislating on these issues and give our Ministers the ability to decide what online content should be taken down.” That seems a dangerous precedent.

Darren Jones Portrait Darren Jones
- Hansard - - - Excerpts

The Minister is shaking his head, but I can tell him that the legislation does do that, because we looked at this and took evidence on it. The Secretary of State would be able to tell the regulator that content should be “legal but harmful” and therefore should be removed as part of its systems design online. We also heard that the ability to do that at speed is very restricted and therefore the power is ineffective in the first place. Therefore, the Government should evidently change their position on that. I do not understand why, in the face of evidence from pretty much every stakeholder, the Government agree that that is an appropriate use of power or why Parliament would vote that through.

I look forward to the Minister giving his answers to those questions, in the hope that, as the Bill proceeds through the House, it can be tidied up and made tighter and more effective, to protect children and adults online in this country.

20:35
Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

This is an incredibly important Bill. It has huge cross-party support and was subject to scrutiny by the Joint Committee, which produced a unanimous report, which shows the widespread feeling in both Houses and on both sides of this Chamber that we should legislate. I do feel, though, that I should respond to some of the remarks of the shadow Secretary of State, the hon. Member for Manchester Central (Lucy Powell), on the Joint Committee report.

I agree with the hon. Member that, unless this legislation covers the systems of social media companies as well as the content hosted, it will not be effective, but it is my belief that it does that. Throughout the evidence that the Committee took, including from Ofcom and not just the Government, it was stated to us very clearly that the systems of social media companies are within scope and that, in preparing the risk registers for the companies, Ofcom can look at risks. For Facebook, that could include the fact that the news feed recommends content to users, while for someone on TikTok using For You, it could be the fact that the company is selecting—algorithmically ranking—content that someone might like. That could include, for a teenage girl, content that promoted self-harm that was being actively recommended by the company’s systems, or, as Frances Haugen set out, extremist content and hate speech being actively promoted and recommended by the systems.

That would be in scope. The algorithms are within scope, and part of Parliament job’s will be to ensure on an ongoing basis that Ofcom is using its powers to audit the companies in that way, to gain access to information in that way, and to say that the active promotion of regulated content by a social media company is an offence. In passing this Bill, we expect that that will be fully in scope. If the legislation placed no obligation on a company to proactively identify any copies of content that it had judged should not be there and had taken down, we would have a very ineffective system. In effect, we would have what Facebook does to assess content today. If that was effective, we would not need this legislation, but it is woefully ineffective, so the algorithms and the systems are in scope. The Bill gives Ofcom the power to regulate on that basis, and we have to ensure that it does that in preparing the risk registers.

Following what my Joint Committee colleague, the hon. Member for Bristol North West (Darren Jones), said, the point about the codes of practice is really important. The regulator sets the codes of practice for companies to follow. The Government set out in their response to the Joint Committee report that the regulator can tell companies if their response is not adequate. If an area of risk has been identified where the company has to create policies to address that risk and the response is not good enough, the regulator can still find the company in breach. I would welcome it if the Minister wished to say more about that, either today or as the Bill goes through the House, because it is really important. The response of a company to a request from the regulator, having identified a risk on its platforms, cannot be: “Oh, sorry, we don’t have a policy on that.” It has to be able to set those policies. We have to go beyond just enforcing the terms of service that companies have created for themselves. Making sure they do what they say they are going to do is really important, as the Secretary of State said, but we should be able to push them to go further.

I agree, though, with the hon. Member for Manchester Central and other hon. Members about regulation being based on risk and not just size. In reality, Ofcom will have to make judgment calls on smaller sites that are posing a huge risk or a new risk that has been identified.

The regulator will have the power to regulate Metaverse and VR platforms. Anything that is a user-to-user service is already in scope of the legislation. The challenge for the regulator will be in moderating conversations between two people in a virtual room, which is much harder than when people are posting text-based content. The technology will have to adapt to do that, but we should start that journey based on the fact that that is already in scope.

Finally, on the much used expression “legal but harmful”, I am pleased the Government took one of our big recommendations, which is to write more offences clearly into the Bill, so it is clear what is actually being regulated—so promotion of self-harm is regulated content and hate speech is part of the regulated content. The job of the regulator then is to set the threshold where intervention should come and I think that should be based on case law. On many of these issues, such as the abuse of the England footballers after the final of the European championships, people have been sentenced in court for what they did. That creates good guidance and a good baseline for what hate speech is in that context and where we would expect intervention. I think it would be much easier for the Bill, the service users that are regulated and the people who post content, to know what the offences are and where the regulatory standard is. Rather than describing those things as “legal but harmful”, we should describe them as what they are, which is regulated offences based on existing offences in law.

The Government made an important step in responding to say that the Government, in seeking amendment to the codes of practice that bring new offences within scope of these priority areas of harm, should have to go through an affirmative process in both Houses. That is really important. Ultimately, the regulation should be based on our laws and changes should be based on decisions taken in this House.

None Portrait Several hon. Members rose—
- Hansard -

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - - - Excerpts

Order. After the next speaker, the time limit will be reduced to four minutes.

20:40
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

Thank you, Madam Deputy Speaker.

I want to focus on how people actually use the internet, particularly how young people actually use the internet. I feel, as was suggested in one of the comments in questions earlier, that this Bill and some of the discussion around it misses some of the point and some of the actual ways in which particularly young people use the internet.

We have not mentioned, or I have not heard anyone mention, Discord. I have not heard anyone mention Twitch. I have not heard people talking about how people interact on Fortnite. A significant number of young people use Fortnite to interact with their friends. That is the way they speak to their friends. I do not know if the Minister is aware of this, but you can only change the parental controls on Fortnite to stop your children speaking to everybody; you cannot stop them speaking to everybody but their friends. There are no parental controls on a lot of these sites that parents can adequately utilise. They only have this heavy-handed business where they can ban their child entirely from doing something, or they are allowed to do everything. I think some bits are missed in this because it does not actually reflect the way young people use the internet.

In the girls’ attitude survey produced by Girlguiding, 71% of the 2,000 girls who were surveyed said that they had experienced harmful content while online. But one of the important things I also want to stress is that a quarter of LGBQ and disabled girls found online forums and spaces an important source of support. So we need to make sure that children and young people have the ability to access those sources of support. Whether that is on forums, or on Fortnite, Minecraft, Animal Crossing or whatever it is they happen to be speaking to their friends on, that is important and key in order for young people to continue to communicate. It has been especially important during the pandemic.

There is at this moment a major parenting knowledge gap. There is a generation of parents who have not grown up using the internet. I was one the first people to grow up using the internet and have kids; they are at the top end of primary school now. Once this generation of kids are adults, they will know how their children are behaving online and what the online world is like because they will have lived through it themselves. The current generation of parents has not. The current generation of parents has this knowledge gap.

I am finding that a lot of my kids’ friends have rules that I consider totally—totally—unacceptable and inappropriate because they do not match how kids actually use the internet and the interactions they are likely to have on there. I asked my kids what they thought was the most important thing, and they said the ability to choose what they see and what they do not see, and who they hear from and who they do not hear from. That was the most important thing to them.

That has been talked about in a lot of the information we have received—the requirement to look at algorithms and to opt in to being served with those algorithms, rather than having an opt-out, as we do with Facebook. Facebook says, “Are you sure you don’t want to see this content any more?” Well, yes, I have clicked that I do not want to see it—of course I do not want to see it any more. Of course I would like to see the things my hon. Friend the Member for Ochil and South Perthshire (John Nicolson) posts and all of the replies he sends to people—I want that to pop up with my notifications—but I should have to choose to do that.

Kids feel like that as well—my kids, and kids up and down the country—because, as has been talked about, once you get into these cycles of seeing inappropriate, harmful, damaging content, you are more likely to be served with more and more of that content. At the very first moment people should be able to say, “Hang on, I don’t want to see any of this”, and when they sign up to a site they should immediately be able to say, “No, I don’t want to see any of this. All I want to do is speak to the people I know or have sent a friend request to and accepted a send request from.” We need to ensure that there are enough safeguards like that in place for children and young people and their parents to be able to make those choices in the knowledge and understanding of how these services will actually be used, rather than MPs who do not necessarily use these services making these decisions. We need to have that flexibility.

My final point is that the internet is moving and changing. Twenty years ago I was going to LAN parties and meeting people I knew from online games. That is still happening today and we are only now getting the legislation here and catching up. It has taken that long for us to get here so this legislation must be fit for the future. It must be flexible enough to work with the new technologies, social media and gaming platforms that are coming through.

20:45
Andrew Percy Portrait Andrew Percy (Brigg and Goole) (Con)
- Hansard - - - Excerpts

I, too, regret the short time we have to debate this important Bill this evening. This is much-needed legislation and I agree with many of the comments already made.

These platforms have been warned over the years to take action yet have failed to do so. Their online platforms have remained a safe space for racism, holocaust denial, homophobia, conspiracy theories and general bullying. One of the best things I ever did for my mental health was to leave Twitter, but for many young people that is not an option as it cuts them off from access to their friends and much of what is their society. So I am proud that the Government are taking action on this but, as the Minister knows from my meetings with him alongside the Antisemitism Policy Trust, there are ways in which I think the Bill can be improved.

First, on small, high-harm platforms, I pay tribute to the Antisemitism Policy Trust, which has been leading the charge. As the hon. Member for Aberdeen North (Kirsty Blackman) said, everybody knows Facebook, Twitter and YouTube but few people are aware of a lot of the smaller platforms such as BitChute, 8kun—previously 8chan—or Minds. These small platforms are a haven for white supremacists, incels, conspiracy theorists and antisemites; it is where they gather, converse and share and spew their hate.

An example of that is a post from the so-called anti-Jewish meme repository on the platform Gab which showed a picture of goblins, in this instance the usual grotesque representation of those age-old Jewish physical stereotypes, alongside the phrase, “Are you ready to die in another Jewish war, Goyim?” That is the sort of stuff that is on these small platforms, and it is not rare; we see it all over. Indeed, many of these small platforms exist purely to spew such hate, but at present, despite the many measures in the Bill that I support, these sites will be sifted by Ofcom into two major categories based on their size and functionality. I met the Minister to discuss this point recently.

The Government have not so far been enthusiastic about risk being a determinant factor for fear that too many of the small platforms would be drawn into scope. That is why I hope that as this Bill progresses the Minister will consider a small amendment to enable Ofcom to have powers to draw the small but high-harm platforms, based on its assessments—the so-called super-complaints that we have heard about or other means— into the category 1 status. That would add a regulatory oversight and burden on those platforms. This is all about putting pressure on them—requiring them to go through more hurdles to frustrate their business model of hate, and making it as uncomfortable as possible for them. I hope the Minister will look at that as the Bill progresses.

I am very short of time but I also want to raise the issue of search, which the Minister knows I have raised previously. We in the all-party group against antisemitism found examples in Alexa and other voice-activated search platforms where the responses that come back are deeply offensive and racist. I understand that the relationship with the user in entering into a search is different from having an account with a particular social media platform, but these search engines are providing access to all sorts of grotesque racist and misogynistic content and I hope we can look at that as the Bill progresses.

20:49
Luke Pollard Portrait Luke Pollard (Plymouth, Sutton and Devonport) (Lab/Co-op)
- Hansard - - - Excerpts

I welcome the Bill. It is an important step forward, and it is because I welcome it that I want to see it strengthened. It seems to be an opportunity for us to get this right and in particular to learn lessons from where we have got it wrong in the past. I want to raise two different types of culture. The first is incel culture, and I would like to relate that to the experience that we had in Keyham, with the massing shooting in Plymouth last year, and the second is the consequences of being Instafamous.

It is just over six months since the tragic shooting in Keyham in which we lost five members of our community. The community feels incredibly strongly that we want to learn the lessons, no matter how painful or difficult they are, to ensure that something like this never happens again. We are making progress, working with the Home Office on gun law changes, in particular on linking medical records and gun certificates. One part is incredibly difficult, and that is addressing incel culture, which has been mentioned from the Front Bench by my hon. Friend the Member for Pontypridd (Alex Davies-Jones) and by the hon. Member for Brigg and Goole (Andrew Percy). It sits in the toxic underbelly of our internet and in many cases, it sits on those smaller platforms to which this Bill will not extend the full obligations. I mention that because it results in real-world experiences.

I cannot allocate responsibility for what happened in the Keyham shooting because the inquest is still under way and the police investigations are ongoing, but it is clear that online radicalisation contributed to it, and many of the sites that are referenced as smaller sites that will not be covered by the legislation contributed perhaps in part to the online radicalisation.

When incel culture leads to violence it is not domestic terrorism; it falls between the stools. It must not fall between the stools of this legislation, so I would be grateful if the Minister agreed to meet me and members of the Keyham community to understand how his proposals relate to the learnings that we are coming out with in Keyham to make sure that nothing like this can ever happen again. With the online radicalisation of our young men in particular, it is really important that we understand where the rescue routes are. This is not just about the legislation; it needs to be about how we rescue people from the routes that they are going down. I would like to understand from the Minister how we can ensure that there are rescue routes; that schools, social services and mental health providers can understand how to rescue people from incel culture and the online radicalisation of incel culture as well as US gun culture—the glorification of guns and the misogynistic culture that exists in this space.

The second point about culture is an important one about how we learn from young people. Plymouth is a brilliant place. It is home to both GOD TV—a global evangelical broadcaster—and to many porn production companies. It is quite an eclectic, creative setting. We need to look at how we can learn from the culture of being Instafamous. Instafamous is something that many of our young people look at from an early age. They look at Body Beautifuls, Perfect Smiles—an existence that is out of reach for many people. In many cases they are viewing the creation of online pornography via sites such as OnlyFans as a natural and logical extension to being Instafamous. It is something that, sadly, can attract a huge amount of income. So young people taking their kit off at an early age, especially in their teenage years, can produce high earnings. I want to see those big companies challenged not to serve links on Instagram profiles to OnlyFans content for under-18s. That sits in a grey area of the Bill. I would be grateful if the Minister looked at how we can have that as a serious setting so that we can challenge that culture and help build understanding about how Instafamous must mean consent and protection.

20:53
Adam Afriyie Portrait Adam Afriyie (Windsor) (Con)
- Hansard - - - Excerpts

Overall, I very much welcome the Bill. It has been a long time coming, but none of us here would disagree that we need to protect our children, certainly from pornography and all sorts of harassment and awful things that are on the internet and online communications platforms. There is no argument or pushback there at all. I welcome the age verification side of things. We all welcome that.

The repeal of the Malicious Communications Act 1988 is a good move. The adjustment of a couple of sections of the Communications Act 2003 is also a really good, positive step, and I am glad that the Bill is before us now. I think pretty much everyone here would agree with the principles of the Bill, and I thank the Government for getting there eventually and introducing it. However, as chair of the freedom of speech all-party parliamentary group I need to say a few words and express a few concerns about some of the detail and some of the areas where the Bill could perhaps be improved still further.

The first point relates to the requirement that social media have regard to freedom of speech. It is very easy, with all the concerns we have—I have them too—to push too hard and say that social media companies should clamp down immediately on anything that could be even slightly harmful, even if it is uncertain what “harmful” actually means. We must not to give them the powers or the incentive through financial penalties to shut down freedom of speech just in case something is seen to be harmful by somebody. As the Bill progresses, therefore, it would be interesting to look at whether there is an area where we can tighten up rights and powers on freedom of speech.

Secondly, there is the huge issue—one or two other Members have raised it—of definitions. Clearly, if we say that something that is illegal should not be there and should disappear, of course we would all agree with that. If we say that something that is harmful should not be there, should not be transmitted and should not be amplified, we start to get into difficult territory, because what is harmful for one person may not be harmful for another. So, again, we need to take a little more of a look at what we are talking about there. I am often called “Tory scum” online. I am thick-skinned; I can handle it. It sometimes happens in the Chamber here—[Laughter.]—but I am thick-skinned and I can handle it. So, what if there was an option online for me to say, “You know what? I am relaxed about seeing some content that might be a bit distasteful for others. I am okay seeing it and hearing it.”? In academic discourse in particular, it is really important to hear the other side of the argument, the other side of a discussion, the other side of a debate. Out of context, one phrase or argument might be seen to be really harmful to a certain group within society. I will just flag the trans debate. Even the mention of the word trans or the words male and female can really ignite, hurt and harm. We could even argue that it is severe harm. Therefore, we need to be very careful about the definitions we are working towards.

Finally, the key principle is that we should ensure that adults who have agency can make decisions for themselves. I hope social media companies can choose not to remove content entirely or amplify content, but to flag content so that grown-ups with agency like us, like a lot of the population, can choose to opt in or to opt out.

20:57
Carla Lockhart Portrait Carla Lockhart (Upper Bann) (DUP)
- Hansard - - - Excerpts

While long overdue, I welcome the Bill and welcome the fact that it goes some way to addressing some of the concerns previously raised in this House. I thank the Minister for his engagement and the manner in which the Government have listened, particularly on the issue of anonymity. While it is not perfect, we will continue to press for the cloak of anonymity, which allows faceless trolls to abuse and cause harm, to be removed.

In building the Bill, a logical cornerstone would be that what is illegal offline—on the street, in the workplace and in the schoolyard—is also illegal online. The level of abuse I have received at times on social media would certainly be a matter for the police if it happened in person. It is wrong that people can get away with it online. However, there are dangers to our right to free speech around regulating content that is legal but deemed harmful to adults. The Bill allows what is legal but harmful to adults to be decided by the Secretary of State. Whatever is included in that category now could be easily expanded in future by regulations, which we all know means limited parliamentary scrutiny. As responsible legislators, we must reflect on how that power could be misused in the future. It could be a tool for repressive censorship and that is surely something neither the Government nor this House would wish to see in a land where freedom of speech is such a fundamental part of what and who we are. Without robust free speech protections, all the weight of the duties on content that is legal but harmful to adults will be pushing in one direction, and sadly, that is censorship. I urge the Government to address that in the Bill.

We also need to look at the weakness of the Bill in relation to the protection, particularly for children and young people, from pornography. It is welcome that since the publication of the draft Bill, the Government have listened to concerns by introducing part 5. In eight days, it will be the fifth anniversary of the Digital Economy Act 2017 receiving Royal Assent. This Government took the decision not to implement part 3 of that Act. Those of us in the House who support age verification restrictions being placed on pornographic content are justifiably hesitant, wondering whether the Government will let children down again.

It could be 2025 before children are protected through age verification. Even if the Bill becomes law, there is still no certainty that the Government will commence the provisions. It simply cannot be left to the Secretary of State in 2025 to move secondary legislation to give effect to age verification. A commencement clause needs to be placed in the Bill. Children deserve the right to know that this Government will act for them this time.

Furthermore, the Bill needs to be consistent in how it deals with pornography across parts 3 and 5. Age verification is a simple concept. If a website, part of a website or social media platform hosts or provides pornographic content, a person’s age should be verified before access. If a child went into a newsagents to attempt to buy a pornographic magazine, they would be challenged by the shopkeeper. This goes back to the cornerstone of this issue: illegal offline should mean illegal online. The concept may be simple but the Bill, as drafted, adds unnecessary complexities. I ask the Minister to act and make parts 3 and 5 similar. We should also give Ofcom more power when it is implementing the Bill.

21:01
Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - - - Excerpts

I had the great privilege of sitting on the Joint Committee on the draft Bill before Christmas and working with the Chair, my hon. Friend the Member for Folkestone and Hythe (Damian Collins), fantastic Members from across both Houses and amazing witnesses.

We heard repeated stories of platforms profiting from pain and prejudice. One story that really affected me was that of Zach Eagling, a heroic young boy who has cerebral palsy and epilepsy and who was targeted with flashing images by cruel trolls to trigger seizures. Those seizures have been triggered for other people with epilepsy, affecting their lives and risking not just harm, but potentially death, depending on their situation. That is why I and my hon. Friend the Member for Stourbridge (Suzanne Webb)—and all members of the Joint Committee, actually, because this was in our report—backed Zach’s law.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Ten-year-old Zach is a child in my constituency who has, as the hon. Member said, cerebral palsy and epilepsy, and he has been subjected to horrendous online abuse. I hope that the Minister can provide clarity tonight and confirm that Zach’s law—which shows that not just psychological harm and distress, but physical harm can be created as a result of online abuse and trolling—will be covered in the Bill.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

My understanding—hopefully this will be confirmed from the Dispatch Box—is that Zach’s law will be covered by clause 150 in part 10, on communications offences, but I urge the Ministry of Justice to firm that up further.

One thing that really came through for me was the role of algorithms. The only analogy that I can find in the real world for the danger of algorithms is narcotics. This is about organisations that focused on and targeted harmful content to people to get them to be more addicted to harm and to harmful content. By doing that, they numbed the senses of people who were using technology and social media, so that they engaged in practices that did them harm, turning them against not only others, but themselves. We heard awful stories about people doing such things as barcoding—about young girls cutting themselves—which was the most vile thing to hear, especially as a parent myself. There was also the idea that it was okay to be abusive to other people and the fact that it became normalised to hurt oneself, including in ways that can be undoable in future.

That leads on to a point about numbing the senses. I am really pleased that in debating the Bill today we have talked about the metaverse, because the metaverse is not just some random technology that we might talk about; it is about numbing the senses. It is about people putting on virtual reality headsets and living in a world that is not reality, even if it is for a matter of minutes or hours. As we look at these technologies and at virtual reality, my concern is that children and young people will be encouraged to spend more time in worlds that are not real and that could include more harmful content. Such worlds are increasingly accurate in their reality, in the impact that they can have and in their capability for user-to-user engagement.

I therefore think that although at the moment the Bill includes Meta and the metaverse, we need to look at it almost as a tech platform in its own right. We will not get everything right at first; I fully support the Bill as it stands, but as we move forward we will need to continue to improve it, test it and adapt it as new technologies come out. That is why I very much support the idea of a continuing Joint Committee specifically on online safety, so that as time goes by the issues can be scrutinised and we can look at whether Ofcom is delivering in its role. Ultimately, we need to use the Bill as a starting point to prevent harm now and for decades to come.

21:05
Liz Twist Portrait Liz Twist (Blaydon) (Lab)
- Hansard - - - Excerpts

I welcome the Bill, which is necessary and overdue, but I would like to raise two issues: how the Bill can tackle suicide and self-harm prevention, and mental health around body image for young people.

First, all suicide and self-harm content should be addressed across all platforms, regardless of size: it is not just the larger platforms that should be considered. The requirement imposed on category 1 platforms relating to legal but harmful suicide and self-harm content should be extended to all platforms, as many colleagues have said. There is a real concern that users will turn from the larger to the smaller platforms, so the issue needs to be addressed. Will the Minister confirm that even smaller platforms will be asked at the start to do an assessment of the risk they pose?

Secondly, the Secretary of State referred to secondary legislation, which will be necessary to identify legal but harmful suicide and self-harm content as a real priority for action. It would be really helpful if we could see that before the legislation is finally passed: it is a key issue and must be an urgent area of work.

Thirdly, I wonder whether the Government will look again at the Law Commission’s proposal that a new offence of encouraging or assisting serious self-harm be created, and that the Bill should make assisting self-harm a priority issue with respect to illegal content. Will the Minister look again at that proposal as the Bill progresses?

I also want to speak about damage to body image, particularly in relation to young people. All of us want to look our best on social media. Young people in particular face a real barrage of digitally enhanced and in many cases unrealistic images that can have a negative effect on body image. Research by the Mental Health Foundation shows that harmful material that damages body image can have a real negative effect on young people’s mental health. As other hon. Members have said, and as most of us know from our own experience, many of the images that we see on social media are driven by algorithms that can amplify the harm to young people. That is particularly concerning as an issue associated with the possible development of eating disorders and mental health conditions.

The Bill does include some provision on algorithms, but more needs to be done to protect our young people from that damage. I encourage the Government to consider amendments that would give more control over new algorithmic content and ensure that the safest settings are the default settings. Users should be given more control over the kind of advertising that they see and receive, to avoid excessive advertising showing perfect bodies. The Government should commit themselves to recognising material that damages body image as a serious form of harm.

There are many more detailed issues that I would have liked to raise tonight, but let me end by saying that we need to give serious consideration to ways of reducing the incidence of suicides and self-harm.

None Portrait Several hon. Members rose—
- Hansard -

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - - - Excerpts

Order. I am reluctant to reduce the time limit, but I am receiving appeals for me to try to get more people in, so I will reduce it to three minutes. However, not everyone will have a chance to speak this evening.

21:09
Caroline Dinenage Portrait Dame Caroline Dinenage (Gosport) (Con)
- Hansard - - - Excerpts

I congratulate the ministerial team and the army of fantastic officials who have brought this enormous and groundbreaking Bill to its current stage. It is one of the most important pieces of legislation that we will be dealing with. No country has attempted to regulate the internet so comprehensively as we have, and I welcome all the improvements that have been made to bring the Bill to this point. Those people have been extremely brave, and they have listened. There are widely competing interests at stake here, and the navigation of the Bill to a position where it has already achieved a degree of consensus is quite remarkable.

The pressure is on now, not least because we have all got into the habit of describing the Bill as the cavalry coming over the hill to solve all the ills of the online world. It is worth acknowledging from the outset that it will not be the silver bullet or the panacea for all the challenges that we face online. The point is, however, that it needs to be the best possible starting point, the groundworks to face down both the current threats and, more important, the likely challenges of the future. We all have a huge responsibility to work collaboratively, and not to let this process be derailed by side issues or clouded by party politics. Never has the phrase “not letting the perfect be the enemy of the good” been more appropriate. So much will be at risk if we do not seize the opportunity to make progress.

As the Secretary of State pointed out, the irony is that this vast and complex legislation is completely unnecessary. Search engines and social media platforms already have the ability to reduce the risks of the online world if they want to, and we have seen examples of that. However, while the bottom line remains their priority—while these precious algorithms remain so protected—the harms that are caused will never be tackled. With that in mind, I am more convinced than ever of the need for platforms to be held to account and for Ofcom to be given the powers to ensure that they are.

Inevitably, we will need to spend the next few weeks and months debating the various facets of this issue, but today I want to underline the bigger picture. It has always been an overarching theme that protecting children must be a top priority. One of the toughest meetings that I had as Digital Minister was with Ian Russell, whose 14-year-old daughter Molly took her own life after reading material promoting suicide and self-harm on Instagram. That is a conversation that brings a chill to the heart of any parent. Children are so often the victims of online harms. During lockdown, 47% of children said they had seen content that they wished they had not seen. Over a month-long period, the Internet Watch Foundation blocked at least 8.8 million attempts by UK internet users to access videos and images of children suffering sexual abuse.

There is so much at stake here, and we need to work together to ensure that the Bill is the very best that it can possibly be.

21:13
Jamie Stone Portrait Jamie Stone (Caithness, Sutherland and Easter Ross) (LD)
- Hansard - - - Excerpts

Obviously I, and my party, support the thrust of the Bill. The Government have been talking about this since 2018, so clearly time is of the essence.

Members have referred repeatedly to the slight vagueness of the definitions currently in the Bill—words such as “harms”, for instance—so I wanted to examine this from a “first principles” point of view. In another place, and almost in another life, and for four long years—perhaps as a punishment brief—I was made the Chairman of Subordinate Legislation Committee in the Scottish Parliament, so without bragging terribly much, I can say that there is nothing I do not know about affirmative and negative resolutions and everything to do with statutory instruments. You could call me a statutory instrument wonk. What I do know, and I do not think it is very different from discussion here, is that instruments come and go; they are not on the face of a Bill, because they are secondary legislation; and, by and large, ordinary, run-of-the-mill Members of Parliament do not take a huge amount of interest in them. The fact is, however, that the powers that will be granted to the Secretary of State to deliver definitions by means of subordinate legislation—statutory instruments—concern me slightly.

Reference has been made to how unfortunate it would be if the Secretary of State could tell the regulator what the regulator was or was not to do, and to the fact that other countries will look at what we do and, hopefully, see it as an example of how things should be done on a worldwide basis. Rightly or wrongly, we give ourselves the name of the mother of Parliaments. The concept of freedom of speech is incredibly important to the way we do things in this place and as a country. When it comes to the definition of what is bad, what is good, what should be online and what should not, I would feel happier if I could see that all 650 Members of Parliament actually understood and owned those definitions, because that is fundamental to the concept of freedom of speech. I look forward to seeing what comes back, and I have no reason to think that the Government are unsympathetic to the points that I am making. This is about getting the balance right.

Finally, in the short time available, I want to make two last points. My party is very keen on end-to-end encryption, and I need reassurance that that remains a possibility. Secondly, on the rules governing what is right and what is wrong for the press, the seven criteria would, as I read them, still allow a channel that I am not keen on, the Russian propaganda channel Russia Today, to broadcast, and allow my former colleague, the former First Minister of Scotland—this is no reflection on the Scottish National party—to broadcast his nonsense. That has now been banned, but the rules, as I see them, would allow Russia Today to broadcast.

21:15
Saqib Bhatti Portrait Saqib Bhatti (Meriden) (Con)
- Hansard - - - Excerpts

I am a great believer in the good that social media has done over the last few decades. It has transformed the way we interact, share ideas and stay connected. Social media has allowed a global conversation about global challenges such as climate change, poverty and even the conflict that we are witnessing in Ukraine. However, there is a dark side to social media, and I would be surprised if there were any Member of this House who had not experienced some form of it. The online world has become like the wild west: anything goes. Indeed, it was just last year when the whole country was gripped by the success of our football team in the Euros, and as I sadly watched us lose another penalty shoot-out, I turned to my wife and said, “You know what’s going to happen now, don’t you?” And it did. The three players who missed penalties, all young black men, were subjected to disgusting racist abuse. Monkey emojis were used to taunt them, and were not taken down because the Instagram algorithm did not deem that to be racism. Abuse on Twitter was rife, and the scale of it was so large that it restarted a national conversation, which I am sad to say we have had many times before.

On the back of that, I, along with 50 of my colleagues, wrote to the major social media companies: Reddit, Facebook, Twitter, Snapchat and TikTok. We asked for three things: that all accounts be verified; that the algorithm be adjusted with human interaction to account for differences in languages; and that there be a “three strikes and you’re out” policy for serial offenders, so that they knew that they would not be allowed to get away with abuse. Unfortunately, not all the companies responded, which shows how much respect they have for our democratic processes and for the moral duty to do the right thing. Those that did respond took long enough to do so, and took the view that they were already doing enough. Clearly, anyone can go on social media today and see that that is not true. It is not that the companies are burying their head in the sand; it is just not very profitable for them to make a change. If they had the will to do so, they certainly have the skill, innovative ability and resources to make it happen.

I fully accept that, in this legislation, the Government have taken a different approach, and there are clearly different ways to skin this cat. The 10% of turnover for fines, the clarity on what is allowed in companies’ terms and conditions, and effective enforcement may well draw a clear line in the sand. I call on the social media companies to heed the message sent by 50 of my colleagues, and to once again recognise their moral duty to be positive and good players in society. We have an opportunity today to set a standard, so that when an aspiring young boy or girl wants to be in the public eye, whether as an athlete, a media star or a politician, they will no longer think that being abused online is an inevitable consequence of that choice.

21:18
Sharon Hodgson Portrait Mrs Sharon Hodgson (Washington and Sunderland West) (Lab)
- Hansard - - - Excerpts

I speak in this debate as chair of the all-party parliamentary group on ticket abuse, which I set up over 10 years ago. The APPG shines a light on ticket abuse and campaigns to protect fans who are purchasing event tickets from being scammed and ripped off, often by the large-scale ticket touts that dominate resale sites such as Viagogo and StubHub. The APPG works with experts in the field such as FanFair Alliance, a music industry campaign, and the Iridium Consultancy to tackle industrial-scale ticket touting. I hope that when this legislation is reviewed in Committee, those organisations will be called on to share their expertise in this area.

Sadly, online ticket fraud is absolutely rife. Despite some regulatory and legislative improvements, not least in the Consumer Rights Act 2015, too many fans are still being scammed on a regular basis. The Bill, as it stands, includes a major loophole that means people will not be properly protected from online fraud. Search engines such as Google are not currently covered by the requirements on fraudulent advertising. A key issue in the ticketing market is how websites that allow fraudulent tickets to be sold often take out paid ads with Google that appear at the top of the search results. This gives the false impression to consumers that these sites are official ticket outlets. People mistakenly believe that only authorised ticket outlets can advertise on Google—people trust Google—and they are scammed as a result.

The Times reported last year that Google was taking advertising money from scam websites selling Premier League football tickets, even though the matches were taking place behind closed doors during lockdown—you couldn’t make it up. The Online Safety Bill needs to ensure that consumers are provided with much greater protection and that Google is forced to take greater responsibility for who it allows to advertise. If the Bill took action, online ticket fraud would be drastically reduced. With £2.3 billion lost to online fraud in the UK last year, it is very much needed.

It is also important to remember the human side of online fraud. Victims go through intense stress, as they are not only scammed out of their money but feel duped, stupid and humiliated. There cannot be a Member of this House who has not had to support a constituent devastated by online fraud. I have come across many stories, including one of an elderly couple who bought two tickets to see their favourite artist to celebrate their 70th wedding anniversary. When they arrived at the venue, they were turned away and told that they had been sold fake tickets. I have a lot more to say, Madam Deputy Speaker, but I think you get the drift.

21:21
Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

For too long, the tech giants have been able to dismiss the harms they create for the people we represent because they do not take seriously their responsibility for how their products are designed and used, which is why this legislation is vital.

The Bill will start to change the destructive culture in the tech industry. We live simultaneously in online and offline worlds, and we expect the rules and the culture to be the same in both, but at the moment, they are not. When I visited the big tech companies in Silicon Valley as Secretary of State in 2014 to talk about online moderation, which was almost completely absent at that stage, and child abuse images, which were not regularly removed, I rapidly concluded that the only way to solve the problem and the cultural deficit I encountered would be to regulate. I think this Bill has its roots in those meetings, so I welcome it and the Government’s approach.

I am pleased to see that measures on many of the issues on which I have been campaigning in the years since 2014 have come to fruition in this Bill, but there is still room for improvement. I welcome the criminalisation of cyber-flashing, and I pay tribute to Grazia, Clare McGlynn and Bumble for all their work with me and many colleagues in this place.

Wera Hobhouse Portrait Wera Hobhouse
- Hansard - - - Excerpts

Scotland banned cyber-flashing in 2010, but that ban includes a motivation test, rather than just a consent test, so a staggering 95% of cyber-flashing goes unpunished. Does the right hon. Lady agree that we should not make the same mistake?

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

I will come on to that shortly, and the hon. Lady knows I agree with her. This is something the Government need to take seriously.

The second thing I support in this Bill is limiting anonymous online abuse. Again, I pay tribute to the Football Association, with which I have worked closely, Glitch, the Centenary Action Group, Compassion in Politics, Hope not Hate and Kick It Out. They have all done a tremendous job, working with many of us in this place, to get to this point.

Finally, I support preventing children from accessing pornography, although I echo what we heard earlier about it being three years too late. It is shameful that this measure was not enacted earlier.

The Minister knows that three demands are coming his way from me. We need to future-proof our approach to the law in this area. Tech moves quickly—quicker than the Government’s approach to legislation, which leaves us playing whack-a-mole. The devious methods of causing harm change rapidly, as do the motivations of perpetrators, to answer the point raised by the hon. Member for Bath (Wera Hobhouse). What stays the same is the lack of consent from victims, so will the Government please look at that as a way of future-proofing our law? A worrying example of that is deepfake technology that creates pornographic images of women. That is currently totally lawful. Nudification software is commercially available and uses images—only of women —to create nude images. I have already stated publicly that that should be banned. It has been in South Korea and Taiwan, yet our law is playing catch-up.

The second issue that the Government need to address is the fact that they are creating many more victims as a result of this Bill. We need to make sure that victim support is in place to augment the amazing work of organisations such as the Revenge Porn Helpline. Finally, to echo the point made by my hon. Friend the Member for Watford (Dean Russell), let me say that this is a complex area, as we are proving with every speech in this debate. I pay tribute to the Select Committee Chair, who is no longer in his place, and the Joint Committee Chair, but I believe that we need a joint standing committee to scrutinise the implementation of this Bill when it is enacted. This is a world-class piece of legislation to change culture, but we also need other countries to adopt a similar approach. A global approach is needed if this is to work to end the wild west.

21:25
Gavin Robinson Portrait Gavin Robinson (Belfast East) (DUP)
- Hansard - - - Excerpts

It is a pleasure to follow the right hon. Member for Basingstoke (Mrs Miller), and a number of contributions this evening chime with my view. My hon. Friend the Member for Upper Bann (Carla Lockhart) outlined our party’s broad support for the Bill; however, she and the hon. Members for Windsor (Adam Afriyie) and for Bristol North West (Darren Jones) all raised concerns that can be ironed out and worked upon as the Bill progresses, but that are worthy of reflection, from a principle perspective, at this stage. My hon. Friend rightly said that we should not ban online that which is legal offline. That issue is causing consternation and concern, and it needs to be reflected on and thought through.

There was a chink of light in the exchange between the Minister and the Chair of the Joint Committee, the hon. Member for Folkestone and Hythe (Damian Collins), who said that we want to, and should, be talking about regulating in the online domain those things that are offences offline. That is what we should be doing, not engaging in discussions about ill-defined or non-defined “legal but harmful” content. We do not know what that is. In this Bill, we are conferring significant power on the Secretary of State, not to decide that, but to bring that proposal forward through a mechanism that does not afford the greatest level of parliamentary scrutiny, as we know. This debate has been curtailed to two and a half hours, and a debate on a statutory instrument on what is legal but harmful will be 90 minutes long, and there will be no ability to amend that instrument.

There has been discussion about journalists. It is right that there should be protections for them, for democratic content and for politicians. However, article 10 of the Human Rights Act does not distinguish between the average Joe and somebody who is providing academic or journalistic content, so should we? Is that the right step? It is right that we provide protection for those individuals, but what about anyone else who wishes to enjoy freedom of expression in the online domain? It has been said that there is a right of appeal, and yes, there is—to an offshored company that marks its own homework and is satisfied with the action it has taken. But it will have removed the journalist or individual’s content, and they will have suffered the consequence, with no recourse. They cannot take a judicial review against such a company, and an individual will not be able to go to Ofcom either; it will not be interested unless a super entity or a super-class complaint is involved. There is no recourse here. Those are the sorts of issues we will have to grapple with. There are fines for the companies here, but what about recourse for the individual?

21:29
Robert Jenrick Portrait Robert Jenrick (Newark) (Con)
- Hansard - - - Excerpts

In the one minute you have given me to speak in this debate, let me make three brief points, Madam Deputy Speaker. First, I come to this Bill with concerns about its impact on freedom of speech. I am grateful for the reassurances I have received already, and will be following how we manage journalistic content, in particular, in order to protect that in the Bill.

Secondly, I am concerned about the Bill’s impact on the ability of us all to tackle the abuse of the power that social media companies have more broadly. The Bill does not contain measures to increase competition, to enable small businesses in this country to prosper and to ensure that the social media platforms do not crowd out existing businesses. I have been assured that a second Bill will follow this one and will tackle that issue, but in recent days I have heard reports in the press that that Bill will not go forward because of a lack parliamentary time. I would be grateful if the Minister could say when he responds to the debate that that Bill will proceed, because it is an extremely important issue.

21:30
John Nicolson Portrait John Nicolson (Ochil and South Perthshire) (SNP)
- Hansard - - - Excerpts

Everyone wants to be safe online and everyone wants to keep their children safe online but, from grooming to religious radicalisation and from disinformation to cruel attacks on the vulnerable, the online world is far from safe. That is why we all agree that we need better controls while we preserve all that is good about the online world, including free speech.

This Bill is an example of how legislation can benefit from a collegiate, cross-party approach. I know because I have served on the Select Committee and the Joint Committee, both of which produced reports on the Bill. The Bill is ambitious and much of it is good, but there are some holes in the legislation and we must make important improvements before it is passed.

Debbie Abrahams Portrait Debbie Abrahams (Oldham East and Saddleworth) (Lab)
- Hansard - - - Excerpts

Does the hon. Gentleman, with whom I served on the Joint Committee on the draft Bill, agree, having listened to the evidence of the whistleblower Frances Haugen about how disinformation was used in the US Capitol insurrection, that it is completely inadequate that there is only one clause on the subject in the Bill?

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

Yes, and I shall return to that point later in my speech.

The Secretary of State’s powers in the Bill need to be addressed. From interested charities to the chief executive of Ofcom, there is consensus that the powers of the Secretary of State in the legislation are too wide. Child safety campaigners, human rights groups, women and girls’ charities, sports groups and democracy reform campaigners all agree that the Secretary of State’s powers threaten the independence of the regulator. That is why both the Joint Committee and the Select Committee have, unanimously and across party lines, recommended reducing the proposed powers.

We should be clear about what exactly the proposed powers will do. Under clause 40, the Secretary of State will be able to modify the draft codes of practice, thus allowing the UK Government a huge amount of power over the independent communications regulator, Ofcom. The Government have attempted to play down the powers, saying that they would be used only in “exceptional circumstances”, but the word “exceptional” is nebulous. How frequent is exceptional? All we are told is that the exceptional circumstances could reflect changing Government “public policy”. That is far too vague, so perhaps the Secretary of State will clarify the difference between public policy and Government policy and give us some further definition of “exceptional”.

While of course I am sure Members feel certain that the current Secretary of State would exercise her powers in a calm and level-headed way, imagine if somebody intemperate held her post or—heaven forfend—a woke, left-wing snowflake from the Labour Benches did. The Secretary of State should listen to her own MPs and reduce her powers in the Bill.

Let me turn to misinformation and disinformation. The Bill aims not only to reduce abuse online but to reduce harm more generally. That cannot be done without including in the Bill stronger provisions on disinformation. As a gay man, I have been on the receiving end of abuse for my sexuality, and I have seen the devasting effect that misinformation and disinformation have had on my community. Disinformation has always been weaponised to spread hate; however, the pervasive reach of social media makes disinformation even more dangerous.

The latest battle ground for LGBT rights has seen an onslaught against trans people. Lies about them and their demand for enhanced civil rights have swirled uncontrollably. Indeed, a correspondent of mine recently lamented “trans funding” in the north-east of Scotland, misreading and misunderstanding and believing it to involve the compulsory regendering of retiring oil workers in receipt of transitional funding from the Scottish Government. That is absurd, of course, but it says something about the frenzied atmosphere stirred up by online transphobes.

The brutal Russian invasion of Ukraine, with lies spewed by the Russian Government and their media apologists, has, like the covid pandemic, illustrated some of the other real-world harms arising from disinformation. It is now a weapon of war, with serious national security implications, yet the UK Government still do not seem to be taking it seriously enough. Full Fact, the independent fact-checking service, said that there is currently no credible plan to tackle disinformation. The Government may well argue that disinformation will fall under the false communications provision in clause 151, but in practice it sets what will likely be an unmeetable bar for services. As such, most disinformation will be dealt with as harmful content.

We welcome the Government’s inclusion of functionality in the risk assessments, which will look not just at content but how it spreads. Evidence from the two Committees shows that the dissemination of harm is as important as the content itself, but the Government should be more explicit in favouring content-neutral modes for reducing disinformation, as this will have less of an impact on freedom of speech. That was recommended by the Facebook whistleblowers Sophie Zhang and Frances Haugen.

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

No, I will make some progress, if I may.

A vital tool in countering disinformation is education, and Estonia—an early and frequent victim of Russian disinformation—is a remarkable case study. That is why the Government’s decision to drop Ofcom’s clause 104 media duties is perplexing. Media literacy should be a shared responsibility for schools, Government, and wider society. Spreading and enhancing media literacy should be up to not just Ofcom, but the larger platforms too. Ofcom should also be allowed to break platform terms and conditions for the purposes of investigation. For example, it would currently be unable to create fake profiles to analyse various companies’ behaviour, such as their response to abuse. It would empower the regulator.

Various issues arise when trying to legislate for harm that is not currently illegal. This is challenging for us as legislators since we do not know exactly what priority harms will be covered by secondary legislation, but we would like assurances from the Government that Zach’s law, as it has come to be known, will become a standalone offence. Vicious cowards who send seizure-inducing flashing images to people with epilepsy to trigger seizures must face criminal consequences. The Minister told me in a previous debate that this wicked behaviour will now be covered by the harmful communications offence under clause 150, but until a specific law is on the statute book, he will, I imagine, understand families’ desire for certainty.

Finally, I turn to cross-platform abuse. There has been a terrifying increase in online child abuse over the past three years. Grooming offences have increased by 70% in that period. The Select Committee and the Joint Committee received a host of recommendations which, disappointingly, seem to have been somewhat ignored by the Government. On both Committees, we have been anxious to reduce “digital breadcrumbing”, which is where paedophiles post images of children which may look benign and will not, therefore, be picked up by scanners. However, the aim is to induce children, or to encourage other paedophiles, to leave the regulated site and move to unregulated sites where they can be abused with impunity. I urge the Secretary of State to heed the advice of the National Society for the Prevention of Cruelty to Children. Without enacting the measures it recommends, children are at ever greater risk of harm.

The House will have noted that those on the SNP Benches have engaged with the Government throughout this process. Indeed, I am the only Member to have sat on both the Joint Committee and the Select Committee as this Bill has been considered and our reports written. It has been a privilege to hear from an incredible range of witnesses, some of whom have displayed enormous bravery in giving their testimony.

We want to see this legislation succeed. That there is a need for it is recognised across the House—but across the House, including on the Tory Benches, there is also recognition that the legislation can and must be improved. It is our intention to help to improve the legislation without seeking party advantage. I hope the Secretary of State will engage in the same constructive manner.

21:39
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

It is an honour to close this debate on behalf of the Opposition. Sadly, there is so little time for the debate that there is much that we will not even get to probe, including any mention of the Government’s underfunded and ill-thought-through online media strategy.

However, we all know that change and regulation of the online space are much needed, so Labour welcomes this legislation even in its delayed form. The current model, which sees social media platforms and tech giants making decisions about what content is hosted and shared online, is simply failing. It is about time that that model of self-regulation, which gives too much control to Silicon Valley, was challenged.

Therefore, as my hon. Friend the Member for Manchester Central (Lucy Powell) said, Labour broadly supports the principles of the Bill and welcomes some aspects of the Government’s approach, including the duty of care frameworks and the introduction of an independent regulator, Ofcom. It cannot and should not be a matter for the Government of the time to control what people across the UK are able to access online. Labour will continue to work hard to ensure that Ofcom remains truly independent of political influence.

We must also acknowledge, however, that after significant delays this Bill is no longer world leading. The Government first announced their intention to regulate online spaces all the way back in 2018. Since then, the online space has remained unregulated and, in many cases, has perpetuated dangerous and harmful misinformation with real-world consequences. Colleagues will be aware of the sheer amount of coronavirus vaccine disinformation so easily accessed by millions online at the height of the pandemic. Indeed, in many respects, it was hard to avoid.

More recently, the devastating impact of state disinformation at the hands of Putin’s regime has been clearer than ever, almost two years after Parliament’s own Intelligence and Security Committee called Russian influence in the UK “the new normal”.

Deidre Brock Portrait Deidre Brock
- Hansard - - - Excerpts

Does the hon. Lady share my disappointment and concern that the Bill does nothing to address misinformation and disinformation in political advertising? A rash of very aggressive campaign groups emerged before the last Scottish Parliament elections, for example; they spent heavily on online political advertising, but were not required to reveal their political ties or funding sources. That is surely not right.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I share the hon. Lady’s concern. There is so much more that is simply missing from this Bill, which is why it is just not good enough. We have heard in this debate about a range of omissions from the Bill and the loopholes that, despite the years of delay, have still not been addressed by the Government. I thank hon. Members on both sides of the House for pointing those out. It is a shame that we are not able to address them individually here, but we will probe those valued contributions further in the Bill Committee.

Despite huge public interest and a lengthy prelegislative scrutiny process, the Government continue to ignore many key recommendations, particularly around defining and regulating both illegal and legal but harmful content online. The very nature of the Bill and its heavy reliance on secondary legislation to truly flesh out the detail leaves much to be desired. We need to see action now if we are truly to keep people safe online.

Most importantly, this Bill is an opportunity, and an important one at that, to decide the kind of online world our children grow up in. I know from many across the House that growing up online as children do now is completely unimaginable. When I was young, we played Snake on a Nokia 3310, and had to wait for the dial-up and for people to get off the phone in order to go online and access MSN, but for people today access to the internet, social media and everything that brings is a fundamental part of their lives.

Once again, however, far too much detail, and the specifics of how this legislation will fundamentally change the user experience, is simply missing from the Bill. When it comes to harmful content that is not illegal, the Government have provided no detail. Despite the Bill’s being years in the making, we are no closer to understanding the impact it will have on users.

The Bill in its current draft has a huge focus on the tools for removing and moderating harmful content, rather than ensuring that design features are in place to make services systematically safer for all of us. The Government are thus at real risk of excluding children from being able to participate in the digital world freely and safely. The Bill must not lock children out of services they are entitled to use; instead, it must focus on making those services safe by design.

I will push the Minister on this particular point. We are all eager to hear what exact harms platforms will have to take steps to address and mitigate. Will it be self-harm? Will it perhaps be content promoting eating disorders, racism, homophobia, antisemitism and misogyny? One of the key problems with the Bill is the failure to make sure that the definitions of “legal but harmful” content are laid out within it. Will the Minister therefore commit to amending the Bill to address this and to allow for proper scrutiny? As we have heard, the Government have also completely failed to address what stakeholders term the problem of breadcrumbing. I would be grateful if the Minister outlined what steps the Government will be taking to address this issue, as there is clearly a loophole in the Bill that would allow this harmful practice to continue.

As we have heard, the gaps in the Bill, sadly, do not end there. Women and girls are disproportionately likely to be affected by online abuse and harassment. Online violence against women and girls is defined as including but not limited to

“intimate image abuse, online harassment, the sending of unsolicited explicit images, coercive ‘sexting’, and the creation and sharing of ‘deepfake’ pornography.”

This Bill is an important step forward but it will need significant strengthening to make online spaces safe for women and girls. While we welcome the steps by the Government to include cyber-flashing in the Bill, it must go further in other areas. Misogyny should be included as a harm to adults that online platforms have a duty to prevent from appearing on them. As colleagues will be aware, Instagram has been completely failing to tackle misogynistic abuse sent via direct message. The Centre for Countering Digital Hate has exposed what it terms an “epidemic of misogynistic abuse”, 90% of which has been completely and utterly ignored by Instagram, even when it has been reported to moderators. The Government must see sense and put violence against women and girls into the Bill, and it must also form a central pillar of regulation around legal but harmful content. Will the Minister therefore commit to at least outlining the definitions of “legal but harmful” content, both for adults and children, in the Bill?

Another major omission from the Bill in as currently drafted is its rather arbitrary categorisation of platforms based on size versus harm. As mentioned by many hon. Members, the categorisation system as it currently stands will completely fail to address some of the most extreme harms on the internet. Thanks to the fantastic work of organisations such as Hope not Hate and the Antisemitism Policy Trust, we know that smaller platforms such as 4chan and BitChute have significant numbers of users who are highly motivated to promote extremely dangerous content. The Minister must accept that his Department has been completely tone-deaf on this particular point, and—he must listen to what hon. Members have said today—its decision making utterly inexplicable. Rather than an arbitrary size cut-off, the regulator must instead use risk levels to determine which category a platform should fall into so that harmful and dangerous content does not slip through the net. Exactly when will the Minister’s Department publish more information on the detail around this categorisation system? Exactly what does he have to say to those people, including many Members here today, who have found themselves the victim of abusive content that has originated on these hate-driven smaller platforms? How will this Bill change their experience of being online? I will save him the energy, because we all know the real answer: it will do little to change the situation.

This Bill was once considered a once-in-a-generation opportunity to improve internet safety for good, and Labour wants to work with the Government to get this right. Part of our frustration is due to the way in which the Government have failed to factor technological change and advancement—which, as we all know, and as we have heard today, can be extremely rapid—into the workings of this Bill. While the Minister and I disagree on many things, I am sure that we are united in saying that no one can predict the future, and that is not where my frustrations lie. Instead, I feel that the Bill has failed to address issues that are developing right now—from developments in online gaming to the expansion of the metaverse. These are complicated concepts but they are also a reality that we as legislators must not shy away from.

The Government have repeatedly said that the Bill’s main objective is to protect children online, and of course it goes without saying that Labour supports that. Yet with the Bill being so restricted to user-to-user services, there are simply too many missed opportunities to deal with areas where children, and often adults, are likely to be at risk of harm. Online gaming is a space that is rightly innovative and fast-changing, but the rigid nature of how services have been categorised will soon mean that the Bill is outdated long before it has had a chance to have a positive impact. The same goes for the metaverse.

While of course Labour welcomes the Government’s commitment to prevent under-18s from accessing pornography online, the Minister must be realistic. A regime that seeks to ban rather than prevent is unlikely to ever be able to keep up with the creative, advanced nature of the tech industry. For that reason, I must press the Minister on exactly how this Bill will be sufficiently flexible and future-proofed to avoid a situation whereby it is outdated by the time it finally receives Royal Assent. We must make sure that we get this right, and the Government know that they could and can do more. I therefore look forward to the challenge and to working with colleagues across the House to strengthen this Bill throughout its passage.

21:49
Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

The piece of legislation before the House this evening is truly groundbreaking, because no other jurisdiction anywhere in the world has attempted to legislate as comprehensively as we are beginning to legislate here. For too long, big tech companies have exposed children to risk and harm, as evidenced by the tragic suicide of Molly Russell, who was exposed to appalling content on Instagram, which encouraged her, tragically, to take her own life. For too long, large social media firms have allowed illegal content to go unchecked online.

Richard Burgon Portrait Richard Burgon (Leeds East) (Lab)
- Hansard - - - Excerpts

I have spoken before about dangerous suicide-related content online. The Minister mentions larger platforms. Will the Government go away and bring back two amendments based on points made by the Samaritans? One would bring smaller platforms within the scope of sanctions, and the second would make the protective aspects of the Bill cover people who are over 18, not just those who are under 18. If the Government do that, I am sure that it will be cause for celebration and that Members on both sides of the House will give their support.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

It is very important to emphasise that, regardless of size, all platforms in the scope of the Bill are covered if there are risks to children.

A number of Members, including the right hon. Member for Barking (Dame Margaret Hodge) and my hon. Friend the Member for Brigg and Goole (Andrew Percy), have raised the issue of small platforms that are potentially harmful. I will give some thought to how the question of small but high-risk platforms can be covered. However, all platforms, regardless of size, are in scope with regard to content that is illegal and to content that is harmful to children.

For too long, social media firms have also arbitrarily censored content just because they do not like it. With the passage of this Bill, all those things will be no more, because it creates parliamentary sovereignty over how the internet operates, and I am glad that the principles in the Bill command widespread cross-party support.

The pre-legislative scrutiny that we have gone through has been incredibly intensive. I thank and pay tribute to the DCMS Committee and the Joint Committee for their work. We have adopted 66 of the Joint Committee’s recommendations. The Bill has been a long time in preparation. We have been thoughtful, and the Government have listened and responded. That is why the Bill is in good condition.

Debbie Abrahams Portrait Debbie Abrahams
- Hansard - - - Excerpts

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I must make some progress, because I am almost out of time and there are lots of things to reply to.

I particularly thank previous Ministers, who have done so much fantastic work on the Bill. With us this evening are my hon. Friend the Member for Gosport (Dame Caroline Dinenage) and my right hon. Friends the Members for Maldon (Mr Whittingdale) and for Basingstoke (Mrs Miller), but not with us this evening are my right hon. and learned Friend the Member for Kenilworth and Southam (Jeremy Wright), who I think is in America, and my right hon. Friends the Members for Hertsmere (Oliver Dowden) and for Staffordshire Moorlands (Karen Bradley), all of whom showed fantastic leadership in getting the Bill to where it is today. It is a Bill that will stop illegal content circulating online, protect children from harm and make social media firms be consistent in the way they handle legal but harmful content, instead of being arbitrary and inconsistent, as they are at the moment.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have so many points to reply to that I have to make some progress.

The Bill also enshrines, for the first time, free speech—something that we all feel very strongly about—but it goes beyond that. As well as enshrining free speech in clause 19, it gives special protection, in clauses 15 and 16, for content of journalistic and democratic importance. As my right hon. Friend the Secretary of State indicated in opening the debate, we intend to table a Government amendment—a point that my right hon. Friends the Members for Maldon and for Ashford (Damian Green) asked me to confirm—to make sure that journalistic content cannot be removed until a proper right of appeal has taken place. I am pleased to confirm that now.

We have made many changes to the Bill. Online fraudulent advertisers are now banned. Senior manager liability will commence immediately. Online porn of all kinds, including commercial porn, is now in scope. The Law Commission communication offences are in the Bill. The offence of cyber-flashing is in the Bill. The priority offences are on the face of the Bill, in schedule 7. Control over anonymity and user choice, which was proposed by my hon. Friend the Member for Stroud (Siobhan Baillie) in her ten-minute rule Bill, is in the Bill. All those changes have been made because this Government have listened.

Let me turn to some of the points made from the Opposition Front Bench. I am grateful for the in-principle support that the Opposition have given. I have enjoyed working with the shadow Minister and the shadow Secretary of State, and I look forward to continuing to do so during the many weeks in Committee ahead of us, but there were one or two points made in the opening speech that were not quite right. This Bill does deal with systems and processes, not simply with content. There are risk assessment duties. There are safety duties. There are duties to prevent harm. All those speak to systems and processes, not simply content. I am grateful to the Chairman of the Joint Committee, my hon. Friend the Member for Folkestone and Hythe (Damian Collins), for confirming that in his excellent speech.

If anyone in this House wants confirmation of where we are on protecting children, the Children’s Commissioner wrote a joint article with the Secretary of State in the Telegraph—I think it was this morning—confirming her support for the measures in the Bill.

When it comes to disinformation, I would make three quick points. First, we have a counter-disinformation unit, which is battling Russian disinformation night and day. Secondly, any disinformation that is illegal, that poses harm to children or that comes under the definition of “legal but harmful” in the Bill will be covered. And if that is not enough, the Minister for Security and Borders, who is sitting here next to me, intends to bring forward legislation at the earliest opportunity to cover counter-hostile state threats more generally. This matter will be addressed in the Bill that he will prepare and bring forward.

I have only four minutes left and there are so many points to reply to. If I do not cover them all, I am very happy to speak to Members individually, because so many important points were made. The right hon. Member for Barking asked who was going to pay for all the Ofcom enforcement. The taxpayer will pay for the first two years while we get ready—£88 million over two years—but after that Ofcom will levy fees on these social media firms, so they will pay for regulating their activities. I have already replied to the point she rightly raised about smaller but very harmful platforms.

My hon. Friend the Member for Meriden (Saqib Bhatti) has been campaigning tirelessly on the question of combating racism. This Bill will deliver what he is asking for.

The hon. Member for Batley and Spen (Kim Leadbeater) and my hon. Friend the Member for Watford (Dean Russell) asked about Zach’s law. Let me take this opportunity to confirm explicitly that clause 150—the harmful communication clause, for where a communication is intended to cause psychological distress—will cover epilepsy trolling. What happened to Zach will be prevented by this Bill. In addition, the Ministry of Justice and the Law Commission are looking at whether we can also have a standalone provision, but let me assure them that clause 150 will protect Zach.

My right hon. Friend the Member for Maldon asked a number of questions about definitions. Companies can move between category 1 and category 2, and different parts of a large conglomerate can be regulated differently depending on their activities. Let me make one point very clear—the hon. Member for Bristol North West (Darren Jones) also raised this point. When it comes to the provisions on “legal but harmful”, neither the Government nor Parliament are saying that those things have to be taken down. We are not censoring in that sense. We are not compelling social media firms to remove content. All we are saying is that they must do a risk assessment, have transparent terms and conditions, and apply those terms and conditions consistently. We are not compelling, we are not censoring; we are just asking for transparency and accountability, which is sorely missing at the moment. No longer will those in Silicon Valley be able to behave in an arbitrary, censorious way, as they do at the moment—something that Members of this House have suffered from, but from which they will no longer suffer once this Bill passes.

The hon. Member for Bristol North West, who I see is not here, asked a number of questions, one of which was about—[Interruption.] He is here; I do apologise. He has moved—I see he has popped up at the back of the Chamber. He asked about codes of practice not being mandatory. That is because the safety duties are mandatory. The codes of practice simply illustrate ways in which those duties can be met. Social media firms can meet them in other ways, but if they fail to meet those duties, Ofcom will enforce. There is no loophole here.

When it comes to the ombudsman, we are creating an internal right of appeal for the first time, so that people can appeal to the social media firms themselves. There will have to be a proper right of appeal, and if there is not, they will be enforced against. We do not think it appropriate for Ofcom to consider every individual complaint, because it will simply be overwhelmed, by probably tens of thousands of complaints, but Ofcom will be able to enforce where there are systemic failures. We feel that is the right approach.

I say to the hon. Member for Plymouth, Sutton and Devonport (Luke Pollard) that my right hon. Friend the Minister for Security and Borders will meet him about the terrible Keyham shooting.

The hon. Member for Washington and Sunderland West (Mrs Hodgson) raised a question about online fraud in the context of search. That is addressed by clause 35, but we do intend to make drafting improvements to the Bill, and I am happy to work with her on those drafting improvements.

I have been speaking as quickly as I can, which is quite fast, but I think time has got away from me. This Bill is groundbreaking. It will protect our citizens, it will protect our children—[Hon. Members: “Sit down!”]—and I commend it to the House.

Question put and agreed to.

Bill accordingly read a Second time.

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - - - Excerpts

The Minister just made it. I have rarely seen a Minister come so close to talking out his own Bill.

Online Safety Bill (Programme)

Motion made, and Question put forthwith (Standing Order No. 83A(7)),

That the following provisions shall apply to the Online Safety Bill:

Committal

(1) The Bill shall be committed to a Public Bill Committee.

Proceedings in Public Bill Committee

(2) Proceedings in the Public Bill Committee shall (so far as not previously concluded) be brought to a conclusion on Thursday 30 June 2022.

(3) The Public Bill Committee shall have leave to sit twice on the first day on which it meets.

Consideration and Third Reading

(4) Proceedings on Consideration shall (so far as not previously concluded) be brought to a conclusion one hour before the moment of interruption on the day on which those proceedings are commenced.

(5) Proceedings on Third Reading shall (so far as not previously concluded) be brought to a conclusion at the moment of interruption on that day.

(6) Standing Order No. 83B (Programming committees) shall not apply to proceedings on Consideration and Third Reading.

Other proceedings

(7) Any other proceedings on the Bill may be programmed.—(Michael Tomlinson.)

Question agreed to.

Online Safety Bill (Money)

Queen’s recommendation signified.

Motion made, and Question put forthwith (Standing Order No. 52(1)(a)),

That, for the purposes of any Act resulting from the Online Safety Bill, it is expedient to authorise the payment out of money provided by Parliament of:

(1) any expenditure incurred under or by virtue of the Act by the Secretary of State, and

(2) any increase attributable to the Act in the sums payable under any other Act out of money so provided.—(Michael Tomlinson.)

Question agreed to.

Online Safety Bill (Ways and Means)

Motion made, and Question put forthwith (Standing Order No. 52(1)(a)),

That, for the purposes of any Act resulting from the Online Safety Bill, it is expedient to authorise:

(1) the charging of fees under the Act, and

(2) the payment of sums into the Consolidated Fund.—(Michael Tomlinson.)

Question agreed to.

Deferred Divisions

Motion made, and Question put forthwith (Standing Order No. 41A(3)),

That at this day’s sitting, Standing Order 41A (Deferred divisions) shall not apply to the Motion in the name of Secretary Nadine Dorries relating to Online Safety Bill: Carry-over.—(Michael Tomlinson.)

Question agreed to.

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - - - Excerpts

Order. Really, people just ought to have more courtesy than to get up and, when there is still business going on in this House, to behave as if it is not sitting because it is after 10 o’clock. We really have to observe courtesy at all times in here.

Online Safety Bill (Carry-Over)

Motion made, and Question put forthwith (Standing Order No. 80A(1)(a)),

That if, at the conclusion of this Session of Parliament, proceedings on the Online Safety Bill have not been completed, they shall be resumed in the next Session.—(Michael Tomlinson.)

Question agreed to.

Online Safety Bill (First sitting)

Committee stage
Tuesday 24th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 24 May 2022 - (24 May 2022)
The Committee consisted of the following Members:
Chairs: Sir Roger Gale, † Christina Rees
† Ansell, Caroline (Eastbourne) (Con)
† Bailey, Shaun (West Bromwich West) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
Carden, Dan (Liverpool, Walton) (Lab)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Double, Steve (St Austell and Newquay) (Con)
† Fletcher, Nick (Don Valley) (Con)
† Holden, Mr Richard (North West Durham) (Con)
† Keeley, Barbara (Worsley and Eccles South) (Lab)
† Leadbeater, Kim (Batley and Spen) (Lab)
† Miller, Mrs Maria (Basingstoke) (Con)
† Mishra, Navendu (Stockport) (Lab)
† Moore, Damien (Southport) (Con)
Nicolson, John (Ochil and South Perthshire) (SNP)
† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
† Russell, Dean (Watford) (Con)
Stevenson, Jane (Wolverhampton North East) (Con)
Katya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks
† attended the Committee
Witnesses
Kevin Bakhurst, Group Director, Broadcasting and Online Content, Ofcom
Richard Wronka, Director for Online Harms, Ofcom
Dame Rachel de Souza, Children’s Commissioner, Office of the Children’s Commissioner for England
Andy Burrows, Head of Child Safety, NSPCC
Lynn Perry MBE, CEO, Barnardo’s
Ben Bradley, Government Relations and Public Policy Manager, TikTok
Katy Minshall, Head of UK Public Policy, Twitter
Public Bill Committee
Tuesday 24 May 2022
(Morning)
[Christina Rees in the Chair]
Online Safety Bill
09:25
None Portrait The Chair
- Hansard -

We are now sitting in public and the proceedings are being broadcast. Before we begin, I have a few announcements. Hansard colleagues would be grateful if Members could email their speaking notes to hansardnotes@parliament.uk. Please switch electronic devices to silent. Tea, coffee, and other drinks, apart from the water provided, are not allowed during sittings.

Today, we will first consider the programme motion on the amendment paper. We will then consider a motion to enable the reporting of written evidence for publication, and a motion to allow us to deliberate in private about our questions before the oral evidence session. In view of the timetable available, I hope that we can take these matters formally without debate. I first call the Minister to move the programme motion standing in his name, which was discussed on Thursday by the Programming Sub-Committee for this Bill.

Ordered,

That—

(1) the Committee shall (in addition to its first meeting at 9.25 am on Tuesday 24 May) meet—

(a) at 2.00 pm on Tuesday 24 May;

(b) at 11.30 am and 2.00 pm on Thursday 26 May;

(c) at 9.25 am and 2.00 pm on Tuesday 7 June;

(d) at 11.30 am and 2.00 pm on Thursday 9 June;

(e) at 9.25 am and 2.00 pm on Tuesday 14 June;

(f) at 11.30 am and 2.00 pm on Thursday 16 June;

(g) at 9.25 am and 2.00 pm on Tuesday 21 June;

(h) at 11.30 am and 2.00 pm on Thursday 23 June;

(i) at 9.25 am and 2.00 pm on Tuesday 28 June;

(j) at 11.30 am and 2.00 pm on Thursday 30 June;

(2) the Committee shall hear oral evidence in accordance with the following Table:

Date

Time

Witness

Tuesday 24 May

Until no later than 10.05 am

Ofcom

Tuesday 24 May

Until no later than 10.50 am

Dame Rachel de Souza, Children’s Commissioner for England; Barnado’s; National Society for the Prevention of Cruelty to Children (NSPCC)

Tuesday 24 May

Until no later than 11.25 am

TikTok; Twitter

Tuesday 24 May

Until no later than 2.45 pm

Meta; Microsoft; Google

Tuesday 24 May

Until no later than 3.30 pm

Professor Clare McGlynn, Professor of Law, Durham University; Refuge; End Violence Against Women

Tuesday 24 May

Until no later than 4.15 pm

techUK; Online Safety Tech Industry Association (OSTIA); Crisp

Tuesday 24 May

Until no later than 5.00 pm

Match Group; Bumble; TrustElevate

Tuesday 24 May

Until no later than 5.30 pm

Marie Collins Foundation; Internet Watch Foundation (IWF)

Tuesday 24 May

Until no later than 6.00 pm

Demos; FairVote

Thursday 26 May

Until no later than 12.15 pm

Catch22; Full Fact; Carnegie UK Trust

Thursday 26 May

Until no later than 1.00 pm

Antisemitism Policy Trust; Clean up the Internet; HOPE not hate

Thursday 26 May

Until no later than 2.25 pm

Information Commissioner’s Office

Thursday 26 May

Until no later than 2.55 pm

Kick It Out; The Football Association

Thursday 26 May

Until no later than 3.25 pm

Center for Countering Digital Hate; Reset

Thursday 26 May

Until no later than 3.55 pm

News Media Association; Guardian Media Group

Thursday 26 May

Until no later than 4.40 pm

Personal Investment Management & Financial Advice Association (PIMFA); Which?; Money Saving Expert

Thursday 26 May

Until no later than 5.05 pm

Frances Haugen



(3) proceedings on consideration of the Bill in Committee shall be taken in the following order: Clauses 1 to 3; Schedules 1 and 2; Clauses 4 to 32; Schedule 3; Clauses 33 to 38; Schedule 4; Clauses 39 to 52; Schedules 5 to 7; Clauses 53 to 64; Schedule 8; Clauses 65 to 67; Schedule 9; Clauses 68 to 80; Schedule 10; Clauses 81 to 91; Schedule 11; Clauses 92 to 122; Schedule 12; Clauses 123 to 158; Schedule 13; Clauses 159 to 161; Schedule 14; Clauses 162 to 194; new Clauses; new Schedules; remaining proceedings on the Bill;

(4) the proceedings shall (so far as not previously concluded) be brought to a conclusion at 5.00 pm on Thursday 30 June.—(Chris Philp.)

None Portrait The Chair
- Hansard -

The Committee will therefore proceed to line-by-line consideration of the Bill on Tuesday 7 June at 9.25 am. I call the Minister to move the motion about written evidence.

Resolved,

That, subject to the discretion of the Chair, any written evidence received by the Committee shall be reported to the House for publication.—(Chris Philp.)

None Portrait The Chair
- Hansard -

Copies of written evidence to the Committee will be made available in the Committee room each day and will be circulated to Members by email. I call the Minister to move the motion about deliberating in private.

Resolved,

That, at this and any subsequent meeting at which oral evidence is to be heard, the Committee shall sit in private until the witnesses are admitted.—(Chris Philp.)

09:27
The Committee deliberated in private.
Examination of Witnesses
Kevin Bakhurst and Richard Wronka gave evidence.
09:30
None Portrait The Chair
- Hansard -

We are now sitting in public again, and the proceedings are being broadcast. Before we start hearing from the witnesses, do any Members wish to make declarations of interest in connection with the Bill?

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

The witness on Thursday’s sitting, Danny Stone from the Antisemitism Policy Trust, is an informal secretariat in a personal capacity to the all-party parliamentary group on wrestling, which I co-chair.

None Portrait The Chair
- Hansard -

That is noted.

Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - - - Excerpts

I refer Members to my entry in the Register of Members’ Financial Interests regarding work I did six months ago for a business called DMA.

None Portrait The Chair
- Hansard -

We will now hear oral evidence from Kevin Bakhurst, group director of broadcasting and online content at Ofcom, and Richard Wronka, director of Ofcom’s online harms policy. Before calling the first Member to ask a question, I remind all Members that questions should be limited to matters within the scope of the Bill, and we must stick to the timings in the programme motion that the Committee has agreed. For this witness panel, we have until 10.05 am. Could the witnesses please introduce themselves for the record?

Kevin Bakhurst: Good morning. I am Kevin Bakhurst, group director at Ofcom for broadcasting and online content.

Richard Wronka: I am Richard Wronka, a director in Ofcom’s online safety policy team.

None Portrait The Chair
- Hansard -

I will open up to the floor for questions now. I call Alex Davies-Jones.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Good morning, both, and welcome to the Committee. The Bill as it stands places responsibility on Ofcom to regulate the 25,000 tech companies and the tens—if not hundreds—of thousands of websites within the UK. How does that look in practice? What technical and administrative capacity do you have to carry that function out, realistically?

Kevin Bakhurst: We should say that we feel the Bill has given us a very good framework to regulate online safety. We have been working closely with the Department for Digital, Culture, Media and Sport to make sure that the Bill gives us a practical, deliverable framework. There is no doubt that it is a challenge. As you rightly say, there will be potentially 25,000 platforms in scope, but we feel that the Bill sets out a series of priorities really clearly in terms of categories.

It is also for us to set out—we will be saying more about this in the next couple of months—how we will approach this, and how we will prioritise certain platforms and types of risk. It is important to say that the only way of achieving online safety is through what the Bill sets out, which is to look at the systems in place at the platforms, and not the individual pieces of content on them, which would be unmanageable.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you, Kevin. You mentioned the categorisation of platforms. A number of stakeholders, including the platforms themselves and charities, have quite rightly raised some serious concerns around the categorisation of platforms. Would you, the regulator, prefer a risk-based approach, or the categorisation as it stands within the Bill?

Richard Wronka: We completely recognise the concerns that have been raised by stakeholders, and we have been speaking to many of them ourselves, so we have first-hand experience. I think my starting point is that the Bill captures those high-risk services, which is a really important feature of it. In particular, responsibilities around the legal content apply across all services in scope. That means that, in practice, when we are regulating, we will take a risk-based approach to whom we choose to engage with, and to where we focus our effort and attention.

We recognise that some of the debate has been about the categorisation process, which is intended to pick up high-risk and high-reach services. We understand the logic behind that. Indeed, I think we would have some concerns about the workability of an approach that was purely risk-based in its categorisation. We need an approach that we can put into operation. Currently, the Bill focuses on the reach of services and their functionality. We would have some concerns about a purely risk-based approach in terms of whether it was something that we could put into practice, given the number of services in scope.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q May I bring you back to putting this into practice, and to the recategorisation of platform and practice? If a category 2B platform as it stands in the Bill grows exponentially in size, and is spreading disinformation and incredibly harmful content quite quickly, how quickly would you be able to react as a regulator to recategorise that platform and bring it into scope as a category 1 platform? How long would that process take, and what would happen in the interim?

Richard Wronka: At the moment, the category 2B service would have transparency reporting requirements. That would be helpful, because it would be one way that the nature of harmful content on that platform could be brought to our attention, and to the public’s attention. We would also be looking at approaches that we could use to monitor the whole scope of the services, to ensure that we had a good grip of who was growing quickest and where the areas of risk were. Some of that is through engaging with the platforms themselves and a whole range of stakeholders, and some of it is through more advanced data and analytical techniques—“supervision technology”, as it is known in the regulatory jargon.

On the specifics of your question, if a company was growing very quickly, the Bill gives us the ability to look at that company again, to ask it for information to support a categorisation decision, and to recategorise it if that is the right approach and if it has met the thresholds set out by the Secretary of State. One of the thresholds regards the number of users, so if a company has moved over that threshold, we look to act as quickly as possible while running a robust regulatory process.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q So while that process is under way, there is no mechanism for you to take action against the platform.

Kevin Bakhurst: May I answer this? We have some experience of this already in the video-sharing platform regime, which is much more limited in scope, and we are already regulating a number of platforms, ranging from some very big ones such as Twitch, TikTok and Snap, down to some much smaller platforms that have caused us some concerns. We think we have the tools, but part of our approach will also be to focus on high-risk and high-impact content, even if it comes through small platforms. That is what we have already done with the video-sharing platform regime. We have to be agile enough to capture that and to move resources to it. We are doing that already with the video-sharing platform regime, even though we have only been regulating it for less than a year.

None Portrait The Chair
- Hansard -

Maria Miller has indicated that she would like to ask a question, so if I may, I will bring her in.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

Not immediately —go on please.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you, Chair, and thank you, Maria.

I am just trying to get to the intricacies of this, and of what would happen during the time that it would take for you to recategorise. This platform, which is disseminating harm to both children and adults, would be allowed to carry on while the recategorisation process is under way. There is no mechanism in the Bill to stop that from happening.

Richard Wronka: A really important point here is that we will be regulating that platform from the outset for illegal content and, potentially, for how it protects children on its platform, irrespective of the categorisation approach. That is really important. We will be able to take action, and take action quickly, irrespective of how the platform is categorised. Categorisation really determines whether the adult “legal but harmful” provisions apply. That is the bit that really matters in this context.

It is worth reminding ourselves what those provisions mean: they are more a transparency and accountability measure. Those categorised category 1 platforms will need to have clear terms and conditions applied to adult “legal but harmful” content, and they will need to implement those consistently. We would expect the really serious and egregious concerns to be picked up by the “illegal” part of the regime, and the protection-of-children part of the regime. The categorisation process may go on. It may take a little time, but we will have tools to act in those situations.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q May I bring you on to the powers of the Secretary of State and the question of the regulator’s independence? The Bill will see the Secretary of State, whoever that may be, have a huge amount of personal direction over Ofcom. Do you have any other experience of being directed by a Secretary of State in this way, and what are the consequences of such an approach?

Kevin Bakhurst: We do have some experience across the various sectors that we regulate, but being directed by the Secretary of State does not happen very often. Specifically on the Bill, our strong feeling is that we think it entirely appropriate, and that the Secretary of State should be able to direct us on matters of national security and terrorist content. However, we have some concerns about the wider direction powers of the Secretary of State, and particularly the grounds on which the Secretary of State can direct public policy, and we have expressed those concerns previously.

We feel it is important that the independence of a regulator can be seen to be there and is there in practice. Legally, we feel it important that there is accountability. We have some experience of being taken to judicial review, and there must be accountability for the codes of practice that we put in place. We must be able to show why and how we have created those codes of practice, so that we can be accountable and there is absolute clarity between regulator and Government.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Thank you very much to the witnesses who have taken the time to be with us today. We are really grateful. You have already alluded to the fact that you have quite extensive experience in regulation, even in social media spaces. I think the Committee would be really interested in your view, based on your experience, about what is not in the Bill that should be.

Kevin Bakhurst: Richard has been leading this process, so he can give more detail on it, but suffice to say, we have been engaging closely with DCMS over the last year or so, and we appreciate the fact that it has taken on board a number of our concerns. What we felt we needed from the Bill was clarity as far as possible, and a balance between clarity and flexibility for this regime, which is a very fast-moving field. We feel, by and large, that the Bill has achieved that.

We still have concerns about one or two areas, to pick up on your question. We feel it is really important—hopefully this is something the Committee can contribute to—that the definition of “illegal content” is really clear for platforms, and particularly the area of intent of illegality, which at the moment might be quite tricky for the platforms to pick up on.

Richard Wronka: I completely agree with Kevin that the Bill as it stands gives us a good framework. I think the pre-legislative scrutiny process has been really helpful in getting us there, and I point out that it is already quite a broad and complex regime. We welcome the introduction of issues such as fraudulent advertising and the regulation of commercial pornographic providers, but I think there is a point about ensuring that the Bill does not expand too much further, because that might raise some practical and operational issues for us.

I completely agree with Kevin that clarity in the Bill regarding illegal content and what constitutes that is really important. An additional area that requires clarity is around some of the complex definitions in the Bill, such as journalistic content and democratically important content. Those are inherently tricky issues, but any extra clarity that Parliament can provide in those areas would be welcome.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q You talk about illegal content and that Ofcom would not have a view on particular laws, but do you think there are harmful areas of content that are not currently covered by the law? I am thinking particularly about the issue of intimate image abuse, which is currently under Law Commission review, with recommendations expected very soon. Have you had any thoughts, particularly in the area of policy, about how you deal with issues that should be against the law but currently are not, given that part of your regulatory process is to determine whether companies are operating within the law?

Richard Wronka: I would start by saying that this is a fluid area. We have had a number of conversations with the Law Commission in particular and with other stakeholders, which has been really helpful. We recognise that the Bill includes four new offences, so there is already some fluidity in this space. We are aware that there are other Law Commission proposals that the Government are considering. Incitement to self-harm and flashing imagery that might trigger epilepsy are a couple of issues that come to mind there. Ultimately, where the criminal law sits is a matter for Parliament. We are a regulator: our role here is to make sure that the criminal law is reflected in the regulatory regime properly, rather than to determine or offer a view on where the criminal law should sit. Linking back to our point just a minute ago, we think it is really important that there is as much clarity as possible about how platforms can take some of those potentially quite tricky decisions about whether content meets the criminal threshold.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q May I press a little further? The four new offences that you talked about, and others, and just the whole approach of regulation will lead more individuals to seek redress and support. You are not responsible for individuals; you are responsible for regulation, but you must have some thoughts on whether the current system of victim support will cope with the changes in the law and the new regulatory process. What might you want to see put in place to ensure that those victims are not all landing at your door, erroneously thinking that Ofcom will provide them with individual redress? Do you have any thoughts on that?

Kevin Bakhurst: One area that is very important and which is in the Bill and one of our responsibilities is to make sure there is a sufficiently robust and reactive complaints process from the platforms—one that people feel they can complain to and be heard—and an appeals process. We feel that that is in the Bill. We already receive complaints at Ofcom from people who have issues about platforms and who have gone to the platforms but do not feel their complaints have been properly dealt with or recognised. That is within the video-sharing platform regime. Those individual complaints, although we are not going to be very specific in looking at individual pieces of material per se, are very useful to alert us where there are issues around particular types of offence or harm that the platforms are not seen to be dealing with properly. It will be a really important part of the regime to make sure that platforms provide a complaints process that is easy to navigate and that people can use quite quickly and accessibly.

Richard Wronka: An additional point I would make, building on that, is that this is a really complex ecosystem. We understand that and have spent a lot of the last two or three years trying to get to grips with that complex ecosystem and building relationships with other participants in the ecosystem. It brings in law enforcement, other regulators, and organisations that support victims of crime or online abuse. We will need to find effective ways to work with those organisations. Ultimately, we are a regulator, so there is a limit to what we can do. It is important that those other organisations are able to operate effectively, but that is perhaps slightly outside our role.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

Q To what extent do you think services should publish publicly the transparency and risk assessments that they will be providing to Ofcom?

Richard Wronka: I think our starting point here is that we think transparency is a really important principle within the regime—a fundamental principle. There are specific provisions in the Bill that speak to that, but more generally we are looking for this regime to usher in a new era of transparency across the tech sector, so that users and other participants in this process can be clearer about what platforms are doing at the moment, how effective that is and what more might be done in the future. That is something that will be a guiding principle for us as we pick up regulation.

Specifically, the Bill provides for transparency reports. Not all services in scope will need to provide transparency reports, but category 1 and 2 services will be required to produce annual transparency reports. We think that is really important. At the moment, risk assessments are not intended to be published—that is not provided for in the Bill—but the transparency reports will show the effectiveness of the systems and processes that those platforms have put in place.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q That was to be my next question: do you think it is an issue that category 1 services will not have to publish child risk assessments? It seems to me that it would be better if they did.

Richard Wronka: I think what is important for us as a regulator is that we are able to access those risk assessments; and for the biggest services, the category 1 services, we would be expecting to do that routinely through a supervisory approach. We might even do that proactively, or where services have come to us for dialogue around those—

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q But would it not improve transparency if they did have to publish them? Why would they not want to publish them?

Richard Wronka: Some services may wish to publish the risk assessments. There is nothing in the Bill or in our regulated approach that would prevent that. At the moment, I do not see a requirement in the Bill to do that. Some services may have concerns about the level of confidential information in there. The important point for us is that we have access to those risk assessments.

Kevin Bakhurst: Picking up on the risk assessments, it is a tricky question because we would expect those assessments to be very comprehensive and to deal with issues such as how algorithms function, and so on. There is a balance between transparency, which, as Richard says, we will drive across the regime—to address information that can harm, or people who are trying to behave badly online or to game the system—and what the regulator needs in practical terms. I am sure the platforms will be able to talk to you more about that.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q May I ask follow-up some questions about resources and timing once the Bill has gone through? You said you are going to open a new digital and technology hub in Manchester, with the creation of 150 jobs. I have a couple of questions on that. Do you think that what is set out in the proposal will be enough? Will you have the resources to carry out the duties set out in the Bill? This is a follow-up point from my colleague’s question earlier.

There is also a question of timing. The reports suggested that the new hub and jobs will come into play in 2025. I am sure that everyone here wants to see the Bill taking effect sooner. Ofcom will need to do a lot of reviews and reporting in the first year after the Bill receives Royal Assent. How will that be possible if people are not in post until 2025?

Kevin Bakhurst: They are both big questions. I will take the first part and maybe Richard can take the second one about the timing. On the resourcing, it is important to say publicly that we feel strongly that, very unusually, we have had funding from Government to prepare for this regime. I know how unusual that is; I was at a meeting with the European regulators last week, and we are almost unique in that we have had funding and in the level of funding that we have had.

The funding has meant that we are already well advanced in our preparations. We have a team of around 150 people working on online safety across the organisation. A number are in Manchester, but some are in London or in our other offices around the UK. It is important to say that that funding has helped us to get off to a really strong start in recruiting people across the piece—not just policy people. Importantly, we have set up a new digital function within Ofcom and recruited a new chief technology officer, who came from Amazon Alexa, to head up that function.

The funding has allowed us to really push hard into this space, which is not easy, and to recruit some of the skills we feel we need to deliver this regime as effectively and rapidly as possible. I know that resourcing is not a matter within the Bill; it is a separate Treasury matter. Going forward though, we feel that, in the plans, we have sufficient resourcing to deliver what we are being asked to deliver. The team will probably double in size by the time we actually go live with the regime. It is a significant number of people.

Some significant new duties have been added in, such as fraudulent advertising, which we need to think carefully about. That is an important priority for us. It requires a different skillset. It was not in the original funding plan. If there are significant changes to the Bill, it is important that we remain alive to having the right people and the right number of people in place while trying to deliver with maximum efficiency. Do you want to talk about timing, Richard?

Richard Wronka: All I would add to that, Kevin, is that we are looking to front-load our recruitment so that we are ready to deliver on the Bill’s requirements as quickly as possible once it receives Royal Assent and our powers commence. That is the driving motivation for us. In many cases, that means recruiting people right now, in addition to the people we have already recruited to help with this.

Clearly there is a bit of a gating process for the Bill, so we will need a settled legislative framework and settled priority areas before we can get on with the consultation process. We will look to run that consultation process as swiftly as possible once we have those powers in place. We know that some stakeholders are very keen to see the Bill in place and others are less enthusiastic, so we need to run a robust process that will stand the test of time.

The Bill itself points us towards a phased process. We think that illegal content, thanks to the introduction of priority illegal content in the Bill, with those priority areas, is the area on which we can make the quickest progress as soon as the Bill achieved Royal Assent.

None Portrait The Chair
- Hansard -

Thank you. I intend to bring in the Minister at about 10 o’clock. Kirsty Blackman, Kim Leadbeater and Dean Russell have indicated that they wish to ask questions, so let us try to keep to time.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

Q I have a few questions, but I will ask them in a short way, and hopefully the witnesses can answer them in a fairly short way too. The chief executive of Ofcom told the Joint Committee on the draft Bill that the Secretary of State’s powers were extremely open ended. You have already touched on this, but do you feel that this will impede Ofcom’s independence as a regulator?

Kevin Bakhurst: There is a particular area on reasons of public policy for the Secretary of State to direct us on codes that we have some concern about. It is more on practicality than independence, but clearly for the platforms, and we have had a lot of discussions with them, the independence of a regulator—that is, a regulatory regime that is essentially about content—is absolutely critical, and it is a priority for us to show that we are independent.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Do you feel that the Bill will allow you to adequately regulate online gaming, which is how an awful lot of young people use the internet, in a way that will keep them safer than they currently are?

Richard Wronka: Yes, we fully anticipate that gaming services, and particularly the messaging functionality that is often integrated into those services, will be captured within the scope of the regime. We do think that the Bill, on the whole, gives us the right tools to regulate those services.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q My last question is about future-proofing the Bill. Obviously, an awful lot of things will happen in the online world that do not currently happen there, and some of those we cannot foresee. Do you think the Bill is wide enough and flexible enough to allow changes to be made so that new and emerging platforms can be regulated?

Kevin Bakhurst: Overall, we feel that it is. By and large, the balance between certainty and flexibility in the Bill is probably about right and will allow some flexibility in future, but it is very hard to predict what other harms may emerge. We will remain as flexible as possible.

Richard Wronka: There are some really important updating tools in the Bill. The ability for the Secretary of State to introduce new priority harms or offences—with the approval of Parliament, of course—is really important.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Q Ofcom is required to produce certain codes, for example on terrorism, but others that were floated in the Green Paper are no longer in the Bill. Are you working on such codes, for example on hate crime and wider harm, and if not, what happens in the meantime? I guess that links to my concerns about the democratic importance and journalistic content provisions in the Bill, to which you have alluded. They are very vague protections and I am concerned that they could be exploited by extremists who suddenly want to identify as a journalist or a political candidate. Could you say a little about the codes and about those two particular clauses and what more you think we could do to help you with those?

Richard Wronka: I will cover the codes first. You are absolutely right that the Bill requires Ofcom to publish codes of practice, particularly on CSEA and on terror, as well as on fraudulent advertising and other areas. We are doing the work right now so that we are ready to progress with that process as soon as we get powers and duties, because it is really important that we are ready to move as quickly as possible. We will set out further detail on exactly how we plan to do that in a roadmap document that we are looking to publish before the summer break, so that will provide some of the detail.

A really important point here is that the Bill quite rightly covers a wide set of harms. We are mindful of the fact that the temptation of having a code that covers every single harm could be counterproductive and confusing for platforms, even for those that want to comply and do the right thing. One of the balancing acts for us as we produce that code framework will be to get the right coverage for all the issues that everyone is rightly concerned about, but doing that in a way that is streamlined and efficient, so that services can apply the provisions of those codes.

Richard Wronka: Shall I pick up on the second bit very quickly? I think you are right; this is one of our central concerns about the definitions. As far as possible, this should be a matter for Parliament. It is really important that to know Parliament has a view on this. Ultimately, the regulator will take a view based on what Parliament says. We have some experience in this area, but as Richard said, we recognise the challenge—it is extremely complex. We can see the policy intent of doing it, quite rightly, and the importance of enshrining freedom of expression as far as possible, but Parliament can help to add clarity and, as you rightly say, be aware of some of the potential loopholes. At the moment, someone could describe themselves as a citizen journalist; where does that leave us? I am not quite sure. Parliament could help to clarify that, and we would be grateful.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Q Do the powers in the Bill cover enough to ensure that people will not be sent flashing images if they have photosensitive epilepsy?

Richard Wronka: This picks up the point we discussed earlier, which is that I understand that the Government are considering proposals from the Law Commission to criminalise the sending of those kinds of images. It would not be covered by the illegal content duties as things stand, but if the Government conclude that it is right to criminalise those issues, it would automatically be picked up by the Bill.

Even so, the regime is not, on the whole, going to be able to pick up every instance of harm. It is about making sure that platforms have the right systems and processes. Where there is clear harm to individuals, we would expect those processes to be robust. We know there is work going on in the industry on that particular issue to try and drive forward those processes.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Q But as the Bill stands, there is a very clear point about stopping harmful content being sent to people, so I imagine that would cover it at least in that sense, would it not?

Kevin Bakhurst: This is a really important point, which Richard just tried to make. The Bill gives us a great range of tools to try and prevent harm as far as possible; I just think we need to get expectations right here. Unfortunately, this Bill will not result in no harm of any type, just because of the nature of the internet and the task that we face. We are ambitious about driving constant improvement and stopping and addressing the main harms, but it is not going to stop any particular harm. We will absolutely focus on the ones that have a significant impact, but unfortunately that is the nature of the web.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

Q Just to continue the point made by my colleague, you are right to say that Ministry of Justice colleagues are considering the flashing image offence as a separate matter. But would you agree that clause 150, on harmful communications, does criminalise and therefore place into the scope of the Bill communications intended to cause harm to a “likely audience” where such harm is

“psychological harm amounting to serious distress”?

Therefore, sending somebody a flashing image with the intention of inducing an epileptic fit would be likely caught under this new harmful communications offence in clause 150, even before a separate future offence that may be introduced.

Richard Wronka: I think we can certainly understand the argument. I think it is important that the Bill is as clear as possible. Ultimately, it is for the courts to decide whether that offence would pick up these kinds of issues that we are talking about around flashing imagery.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I would suggest that the definition in clause 150 would cover epilepsy trolling.

You mentioned that you met recently with European regulators. Briefly, because we are short of time, were there any particular messages, lessons or insights you picked up in those meetings that might be of interest to the Committee?

Kevin Bakhurst: Yes, there were a number, and liaising with European regulators and other global regulators in this space is a really important strand of our work. It often said that this regime is a first globally. I think that is true. This is the most comprehensive regime, and it is therefore potentially quite challenging for the regulator. That is widely recognised.

The second thing I would say is that there was absolute recognition of how advanced we are in terms of the recruitment of teams, which I touched on before, because we have had the funding available to do it. There are many countries around Europe that have recruited between zero and 10 and are imminently going to take on some of these responsibilities under the Digital Services Act, so I think they are quite jealous.

The last thing is that we see continued collaboration with other regulators around the world as a really important strand, and we welcome the information-sharing powers that are in the Bill. There are some parallels, and we want to take similar approaches on areas such as transparency, where we can collaborate and work together. I think it is important—

None Portrait The Chair
- Hansard -

Order. I am afraid we have come to the end of the allotted time for questions. On behalf of the Committee, I thank our witnesses for their evidence.

Examination of Witnesses

Dame Rachel de Souza, Lynn Perry MBE and Andy Burrows gave evidence.

10:05
None Portrait The Chair
- Hansard -

We will now hear from the Children’s Commissioner, Dame Rachel de Souza; Lynn Perry, chief executive officer of Barnardo’s, who will be appearing via Zoom; and Andy Burrows, head of child safety at the National Society for the Prevention of Cruelty to Children. Could the new witnesses take their places, please?

We have until 10.50 am for this panel. Could the witnesses please introduce themselves for the record? We will take the witnesses in the room first.

Andy Burrows: I am Andy Burrows, the head of online safety policy at the NSPCC.

Dame Rachel de Souza: I am Rachel de Souza, Children’s Commissioner for England.

None Portrait The Chair
- Hansard -

And on the screen—[Interruption.] Uh-oh, it has frozen. We will have to come back to that. We will take evidence from the witnesses in the room until we have sorted out the problem with the screen.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q Do you think there is enough in the Bill to make sure that the voices of children at risk of online harms are heard? There is a super-complaints mechanism, but do you think it goes far enough for children, and are you confident that the regime will be able to quickly respond to new and emerging harms to children? Could Andy Burrows start?

Andy Burrows: Thank you for the question. We think that more could be built into the Bill to ensure that children’s needs and voices can be fed into the regime.

One of the things that the NSPCC would particularly like to see is provision for statutory user advocacy arrangements, drawing on the examples that we see in multiple other regulated sectors, where we have a model by which the levy on the firms that will cover the costs of the direct regulation also provides for funded user advocacy arrangements that can serve as a source of expertise, setting out children’s needs and experiences.

A comparison here would be the role that Citizens Advice plays in the energy and postal markets as the user voice and champion. We think that would be really important in bolstering the regulatory settlement. That can also help to provide an early warning function—particularly in a sector that is characterised by very rapid technological and market change—to identify new and emerging harms, and bolster and support the regulator in that activity. That, for us, feels like a crucial part of this jigsaw.

Given the very welcome systemic approach of the regime, that early warning function is particularly important, because there is the potential that if harms cannot be identified quickly, we will see a lag where whole regulatory cycles are missed. User advocacy can help to plug that gap, meaning that harms are identified at an earlier stage, and then the positive design of the process, with the risk profiles and company risk assessments, means that those harms can be built into that particular cycle.

Dame Rachel de Souza: I was very pleased when the Government asked me, when I came into the role, to look at what more could be done to keep children safe online and to make sure that their voices went right through the passage of the Bill. I am committed to doing that. Obviously, as Children’s Commissioner, my role is to elevate children’s voices. I was really pleased to convene a large number of charities, internet safety organisations and violence against women and girls experts in a joint briefing to MPs to try to get children’s voices over.

I worry that the Bill does not do enough to respond to individual cases of abuse and that it needs to do more to understand issues and concerns directly from children. Children should not have to exhaust the platforms’ ineffective complaints routes, which can take days, weeks or even months. I have just conducted a survey of 2,000 children and asked them about their experiences in the past month. Of those 2,000 children, 50% had seen harmful content and 40% had tried to get content about themselves removed and had not succeeded. For me, there is something really important about listening to children and talking their complaints into account. I know you have a busy day, but that is the key point that I want to get across.

None Portrait The Chair
- Hansard -

Lynn Perry is back on the screen—welcome. Would you like to introduce yourself for the record and then answer the question? [Interruption.] Oh, she has gone again. Apparently the problem is at Lynn’s end, so we will just have to live with it; there is nothing we can do on this side.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q Is the Bill future-proof? If you think it is not, how can we ensure that it is responsive to future risks and harms?

Andy Burrows: The systemic regime is important. That will help to ensure that the regime can be future-proofed; clearly, it is important that we are not introducing a set of proposals and then casting them in aspic. But there are ways that the Bill could be more strongly future-proofed, and that links to ensuring that the regime can effectively map on to the dynamics of the child sexual abuse problem in particular.

Let me give a couple of examples of where we think the Bill could be bolstered. One is around placing a duty on companies to consider the cross-platform nature of harm when performing their risk assessment functions, and having a broad, overarching duty to ask companies to work together to tackle the child sexual abuse threat. That is very important in terms of the current dynamics of the problem. We see, for example, very well-established grooming pathways, where abusers will look to exploit the design features of open social networks, such as on Instagram or Snapchat, before moving children and abuse on to perhaps live-streaming sites or encrypted messaging sites.

The cross-platform nature of the threat is only going to intensify in the years ahead as we start to look towards the metaverse, for example. It is clear that the metaverse will be built on the basis of being cross-platform and interdependent in nature. We can also see the potential for unintended consequences from other regulatory regimes. For example, the Digital Markets Act recently passed by the EU has provisions for interoperability. That effectively means that if I wanted to send you a message on platform A, you could receive it on platform B. There is a potential unintended consequence there that needs to be mitigated; we need to ensure that there is a responsibility to address the harm potential that could come from more interoperable services.

This is a significant area where the Bill really can be bolstered to address the current dynamics of the problem and ensure that legislation is as effective as it possibly can be. Looking to the medium to long term, it is crucial to ensure that we have arrangements that are commensurate to the changing nature of technology and the threats that will emerge from that.

Dame Rachel de Souza: A simple answer from me: of course we cannot future-proof it completely, because of the changing nature of online harms and technology. I talked to a large number of 16 to 21-year-olds about what they wished their parents had known about technology and what they had needed to keep them safe, and they listed a range of things. No. 1 was age assurance—they absolutely wanted good age assurance.

However, the list of harms and things they were coming across—cyber-flashing and all this—is very much set in time. It is really important that we deal with those things, but they are going to evolve and change. That is why we have to build in really good cross-platform work, which we have been talking about. We need these tech companies to work together to be able to stay live to the issues. We also need to make sure that we build in proper advocacy and listen to children and deal with the issues that come up, and that the Bill is flexible enough to be able to grow in that way. Any list is going to get timed out. We need to recognise that these harms are there and that they will change.

None Portrait The Chair
- Hansard -

I will bring in Kim Leadbeater and then Maria Miller and Kirsty Blackman, but I will definitely bring in the Minister at 10.45 am.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you, Ms Rees, and thank you to the witnesses. Many websites host pornography without necessarily being pornographic websites, meaning that children can easily stumble across it. Does the Bill do enough to tackle pornography when it is hosted on mainstream websites?

Dame Rachel de Souza: I have argued hard to get pornographic sites brought into the Bill. That is something very positive about the Bill, and I was really pleased to see that. Why? I have surveyed more than half a million children in my Big Ask survey and spoken recently to 2,000 children specifically about this issue. They are seeing pornography, mainly on social media sites—Twitter and other sites. We know the negative effects of that, and it is a major concern.

I am pleased to see that age assurance is in the Bill. We need to challenge the social media companies—I pull them together and meet them every six months—on getting this stuff off their sites and making sure that under-age children are not on their sites seeing some of these things. You cannot go hard enough in challenging the social media companies to get pornography off their sites and away from children.

Andy Burrows: Just to add to that, I would absolutely echo that we are delighted that part 5 of the Bill, with measures around commercial pornography, has been introduced. One of our outstanding areas of concern, which applies to pornography but also more broadly, is around clause 26, the children’s access assessment, where the child safety duties will apply not to all services but to services where there is a significant number of child users or children comprise a significant part of the user base. That would seem to open the door to some small and also problematic services being out of scope. We have expressed concerns previously about whether OnlyFans, for example, which is a very significant problem as a user-generated site with adult content, could be out of scope. Those are concerns that I know the Digital, Culture, Media and Sport Committee has recognised as well. We would very much like to see clause 26 removed from the Bill, which would ensure that we have a really comprehensive package in this legislation that tackles both commercial pornography and user-generated material.

None Portrait The Chair
- Hansard -

I think Lynn Perry is back. Are you with us, Lynn? [Interruption.] No—okay. We will move on to Maria Miller.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q I have a question for the Children’s Commissioner. You talked just now about doing more on the advocacy of individual cases. I asked a question of Ofcom in the first panel about the issue of support for victims. Its response was that complaints processes will be part of what it will regulate. Do you think that will be enough to answer your concerns, or are you expecting more than simply ensuring that platforms do what they should do?

Dame Rachel de Souza: I absolutely think that we need to look at independent advocacy and go further. I do not think the Bill does enough to respond to individual cases of abuse and to understand issues and concerns directly from children. Children should not have to exhaust platforms’ ineffective complaints routes. It can take days, weeks, months. Even a few minutes or hours of a nude image being shared online can be enormously traumatising for children.

That should inform Ofcom’s policies and regulation. As we know, the risks and harms of the online world are changing constantly. It serves a useful purpose as an early warning mechanism within online safety regulation. I would like to see independent advocacy that allows a proper representation service for children. We need to hear from children directly, and I would like to see the Bill go further on this.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Is there capacity in the sector to deliver what you are talking about?

Dame Rachel de Souza: I think we need to make capacity. There is some—the NSPCC has its Childline and, as Children’s Commissioner, I have my own advocacy service for children in care. I think this should function in that way, with direct access. So I think that we can create it.

Andy Burrows: May I come in briefly? Our proposals for user advocacy reflect the clear “polluter pays” principle that we think should apply here, to help build and scale up that capacity, but the levy that is covering the direct cost of regulation should also provide really effective user advocacy. That is really important not only to help to give victims what they need in frontline services, but in ensuring that there is a strong counterbalance to some of the largest companies in the world for our sector, which has clear ambition but self-evident constraints.

Dame Rachel de Souza: One of the concerns that has come to me from children—I am talking about hundreds of thousands of children—over the past year is that there is not strong enough advocacy for them and that their complaints are not being met. Girls in particular, following the Everyone’s Invited concerns, have tried so hard to get images down. There is this almost medieval bait-out practice of girls’ images being shared right across platforms. It is horrendous, and the tech firms are not acting quickly enough to get those down. We need proper advocacy and support for children, and I think that they would expect that of us in this groundbreaking Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q There has not been a huge amount of discussion of online gaming in the context of the Bill, despite the fact that for many young people that is the way in which they interact with other people online. Do you think the Bill covers online gaming adequately? A lot of interaction in online gaming is through oral communication—voice chat messages. Do you think that it is possible to properly regulate oral communications in gaming?

Dame Rachel de Souza: Good question. I applaud the Bill for what it does cover. We are looking at a Bill that, for the first time, is going to start protecting children’s rights online, so I am really pleased to see that. We have looked a bit at gaming in the past. In terms of harms, obviously the Bill does not cover gaming in full, but it does cover the safety aspects of children’s experience.

It is always good for us to be looking further. Gaming, we know, has some extremely harmful and individualistic issues with it, particularly around money and the profile of potential grooming and safety. In terms of communications, one of the reasons that I am so concerned about encryption and communications online is that it happens through gaming. We need to make sure that those elements are really firm.

Andy Burrows: It is vitally important that the gaming sector is in scope. We know that there are high-risk gaming sites—for example, Twitch—and gaming-adjacent services such as Discord. To go back to my earlier point about the need for cross-platform provisions to apply here, in gaming we can see grooming pathways that can take on a different character from those on social networks, for example, where we might see abuse pathways where that grooming is taking place at the same time, rather than sequentially from a gaming streaming service, say, to a gaming-adjacent platform such as Discord. I think it is very important that a regulator is equipped to understand the dynamics of the harms and how they will perhaps apply differently on gaming services. That is a very strong and important argument for use advocacy.

I would say a couple of things on oral communications. One-to-one oral communication are excluded from the Bill’s scope—legitimately—but we should recognise that there is a grooming risk there, particularly when that communication is embedded in a platform of wider functionality. There is an argument for a platform to consider all aspects of its functionality within the risk assessment process. Proactive scanning is a different issue.

There is a broader challenge for the Bill, and this takes us back to the fundamental objectives and the very welcome design based around systemic risk identification and mitigation. We know that right now, in respect of oral communications and livestream communications, the industry response is not as developed in terms of detecting and disrupting harm as it is for, say, text-based chat. In keeping with the risk assessment process, it should be clear that if platforms want to offer that functionality, they should have to demonstrate through the risk assessment process that they have high-quality, effective arrangements in place to detect and disrupt harm, and that should be the price of admission. If companies cannot demonstrate that, they should not be offering their services, because there is a high risk to children.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Do you think it would be reasonable for gaming companies in particular to have a setting whereby children or young people can choose to interact only with people in their friends list? Would that be helpful?

Andy Burrows: I think that aspect is certainly worthy of consideration, because the key objective is that platforms should be incentivised to deliver safety by design initiatives. One area in the Bill that we would like to be amended is the user empowerment mechanism. That gives adults the ability to screen out anonymous accounts, for example, but those provisions do not apply to children. Some of those design features that introduce friction to the user experience are really important to help children, and indeed parents, have greater ownership of their experience.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Finally, could you explain breadcrumbing a little further? What does it mean and does the Bill tackle it adequately?

Andy Burrows: Child abuse breadcrumbing is a major area of concern for us. The term captures a range of techniques whereby abusers are able to use social networks to facilitate the discovery and the dissemination of child sexual abuse. The activity does not meet the criminal threshold in and of itself, but it effectively enables abusers to use online services as a shop window to advertise their sexual interest in children.

I will give a couple of fairly chilling examples of what I mean by that. There is a phenomenon called “tribute sites”. Abusers open social media accounts in the guise of well-known survivors of child sexual abuse. To all of us in this room, that would look perfectly innocuous, but if you are an abuser, the purpose of those accounts is very clear. In the first quarter of last year, those types of accounts received 6 million interactions.

Another example is Facebook groups. We have seen evidence of Facebook refusing to take down groups that have a common interest in, for example, children celebrating their 8th, 9th and 10th birthdays. That is barely disguised at all; we can all see what the purpose is. Indeed, Facebook’s algorithms can see the purpose there, because research has shown that, within a couple of hours of use of the service, the algorithms identify the common characteristic of interest, which is child sexual abuse, and then start recommending accounts in multiple other languages.

We are talking about a significant way in which abusers are able to organise abuse and migrate it to encrypted chat platforms, to the dark web, and to offender fora, where it is, by definition, much harder to catch that activity, which happens after harm has occurred—after child abuse images have been circulated. We really want breadcrumbing to be brought unambiguously into the scope of the Bill. That would close off tens of millions of interactions with accounts that go on to enable abusers to discover and disseminate material and to form offender networks.

We have had some good, constructive relationships with the Home Office in recent weeks. I know that the Home Office is keen to explore how this area can be addressed, and it is vital that it is addressed. If we are going to see the Bill deliver the objective of securing a really effective upstream response, which I think is the clear legislative ambition, this is an area where we really need to see the Bill be amended.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q You mostly talked about Facebook. Is it mostly the largest social media platforms, or are we talking about some of the smaller ones, such as Discord, which you mentioned? Would you like to see those in scope as well, or is it just the very biggest ones?

Andy Burrows: Those provisions should apply broadly, but it is a problem that we see particularly on those large sites because of the scale and the potential for algorithmic amplification.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q I want to ask about the many tragic cases of teenagers who have died by suicide after viewing self-harm material online. Do you think coroners have sufficient powers to access digital data after the death of a child, and should parents have the right to access their children’s digital data following their death?

Dame Rachel de Souza: Baroness Kidron has done some fantastic work on this, and I really support her work. I want to tell you why. I am a former headteacher—I worked for 30 years in schools as a teacher and headteacher. Only in the last five or six years did I start seeing suicides of children and teenagers; I did not see them before. In the year just before I came to be Children’s Commissioner, there was a case of a year 11 girl from a vulnerable family who had a relationship with a boy, and it went all over the social media sites. She looked up self-harm material, went out to the woods and killed herself. She left a note that basically said, “So there. Look what you’ve done.”

It was just horrendous, having to pick up the family and the community of children around her, and seeing the long-term effects of it on her siblings. We did not see things like that before. I am fully supportive of Baroness Kidron and 5Rights campaigning on this issue. It is shocking to read about the enormous waiting and wrangling that parents must go through just to get their children’s information. It is absolutely shocking. I think that is enough from me.

Andy Burrows: I absolutely agree. One of the things we see at the NSPCC is the impact on parents and families in these situations. I think of Ian Russell, whose daughter Molly took her own life, and the extraordinarily protracted process it has taken to get companies to hand over her information. I think of the anguish and heartbreak that comes with this process. The Bill is a fantastic mechanism to be able to redress the balance in terms of children and families, and we would strongly support the amendments around giving parents access to that data, to ensure that this is not the protracted process that it currently all too often is.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Just quickly, do coroners have sufficient powers? Should they have more powers to access digital data after the death of a child?

Andy Burrows: We can see what a protracted process it has been. There have been improvements to the process. It is currently a very lengthy process because of the mutual legal assistance treaty arrangements—MLAT, as they are known—by which injunctions have to be sought to get data from US companies. It has taken determination from some coroners to pursue cases, very often going up against challenges. It is an area where we think the arrangements could certainly be streamlined and simplified. The balance here should shift toward giving parents and families access to the data, so that the process can be gone through quickly and everything can be done to ease the heartbreak for families having to go through those incredibly traumatic situations.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Very briefly, Dame Rachel, I will build on what you were just saying, based on your experience as a headteacher. When I make my school visits, the teachers overwhelmingly tell me how, on a daily basis, they have to deal with the fallout from an issue that has happened online or on social media. On that matter, the digital media literacy strategy is being removed from the Bill. What is your thinking on that? How important do you see a digital media literacy strategy being at the heart of whatever policy the Government try to make regarding online safety for children?

Dame Rachel de Souza: There is no silver bullet. This is now a huge societal issue and I think that some of the things that I would want to say would be about ensuring that we have in our educational arsenal, if you like, a curriculum that has a really strong digital media literacy element. To that end, the Secretary of State for Education has just asked me to review how online harms and digital literacy are taught in schools—reviewing not the curriculum, but how good the teaching is and what children think about how the subject has been taught, and obviously what parents think, too.

I would absolutely like to see the tech companies putting some significant funding into supporting education of this kind; it is exactly the kind of thing that they should be working together to provide. So we need to look at this issue from many aspects, not least education.

Obviously, in a dream world I would like really good and strong digital media literacy in the Bill, but actually it is all our responsibility. I know from my conversations with Nadhim Zahawi that he is very keen that this subject is taught through the national curriculum, and very strongly.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q I have a quick question on parental digital literacy. You mentioned the panel that you put together of 16 to 21-year-olds. Do you think that today’s parents have the experience, understanding, skills and tools to keep their children properly safe online? Even if they are pretty hands-on and want to do that, do you think that they have all the tools they need to be able to do that?

Dame Rachel de Souza: It is a massive concern to parents. Parents talk to me all the time about their worries: “Do we know enough?” They have that anxiety, especially as their children turn nine or 10; they are thinking, “I don’t even know what this world out there is.” I think that our conversations with 16 to 21-year-olds were really reassuring, and we have produced a pamphlet for parents. It has had a massive number of downloads, because parents absolutely want to be educated in this subject.

What did young people tell us? They told us, “Use the age controls; talk to us about how much time we are spending online; keep communication open; and talk to us.” Talk to children when they’re young, particularly boys, who are likely to be shown pornography for the first time, even if there are parental controls, around the age of nine or 10. So have age-appropriate conversations. There was some very good advice about online experiences, such as, “Don’t worry; you’re not an expert but you can talk to us.” I mean, I did not grow up with the internet, but I managed parenting relatively well—my son is 27 now. I think this is a constant concern for parents.

I do think that the tech companies could be doing so much more to assist parents in digital media literacy, and in supporting them in how to keep their child safe. We are doing it as the Office of the Children’s Commissioner. I know that we are all trying to do it, but we want to see everyone step up on this, particularly the tech companies, to support parents on this issue.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Can I start by thanking the NSPCC and you, Dame Rachel, and your office for the huge contribution that you have made to the Bill as it has developed? A number of changes have been made as a result of your interventions, so I would just like to start by putting on the record my thanks to both of you and both your organisations for the work that you have done so far.

Could you outline for the Committee the areas where you think the Bill, as currently drafted, contains the most important provisions to protect children?

Dame Rachel de Souza: I was really glad to see, in the rewrite of the Online Safety Bill, a specific reference to the role of age assurance to prevent children from accessing harmful content. That has come across strongly from children and young people, so I was very pleased to see that. It is not a silver bullet, but for too long children have been using entirely inappropriate services. The No. 1 recommendation from the 16 to 21-year-olds, when asked what they wish their parents had known and what we should do, was age assurance, if you are trying to protect a younger sibling or are looking at children, so I was pleased to see that. Companies cannot hope to protect children if they do not know who the children are on their platforms, so I was extremely pleased to see that.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Sorry to interject, Dame Rachel, but do you agree that it is not just about stopping under-18s viewing pornography; it also includes stopping children under 13 accessing social media entirely, as per those companies’ purported terms and conditions, which are frequently flouted?

Dame Rachel de Souza: Absolutely. I have called together the tech companies. I have met the porn companies, and they reassured me that as long as they were all brought into the scope of this Bill, they would be quite happy as this is obviously a good thing. I brought the tech companies together to challenge them on their use of age assurance. With their artificial intelligence and technology, they know the age of children online, so they need to get those children offline. This Bill is a really good step in that direction; it will hold them to account and ensure they get children offline. That was a critically important one for me.

I was also pleased to see the holding to account of companies, which is very important. On full coverage of pornography, I was pleased to see the offence of cyber-flashing in the Bill. Again, it is particularly about age assurance.

What I would say is that nudge is not working, is it? We need this in the Bill now, and we need to get it there. In my bit of work with those 2,000 young people, we asked what they had seen in the last month, and 40% of them have not had bad images taken down. Those aspects of the Bill are key.

Andy Burrows: This is a landmark Bill, so we thank you and the Government for introducing it. We should not lose sight of the fact that, although this Bill is doing many things, first and foremost it will become a crucial part of the child protection system for decades to come, so it is a hugely important and welcome intervention in that respect.

What is so important about this Bill is that it adopts a systemic approach. It places clear duties on platforms to go through the process of identifying the reasonably foreseeable harms and requiring that reasonable steps be taken to mitigate them. That is hugely important from the point of view of ensuring that this legislation is future-proofed. I know that many companies have argued for a prescriptive checklist, and then it is job done—a simple compliance job—but a systemic approach is hugely important because it is the basis upon which companies have very clear obligations. Our engagement is very much about saying, “How can we make sure this Bill is the best it can possibly be?” But that is on the bedrock of that systemic approach, which is fundamental if we are to see a culture shift in these companies and an emphasis on safety by design—designing out problems that do not have to happen.

I have engaged with companies where child safety considerations are just not there. One company told me that grooming data is a bad headline today and tomorrow’s chip shop wrapper. A systemic approach is the key to ensuring that we start to address that balance.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you. I obviously strongly agree with those comments.

I would like to turn to a one or two points that came up in questioning, and then I would like to probe a couple of points that did not. Dame Rachel mentioned advocacy and ensuring that the voice of particular groups—in this context, particularly that of children—is heard. In that context, I would like to have a look at clause 140, which relates to super-complaints. Subsection (4) says that the Secretary of State can, by regulations, nominate which organisations are able to bring super-complaints. These are complaints whereby you go to Ofcom and say that there is a particular company that is failing in its systemic duties.

Subsection (4) makes it clear that the entities nominated to be an authorised super-complainant would include

“a body representing the interests of users of regulated services”,

which would obviously include children. If an organisation such as the Office of the Children’s Commissioner or the NSPCC—I am obviously not prejudicing the future process—were designated as a super-complainant that was able to bring super-complaints to Ofcom, would that address your point about the need for proper advocacy for children?

Dame Rachel de Souza: Absolutely. I stumbled over that a bit when Maria asked me the question, but we absolutely need people who work with children, who know children and are trusted by children, and who can do that nationally in order to be the super-complainants. That is exactly how I would envisage it working.

Andy Burrows: The super-complaint mechanism is part of the well-established arrangements that we see in other sectors, so we are very pleased to see that that is included in the Bill. I think there is scope to go further and look at how the Bill could mirror the arrangements that we see in other sectors—I mentioned the energy, postal and water sectors earlier as examples—so that the statutory user advocacy arrangements for inherently vulnerable children, including children at risk of sexual abuse, mirror the arrangements that we see in those other sectors. That is hugely important as a point of principle, but it is really helpful and appropriate for ensuring that the legislation can unlock the positive regulatory outcomes that we all want to see, so I think it contributes towards really effective regulatory design.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you, Andy. I am conscious of the time, so I will be brief with my final three questions. You made a valid point about large social media platforms receiving complaints generally, but in this case from children, about inappropriate content, such as photographs of them on a social media platform that do not get taken down—the complaint gets ignored, or it takes a very long time. In clause 18, we have duties on the complaints procedures that the big social media firms will now have to follow. I presume that you would join me in urging Ofcom to ensure that how it enforces the duties in clause 18 includes ensuring that big social media firms are responsive and quick in how they deal with complaints. Children are specifically referred to in the clause—for example, in subsection (3) and elsewhere.

Dame Rachel de Souza: Yes, and I was so pleased to see that. The regulator needs to have teeth for it to have any effect—I think that is what we are saying. I want named senior managers to be held accountable for breaches of their safety duties to children, and I think that senior leaders should be liable to criminal sanctions when they do not uphold their duty of care to children.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Good—thank you. I want to say something about gaming, because Kirsty Blackman asked about it. If messages are being sent back and forth in a gaming environment, which is probably the concern, those are in scope of the Bill, because they are user-to-user services.

I will put my last two questions together. Are you concerned about the possibility that encryption in messaging services might impede the automatic scanning for child exploitation and abuse images that takes place, and would you agree that we cannot see encryption happen at the expense of child safety? Secondly, in the context of the Molly Russell reference earlier, are you concerned about the way that algorithms can promote and essentially force-feed children very harmful content? Those are two enormous questions, and you have only two minutes to answer them, so I apologise.

Dame Rachel de Souza: I am going to say yes and yes.

Andy Burrows: I will say yes and yes as well. The point about end-to-end encryption is hugely important. Let us be clear: we are not against end-to-end encryption. Where we have concerns is about the risk profile that end-to-end encryption introduces, and that risk profile, when we are talking about it being introduced into social networking services and bundled with other sector functionality, is very high and needs to be mitigated.

About 70% of child abuse reports that could be lost with Meta going ahead. That is 28 million reports in the past six months, so it is very important that the Bill can require companies to demonstrate that if they are running services, they can acquit themselves in terms of the risk assessment processes. We really welcome the simplified child sexual exploitation warning notices in the Bill that will give Ofcom the power to intervene when companies have not demonstrated that they have been able to introduce end-to-end encryption in a safe and effective way.

One area in which we would like to see the Bill—

None Portrait The Chair
- Hansard -

Order. I am afraid that brings us to the end of the time allotted for the Committee to ask questions of this panel. On behalf of the Committee, I thank our witnesses for their evidence, and I am really sorry that we could not get Lynn Perry online. Could we move on to the last panel? Thank you very much.

Examination of Witnesses

Ben Bradley and Katy Minshall gave evidence.

10:51
None Portrait The Chair
- Hansard -

We will now hear from Ben Bradley, government relations and public policy manager at TikTok, and Katy Minshall, head of UK public policy at Twitter. We have until 11.25 for this panel of witnesses. Could the witnesses please introduce themselves for the record?

Ben Bradley: I am Ben Bradley. I am a public policy manager at TikTok, leading on the Bill from TikTok.

Katy Minshall: I am Katy Minshall. I am head of UK public policy for Twitter.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Good morning, both. Thank you for joining us today. We have recently had it confirmed by the Minister in a written parliamentary question that NFTs—non-fungible tokens—will be included in the scope of the Bill. Concerns have been raised about how that will work in practice, and also in relation to GIFs, memes and other image-based content that is used on your platforms, Twitter specifically. Katy, how do you see that working in practice? Is the Bill workable in its current form to encapsulate all of that?

Katy Minshall: Thank you for inviting me here to talk about the Online Safety Bill. On whether the Bill is workable in its current form, on the one hand, we have long been supportive of an approach that looks at overall systems and processes, which I think would capture some of the emerging technologies that you are talking about. However, we certainly have questions about how are aspects of the Bill would work in practice. To give you an example, one of the late additions to the Bill was about user verification requirements, which as I understand it means that all category 1 platforms will need to offer users the opportunity to verify themselves and, in turn, those verified users have the ability to turn off interaction from unverified users. Now, while we share the Government’s policy objective of giving users more control, we certainly have some workability questions.

Just to give you one example, let’s say this existed today, and Boris Johnson turned on the feature. In practice, that would mean one of two things. Either the feature is only applicable to users in the UK, meaning that people around the world—in France, Australia, Germany or wherever it may be—are unable to interact with Boris Johnson, and only people who are verified in the UK can reply to him, tweet at him and so on, or it means the opposite and anyone anywhere can interact with Boris Johnson except those people who have chosen not to verify their identity, perhaps even in his own constituency, who are therefore are at a disadvantage in being able to engage with the Prime Minister. That is just one illustration of the sorts of workability questions we have about the Bill at present.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q You brought up the Prime Minister, so we’ll carry on down that route. One of the concerns about the Bill is the issue of protecting democratic importance. If there is an exemption for content of democratic importance, would your platforms be able to take that down?

Katy Minshall: I am sorry, do you mean—

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Would you be able to remove the content?

Katy Minshall: At present, what would be expected of companies in that scenario is not entirely clear in the Bill. There are certainly examples of content we have removed over the years for abuse and hateful conduct where the account owner that we suspended would have grounds to say, “Actually, this is content of democratic importance.” At the very least, it is worth pointing out that, in practice, it is likely to slow down our systems because we would have to build in extra steps to understand if a tweet or an account could be considered content of democratic importance, and we would therefore treat it differently.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q That brings me to my next question. Because what would be classed as content of democratic importance is so ambiguous, would your platforms even be able to detect it?

Katy Minshall: That is a really important question. At present, the Bill envisages that we would treat journalistic content differently from other types of content. I think the definition in the Bill—correct me if I get this wrong—is content for the purposes of journalism that is UK linked. That could cover huge swathes of the conversation on Twitter—links to blog posts, citizen journalists posting, front pages of news articles. The Bill envisages our having a system to separate that content from other content, and then treating that content differently. I struggle to understand how that would work in practice, especially when you layer on top the fact that so much of our enforcement is assisted by technology and algorithms. Most of the abusive content we take down is detected using algorithms; we suspend millions of spam accounts every day using automated systems. When you propose to layer something so ambiguous and complicated on top of that, it is worth considering how that might impact on the speed of enforcement across all of our platform.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you. Given the media carve-out and the journalism exemption in the Bill, how could you detect state actors that are quoting disinformation, or even misinformation?

Katy Minshall: At present, we label a number of accounts as Government actors or state-affiliated media and we take action on those accounts. We take down their tweets and in some cases we do not amplify their content because we have seen in current situations that some Governments are sharing harmful content. Again, I question the ambiguity in the Bill and how it would interact with our existing systems that are designed to ensure safety on Twitter.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you. Just one final question for Twitter. A query we raised with the Children’s Commissioner and the NSPCC is about pornography and children accessing it. A person needs to be 13 years old to join Twitter—to host a profile on the site—but you do host pornographic content; it is used mainly by sex workers to promote their trade. How does the proposed provision affect your model of business in allowing 13-year-olds and above to access your platform?

Katy Minshall: Until we see the full extent of the definitions and requirements, it is difficult to say exactly what approach we would take under the Bill. Regarding adult content, Twitter is not a service targeting a youth audience, and as you illustrate, we endeavour to give people the ability to express themselves as they see fit. That has to be balanced with the objective of preventing young people from inadvertently stumbling on such content.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q So you are not predominantly aimed at children? If you are an adult service, why is it that people aged 13 or above can access your platform?

Katy Minshall: We find that, in practice, the overwhelming majority of our user base are over the age of 18; both internal and external data show that. Of course young people can access Twitter. I think we have to be very careful that the Bill does not inadvertently lock children out of services they are entitled to use. I am sure we can all think of examples of people under the age of 18 who have used Twitter to campaign, for activism and to organise; there are examples of under-18s using Twitter to that effect. But as I say, predominantly we are not a service targeting a youth audience.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Okay. Thank you, Chair.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Q Before Christmas, I was involved in the Joint Committee that carried out pre-legislative work on the Bill. We heard platforms repeatedly state their belief that they are doing all they can to ensure safety and protect from harm. Actually, they do not even come close. My question to both platforms—and the others we are hearing from later today—is to what extent are you going to have to be dragged, kicking and screaming, to make sure these measures are put in place, or are you willing to work with Ofcom and other organisations to make sure that that is done?

Ben Bradley: Speaking for TikTok, we view ourselves as a second-generation platform. We launched in 2018, and at that time when you launched a product you had to make sure that safety was at the heart of it. I think the Secretary of State herself has said that the Bill process actually predates the launch of TikTok in the UK.

We view ourselves as an entertainment platform and to express yourself, enjoy yourself and be entertained you have to feel safe, so I do not think we would be seen as kicking and screaming under this regime. It is something that we have supported for a long time and we are regulated by Ofcom under the video-sharing platform, or VSP, regime. What the Bill will achieve is to raise the floor of industry standards, a bit like GDPR did for data, so that for all the companies in the future—to Alex’s point, this is about the next five and 10 years—there will be a baseline of standards that everyone must comply with and expectations that you will be regulated. Also, it takes a lot of these difficult decisions about the balance between safety and expression, privacy and security out of the hands of tech companies and into the hands of a regulator that, of course, will have democratic oversight.

Katy Minshall: I do not have very much more to add. We already engage positively with Ofcom. I remember appearing before a Select Committee back in 2018 or 2019 and at that point saying that we were absolutely supportive of Ofcom taking in this role and regulation potentially being a game changer. We are supportive of the systems and processes approach and look forward to engaging constructively in the regulation.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Q In terms of the timing, once the Bill comes into law, there may be a period where it is enforced to set everything up. Are both your platforms already gearing up to make sure you fulfil the requirements of the Bill from day one?

Katy Minshall: I am glad you asked that question. The problem with the Bill is it depends on so many things that do not exist yet. We are looking at the Bill and thinking how we can prepare and start thinking about what is necessary, but in practice, content that is harmful to adults and harmful to children has not been set out yet. So much of the Bill depends on secondary legislation and codes of practice, and as I described earlier in the question from Alex Davies-Jones, there are such real workability questions around exemptions and ID verification that I worry there would be the risk of substantial delays at the other end, which I do not think anyone wants to see.

Ben Bradley: It is the same from our perspective. We have our community guidelines and we are committed to enforcing those at the moment. A lot of the detail of the Bill will be produced in Ofcom’s codes of practice but I think it is important we think about operationalising the process, what it looks like in practice and whether it is workable.

Something like Katy mentioned in terms of the user empowerment duties, how prescriptive those would be and how those would work, not just from the platforms of today but for the future, is really important. For TikTok, to use a similar example on the user empowerment duties, the intent is to discover content from all over the world. When you open the app, you are recommended content from all sorts of users and there is no expectation that those would be verified. If you have opted into this proposed user empowerment duty, there is a concern that it could exacerbate the risk of filter bubbles, because you would only be receiving content from users within the UK who have verified themselves, and we work very hard to make sure there is a diverse range of recommendations in that. I think it is a fairly easy fix. Much like elsewhere in the Bill, where Ofcom has flexibility about whether to require specific recommendations, they could have that flexibility in this case as well, considering whether this type of power works for these types of platforms.

To use the example of the metaverse, how would it work once the metaverse is up and running? The whole purpose of the metaverse is a shared environment in which users interact, and because the Bill is so prescriptive at the minute about how this user empowerment duty needs to be achieved, it is not clear, if you were verified and I were unverified and you had opted not to see my content but I moved something in the shared environment, like this glass, whether that would move for everyone. It is a small point, but it just goes to the prescriptiveness of how it is currently drafted and the importance of giving Ofcom the flexibility that it has elsewhere in the Bill, but in this section as well.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q I have a few questions, starting with Twitter, in relation to young people using the platform. How do you currently make sure that under-13s do not use the platform? What actions do you take to ensure that happens? Going forward, will that change?

Katy Minshall: At present, we follow the industry standard of age self-declaration. How you manage and verify identity—whether using a real-name system or emerging technologies like blockchain or documentation—is at the heart of a range of industries, not just ours.

Technology will change and new products that we cannot even envisage today will come on to the market. In terms of what we would do in relation to the Bill, as I said, until we see the full extent of the definitions and requirements, we cannot really say what exact approach we would take.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q To follow up on that, you said that there is agreement internally and externally that your service is mostly used by over-18s. Does that mean that you do not think you will have a responsibility to undertake the child safety duties?

Katy Minshall: My understanding of the Bill is that if there is a chance a young person could access your service, you would be expected to undertake the child safety duties, so my understanding is that that would be the case.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Okay. Ben, for TikTok, how do you currently ensure that under-13s are not using your service, and how is that likely to change with the Bill coming in?

Ben Bradley: We are a strictly 13-plus platform. There are basically two approaches to preventing under-age access to our platform. The first is preventing them from signing up. We are 12+ rated in the app stores, so if you have parental controls on those app stores, you cannot download the app. We also have a neutral age gate, which I think is similar to Twitter’s. We do not ask people to confirm whether they are over 13—we do not ask them to tick a box; instead we ask them to enter their date of birth. If they enter a date of birth that is under 13, they are blocked from re-entering date of birth, so they cannot just keep trying. We do not say that it is because they are under age; we just say, “TikTok isn’t right for you right now.” That is the first step.

Secondly, we proactively surface and remove under-age users. Whenever a piece of content is reported on TikTok, for whatever reason, the moderator will look at two things: the reason why it was reported and also whether the user is under 13. They can look at a range of signals to do that. Are they wearing a school uniform? Is there a birthday cake in their biography? Do they say that they are in a certain year of school? They can use those signals.

We actually publish every quarter how many suspected under-13s we remove from our platform. I think we are currently the only company to publish that on a quarterly basis, but we think it is important to be transparent about how we are approaching this, to give a sense of the efficacy of our interventions.

On what specifically might change, that is not clear; obviously, we have to wait for further guidance from Ofcom. However, we did carry out research last year with parents and young people in five countries across Europe, including the UK, where we tested different ideas of age assurance and verification, trying to understand what they would like to see. There was not really a single answer that everyone could get behind, but there were concerns raised around data protection and privacy if you were handing over this type of information to the 50 or 60 apps that might be on your phone.

One idea, which people generally thought was a good one, was that when you first get a device and first sign into the app store, you would verify your age there, and then that app store on that device could then pass an additional token to all the apps on your phone suggesting that you are of a certain age, so that we could apply an age-appropriate experience. Obviously that would not stop us doing everything that we currently do, but I think that would be a strong signal. If that were to move forward, we would be happy to explore that.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Both of your sites work very heavily on algorithms for the content that is put towards people. If you are in the top tweets feed on Twitter, you get algorithmically derived or chosen content, and TikTok is even more heavily involved in algorithms. How will this Bill affect the algorithms that you use, particularly regarding some of the content that may get more and more extreme, for example, if people are going down that route? In terms of the legal but harmful stuff that is likely to come through, how will the Bill affect the algorithms that you use, and is it possible to do that? Does it work?

Ben Bradley: TikTok does not take a filter bubble approach. When you first open the app, you express areas of content that you are interested in and then we recommend content. Because it is short-form, the key to TikTok’s success is sending you diverse content, which allows you to discover things that you might never have previously expressed interest in. I use the example of Nathan Evans, a postman who went on to have a No. 1 song with “Wellerman”, or even Eurovision, for example. These are things that you would not necessarily express interest in, but when they are recommended to you, you are engaged. Because it is short-form content, we cannot show you the same type of material over and over again—you would not be interested in seeing 10 30-second videos on football, for example. We intentionally try to diversify the feed to express those different types of interests.

Katy Minshall: Our algorithms down-rank harmful content. If you want to see an example live on Twitter, if you send a tweet and get loads of replies, there is a chunk that are automatically hidden at the bottom in a “view more replies” section. Our algorithm works in other ways as well to down-rank content that could be violating our rules. We endeavour to amplify credible content as well. In the explore tab, which is the magnifying glass, we will typically be directing you to credible sources of information—news websites and so on.

In terms of how the Bill would affect that, my main hope is that codes of practice go beyond a leave up or take down binary and beyond content moderation and think about the role of algorithms. At present on Twitter, you can turn the algorithm off in the top right-hand corner of the app, on the sparkle icon. In the long term, I think what we will be aiming for is a choice in the range of algorithms that you could use on services like Twitter. I would hope that the code of practice enables that and does not preclude is as a solution to some of the legal but harmful content we may have in mind.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Just one more question. We know that women and minorities face more abuse online than men do. Is that something that you have found in your experience, particularly Twitter? What are you doing to ensure that the intersectionality of harms is considered in the work that you are doing to either remove or downgrade content?

Katy Minshall: That is absolutely the case and it has been documented by numerous organisations and research. Social media mirrors society and society has the problems you have just described. In terms of how we ensure intersectionality in our policies and approaches, we are guided by our trust and safety council, which is a network of dozens of organisations around the world, 10 of which are here in the UK, and which represents different communities and different online harms issues. Alongside our research and engagement, the council ensures that when it comes to specific policies, we are constantly considering a range of viewpoints as we develop our safety solutions.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you, Chair, and thank you to the witnesses. I share your concerns about the lack of clarity regarding the journalistic content and democratic content exemptions. Do you think those exemptions should be removed entirely, or can you suggest what we might do to make them clearer in the Bill?

Katy Minshall: At the very least, there must be tighter definitions. I am especially concerned when it comes to the news publisher exemption. The Secretary of State has indicated an amendment that would mean that services like Twitter would have to leave such content up while an appeals process is ongoing. There is no timeline given. The definition in the Bill of a news publisher is, again, fairly vague. If Ben and I were to set up a news website, nominally have some standards and an email address where people could send complaints, that would enable it to be considered a news publisher under the Bill. If we think about some of the accounts that have been suspended from social media over the years, you can absolutely see them creating a news website and saying, “I have a case to come back on,” to Twitter or TikTok or wherever it maybe.

Ben Bradley: We share those concerns. There are already duties to protect freedom of expression in clause 19. Those are welcome. It is the breadth of the definition of journalistic and democratic content that is a concern for us, particularly when it comes to things like the expediated and dedicated appeals mechanism, which those people would be able to claim if their content was removed. We have already seen people like Tommy Robinson on the far right present themselves as journalists or citizen journalists. Giving them access to a dedicated and expediated appeals mechanism is an area of concern.

There are different ways you could address that, such as greater clarity in those definitions and removing subjective elements. At the minute, it is whether or not a user considers their content to be journalistic; that it is not an objective criterion but about their belief about their content.

Also, if you look at something like the dedicated and expediated appeals mechanism, could you hold that in reserve so that if a platform were found to be failing in its duties to journalistic content or in its freedom of expression duties, Ofcom could say, like it can in other areas of the Bill, “Okay, we believe that you need to create this dedicated mechanism, because you have failed to protect those duties.”? That would, I think, minimise the risk for exploitation of that mechanism.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

That is really helpful, thank you. A quick question—

None Portrait The Chair
- Hansard -

I am sorry, I have to interrupt because of time. Maria Miller.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Two hopefully quick questions. I have been listening carefully. Could you summarise the main changes you will make to your products that your users will notice make them safer, whether they are children or adults? I have heard a lot about problems, but what are the changes you will actually make? Within that, could you talk about how you will improve your complaints system, which earlier witnesses said is inadequate?

Katy Minshall: We would certainly look to engage with Ofcom positively on the requirements it sets out. I am sorry to sound repetitive, but the challenge is that the Bill depends on so many things that do not exist yet and the definitions around what we mean by content harmful to adults or to children. In practice, that makes it challenging to say to you exactly today what approaches we would take. To be clear, we would of course look to continue working with the Government and now Ofcom with the shared objective of making the online space safer for everyone.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q I want to probe you a little on that. Harmful content not being defined means that you will not make any changes other than around that. It is quite a large Bill; surely there are other things you will do differently, no?

Katy Minshall: The lesson of the past three or four years is that we cannot wait for the Bill. We at Twitter are continuing to make changes to our product and our policies to improve safety for everyone, including children.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q So the bill is irrelevant to you.

Katy Minshall: The Bill is a really important piece of regulation, which is why I was so pleased to come today and share our perspectives. We are continuing to engage positively with Ofcom. What I am trying to say is that until we see the full extent of the requirements and definitions, it is hard to set out exactly what steps we would take with regards to the Bill.

Ben Bradley: To add to that point, it is hard to be specific about some of the specific changes we would make because a lot of the detail of the Bill defers to Ofcom guidance and the codes of practice. Obviously we all have the duties around child safety and adult safety, but the Ofcom guidance will suggest specific measures that we can take to do that, some of which we may take already, and some of which may go further than what we already do. Once we see the details of the codes, we will be able to give a clearer answer.

Broadly from a TikTok perspective, through the design of the product and the way we approach safety, we are in a good place for when the new regime comes in, because we are regulated by Ofcom in the VSP regime, but we would have to wait for the full amount of detail. But outside some of the companies that you will hear from today, this will touch 20,000 companies and will raise the floor for all the companies that will be regulated under the regime.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q But you cannot give any further detail about specific changes you will make as a result of this legislation because you have not seen the guidance and the codes.

Ben Bradley: Yes, the codes of practice will recommend specific steps that we should take to achieve our duties. Until we see the detail of those codes it is hard to be specific about some of the changes that we would make.

None Portrait The Chair
- Hansard -

Barbara, you have just a couple of minutes.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q Can I ask about children’s risk assessments? Who in your organisation will write the children’s risk assessments, and at what level in your organisation will they be signed off?

Katy Minshall: At present, we have a range of risk assessment processes. We have a risk committee of the board. We do risk assessments when we make a change about—

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q No, I mean the children’s risk assessment you will have to do as part of what the Bill will bring in.

Katy Minshall: At present, we do not have a specific individual designated to do the children’s risk assessment. The key question is how much does Ofcom’s guidance on risk assessments—once we see it—intersect with our current processes versus changes we would need to make to our risk assessment processes?

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q Okay. At what level in the organisation do you anticipate children’s risk assessment would be signed off? Clearly, this is a very important aspect of the Bill.

Katy Minshall: I would have to go away and review the Bill. I do not know whether a specific level is set out in the Bill, but we would want to engage with the regulation and requirements set for companies such as Twitter. However it would be expected that is what we would—

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q Do you think it should be signed off at a senior level—board level—in your organisation?

Katy Minshall: Already all the biggest decisions that we make as a company are signed off at the most senior level. We report to our chief executive, Parag Agrawal, and then to the board. As I say, there is a risk committee of the board, so I expect that we would continue to make those decisions at the highest level.

Ben Bradley: It is broadly the same from a TikTok perspective. Safety is a priority for every member of the team, regardless of whether they are in a specific trust and safety function. In terms of risk assessments, we will see from the detail of the Bill at what level they need to be signed off, but our CEO has been clear in interviews that trust and safety is a priority for him and everyone at TikTok, so it would be something to which we are all committed.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Do you think you would be likely to sign it off at the board level—

None Portrait The Chair
- Hansard -

Sorry, I have to interrupt you there. I call the Minister.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you for coming to give evidence to the Committee. On the question about user choice around identity verification, is this not conceptually quite similar to the existing blue tick regime that Twitter operates successfully?

Katy Minshall: As I say, we share your policy objective of giving users more choice. For example, at present we are testing a tool where Twitter automatically blocks abusive accounts on your behalf. We make the distinction based on an account’s behaviour and not on whether it has verified itself in some way.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Well, I’d be grateful if you applied that to my account as quickly as possible!

I do not think that the concept would necessarily operate as you suggested at the beginning. You suggested that people might end up not seeing content posted by the Prime Minister or another public figure. The concept is that, assuming a public figure would choose to verify themselves, content that they posted would be visible to everybody because they had self-verified. The content in the other direction may or may not be, depending on whether the Prime Minister or the Leader of the Opposition chose to see all content or just verified content, but their content—if they verified themselves—would be universally visible, regardless of whatever choice anyone else exercised.

Katy Minshall: Yes, sorry if I was unclear. I totally accept that point, but it would mean that some people would be able to reply to Boris Johnson and others would not. I know we are short on time, but it is worth pointing out that in a YouGov poll in April, nearly 80% of people said that they would not choose to provide ID documents to access certain websites. The requirements that you describe are based on the assumption that lots of people will choose to do it, when in reality that might not be the case.

A public figure might think, “Actually, I really appreciate that I get retweets, likes and people replying to my tweets,” but if only a small number of users have taken the opportunity to verify themselves, that is potentially a disincentive even to use this system in the first place—and all the while we were creating a system, we could have been investing in or trying to develop new solutions, such as safety mode, which I described and which tries to prevent abusive users from interacting with you.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I want to move on to the next question because we only have two minutes left.

Ben, you talked about the age verification measures that TikTok currently takes. For people who do not come via an age-protected app store, it is basically self-declared. All somebody has to do is type in a date of birth. My nine-year-old children could just type in a date of birth that was four years earlier than their real date of birth, and off they would go on TikTok. Do you accept that that is wholly inadequate as a mechanism for policing the age limit of 13?

Ben Bradley: That is not the end of our age assurance system; it is just the very start. Those are the first two things that we have to prevent sign-up, but we are also proactive in surfacing and removing under-age accounts. As I said, we publish every quarter how many suspected under-13s get removed.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q If I understood your answer correctly, that is only if a particular piece of content comes to the attention of your moderators. I imagine that only 0.01% or some tiny fraction of content on TikTok comes to the attention of your moderators.

Ben Bradley: It is based on a range of signals that they have available to them. As I said, we publish a number every quarter. In the last quarter, we removed 14 million users across the globe who were suspected to be under the age of 13. That is evidence of how seriously we take the issue. We publish that information because we think it is important to be transparent about our efforts in this space, so that we can be judged accordingly.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you. Forgive me for moving on in the interests of time.

Earlier, we debated content of democratic importance and the protections that that and free speech have in the Bill. Do you agree that a requirement to have some level of consistency in the way that that is treated is important, particularly given that there are some glaring inconsistencies in the way in which social media firms treat content at the moment? For example, Donald Trump has been banned, while flagrant disinformation by the Russian regime, lying about what they are doing in Ukraine, is allowed to propagate—including the tweets that I drew to your attention a few weeks ago, Katy.

Katy Minshall: I agree that freedom of expression should be top of mind as companies develop safety and policy solutions. Public interest should always be considered when developing policies. From the perspective of the Bill, I would focus on freedom of expression for everyone, and not limit it to content that could be related to political discussions or journalistic content. As Ben said, there are already wider freedom of expression duties in the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q To be clear, those freedom of expression duties in clause 19(2) do apply to everyone.

Katy Minshall: Sorry, but I do not know the Bill in those terms, so you would have to tell me the definition.

None Portrait The Chair
- Hansard -

Order. I am afraid that that brings us to the end of the time allotted for the Committee to ask questions in this morning’s sitting. On behalf of the Committee, I thank our witnesses for their evidence. We will meet again at 2 pm in this room to hear further oral evidence.

11:26
The Chair adjourned the Committee without Question put (Standing Order No. 88).
Adjourned till this day at Two o’clock.

Online Safety Bill (Second sitting)

Committee stage
Tuesday 24th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 24 May 2022 - (24 May 2022)
The Committee consisted of the following Members:
Chairs: † Sir Roger Gale, Christina Rees
† Ansell, Caroline (Eastbourne) (Con)
† Bailey, Shaun (West Bromwich West) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
Carden, Dan (Liverpool, Walton) (Lab)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Double, Steve (St Austell and Newquay) (Con)
† Fletcher, Nick (Don Valley) (Con)
† Holden, Mr Richard (North West Durham) (Con)
† Keeley, Barbara (Worsley and Eccles South) (Lab)
† Leadbeater, Kim (Batley and Spen) (Lab)
† Miller, Mrs Maria (Basingstoke) (Con)
† Mishra, Navendu (Stockport) (Lab)
† Moore, Damien (Southport) (Con)
Nicolson, John (Ochil and South Perthshire) (SNP)
† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
† Russell, Dean (Watford) (Con)
Stevenson, Jane (Wolverhampton North East) (Con)
Katya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks
† attended the Committee
Witnesses
Richard Earley, UK Public Policy Manager, Meta
Becky Foreman, UK Corporate Affairs Director, Microsoft
Katie O’Donovan, Director of Government Affairs and Public Policy, Google UK
Professor Clare McGlynn, Professor of Law, Durham University
Jessica Eagelton, Policy and Public Affairs Manager, Refuge
Janaya Walker, Public Affairs Manager, End Violence Against Women
Lulu Freemont, Head of Digital Regulation, techUK
Ian Stevenson, Chair, OSTIA
Adam Hildreth, CEO, Crisp
Jared Sine, Chief Business Affairs and Legal Officer, Match Group
Nima Elmi, Head of Public Policy in Europe, Bumble
Dr Rachel O’Connell, CEO TrustElevate
Rhiannon-Faye McDonald, Victim and Survivor Advocate, Marie Collins Foundation
Susie Hargreaves OBE, Chief Executive, Internet Watch Foundation
Ellen Judson, Lead Researcher at the Centre for the Analysis of Social Media, Demos
Kyle Taylor, Founder and Director, Fair Vote UK
Public Bill Committee
Tuesday 24 May 2022
(Afternoon)
[Sir Roger Gale in the Chair]
Online Safety Bill
14:00
The Committee deliberated in private.
Examination of Witnesses
Richard Earley, Becky Foreman and Katie O’Donovan gave evidence.
14:01
None Portrait The Chair
- Hansard -

Q 68 Good afternoon. We are now sitting in public and these proceedings are being broadcast. This afternoon, we will first hear oral evidence from Richard Earley, the UK public policy manager of Meta, Becky Foreman, the UK corporate affairs director at Microsoft, and Katie O’Donovan, the director of Government affairs and public policy at Google and YouTube. Ladies and gentlemen, thank you very much indeed for joining us. For the sake of the record, could I just ask you to identify yourselves?

Richard Earley: Good afternoon. My name is Richard Earley, and I work in the public policy team at Meta, leading on content issues including the Online Safety Bill.

Becky Foreman: I am Becky Foreman; I am the corporate affairs director for Microsoft UK.

Katie O’Donovan: I am Katie O’Donovan; I am director of Government affairs and public policy for Google in the UK.

None Portrait The Chair
- Hansard -

May I just ask you, for the benefit of Hansard, to try to speak up a little? The sound system is not all that it might be in this room, and the acoustics certainly are not.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

Q Thank you to our witnesses for joining us this afternoon. Quite bluntly, I will get into it, because what is frustrating for us, as Parliamentarians, and for our constituents, is the fact that we need this legislation in the first place. Why are you, as platforms, allowing harmful and illegal content to perpetuate on your platforms? Why do we need this legislation for you to take action? It is within your gift to give, and despite all the things I am sure you are about to tell me that you are doing to prevent this issue from happening, it is happening and we are needing to legislate, so why?

None Portrait The Chair
- Hansard -

Mr Earley, I will go left to right to start with, if that is all right with you, so you have drawn the short straw.

Richard Earley: No worries, and thank you very much for giving us the opportunity to speak to you all today; I know that we do not have very much time. In short, we think this legislation is necessary because we believe that it is really important that democratically elected Members of Parliament and Government can provide input into the sorts of decisions that companies such as ours are making, every day, about how people use the internet. We do not believe that it is right for companies such as ours to be taking so many important decisions every single day.

Now, unfortunately, it is the case that social media reflects the society that we live in, so all of the problems that we see in our society also have a reflection on our services. Our priority, speaking for Meta and the services we provide—Facebook, Instagram and WhatsApp—is to do everything we can to make sure our users have as positive an experience as possible on our platform. That is why we have invested more than $13 billion over the past five years in safety and security, and have more than 40,000 people working at our company on safety and security every day.

That said, I fully recognise that we have a lot more areas to work on, and we are not waiting for this Bill to come into effect to do that. We recently launched a whole range of updated tools and technologies on Instagram, for example, to protect young people, including preventing anyone under the age of 18 from being messaged by a person they are not directly connected to. We are also using new technology to identify potentially suspicious accounts to prevent young people from appearing in any search results that those people carry out. We are trying to take steps to address these problems, but I accept there is a lot more to do.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Before I bring in Becky and Katie to answer that, I just want to bring you back to something you said about social media and your platforms reflecting society like a mirror. That analogy is used time and again, but actually they are not a mirror. The platforms and the algorithms they use amplify, encourage and magnify certain types of content, so they are not a mirror of what we see in society. You do not see a balanced view of two points of an issue, for example.

You say that work is already being done to remove this content, but on Instagram, for example, which is a platform predominantly used by women, the Centre for Countering Digital Hate has exposed what they term an “epidemic of misogynistic abuse”, with 90% of misogynistic abuse being sent via direct messaging. It is being ignored by the platform even when it is being reported to the moderators. Why is that happening?

Richard Earley: First, your point about algorithms is really important, but I do not agree that they are being used to promote harmful content. In fact, in our company, we use algorithms to do the reverse of that. We try to identify content that might break our policies—the ones we write with our global network of safety experts—and then remove those posts, or if we find images or posts that we think might be close to breaking those rules, we show them lower in people’s feeds so that they have a lower likelihood of being seen. That is why, over the past two years, we have reduced the prevalence of harmful posts such as hate speech on Facebook so that now only 0.03% of views of posts on Facebook contain that kind of hate speech—we have almost halved the number. That is one type of action that we take in the public parts of social media.

When it comes to direct messages, including on Instagram, there are a range of steps that we take, including giving users additional tools to turn off any words they do not want to see in direct messages from anyone. We have recently rolled out a new feature called “restrict” which enables you to turn off any messages or comments from people who have just recently started to follow you, for example, and have just created their accounts. Those are some of the tools that we are trying to use to address that.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q So the responsibility is on the user, rather than the platform, to take action against abuse?

Richard Earley: No, the responsibility is absolutely shared by those of us who offer platforms, by those who are engaged in abuse in society, and by civil society and users more widely. We want to ensure we are doing everything we can to use the latest technology to stop abuse happening where we can and give people who use our services the power to control their experience and prevent themselves from encountering it.

None Portrait The Chair
- Hansard -

We must allow the other witnesses to participate.

Becky Foreman: Thank you for inviting me to give evidence to you today. Online safety is extremely important to Microsoft and sits right at the heart of everything we do. We have a “safety by design” policy, and responsibility for safety within our organisation sits right across the board, from engineers to operations and policy people. However, it is a complicated, difficult issue. We welcome and support the regulation that is being brought forward.

We have made a lot of investments in this area. For example, we introduced PhotoDNA more than 10 years ago, which is a tool that is used right across the sector and by non-governmental organisations to scan for child sexual abuse material and remove it from their platforms. More recently, we have introduced a grooming tool that automates the process of trying to establish whether there is a conversation for grooming taking place between an adult and a child. That can then be flagged for human review. We have made that available at no charge to the industry, and it has been licensed by a US NGO called Thorn. We take this really seriously, but it is a complicated issue and we really welcome the regulation and the opportunity to work with the Government and Ofcom on this.

Katie O’Donovan: Thank you so much for having me here today and asking us to give evidence. Thank you for your question. I have worked at Google and YouTube for about seven years and I am really proud of our progress on safety in those years. We think about it in three different ways. First, what products can we design and build to keep our users safer? Similar to Microsoft, we have developed technology that identifies new child sex abuse material and we have made that available across the industry. We have developed new policies and new ways of detecting content on YouTube, which means we have really strict community guidelines, we identify that content and we take it down. Those policies that underlie our products are really important. Finally, we work across education, both in secondary and primary schools, to help inform and educate children through our “Be Internet Legends” programme, which has reached about 4 million people.

There is definitely much more that we can do and I think the context of a regulatory environment is really important. We also welcome the Bill and I think it is really going to be meaningful when Ofcom audits how we are meeting the requirements in the legislation—not just how platforms like ours are meeting the requirements in the Bill, but a wide spectrum of platforms that young people and adults use. That could have a really positive additive effect to the impact.

It is worth pausing and reflecting on legislation that has passed recently, as well. The age-appropriate design code or the children’s code that the Information Commissioner’s Office now manages has also helped us determine new ways to keep our users safe. For example, where we have long had a product called SafeSearch, which you can use on search and parents can keep a lock on, we now also put that on by default where we use signals to identify people who we think are under 18.

We think that is getting the right balance between providing a safer environment but also enabling people to access information. We have not waited for this regulation. This regulation can help us do more, and it can also level the playing field and really make sure that everyone in the industry steps up and meets the best practice that can exist.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you, both, for adding context to that. If I can bring you back to what is not being done and why we need to legislate, Richard, I come back to you. You mentioned some of the tools and systems that you have put in place so users can stop abuse from happening. Why is it that that 90% of abuse on Instagram in direct messages is being ignored by your moderators?

Richard Earley: I do not accept that figure. I believe that if you look at our quarterly transparency report, which we just released last week, you can see that we find more than 90% of all the content that we remove for breaking our policies ourselves. Whenever somebody reports something on any of our platforms, they get a response from us. I think it is really important, as we are focusing on the Bill, to understand or make the point that, for private messaging, yes, there are different harms and different risks of harm that can apply, which is why the steps that we take differ from the steps that we take in public social media.

One of the things that we have noticed in the final draft of the Bill is that the original distinction between public social media and private messaging, which was contained in the online harms White Paper and in earlier drafts of the Bill, has been lost here. Acknowledging that distinction, and helping companies recognise that there is different risk and then different steps that can be taken in private messaging to what is taken on public social media, would be a really important thing for the Committee to consider.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Quite briefly, because I know we are short on time, exactly how many human moderators do you have working to take down disinformation and harmful illegal content on your platforms?

Richard Earley: We have around 40,000 people in total working on safety and security globally and, of those, around half directly review posts and content.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q How many of those are directly employed by you and how many are third party?

Richard Earley: I do not have that figure myself but I know it is predominantly the case that, in terms of the safety functions that we perform, it is not just looking at the pieces of content; it is also designing the technology that finds and surfaces content itself. As I said, more than 90% of the time—more than 95% in most cases—it is our technology that finds and removes content before anyone has to look at it or report it.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q On that technology, we have been told that you are not doing enough to remove harmful and illegal content in minority languages. This is a massive gap. In London alone, more than 250 languages are spoken on a regular basis. How do you explain your inaction on this? Can you really claim that your platform is safe if you are not building and investing in AI systems in a range of languages? What proactive steps are you taking to address this extreme content that is not in English?

Richard Earley: That group of 40,000 people that I mentioned, they operate 24 hours, 7 days a week. They cover more than 70 languages between them, which includes the vast majority of the world’s major spoken languages. I should say that people working at Meta, working on these classifiers and reviewing content, include people with native proficiency in these languages and people who can build the technology to find and remove things too. It is not just what happens within Meta that makes a difference here, but the work we do with our external partners. We have over 850 safety partners that we work with globally, who help us understand how different terms can be used and how different issues can affect the spread of harm on our platforms. All of that goes into informing both the policies we use to protect people on our platform and the technology we build to ensure those policies are followed.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Finally, which UK organisations that you use have quality assured any of their moderator training materials?

Richard Earley: I am sorry, could you repeat the question?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The vast majority of people are third party. They are not employed directly by Meta to moderate content, so how many of the UK organisations you use have been quality assured to ensure that the training they provide in order to spot this illegal and harmful content is taken on board?

Richard Earley: I do not believe it is correct that for our company, the majority of moderators are employed by—

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

You do not have the figures, so you cannot tell me.

Richard Earley: I haven’t, no, but I will be happy to let you know afterwards in our written submission. Everyone who is involved in reviewing content at Meta goes through an extremely lengthy training process that lasts multiple weeks, covering not just our community standards in total but also the specific area they are focusing on, such as violence and incitement. If it is hate speech, of course, there is a very important language component to that training, but in other areas—nudity or graphic violence—the language component is less important. We have published quite a lot about the work we do to make sure our moderators are as effective as possible and to continue auditing and training them. I would be really happy to share some of that information, if you want.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q But that is only for those employed directly by Meta.

Richard Earley: I will have to get back to you to confirm that, but I think it applies to everyone who reviews content for Meta, whether they are directly employed by Meta or through one of our outsourced-in persistent partners.

None Portrait The Chair
- Hansard -

Thank you very much. Don’t worry, ladies; I am sure other colleagues will have questions that they wish to pursue. Dean Russell, please.

Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - - - Excerpts

Q Thank you, Chair. I guess this is for all three of you, but it is actually directed primarily at Richard—apologies. I do not mean to be rude—well, I am probably about to be rude.

One of the reasons why we are bringing in this Bill is that platforms such as Facebook—Meta, sorry—just have not fulfilled their moral obligations to protect children from harm. What commitment are you making within your organisation to align yourself to deliver on the requirements of the Bill?

To be frank, the track record up until now is appalling, and all I hear when in these witness sessions, including before Christmas on the Joint Committee, is that it is as though the big platforms think they are doing a good job—that they are all fine. They have spent billions of pounds and it is not going anywhere, so I want to know what practical measures you are going to be putting into place following this Bill coming into law.

Richard Earley: Of course, I do not accept that we have failed in our moral obligation to our users, particularly our younger users. That is the most important obligation that we have. I work with hundreds of people, and there are thousands of people at our company who spend every single day talking to individuals who have experienced abuse online, people who have lived experience of working with victims of abuse, and human rights defenders—including people in public life such as yourself—to understand the impact that the use of our platform can have, and work every day to make it better.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Q But do you accept that there is a massive gap between those who you perhaps have been protecting and those who are not protected, hence the need for us to put this law in place?

Richard Earley: Again, we publish this transparency report every quarter, which is our attempt to show how we are doing at enforcing our rules. We publish how many of the posts that break our rules we take down ourselves, and also our estimates of how likely you are to find a piece of harmful content on the platform—as I mentioned, it is around three in every 10,000 for hate speech right now—but we fully recognise that you will not take our word for it. We expect confidence in that work to be earned, not just assumed.

That is why last year, we commissioned EY to carry out a fully independent audit of these systems. It published that report last week when we published our most recent transparency report and, again, I am very happy to share it with you here. The reason we have been calling for many years for pieces of legislation like this Bill to come into effect is that we think having Ofcom, the regulator—as my colleagues just said—able to look in more detail at the work we are doing, assess the work we are doing, and identify areas where we could do more is a really important part of what this Bill can do.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Q I am conscious of the time, sorry. I know colleagues want to come in, but what are the practical measures? What will you be doing differently moving forward following this Bill?

Richard Earley: To start with, as I said, we are not waiting for the Bill. We are introducing new products and new changes all the time.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Q Which will do what, sorry? I do not mean to be rude, but what will they be?

Richard Earley: Well, I just spoke about some of the changes we made regarding young people, including defaulting them into private accounts. We have launched additional tools making it possible for people to put in lists of words they do not want to see. Many of those changes are aligned with the core objectives of the Bill, which are about assessing early the risks of any new tools that we launch and looking all the time at how the use of technology changes and what new risks that might bring. It is then about taking proactive steps to try to reduce the risk of those harms.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Q May I ask you a specific question? Will that include enabling bereaved parents to see their children’s Facebook posts and profile?

Richard Earley: This is an issue we have discussed at length with DCMS, and we have consulted a number of people. It is, of course, one of the most sensitive, delicate and difficult issues we have to deal with, and we deal with those cases very regularly. In the process that exists at present, there are, of course, coronial powers. There is a process in the UK and other countries for coroners to request information.

When it comes to access for parents to individuals’ accounts, at present we have a system for legacy contacts on some of our services, where you can nominate somebody to have access to your account after you pass away. We are looking at how that can be expanded. Unfortunately, there are an awful lot of different obligations we have to consider, not least the obligations to a person who used our services and then passed away, because their privacy rights continue after they have passed away too.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Okay, so there is a compassion element. I am conscious of time, so I will stop there.

None Portrait The Chair
- Hansard -

One moment, please. I am conscious of the fact that we are going to run out of time. I am not prepared to allow witnesses to leave without feeling they have had a chance to say anything. Ms Foreman, Ms O’Donovan, is there anything you want to comment on from what you have heard so far? If you are happy, that is fine, I just want to make sure you are not being short-changed.

Becky Foreman: No.

Katie O'Donovan: No, I look forward to the next question.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

Q Given the size of Facebook, a lot of our questions will be focused towards it—not that you guys do not have very large platforms, but the risks with social media are larger. You mentioned, Richard, that three in every 10,000 views are hate speech. If three in every 10,000 things I said were hate speech, I would be arrested. Do you not think that, given the incredibly high number of views there are on Facebook, there is much more you need to do to reduce the amount of hate speech?

Richard Earley: So, reducing that number—the prevalence figure, as we call it—is the goal that we set our engineers and policy teams, and it is what we are devoted to doing. On whether it is a high number, I think we are quite rare among companies of our size in providing that level of transparency about how effective our systems are, and so to compare whether the amount is high or low, you would require additional transparency from other companies. That is why we really welcome the part of the Bill that allows Ofcom to set standards for what kinds of transparency actually are meaningful for people.

We have alighted on the figure of prevalence, because we think it is the best way for you and the public to hold us to account for how we are doing. As I said, that figure of three in every 10,000 has declined from six in every 10,000 about 12 months ago. I hope the figure continues to go down, but it is not just a matter of what we do on our platform. It is about how all of us in society function and the regulations you will all be creating to help support the work we do.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q I would like to follow up with a question about responding to complaints. The complaints process is incredibly important. Reports need to be made and Facebook needs to respond to those reports. The Centre for Countering Digital Hate said that it put in 100 complaints and that 51 did not get any response from Facebook. It seems as though there is a systemic issue with a lack of response to complaints.

Richard Earley: I do not know the details of that methodological study. What I can tell you is that every time anyone reports something on Facebook or Instagram, they get a response into their support inbox. We do not put the response directly into your Messenger inbox or IG Direct inbox, because very often when people report something, they do not want to be reminded of what they have seen among messages from their friends and family. Unfortunately, sometimes people do not know about the support inbox and so they miss the response. That could be what happened there, but every time somebody reports something on one of our platforms, they get a response.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Does the response just say, “Thanks for your report”?

Richard Earley: No. I want to be very constructive here. I should say that some of the concerns that are raised around this date from several years ago. I will accept that five or 10 years ago, the experience on our platforms was not this comprehensive, but in the last few years, we have really increased the transparency we give to people. When you submit something and report it for a particular violation, we give you a response that explains the action we took. If we removed it, we would explain what piece of our community standards it broke. It also gives you a link to see that section of our policy so you can understand it.

That is one way we have tried to increase the transparency we give to users. I think there is a lot more we could be doing. I could talk about some of the additional transparency steps we are taking around the way that our algorithms recommend content to people. Those are, again, all welcome parts of the Bill that we look forward to discussing further with Ofcom.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q One of the things that has been recommended by a number of charities is increasing cross-platform and cross-company work to identify and take action on emerging threats. Do you think there would be the level of trust necessary for cross-platform co-operation with your competitors in the light of reports that, for example, Facebook employed somebody to put out negative things about TikTok in the US? Do you think that cross-platform working will work in that environment?

Richard Earley: Yes; I think it is already working, in fact. Others on the panel mentioned a few areas in which we have been collaborating in terms of open-sourcing some of the technology we have produced. A few years ago, we produced a grooming classifier—a technology that allows people to spot potentially inappropriate interactions between adults and children—and we open-sourced that and enabled it to be used and improved on by anyone else who is building a social media network.

A number of other areas, such as PhotoDNA, have already been mentioned. An obvious one is the Global Internet Forum to Counter Terrorism, which is a forum for sharing examples of known terrorist content so that those examples can be removed from across the internet. All those areas have been priorities for us in the past. A valuable piece of the Bill is that Ofcom—from what I can see from the reports that it has been asked to make—will do a lot of work to understand where there are further opportunities for collaboration among companies. We will be very keen to continue being involved in that.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q I have a question for Katie on the algorithms that produce suggestions when you begin to type. It has been raised with me and in the evidence that we have received that when you begin to type, you might get a negative suggestion. If somebody types in, “Jews are”, the algorithm might come up with some negative suggestions. What has Google done about that?

Katie O'Donovan: We are very clear that we want the auto-suggestion, as we call it, to be a helpful tool that helps you find the information that you are looking for quicker—that is the core rationale behind the search—but we really do not want it to perpetuate hate speech or harm for protected individuals or wider groups in society. We have changed the way that we use that auto-complete, and it will not auto-complete to harmful suggestions. That is a live process that we review and keep updated. Sometimes terminology, vernacular or slang change, or there is a topical focus on a particular group of people, so we keep it under review. But by our policy and implementation, those auto-suggestions should very much not be happening on Google search.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Would it be technically possible for all of the protected characteristics, for example, to have no auto-complete prompts come up?

Katie O'Donovan: That is an excellent question on where you do not want protections and safety to minimise user or individual impact. If you wanted a protected characteristic for Jewish people, for example, we see that as really important, and we should remove the ability for hate speech. If you wanted to do that for a Jewish cookbook, Jewish culture or Jewish history, and we removed everything, you would really minimise the amount of content that people could access.

The Bill is totally vital and will be incredibly significant on UK internet access, but that is where it is really important to get the balance and nuance right. Asking an algorithm to do something quite bluntly might look at first glance like it will really improve safety, but when you dig into it, you end up with the available information being much less sophisticated, less impactful and less full, which I think nobody really wants—either for the user or for those protected groups.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Would it not be easier to define all the protected characteristics and have a list of associated words than to define every possible instance of hate speech in relation to each?

Katie O'Donovan: The way we do it at the moment is through algorithmic learning. That is the most efficient way to do it because we have millions of different search terms, a large number of which we see for the very first time every single day on Google search. We rarely define things with static terms. We use our search rater guidelines—a guide of about 200 pages—to determine how those algorithms work and make sure that we have a dynamic ability to restrict them.

That means that you do not achieve perfection, and there will be changes and new topical uses that we perhaps did not anticipate—we make sure that we have enough data incoming to adjust to that. That is the most efficient way of doing it, and making sure that it has the nuance to stop the bad autocomplete but give access to the great content that we want people to get to.

None Portrait The Chair
- Hansard -

Thank you very much. Ms Foreman, do you want to add anything to that? You do not have to.

Becky Foreman: I do not have anything to add.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

Q I want to come back to transparency, which we touched on with my colleague Alex Davies-Jones earlier. Clearly, it is very important, and I think we could take a big step forward with the Bill. I want to ask you about child risk assessments, and whether they should be available publicly. I also want to ask about reports on the measures that you will have to take, as platforms, to manage the risks and mitigate the impact of harm. Harm is occurring at the moment—for example, content that causes harm is being left up. We heard earlier from the NSPCC that Facebook would not take down birthday groups for eight, nine and 10-year-old children, when it is known what purpose those birthday groups were serving for those young children. I guess my question on transparency is, “Can’t you do much better, and should there be public access to reports on the level of harm?”

Richard Earley: There are quite a few different questions there, and I will try to address them as briefly as I can. On the point about harmful Facebook groups, if a Facebook group is dedicated to breaking any of our rules, we can remove that group, even if no harmful content has been posted in it. I understand that was raised in the context of breadcrumbing, so trying to infer harmful intent from innocuous content. We have teams trying to understand how bad actors circumvent our rules, and to prevent them from doing that. That is a core part of our work, and a core part of what the Bill needs to incentivise us to do. That is why we have rules in place to remove groups that are dedicated to breaking our rules, even if no harmful content is actually posted in them.

On the question you asked about transparency, the Bill does an admirable job of trying to balance different types of transparency. There are some kinds of transparency that we believe are meaningful and valid to give to users. I gave the example a moment ago of explaining why a piece of content was removed and which of our community standards it broke. There is other transparency that we think is best given in a more general sense. We have our transparency report, as I said, where we give the figures for how much content we remove, how much of it we find ourselves—

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q I am not talking here about general figures for what you have removed. I am talking about giving real access to the data on the risks of harm and the measures to mitigate harm. You could make those reports available to academics—we could find a way of doing that—and that would be very valuable. Surely what we want to do is to generate communities, including academics and people who have the aim of improving things, but you need to give them access to the data. You are the only ones who have access to the data, so it will just be you and Ofcom. A greater community out there who can help to improve things will not have that access.

Richard Earley: I completely agree. Apologies for hogging more time, but I think you have hit on an important point there, which is about sharing information with researchers. Last year, we gave data to support the publishing of more than 400 independent research projects, carried out along the lines you have described here. Just yesterday, we announced an expansion of what is called our Facebook open research tool, which expands academics’ ability to access data about advertising.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q My question is, will you publish the risk assessment and the measures you are taking to mitigate?

Richard Earley: Going back to how the Bill works, when it comes to—

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

No, I am not just asking about the Bill. Will you do that?

Richard Earley: We have not seen the Ofcom guidance on what those risk assessments should contain yet, so it is not possible to say. I think more transparency should always be the goal. If we can publish more information, we will do so.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q It would be good to have that goal. Can I come to you, Katie O’Donovan?

Katie O'Donovan: To begin with, I would pick up on the importance of transparency. We at Google and YouTube publish many reports on a quarterly or annual basis to help understand the actions we are taking. That ranges from everything on YouTube, where we publish by country the content we have taken down, why we have taken it down, how it was detected and the number of appeals. That is incredibly important information. It is good for researchers and others to have access to that.

We also do things around ads that we have removed and legal requests from different foreign Governments, which again has real validity. I think it is really important that Ofcom will have access to how we work through this—

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q I was not just asking about Ofcom; I was wanting to go further than that and have wider access.

Katie O'Donovan: I do not want to gloss over the Ofcom point; I want to dwell on it for a second. In anticipation of this Bill, we were able to have conversations with Ofcom about how we work, the risks that we see and how our systems detect that. Hopefully, that is very helpful for Ofcom to understand how it will audit and regulate us, but it also informs how we need to think and improve our systems. I do think that is important.

We make a huge amount of training data available at Google. We publish a lot of shared APIs to help people understand what our data is doing. We are very open to publishing and working with academics.

It is difficult to give a broad statement without knowing the detail of what that data is. One thing I would say—it always sound a bit glib when people in my position say this—is that, in some cases, we do need to be limited in explaining exactly how our systems work to detect bad content. On YouTube, you have very clear community guidelines, which we know we have to publish, because people have a right to know what content is allowed and what is not, but we will find people who go right up to the line of that content very deliberately and carefully—they understand that, almost from a legal perspective. When it comes to fraudulent services and our ads, we have also seen people pivot the way that they attempt to defraud us. There needs to be some safe spaces to share that information. Ofcom is helpful for that too.

None Portrait The Chair
- Hansard -

Okay. Kim Leadbetter, one very quick question. We must move on—I am sorry.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Q Okay, I will try to be very quick. The draft Bill contained a proposed new media literacy duty. That seems to have now disappeared. What are your digital media literacy strategies?

Becky Foreman: We have a range of strategies. One thing I would point to is research that we conduct every year and have done for a number of years called the digital civility index. It is a set of research that speaks to teens and adults in a number of countries around the world to understand what harms they are concerned about online and to ascertain whether those harms are increasing or decreasing and how they vary between different geographies. That is one way in which we are trying to make more data and information available to the general public about the type of harms they might come across online and whether they are increasing or decreasing.

Richard Earley: We have a range of different organisations that we work with in the UK and internationally. One that I would like to draw attention to is the Economist Educational Foundation’s Burnet News Club. We have supported them to increase their funding to be able to aim to reach 10% of all state schools with a really incredibly immersive and impressive programme that enables young people to understand digital literacy and digital numeracy and the media. We are also members of the media literacy taskforce of the Department for Digital, Culture, Media and Sport at the moment, which has been working to build on the strategy that the Government published.

Overall, there is a really important role for us as platforms to play here. We regularly commission and start new programmes in this space. What is also really important is to have more guidance from Government and civil society organisations that we work with on what is effective, so that we can know where we can put our resources and boost the greatest work.

Katie O'Donovan: Thank you for the question. It is really important. We were disappointed to see the literacy focus lost in the Bill.

We really take the issue seriously. We know there is an absolute responsibility for us when it comes to product, and an absolute responsibility when it comes to policy. Even within the safest products and with the most impressive and on-it parents, people can be exposed in content in ways that are surprising and shocking. That is why you need this holistic approach. We have long invested in a programme that we run with the non-governmental organisation Parent Zone called “Be internet legends”. When we developed that, we did it with the PSHE Association to make sure it was totally compliant with the national curriculum. We regularly review that to check that it is actually making a difference. We did some recent research with MORI and got some really good results back.

We used to deliver that programme face to face in schools up and down the country. Obviously, the pandemic stopped that. We went online and while we did not enjoy it quite as much, we were able to reach real scale and it was really effective. Along with doing the assemblies, which are now back in person, we deliver a pack for teachers so they can also take that up at scale. We run similar programmes through YouTube with teenagers. It is absolutely incumbent on us to do more, but it must be part of the debate, because if you rely just on technological solutions, you will end up reducing access to lawful information, with some of the harms still being prevalent and people not having the skills to navigate them.

None Portrait The Chair
- Hansard -

I am sorry, but I must move on. Minister, I am afraid you only have five minutes.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

Q Welcome to the Committee’s proceedings and thank you for joining us this afternoon. I would like to start on the question of the algorithmic promotion of content. Last week, I met with the Facebook whistleblower, Frances Haugen, who spoke in detail about she had found when working for Facebook, so I will start with you, Richard. On the question of transparency, which other Members of the Committee have touched on, would you have any objection to sharing all the information you hold internally with trusted researchers?

Richard Earley: What information are you referring to?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Data, in particular on the operation of algorithmic promotion of particular kinds of content.

Richard Earley: We already do things like that through the direct opportunity that anyone has to see why a single post has been chosen for them in their feed. You can click on the three dots next to any post and see that. For researcher access and support, as I mentioned, we have contributed to the publishing of more than 400 reports over the last year, and we want to do more of that. In fact, the Bill requires Ofcom to conduct a report on how to unlock those sorts of barriers, which we think should be done as soon as possible. Yes, in general we support that sort of research.

I would like say one thing, though. I have worked at Facebook—now Meta—for almost five years, and nobody at Facebook has any obligation, any moral incentive, to do anything other than provide people with the best, most positive experience on our platform, because we know that if we do not give people a positive experience, through algorithms or anything else, they will leave our platform and will not use it. They tell us that and they do it, and the advertisers who pay for our services do not want to see that harmful content on our platforms either. All of our incentives are aligned with yours, which are to ensure that our users have a safe and positive experience on our platforms.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Yet the algorithms that select particular content for promotion are optimised for user engagement —views, likes and shares—because that increases user stickiness and keeps them on the site for longer. The evidence seems to suggest that, despite what people say in response to the surveys you have just referenced, what they actually interact with the most—or what a particular proportion of the population chooses to interact with the most—is content that would be considered in some way extreme, divisive, or so on, and that the algorithms, which are optimised for user engagement, notice that and therefore uprank that content. Do you accept that your algorithms are optimised for user engagement?

Richard Earley: I am afraid to say that that is not correct. We have multiple algorithms on our services. Many of them, in fact, do the opposite of what you have just described: they identify posts that might be violent, misleading or harmful and reduce the prevalence of them within our feed products, our recommendation services and other parts of the service.

We optimise the algorithm that shows people things for something called meaningful social interaction. That is not just pure engagement; in fact, its focus—we made a large change to our algorithms in 2018 to focus on this—is on the kinds of activities online that research shows are correlated with positive wellbeing outcomes. Joining a group in your local area or deciding to go to an event that was started by one of your friends—that is what our algorithms are designed to promote. In fact, when we made that switch in 2018, we saw a decrease in more than 50 million hours of Facebook use every day as a result of that change. That is not the action of a company that is just focused on maximising engagement; it is a company that is focused on giving our users a positive experience on our platform.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q You have alluded to some elements of the algorithmic landscape, but do you accept that the dominant feature of the algorithm that determines which content is most promoted is based on user engagement, and that the things you have described are essentially second-order modifications to that?

Richard Earley: No, because as I just said, when we sent the algorithm this instruction to focus on social interaction it actually decreased the amount of time people spent on our platform.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q It might have decreased it, but the meaningful social interaction score is, not exclusively, as you said, but principally based on user engagement, isn’t it?

Richard Earley: As I said, it is about ensuring that people who spend time on our platform come away feeling that they have had a positive experience.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q That does not quite answer the question.

Richard Earley: I think that a really valuable part of the Bill that we are here to discuss is the fact that Ofcom will be required, and we in our risk assessments will be required, to consider the impact on the experience of our users of multiple different algorithms, of which we have hundreds. We build those algorithms to ensure that we reduce the prevalence of harmful content and give people the power to connect with those around them and build community. That is what we look forward to demonstrating to Ofcom when this legislation is in place.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Yes, but in her testimony to, I think, the Joint Committee and the US Senate, in a document that she released to The Wall Street Journal, and in our conversation last week, Frances Haugen suggested that the culture inside Facebook, now Meta, is that measures that tend to reduce user engagement do not get a very sympathetic hearing internally. However, I think we are about to run out of time. I have one other question, which I will direct, again, to Richard. Forgive me, Katie and Becky, but it is probably most relevant for Meta.

None Portrait The Chair
- Hansard -

Q Just one moment, please. Is there anything that the other witnesses need to say about this before we move on? It will have to be very brief.

Katie O'Donovan: I welcome the opportunity to address the Committee. It is so important that this Bill has parliamentary scrutiny. It is a Bill that the DCMS has spent a lot of time on, getting it right and looking at the systems and the frameworks. However, it will lead to a fundamentally different internet for UK users versus the rest of the world. It is one of the most complicated Bills we are seeing anywhere in the world. I realise that it is very important to have scrutiny of us as platforms to determine what we are doing, but I think it is really important to also look at the substance of the Bill. If we have time, I would welcome the chance to give a little feedback on the substance of the Bill too.

Becky Foreman: I would add that the Committee spent a lot of time talking to Meta, who are obviously a big focus for the Bill, but it is important to remember that there are numerous other networks and services that potentially will be caught by the Bill and that are very different from Meta. It is important to remember that.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

While the Bill is proportionate in its measures, it is not designed to impose undue burdens on companies that are not high risk. I have one more question for Richard. I think Katie was saying that she wanted to make a statement?

None Portrait The Chair
- Hansard -

We are out of time. I am sorry about this; I regard it as woefully unsatisfactory. We have got three witnesses here, a lot of questions that need to be answered, and not enough time to do it. However, we have a raft of witnesses coming in for the rest of the day, so I am going to have to draw a line under this now. I am very grateful to you for taking the trouble to come—the Committee is indebted to you. You must have the opportunity to make your case. Would you be kind enough to put any comments that you wish to make in writing so that the Committee can have them. Feel free to go as broad as you would like because I feel very strongly that you have been short-changed this afternoon. We are indebted to you. Thank you very much indeed.

Richard Earley: We will certainly do that and look forward to providing comments in writing.

Examination of Witnesses

Professor Clare McGlynn, Jessica Eagelton and Janaya Walker gave evidence.

14:48
None Portrait The Chair
- Hansard -

Good afternoon. We now hear oral evidence from Professor Clare McGlynn, professor of law at Durham University, Jessica Eagleton, policy and public affairs manager at Refuge, and Janaya Walker, public affairs manager at End Violence Against Women. Ladies, thank you very much for taking the trouble to join us this afternoon. We look forward to hearing from you.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you, Sir Roger, and thank you to the witnesses for joining us. We hear a lot about the negative experiences online of women, particularly women of colour. If violence against women and girls is not mentioned directly in the Bill, if misogyny is not made a priority harm, and if the violence against women and girls code of practice is not adopted in the Bill, what will that mean for the experience of women and girls?

Janaya Walker: Thank you for the opportunity to speak today. As you have addressed there, the real consensus among violence against women and girls organisations is for VAWG to be named in the Bill. The concern is that without that, the requirements that are placed on providers of regulated services will be very narrowly tied to the priority illegal content in schedule 7, as well as other illegal content.

We are very clear that violence against women and girls is part of a continuum in which there is a really broad manifestation of behaviour; some reaches a criminal threshold, but there is other behaviour that is important to be understood as part of the wider context. Much of the abuse that women and girls face cannot be understood by only looking through a criminal lens. We have to think about the relationship between the sender and the recipient—if it is an ex-partner, for example—the severity of the abuse they have experienced, the previous history and also the reach of the content. The worry is that the outcome of the Bill will be a missed opportunity in terms of addressing something that the Government have repeatedly committed to as a priority.

As you mentioned, we have worked with Refuge, Clare McGlynn, the NSPCC and 5Rights, bringing together our expertise to produce this full code of practice, which we think the Bill should be amended to include. The code of practice would introduce a cross-cutting duty that tries to mitigate this kind of pocketing of violence against women and girls into those three categories, to ensure that it is addressed really comprehensively.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q To what extent do you think that the provisions on anonymity will assist in reducing online violence against women and girls? Will the provisions currently in the Bill make a difference?

Janaya Walker: I think it will be limited. For the End Violence Against Women Coalition, our priority above all else is having a systems-based approach. Prevention really needs to be at the heart of the Bill. We need to think about the choices that platforms make in the design and operation of their services in order to prevent violence against women and girls in the first instance.

Anonymity has a place in the sense of providing users with agency, particularly in a context where a person is in danger and they could take that step in order to mitigate harm. There is a worry, though, when we look at things through an intersectional lens—thinking about how violence against women and girls intersects with other forms of harm, such as racism and homophobia. Lots of marginalised and minoritised people rely very heavy on being able to participate online anonymously, so we do not want to create a two-tier system whereby some people’s safety is contingent on them being a verified user, which is one option available. We would like the focus to be much more on prevention in the first instance.

None Portrait The Chair
- Hansard -

Professor McGlynn and Ms Eagelton, you must feel free to come in if you wish to.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q My final question is probably directed at you, Professor McGlynn. Although we welcome the new communications offence of cyber-flashing, one of the criticisms is that it will not actually make a difference because of the onus on proving intent to cause harm, rather than the sender providing consent to receive the material. How do you respond to that?

Professor Clare McGlynn: I think it is great that the Government have recognised the harms of cyber-flashing and put that into the Bill. In the last couple of weeks we have had the case of Gaia Pope, a teenager who went missing and died—an inquest is currently taking place in Dorset. The case has raised the issue of the harms of cyber-flashing, because in the days before she went missing she was sent indecent images that triggered post-traumatic stress disorder from a previous rape. On the day she went missing, her aunt was trying to report that to the police, and one of the police officers was reported as saying that she was “taking the piss”.

What I think that case highlights, interestingly, is that this girl was triggered by receiving these images, and it triggered a lot of adverse consequences. We do not know why that man sent her those images, and I guess my question would be: does it actually matter why he sent them? Unfortunately, the Bill says that why he sent them does matter, despite the harm it caused, because it would only be a criminal offence if it could be proved that he sent them with the intention of causing distress or for sexual gratification and being reckless about causing distress.

That has two main consequences. First, it is not comprehensive, so it does not cover all cases of cyber-flashing. The real risk is that a woman, having seen the headlines and heard the rhetoric about cyber-flashing being criminalised, might go to report it to the police but will then be told, “Actually, your case of cyber-flashing isn’t criminal. Sorry.” That might just undermine women’s confidence in the criminal justice system even further.

Secondly, this threshold of having to prove the intention to cause distress is an evidential threshold, so even if you think, as might well be the case, that he sent the image to cause distress, you need the evidence to prove it. We know from the offence of non-consensual sending of sexual images that it is that threshold that limits prosecutions, but we are repeating that mistake here with this offence. So I think a consent-based, comprehensive, straightforward offence would send a stronger message and be a better message from which education could then take place.

None Portrait The Chair
- Hansard -

You are nodding, Ms Eagelton.

Jessica Eagelton: I agree with Professor McGlynn. Thinking about the broader landscape and intimate image abuse as well, I think there are some significant gaps. There is quite a piecemeal approach at the moment and issues that we are seeing in terms of enforcing measures on domestic abuse as well.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

Q Thank you to all the panellists; it is incredibly helpful to have you here. The strength of the Bill will really be underpinned by the strength of the criminal law that underpins it, and schedule 7 lists offences that relate to sexual images, including revenge pornography, as priority offences. Can the witnesses say whether they think the law is sufficient to protect women from having their intimate pictures shared without their consent, or indeed whether the Bill will do anything to prevent the making and sharing of deepfake images? What would you like to see?

Professor Clare McGlynn: You make a very good point about how, in essence, criminal offences are now going to play a key part in the obligations of platforms under this Bill. In general, historically, the criminal law has not been a friend to women and girls. The criminal law was not written, designed or interpreted with women’s harms in mind. That means that you have a very piecemeal, confusing, out-of-date criminal law, particularly as regards online abuse, yet that is the basis on which we have to go forward. That is an unfortunate place for us to be, but I think we can strengthen it.

We could strengthen schedule 7 by, for example, including trafficking offences. There are tens of thousands of cases of trafficking, as we know from yourselves and whistleblowers, that platforms could be doing so much more about, but that is not a priority offence. The Obscene Publications Act distribution of unlawful images offence is not included. That means that incest porn, for example, is not a priority offence; it could be if we put obscene publications in that provision. Cyber-flashing, which again companies could take a lot of steps to act against, is not listed as a priority offence. Blackmail—sexual extortion, which has risen exponentially during the pandemic—again is not listed as a priority offence.

Deepfake pornography is a rising phenomenon. It is not an offence in English law to distribute deepfake pornography at the moment. That could be a very straightforward, simple change in the Bill. Only a few words are needed. It is very straightforward to make that a criminal offence, thanks to Scots law, where it is actually an offence to distribute altered images. The way the Bill is structured means the platforms will have to go by the highest standard, so in relation to deepfake porn, it would be interpreted as a priority harm—assuming that schedule 7 is actually altered to include all the Scottish offences, and the Northern Irish ones, which are absent at the moment.

The deepfake example points to a wider problem with the criminal law on online abuse: the laws vary considerably across the jurisdictions. There are very different laws on down-blousing, deepfake porn, intimate image abuse, extreme pornography, across all the different jurisdictions, so among the hundreds of lawyers the platforms are appointing, I hope they are appointing some Scots criminal lawyers, because that is where the highest standard tends to be.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Would the other panellists like to comment on this?

Jessica Eagelton: I think something that will particularly help in this instance is having that broad code of practice; that is a really important addition that must be made to the Bill. Refuge is the largest specialist provider of gender-based violence services in the country. We have a specialist tech abuse team who specialise in technology-facilitated domestic abuse, and what they have seen is that, pretty consistently, survivors are being let down by the platforms. They wait weeks and weeks for responses—months sometimes—if they get a response at all, and the reporting systems are just not up to scratch.

I think it will help to have the broad code of practice that Janaya mentioned. We collaborated with others to produce a workable example of what that could look like, for Ofcom to hopefully take as a starting point if it is mandated in the Bill. That sets out steps to improve the victim journey through content reporting, for example. Hopefully, via the code of practice, a victim of deepfakes and other forms of intimate image abuse would be able to have a more streamlined, better response from platforms.

I would also like to say, just touching on the point about schedule 7, that from the point of view of domestic abuse, there is another significant gap in that: controlling and coercive behaviour is not listed, but it should be. Controlling and coercive behaviour is one of the most common forms of domestic abuse. It carries serious risk; it is one of the key aggravating factors for domestic homicide, and we are seeing countless examples of that online, so we think that is another gap in schedule 7.

None Portrait The Chair
- Hansard -

Ms Walker?

Janaya Walker: Some of these discussions almost reiterate what I was saying earlier about the problematic nature of this, in that so much of what companies are going to be directed to do will be tied only to the specific schedule 7 offences. There have been lots of discussions about how you respond to some harms that reach a threshold of criminality and others that do not, but that really contrasts with the best practice approach to addressing violence against women and girls, which is really trying to understand the context and all of the ways that it manifests. There is a real worry among violence against women and girls organisations about the minimal response to content that is harmful to adults and children, but will not require taking such a rigorous approach.

Having the definition of violence against women and girls on the face of the Bill allows us to retain those expectations on providers as technology changes and new forms of abuse emerge, because the definition is there. It is VAWG as a whole that we are expecting the companies to address, rather than a changing list of offences that may or may not be captured in criminal law.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Why is it important that we have this? Is this a big thing? What are you guys actually seeing here?

Jessica Eagelton: I can respond to that in terms of what we are seeing as a provider. Technology-facilitated domestic abuse is an increasing form of domestic abuse: technology is providing perpetrators with increasing ways to abuse and harass survivors. What we are seeing on social media is constant abuse, harassment, intimate image abuse, monitoring and hacking of accounts, but when it comes to the responses we are getting from platforms at the moment, while I acknowledge that there is some good practice, the majority experience of survivors is that platforms are not responding sufficiently to the tech abuse they are experiencing.

Our concern is that the Bill could be a really good opportunity for survivors of domestic abuse to have greater protections online that would mean that they are not forced to come offline. At the moment, some of the options being given to survivors are to block the perpetrator—which in some cases has a minimal impact when they can easily set up new fake accounts—or to come offline completely. First, that is not a solution to that person being able to maintain contact, stay online and take part in public debate. But secondly, it can actually escalate risk in some cases, because a perpetrator could resort to in-person forms of abuse. If we do not make some of these changes—I am thinking in particular about mandating a VAWG code of practice, and looking at schedule 7 and including controlling and coercive behaviour—the Bill is going to be a missed opportunity. Women and survivors have been waiting long enough, and we need to take this opportunity.

Janaya Walker: If I could add to that, as Jessica has highlighted, there is the direct harm to survivors in terms of the really distressing experience of being exposed to these forms of harm, or the harm they experience offline being exacerbated online, but this is also about indirect harm. We need to think about the ways in which the choices that companies are making are having an impact on the extent to which violence against women and girls is allowed to flourish.

As Jessica said, it impacts our ability to participate in online discourse, because we often see a mirroring online of what happens offline, in the sense that the onus is often on women to take responsibility for keeping themselves safe. That is the status quo we see offline, in terms of the decisions we make about what we are told to wear or where we should go as a response to violence against women and girls. Similarly, online, the onus is often on us to come offline or put our profiles on private, to take all those actions, or to follow up with complaints to various different companies that are not taking action. There is also something about the wider impact on society as a whole by not addressing this within the Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q How does the proposed code of practice—or, I suppose, how could the Bill—tackle intersectionality of harms?

Janaya Walker: This is a really important question. We often highlight the fact that, as I have said, violence against women and girls often intersects with other forms of discrimination. For example, we know from research that EVAW conducted with Glitch during the pandemic that black and minoritised women and non-binary people experience a higher proportion of abuse. Similarly, research done by Amnesty International shows that black women experience harassment at a rate 84% higher than that experienced by their white counterparts. It is a real focal point. When we think about the abuse experienced, we see the ways that people’s identities are impacted and how structural discrimination emerges online.

What we have done with the code of practice is try to introduce requirements for the companies to think about things through that lens, so having an overarching human rights and equalities framework and having the Equality Act protected characteristics named as a minimum. We see in the Bill quite vague language when it comes to intersectionality; it talks about people being members of a certain group. We do not have confidence that these companies, which are not famed for their diversity, will interpret that in a way that we regard as robust—thinking very clearly about protected characteristics, human rights and equalities legislation. The vagueness in the Bill is quite concerning. The code of practice is an attempt to be more directive on what we want to see and how to think through issues in a way that considers all survivors, all women and girls.

Professor Clare McGlynn: I wholly agree. The code of practice is one way by which we can explain in detail those sorts of intersecting harms and what companies and platforms should do, but I think it is vital that we also write it into the Bill. For example, on the definitions around certain characteristics and certain groups, in previous iterations reference was made to protected characteristics. I know certain groups can go wider than that, but naming those protected characteristics is really important, so that they are front and centre and the platforms know that that is exactly what they have to cover. That will cover all the bases and ensure that that happens.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a quite specific question on something that is a bit tangential.

None Portrait The Chair
- Hansard -

Last one, please.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q If someone has consented to take part in pornography and they later change their mind and would like it to be taken down, do you think they should have the right to ask a porn website, for example, to take it down?

Professor Clare McGlynn: That is quite challenging not only for pornography platforms but for sex workers, in that if you could participate in pornography but at any time thereafter withdraw your consent, it is difficult to understand how a pornography company and the sex worker would be able to make a significant amount of money. The company would be reluctant to invest because it might have to withdraw the material at any time. In my view, that is a quite a challenge. I would not go down that route, because what it highlights is that the industry can be exploitative and that is where the concern comes from. I think there are other ways to deal with an exploitative porn industry and other ways to ensure that the material online has the full consent of participants. You could put some of those provisions into the Bill—for example, making the porn companies verify the age and consent of those who are participating in the videos for them to be uploaded. I think that is a better way to deal with that, and it would ensure that sex workers themselves can still contract to perform in porn and sustain their way of life.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you very much—this is extremely interesting and helpful. You have covered a lot of ground already, but I wonder whether there is anything specific you think the Bill should be doing more about, to protect girls—under-18s or under-16s—in particular?

Janaya Walker: A lot of what we have discussed in terms of naming violence against women and girls on the face of the Bill includes children. We know that four in five offences of sexual communications with a child involved girls, and a lot of child abuse material is targeted at girls specifically. The Bill as a whole takes a very gender-neutral approach, which we do not think is helpful; in fact, we think it is quite harmful to trying to reduce the harm that girls face online.

This goes against the approach taken in the Home Office violence against women and girls strategy and its domestic abuse plan, as well as the gold-standard treaties the UK has signed up to, such as the Istanbul convention, which we signed and have recently committed to ratifying. The convention states explicitly that domestic laws, including on violence against women and girls online, need to take a very gendered approach. Currently, it is almost implied, with references to specific characteristics. We think that in addressing the abuse that girls, specifically, experience, we need to name girls. To clarify, the words “women”, “girls”, “gender” and “sex” do not appear in the Bill, and that is a problem.

Jessica Eagelton: May I add a point that is slightly broader than your question? Another thing that the Bill does not do at the moment is provide for specialist victim support for girls who are experiencing online abuse. There has been some discussion about taking a “polluter pays” approach; where platforms are not compliant with the duties, for example, a percentage of the funds that go to the regulator could go towards victim support services, such as the revenge porn helpline and Refuge’s tech abuse team, that provide support to victims of abuse later on.

Professor Clare McGlynn: I can speak to pornography. Do you want to cover that separately, or shall I do that now?

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

That is fine.

Professor Clare McGlynn: I know that there was a discussion this morning about age assurance, which obviously targets children’s access to pornography. I would emphasise that age assurance is not a panacea for the problems with pornography. We are so worried about age assurance only because of the content that is available online. The pornography industry is quite happy with age verification measures. It is a win-win for them: they get public credibility by saying they will adopt it; they can monetise it, because they are going to get more data—especially if they are encouraged to develop age verification measures, which of course they have been; that really is putting the fox in charge of the henhouse—and they know that it will be easily evaded.

One of the most recent surveys of young people in the UK was of 16 and 17-year-olds: 50% of them had used a VPN, which avoids age verification controls, and 25% more knew about that, so 75% of those older children knew how to evade age assurance. This is why the companies are quite happy—they are going to make money. It will stop some people stumbling across it, but it will not stop most older children accessing pornography. We need to focus on the content, and when we do that, we have to go beyond age assurance.

You have just heard Google talking about how it takes safety very seriously. Rape porn and incest porn are one click away on Google. They are freely and easily accessible. There are swathes of that material on Google. Twitter is hiding in plain sight, too. I know that you had a discussion about Twitter this morning. I, like many, thought, “Yes, I know there is porn on Twitter,” but I must confess that until doing some prep over the last few weeks, I did not know the nature of that porn. For example, “Kidnapped in the wood”; “Daddy’s little girl comes home from school; let’s now cheer her up”; “Raped behind the bin”—this is the material that is on Twitter. We know there is a problem with Pornhub, but this is what is on Twitter as well.

As the Minister mentioned this morning, Twitter says you have to be 13, and you have to be 18 to try to access much of this content, but you just put in whatever date of birth is necessary—it is that easy—and you can get all this material. It is freely and easily accessible. Those companies are hiding in plain sight in that sense. The age verification and age assurance provisions, and the safety duties, need to be toughened up.

To an extent, I think this will come down to the regulator. Is the regulator going to accept Google’s SafeSearch as satisfying the safety duties? I am not convinced, because of the easy accessibility of the rape and incest porn I have just talked about. I emphasise that incest porn is not classed as extreme pornography, so it is not a priority offence, but there are swathes of that material on Pornhub as well. In one of the studies that I did, we found that one in eight titles on the mainstream pornography sites described sexually violent material, and the incest material was the highest category in that. There is a lot of that around.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q We are talking here about pornography when it is hosted on mainstream websites, as opposed to pornographic websites. Could I ask you to confirm what more, specifically, you think the Bill should do to tackle pornography on mainstream websites, as you have just been describing with Twitter? What should the Bill be doing here?

Professor Clare McGlynn: In many ways, it is going to be up to the regulator. Is the regulator going to deem that things such as SafeSearch, or Twitter’s current rules about sensitive information—which rely on the host to identify their material as sensitive—satisfy their obligations to minimise and mitigate the risk? That is, in essence, what it will all come down to.

Are they going to take the terms and conditions of Twitter, for example, at face value? Twitter’s terms and conditions do say that they do not want sexually violent material on there, and they even say that it is because they know it glorifies violence against women and girls, but this material is there and does not appear to get swiftly and easily taken down. Even when you try to block it—I tried to block some cartoon child sexual abuse images, which are easily available on there; you do not have to search for them very hard, it literally comes up when you search for porn—it brings you up five or six other options in case you want to report them as well, so you are viewing them as well. Just on the cartoon child sexual abuse images, before anyone asks, they are very clever, because they are just under the radar of what is actually a prohibited offence.

It is not necessarily that there is more that the Bill itself could do, although the code of practice would ensure that they have to think about these things more. They have to report on their transparency and their risk assessments: for example, what type of content are they taking down? Who is making the reports, and how many are they upholding? But it is then on the regulator as to what they are going to accept as acceptable, frankly.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Do any other panellists want to add to that?

Janaya Walker: Just to draw together the questions about pornography and the question you asked about children, I wanted to highlight one of the things that came up earlier, which was the importance of media literacy. We share the view that that has been rolled back from earlier versions of the draft Bill.

There has also been a shift, in that the emphasis of the draft Bill was also talking about the impact of harm. That is really important when we are talking about violence against women and girls, and what is happening in the context of schools and relationship and sex education. Where some of these things like non-consensual image sharing take place, the Bill as currently drafted talks about media literacy and safe use of the service, rather than the impact of such material and really trying to point to the collective responsibility that everyone has as good digital citizens—in the language of Glitch—in terms of talking about online violence against women and girls. That is an area in which the Bill could be strengthened from the way it is currently drafted.

Jessica Eagelton: I completely agree with the media literacy point. In general, we see very low awareness of what tech abuse is. We surveyed some survivors and did some research last year—a public survey—and almost half of survivors told no one about the abuse they experienced online at the hands of their partner or former partner, and many of the survivors we interviewed did not understand what it was until they had come to Refuge and we had provided them with support. There is an aspect of that to the broader media literacy point as well: increasing awareness of what is and is not unacceptable behaviour online, and encouraging members of the public to report that and call it out when they see it.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q Thank you. Can I ask for a bit more detail on a question that you touched on earlier with my colleague Kirsty Blackman? It is to Professor McGlynn, really. I think you included in your written evidence to the Committee a point about using age and consent verification for pornography sites for people featured in the content of the site—not the age verification assurance checks on the sites, but for the content. Could I just draw out from you whether that is feasible, and would it be retrospective for all videos, or just new ones? How would that work?

Professor Clare McGlynn: Inevitably, it would have to work from any time that that requirement was put in place, in reality. That measure is being discussed in the Canadian Parliament at the moment—you might know that Pornhub’s parent company, MindGeek, is based in Canada, which is why they are doing a lot of work in that regard. The provision was also put forward by the European Parliament in its debates on the Digital Services Act. Of course, any of these measures are possible; we could put it into the Bill that that will be a requirement.

Another way of doing it, of course, would be for the regulator to say that one of the ways in which Pornhub, for example—or XVideos or xHamster—should ensure that they are fulfilling their safety duties is by ensuring the age and consent of those for whom videos are uploaded. The flipside of that is that we could also introduce an offence for uploading a video and falsely representing that the person in the video had given their consent to that. That would mirror offences in the Fraud Act 2006.

The idea is really about introducing some element of friction so that there is a break before images are uploaded. For example, with intimate image abuse, which we have already talked about, the revenge porn helpline reports that for over half of the cases of such abuse that it deals with, the images go on to porn websites. So those aspects are really important. It is not just about all porn videos; it is also about trying to reduce the distribution of non-consensual videos.

Nick Fletcher Portrait Nick Fletcher (Don Valley) (Con)
- Hansard - - - Excerpts

Q I think that it would have been better to hear from you three before we heard from the platforms this morning. Unfortunately, you have opened my eyes to a few things that I wish I did not have to know about—I think we all feel the same.

I am concerned about VPNs. Will the Bill stop anyone accessing through VPNs? Is there anything we can do about that? I googled “VPNs” to find out what they were, and apparently there is a genuine need for them when using public networks, because it is safer. Costa Coffee suggests that people do so, for example. I do not know how we could work that.

You have obviously educated me, and probably some of my colleagues, about some of the sites that are available. I do not mix in circles where I would be exposed to that, but obviously children and young people do and there is no filter. If I did know about those things, I would probably not speak to my colleagues about it, because that would probably not be a good thing to do, but younger people might think it is quite funny to talk about. Do you think there is an education piece there for schools and parents? Should these platforms be saying to them, “Look, this is out there, even though you might not have heard of it—some MPs have not heard of it.” We ought to be doing something to protect children by telling parents what to look out for. Could there be something in the Bill to force them to do that? Do you think that would be a good idea? There is an awful lot there to answer—sorry.

Professor Clare McGlynn: On VPNs, I guess it is like so much technology: obviously it can be used for good, but it can also be used to evade regulations. My understanding is that individuals will be able to use a VPN to avoid age verification. On that point, I emphasise that in recent years Pornhub, at the same time as it was talking to the Government about developing age verification, was developing its own VPN app. At the same time it was saying, “Of course we will comply with your age verification rules.”

Don’t get me wrong: the age assurance provisions are important, because they will stop people stumbling across material, which is particularly important for the very youngest. In reality, 75% know about VPNs now, but once it becomes more widely known that this is how to evade it, I expect that all younger people will know how to do so. I do not think there is anything else you can do in the Bill, because you are not going to outlaw VPNs, for the reasons you identified—they are actually really important in some ways.

That is why the focus needs to be on content, because that is what we are actually concerned about. When you talk about media literacy and understanding, you are absolutely right, because we need to do more to educate all people, including young people—it does not just stop at age 18—about the nature of the pornography and the impact it can have. I guess that goes to the point about media literacy as well. It does also go to the point about fully and expertly resourcing sex and relationships education in school. Pornhub has its own sex education arm, but it is not the sex education arm that I think many of us would want to be encouraging. We need to be doing more in that regard.

Nick Fletcher Portrait Nick Fletcher
- Hansard - - - Excerpts

Q This might sound like a silly question. Can we not just put age verification on VPN sites, so that you can only have VPN access if you have gone through age verification? Do you understand what I am saying?

Professor Clare McGlynn: I do. We are beginning to reach the limits of my technical knowledge.

Nick Fletcher Portrait Nick Fletcher
- Hansard - - - Excerpts

You have gone beyond mine anyway.

Professor Clare McGlynn: You might be able to do that through regulations on your phone. If you have a phone that is age-protected, you might not be able to download a particular VPN app, perhaps. Maybe you could do that, but people would find ways to evade that requirement as well. We have to tackle the content. That is why you need to tackle Google and Twitter as well as the likes of Pornhub.

Nick Fletcher Portrait Nick Fletcher
- Hansard - - - Excerpts

Can we have them back in, Sir Roger?

None Portrait The Chair
- Hansard -

Minister?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you, Sir Roger, and thank you to the witnesses for coming in and giving very clear, helpful and powerful evidence to the Committee this afternoon. On the question of age verification or age assurance that we have just spoken about, clause 11(14) of the Bill sets a standard in the legislation that will be translated into the codes of practice by Ofcom. It says that, for the purposes of the subsection before on whether or not children can access a particular set of content, a platform is

“only entitled to conclude that it is not possible for children to access a service…if there are systems or processes in place…that achieve the result that children are not normally able to access the service”.

Ofcom will then interpret in codes of practice what that means practically. Professor McGlynn, do you think that standard set out there—

“the result that children are not normally able to access the service or that part of it”

—is sufficiently high to address the concerns we have been discussing in the last few minutes?

Professor Clare McGlynn: At the moment, the wording with regard to age assurance in part 5—the pornography providers—is slightly different, compared with the other safety duties. That is one technicality that could be amended. As for whether the provision you just talked about is sufficient, in truth I think it comes down, in the end, to exactly what is required, and of course we do not yet know what the nature of the age verification or age assurance requirements will actually be and what that will actually mean.

I do not know what that will actually mean for something like Twitter. What will they have to do to change it? In principle, that terminology is possibly sufficient, but it kind of depends in practice what it actually means in terms of those codes of practice. We do not yet know what it means, because all we have in the Bill is about age assurance or age verification.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Yes, you are quite right that the Ofcom codes of practice will be important. As far as I can see, the difference between clauses 68 and 11(14) is that one uses the word “access” and the other uses the word “encounter”. Is that your analysis of the difference as well?

Professor Clare McGlynn: My understanding as well is that those terms are, at the moment, being interpreted slightly differently in terms of the requirements that people will be under. I am just making a point about it probably being easier to harmonise those terms.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you very much. I wanted to ask you a different question—one that has not come up so far in this session but has been raised quite frequently in the media. It concerns freedom of speech. This is probably for Professor McGlynn again. I am asking you this in your capacity as a professor of law. Some commentators have suggested that the Bill will have an adverse impact on freedom of speech. I do not agree with that. I have written an article in The Times today making that case, but what is your expert legal analysis of that question?

Professor Clare McGlynn: I read your piece in The Times this morning, which was a robust defence of the legislation, in that it said that it is no threat to freedom of speech, but I hope you read my quote tweet, in which I emphasised that there is a strong case to be made for regulation to free the speech of many others, including women and girls and other marginalised people. For example, the current lack of regulation means that women’s freedom of speech is restricted because we fear going online because of the abuse we might encounter. Regulation frees speech, while your Bill does not unduly limit freedom of speech.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Okay, I take your second point, but did you agree with the point that the Bill as crafted does not restrict what you would ordinarily consider to be free speech?

Professor Clare McGlynn: There are many ways in which speech is regulated. The social media companies already make choices about what speech is online and offline. There are strengths in the Bill, such as the ability to challenge when material is taken offline, because that can impact on women and girls as well. They might want to put forward a story about their experiences of abuse, for example. If that gets taken down, they will want to raise a complaint and have it swiftly dealt with, not just left in an inbox.

There are lots of ways in which speech is regulated, and the idea of having a binary choice between free speech and no free speech is inappropriate. Free speech is always regulated, and it is about how we choose to regulate it. I would keep making the point that the speech of women and girls and other marginalised people is minimised at the moment, so we need regulation to free it. The House of Lords and various other reports about free speech and regulation, for example, around extreme pornography, talk about regulation as being human-rights-enhancing. That is the approach we need to take.

None Portrait The Chair
- Hansard -

Thank you very much indeed. Once again, I am afraid I have to draw the session to a close, and once again we have probably not covered all the ground we would have liked. Professor McGlynn, Ms Walker, Ms Eagleton, thank you very much indeed. As always, if you have further thoughts or comments, please put them in writing and let us know. We are indebted to you.

Examination of Witnesses

Lulu Freemont, Ian Stevenson and Adam Hildreth gave evidence.

15:32
None Portrait The Chair
- Hansard -

We will now hear oral evidence from Lulu Freemont, head of digital regulation at techUK; Ian Stevenson, the chairman of OSTIA; and Adam Hildreth, chief executive officer of Crisp, who is appearing by Zoom—and it works. Thank you all for joining us. I will not waste further time by asking you to identify yourselves, because I have effectively done that for you. Without further ado, I call Alex Davies-Jones.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you, Sir Roger; thank you, witnesses. We want the UK to become a world leader in tech start-ups. We want those employment opportunities for the future. Does this legislation, as it currently stands, threaten that ability?

Lulu Freemont: Hi everybody. Thank you so much for inviting techUK to give evidence today. Just to give a small intro to techUK, so that you know the perspective I am coming from, we are the trade body for the tech sector. We have roughly 850 tech companies in our membership, the majority of which are small and medium-sized enterprises. We are really focused on how this regime will work for the 25,000 tech companies that are set to be in scope, and our approach is really on the implementation and how the Bill can deliver on the objectives.

Thank you so much for the question. There are some definite risks when we think about smaller businesses and the Online Safety Bill. Today, we have heard a lot of the names that come up with regard to tech companies; they are the larger companies. However, this will be a regime that impacts thousands of different tech companies, with different functionalities and different roles within the ecosystem, all of which contribute to the economy in their own way.

There are specific areas to be addressed in the Bill, where there are some threats to innovation and investment by smaller businesses. First, greater clarity is needed. In order for this regime to be workable for smaller businesses, they need clarity on guidelines and on definitions, and they also need to be confident that the systems and processes that they put in place will be sustainable—in other words, the right ones.

Certain parts of the regime risk not having enough clarity. The first thing that I will point to is around the definitions of harm. We would very much welcome having some definitions of harmful content, or even categories of harmful content, in primary legislation. It might then be for Ofcom to determine how those definitions are interpreted within the codes, but having things to work off and types of harmful content for smaller businesses to start thinking about would be useful; obviously, that will be towards children, given that they are likely to be category 2.

The second risk for smaller businesses is really around the powers of the Secretary of State. I think there is a real concern. The Secretary of State will have some technical powers, which are pretty much normal; they are what you would expect in any form of regulation. However, the Online Safety Bill goes a bit further than that, introducing some amendment powers. So, the Secretary of State can modify codes of practice to align with public policy. In addition to that, there are provisions to allow the Secretary of State to set thresholds between the categories of companies.

Smaller businesses want to start forming a strong relationship with Ofcom and putting systems and processes in place that they can feel confident in. If they do not have that level of confidence and if the regime could be changed at any point, they might not be able to progress with those systems and processes, and when it comes to kind of pushing them out of the market, they might not be able to keep up with some of the larger companies that have been very much referenced in every conversation.

So, we need to think about proportionality, and we need to think about Ofcom’s independence and the kind of relationship that it can form with smaller businesses. We also need to think about balance. This regime is looking to strike a balance between safety, free speech and innovation in the UK’s digital economy. Let us just ensure that we provide enough clarity for businesses so that they can get going and have confidence in what they are doing.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you, Lulu. Adam and Ian, if either of you want to come in at any point, please just indicate that and I will bring you in.

None Portrait The Chair
- Hansard -

May I just apologise before we go any further, because I got you both the wrong way round? I am sorry. It is Mr Stevenson who is online and it is Adam Hildreth who is here in body and person.

Adam Hildreth: I think we have evolved as a world actually, when it comes to online safety. I think that if you went back five or 10 years, safety would have come after your people had developed their app, their platform or whatever they were creating from a tech perspective. I think we are now in a world where safety, in various forms, has to be there by default. And moving on to your point, we have to understand what that means for different sizes of businesses. The risk assessment word or phrase for me is the critical part there, which is putting blocks in front of people who are innovating and creating entrepreneurial businesses that make the online world a better place. Putting those blocks in without them understanding whether they can compete or not in an open and fair market is where we do not want to be.

So, getting to the point where it is very easy to understand is important—a bit like where we got to in other areas, such as data protection and where we went with the GDPR. In the end, it became simplified; I will not use the word “simplified” ever again in relation to GDPR, but it did become simplified from where it started. It is really important for anyone developing any type of tech platform that the Online Safety Bill will affect that they understand exactly what they do and do not have to put in place; otherwise, they will be taken out just by not having a legal understanding of what is required.

The other point to add, though, is that there is a whole other side to online safety, which is the online safety tech industry. There are tons of companies in the UK and worldwide that are developing innovative technologies that solve these problems. So, there is a positive as well as an understanding of how the Bill needs to be created and publicised, so that people understand what the boundaries are, if you are a UK business.

None Portrait The Chair
- Hansard -

Mr Stevenson, you are nodding. Do you want to come in?

Ian Stevenson: I agree with the contributions from both Adam and Lulu. For me, one of the strengths of the Bill in terms of the opportunity for innovators is that so much is left to Ofcom to provide codes of practice and so on in the future, but simultaneously that is its weakness in the short term. In the absence of those codes of practice and definitions of exactly where the boundaries between merely undesirable and actually harmful and actionable might lie, the situation is very difficult. It is very difficult for companies like my own and the other members of the Online Safety Tech Industry Association, who are trying to produce technology to support safer experiences online, to know exactly what that technology should do until we know which harms are in scope and exactly what the thresholds are and what the definitions of those harms are. Similarly, it is very hard for anybody building a service to know what technologies, processes and procedures they will need until they have considerably more detailed information than they have at the moment.

I agree that there are certain benefits to having more of that in the Bill, especially when it comes to the harms, but in terms of the aspiration and of what I hear is the objective of the Bill—creating safer online experiences—we really need to understand when we are going to have much more clarity and detail from Ofcom and any other relevant party about exactly what is going to be seen as best practice and acceptable practice, so that people can put in place those measures on their sites and companies in the Online Safety Tech Industry Association can build the tools to help support putting those measures in place.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you all. Lulu, you mentioned concerns about the Secretary of State’s powers and Ofcom’s independence. Other concerns expressed about Ofcom include its ability to carry out this regulation. It is being hailed as the saviour of the internet by some people. Twenty-five thousand tech companies in the UK will be under these Ofcom regulations, but questions have been asked about its technical and administrative capacity to do this. Just today, there is an online safety regulator funding policy adviser role being advertised by the Department for Digital, Culture, Media and Sport. Part of the key roles and responsibilities are:

“The successful post holder will play a key role in online safety as the policy advisor on Funding for the Online Safety Regulator.”

Basically, their job is to raise money for Ofcom. Does that suggest concerns about the role of Ofcom going forward, its funding, and its resource and capacity to support those 25,000 platforms?

Lulu Freemont: It is a very interesting question. We really support Ofcom in this role. We think that it has a very good track record with other industries that are also in techUK’s membership, such as broadcasters. It has done a very good job at implementing proportionate regulation. We know that it has been increasing its capacity for some time now, and we feel confident that it is working with us as the trade and with a range of other experts to try to understand some of the detail that it will have to understand to regulate.

One of the biggest challenges—we have had this conversation with Ofcom as well—is to understand the functionalities of tech services. The same functionality might be used in a different context, and that functionality could be branded as very high risk in one context but very low risk in another. We are having those conversations now. It is very important that they are being had now, and we would very much welcome Ofcom publishing drafts. We know that is its intention, but it should bring everything forward in terms of all the gaps in this regulation that are left to Ofcom’s codes, guidance and various other documentation.

Adam Hildreth: One of the challenges that I hear a lot, and that we hear a lot at Crisp in our work, is that people think that the Bill will almost eradicate all harmful content everywhere. The challenge that we have with content is that every time we create a new technology or mechanism that defeats harmful or illegal content, the people who are creating it—they are referred to in lots of ways, but bad actors, ultimately—create another mechanism to do it. It is very unlikely that we will ever get to a situation in which it is eradicated from every platform forever—though I hope we do.

What is even harder for a regulator is to be investigating why a piece of content is on a platform. If we get to a position where people are saying, “I saw this bit of content; it was on a platform,” that will be a really dangerous place to be, because the funding requirement for any regulator will go off the charts—think about how much content we consume. I would much prefer to be in a situation where we think about the processes and procedures that a platform puts in place and making them appropriate, ensuring that if features are aimed at children, they do a risk assessment so that they understand how those features are being used and how they could affect children in particular—or they might have a much more diverse user group, whereby harm is much less likely.

So, risk assessments and, as Ian mentioned, technologies, processes and procedures—that is the bit that a regulator can do well. If your risk assessment is good and your technology, process and procedures are as good as they can be based on a risk assessment, that almost should mean that you are doing the best job you possibly can to stop that content appearing, but you are not eradicating it. It really worries me that we are in a position whereby people are going to expect that they will never see content on a platform again, even though billions of pieces of potentially harmful content could have been removed from those platforms.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q On that point, you mentioned that it is hard to predict the future and to regulate on the basis of what is already there. We have waited a long time for the Bill, and in that time we have had new platforms and new emerging technology appear. How confident are you that the Bill allows for future-proofing, in order that we can react to anything new that might crop up on the internet?

Adam Hildreth: I helped personally in 2000 and 2001, when online grooming did not even exist as a law, so I have been involved in this an awful long time, waiting for laws to exist. I do not think we will ever be in a situation in which they are future-proofed if we keep putting every possibility into law. There needs to be some principles there. There are new features launched every day, and assessments need to be made about who they pose a risk to and the level of risk. In the same way as you would do in all kinds of industries, someone should do an assessment from a health and safety perspective. From that, you then say, “Can we even launch it at all? Is it feasible? Actually, we can, because we can take this amount of risk.” Once they understand those risk assessments, technology providers can go further and develop technology that can combat this.

If we can get to the point where it is more about process and the expectations around people who are creating any types of online environments, apps or technologies, it will be future-proofed. If we start trying to determine exact pieces of content, what will happen is that someone will work out a way around it tomorrow, and that content will not be included in the Bill, or it will take too long to get through and suddenly, the whole principle of why we are here and why we are having this discussion will go out the window. That is what we have faced every day since 1998: every time the technology works out how to combat a new risk—whether that is to children, adults, the economy or society—someone comes along and works out a way around the technology or around the rules and regulations. It needs to move quickly; that will future-proof it.

None Portrait The Chair
- Hansard -

I have four Members plus the Minister to get in, so please be brief. I call Dean Russell.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Q Thank you, Sir Roger. My question builds on the future-proofing. Obviously, the big focus now is the metaverse and a virtual reality world. My question has two parts. First, is the Bill helping already by encouraging the new start-ups in that space to put safety first? Secondly, do you agree that a Joint Committee of the Houses of Parliament that continued to look at the Act and its evolution over the long term once it had been passed would be beneficial? I will come to you first, Lulu.

Lulu Freemont: On future-proofing, one of the real strengths of the Bill is the approach: it is striving to rely on systems and processes, to be flexible and to adapt to future technologies. If the Bill sticks to that approach, it will have the potential to be future-proof. Some points in the Bill raise a slight concern about the future-proofness of the regulation. There is a risk that mandating specific technologies—I know that is one of Ofcom’s powers under the Bill—would put a bit of a timestamp on the regulation, because those technologies will likely become outdated at some point. Ensuring that the regulation remains flexible enough to build on the levels of risk that individual companies have, and on the technologies that work for the development and innovation of those individual companies, will be a really important feature, so we do have some concerns around the mandating of specific technologies in the Bill.

On the point about setting up a committee, one of the things for which techUK has called for a really long time is an independent committee that could think about the current definitions of harm and keep them under review. As companies put in place systems and processes that might mitigate levels of risk of harm, will those levels of harm still be harmful? We need to constantly evolve the regime so that it is true to the harms and risks that are present today, and to evaluate it against human rights implications. Having some sort of democratically led body to think about those definitional points and evaluate them as times change and harm reduces through this regime would be very welcome.

Adam Hildreth: To add to that, are people starting to think differently? Yes, they definitely are. That ultimately, for me, is the purpose of the Bill. It is to get people to start thinking about putting safety as a core principle of what they do as an overall business—not just in the development of their products, but as the overall business. I think that will change things.

A lot of the innovation that comes means that safety is not there as the principal guiding aspect, so businesses do need some help. Once they understand how a particular feature can be exploited, or how it impacts certain demographics or particular age groups—children being one of them—they will look for solutions. A lot of the time, they have no idea before they create this amazing new metaverse, or this new metaverse game, that it could actually be a container for harmful content or new types of harm. I think this is about getting people to think. The risk assessment side is critical, for me—making sure they go through that process or can bring on experts to do that.

Ian Stevenson: I would split the future-proofing question into two parts. There is a part where this Bill will provide Ofcom with a set of powers, and the question will be: does Ofcom have the capacity and agility to keep up with the rate of change in the tech world? Assuming it does, it will be able to act fairly quickly. There is always a risk, however, that once a code of conduct gets issued, it becomes very difficult to update that code of conduct in a responsive way.

There is then a second piece, which is: are the organisations that are in scope of regulation, and the powers that Ofcom has, sufficient as things change? That is where the idea of a long-term committee to keep an eye on this is extremely helpful. That would be most successful if it did not compromise Ofcom’s independence by digging deeply into individual codes of conduct or recommendations, but rather focused on whether Ofcom has the powers and capacity that it needs to regulate as new types of company, platform and technology come along.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Thank you.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q My first question is for Lulu. Do small tech companies have enough staff with technical expertise to be able to fulfil their obligations under the Bill?

Lulu Freemont: It is a great question. One of the biggest challenges is capacity. We hear quite a lot from the smaller tech businesses within our membership that they will have to divert their staff away from existing work to comply with the regime. They do not have compliance teams, and they probably do not have legal counsel. Even at this stage, to try to understand the Bill as it is currently drafted—there are lots of gaps—they are coming to us and saying, “What does this mean in practice?” They do not have the answers, or the capability to identify that. Attendant regulatory costs—thinking about the staff that you have and the cost, and making sure the regulation is proportionate to the need to divert away from business development or whatever work you might be doing in your business—are really fundamental.

Another real risk, and something in the Bill that smaller businesses are quite concerned about, is the potential proposal to extend the senior management liability provisions. We can understand them being in there to enable the regulators to do their job—information requests—but if there is any extension into individual pieces of content, coupled with a real lack of definitions, those businesses might find themselves in the position of restricting access to their services, removing too much content or feeling like they cannot comply with the regime in a proportionate way. That is obviously a very extreme case study. It will be Ofcom’s role to make sure that those businesses are being proportionate and understand the provisions, but the senior management liability does have a real, chilling impact on the smaller businesses within our membership.

Adam Hildreth: One of the challenges that we have seen over the last few years is that you can have a business that is small in revenue but has a huge global user base, with millions of users, so it is not really a small business; it just has not got to the point where it is getting advertisers and getting users to pay for it. I have a challenge on the definition of a small to medium-sized business. Absolutely, for start-ups with four people in a room—or perhaps even still just two—that do not have legal counsel or anything else, we need to make it simple for those types of businesses to ingest and understand what the principles are and what is expected of them. Hopefully they will be able to do quite a lot early on.

The real challenge comes when someone labels themselves as a small business but they have millions of users across the globe—and sometimes actually quite a lot of people working for them. Some of the biggest tech businesses in the world that we all use had tens of people working for them at one point in time, when they had millions of users. That is the challenge, because there is an expectation for the big-tier providers to be spending an awful lot of money, when the small companies are actually directly competing with them. There is a challenge to understanding the definition a small business and whether that is revenue-focused, employee-focused or about how many users it has—there may be other metrics.

Ian Stevenson: One of the key questions is how much staffing this will actually take. Every business in the UK that processes data is subject to GDPR from day one. Few of them have a dedicated data protection officer from day one; it is a role or responsibility that gets taken on by somebody within the organisation, or maybe somebody on the board who has some knowledge. That is facilitated by the fact that there are a really clear set of requirements there, and there are a lot of services you can buy and consume that help you deliver compliance. If we can get to a point where we have codes of practice that make very clear recommendations, then even small organisations that perhaps do not have that many staff to divert should be able to achieve some of the basic requirements of online safety by buying in the services and expertise that they need. We have seen with GDPR that many of those services are affordable to small business.

If we can get the clarity of what is required right, then the staff burden does not have to be that great, but we should all remember that the purpose of the Bill is to stop some of the egregiously bad things that happen to people as a result of harmful content, harmful behaviours and harmful contact online. Those things have a cost in the same way that implementing data privacy has a cost. To come back to Lulu’s point, it has to be proportionate to the business.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Adam, you said a few moments ago that companies are starting to put safety at the core of what they do, which will be welcome to us all—maybe it should have happened a lot earlier. I know you have worked a lot in that area. Regulators and company owners will have to depend on an ethical culture in their organisations if they are going to abide by the new regulations, because they cannot micromanage and regulators cannot micromanage. Will the Bill do enough to drive that ethical culture? If not, what more could it do or could the industry do? I would be really interested in everybody’s answer to this one, but I will start with Adam.

Adam Hildreth: What we are seeing from the people that are getting really good at this and that really understand it is that they are treating this as a proper risk assessment, at a very serious level, across the globe. When we are talking about tier 1s, they are global businesses. When they do it really well, they understand risk and how they are going to roll out systems, technology, processes and people in order to address that. That can take time. Yes, they understand the risk, who it is impacting and what they are going to do about it, but they still need to train people and develop processes and maybe buy or build technology to do it.

We are starting to see that work being done really well. It is done almost in the same way that you would risk assess anything else: corporate travel, health and safety in the workplace—anything. It should really become one of those pillars. All those areas I have just gone through are regulated. Once you have regulation there, it justifies why someone is doing a risk assessment, and you will get businesses and corporates going through that risk assessment process. We are seeing others that do not do the same level of risk assessment and they do not have that same buy-in.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Lulu, how do you drive a culture change?

Lulu Freemont: TechUK’s membership is really broad. We have cyber and defence companies in our membership, and large platforms and telcos. We speak on behalf of the sector. We would say that there is a real commitment to safety and security.

To bring it back to regulation, the risk-based approach is very much the right one—one that we think has the potential to really deliver—but we have to think about the tech ecosystem and its diversity. Lots of TechUK members are on the business-to-business side and are thinking about the role that they play in supporting the infrastructure for many of the platforms to operate. They are not entirely clear that they are exempt in the Bill. We understand that it is a very clear policy intention to exempt those businesses, but they do not have the level of legal clarity that they need to understand their role as access facilities within the tech.

That is just one example of a part of the sector that you would not expect to be part of this culture change or regulation but which is being caught in it slightly as an unintended consequence of legal differences or misinterpretations. Coming from that wide-sector perspective, we think that we need clarity on those issues to understand the different functionalities, and each platform and service will be different in their approach to this stuff.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Ian, how do you drive a culture change in the sector?

Ian Stevenson: I think you have to look at the change you are trying to effect. For many people in the sector, there is a lack of awareness about what happens when the need to consider safety in building features is not put first. Even when you realise how many bad things can happen online, if you do not know what to do about it, you tend not to be able to do anything about it.

If we want to change culture—it is the same for individual organisations as for the sector as a whole—we have to educate people on what the problem is and give them the tools to feel empowered to do something about it. If you educate and empower people, you remove the barrier to change. In some places, an extremely ethical people-centric and safety-focused culture very naturally emerges, but in others, less so. That is precisely where making it a first-class citizen in terms of risk assessment for boards and management becomes so important. When people see management caring about things, that gets pushed out through the organisations.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q In your view, what needs to be added or taken away from the Bill to help it achieve the Government’s aim of making the UK

“the safest place in the world to be online”?

Lulu Freemont: First, I want to outline that there are some strong parts in the Bill that the sector really supports. I think the majority of stakeholders would agree that the objectives are the right ones. The Bill tries to strike a balance between safety, free speech and encouraging innovation and investment in the UK’s digital economy. The approach—risk-based, systems-led and proportionate—is the right one for the 25,000 companies that are in scope. As it does not focus on individual pieces of content, it has the potential to be future-proof and to achieve longer-term outcomes.

The second area in the Bill that we think is strong is the prioritisation of illegal content. We very much welcome the clear definitions of illegal content on the face of the Bill, which are incredibly useful for businesses as they start to think about preparing for their risk assessment on illegal content. We really support Ofcom as the appropriate regulator.

There are some parts of the Bill that need specific focus and, potentially, amendments, to enable it to deliver on those objectives without unintended consequences. I have already mentioned a few of those areas. The first is defining harmful content in primary legislation. We can leave it to codes to identify the interpretations around that, but we need definitions of harmful content so that businesses can start to understand what they need to do.

Secondly, we need clarity that businesses will not be required to monitor every piece of content as a result of the Bill. General monitoring is prohibited in other regions, and we have concerns that the Online Safety Bill is drifting away from those norms. The challenges of general monitoring are well known: it encroaches on individual rights and could result in the over-removal of content. Again, we do not think that the intention is to require companies of all sizes to look at every piece of content on their site, but it might be one of the unintended consequences, so we would like an explicit prohibition of general monitoring on the face of the Bill.

We would like to remove the far-reaching amendment powers of the Secretary of State. We understand the need for technical powers, which are best practised within regulation, but taking those further so that the Secretary of State can amend the regime in such an extreme way to align with public policy is of real concern, particularly to smaller businesses looking to confidently put in place systems and processes. We would like some consideration of keeping senior management liability as it is. Extending that further is only going to increase the chilling impact that it is having and the environment it is creating within UK investment. The final area, which I have just spoken about, is clarifying the scope. The business-to-business companies in our membership need clarity that they are not in scope and for that intention to be made clear on the face of the Bill.

We really support the Bill. We think it has the potential to deliver. There are just a few key areas that need to be changed or amended slightly to provide businesses with clarity and reassurances that the policy intentions are being delivered on.

Adam Hildreth: To add to that—Lulu has covered absolutely everything, and I agree—the critical bit is not monitoring individual pieces of content. Once you have done your risk assessment and put in place your systems, processes, people and technology, that is what people are signing up for. They are not signing up for this end assessment where, because you find that one piece of harmful content exists, or maybe many, you have failed to abide by what you are really signing up to.

That is the worry from my perspective: that people do a full risk assessment, implement all the systems, put in place all the people, technology and processes that they need, do the best job they can and have understood what investment they are putting in, and someone comes along and makes a report to a regulator—Ofcom, in this sense—and says, “I found this piece of content there.” That may expose weaknesses, but the very best risk assessments are ongoing ones anyway, where you do not just put it away in a filing cabinet somewhere and say, “That’s done.” The definitions of online harms and harmful content change on a daily basis, even for the biggest social media platforms; they change all the time. There was talk earlier about child sexual abuse material that appears as cartoons, which would not necessarily be defined by certain legislation as illegal. Hopefully the legislation will catch up, but that is where that risk assessment needs to be made again, and policies may need to be changed and everything else. I just hope we do not get to the point where the individual monitoring of content, or content misses, is the goal of the Bill—that the approach taken to online safety is this overall one.

None Portrait The Chair
- Hansard -

Thank you. I call the Minister.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you, Sir Roger, and thank you very much indeed for joining us for this afternoon’s session. Adam, we almost met you in Leeds last October or November, but I think you were off with covid at the time.

Adam Hildreth: I had covid at the time, yes.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Covid struck. I would like to ask Adam and Ian in particular about the opportunities provided by emerging and new technology to deliver the Bill’s objectives. I would like you both to give examples of where you think new tech can help deliver these safety duties. I ask you to comment particularly on what it might do on, first, age assurance—which we debated in our last session—and secondly, scanning for child sexual abuse images in an end-to-end encrypted environment. Adam, do you want to go first?

Adam Hildreth: Well, if Ian goes first, the second question would be great for him to answer, because we worked on it together.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Fair enough. Ian?

Ian Stevenson: Yes, absolutely. The key thing to recognise is that there is a huge and growing cohort of companies, around the world but especially in the UK, that are working on technologies precisely to try to support those kinds of safety measures. Some of those have been supported directly by the UK Government, through the safety tech challenge fund, to explore what can be done around end-to-end encrypted messaging. I cannot speak for all the participants, but I know that many of them are members of the safety tech industry association.

Between us, we have demonstrated a number of different approaches. My own company, Cyacomb, demonstrated technology that could block known child abuse within encrypted messaging environments without compromising the privacy of users’ messages and communications. Other companies in the UK, including DragonflAI and Yoti, demonstrated solutions based on detecting nudity and looking at the ages of the people in those images, which are again hugely valuable in this space. Until we know exactly what the regulation is going to demand, we cannot say exactly what the right technology to solve it is.

However, I think that the fact that that challenge alone produced five different solutions looking at the problem from different angles shows just how vibrant the innovation ecosystem can be. My background in technology is long and mixed, but I have seen a number of sectors emerge—including cyber-security and fintech—where, once the foundations for change have been created, the ability of innovators to come up with answers to difficult questions is enormous. The capacity to do that is enormous.

There are a couple of potential barriers to that. The strength of the regulation is that it is future proof. However, until we start answering the question, “What do we need to do and when? What will platforms need to do and when will they need to do it?” we do not really create in the commercial market the innovation drivers for the technical solutions that will deliver this. We do not create the drivers for investment. It is really important to be as specific as we can about what needs to be done and when.

The other potential barrier is regulation. We have already had a comment about how there should be a prohibition of general monitoring. We have seen what has happened in the EU recently over concerns about safety technologies that are somehow looking at traffic on services. We need to be really clear that, while safety technologies must protect privacy, there needs to be a mechanism so that companies can understand when they can deploy safety technologies. At the moment there are situations where we talk to potential customers for safety technologies and they are unclear as to whether it would be proportionate to deploy those under, for example, data protection law. There are areas, even within the safety tech challenge fund work on end-to-end encrypted messaging, where it was unclear whether some of the technologies—however brilliant they were at preventing child abuse in those encrypted environments —would be deployable under current data protection and privacy of electronic communications regulations.

There are questions there. We need to make sure that when the Online Safety Bill comes through, it makes clear what is required and how it fits together with other regulations to enable that. Innovators can do almost anything if you give them time and space. They need the certainty of knowing what is required, and an environment where solutions can be deployed and delivered.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Ian, thank you very much. I am encouraged by your optimism about what innovation can ultimately deliver. Adam, let me turn to you.

Adam Hildreth: I agree with Ian that the level of innovation is amazing. If we start talking about age verification and end-to-end encryptions, for me—I am going to say that same risk assessment phrase again—it absolutely depends on the type of service, who is using the service and who is exploiting the service, as to which safety technologies should be employed. I think it is dangerous to say, “We are demanding this type of technology or this specific technology to be deployed in this type of instance,” because that removes the responsibility from the people who are creating it.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Sorry to interject, but to be clear, the Bill does not do that. The Bill specifies the objectives, but it is tech agnostic. The manner of delivering those is, of course, not specified, either in the Bill or by Ofcom.

Adam Hildreth: Absolutely. Sorry, I was saying that I agree with how it has been worded. We know what is available, but technology changes all the time and solutions change all the time—we can do things in really innovative ways. However, the risk assessment has to bring together freedom of speech versus the types at risk of abuse. Is it children who are at risk, and if so, what are they at risk from? That changes the space massively when compared with some adult gaming communities, where what is harmful to them is very different from what harms other audiences. That should dictate for them what system and technology is deployed. Once we understand what best of breed looks like for those types of companies, we should know what good is.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you, Adam. We only have one minute left, so what is your prediction for the potential possibilities that emerging tech presents to deal with the issues of age assurance, which are difficult, and CSEA scanning, given end-to-end encrypted environments?

Adam Hildreth: The technology is there. It exists and it is absolutely deployable in the environments that need it. I am sure Ian would agree; we have seen it and done a lot of testing on it. The technology exists in the environments that need it.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Including inside the end-to-end encrypted environment, rather than just at the device level? Quite a few of the safety challenge solutions that Ian mentioned are at the device level; they are not inside the encryption.

Adam Hildreth: There are ways that can work. Again, it brings in freedom of expression, global businesses and some other areas, so it is more about regulation and consumer concerns about the security of data, rather than whether technological solutions are available.

None Portrait The Chair
- Hansard -

Ms Freemont, Mr Hildreth and Mr Stevenson, thank you all very much indeed. We have run out of time. As ever, if you have any further observations that you wish to make, please put them in writing and let the Committee have them; we shall welcome them. Thank you for your time this afternoon. We are very grateful to you.

Examination of Witnesses

Jared Sine, Nima Elmi and Dr Rachel O’Connell gave evidence.

16:16
None Portrait The Chair
- Hansard -

We are now going to hear from Jared Sine, who is the chief business affairs and legal officer at Match Group, and Nima Elmi, the head of public policy in Europe at Bumble, who is appearing by Zoom. Thank you for joining us. I hope you can hear us all right. Wave if you can.

Nima Elmi indicated assent.

None Portrait The Chair
- Hansard -

We also have Dr Rachel O’Connell, who is the CEO of TrustElevate. Good afternoon.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q Does the Bill differentiate enough between services that have different business models? If not, what do you think are the consequences of the lack of differentiation, and where could more differentiation be introduced? Shall we start with you, Jared Sine?

Jared Sine: Sure—thank you for the question. Business models play a pretty distinct role in the incentives of the companies. When we talk to people about Match Group and online dating, we try to point out a couple of really important things that differentiate what we do in the dating space from what many technology companies are doing in the social media space. One of those things is how we generate our revenue. The overwhelming majority of it is subscription-based, so we are focused not on time on platform or time on device, but on whether you are having a great experience, because if you are, you are going to come back and pay again, or you are going to continue your subscription with us. That is a really big differentiator, in terms of the business model and where incentives lie, because we want to make sure they have a great experience.

Secondly, we know we are helping people meet in real life. Again, if people are to have a great experience on our platforms, they are going to have to feel safe on them, so that becomes a really big focus for us.

Finally, we are more of a one-to-one platform, so people are not generally communicating to large groups, so that protects us from a lot of the other issues you see on some of these larger platforms. Ultimately, what that means is that, for our business to be successful, we really have to focus on safety. We have to make sure users come, have a good, safe experience, and we have to have tools for them to use and put in place to empower themselves so that they can be safe and have a great experience. Otherwise, they will not come back and tell their friends.

The last thing about our platforms is that ultimately, if they are successful, our users leave them because they are engaged in a relationship, get married or just decide they are done with dating all together—that happens on occasion, too. Ultimately, our goal is to make sure that people have that experience, so safety becomes a core part of what we do. Other platforms are more focused on eyeballs, advertising sales and attention—if it bleeds, it leads—but those things are just not part of the equation for us.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q And do you think the Bill differentiates enough? If not, what more could be done in it?

Jared Sine: We are very encouraged by the Bill. We think it allows for different codes of conduct or policy, as it relates to the various different types of businesses, based on the business models. That is exciting for us because we think that ultimately those things need to be taken into account. What are the drivers and the incentives in place for those businesses? Let us make sure that we have regulations in place that address those needs, based on the approaches of the businesses.

None Portrait The Chair
- Hansard -

Nima, would you like go next?

Nima Elmi: Thank you very much for inviting me along to this discussion. Building on what Jared said, currently the Bill is not very clear in terms of references to categorisations of services. It clusters together a number of very disparate platforms that have different platform designs, business models and corporate aims. Similarly to Match Group, our platform is focused much more on one-to-one communications and subscription-based business models. There is an important need for the Bill to acknowledge these different types of platforms and how they engage with users, and to ensure appropriate guidance from Ofcom on how they should be categorised, rather than clustering together a rather significant amount of companies that have very different business aims in in this space.

None Portrait The Chair
- Hansard -

Dr O’Connell, would you like to answer?

Dr Rachel O'Connell: Absolutely. I think those are really good points that you guys have raised. I would urge a little bit of caution around that though, because I think about Yellow Tinder, which was the Tinder for teens, which has been rebranded as Yubo. It transgresses: it is a social media platform; it enables livestreaming of teens to connect with each other; it is ultimately for dating. So there is a huge amount of risk. It is not a subscription-based service.

I get the industry drive to say, “Let’s differentiate and let’s have clarity”, but in a Bill, essentially the principles are supposed to be there. Then it is for the regulator, in my view, to say, at a granular level, that when you conduct a risk impact assessment, you understand whether the company has a subscription-based business model, so the risk is lower, and also if there is age checking to make sure those users are 18-plus. However, you must also consider that there are teen dating sites, which would definitely fall under the scope of this Bill and the provisions that it is trying to make to protect kids and to reduce the risk of harm.

While I think there is a need for clarity, I would urge caution. For the Bill to have some longevity, being that specific about the categorisations will have some potential unintended consequences, particularly as it relates to children and young people.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q The next question is really about age verification, which you have touched on, so let us start with you, Dr O’Connell. What do you think the Bill should contain to enable age verification or the age assurance needed to protect children online?

Dr Rachel O'Connell: There is a mention of age assurance in the Bill. There is an opportunity to clarify that a little further, and also to bring age verification services under the remit of the Bill, as they are serving and making sure that they are mitigating risk. There was a very clear outline by Elizabeth Denham when we were negotiating the Digital Economy Act in relation to age verification and adult content sites; she was very specific when she came to Committee and said it should be a third party conducting the checks. If you want to preserve privacy and security, it should be a third-party provider that runs the checks, rather than companies saying, “You know what? We’ll track everybody for the purposes of age verification.”

There needs to be a clear delineation, which currently in clause 50 is not very clear. I would recommend that that be looked at again and that some digital identity experts be brought into that discussion, so that there is a full appreciation. Currently, there is a lot of latitude for companies to develop their own services in-house for age verification, without, I think, a proper risk assessment of what that might mean for end users in terms of eroding their privacy.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q TikTok were talking to us earlier about their age verification. If companies do it themselves rather than it being a third party, where does that fall down?

Dr Rachel O'Connell: That means you have to track and analyse people’s activities and you are garnering a huge amount of data. If you are then handling people under the age of 13, under the Data Protection Act, you must obtain parental consent prior to processing data. By definition, you have to gather the data from parents. I have been working in this space for 25 years. I remember, in 2008, when the Attorneys General brought all the companies together to consider age verification as part of the internet safety technical task force, the arguments of industry—I was in industry at the time—were that it would be overly burdensome and a privacy risk. Looking back through history, industry has said that it does not want to do that. Now, there is an incentive to potentially do that, because you do not have to pay for a third party to do it, but what are the consequences for the erosion of privacy and so on?

I urge people to think carefully about that, in particular when it comes to children. It would require tracking children’s activities over time. We do not want our kids growing up in a surveillance society where they are being monitored like that from the get-go. The advantage of a third-party provider is that they can have a zero data model. They can run the checks without holding the data, so you are not creating a data lake. The parent or child provides information that can be hashed on the device and checked against data sources that are hashed, which means there is no knowledge. It is a zero data model.

The information resides on the user’s device, which is pretty cool. The checks are done, but there is no exposure and no potential for man-in-the-middle checks. The company then gets a token that says “This person is over 18”, or “This person is below 12. We have verified parental responsibility and that verified parent has given consent.” You are dealing with tokens that do not contain any personal information, which is a far better approach than companies developing things in-house.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q I think the TikTok example was looking at materials and videos and seeing whether they mention school or birthdays as a way of verifying age. As you say, that does involve scanning the child’s data.

None Portrait The Chair
- Hansard -

Q Can I see if Ms Elmi wants to come in? She tends to get left out on a limb, on the screen. Are you okay down there? Do you need to come in on this, or are you happy?

Nima Elmi: Yes, I am. I have nothing to add.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q Jared Sine, did you have anything to add?

Jared Sine: Sure. I would add a couple of thoughts. We run our own age verification scans, which we do through the traditional age gate but also through a number of other scans that we run.

Again, online dating platforms are a little different. We warn our users upfront that, as they are going to be meeting people in real life, there is a fine balance between safety and privacy, and we tend to lean a little more towards safety. We announce to our users that we are going to run message scans to make sure there is no inappropriate behaviour. In fact, one of the tools we have rolled out is called “Are you sure? Does this bother you?”, through which our AI looks at the message a user is planning to send and, if it is an inappropriate message, a flag will pop up that says, “Are you sure you want to send this?” Then, if they go ahead and send it, the person receiving it at the other end will get a pop-up that says, “This may not be something you want to see. Go ahead and click here if you want to.” If they open it, they then get another pop-up that asks “Does this bother you?” and, if it does, you can report the user immediately.

We think that is an important step to keep our platform safe. We make sure our users know that it is happening, so it is not under the table. However, we think there has to be a balance between safety and privacy, especially when we have users who are meeting in person. We have actually demonstrated on our platforms that this reduces harassment and behaviour that would otherwise be untoward or that you would not want on the platform.

We think that we have to be careful not to tie the hands of industry to be able to come up with technological solutions and advances that can work side by side with third-party tools and solutions. We have third-party ID verification tools that we use. If we identify or believe a user is under the age of 18, we push them through an ID verification process.

The other thing to remember, particularly as it relates to online dating, is that companies such as ours and Bumble have done the right thing by saying “18-plus only on our platforms”. There is no law that says that an online dating platform has to be 18-plus, but we think it is right thing to do. I am a father of five kids; I would not want kids on my platform. We are very vigilant in taking steps to make sure we are using the latest and greatest tools available to try to make sure that our platforms are safe.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Rachel, we have, in you, what we are told is a leading, pre-eminent authority on the issue of age verification, so we are listening very carefully to what you say. I am thinking about the evidence we had earlier today, which said that it is reasonably straightforward for a large majority of young people to subvert age verification through the use of VPNs. You have been advocating third-party verification. How could we also deal with this issue of subverting the process through the use of the VPNs?

Dr Rachel O'Connell: I am the author of the technical standard PAS 1296, an age checking code of practice, which is becoming a global standard at the moment. We worked a lot with privacy and security and identity experts. It should have taken nine months, but it took a bit longer. There was a lot of thought that went into it. Those systems were developed to, as I just described, ensure a zero data, zero knowledge kind of model. What they do is enable those verifications to take place and reduce the requirement. There is a distinction between monitoring your systems, as was said earlier, for age verification purposes and abuse management. They are very different. You have to have abuse management systems. It is like saying that if you have a nightclub, you have to have bouncers. Of course you have to check things out. You need bouncers at the door. You cannot let people go into the venue, then afterwards say that you are spotting bad behaviour. You have to check at the door that they are the appropriate age to get into the venue.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Can they not just hop on a VPN and bypass the whole system anyway?

Dr Rachel O'Connell: I think you guys will be aware of the DCMS programme of work about the verification of children last year. As part of that, there was a piece of research that asked children what they would think about age verification. The predominant thing that came across from young children is that they are really tired of having to deal with weirdos and pervs. It is an everyday occurrence for them.

To just deviate slightly to the business model, my PhD is in forensics and tracking paedophile activity on the internet way back in the ’90s. At that time, guys would have to look for kids. Nowadays, on TikTok and various livestream platforms, the algorithms recognise that an individual—a man, for example—is very interested in looking at content produced by kids. The algorithms see that a couple of times and go, “You don’t have to look anymore. We are going to seamlessly connect you with kids who livestream. We are also going to connect you with other men that like looking at this stuff.”

If you are on these livestream sites at 3 o’clock in the morning, you can see these kids who are having sleepovers or something. They put their phone down to record whatever the latest TikTok dance is, and they think that they are broadcasting to other kids. You would assume that, but what they then hear is the little pops of love hearts coming on to the screen and guys’ voices saying, “Hey sweetie, you look really cute. Lick your lips. Spread your legs.” You know where I am going with this.

The Online Safety Bill should look at the systems and processes that underpin these platforms, because there is gamification of kids. Kids want to become influencers—maybe become really famous. They see the views counter and think, “Wow, there are 200 people looking at us.” Those people are often men, who will co-ordinate their activities at the back. They will push the boys a little bit further, and if a girl is on her own, they will see. If the child does not respond to the request, they will drop off. The kid will think, “Oh my God. Well, maybe I should do it this one time.”

What we have seen is a quadrupling of child sexual abuse material online that has been termed “self-generated”, because the individual offender hasn’t actually produced it. From a psychological perspective, it is a really bad name, but that is a separate topic. Imagine if that was your kid who had been coerced into something that had then been labelled as “self-generated”. The businesses models that underpin those processes that happen online are certainly something that should be really within scope.

We do not spend enough time thinking about the implications of the use of recommendation engines and so on. I think the idea of the VPN is a bit of a red herring. Children want safety. They do not want to have to deal with this sort of stuff online. There are other elements. If you were a child and felt that you might be a little bit fat, you could go on YouTube and see whether you could diet or something. The algorithms will pick that up also. There is a tsunami of dieting and thinspiration stuff. There is psychological harm to children as a result of the systems and processes that these companies operate.

There was research into age verification solutions and trials run with BT. Basically, the feedback from both parents and children was, “Why doesn’t this exist already?”. If you go into your native EE app where it says, “Manage my family” and put in your first name, last name and mobile number and your child’s first name, last name and date of birth, it is then verified that you are their parent. When the child goes on Instagram or TikTok, they put in their first and last name. The only additional data point is the parent’s mobile number. The parent gets a notification and they say yes or no to access.

There are solutions out there. As others have mentioned, the young people want them and the parents want them. Will people try to work around them? That can happen, but if it is a parent-initiated process or a child-initiated process, you have the means to know the age bands of the users. From a business perspective, it makes a lot of sense because you can have a granular approach to the offerings you give to each of your customers in different age bands.

Nima Elmi: Just to add to what Rachel has said, I think she has articulated extremely well the complexities of the issues around not only age verification, but business models. Ultimately, this is such a complex matter that it requires continued consultation across industry, experts and civil society to identity pragmatic recommendations for industry when it comes to not only verifying the age of their users, but thinking about the nuanced differences between platforms, purposes, functionality and business models, and what that means.

In the context of the work we do here at Bumble, we are clear about our guidelines requiring people to be 18-plus to download our products from app stores, as well as ensuring that we have robust moderation processes to identify and remove under-18s from our platforms. There is an opportunity here for the Bill to go further in providing clarity and guidance on the issue of accessibility of children to services.

Many others have said over the course of today’s evidence that there needs to be a bit more colour put into definitions, particularly when certain sections of the Bill refer to what constitutes a “significant number of users” for determining child accessibility to platforms. Coupled with the fact that age verification or assurance is a complex area in and of itself and the nuance between how social media may engage with it versus a dating or social networking platform, I think that more guidance is very much needed and a much more nuanced approach would be welcome.

None Portrait The Chair
- Hansard -

I have three Members and the Minister to get in before 5 o’clock, so I urge brief questions and answers please.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Is it technically possible—I do not need to know how—to verify the age of children who are under 16, for example?

Dr Rachel O'Connell: Yes.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q So technology exists out there for that to happen.

Dr Rachel O'Connell: Yes.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Once we have the verification of those ages, do you think it would be possible or desirable to limit children’s interactions to only with other children? Is that the direction you were going in?

Dr Rachel O'Connell: I will give an example. If you go to an amusement park, kids who are below four feet, for example, cannot get on the adult rides, so the equivalent would be that they should not be on an 18-plus dating site. The service can create it at a granular level so the kids can interact with kids in the same age group or a little bit older, but they can also interact with family. You can create circles of trust among verified people.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

For a game like Roblox, which is aimed at kids—it is a kids platform—if you had the age verification and if that worked, you could have a situation where a 13-year-old on Roblox could only interact with children who are between 12 and 14. Does the technology exist to make that work?

Dr Rachel O'Connell: You could do. Then if you were using it in esports or there was a competition, you could broaden it out. The service can set the parameters, and you can involve the parents in making decisions around what age bands their child can play with. Also, kids are really into esports and that is their future, so there are different circumstances and contexts that the technology could enable.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Finally, do you think it would be desirable for Ofcom to consider a system with more consistency in parental controls, so that parents can always ensure that their children cannot talk to anybody outside their circle? Would that be helpful?

Dr Rachel O'Connell: There is a history of parental controls, and only 36% of parents use them. Ofcom research consistently says that it is 70%, but in reality, it is lower. When using age verification, the parents are removing the ability to watch everything. It is a platform; they are providing the digital playground. In the same way, when you go on swings and slides, there is bouncy tarmac because you know the kids are going to use them. It is like creating that health and safety environment in a digital playground.

When parents receive a notification that their child wants to access something, there could be a colour-coded nutrition-style thing for social media, livestreaming and so on, and the parents could make an informed choice. It is then up to the platform to maintain that digital playground and run those kinds of detection systems to see if there are any bad actors in there. That is better than parental controls because the parent is consenting and it is the responsibility of the platform to create the safer environment. It is not the responsibility of the parent to look over the child’s shoulder 24/7 when they are online.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q The age verification stuff is really interesting, so thank you to our witnesses. On violence against women and girls, clauses 150 to 155 set out three new communications offences. Do you think those offences will protect women from receiving offensive comments, trolling and threats online? What will the Bill mean for changing the way you manage those risks on your platforms?

Jared Sine: I do not know the specific provisions but I am familiar with the general concept of them. Any time you put something in law, it can either be criminalised or have enforcement behind it, and I think that helps. Ultimately, it will be up to the platforms to come up with innovative technologies or systems such as “Are You Sure?” and “Does This Bother You?” which say that although the law says x, we are going to go beyond that to find tools and systems that make it happen on our platform. Although I think it is clearly a benefit to have those types of provisions in law, it will really come down to the platforms taking those extra steps in the future. We work with our own advisory council, which includes the founder of the #MeToo movement, REIGN and others, who advise us on how to make platforms safer for those things. That is where the real bread gets buttered, so to speak.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Do you think that is consistent across your industry? It sounds like you are taking a very proactive approach to it.

Jared Sine: We are proactive about it, and I know our colleagues and friends over at Bumble are proactive about it as well. Our heads of trust and safety both came from the same company—Uber—before coming to us, so I know that they compare notes quite regularly. Because of the way the legislation is set up, there can be codes of conduct applying specifically to online dating, and to the extent that that technology exists, you need to deploy it.

None Portrait The Chair
- Hansard -

Shall we ask our friends at Bumble if they would like to come in?

Nima Elmi: It is a great question. There are three points that I want to address, and I will try to be brief. First, Bumble is very much a uniquely female-founded and female-led tech company that adopts a safety-by-design ethos. It is baked within our DNA. The majority of our board are women, and they are public figures who, unfortunately, have to some extent experienced online harms targeting women.

We believe it is incredibly important that the Bill acknowledges that women are disproportionately impacted by online harms. Some studies have found that women are 27 times more likely than men to suffer online harassment and online harms. Currently, the Bill does not acknowledge or reference gender or women at all, so a lot more can be done, and we have submitted some recommendations.

Not every company in our industry or across the tech sector is female-founded and female-led, and they prioritise the harms that they want to tackle on their platforms very differently—that is important. Our systems-based approach, which bakes in safety-by-design principles, puts women at the centre of how our products are designed and used. We deploy corrective action and safety tools to make sure that our female members feel not only safe but empowered on our platforms. When it comes to managing risk, it is central to us to ensure that women feel safe on our products and services. We are here advocating for the fact that it should not just be our products that are safe for women—it should be the internet as a whole. In our view, the Bill does not currently go far enough to make sure that that happens.

We welcome the inclusion of the miscommunication offences in clauses 150 to 155 and also welcome the offence of cyber-flashing, the inclusion of which we have been advocating for publicly for several months. However, in both instances, and particularly with cyber-flashing, the Bill does not go far enough in acknowledging that it is an offence, as Professor McGlynn has highlighted, that should be grounded on consent rather than the motivation of the perpetrator.

Essentially, there are a number of inclusions that are a step in the right direction, but we would welcome significant changes to the Bill, predominantly through including a safety duty for women, to ensure that all platforms are consistent in their approach and prioritise how their female users engage with their services, so that they feel protected, and to ensure that determining those features is not predicated on the composition of the board or who the founder is.

None Portrait The Chair
- Hansard -

Right. For once, we seem to have run out of questions. Minister, do you wish to contribute?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Everything I was going to ask has already been asked by my colleagues, so I will not duplicate that.

None Portrait The Chair
- Hansard -

Q In that case, given that we have the time, rather than doing what I normally do and inviting you to make any further submissions in writing, if there are any further comments that you would like to make about the Bill, the floor is yours. Let us start with Mr Sine.

Jared Sine: I would just make one brief comment. I think it has been mentioned by everyone here. Everyone has a role to play. Clearly, the Government have a role in proposing and pushing forward the legislation. The platforms that have the content have an obligation and a responsibility to try to make sure that their users are safe. One of the things that Dr O’Connell mentioned is age verification and trying to make sure that we keep young kids off platforms where they should not be.

I think there is a big role to play for the big tech platforms—the Apples and Googles—who distribute our apps. Over the years, we have said again and again to both of those companies, “We have age-gated our apps at 18, yet you will allow a user you know is 15, 14, 16—whatever it is—to download that app. That person has entered that information and yet you still allow that app to be downloaded.” We have begged and pleaded with them to stop and they will not stop. I am not sure that that can be included in the Bill, but if it could be, it would be powerful.

If Apple and Google could not distribute any of our apps—Hinge, Match, Tinder—to anyone under the age of 18, that solves it right there. It is the same methodology that has been used at clubs with bouncers—you have a bouncer at the door who makes sure you are 21 before you go in and have a drink. It should be the same thing with these technology platforms. If they are going to distribute and have these app stores, the store should then have rules that show age-gated apps—“This is for 17-plus or 18-plus”—and should also enforce that. It is very unfortunate that our calls on this front have gone unanswered. If the Bill could be modified to include that, it would really help to address the issue.

Dr Rachel O'Connell: Absolutely. I 100% support that. There is a tendency for people to say, “It is very complex. We need a huge amount of further consultation.” I started my PhD in 1996. This stuff has been going on for all that time. In 2008, there was a huge push by the Attorneys General, which I mentioned already, which brought all of the industry together. That was 2008. We are in 2022 now. 2017 was the Internet Safety Strategy Green Paper. We know what the risks are. They are known; we understand what they are. We understand the systems and processes that facilitate them. We understand what needs to be done to mitigate those risks and harms. Let’s keep on the track that we are going on.

Regarding industry’s concerns, a lot of them will be ironed out when companies are required to conduct risk assessments and impact assessments. They might ask, what are the age bands of your users? What are the risks associated with the product features that you are making available? What are the behaviour modification techniques that you are using, like endless scroll and loot boxes that get kids completely addicted? Are those appropriate for those ages? Then you surface the decision making within the business that results in harms and also the mitigations.

I urge you to keep going on this; do not be deterred from it. Keep the timeframe within which it comes into law fairly tight, because there are children out there who are suffering. As for the harassment—I have experienced it myself, it is horrible.

Those would be my final words.

None Portrait The Chair
- Hansard -

Thank you. Finally, Nima Elmi, please.

Nima Elmi: Thank you again for your time. I want to re-emphasise a couple of points, since we have a few minutes.

First, on the point around gendered harms, I think it is important for the Committee to really think about whether this is an opportunity to make reference in the Bill to acknowledge that women are experiencing online harms at a significantly higher rate than men. That is meant to futureproof the Bill, as new forms of online harms are, unfortunately, usually felt by women first. I know that Maria Miller, for example, has been doing extensive work around the issue of AI nudification tools, which, in the current framing of the Bill, would not be captured.

We would certainly urge that there is a greater focus in the Bill on gendered harms, whether that is through a specific safety duty, acknowledgement as a category within risk assessment, a designated code of practice—which I know Clare McGlynn, Refuge and EVAW have also advocated for—or acknowledgement of gender-based violence in transparency reporting.

Right now, the nature of moderation of technology platforms is very much grounded in the prioritisation of issues based on the leadership and usage of certain platforms, and this is an opportunity for the Government and Parliament to provide a standard setting that ensures consistency across the board while acknowledging the nuanced differences between the platforms and their business models, and their end goals. I would really like to emphasise that point.

The second point I want to emphasise, on cyber-flashing in particular, is the fact that we have an opportunity to bake in what should be societal standards that we want to hold people accountable to, both offline and online. Offences captured by the Bill that do not create a threshold where you will see prosecutions and a change in behaviour—for example, in the current formulation of the cyber-flashing offence, which is grounded in the perpetrator’s motivation rather than in consent—will have little impact in changing the hearts and minds of individuals and stopping that behaviour, because the threshold will be so high.

We would definitely encourage the Committee to reflect on the pragmatic ways in which the Bill can be refined. In particular, I want to emphasise that it will be important to acknowledge that online harms are sadly very much experienced by women—both emerging forms and existing forms of harms. I welcome this opportunity to share this feedback with the Committee.

None Portrait The Chair
- Hansard -

Ms Elmi, Dr O’Connell and Mr Sine, thank you all very much indeed; the Committee is indebted to you. Thank you so much.

Examination of Witnesses

Rhiannon-Faye McDonald and Susie Hargreaves OBE gave evidence.

16:55
None Portrait The Chair
- Hansard -

We will now hear from Rhiannon-Faye McDonald, victim and survivor advocate at the Marie Collins Foundation, and Susie Hargreaves, chief executive at the Internet Watch Foundation. Thank you for joining us this afternoon; first question, please.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you both for joining us this afternoon. One of the key objectives of the legislation is to ensure that a high level of protection for children and adults is in place. In your view, does the Bill in its current form achieve that?

Susie Hargreaves: Thank you very much for inviting me today. I think the Bill is working in the right direction. Obviously, the area that we at the IWF are concerned with is child sexual abuse online, and from our point of view, the Bill does need to make a few changes in order to put those full protections in place for children.

In particular, we have drafted an amendment to put co-designation on the face of the Bill. When it comes to child sexual abuse, we do not think that contracting out is an acceptable approach, because we are talking about the most egregious form of illegal material—we are talking about children—and we need to ensure that Ofcom is not just working in a collaborative way, but is working with experts in the field. What is really important for us at the moment is that there is nothing in the Bill to ensure that the good work that has been happening over 25 years in this country, where the IWF is held up as a world leader, is recognised, and that that expertise is assured on the face of the Bill. We would like to see that amendment in particular adopted, because the Bill needs to ensure that there are systems and processes in place for dealing with illegal material. The IWF already works with internet companies to ensure they take technical services.

There needs to be a strong integration with law enforcement—again, that is already in place with the memorandum of understanding between CPS, the National Police Chiefs’ Council and the IWF. We also need clarity about the relationship with Ofcom so that child sexual abuse, which is such a terrible situation and such a terrible crime, is not just pushed into the big pot with other harms. We would like to see those specific changes.

Rhiannon-Faye McDonald: Generally, we think the Bill is providing a higher standard of care for children, but there is one thing in particular that I would like to raise. Like the IWF, the Marie Collins Foundation specialises in child sexual abuse online, specifically the recovery of people who have been affected by child sexual abuse.

The concern I would like to raise is around the contextual CSA issue. I know this has been raised before, and I am aware that the Obscene Publications Act 1959 has been brought into the list of priority offences. I am concerned that that might not cover all contextual elements of child sexual abuse: for example, where images are carefully edited and uploaded to evade content moderation, or where there are networks of offenders who are able to gain new members, share information with each other, and lead other people to third-party sites where illegal content is held. Those things might not necessarily be caught by the illegal content provisions; I understand that they will be dealt with through the “legal but harmful” measures.

My concern is that the “legal but harmful” measures do not need to be implemented by every company, only those that are likely to be accessed by children. There are companies that can legitimately say that the majority of their user base is not children, and therefore would not have to deal with that, but that provides a space for this contextual CSA to happen. While those platforms may not be accessed by children as much as other platforms, it still provides a place for this to happen—the harm can still occur, even if children do not come across it as much as they would elsewhere.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q On that point, one of the concerns that has been raised by other stakeholders is about the categorisation of platforms—for example, category 1 and category 2B have different duties on them, as Ofcom is the regulator. Would you rather see a risk-based approach to platforms, rather than categorisation? What are your thoughts on that?

Susie Hargreaves: We certainly support the concept of a risk-based approach. We host very little child sexual abuse content in the UK, with the majority of the content we see being hosted on smaller platforms in the Netherlands and other countries. It is really important that we take a risk-based approach, which might be in relation to where the content is—obviously, we are dealing with illegal content—or in relation to where children are. Having a balance there is really important.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q A final question from me. We heard concerns from children’s charities and the Children’s Commissioner that the Bill does not account for breadcrumbing—the cross-platform grooming that happens on platforms. What more could the Bill do to address that, and do you see it as an omission and a risk?

Susie Hargreaves: I think we probably have a slightly different line from that of some of the other charities you heard from this morning, because we think it is very tricky and nuanced. What we are trying to do at the moment is define what it actually means and how we would have to deal with it, and we are working very closely with the Home Office to go through some of those quite intense discussions. At the moment, “harmful” versus “illegal” is not clearly defined in law, and it could potentially overwhelm certain organisations if we focus on the higher-level harms and the illegal material. We think anything that protects children is essential and needs to be in the Bill, but we need to have those conversations and to do some more work on what that means in reality. We are more interested in the discussions at the moment about the nuance of the issue, which needs to be mapped out properly.

One of the things that we are very keen on in the Bill as a whole is that there should be a principles-based approach, because we are dealing with new harms all the time. For example, until 2012 we had not seen self-generated content, which now accounts for 75% of the content we remove. So we need constantly to change and adapt to new threats as they come online, and we should not make the Bill too prescriptive.

None Portrait The Chair
- Hansard -

Ms McDonald?

Rhiannon-Faye McDonald: I was just thinking of what I could add to what Susie has said. My understanding is that it is difficult to deal with cross-platform abuse because of the ability to share information between different platforms—for example, where a platform has identified an issue or offender and not shared that information with other platforms on which someone may continue the abuse. I am not an expert in tech and cannot present you with a solution to that, but I feel that sharing intelligence would be an important part of the solution.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q What risks do end-to-end encrypted platforms pose to children, and how should the Bill seek to mitigate those risks specifically?

Susie Hargreaves: We are very clear that end-to-end encryption should be within scope, as you have heard from other speakers today. Obviously, the huge threat on the horizon is the end-to-end encryption on Messenger, which would result in the loss of millions of images of child sexual abuse. In common with previous speakers, we believe that the technology is there. We need not to demonise end-to-end encryption, which in itself is not bad; what we need to do is ensure that children do not suffer as a consequence. We must have mitigations and safety mechanisms in place so that we do not lose these child sexual abuse images, because that means that we will not be able to find and support those children.

Alongside all the child protection charities, we are looking to ensure that protections equivalent to the current ones are in place in the future. We do not accept that the internet industry cannot put them in place. We know from experts such as Dr Hany Farid, who created PhotoDNA, that those mechanisms and protections exist, and we need to ensure that they are put in place so that children do not suffer as a consequence of the introduction of end-to-end encryption. Rhiannon has her own experiences as a survivor, so I am sure she would agree with that.

Rhiannon-Faye McDonald: I absolutely would. I feel very strongly about this issue, which has been concerning me for quite some time. I do not want to share too much, but I am a victim of online grooming and child sex abuse. There were images and videos involved, and I do not know where they are and who has seen them. I will never know that. I will never have any control over it. It is horrifying. Even though my abuse happened 19 years ago, I still walk down the street wondering whether somebody has seen those images and recognises me from them. It has a lifelong impact on the child, and it impacts on recovery. I feel very strongly that if end-to-end encryption is implemented on platforms, there must be safeguards in place to ensure we can continue to find and remove these images, because I know how important that is to the subject of those images.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q So what needs to change in the Bill to make sure that happens? I am not clear.

Susie Hargreaves: We just want to make sure that the ability to scan in an end-to-end encrypted environment is included in the Bill in some way.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q The ability to scan is there right now—we have got that—so you are just trying to make sure we are standing still, basically. Am I correct in my understanding?

Susie Hargreaves: I think with technology you can never stand still. We do not know what is coming down the line. We have to deal with the here and now, but we also need to be prepared to deal with whatever comes down the line. The answer, “Okay, we will just get people to report,” is not a good enough replacement for the ability to scan for images.

When the privacy directive was introduced in Europe and Facebook stopped scanning for a short period, we lost millions of images. What we know is that we must continue to have those safety mechanisms in place. We need to work collectively to do that, because it is not acceptable to lose millions of images of child sexual abuse and create a forum where people can safely share them without any repercussions, as Rhiannon says. One survivor we talked to in this space said that one of her images had been recirculated 70,000 times. The ability to have a hash of a unique image, go out and find those duplicates and make sure they are removed means that people are not re-victimised on a daily basis. That is essential.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Focusing on thinking about how to prevent grooming behaviour, does the Bill have enough in place to protect children from conversations that they may have adults, or from facing grooming behaviour online?

Rhiannon-Faye McDonald: There is one specific point that I would like to raise about this. I am concerned about private communications. We know that many offenders identify and target children on more open platforms, and then very quickly move them to more private platforms to continue the grooming and abuse. We were very pleased to see that private communications were brought in scope. However, there is a difficulty in the code of practice. When that is drafted, Ofcom is not going to be able to require proactive tools to be used to identify. That includes things like PhotoDNA and image and text-based classifiers.

So although we have tools that we can use currently, which can identify conversations where grooming is happening, we are not going to be using those immediately on private platforms, on private communications where the majority of grooming is going to happen. That means there will be a delay while Ofcom establishes that there is a significant problem with grooming on the platform, and then issues are noticed to require those tools to be used.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q You mentioned the reporting mechanisms that are in place, Susie. Yes, they are not the only tool, and should not be the only tool—many more things should be happening—but are the reporting mechanisms that will be in place, once the Bill has come in and is being embedded, sufficient, or do they need to be improved as well; as requirements for platforms to have reporting mechanisms?

Susie Hargreaves: An awful lot of work has already gone into this over the past few years. We have been working closely with Departments on the draft code of practice. We think that, as it stands, it is in pretty good shape. We need to work more closely with Ofcom as those codes are developed—us and other experts in the field. Again, it needs to be very much not too directing, in the sense that we do not want to limit people, and to be available for when technology changes in the future. It is looking in the right shape, but of course we will all be part of the consultation and of the development of those practices as they go. It requires people to scan their networks, to check for child sexual abuse and—I guess for the first time, the main thing—to report on it. It is going to be a regulated thing. In itself, that is a huge development, which we very much welcome.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q I have one last question. Rhiannon, a suggestion was made earlier by Dr Rachel O’Connell about age verification and only allowing children to interact with other children whose age is verified within a certain area. Do you think that would help to prevent online grooming?

Rhiannon-Faye McDonald: It is very difficult. While I am strongly about protecting children from encountering perpetrators, I also recognise that children need to have freedoms and the ability to use the internet in the ways that they like. I think if that was implemented and it was 100% certain that no adult could pose as a 13-year-old and therefore interact with actual 13-year-olds, that would help, but I think it is tricky.

Susie Hargreaves: One of the things we need to be clear about, particularly where we see children groomed —we are seeing younger and younger children—is that we will not ever sort this just with technology; the education piece is huge. We are now seeing children as young as three in self-generated content, and we are seeing children in bedrooms and domestic settings being tricked, coerced and encouraged into engaging in very serious sexual activities, often using pornographic language. Actually, a whole education piece needs to happen. We can put filters and different technology in place, but remember that the IWF acts after the event—by the time we see this, the crime has been committed, the image has been shared and the child has already been abused. We need to bump up the education side, because parents, carers, teachers and children themselves have to be able to understand the dangers of being online and be supported to build their resilience online. They are definitely not to be blamed for things that happen online. From Rhiannon’s own story, how quickly it can happen, and how vulnerable children are at the moment—I don’t know.

Rhiannon-Faye McDonald: For those of you who don’t know, it happened very quickly to me, within the space of 24 hours, from the start of the conversation to the perpetrator coming to my bedroom and sexually assaulting me. I have heard other instances where it has happened much more quickly than that. It can escalate extremely quickly.

Just to add to Susie’s point about education, I strongly believe that education plays a huge part in this. However, we must be very careful in how we educate children, so that the focus is not on how to keep themselves safe, because puts the responsibility on them, which in turn increases the feelings of responsibility when things do go wrong. That increased feeling of responsibility makes it less likely that they will disclose that something has happened to them, because they feel that they will be blamed. It will decrease the chance that children will tell us that something has happened.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q Just to follow up on a couple of things, mainly with Susie Hargreaves. You mentioned reporting mechanisms and said that reporting will be a step forward. However, the Joint Committee on the draft Bill recommended that the highest-risk services should have to report quarterly data to Ofcom on the results of their child sexual exploitation and abuse removal systems. What difference would access to that kind of data make to your work?

Susie Hargreaves: We already work with the internet industry. They currently take our services and we work closely with them on things such as engineering support. They also pay for our hotline, which is how we find child sexual abuse. However, the difference it would make is that we hope then to be able to undertake work where we are directly working with them to understand the level of their reports and data within their organisations.

At the moment, we do not receive that information from them. It is very much that we work on behalf of the public and they take our services. However, if we were suddenly able to work directly with them—have information about the scale of the issue within their own organisations and work more directly on that— then that would help to feed into our work. It is a very iterative process; we are constantly developing the technology to deal with the current threats.

It would also help us by giving us more intelligence and by allowing us to share that information, on an aggregated basis, more widely. It would certainly also help us to understand that they are definitely tackling the problem. We do believe that they are tackling the problem, because it is not in their business interests not to, but it just gives a level of accountability and transparency that does not exist at the moment.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q You also said earlier that there was nothing in the Bill on co-designation—nothing to recognise the Internet Watch Foundation’s 25 years of experience. Do you still expect to be co-designated as a regulator by Ofcom, and if so, what do you expect your role to be?

Susie Hargreaves: At the moment, there is nothing on the face of the Bill on co-designation. We do think that child sexual abuse is different from other types of harm, and when you think about the huge number of harms, and the scale and complexity of the Bill, Ofcom has so much to work with.

We have been working with Ofcom for the past year to look at exactly what exactly our role would be. However, because we are the country’s experts on dealing with child sexual abuse material, because we have the relationships with the companies, and because we are an internationally renowned organisation, we are able to have that trusted relationship and then undertake a number of functions for Ofcom. We could help to undertake specific investigations, help update the code, or provide that interface between Ofcom and the companies where we undertake that work on their behalf.

We very much feel that we should be doing that. It is not about being self-serving, but about recognising the track record of the organisation and the fact that the relationships and technology are in place. We are already experts in this area, so we are able to work directly with those companies because we already work with them and they trust us. Basically, we have a memorandum of understanding with the CPS and the National Police Chiefs’ Council that protects our staff from prosecution but the companies all work with us on a voluntary basis. They already work with us, they trust our data, and we have that unique relationship with them.

We are able to provide that service to take the pressure off Ofcom because we are the experts in the field. We would like that clarified because we want this to be right for children from day one—you cannot get it wrong when dealing with child sexual abuse. We must not undo or undermine the work that has happened over the last 25 years.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q Just to be clear, is there uncertainty somewhere in there? I am just trying to comprehend.

Susie Hargreaves: There is uncertainty, because we do not know exactly what our relationship with Ofcom is going to be. We are having discussions and getting on very well, but we do not know anything about what the relationship will be or what the criteria and timetable for the relationship are. We have been working on this for nearly five years. We have analysts who work every single day looking at child sexual abuse; we have 70 members of staff, and about half of them look at child sexual abuse every day. They are dealing with some of the worse material imaginable, they are already in a highly stressful situation and they have clear welfare needs; uncertainty does not help. What we are looking for is certainty and clarity that child sexual abuse is so important that it is included on the face of the Bill, and that should include co-designation.

None Portrait The Chair
- Hansard -

Thank you. One question from Kim Leadbeater.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you for your very powerful testimony, Rhiannon. I appreciate that could not have been easy. Going back to the digital literacy piece, it feels like we were talking about digital literacy in the Bill when it started coming through, and that has been removed now. How important do you think it is that we have a digital literacy strategy, and that we hold social media providers in particular to having a strategy on digital education for young people?

Rhiannon-Faye McDonald: It is incredibly important that we have this education piece. Like Susie said, we cannot rely on technology or any single part of this to solve child sexual abuse, and we cannot rely on the police to arrest their way out of the problem. Education really is the key. That is education in all areas—educating the child in an appropriate way and educating parents. We hold parenting workshops. Parents are terrified; they do not know what to do, what platforms are doing what, or what to do when things go wrong. They do not even know how to talk to children about the issue; it is embarrassing for them and they cannot bring it up. Educating parents is a huge thing. Companies have a big responsibility there. They should have key strategies in place on how they are going to improve education.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Can I start by thanking both Rhiannon-Faye and Susie for coming and giving evidence, and for all the work they are doing in this area? I know it has been done over many years in both cases.

I would like to pick up on a point that has arisen in the discussion so far—the point that Susie raised about the risks posed by Meta introducing end-to-end encryption, particularly on the Facebook Messenger service. You have referenced the fact that huge numbers of child sexual exploitation images are identified by scanning those communications, leading to the arrests of thousands of paedophiles each year. You also referenced the fact that when this was temporarily turned off in Europe owing to the privacy laws there—briefly, thankfully—there was a huge loss of information. We will come on to the Bill in a minute, but as technology stands now, if Meta did proceed with end-to-end encryption, would that scanning ability be lost?

Susie Hargreaves: Yes. It would not affect the Internet Watch Foundation, but it would affect the National Centre for Missing and Exploited Children. Facebook, as a US company, has a responsibility to do mandatory reporting to NCMEC, which will be brought in with the Bill in this country. Those millions of images would be lost, as of today, if they brought end-to-end encryption in now.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Why would it not affect the Internet Watch Foundation?

Susie Hargreaves: Because they are scanning Facebook—sorry, I am just trying to unpack the way it works. It will affect us, actually. Basically, when we provide our hash list to Facebook, it uses that to scan Messenger, but the actual images that are found—the matches—are not reported to us; they are reported into NCMEC. Facebook does take our hash list. For those of you who do not know about hashing, it is a list of digital fingerprints—unique images of child sexual abuse. We currently have about 1.3 million unique images of child sexual abuse. Facebook does use our hash list, so yes it does affect us, because it would still take our hash list to use on other platforms, but it would not use it on Messenger. The actual matches would go into NCMEC. We do not know how many matches it gets against our hash list, because it goes into NCMEC.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q But its ability to check images going across Messenger against your list would effectively terminate.

Susie Hargreaves: Yes, sorry—I was unclear about that. Yes, it would on Messenger.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Clearly the Bill cannot compel the creation of technology that does not exist yet. It is hoped that there will be technology—we heard evidence earlier suggesting that it is very close to existing—that allows scanning in an end-to-end encrypted environment. Do you have any update on that that you can give the Committee? If there is no such technology, how do you think the Bill should address that? Effectively there would be a forced choice between end-to-end encryption and scanning for CSEA content.

Susie Hargreaves: As I said before, it is essential that we do not demonise end-to-end encryption. It is really important. There are lots of reasons why, from a security and privacy point of view, people want to be able to use end-to-end encryption.

In terms of whether the technology is there, we all know that there are things on the horizon. As Ian said in the previous session, the technology is there and is about to be tried out. I cannot give any update at this meeting, but in terms of what we would do if end-to-end encryption is introduced and there is no ability to scan, we could look at on-device scanning, which I believe you mentioned before, Minister.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes.

Susie Hargreaves: That is an option. That could be a backstop position. I think that, at the moment, we should stand our ground on this and say, “No, we need to ensure that we have some form of scanning in place if end-to-end encryption is introduced.”

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q For complete clarity, do you agree that the use of end-to-end encryption cannot be allowed at the expense of child safety?

Susie Hargreaves: I agree 100%.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Good. Thank you.

None Portrait The Chair
- Hansard -

Thank you very much indeed, Ms McDonald and Ms Hargreaves. We are most grateful to you; thank you for your help.

Examination of Witnesses

Ellen Judson and Kyle Taylor gave evidence.

17:29
None Portrait The Chair
- Hansard -

Finally this afternoon, we will hear from Ellen Judson, who is the lead researcher at the Centre for the Analysis of Social Media at Demos, and Kyle Taylor, who is the founder and director of Fair Vote. Thank you for joining us this afternoon.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you both for joining us, and for waiting until the end of a very long day. We appreciate it.

There is a wide exemption in the Bill for the media and for journalistic content. Are you concerned that that is open to abuse?

Kyle Taylor: Oh, absolutely. There are aspects of the Bill that are extremely worrying from an online safety perspective: the media exemption, the speech of democratic importance exemption, and the fact that a majority of paid ads are out of scope. We know that a majority of harmful content originates from or is amplified by entities that meet one of those exceptions. What that means is that the objective of the Bill, which is to make the online world safer, might not actually be possible, because platforms, at least at present, are able to take some actions around these through their current terms and conditions, but this will say explicitly that they cannot act.

One real-world example is the white supremacist terror attack just last week in Buffalo, in the United States. The “great replacement” theory that inspired the terrorist was pushed by Tucker Carlson of Fox News, who would meet the media exemption; by right-wing blogs, which were set up by people who claim to be journalists and so would meet the journalistic standards exemption; by the third-ranking House Republican, who would meet the democratic importance exemption; and it was even run as paid ads by those candidates. In that one example, you would not be able to capture a majority of the way that harm spreads online.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Is there a way in which the exemptions could be limited to ensure that the extremists you have mentioned cannot take advantage of them?

Ellen Judson: I think there are several options. The primary option, as we would see it, is that the exemptions are removed altogether, on the basis that if the Bill is really promoting a systems-based approach rather than focusing on individual small categories of content, then platforms should be required to address their systems and processes whenever those lead to an increased risk of harm. If that leads to demotion of media content that meets those harmful thresholds, that would seem appropriate within that response.

If the exemptions are not to be removed, they could be improved. Certainly, with regard to the media exemption specifically, I think the thresholds for who qualifies as a recognised news publisher could be raised to make it more difficult for bad actors and extremists, as Kyle mentioned, simply to set up a website, add a complaints policy, have an editorial code of conduct and then say that they are a news publisher. That could involve linking to existing publishers that are already registered with existing regulators, but I think there are various ways that could be strengthened.

On the democratic importance and journalism exemptions, I think the issue is that the definitions are very broad and vague; they could easily be interpreted in any way. Either they could be interpreted very narrowly, in which case they might not have much of an impact on how platforms treat freedom of expression, as I think they were intended to do; or they could be interpreted very broadly, and then anyone who thinks or who can claim to think that their content is democratically important or journalistic, even if it is clearly abusive and breaches the platform’s terms and conditions, would be able to claim that.

One option put forward by the Joint Committee is to introduce a public interest exemption, so that platforms would have to think about how they are treating content that is in the public interest. That would at least remove some of the concerns. The easiest way for platforms to interpret what is democratically important speech and what is journalistic speech is based on who the user is: are they a politician or political candidate, or are they a journalist? That risks them privileging certain people’s forms of speech over that of everyday users, even if that speech is in fact politically relevant. I think that having something that moves the threshold further away from focusing on who a user is as a proxy for whether their speech is likely to deserve extra protection would be a good start.

Kyle Taylor: It is basically just saying that content can somehow become less harmful depending on who says it. A systems-based approach is user-neutral, so its only metric is: does this potentially cause harm at scale? It does not matter who is saying it; it is simply a harm-based approach and a system solution. If you have exemptions, exceptions and exclusions, a system will not function. It suggests that a normal punter with six followers saying that the election was stolen is somehow more harmful than the President of the United States saying that an election is stolen. That is just the reality of how online systems work and how privileged and powerful users are more likely to cause harm.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q You are creating a two-tier internet, effectively, between the normal user and those who are exempt, which large swathes of people will be because it is so ambiguous. One of the other concerns that have been raised is the fact that the comments sections on newspaper websites are exempt from the Bill. Do you see an issue with that?

Ellen Judson: There is certainly an issue as that is often where we see a lot of abuse and harm, such that if that same content were replicated on a social media platform, it would almost certainly be within the scope of the Bill. There is a question, which is for Ofcom to consider in its risk profiles and risk registers, about where content at scale has the potential to cause the most harm. The reach of a small news outlet’s comments section would be much less than the reach of Donald Trump’s Twitter account, for instance. Certainly, if the risk assessments are done and comments sections of news websites have similar reach and scale and could cause significant harm, I think it would be reasonable for the regulator to consider that.

Kyle Taylor: It is also that they are publicly available. I can speak from personal experience. Just last week, there was a piece about me. The comments section simultaneously said that I should be at Nuremberg 2.0 because I was a Nazi, but also that I should be in a gas chamber. Hate perpetuates in a comments section just as it does on a social media platform. The idea that it is somehow less harmful because it is here and not there is inconsistent and incoherent with the regime where the clue is in the name: the Online Safety Bill. We are trying to make the online world safer.

On media I would add that we have to think about how easy it is, based on the criteria in the Bill, to become exempt as a media entity. We can think about that domestically, but what happens when a company is only meant to enforce their terms and conditions in that country, but can broadcast to the world? The UK could become the world’s disinformation laundromat because you can come here, meet the media exemption and then blast content to other places in the world. I do not think that is something that we are hoping to achieve through this Bill. We want to be the safest place in the world to go online and to set a global benchmark for what good regulation looks like.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q I suppose, yes. Under the current media carve-out, how do you see platforms being able to detect state actors that are quoting misinformation or perpetuating disinformation on their platforms?

Ellen Judson: I think it is a real challenge with the media exemptions, because it is a recognised tactic of state-based actors, state-aligned actors and non-state actors to use media platforms as ways to disseminate this information. If you can make a big enough story out of something, it gets into the media and that perpetuates the campaign of abuse, harassment and disinformation. If there are protections in place, it will not take disinformation actors very long to work out that if there are ways that they can get stories into the press, they are effectively covered.

In terms of platform enforceability, if platforms are asked to, for instance, look at their systems of amplification and what metrics they use to recommend or promote content to users, and to do that from a risk-based perspective and based on harm except when they are talking about media, it all becomes a bit fuzzy what a platform would actually be expected to do in terms of curating those sorts of content.

Kyle Taylor: As an example, Russia Today, until its broadcast licence was revoked about three months ago, would have qualified for the media exemption. Disinformation from Russia Today is not new; it has been spreading disinformation for years and years, and would have qualified for the media exemption until very recently.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q So as a result of these exemptions, the Bill as it stands could make the internet less safe than it currently is.

Kyle Taylor: The Bill as it stands could absolutely make the internet less safe than it currently is.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q You have done a really good job of explaining the concerns about journalistic content. Thinking about the rest of the Bill for a moment, do you think the balance between requiring the removal of content and the prioritisation of content is right? Do you think it will be different from how things are now? Do you think there is a better way it could be done in the Bill?

Ellen Judson: The focus at the moment is too heavily on content. There is a sort of tacit equation of content removal—sometimes content deprioritisation, but primarily content removal—as the way to protect users from harm, and as the threat to freedom of expression. That is where the tension comes in with how to manage both those things at once. What we would want from a Bill that was taking more of a systems approach is thinking: where are platforms making decisions about how they are designing their services, and how they are operating their services at all levels? Content moderation policy is certainly included, but it goes back to questions of how a recommendation algorithm is designed and trained, who is involved in that process, and how human moderators are trained and supported. It is also about what functionality users are given and what behaviour is incentivised and encouraged. There is a lot of mitigation that platforms can put in place that does not talk about directly affecting user content.

I think we should have risk assessments that focus on the risks of harms to users, as opposed to the risk of users encountering harmful content. Obviously there is a relationship, but one piece of content may have very different effects when it is encountered by different users. It may cause a lot of harm to one user, whereas it may not cause a lot of harm to another. We know that when certain kinds of content are scaled and amplified, and certain kinds of behaviour are encouraged or incentivised, we see harms at a scale that the Bill is trying to tackle. That is a concern for us. We want more of a focus on some things that are mentioned in the Bill—business models, platform algorithms, platform designs and systems and processes. They often take a backseat to the issues of content identification and removal.

Kyle Taylor: I will use the algorithm as an example, because this word flies around a lot when we talk about social media. An algorithm is a calculation that is learning from people’s behaviour. If society is racist, an algorithm will be racist. If society is white, an algorithm will be white. You can train an algorithm to do different things, but you have to remember that these companies are for-profit businesses that sell ad space. The only thing they are optimising for in an algorithm is engagement.

What we can do, as Ellen said, through a system is force optimisation around certain things, or drive algorithms away from certain types of content, but again, an algorithm is user-neutral. An algorithm does not care what user is saying what; it is just “What are people clicking on?”, regardless of what it is or who said it. An approach to safety has to follow the same methodology and say, “We are user-neutral. We are focused entirely on propensity to cause harm.”

The second piece is all the mitigation measures you can take once a post is up. There has been a real binary of “Leave it up” and “Take it down”, but there is a whole range of stuff—the most common word used is “friction”—to talk about what you can do with content once it is in the system. You have to say to yourself, “Okay, we absolutely must have free speech protections that exceed the platform’s current policies, because they are not implemented equally.” At the same time, you can preserve someone’s free expression by demonetising content to reduce the incentive of the company to push that content or user through its system. That is a way of achieving both a reduction in harm and the preservation of free expression.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

May I just ask one more question, Chair?

None Portrait The Chair
- Hansard -

Briefly, because there are two other Members and the Minister wishing to ask questions.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Thanks. On the propensity to cause harm, we heard earlier that a company might create a great new feature and put it out, but then there is a period—a lag, if you like—before they realise the harm that is being caused. Do you trust that companies would have the ability to understand in advance of doing something what harm it may cause, and adequately to assess that?

Ellen Judson: I think there are a lot of things that companies could be doing. Some of these things are in research that they probably are conducting. As we have seen from the Facebook files, companies are conducting that sort of research, but we aren’t privy to the results. I think there a couple of things we want to see. First, we want companies to have to be more transparent about what kind of testing they have done, or, if not testing, about who they have consulted when designing these products. Are they consulting human rights experts? Are they consulting people who are affected by identity-based harm, or are they just consulting their shareholders? Even that would be a step in the right direction, and that is why it is really important.

We feel that there need to be stronger provisions in the Bill for independent researcher and civil society access to data. Companies will be able to do certain amounts of things, and regulators will have certain powers to investigate and do their own research, but it requires the added efforts of civil society properly to hold companies to account for the effects of certain changes they have made—and also to help them in identifying what the effects of those changes to design have been. I think that is really crucial.

None Portrait The Chair
- Hansard -

We are playing “Beat the clock”. I am going to ask for brief answers and brief questions, please. I will take one question from Kim Leadbeater and one from Barbara Keeley.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Gosh, right. I think we are clear that your view is that these two exceptions could potentially do more harm than good. The ideal scenario from your perspective would be to remove them, but again, the challenge is how we balance the freedom of speech issue with protecting the rights of people online who are vulnerable to abuse and harassment. How would you respond to those who say that the Bill risks setting an unwitting precedent for non-democratic countries that would seek to restrict the freedom of expression of their citizens?

Ellen Judson: There is absolutely a risk of over-moderation, and of the Bill incentivising over-moderation, particularly because of the very heavy content focus. Even with illegal content, there is a very broad range of content that companies are expected proactively to monitor for, even when the technical systems to identify that content reliably at scale are perhaps not in place. I absolutely understand and share the concern about over-moderation.

Our response would be that we should look to strengthen the freedom of expression duties currently in the Bill. At the moment, there is a quite vague duty to have regard to the importance of freedom of expression, but it is not at all clear what that would actually mean, and what would be expected from the platforms. One change we would want would be for rights—including freedom of expression and privacy—to be included in the online safety objectives, and to establish that part of the purpose of this regime is to ensure that services are being designed to protect and promote human rights, including freedom of expression. We think that would be a way to bring freedom of expression much more into the centre of the regime and the focus of the Bill, without having to have those add-on exemptions after the fact.

Kyle Taylor: And it creates a level playing field—it says, “These rules apply to everyone equally.”

On the second point, authoritarian—absolutely—but the other area that is really important is fragile democracies. For example, if you look at Hungary, just last week Viktor Orbán said, “You know what you need? Your own media.” If we are setting a standard that says it is totally fine to exempt people in politics and media, then for those fragile democracies that control most aspects of information sharing, we are explicitly saying that it is okay to privilege them over others. That is a very dangerous precedent to set when we have the opportunity to set best global standards here with the Bill.

None Portrait The Chair
- Hansard -

Barbara Keeley?

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q I have a really simple question. You have touched on the balance between free speech rights and the rights of people who are experiencing harassment, but does the Bill do enough to protect human rights?

Ellen Judson: At the moment, no. The rights that are discussed in the Bill at the minute are quite limited: primarily, it is about freedom of expression and privacy, and the way that protections around privacy have been drafted is less strong than for those around freedom of expression. Picking up on the question about setting precedents, if we have a Bill that is likely to lead to more content moderation and things like age verification and user identity verification, and if we do not have strong protections for privacy and anonymity online, we are absolutely setting a bad precedent. We would want to see much more integration with existing human rights legislation in the Bill.

Kyle Taylor: All I would add is that if you look at the exception for content of democratic importance, and the idea of “active political issue”, right now, conversion therapy for trans people—that has been described by UN experts as torture—is an active political issue. Currently, the human rights of trans people are effectively set aside because we are actively debating their lives. That is another example of how minority and marginalised people can be negatively impacted by this Bill if it is not more human rights-centred.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Let me start with this concept—this suggestion, this claim—that there is special protection for politicians and journalists. I will come to clause 50, which is the recognised news publisher exemption, in a moment, but I think you are referring to clauses 15 and 16. If we turn to those clauses and read them carefully, they do not specifically protect politicians and journalists, but “content of democratic importance” and “journalistic content”. It is about protecting the nature of the content, not the person who is speaking it. Would you accept that?

Ellen Judson: I accept that that is what the Bill currently says. Our point was thinking about how it will be implemented in practice. If platforms are expected to prove to a regulator that they are taking certain steps to protect content of democratic importance—in the explanatory notes, that is content related to Government policy and political parties—and they are expected to prove that they are taking a special consideration of journalistic content, the most straightforward way for them to do that will be in relation to journalists and politicians. Given that it is such a broad category and definition, that seems to be the most likely effect of the regime.

Kyle Taylor: It is potentially—

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Sorry, Kyle, do come in in a second, but I just want to come back on that point.

Is it not true that a member of the public or anyone debating a legitimate political topic would also benefit from these measures? It is likely that MPs would automatically benefit—near automatically—but a member of the public might equally benefit if the topic they are talking about is of democratic or journalistic importance.

Ellen Judson: Our concern is that defining what is a legitimate political debate is itself already privileging. As you said, an MP is very likely automatically to benefit.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Well, it is likely; I would not say it is guaranteed.

Ellen Judson: A member of the public may be discussing something—for example, an active political debate that is not about the United Kingdom, which I believe would be out of scope of that protection. They would be engaged in political discussion and exercising freedom of expression, and if they were not doing so in a way that met the threshold for action based on harm, their speech should also come under those protections.

Kyle Taylor: I would add that the way in which you have described it would be so broad as to effectively be meaningless in the context of the Bill, and that instead we should be looking for universal free expression protections in that part of the Bill, and removing this provision. Because what is not, in a liberal democracy, speech of democratic importance? Really, that is everything. When does it reach the threshold where it is an active political debate? Is it when enough people speak about it or enough politicians bring it up? It is so subjective and so broad effectively to mean that everything could qualify. Again, this is not taking a harms-based approach to online safety, because the question is not “Who is saying it?” or “In what context?”; the question is, “Does this have the propensity to cause harm at scale?”

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q The harms are covered elsewhere in the Bill. This is saying what you have to take into account. In fact, at the very beginning of your remarks, Kyle, you said that some of the stuff in the US a week or two ago might have been allowed to stand under these provisions, but the provision does not provide an absolute protection; it simply says that the provider has to take it into account. It is a balancing exercise. Other parts of the Bill say, “You’ve got to look at the harm on a systemic basis.” This is saying, “You’ve got to take into account whether the content is of democratic or journalistic importance.” You made a point a second ago about general protection on free speech, which is in clause 19(2).

Kyle Taylor: Can I respond to that?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes, sure.

Kyle Taylor: My point is that if there is a provision in the Bill about freedom of expression, it should be robust enough that this protection does not have to be in the Bill. To me, this is saying, “Actually, our free expression bit isn’t strong enough, so we’re going to reiterate it here in a very specific context, using very select language”. That may mean that platforms decide not to act for fear of reprisal, as opposed to pursuing online safety. I suggest strengthening the freedom of expression section so that it hits all the points that the Government intend to hit, and removing those qualifiers that create loopholes and uncertainty for a regime that, if it is systems-based, does not have loopholes.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I understand the point you are making, logically. Someone mentioned the human rights element earlier. Of course, article 10 of the European convention on human rights expresses the right to freedom of speech. The case law deriving from that ECHR article provides an enhanced level of protection, particularly for freedom of the press relative to otherwise, so there is some established case law which makes that point. You were talking about human rights earlier, weren’t you?

Ellen Judson: We absolutely recognise that. There is discussion in terms of meeting certain standards of responsible journalism in relation to those protections. Our concern is very much that the people and actors who would most benefit from the journalistic protections specifically would be people who do not meet those standards and cannot prove that they meet those standards, because the standards are very broad. If you intend your content to be journalistic, you are in scope, and that could apply to extremists as much as to people meeting standards of responsible journalism.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q If you are talking about clause 16, it is not that you intend it to be journalistic content; it is that it is journalistic content. You might be talking about clause 50, which is the general exemption to recognise news publishers from the provisions of the Bill. That of course does not prevent social media platforms from choosing to apply their terms and conditions to people who are recognised news publishers; it is just that the Bill is not compelling them. It is important to make that clear—that goes back to the point you made right at the beginning, Kyle. A couple of times in your testimony so far, you have said that you think the way the definition of “recognised news publisher” is drafted in clause 50 is too wide, and potentially susceptible to, basically, abuse by people who are in essence pretending to be news publishers, but who are not really. They are using this as a way to get a free pass from the provisions of the Bill. I completely understand that concern. Do you have any specific suggestions for the Committee about how that concern might be addressed? How could we change the drafting of the Bill to deal with that issue?

Kyle Taylor: Remove the exemption.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q You mean completely? Just delete it?

Kyle Taylor: Well, I am struggling to understand how we can look at the Bill and say, “If this entity says it, it is somehow less harmful than if this entity says it.” That is a two-tiered system and that will not lead to online safety, especially when those entities that are being given privilege are the most likely and largest sources and amplifiers of harmful content online. We sit on the frontlines of this every day, looking at social media, and we can point to countless examples from around the world that will show that, with these exemptions, exceptions and exclusions, you will actually empower those actors, because you explicitly say that they are special. You explicitly say that if they cause harm, it is somehow not as bad as if a normal user with six followers on Twitter causes harm. That is the inconsistency and incoherency in the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We are talking here about the press, not about politicians—

Kyle Taylor: Yes, but the press and media entities spread a lot of disinformation—

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I get that. You have mentioned Victor Orbán and the press already in your comments. There is a long-standing western tradition of treating freedom of the press as something that is sacrosanct and so foundational to the functioning of democracy that you should not infringe or impair it in any way. That is the philosophy that underpins this exclusion.

Kyle Taylor: Except that that is inconsistent in the Bill, because you are saying that for broadcast, they must have a licence, but for print press, they do not have to subscribe to an independent standards authority or code. Even within the media, there is this inconsistency within the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

That is a point that applies regardless of the Bill. The fact is that UK broadcast is regulated whereas UK newspapers are not regulated, and that has been the case for half a century. You can debate whether that is right or wrong, but—

Kyle Taylor: We are accepting that newspapers are not regulated then.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q That matter stands outside the scope of the Bill. If one was minded to tighten this up—I know that you have expressed a contrary view to the thing just being deleted—and if you were to accept that the freedom of the press is something pretty sacrosanct, but equally you don’t want it to be abused by people using it as a fig leaf to cover malfeasant activity, do you have any particular suggestions as to how we can improve the drafting of that clause?

Kyle Taylor: I am not suggesting that the freedom of the press is not sacrosanct. Actually, I am expressing the opposite, which is that I believe that it is so sacrosanct that it should be essential to the freedom-of-expression portion of the Bill, and that the press should be set to a standard that meets international human rights and journalistic standards. I want to be really clear that I absolutely believe in freedom of the press, and it is really important that we don’t leave here suggesting that we don’t think that the press should be free—

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I got that, but as I say, article 10 case law does treat the press a little differently. We are about to run out of time. I wanted to ask about algorithms, which I will probably not have a chance to do, but are there any specific changes to the clause that you would urge us to make?

Ellen Judson: To the media exemption—

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

To clause 50, “Recognised news publisher”.

Ellen Judson: One of the changes that the Government have indicated that they are minded to make—please correct me if I misunderstood—is to introduce a right to appeal.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Correct.

Ellen Judson: Content having to stay online while the appeal was taking place I would very much urge not to be introduced, on the grounds that the content staying online might then be found to be incredibly harmful, and by the time you have got through an appeals process, it will already have done the damage it was going to do. So, if there is a right to appeal—I would urge there not to be a particular right to appeal beyond what is already in the Bill, but if that is to be included, not having the restriction that the platforms must carry the content while the appeal process is ongoing would be important.

Kyle Taylor: You could require an independent standards code as a benchmark at least.

None Portrait The Chair
- Hansard -

Order. I am afraid that brings us to the end of the time allotted for the Committee to ask questions. It also brings us to the end of the day’s sitting. On behalf of the Committee, I thank the witnesses for your evidence. As you ran out of time and the opportunity to frame answers, if you want to put them in writing and offer them to the Minister, I am sure they will be most welcome. The Committee will meet again on Thursday at 11.30 am in this room to hear further evidence on the Bill.

Ordered, That further consideration be now adjourned. —(Steve Double.)

18:00
Adjourned till Thursday 26 May at half-past Eleven o’clock.
Written evidence to be reported to the House
OSB01 Professional Publishers Association (PPA)
OSB02 Neil Kendall and others
OSB03 Girlguiding
OSB04 Alliance to Counter Crime Online and the World Parrot Trust (joint submission)
OSB05 Which?
OSB06 Index on censorship
OSB07 Alliance for intellectual property
OSB08 Internet Services Providers’ Association (ISPA UK)
OSB09 International Justice Mission (IJM UK)
OSB10 Local Government Association (LGA)
OSB11 Russ Elliott
OSB12 SWGfL - Safety & Security Online
OSB13 Action for Primates and Lady Freethinker
OSB14 Association of British Insurers (ABI)
OSB15 Microsoft
OSB16 Age Verification Providers Association
OSB17 techUK
OSB18 UK Interactive Entertainment (Ukie), the trade association for the UK’s video games industry
OSB19 Centre for Media Monitoring
OSB20 Save Online Speech Coalition
OSB21 Bumble
OSB22 Professor Clare McGlynn
OSB23 Antisemitism Policy Trust

Online Safety Bill (Third sitting)

Committee stage & Committee Debate - 3rd sitting
Thursday 26th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)
The Committee consisted of the following Members:
Chairs: Sir Roger Gale, † Christina Rees
† Ansell, Caroline (Eastbourne) (Con)
Bailey, Shaun (West Bromwich West) (Con)
Blackman, Kirsty (Aberdeen North) (SNP)
Carden, Dan (Liverpool, Walton) (Lab)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Double, Steve (St Austell and Newquay) (Con)
Fletcher, Nick (Don Valley) (Con)
Holden, Mr Richard (North West Durham) (Con)
Keeley, Barbara (Worsley and Eccles South) (Lab)
† Leadbeater, Kim (Batley and Spen) (Lab)
† Miller, Mrs Maria (Basingstoke) (Con)
† Mishra, Navendu (Stockport) (Lab)
Moore, Damien (Southport) (Con)
Nicolson, John (Ochil and South Perthshire) (SNP)
† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
Russell, Dean (Watford) (Con)
Stevenson, Jane (Wolverhampton North East) (Con)
Katya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks
† attended the Committee
Witnesses
Mat Ilic, Chief Development Officer, Catch22
William Moy, Chief Executive, Full Fact
William Perrin OBE, Board Member Carnegie UK Trust
Professor Lorna Woods, Author (Professor of Internet Law, University of Essex), Carnegie UK Trust
Danny Stone MBE, Chief Executive, Antisemitism Policy Trust
Stephen Kinsella OBE, Founder, Clean up the Internet
Liron Velleman, Political Organiser, HOPE not hate
Public Bill Committee
Thursday 26 May 2022
(Morning)
[Christina Rees in the Chair]
Online Safety Bill
11:30
The Committee deliberated in private.
11:31
On resuming—
None Portrait The Chair
- Hansard -

We are now sitting in public and the proceedings are being broadcast. I understand that the Government wish to move a motion to amend the programme order agreed by the Committee on Tuesday. The Football Association is unable to attend and, following the technical difficulties on Tuesday, we will replace it with Barnardo’s.

Ordered,

That the Order of the Committee of 24 May 2022 be amended, in paragraph (2), in the Table, in the entry for Thursday 26 May until no later than 2.55 pm, leaving out “The Football Association” and inserting “Barnardo’s”.—(Chris Philp.)

None Portrait The Chair
- Hansard -

Before we hear oral evidence, I invite Members to declare any interests in connection with the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

I need to declare an interest, Ms Rees. Danny Stone from the Antisemitism Policy Trust provides informal secretariat in a personal capacity to the all-party parliamentary group on wrestling, which I co-chair.

None Portrait The Chair
- Hansard -

That is noted. Thank you.

Examination of Witnesses

Mat Ilic, William Moy, Professor Lorna Woods MBE and William Perrin OBE gave evidence.

11:33
None Portrait The Chair
- Hansard -

We will now hear oral evidence from Mat Ilic, chief development officer at Catch22; William May, chief executive at Full Fact; and Professor Lorna Woods and William Perrin of the Carnegie UK Trust. Before calling the first Member, I remind all Members that questions should be limited to matters within the scope of the Bill and that we must stick to the timings in the programme order that the Committee agreed. For this session, we have until 12.15 pm. I call Alex Davies- Jones to begin the questioning.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q187 Good morning to our witnesses. Thank you for joining us today. One of the main criticisms of the Bill is that the vast majority of the detail will not be available until after the legislation is enacted, under secondary legislation and so on. Part of the problem is that we are having difficulty in differentiating the “legal but harmful” content. What impact does that have?

William Perrin: At Carnegie, we saw this problem coming some time ago, and we worked in the other place with Lord McNally on a private Member’s Bill —the Online Harms Reduction Regulator (Report) Bill—that, had it carried, would have required Ofcom to make a report on a wide range of risks and harms, to inform and fill in the gaps that you have described.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

On a point of order, Ms Rees. There is a gentleman taking photographs in the Gallery.

None Portrait The Chair
- Hansard -

There is no photography allowed here.

William Perrin: Unfortunately, that Bill did not pass and the Government did not quite take the hint that it might be good to do some prep work with Ofcom to provide some early analysis to fill in holes in a framework Bill. The Government have also chosen in the framework not to bring forward draft statutory instruments or to give indications of their thinking in a number of key areas of the Bill, particularly priority harms to adults and the two different types of harms to children. That creates uncertainty for companies and for victims, and it makes the Bill rather hard to scrutinise.

I thought it was promising that the Government brought forward a list of priority offences in schedule 7 —I think that is where it is; I get these things mixed up, despite spending hours reading the thing. That was helpful to some extent, but the burden is on the Government to reduce complexity by filling in some of the blanks. It may well be better to table an amendment to bring some of these things into new schedules, as we at Carnegie have suggested—a schedule 7A for priority harms to adults, perhaps, and a 7B and 7C for children and so on—and then start to fill in some of the blanks in the regime, particularly to reassure victims.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Thank you. Does anybody else want to comment?

William Moy: There is also a point of principle about whether these decisions should be made by Government later or through open, democratic, transparent decision making in Parliament.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q That brings me on to my next point, William, relating to concerns about the powers that the Bill gives to the Secretary of State and about the independence of the regulator and the impact that could have. Do you have any comments on that?

William Moy: Sure. I should point out—we will need to get to this later—the fact that the Bill is not seriously trying to address misinformation and disinformation at this point, but in that context, we all know that there will be another information incident that will have a major effect on the public. We have lived through the pandemic, when information quality has been a matter of life and death; we are living through information warfare in the context of Ukraine, and more will come. The only response to that in the Bill is in clause 146, which gives the Secretary of State power to direct Ofcom to use relatively weak media literacy duties to respond.

We think that in an open society there should be an open mechanism for responding to information incidents—outbreaks of misinformation and disinformation that affect people’s lives. That should be set out in the roles of the regulator, the Government and internet companies, so that there is a framework that the public understand and that is open, democratic and transparent in declaring a misinformation and disinformation incident, creating proportionate responses to it, and monitoring the effects of those responses and how the incident is managed. At the moment, it largely happens behind closed doors and it involves a huge amount of restricting what people can see and share online. That is not a healthy approach in an open society.

William Perrin: I should add that as recently as April this year, the Government signed up to a recommendation of the Council of Ministers of the Council of Europe on principles for media and communication governance, which said that

“media and communication governance should be independent and impartial to avoid undue influence…discriminatory treatment and preferential treatment of powerful groups, including those with significant political or economic power.”

That is great. That is what the UK has done for 50 to 60 years in media regulation, where there are very few powers for the Secretary of State or even Parliament to get involved in the day-to-day working of communications regulators. Similarly, we have had independent regulation of cinema by the industry since 1913 and regulation of advertising independent of Government, and those systems have worked extremely well. However, this regime—which, I stress, Carnegie supports—goes a little too far in introducing a range of powers for the Secretary of State to interfere with Ofcom’s day-to-day doing of its business.

Clause 40 is particularly egregious, in that it gives the Secretary of State powers of direction over Ofcom’s codes of practice and, very strangely, introduces an almost infinite ability for the Government to keep rejecting Ofcom’s advice—presumably, until they are happy with the advice they get. That is a little odd, because Ofcom has a long track record as an independent, evidence-based regulator, and as Ofcom hinted in a terribly polite way when it gave evidence to this Committee, some of these powers may go a little too far. Similarly, in clause 147, the Secretary of State can give tactical guidance to Ofcom on its exercise of its powers. Ofcom may ignore that advice, but it is against convention that the Secretary of State can give that advice at all. The Secretary of State should be able to give strategic guidance to Ofcom roughly one or one and a half times per Parliament to indicate its priorities. That is absolutely fine, and is in accordance with convention in western Europe and most democracies, but the ability to give detailed guidance is rather odd.

Then, as Mr Moy has mentioned, clause 146, “Directions in special circumstances”, is a very unusual power. The Secretary of State can direct Ofcom to direct companies to make notices about things and can direct particular companies to do things without a particularly high threshold. There just have to be “reasonable grounds to believe”. There is no urgency threshold, nor is there a strong national security threshold in there, or anyone from whom the Secretary of State has to take advice in forming that judgment. That is something that we think can easily be amended down.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you. Mr Moy, you brought up the issue of misinformation and disinformation being removed from the scope of the Bill. Can you expand on your thoughts on that point?

William Moy: Absolutely. It is an extraordinary decision in a context where we are just coming through the pandemic, where information quality was such a universal concern, and we are in an information war, with the heightened risk of attempts to interfere in future elections and other misinformation and disinformation risks. It is also extraordinary because of the Minister’s excellent and thoughtful Times article, in which he pointed out that at the moment, tech companies censor legal social media posts at vast scale, and this Bill does nothing to stop that. In fact, the Government have actively asked internet companies to do that censorship—it has told them to do so. I see the Minister looking surprised, so let me quote from BBC News on 5 April 2020:

“The culture secretary is to order social media companies to be more aggressive in their response to conspiracy theories linking 5G networks to the coronavirus pandemic.”

In that meeting, essentially, the internet companies were asked to make sure they were taking down that kind of content from their services. Now, in the context of a Bill where, I think, the Minister and I completely agree about our goal—tackling misinformation in an open society—there is an opportunity for this Bill to be an example to the free world of how open societies respond to misinformation, and a beacon for the authoritarian world as well.

This is the way to do that. First, set out that the Bill must cover misinformation and disinformation. We cannot leave it to internet companies, with their political incentives, their commercial convenience and their censoring instincts, to do what they like. The Bill must cover misinformation and set out an open society response to it. Secondly, we must recognise that the open society response is about empowering people. The draft Bill had a recognition that we need to modernise the media literacy framework, but we do not have that in this Bill, which is really regrettable. It would be a relatively easy improvement to create a modern, harms and safety-based media literacy framework in this Bill, empowering users to make their own decisions with good information.

Then, the Bill would need to deal with three main threats to freedom of expression that threaten the good information in our landscape. Full Fact as a charity exists to promote informed and improved public debate, and in the long run we do that by protecting freedom of expression. Those three main threats are artificial intelligence, the internet companies and our own Government, and there are three responses to them. First, we must recognise that the artificial intelligence that internet companies use is highly error-prone, and it is a safety-critical technology. Content moderation affects what we can all see and share; it affects our democracy, it affects our health, and it is safety-critical. In every other safety-critical industry, that kind of technology would be subject to independent third-party open testing. Cars are crashed against walls, water samples are taken and tested, even sofas are sat on thousands of times to check they are safe, but internet companies are subject to no third-party independent open scrutiny. The Bill must change that, and the crash test dummy test is the one I would urge Members to apply.

The second big threat, as I said, is the internet companies themselves, which too often reach for content restrictions rather than free speech-based and information-based interventions. There are lots of things you can do to tackle misinformation in a content-neutral way—creating friction in sharing, asking people to read a post before they share it—or you can tackle misinformation by giving people information, rather than restricting what they can do; fact-checking is an example of that. The Bill should say that we prefer content-neutral and free speech-based interventions to tackle misinformation to content-restricting ones. At the moment the Bill does not touch that, and thus leaves the existing system of censorship, which the Minister has warned about, in place. That is a real risk to our open society.

The final risk to freedom of expression, and therefore to tackling misinformation, are the Government themselves. I have just read you an example of a Government bringing in internet companies to order them around by designating their terms and conditions and saying certain content is unacceptable. That content then starts to get automatically filtered out, and people are stopped from seeing it and sharing it online. That is a real risk. Apart from the fact that they press released it, that is happening behind closed doors. Is that acceptable in an open democratic society, or do we think there should be a legal framework governing when Governments can seek to put pressure on internet companies to affect what we can all see and share? I think that should be governed by a clear legislative framework that sets out if those functions need to exist, what they are and what their parameters are. That is just what we would expect for any similarly sensitive function that Government carry out.

None Portrait The Chair
- Hansard -

Thank you. I am going to bring Maria Miller in now.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q This evidence session is underlining to me how complicated these issues are. I am really grateful for your expertise, because we are navigating through a lot of issues. With slight trepidation I open the conversation up into another area—the issue of protection for children. One of the key objectives of the legislation is to ensure a higher level of protection for children than for adults. In your view, does the Bill achieve that? I am particularly interested in your views on whether the risks of harm to children should be set out on the face of the Bill, and if so, what harms should be included. Can I bring Mat in here?

Mat Ilic: Thank you so much. The impact of social media in children’s lives has been a feature of our work since 2015, if not earlier; we have certainly researched it from that period. We found that it was a catalyst to serious youth violence and other harms. Increasingly, we are seeing it as a primary issue in lots of the child exploitation and missing cases that we deal with—in fact, in half of the cases we have seen in some of the areas that we work in it featured as the primary reason rather than as a coincidental reason. The online harm is the starting point rather than a conduit.

In relation to the legislation, all our public statements on this have been informed by user research. I would say that is one of the central principles to think through in the primary legislation—a safety-by-design focus. We have previously called this the toy car principle, which means any content or product that is designed with children in mind needs to be tested in a way that is explicitly for children, as Mr Moy talked about. It needs to have some age-specific frameworks built in, but we also need to go further than that by thinking about how we might raise the floor, rather than necessarily trying to tackle explicit harms. Our point is that we need to remain focused on online safety for children and the drivers of online harm and not the content.

The question is, how can that be done? One way is the legal design requirement for safety, and how that might play out, as opposed to having guiding principles that companies might adopt. Another way is greater transparency on how companies make particular decisions, and that includes creating or taking off content that pertains to children. I want to underline the point about empowerment for children who have been exposed to or experience harm online, or offline as a result of online harm. That includes some kind of recourse to be able to bring forward cases where complaints, or other issues, were not taken seriously by the platforms.

If you read the terms and conditions of any given technology platform, which lots of young people do not do on signing up—I am sure lots of adults do not do that either—you realise that even with the current non-legislative frameworks that the companies deploy to self-regulate, there is not enough enforcement in the process. For example, if I experience some kind of abuse and complain, it might never be properly addressed. We would really chime on the enforcement of the regulatory environment; we would try to raise the floor rather than chase specific threats and harms with the legislation.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Can I bring Lorna in here? We are talking about moving from content to the drivers of harm. Where would you suggest that should be achieved within the Bill?

Professor Lorna Woods: I think by an overarching risk assessment rather than one that is broken down into the different types of content, because that, in a way, assumes a certain knowledge of the type of content before you can do a risk assessment, so you are into a certain circular mode there. Rather than prejudging types of content, I think it would be more helpful to look at what is there and what the system is doing. Then we could look at what a proportionate response would be—looking, as people have said, at the design and the features. Rather than waiting for content to be created and then trying to deal with it, we could look at more friction at an earlier stage.

If I may add a technical point, I think there is a gap relating to search engines. The draft Bill excluded paid-for content advertising. It seems that, for user-to-user content, this is now in the Bill, bringing it more into line with the current standards for children under the video-sharing platform provisions. That does not apply to search. Search engines have duties only in relation to search content, and search content excludes advertising. That means, as I read it, that search engines would have absolutely no duties to children under their children safety duty in relation to advertising content. You could, for example, target a child with pornography and it would fall outside the regime. I think that is a bit of a gap.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Q Thank you, witnesses, for your time this morning. I am going to focus initially on journalistic content. Is it fair that the platforms themselves are having to try to define what journalistic content is and, by default, what a journalist is? Do you see a way around this?

William Moy: No, no, yes. First, no, it is not fair to put that all on the platforms, particularly because—I think this a crucial thing for the Committee across the Bill as a whole—for anything to be done at internet scale, it has to be able to be done by dumb robots. Whatever the internet companies tell you about the abilities of their technology, it is not magic, and it is highly error-prone. For this duty to be meaningful, it has to be essentially exercised in machine learning. That is really important to bear in mind. Therefore, being clear about what it is going to tackle in a way that can be operationalised is important.

To your second point, it is really important in this day and age to question whether journalistic content and journalists equate to one another. I think this has come up in a previous session. Nowadays, journalism, or what we used to think of as journalism, is done by all kinds of people. That includes the same function of scrutiny and informing others and so on. It is that function that we care about—the passing of information between people in a democracy. We need to protect that public interest function. I think it is really important to get at that. I am sure there are better ways of protecting the public interest in this Bill by targeted protections or specifically protecting freedom of expression in specific ways, rather than these very broad, vague and general duties.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Is there a body that sets out a framework around journalistic standards that the Bill could possibly refer to?

William Moy: No.

William Perrin: At Carnegie, in our earliest work on this in 2018, we were very clear that this Bill should not be a route to regulating the press and media beyond what the social settlement was. Many people are grumpy about that settlement, and many people are happy with it, but it is a classic system intention. We welcome the Government’s attempt to carve journalism out one way or another, but there is still a great problem in defining journalists and journalism.

I think some of the issues around news provider organisations do give a sense in the Bill of a heavy-duty organisation, not some fly-by-night thing that has been set up to evade the rules. As Will was pointing out, the issue then comes down to individual journalists, who are applying their trade in new ways that the new media allows them to do. I remember many years ago, when I ran a media business, having a surreal meeting at DCMS during Leveson, where I had to explain to them what a blogger was. Sadly, we have not quite yet got that precision of how one achieves the intended effect around, in particular, individual journalists.

Professor Lorna Woods: I emphasise what Mr Moy said about the fact that this is going to have to be a system. It is not a decision on every individual item of content, and it is not about a decision on individual speakers. It is going to be about how the characteristics that we care about—the function of journalism—are recognised in an automated systems.

On the drafting of the Bill, I wonder whether there is any overlap between the user-generated content and citizen journalism in clause 16 and the recognition in clause 15 of user-generated content in relation to democratic speech. I am not sure whether one is not a subset of the other.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q What would you change about clauses 15 and 16? Is there an argument that they should not be there at all?

Professor Lorna Woods: I have to confess that I have not really looked at them in great detail, although I have read them. I do not think they work, but I have not got to a solution because that is actually quite a difficult thing to define.

William Moy: I should declare an interest in clause 15 and the news publisher content exemption, because Full Fact would be covered by that exemption. I do not welcome that; I find it very awkward that we could be fact-checking things and some of the people we are fact-checking would not be covered by the exemption.

It is regrettable that we are asking for those exemptions in the Bill. The Bill should protect freedom of expression for everyone. Given the political reality of that clause, it does not do the job that it tries to do. The reason why is essentially because you can set yourself up to pass the test in that clause very easily. The Minister asked about that in a previous session and recognised that there is probably room to tighten the drafting, and I am very happy to work with his officials and talk about how, if that is Parliament’s political intention, we can do it in as practical a way as possible.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you. How could the Bill protect people who are involved in elections, be they parliamentary candidates, people standing in local elections, staff, or election officers? Could that be worked on, and where would it go in the Bill?

William Perrin: The Bill is a risk-management regime. As part of a risk-management regime, one should routinely identify people who are at high risk and high-risk events, where they intersect and how you assess and mitigate that risk. As someone who was a civil servant for 15 years and has worked in public policy since, I hugely respect the functioning of the election process. At the very extreme end, we have seen hideous events occur in recent years, but there is also the routine abuse of politicians and, to some extent, an attempt to terrorise women politicians off certain platforms, which has been quite grotesque.

I feel that there is a space, within the spirit of the Bill as a risk-management regime, to draw out the particular risks faced by people who participate in elections. They are not just candidates and office holders, as you say, but the staff who administer elections—we saw the terrible abuse heaped on them in recent American elections; let us hope that that does not come across here—and possibly even journalists, who do the difficult job of reporting on elections, which is a fundamental part of democracy.

The best way to address those issues might be to require Ofcom to produce a straightforward code of practice—particularly for large, category 1 platforms—so that platforms regard elections and the people who take part in them as high-risk events and high-harm individuals, and take appropriate steps. One appropriate step would be to do a forward look at what the risks might be and when they might arise. Every year, the BBC produces an elections forward look to help it manage the particular risks of public service broadcasting around elections. Could a platform be asked to produce and publish an elections forward look, discussing with people who take part in elections their experience of the risks that they face and how best to mitigate them in a risk-management regime? That could also involve the National Police Chiefs’ Council, which already produces guidance at each election.

We are sitting here having this discussion in a highly fortified, bomb-proof building surrounded by heavily armed police. I do not think any member of the public would begrudge Members of Parliament and the people who come here that sort of protection. We sometimes hear the argument that MPs should not be recognised as special or get special protection. I do not buy that; no one begrudges the security here. It is a simple step to ask platforms to do a risk assessment that involves potential victims of harm, and to publish it and have a dialogue with those who take part, to ensure that the platforms are safe places for democratic discussion.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you. Just to finish, you are right that the point people have made is, “Why should MPs or elected officials be any different from anybody else?” I understand that. What worries me, from some of the work I have done, is that this is about not just the safety of human beings but the impact on democracy. Threatening and abusive behaviour directed at elected politicians can affect the way they feel about doing their job, and that worries me. Do you think it should be a specific stand-alone offence to send harmful or threatening communications to elected people—MPs, councillors, Mayors or police and crime commissioners? Do you think that warrants a separate, stand-alone offence?

William Perrin: The Government have, to their credit, introduced in this Bill offences of sending messages with the intent to harm, but it will take many years for them to work their way through CPS guidance and to establish a body of case law so that it is understood how they are applied. Of course, these cases are heard in magistrates courts, so they do not get reported very well.

One of the reasons we are here discussing this is that the criminal law has failed to provide adequate measures of public protection across social media. If the criminal law and the operation of the police and the CPS worked, we would not need to have this discussion. This discussion is about a civil regulatory regime to make up for the inadequacies in the working of the criminal law, and about making it work a little smoother. We see that in many areas of regulated activity. I would rather get a quicker start by doing some risk assessment and risk mitigation before, in many years’ time, one gets to an effective operational criminal offence. I note that the Government suggested such an offence a few years ago, but I am not quite clear where it got to.

William Moy: To echo Ms Leadbeater’s call for a holistic approach to this, treating as criminal some of the abuse that MPs receive is entirely appropriate. The cost to all of us of women and people of colour being deterred from public life is real and serious. There is also the point that the Bill deals only with personal harms, and a lot of the risk to elections is risk to the democratic system as a whole. You are absolutely right to highlight that that is a gap in what the Bill is doing. We think, certainly from a misinformation point of view, that you cannot adequately address the predictable misinformation and disinformation campaigns around elections simply by focusing on personal harm.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

Q Thank you to the witnesses for joining us and giving us such thorough and clear responses to the various questions. I want to start on a topic that William Perrin and William Moy touched on—the exemption for recognised news publishers, set out in clause 50. You both said you have some views on how that is drafted. As you said, I asked questions on Tuesday about whether there are ways in which it could be improved to avoid loopholes—not that I am suggesting there are any, by the way. Mr Perrin and Mr Moy, could you elaborate on the specific areas where you think it might be improved?

William Moy: Essentially, the tests are such that almost anyone could pass them. Without opening the Bill, you have to have a standards code, which you can make up for yourself, a registered office in the UK and so on. It is not very difficult for a deliberate disinformation actor to pass the set of tests in clause 50 as they currently stand.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q How would you change it to address that, if you think it is an issue?

William Moy: This would need a discussion. I have not come here with a draft amendment—frankly, that is the Government’s job. There are two areas of policy thinking over the last 10 years that provide the right seeds and the right material to go into. One is the line of thinking that has been done about public benefit journalism, which has been taken up in the House of Lords Communications and Digital Committee inquiry and the Cairncross review, and is now reflected in recent Charity Commission decisions. Part of Full Fact’s charitable remit is as a publisher of public interest journalism, which is a relatively new innovation, reflecting the Cairncross review. If you take that line of thinking, there might be some useful criteria in there that could be reflected in this clause.

I hate to mention the L-word in this context, but the other line of thinking is the criteria developed in the context of the Leveson inquiry for what makes a sensible level of self-regulation for a media organisation. Although I recognise that that is a past thing, there are still useful criteria in that line of thinking, which would be worth thinking about in this context. As I said, I would be happy to sit down, as a publisher of journalism, with your officials and industry representatives to work out a viable way of achieving your political objectives as effectively as possible.

William Perrin: Such a definition, of course, must satisfy those who are in the industry, so I would say that these definitions need to be firmly industry-led, not simply by the big beasts—for whom we are grateful, every day, for their incredibly incisive journalism—but by this whole spectrum of new types of news providers that are emerging. I have mentioned my experience many years ago of explaining what a blog was to DCMS.

The news industry is changing massively. I should declare an interest: I was involved in some of the work on public-benefit journalism in another capacity. We have national broadcasters, national newspapers, local papers, local broadcasters, local bloggers and local Twitter feeds, all of which form a new and exciting news media ecosystem, and this code needs to work for all of them. I suppose that you would need a very deep-dive exercise with those practitioners to ensure that they fit within this code, so that you achieve your policy objective.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Okay, thank you. I am not sure that I can take anything specific away from that. Perhaps that illustrates the difficulty of legislating. The clause, as drafted, obviously represents the best efforts, thus far, to deal with an obviously difficult and complicated issue.

We heard some commentary earlier—I think from Mr Moy—about the need to address misinformation, particularly in the context of a serious situation such as the recent pandemic. I think you were saying that there was a meeting, in March or April 2020, for the then Secretary of State and social media firms to discuss the issue and what steps they might take to deal with it. You said that it was a private meeting and that it should perhaps have happened more transparently.

Do you accept that the powers conferred in clause 146, as drafted, do, in fact, address that issue? They give the Secretary of State powers, in emergency situations—a public health situation or a national security situation, as set out in clause 146(1)—to address precisely that issue of misinformation in an emergency context. Under that clause, it would happen in a way that was statutory, open and transparent. In that context, is it not a very welcome clause?

William Moy: I am sorry to disappoint you, Minister, but no, I do not accept that. The clause basically attaches to Ofcom’s fairly weak media literacy duties, which, as we have already discussed, need to be modernised and made harms-based and safety-based.

However, more to the point, the point that I was trying to make is that we have normalised a level of censorship that was unimaginable in previous generations. A significant part of the pandemic response was, essentially, some of the main information platforms in all of our day-to-day lives taking down content in vast numbers and restricting what we can all see and share. We have started to treat that as a normal part of our lives, and, as someone who believes that the best way to inform debate in an open society is freedom of expression, which I know you believe, too, Minister, I am deeply concerned that we have normalised that. In fact, you referred to it in your Times article.

I think that the Bill needs to step in and prevent that kind of overreach, as well as the triggering of unneeded reactions. In the pandemic, the political pressure was all on taking down harmful health content; there was no countervailing pressure to ensure that the systems did not overreach. We therefore found ridiculous examples, such as police posts warning of fraud around covid being taken down by the internet companies’ automated systems because those systems were set to, essentially, not worry about overreach.

That is why we are saying that we need, in the Bill, a modern, open-society approach to misinformation. That starts with it recognising misinformation in the first place. That is vital, of course. It should then go on to create a modern, harms-based media literacy framework, and to prefer content-neutral and free-speech-based interventions over content-restricting interventions. That was not what was happening during the pandemic, and it is not what will happen by default. It takes Parliament to step in and get away from this habitual, content-restriction reaction and push us into an open-society-based response to misinformation.

William Perrin: Can I just add that it does not say “emergency”? It does not say that at all. It says “reasonable grounds” that “present a threat”—not a big threat—under “special circumstances”. We do not know what any of that means, frankly. With this clause, I get the intent—that it is important for national security, at times, to send messages—but this has not been done in the history of public communication before. If we go back through 50 or 60 years, even 70 years, of Government communication, the Government have bought adverts and put messages transparently in place. Apart from D-notices, the Government have never sought to interfere in the operations of media companies in quite the way that is set out here.

If this clause is to stand, it certainly needs a much higher threshold before the Secretary of State can act—such as who they are receiving advice from. Are they receiving advice from directors of public health, from the National Police Chiefs’ Council or from the national security threat assessment machinery? I should declare an interest; I worked in there a long time ago. It needs a higher threshold and greater clarity, but you could dispense with this by writing to Ofcom and saying, “Ofcom, you should have regard to these ‘special circumstances’. Why don’t you take actions that you might see fit to address them?”

Many circumstances, such as health or safety, are national security issues anyway if they reach a high enough level for intervention, so just boil it all down to national security and be done with it.

Professor Lorna Woods: If I may add something about the treatment of misinformation more generally, I suspect that if it is included in the regime, or if some subset such as health misinformation is included in the regime, it will be under the heading of “harmful to adults”. I am picking up on the point that Mr Moy made that the sorts of interventions will be more about friction and looking at how disinformation is incentivised and spread at an earlier stage, rather than reactive takedown.

Unfortunately, the measures that the Bill currently envisages for “harmful but legal” seem to focus more on the end point of the distribution chain. We are talking about taking down content and restricting access. Clause 13(4) gives the list of measures that a company could employ in relation to priority content harmful to adults.

I suppose that you could say, “Companies are free to take a wider range of actions”, but my question then is this: where does it leave Ofcom, if it is trying to assess compliance with a safety duty, if a company is doing something that is not envisaged by the Act? For example, taking bot networks offline, if that is thought a key factor in the spreading of disinformation—I see that Mr Moy is nodding. A rational response might be, “Let’s get rid of bot networks”, but that, as I read it, does not seem to be envisaged by clause 13(4).

I think that is an example of a more general problem. With “harmful but legal”, we would want to see less emphasis on takedown and more emphasis on friction, but the measures listed as envisaged do not go that far up the chain.

None Portrait The Chair
- Hansard -

Minister, we have just got a couple of minutes left, so perhaps this should be your last question.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Yes. On clause 13(4), the actions listed there are quite wide, given that they include not just “taking down the content”—as set out in clause 13(4)(a) —but also

“(b) restricting users’ access to the content;

(c) limiting the recommendation or promotion of the content;

(d) recommending or promoting the content.”

I would suggest that those actions are pretty wide, as drafted.

One of the witnesses—I think it was Mr Moy—talked about what were essentially content-agnostic measures to impede virality, and used the word “friction”. Can you elaborate a little bit on what you mean by that in practical terms?

William Moy: Yes, I will give a couple of quick examples. WhatsApp put a forwarding limit on WhatsApp messages during the pandemic. We knew that WhatsApp was a vector through which misinformation could spread, because forwarding is so easy. They restricted it to, I think, six forwards, and then you were not able to forward the message again. That is an example of friction. Twitter has a note whereby if you go to retweet something but you have not clicked on the link, it says, “Do you want to read the article before you share this?” You can still share it, but it creates that moment of pause for people to make a more informed decision.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you. Would you accept that the level of specificity that you have just outlined there is very difficult, if not impossible, to put in a piece of primary legislation?

William Moy: But that is not what I am suggesting you do. I am suggesting you say that this Parliament prefers interventions that are content-neutral or free speech-based, and that inform users and help them make up their own minds, to interventions that restrict what people can see and share.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q But a piece of legislation has to do more than express a preference; it has to create a statutory duty. I am just saying that that is quite challenging in this context.

William Moy: I do not think it is any more challenging than most of the risk assessments, codes of practice and so on, but I am willing to spend as many hours as it takes to talk through it with you.

None Portrait The Chair
- Hansard -

Order. I am afraid that we have come to the end of our allotted time for questions. On behalf of the Committee, I thank the witnesses for all their evidence.

Examination of Witnesses

Danny Stone MBE, Stephen Kinsella OBE and Liron Velleman gave evidence.

12:15
None Portrait The Chair
- Hansard -

We will now hear from Danny Stone, chief executive of the Antisemitism Policy Trust; Stephen Kinsella, founder of Clean up the Internet; and Liron Velleman, political organiser at HOPE not hate. We have until 1 pm for this panel.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Good morning, witnesses. Thank you for joining us today. Does the Bill give Ofcom discretion to regulate on the smaller but high-risk platforms?

Danny Stone: First, thank you for having me today. We have made various representations about the problems that we think there are with small, high-harm platforms. The Bill creates various categories, and the toughest risk mitigation is on the larger services. They are defined by their size and functionality. Of course, if I am determined to create a platform that will spread harm, I may look at the size threshold that is set and make a platform that falls just below it, in order to spread harm.

It is probably important to set out what this looks like. The Community Security Trust, which is an excellent organisation that researches antisemitism and produces incident figures, released a report called “Hate Fuel” in June 2020. It looked at the various small platforms and highlighted that, in the wake of the Pittsburgh antisemitic murders, there had been 26 threads, I think, with explicit calls for Jews to be killed. One month prior to that, in May 2020, a man called Payton Gendron found footage of the Christchurch attacks. Among this was legal but harmful content, which included the “great replacement” theory, GIFs and memes, and he went on a two-year journey of incitement. A week or so ago, he targeted and killed 10 people in Buffalo. One of the things that he posted was:

“Every Time I think maybe I shouldn’t commit to an attack I spend 5 min of /pol/”—

which is a thread on the small 4chan platform—

“then my motivation returns”.

That is the kind of material that we are seeing: legal but harmful material that is inspiring people to go out and create real-world harm. At the moment, the small platforms do not have that additional regulatory burden. These are public-facing message boards, and this is freely available content that is promoted to users. The risks of engaging with such content are highest. There is no real obligation, and there are no consequences. It is the most available extremism, and it is the least regulated in respect of the Bill. I know that Members have raised this issue and the Minister has indicated that the Government are looking at it, but I would urge that something is done to ensure that it is properly captured in the Bill, because the consequences are too high if it is not.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thanks, Danny. So in your opinion, you would rather see a risk-based approach, as opposed to size and functionality.

Danny Stone: I think there are various options. Either you go for a risk-based approach—categorisation—or you could potentially amend it so that it is not just size and functionality. You would take into account other things—for example, characteristics are already defined in the Bill, and that might be an option for doing it.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Does anybody else want to come in on small platforms? Liron?

Liron Velleman: From the perspective of HOPE not hate, most of our work targeting and looking at far-right groups is spent on some of those smaller platforms. I think that the original intention of the Bill, when it was first written, may have been a more sensible way of looking at the social media ecospace: larger platforms could host some of this content, while other platforms were just functionally not ready to host large, international far-right groups. That has changed radically, especially during the pandemic.

Now, there are so many smaller platforms—whether small means hundreds of thousands, tens of thousands or even smaller than that—that are almost as easy to use as some of the larger platforms we all know so well. Some of the content on those smaller platforms is definitely the most extreme. There are mechanisms utilised by the far-right—not just in the UK, but around the world—to move that content and move people from some of the larger platforms, where they can recruit, on to the smaller platforms. To have a situation in which that harmful content is not looked at as stringently as content on the larger platforms is a miscategorisation of the internet.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q One of our concerns with the Bill, which we raised with the regulator, Ofcom, in Tuesday’s evidence session, is what would happen in the interim if one of those smaller categorised platforms was to grow substantially and then need to be recategorised. Our concern is about what would happen in the interim, during the recategorisation process, while that platform was allowed to disseminate harmful content. What would you like to see happen as an interim measure during recategorisation, if that provision remained in the Bill?

Liron Velleman: We have seen this similarly with the proscription of far-right terrorist groups in other legislation. It was originally quite easy to say that, eventually, the Government would proscribe National Action as a far-right terror group. What has happened since is that aliases and very similar organisations are set up, and it then takes months or sometimes years for the Government to be able to proscribe those organisations. We have to spend our time making the case as to why those groups should be banned.

We can foresee a similar circumstance here. We turn around and say, “Here is BitChute” or hundreds of other platforms that should be banned. We spend six months saying to the Government that it needs to be banned. Eventually, it is, but then almost immediately an offshoot starts. We think that Ofcom should have delegated power to make sure that it is able to bring those platforms into category 1 almost immediately, if the categorisations stay as they are.

Danny Stone: It could serve a notice and ensure that platforms prepare for that. There will, understandably, be a number of small platforms that are wary and do not want to be brought into that category, but some of them will need to be brought in because of the risk of harm. Let us be clear: a lot of this content may well—probably will—stay on the platform, but, at the very least, they will be forced to risk assess for it. They will be forced to apply their terms and conditions consistently. It is a step better than what they will be doing without it. Serving a notice to try to bring them into that regime as quickly as possible and ensure that they are preparing measures to comply with category 1 obligations would be helpful.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you. The Antisemitism Policy Trust has made the case that search services should be eligible for inclusion as a high-risk category. Is that still your position? What is the danger, currently, of excluding them from that provision?

Danny Stone: Very much so. You heard earlier about the problems with advertising. I recognise that search services are not the same as user-to-user services, so there does need to be some different thinking. However, at present, they are not required to address legal harms, and the harms are there.

I appeared before the Joint Committee on the draft Bill and talked about Microsoft Bing, which, in its search bar, was prompting people with “Jews are” and then a rude word. You look at “Gays are”, today, and it is prompting people with “Gays are using windmills to waft homosexual mists into your home”. That is from the search bar. The first return is a harmful article. Do the same in Google, for what it’s worth, and you get “10 anti-gay myths debunked.” They have seen this stuff. I have talked to them about it. They are not doing the work to try to address it.

Last night, using Amazon Alexa, I searched “Is George Soros evil?” and the response, was “Yes, he is. According to an Alexa Answers contributor, every corrupt political event.” “Are the White Helmets fake?” “Yes, they are set up by an ex-intelligence officer.” The problem with that is that the search prompts—the things that you are being directed to; the systems here—are problematic, because one person could give an answer to Amazon and that prompts the response. The second one, about the White Helmets, was a comment on a website that led Alexa to give that answer.

Search returns are not necessarily covered because, as I say, they are not the responsibility of the internet companies, but the systems that they design as to how those things are indexed and the systems to prevent them going to harmful sites by default are their responsibility, and at present the Bill does not address that. Something that forces those search companies to have appropriate risk assessments in place for the priority harms that Parliament sets, and to enforce those terms and conditions consistently, would be very wise.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you to the witnesses for joining us today. The Bill contains duties to protect content of “democratic importance” and “journalistic content”. What is your view of these measures and their likely effectiveness?

Liron Velleman: These are both pretty dangerous clauses. We are very concerned about what I would probably be kind and call their unintended consequences. They are loopholes that could allow some of the most harmful and hateful actors to spread harm on social media. I will take “journalistic” first and then move on to “democratic”.

A number of companies mentioned in the previous evidence session are outlets that could be media publications just by adding a complaints system to their website. There is a far-right outlet called Urban Scoop that is run by Tommy Robinson. They just need to add a complaints system to their website and then they would be included as a journalist. There are a number of citizen journalists who specifically go to our borders to harass people who are seeking refuge in this country. They call themselves journalists; Tommy Robinson himself calls himself a journalist. These people have been specifically taken off platforms because they have repeatedly broken the terms of service of those platforms, and we see this as a potential avenue for them to make the case that they should return.

We also see mainstream publications falling foul of the terms of service of social media companies. If I take the example of the Christchurch massacre, social media companies spent a lot of time trying to take down both the livestream of the attack in New Zealand and the manifesto of the terrorist, but the manifesto was then put on the Daily Mail website—you could download the manifesto straight from the Daily Mail website—and the livestream was on the Daily Mirror and The Sun’s websites. We would be in a situation where social media companies could take that down from anyone else, but they would not be able to take it down from those news media organisations. I do not see why we should allow harmful content to exist on the platform just because it comes from a journalist.

On “democratic”, it is still pretty unclear what the definition of democratic speech is within the Bill. If we take it to be pretty narrow and just talk about elected officials and candidates, we know that far-right organisations that have been de-platformed from social media companies for repeatedly breaking the terms of service—groups such as Britain First and, again, Tommy Robinson—are registered with the Electoral Commission. Britain First ran candidates in the local elections in 2022 and they are running in the Wakefield by-election, so, by any measure, they are potentially of “democratic importance”, but I do not see why they should be allowed to break terms of service just because they happen to have candidates in elections.

If we take it on a wider scale and say that it is anything of “democratic importance”, anyone who is looking to cause harm could say, “A live political issue is hatred of the Muslim community.” Someone could argue that that or the political debate around the trans community in the UK is a live political debate, and that would allow anyone to go on the platform and say, “I’ve got 60 users and I’ve got something to say on this live political issue, and therefore I should be on the platform,” in order to cause that harm. To us, that is unacceptable and should be removed from the Bill. We do not want a two-tier internet where some people have the right to be racist online, so we think those two clauses should be removed.

Stephen Kinsella: At Clean up the Internet this is not our focus, although the proposals we have made, which we have been very pleased to see taken up in the Bill, will certainly introduce friction. We keep coming back to friction being one of the solutions. I am not wearing this hat today, but I am on the board of Hacked Off, and if Hacked Off were here, I think they would say that the solution—although not a perfect solution—might be to say that a journalist, or a journalistic outlet, will be one that has subjected itself to proper press regulation by a recognised press regulator. We could then possibly take quite a lot of this out of the scope of social media regulation and leave it where I think it might belong, with proper, responsible press regulation. That would, though, lead on to a different conversation about whether we have independent press regulation at the moment.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q I think someone has alluded to this already, but should the comments section on news publisher platforms be included in the scope of the Bill?

Danny Stone: I feel quite strongly that they should. I think this is about clauses 39(2) and (5). When they had an exemption last time, we were told they were already regulated, because various newspapers have their own systems, because of IPSO or whatever it might be. There was a written question in the House from Emma Hardy, and the Government responded that they had no data—no assessment of moderator system effectiveness or the harms caused. The Secretary of State said to the DCMS Select Committee that he was confident that these platforms have appropriate moderation policies in place, but was deeply sceptical about IPSO involvement. The Law Commission said that it was not going to give legal exemption to comments boards because they host an abundance of harmful material and abuse, and there are articles in, say, The Times:

“Pro-Kremlin trolls have infiltrated the reader comments on the websites of news organisations, including The Times, the Daily Mail and Fox News, as part of a ‘major influence operation’”.

A number of years ago, we worked—through the all-party parliamentary group against antisemitism, to which we provide the secretariat—on a piece with the Society of Editors on comment moderation on websites, so there have been efforts in the past, but this is a place where there is serious harm caused. You can go on The Sun or wherever now and find comments that will potentially be read by millions of people, so having some kind of appropriate risk assessment, minimum standard or quality assurance in respect of comments boards would seem to be a reasonable step. If it does not get into the Bill, I would in any event urge the Minister to develop some guidance or work with the industry to ensure they have some of those standards in place, but ideally, you would want to lose that carve-out in the Bill.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you. Stephen, just to finish—

None Portrait The Chair
- Hansard -

Just a short question.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Yes, sorry. Is there a body that sets a framework around journalistic standards that the Bill could refer to?

Stephen Kinsella: Obviously, there are the regulators. There is IMPRESS and IPSO, at the very least. I am afraid that I do not know the answer; there must also be journalistic trade bodies, but the regulators would probably be the first port of call for me.

Caroline Ansell Portrait Caroline Ansell (Eastbourne) (Con)
- Hansard - - - Excerpts

Q May I ask about anonymity? It is mentioned in the Bill, but only once. Do you think there is a need for more expansive coverage of this issue? Do you think people should be able to use the internet while remaining anonymous, and if not, to whom would users disclose their identity? Would it be to the platform, or would it be more publicly than that?

Stephen Kinsella: There are a few questions there, obviously. I should say that we are happy with the approach in the Bill. We always felt that focusing on anonymity was the wrong place to start. Instead, we thought that a positive right to be verified, and then a right to screen out replies and posts from unverified accounts, was the way to go.

In terms of who one should make the disclosure to, or who would provide the verification, our concern was always that we did not want to provide another trove of data that the platforms could use to target us with adverts and otherwise monetise. While we have tried to be agnostic on the solution—again, we welcome the approach in the Bill, which is more about principles and systems than trying to pick outcomes—there are third-party providers out there that could provide one-stop verification. Some of them, for instance, rely on the open banking principles. The good thing about the banks is that under law, under the payment services directive and others, we are the owners of our own data. It is a much greyer area whether we are the owners of the data that the social media platforms hold on us, so using that data that the banks have—there is a solution called One ID, for instance—they will provide verification, and you could then use that to open your social media accounts without having to give that data to the platforms.

I saw in the evidence given to you on Tuesday that it was claimed that 80% of users are reluctant to give their data to platforms. We were surprised by that, and so we looked at it. They chose their words carefully. They said users were reluctant to give their data to “certain websites”. What they meant was porn sites. In the polling they were referring to, the question was specifically about willingness to share data with porn sites, and people are, understandably, reluctant to do that. When using open banking or other systems, there are good third-party providers, I would suggest, for verification.

Caroline Ansell Portrait Caroline Ansell
- Hansard - - - Excerpts

Q May I ask a quick supplementary about positive verification, before others contribute? A contributor to a previous session said there was a reluctance—a genuinely held reluctance—by some to be verified. In that way, it suppressed democratic engagement. Do you recognise that as an issue or a fault line in the verification argument?

Stephen Kinsella: Very much not. We have conducted polling using YouGov. Compassion in Politics did polling using Opinium. The figures vary slightly, but at a minimum, two in three citizens—often four out of five citizens—are very willing to be verified and would like the opportunity to be verified if it meant that they could then screen out replies from unverified accounts. I would say there is a weight of evidence on this from the polling. By the way, we would be very happy to conduct further polling, and we would be very happy to consult with the Committee on the wording of the questions that should be put, if that would be helpful, but I think we are quite confident what the response would be.

Liron Velleman: We set two clear tests for the situation on anonymity on platforms. First, will it harm the ability of some groups in society to have freedom of speech online? We are concerned that verification could harm the ability of LGBT people and domestic abuse survivors to use the platforms in the full ways they wish to. For example, if a constituent who is, say, a domestic abuse survivor or LGBT, wished to get in touch with you but was not verified on the platform, it would be one restriction that you would not be able to get around if you chose to change your settings.

Caroline Ansell Portrait Caroline Ansell
- Hansard - - - Excerpts

Q Would that be an argument for their identity verification being at platform level, rather than any wider public identity?

Liron Velleman: That could be very possible. One of our key questions is whether verification would mean that you had to use your real name on the platform or whether you had to verify that you were a person who was using a platform, but could then use a pseudonym on the front face of the website. I could sign up and say, “Here is my ID for the platform verification”, but if I did not wish to use my name, in order to protect my actual identity publicly on the platform, I could choose not to but still be verified as a real person. It would be different to having to have my name, Liron Velleman, as the user for Facebook or Twitter or any other platform.

The second test for us is whether it is going to make a real difference to reducing online harm. With a lot of the harm we see, people are very happy to put their names to the racism, misogyny and sexism and homophobia that they put online. We would not want to see a huge focus on anonymity, whereby we “ended” anonymity online, and yet online harm continued to propagate. We believe it would still continue, and we would not want people to be disappointed that that had not completely solved the issue. Of course, there are a huge number of anonymous accounts online that carry out abuse. Anything we can do to reduce that is welcome, but we do not see it as the silver bullet that could end racism online.

Stephen Kinsella: Obviously, we have not suggested that there is a silver bullet. We are talking about responding to what users want. A lot of users want the ability to say that they do not want to interact with people who are not using their real name. That does not mean that one could not envisage other levels of filter. You could have a different filter that said, “I am happy to interact with people who are verified to be real, but I don’t require that they have given their name”. The technology exists there, certainly to provide a menu of solutions. If you could only have one, we happen to think ours is the best, and that the evidence shows it would reduce a significant amount of disinformation spread and, certainly, abuse.

Danny Stone: I think one issue will be Ofcom’s ability to ensure consistency in policing. It is very difficult, actually, to find out where crimes have happened and who an individual is. Sometimes, the police have the power to compel the revelation of identity. The way the platforms respond is, I think, patchy, so Ofcom’s position in its guidance here will be pretty important.

None Portrait The Chair
- Hansard -

Thank you. We have time for a question from Navendu Mishra before we bring the Minister in.

Navendu Mishra Portrait Navendu Mishra (Stockport) (Lab)
- Hansard - - - Excerpts

Q Mr Stone, do you think that the Bill gives sufficient protection to groups who suffer disproportionate abuse online because of protected characteristics? Do you think that those protections should be clarified in the Bill?

Danny Stone: If we are talking about the “legal but harmful” provisions, I would reflect what the witnesses from the Carnegie Trust—who are brilliant—were saying earlier. There is a principle that has been established in the Bill to list priority illegal harms, and there is no reason why priority harms against adults should not be listed. Racism and misogyny are not going anywhere. The Joint Committee suggested leaning into existing legislation, and I think that is a good principle. The Equality Act established protected characteristics, so I think that is a start—it is a good guide. I think there could be further reference to the Equality Act in the Bill, including in relation to anonymity and other areas.

Navendu Mishra Portrait Navendu Mishra
- Hansard - - - Excerpts

Q So it could be clarified?

Danny Stone: Yes.

None Portrait The Chair
- Hansard -

Would any other witness like to contribute? No.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you again to the witnesses for joining us this morning. I will start with Stephen Kinsella. You have spoken already about some of the issues to do with anonymity. Can you share with the Committee your view on the amendments made to the Bill, when it was introduced a couple of months ago, to give users choices over self-verification and the content they see? Do you think they are useful and helpful updates to the Bill?

Stephen Kinsella: Yes. We think they are extremely helpful. We welcome what we see in clause 14 and clause 57. There is thus a very clear right to be verified, and an ability to screen out interactions with unverified accounts, which is precisely what we asked for. The Committee will be aware that we have put forward some further proposals. I would really hesitate to describe them as amendments; I see them as shading-in areas—we are not trying to add anything. We think that it would be helpful, for instance, when someone is entitled to be verified, that verification status should also be visible to other users. We think that should be implicit, because it is meant to act as a signal to others as to whether someone is verified. We hope that would be visible, and we have suggested the addition of just a few words into clause 14 on that.

We think that the Bill would benefit from a further definition of what it means by “user identity verification”. We have put forward a proposal on that. It is such an important term that I think it would be helpful to have it as a defined term in clause 189. Finally, we have suggested a little bit more precision on the things that Ofcom should take into account when dealing with platforms. I have been a regulatory lawyer for nearly 40 years, and I know that regulators often benefit from having that sort of clarity. There is going to be negotiation between Ofcom and the platforms. If Ofcom can refer to a more detailed list of the factors it is supposed to take into account, I think that will speed the process up.

One of the reasons we particularly welcomed the structure of the Bill is that there is no wait for detailed codes of conduct because these are duties that we will be executing immediately. I hope Ofcom is working on the guidance already, but the guidance could come out pretty quickly. Then there would be the process of—maybe negotiating is the wrong word—to-and-fro with the platforms. I would be very reluctant to take too much on trust. I do not mean on trust from the Government; I mean on trust from the platforms—I saw the Minister look up quickly then. We have confidence in Government; it is the platforms we are little bit wary of. I heard the frustration expressed on Tuesday.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

indicated assent.

Stephen Kinsella: I think you said, “If platforms care about the users, why aren’t they already implementing this?” Another Member, who is not here today, said, “Why do they have to be brought kicking and screaming?” Yet, every time platforms were asked, we heard them say, “We will have to wait until we see the detail of—”, and then they would fill in whatever thing is likely to come last in the process. So we welcome the approach. Our suggestions are very modest and we are very happy to discuss them with you.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Yes, and thank you for the work that you have done on this issue, together with Siobhan Baillie, my hon. Friend the Member for Stroud, which the Government adopted. Some of the areas that you have referred to could be dealt with in subsequent Ofcom codes of practice, but we are certainly happy to look at your submissions. Thank you for the work that you have done in this area.

Danny, we have had some fairly extensive discussions on the question of small but toxic platforms such as 4chan and BitChute—thank you for coming to the Department to discuss them. I heard your earlier response to the shadow Minister, but do you accept that those platforms should be subject to duties in the Bill in relation to content that is illegal and content that is already harmful to children?

Danny Stone: Yes, that is accurate. My position has always been that that is a good thing. The extent and the nature of the content that is harmful to adults on such platforms—you mentioned BitChute but there are plenty of others—require an additional level of regulatory burden and closer proximity to the regulator. Those platforms should have to account for it and say, “We are the platforms; we are happy that this harm is on our platform and”—as the Bill says—“we are promoting it.” You are right that it is captured to some degree; I think it could be captured further.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I understand; thank you. Liron, in an earlier answer, you referred to the protections for content of democratic importance and journalistic content, which are set out in clauses 15 and 16. You suggested and were concerned that they could act as a bar to hateful, prohibited or even illegal speech being properly enforced against. Do you accept that clauses 15 and 16 do not provide an absolute protection for content of democratic importance or journalistic content, and that they do not exempt such content from the Bill’s provisions? They simply say that in discharging duties under the Bill, operators must use

“proportionate systems and processes…to ensure that…content of democratic”—

or journalistic—

“importance is taken into account”.

That is not an absolute protection; it is simply a requirement to take into account and perform a proportionate and reasonable balancing exercise. Is that not reasonable?

Liron Velleman: I have a couple of things to say on that. First, we and others in civil society have spent a decade trying to de-platform some of the most harmful actors from mainstream social media companies. What we do not want to see after the Bill becomes an Act are massive test cases where we do not know which way they will go and where it will be up to either the courts or social media companies to make their own decisions on how much regard they place in those exemptions at the same time as all the other clauses.

Secondly, one of our main concerns is the time it takes for some of that content to be removed. If we have a situation in which there is an expediated process for complaints to be made, and for journalistic content to remain on the platform for an announced time until the platform is able take it down, that could move far outside the realms of that journalistic or democratically important content. Again, using the earlier examples, it does not take long for content such as a livestream of a terrorist attack to be up on the Sun or the Daily Mirror websites and for lots of people to modify that video and bypass content, which can then be shared and used to recruit new terrorists and allow copycat attacks to happen, and can go into the worst sewers of the internet. Any friction that is placed on stopping platforms being able to take down some of that harm is definitely of particular concern to us.

Finally, as we heard on Tuesday, social media platforms—I am not sure I would agree with much of what they would say about the Bill, but I think this is true—do not really understand what they are meant to do with these clauses. Some of them are talking about flowcharts and whether this is a point-scoring system that says, “You get plus one for being a journalist, but minus two for being a racist.” I am not entirely sure that platforms will exercise the same level of regard. If, with some of the better-faith actors in the social media space, we have successfully taken down huge reams of the most harmful content and moved it away from where millions of people can see it to where only tens of thousands can see it, we do not want in any way the potential to open up the risk that hundreds of people could argue that they should be back on platforms when they are currently not there.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Okay, thank you. My last question touches on those issues and is for each of the panel in turn. Some people have claimed—I think wrongly—that the provisions in the Bill in some way threaten free speech. As you will have seen in the article I wrote in The Times earlier this week, I do not think, for a number of reasons, that that is remotely true, but I would be interested in hearing the views of each of the panel members on whether there is any risk to freedom of speech in the work that the Bill does in terms of protecting people from illegal content, harm to children and content that is potentially harmful to adults.

Danny Stone: My take on this—I think people have misunderstood the Bill—is that it ultimately creates a regulated marketplace of harm. As a user, you get to determine how harmful a platform you wish to engage with—that is ultimately what it does. I do not think that it enforces content take-downs, except in relation to illegal material. It is about systems, and in some places, as you have heard today, it should be more about systems, introducing friction, risk-assessing and showing the extent to which harm is served up to people. That has its problems.

The only other thing on free speech is that we sometimes take too narrow a view of it. People are crowded out of spaces, particularly minority groups. If I, as a Jewish person, want to go on 4chan, it is highly unlikely that I will get a fair hearing there. I will be threatened or bullied out of that space. Free speech has to apply across the piece; it is not limited. We need to think about those overlapping harms when it comes to human rights—not just free speech but freedom from discrimination. We need to be thinking about free speech in its widest context.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you. You made a very important point: there is nothing in the Bill that requires censorship or prohibition of content that is legal and harmless to children. That is a really important point.

Stephen Kinsella: I agree entirely with what Danny was saying. Of course, we would say that our proposals have no implications for free speech. What we are talking about is the freedom not to be shouted at—that is really what we are introducing.

On disinformation, we did some research in the early days of our campaign that showed that a vast amount of the misinformation and disinformation around the 5G covid conspiracy was spread and amplified by anonymous or unverified accounts, so they play a disproportionate role in disseminating that. They also play a disproportionate role in disseminating abuse, and I think you may have a separate session with Kick It Out and the other football bodies. They have some very good research that shows the extent to which abusive language is from unverified or anonymous accounts. So, no, we do not have any free speech concerns at Clean up the Internet.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Good. Thank you, Stephen. Liron?

Liron Velleman: We are satisfied that the Bill adequately protects freedom of speech. Our key view is that, if people are worried that it does not, beefing up the universal protections for freedom of speech should be the priority, instead of what we believe are potentially harmful exemptions in the Bill. We think that freedom of speech for all should be protected, and we very much agree with what Danny said—that the Bill should be about enhancing freedom of speech. There are so many communities that do not use social media platforms because of the harm that exists currently on platforms.

On children, the Bill should not be about limiting freedom of speech, but a large amount of our work covers the growth of youth radicalisation, particularly in the far right, which exists primarily online and which can then lead to offline consequences. You just have to look at the number of arrests of teenagers for far-right terrorism, and so much of that comes from the internet. Part of the Bill is about moderating online content, but it definitely serves to protect against some of the offline consequences of what exists on the platform. We would hope that if people are looking to strengthen freedom of speech, that is a universalist principle in the Bill, and not for some groups but not others.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Good. Thank you. I hope the Committee is reassured by those comments on the freedom of speech question.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q I will use the small amount of time we have left to ask one question. A number of other stakeholders and witnesses have expressed concerns regarding the removal of a digital media literacy strategy from the Bill. What role do you see a digital media literacy strategy playing in preventing the kind of abuse that you have been describing?

Danny Stone: I think that a media literacy strategy is really important. There is, for example, UCL data on the lack of knowledge of the word “antisemitism”: 68% of nearly 8,000 students were unfamiliar with the term’s meaning. Dr Tom Harrison has discussed cultivating cyber-phronesis—this was also in an article by Nicky Morgan in the “Red Box” column some time ago—which is a method of building practical knowledge over time to make the right decisions when presented with a moral challenge. We are not well geared up as a society—I am looking at my own kids—to educate young people about their interactions, about what it means when they are online in front of that box and about to type something, and about what might be received back. I have talked about some of the harms people might be directed to, even through Alexa, but some kind of wider strategy, which goes beyond what is already there from Ofcom—during the Joint Committee process, the Government said that Ofcom already has its media literacy requirements—and which, as you heard earlier, updates it to make it more fit for purpose for the modern age, would be very appropriate.

Stephen Kinsella: I echo that. We also think that that would be welcome. When we talk about media literacy, we often find ourselves with the platforms throwing all the obligation back on to the users. Frankly, that is one of the reasons why we put forward our proposal, because we think that verification is quite a strong signal. It can tell you quite a lot about how likely it is that what you are seeing or reading is going to be true if someone is willing to put their name to it. Seeing verification is just one contribution. We are really talking about trying to build or rebuild trust online, because that is what is seriously lacking. That is a system and design failure in the way that these platforms have been built and allowed to operate.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q The shadow Minister’s question is related to the removal of what was clause 103 in the old draft of the Bill. As she said, that related to media literacy. Does the panel draw any comfort from three facts? First, there is already a media literacy duty on Ofcom under section 11 of the Communications Act 2003—the now deleted clause 103 simply provided clarification on an existing duty. Secondly, last December, after the Joint Committee’s deliberations, but before the updated Bill was published, Ofcom published its own updated approach to online media literacy, which laid out the fact that it was going to expand its media literacy programme beyond what used to be in the former clause 103. Finally, the Government also have their own media literacy strategy, which is being funded and rolled out. Do those three things—including, critically, Ofcom’s own updated guidance last December—give the panel comfort and confidence that media literacy is being well addressed?

Liron Velleman: If the Bill is seeking to make the UK the safest place to be on the internet, it seems to be the obvious place to put in something about media literacy. I completely agree with what Danny said earlier: we would also want to specifically ensure—although I am sure this already exists in some other parts of Ofcom and Government business—that there is much greater media literacy for adults as well as children. There are lots of conversations about how children understand use of the internet, but what we have seen, especially during the pandemic, is the proliferation of things like community Facebook groups, which used to be about bins and a fair that is going on this weekend, becoming about the worst excesses of harmful content. People have seen conspiracy theories, and that is where we have seen some of the big changes to how the far-right and other hateful groups operate, in terms of being able to use some of those platforms. That is because of a lack of media literacy not just among children, but among the adult population. I definitely would encourage that being in the Bill, as well as anywhere else, so that we can remove some of those harms.

Danny Stone: I think it will need further funding, beyond what has already been announced. That might put a smile on the faces of some Department for Education officials, who looked so sad during some of the consultation process—trying to ensure that there is proper funding. If you are going to roll this out across the country and make it fit for purpose, it is going to cost a lot of money.

None Portrait The Chair
- Hansard -

Thank you. As there are no further questions from Members, I thank the witnesses for their evidence. That concludes this morning’s sitting.

Ordered, That further consideration be now adjourned. —(Steve Double.)

13:00
Adjourned till this day at Two o’clock.

Online Safety Bill (Fourth sitting)

Committee stage & Committee Debate - 4th sitting
Thursday 26th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)
The Committee consisted of the following Members:
Chairs: † Sir Roger Gale, Christina Rees
† Ansell, Caroline (Eastbourne) (Con)
Bailey, Shaun (West Bromwich West) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
Carden, Dan (Liverpool, Walton) (Lab)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Double, Steve (St Austell and Newquay) (Con)
Fletcher, Nick (Don Valley) (Con)
Holden, Mr Richard (North West Durham) (Con)
Keeley, Barbara (Worsley and Eccles South) (Lab)
† Leadbeater, Kim (Batley and Spen) (Lab)
† Miller, Mrs Maria (Basingstoke) (Con)
† Mishra, Navendu (Stockport) (Lab)
Moore, Damien (Southport) (Con)
Nicolson, John (Ochil and South Perthshire) (SNP)
† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
Russell, Dean (Watford) (Con)
Stevenson, Jane (Wolverhampton North East) (Con)
Katya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks
† attended the Committee
Witnesses
Stephen Almond, Director of Technology and Innovation, Information Commissioner’s Office
Sanjay Bhandari, Chair, Kick It Out
Eva Hartshorn-Sanders, Head of Policy, Center for Countering Digital Hate
Poppy Wood, UK Director, Reset.tech
Owen Meredith, Chief Executive, News Media Association
Matt Rogerson, Director of Public Policy, Guardian Media Group
Tim Fassam, Director of Government Relations and Policy, Personal Investment Management and Financial Advice Association
Rocio Concha, Director of Policy and Advocacy, Which?
Martin Lewis CBE, Founder, MoneySavingExpert.com
Frances Haugen
Public Bill Committee
Thursday 26 May 2022
(Afternoon)
[Sir Roger Gale in the Chair]
Online Safety Bill
14:00
The Committee deliberated in private.
Examination of Witness
Stephen Almond gave evidence.
14:02
None Portrait The Chair
- Hansard -

Good afternoon, ladies and gentlemen. We are now sitting in public and the proceedings are being broadcast. Thank you all for joining us.

We will now hear oral evidence from Stephen Almond, the director of technology and innovation in the Information Commissioner’s Office. Mr Almond, thank you for coming. As I have introduced you, I am not going to ask you to introduce yourself, so we can go straight into the questions. I call the shadow Front-Bench spokesman.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

Q 224 Thank you for coming to give evidence to us this afternoon, Mr Almond. There has been a lot of debate about the risk end-to-end encrypted platforms pose to online safety. What need is there to mitigate that risk in the Bill?

Stephen Almond: Let me start by saying that the ICO warmly welcomes the Bill and its mission to make the UK the safest place in the world to be online. End-to-end encryption supports the security and privacy of online communication and keeps people safe online, but the same characteristics that create a private space for the public to communicate can also provide a safe harbour for more malicious actors, and there are valid concerns that encrypted channels may be creating spaces where children are at risk.

Our view is that the Bill has the balance right. All services in scope, whether encrypted or not, must assess the level of risk that they present and take proportionate action to address it. Moreover, where Ofcom considers it necessary and proportionate, it will have the power to issue technology notices to regulated services to require them to deal with child sexual abuse and exploitation material. We think this presents a proportionate way of addressing the risk that is present on encrypted channels.

It is worth saying that I would not favour provisions that sought to introduce some form of outright ban on encryption in a generalised way. It is vital that the online safety regime does not seek to trade off one sort of online safety risk for another. Instead, I urge those advancing more fundamentalist positions around privacy or safety to move towards the question of how we can incentivise companies to develop technological innovation that will enable the detection of harmful content without compromising privacy. It is one reason why the ICO has been very pleased to support the Government’s safety tech challenge, which has really sought to incentivise the development of technological innovation in this area. Really what we would like to see is progress in that space.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q On that point around technological advances and enabling people to access the internet, people have raised concerns that tech-savvy children will be able to use VPNs, Tor Browser and other tricks to easily circumnavigate the measures that will be in the Bill, especially around age verification and user identity. How do you respond to that, and how do you suggest we close those loopholes, if we can?

Stephen Almond: First and foremost, it is incredibly important that the Bill has the appropriate flexibility to enable Ofcom as the online safety regulator to be agile in responding to technological advances and novel threats in this area. I think the question of VPNs is ultimately going to be one that Ofcom and the regulator services themselves are going to have to work around. VPNs play an important role in supporting a variety of different functions, such as the security of communications, but ultimately it is going to be critical to make sure that services are able to carry out their duties. That is going to require some questions to be asked in this area.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q One final question from me. I would like to discuss your thoughts on transparency and how we can make social media companies like Meta be more transparent and open with their data, beyond the measures we currently have in the Bill. For instance, we could create statute to allow academics or researchers in to examine their data. Do you have any thoughts on how this can be incentivised?

Stephen Almond: Transparency is a key foundation of data protection law in and of itself. As the regulator in this space, I would say that there is a significant emphasis within the data protection regime on ensuring that companies are transparent about the processing of personal data that they undertake. We think that that provides proportionate safeguards in this space. I would not recommend an amendment to the Bill on this point, because I would be keen to avoid duplication or an overlap between the regimes, but it is critical; we want companies to be very clear about how people’s personal data is being processed. It is an area that we are going to continue to scrutinise.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

May I ask a supplementary to that before I come on to my main question?

None Portrait The Chair
- Hansard -

Absolutely.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Thank you so much for coming along. You spoke in your initial comments to my colleague about encryption. The challenges of encryption around child abuse images have been raised with us previously. How can we balance the need to allow people to have encrypted options, if possible, with the need to ensure that this does not adversely affect organisations such as the Internet Watch Foundation, which does so much good in protecting children and rooting out child abuse imagery?

Stephen Almond: I share your concern about this. To go back to what I was saying before, I think the approach that is set out in the Bill is proportionate and targeted. The granting of, ultimately, backstop powers to Ofcom to issue technology notices and to require services to deal with this horrendous material will have a significant impact. I think this will ensure that the regime operates in a risk-based way, where risks can be identified. There will be the firm expectation on service providers to take action, and that will require them to think about all the potential technological solutions that are available to them, be they content scanning or alternative ways of meeting their safety duties.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q My main question is about child safety, which is a prime objective for the Government in this legislation. Do you feel that the Bill’s definition of “likely to be accessed by children” should be more closely aligned with the one used in the ICO’s age-appropriate design code?

Stephen Almond: The objectives of both the Online Safety Bill and the children’s code are firmly aligned in respect of protecting children online. We have reviewed the definitions and, from our perspective, there are distinctions in the definition that is applied in the Bill and the children’s code, but we find no significant tension between them. My focus at the ICO, working in co-operation with Ofcom, will ultimately be on ensuring that there is clarity for business on how the definitions apply to their services, and that organisations know when they are in scope of the children’s code and what actions they should take.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Do you think any further aspects of the age-appropriate design code should be incorporated into the Bill?

Stephen Almond: We are not seeking to incorporate further aspects of the code into the Bill. We think it is important that the regimes fit together coherently, but that that is best achieved through regulatory co-operation between the ICO and Ofcom. The incorporation of the children’s code would risk creating some form of regulatory overlap and confusion.

I can give you a strong assurance that we have a good track record of working closely with Ofcom in this area. Last year, the children’s code came into force, and not too longer after it, Ofcom’s video-sharing platform regime came into force. We have worked very closely to make sure that those regimes are introduced in a harmonised way and that people understand how they fit together.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Working closely with Ofcom is really good, but do you think there needs to be a duty to co-operate with Ofcom, or indeed with other regulators—to be specified in the Bill—in case relations become more tense in future?

Stephen Almond: The Bill has, in my view, been designed to work closely alongside data protection law. It supports effective co-operation between us and Ofcom by requiring and setting out a series of duties for Ofcom to consult with the ICO on the development of any codes of practice or formal guidance with an impact on privacy. With that framework in mind, I do not think there is a case to instil further co-operation duties in that way. I hope I can give you confidence that we and Ofcom will be working tirelessly together to promote the safety and privacy of citizens online. It is firmly in our interests and in the interest of society as a whole to do so.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Q Thank you for joining us, Mr Almond. You stated the aim of making the UK the

“safest place in the world to be online”.

In your view, what needs to be added or taken away from the Bill to achieve that?

Stephen Almond: I am not best placed to comment on the questions of online safety and online harms. You will speak to a variety of different experts who can comment on that point. From my perspective as a digital regulator, one of the most important things will be ensuring that the Bill is responsive to future challenges. The digital world is rapidly evolving, and we cannot necessarily envisage all the developments in technology that will come, or the emergence of new harms. The data protection regime is a principles-based piece of legislation. That gives us a great degree of flexibility and discretion to adapt to novel forms of technology and to provide appropriate guidance as challenges emerge. I really recommend retaining that risk-based, principles-based approach to regulation that is envisaged currently in the Online Safety Bill.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q There has been much talk about trying to future-proof the Bill. Is there anything you could recommend that should be in the Bill to try to help with that?

Stephen Almond: Again, I would say that the most important thing I can recommend around this is to retain that flexibility within the Bill. I know that a temptation will emerge to offer prescription, whether for the purpose of giving companies clarity today or for addressing present harms, but it is going to be really important to make sure that there is due flexibility to enable the legislation to be responsive to future harms.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Under clause 40, the Secretary of State can modify codes of practice to reflect public policy. How do you respond to criticism that this provision risks undermining the independence of the regulator?

Stephen Almond: Ultimately, it is for Ofcom to raise any concerns about the impact of the regime, as set out by its ability to apply its duties appropriately, independently and with due accountability to Parliament and the public. As a regulator, I would say that it is important to have a proper and proportionate degree of independence, so that businesses and the public can have trust in how regulation is carried out. Ultimately though, it is for Government and Parliament to determine what the right level of independence is.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q You have no concerns about that.

Stephen Almond: No.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

Q Mr Almond, welcome to the Committee. Thank you for joining us this afternoon. Can I start with co-operation? You mentioned a moment ago in answer to Maria Miller that co-operation between regulators, particularly in this context the ICO and Ofcom, was going to be very important. Would you describe the co-operative work that is happening already and that you will be undertaking in the future, and comment on the role that the Digital Regulation Cooperation Forum has in facilitating that?

Stephen Almond: Thank you very much. I will start by explaining the Digital Regulation Cooperation Forum. It is a voluntary, not statutory, forum that brings together ourselves, Ofcom, the Competition and Markets Authority and the Financial Conduct Authority—some of the regulators with the greatest interest in digital regulation—to make sure that we have a coherent approach to the regulation of digital services in the interests of the public and indeed the economy.

We are brought together through our common interest. We do not require a series of duties or statutory frameworks to make us co-operate, because the case for co-operation is very, very clear. We will deliver better outcomes by working together and by joining up where our powers align. I think that is what you are seeing in practice in some of the work we have done jointly—for example, around the implementation of the children’s code alongside Ofcom’s implementation of the video-sharing platform regime. A joined-up approach to questions about, for example, how you assure the age of children online is really important. That gives me real confidence in reassuring the Committee that the ICO, Ofcom and other digital regulators will be able to take a very joined-up approach to regulating in the context of the new online safety regime.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you very much. That is extremely helpful. From the perspective of privacy, how satisfied are you that the Bill as constructed gives the appropriate protections to users’ privacy?

Stephen Almond: In our view, the Bill strikes an appropriate balance between privacy and online safety. The duties in the Bill should leave service providers in no doubt that they must comply with data protection law, and that they should guard against unwarranted intrusion of privacy. In my discourse with firms, I am very clear that this is not a trade-off between online safety and privacy: it is both. We are firmly expecting that companies take that forward and work out how they are going to adopt both a “privacy by design” and a “safety by design” approach to the delivery of their services. They must deliver both.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you. My final question is this: do you feel the Bill has been constructed in such a way that it works consistently with the data protection provisions, such as UK GDPR and the Data Protection Act 2018?

Stephen Almond: In brief, yes. We feel that the Bill has been designed to work alongside data protection law, for which we remain the statutory regulator, but with appropriate mechanisms for co-operation with the ICO—so, with this series of consultation duties where codes of practice or guidance that could be issued by Ofcom may have an impact on privacy. We think that is the best way of assuring regulatory coherence in this area.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

That is very helpful. Thank you very much indeed.

None Portrait The Chair
- Hansard -

Mr Almond, we are trying to get a pint into a half-pint pot doing this, so we are rushing a bit. If, when you leave the room, you have a “I wish I’d said that” moment, please feel free to put it in writing to us. We are indebted to you. Thank you very much indeed.

Examination of Witnesses

Sanjay Bhandari and Lynn Perry gave evidence.

14:22
None Portrait The Chair
- Hansard -

Moving, I hope, seamlessly on, we are now going to hear oral evidence from Sanjay Bhandari, who is the chairman of Kick It Out, and—as the Committee agreed this morning—after Tuesday’s technical problems, if we do not have further technical problems, we are going to hear from Lynn Perry from Barnardo’s, again by Zoom. Is Lynn Perry on the line? [Interruption.] Lynn Perry is not on the line. We’ve got pictures; now all we need is Lynn Perry in the pictures.

I am afraid we must start, but if Lynn Perry is able to join, we will be delighted to hear from her. We have Mr Bhandari, so we will press on, because we are very short of time as it is. We hope that Lynn Perry will join us.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Good afternoon, Mr Bhandari; thank you for joining us. What response have you as a football charity seen from the social media companies to the abuse that has been suffered by our sports players online? We all saw the horrendous abuse that our football heroes suffered during the Euros last year. What has been the reaction of the social media companies when this has been raised? Why has it not been tackled?

Sanjay Bhandari: I think you would have to ask them why it has not been tackled. My perception of their reaction is that it has been a bit like the curate’s egg: it has been good in parts and bad in parts, and maybe like the original meaning of that allegory, it is a polite way of saying something is really terrible.

Before the abuse from the Euros, actually, we convened a football online hate working group with the social media companies. They have made some helpful interventions: when I gave evidence to the Joint Committee, I talked about wanting to have greater friction in the system, and they are certainly starting to do that with things like asking people, “Do you really want to send this?” before they post something. We understand that that is having some impact, but of course, it is against the backdrop of a growing number of trolls online. Also, we have had some experiences where we make a suggestion, around verification for instance, where we are introducing third-party companies to social media companies, and very often the response we get is different between London and California. London will say “maybe”, and California then says “no”. I have no reason to distrust the people we meet locally here, but I do not think they always have the power to actually help and respond. The short answer is that there are certainly good intentions from the people we meet locally and there is some action. However, the reality is that we still see quite a lot of content coming through.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you for that. The Centre for Countering Digital Hate, which we will hear from later this afternoon, has identified that, as well as a vast majority of abuse being directed on public profiles, it is also done via direct messaging, in private and sometimes on those smaller high-harm platforms. There are concerns raised by others that this would not be covered by the Bill. Do you have any thoughts on that and what would you like to see?

Sanjay Bhandari: I think we need to work that through. I am sorry that my colleagues from the Premier League and the Football Association could not be here today; I did speak to them earlier this week but unfortunately they have got some clashes. One thing we are talking about is how we tag this new framework to exist in content. We have a few hundred complaints that the Premier League investigates, and we have got a few thousand items that are proactively identified by Signify, working with us and the Professional Footballers’ Association. Our intention is to take that data and map it to the new framework and say, “Is this caught? What is caught by the new definition of harm? What is caught by priority illegal content? What is caught by the new communication offences, and what residue in that content might be harmful to adults?” We can then peg that dialogue to real-world content rather than theoretical debate. We know that a lot of complaints we receive are in relation to direct messaging, so we are going to do that exercise. It may take us a little bit of time, but we are going to do that.

None Portrait The Chair
- Hansard -

Lynn Perry is on the line, but we have lost her for the moment. I am afraid we are going to have to press on.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q I want to focus on one particular issue, which is anonymity. Kick It Out has done so much with the FA to raise awareness of that issue. I was interested in your views on how the Bill treats that. The Bill mentions anonymity and pseudonymity, but it does so only once. Should the Bill take a clearer stance on online anonymity? Do you have any views on whether people should be able to use the internet fully anonymously, or should they disclose their identity to the platform? Do you have any thoughts on that? You have done a huge amount of work on it.

Sanjay Bhandari: There is quite a lot in that question. In terms of whether people should be fully anonymous or not, it depends on what you mean by fully. I am a lawyer, so I have 30 years specialising in the grey, rather than in the black and white. It really does depend on what you mean by fully. In my experience, nothing is absolute. There is no absolute right to freedom of speech; I cannot come in here and shout “Fire!” and make you all panic. There is also no absolute right to anonymity; I cannot use my anonymity online as a cloak to commit fraud. Everything is qualified. It is a question of what is the balance of those qualifications and what those qualifications should be, in the particular context of the problem that we are seeking to address.

The question in this context is around the fact that anonymity online is actually very important in some contexts. If you are gay in a country where that is illegal, being anonymous is a fantastic way to be able to connect with people like you. In a country that has a more oppressive regime, anonymity is another link to the outside world. The point of the Bill is to try to get the balance so that anonymity is not abused. For example, when a football player misses a penalty in a cup final, the point of the Bill is that you cannot create a burner account and instantly send them a message racially abusing them and then delete the account—because that is what happens now. The point of the Bill, which we are certainly happy with in general terms, is to draw a balance in the way that identity verification must be offered as an option, and to give users more power over who they interact with, including whether they wish to engage only with verified accounts.

We will come back and look in more detail at whether we would like more amendments, and we will also work with other organisations. I know that my colleague Stephen Kinsella of Clean up the Internet has been looking at those anonymity provisions and at whether verification should be defined and someone’s status visible on the face of the platforms, for instance. I hope that answers those two or three questions.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

That is very helpful; thank you.

None Portrait The Chair
- Hansard -

I saw you nodding, Ms Perry. Do you wish to add anything?

Lynn Perry: I agree. The important thing, particularly from the perspective of Barnardo’s as a children’s charity, is the right of children to remain safe and protected online and in no way compromised by privacy or anonymity considerations online. I was nodding along at certain points to endorse the need to ensure that the right balance is struck for protections for those who might be most vulnerable.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

Q Lynn, does the Bill ensure that children are kept as safe as possible online? If not, what improvements need to be made to it so that they are?

Lynn Perry: There are several things that we welcome as a children’s charity. One of them, age verification, has just been mentioned. We are particularly concerned and have written about children’s access to harmful and extreme pornography—they are sometimes only a couple of clicks away from harmful online commercial pornography—and we welcome the age-verification measures in the Bill. However, we are concerned about the length of time that it may take to implement those measures, during which children and young people will remain at risk and exposed to content that is potentially harmful to their development. We would welcome measures to strengthen that and to compel those companies to implement the measures earlier. If there were a commencement date for that, those provisions could be strengthened.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q How much of an impact will the Bill have on the likelihood of children being subjected to online grooming and predatory behaviour?

Lynn Perry: There are some contextual considerations that we have been reflecting on as a charity, influenced by what we have heard from children, young people, parents and carers. We know that more children have had access to digital devices and have spent more time online over the last couple of years in particular. In that sense, we are concerned that the Bill needs to be strengthened because of the volume of access, the age at which children and young people now access digital content, and the amount of time that they spend online.

There are some other contextual things in respect of grooming. We welcome the fact that offences are named on the face of the Bill, for example, but one of the things that is not currently included is the criminal exploitation of children. We think that there is another opportunity to name criminal exploitation, where young people are often targeted by organised criminal gangs. We have seen more grooming of that type during the pandemic period as offenders have changed the ways in which they seek to engage young people. That is another area that we would welcome some consideration of.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q In terms of online gaming, and predators moving children from more mainstream to less regulated platforms, do you think there are improvements in the Bill that relate to that, or do you think more can be done?

Lynn Perry: Grooming does happen within gaming, and we know that online video games offer some user-to-user interaction. Users sometimes have the ability to create content within platforms, which is in scope for the Bill. The important thing will be enforcement and compliance in relation to those provisions. We work with lots of children and young people who have been sexually exploited and abused, and who have had contact through gaming sites. It is crucial that this area is in focus from the perspective of building in, by design, safety measures that stop perpetrators being able to communicate directly with children.

Private messaging is another area for focus. We also consider it important for Ofcom to have regulatory powers to compel firms to use technology that could identify child abuse and grooming.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q If I could address one question to each witness, that would be fantastic. I do a lot of work with women in sport, including football. Obviously, we have the Women’s Euros coming up, and I have my Panini sticker album at the ready. Do you think the Bill could do more to address the pervasive issue of online threats of violence and abuse against women and girls, including those directed at women in sport, be they players, officials or journalists?

Sanjay Bhandari: I can see that there is something specific in the communications offences and that first limb around threatening communications, which will cover a lot of the things we see directed at female football pundits, like rape threats. It looks as though it would come under that. With our colleagues in other civil society organisations, particularly Carnegie UK Trust, we are looking at whether more should be done specifically about tackling misogyny and violence against women and girls. It is something that we are looking at, and we will also work with our colleagues in other organisations.

None Portrait The Chair
- Hansard -

Q Ms Perry, do you want to add anything to that?

Lynn Perry: When we were looking at children and young people’s access to harmful pornographic content, one thing we were particularly concerned about related to seeing extreme harmful and violent content, often perpetrated towards women. In respect of younger children, violence against women and girls and gender-based violence considerations, it is something that we are concerned about in that context.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Do you have any thoughts on the Bill committing to a statutory user advocacy body representing the interests of children? If you do, how do you think that that could be funded?

Lynn Perry: I am sorry—that was a question about advocacy, I think.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Yes, the idea of having a statutory user advocacy body that would represent the interests of children. This is something that has been talked about. Is that something you have any thoughts about?

Lynn Perry: We certainly have a lot of representation from children and young people directly. Last year, we worked with more than 380,000 children and young people. We think that advocacy and representation on behalf of children and young people can be used to powerful effect. Making sure that the voices of children and young people, their views, wishes and experiences, are heard and influence legislation that could safeguard and protect them effectively is something that we are supportive of.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Should the Bill commit to that?

Lynn Perry: As a recommendation, we think that could only strengthen the protections of children.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Picking up that last point about representation for particular groups of users including children, Ms Perry, do you agree that the ability to designate organisations that can make super-complaints might be an extremely valuable avenue, in particular for organisations that represent user groups such as children? Organisations such as yours could get designated and then speak on behalf of children in a formal context. You could raise super-complaints with the regulator on behalf of the children you speak for. Is that something to welcome? Would it address the point made by my colleague, Kim Leadbetter, a moment ago?

Lynn Perry: We would welcome provision to be able to bring particularly significant evidence of concern. That is certainly something that organisations, large charities in the sector and those responsible for representing the rights of children and young people would welcome. On some of these issues, we work in coalition to make representations on behalf of children and young people, as well as of parents and carers, who also raise some concerns. The ability to do that and to strengthen the response is something that would be welcomed.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I am glad you welcome that. I have a question for both witnesses, briefly. You have commented in some detail on various aspects of the Bill, but do you feel that the Bill as a whole represents a substantial step forward in protecting children, in your case, Ms Perry, and those you speak for, Sanjay?

Sanjay Bhandari: Our beneficiaries are under-represented or minority communities in sports. I agree, I think that the Bill goes a substantial way to protecting them and to dealing with some of the issues that we saw most acutely after the Euro 2020 finals.

We have to look at the Bill in context. This is revolutionary legislation, which we are not seeing anywhere else in the world. We are going first. The basic sanctions framework and the 10% fines I have seen working in other areas—anti-trust in particular. In Europe, that has a long history. The definition of harm being in the manner of dissemination will pick up pile-ons and some forms of trolling that we see a lot of. Hate crime being designated as priority illegal content is a big one for us, because it puts the proactive duty on the platforms. That too will take away quite a lot of content, we think. The new threatening communications offence we have talked about will deal with rape and death threats. Often the focus is on, quite rightly, the experience of black professional footballers, but there are also other people who play, watch and work in the game, including our female pundits and our LGBT fan groups, who also get loads of this abuse online. The harm-based offence—communications sent to cause harm without reasonable excuse—will likely cover things such as malicious tagging and other forms of trolling. I have already talked about the identification, verification and anonymity provisions.

I think that the Bill will go a substantial way. I am still interested in what fits into that residual category of content harmful to adults, but rather than enter into an arid philosophical and theoretical debate, I will take the spirit of the Bill and try to tag it to real content.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Before I turn to Ms Perry with the same question about the Bill’s general effect, Sanjay, you mentioned the terrible incidence of abuse that the three England footballers got after the penalties last summer. Do you think the social media firms’ response to that incident was adequate, or anywhere close to adequate? If not, does that underline the need for this legislation?

Sanjay Bhandari: I do not think it was adequate because we still see stuff coming through. They have the greatest power to stop it. One thing we are interested in is improving transparency reporting. I have asked them a number of times, “Someone does not become a troll overnight, in the same way that someone does not become a heroin addict overnight, or commit an extremist act of terrorism overnight. There is a pathway where people start off, and you have that data. Can I have it?” I have lost count of the number of times that I have asked for that data. Now I want Ofcom to ask them for it.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Yes. There are strong powers in the Bill for Ofcom to do precisely that. Ms Perry, may I ask you same general question? Do you feel that the Bill represents a very substantial step forward in protecting children?

Lynn Perry: We do. Barnardo’s really welcomes the Bill. We think it is a unique and once-in-a-generation opportunity to achieve some really long-term changes to protect children from a range of online harms. There are some areas in which the Bill could go further, which we have talked about today. The opportunity that we see here is to make the UK the safest place in the world for children to be online. There are some very important provisions that we welcome, not least on age verification, the ability to raise issues through super-complaints, which you have asked me about, and the accountability in various places throughout the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you, Ms Perry. Finally, Mr Bhandari, some people have raised concerns about free speech. I do not share those concerns—in fact, I rebutted them a Times article earlier this week—but does the Bill cause you any concern from a free-speech perspective?

Sanjay Bhandari: As I said earlier, there are no absolute rights. There is no absolute right to freedom of speech— I cannot shout “Fire!” here—and there is no absolute right to privacy; I cannot use my anonymity as a cloak for criminality. It is question of drawing an appropriate balance. In my opinion, the Bill draws an appropriate balance between the right to freedom of speech and the right to privacy. I believe in both, but in the same way that I believe in motherhood and apple pie: of course I believe in them. It is really about the balancing exercise, and I think this is a sensible, pragmatic balancing exercise.

None Portrait The Chair
- Hansard -

Ms Perry, I am very pleased that we were finally able to hear from you. Thank you very much indeed—you have been very patient. Thank you very much, Mr Bhandari. If either of you, as a result of what you have heard and been asked today, have any further thoughts that you wish to submit, please do so.

Examination of Witnesses

Eva Hartshorn-Sanders and Poppy Wood gave evidence.

14:48
None Portrait The Chair
- Hansard -

We will hear oral evidence first from Eva Hartshorn-Sanders, who is the head of policy at the Centre for Countering Digital Hate. We shall be joined in due course by Poppy Wood. Without further ado, I call the shadow Minister.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you for joining us this afternoon. I have quoted a lot of the stats that the Centre for Countering Digital Hate has produced on online abuse directed at individuals with protected characteristics. In the previous panel, I mentioned that the vast majority is done via direct messaging, sometimes through end-to-end encryption on platforms. What are your concerns about this issue in the Bill? Does the Bill adequately account for tackling that form of abuse?

Eva Hartshorn-Sanders: That is obviously an important area. The main mechanism to look at are the complaints pathways and ensuring that when reports are made, action is taken, and that that is included in risk assessments as well. In our “Hidden Hate” report, we found that 90% of misogynist abuse, which included quite serious sexual harassment and abuse, videos and death threats, was not acted on by Instagram, even when we used the current pathways for the complainant. This is an important area.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Part of the issue is that the regulated service providers have to rely heavily on the use of AI to facilitate monitoring and take down problematic content in order to comply with the Bill, but, as several stakeholders have said, algorithmic moderation is inadequate for recognising the nuance and subtleties, in order to actively and effectively take down the content. What more would you like to see in the Bill to counteract that issue?

Eva Hartshorn-Sanders: There has to be human intervention as part of that process as well. Whatever system is in place—the relationship between Ofcom and the provider is going to vary by platform and by search provider too, possibly—if you are making those sorts of decisions, you want to have it adequately resourced. That is what we are saying is not happening at the moment, partly because there is not yet the motivation or the incentives there for them to be doing any differently. They are doing the minimum; what they say they are going to do often comes out through press releases or policies, and then it is not followed through.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q You mentioned that there is not adequate transparency and openness on how these things work. What systems would you like to see the Bill put the place to ensure the transparency, independence and accountability of Ofcom, but also the transparency and openness of the tech companies and the platforms that we are seeking to regulate?

Eva Hartshorn-Sanders: I think there is a role for independent civil society, working with the regulator, to hold those companies to account and to be accessing that data in a way that can be used to show how they are performing against their responsibilities under the Bill. I know Poppy from Reset.tech will talk to this area a bit more. We have just had a global summit on online harms and misinformation. Part of the outcome of that was looking at a framework for how we evaluate global efforts at legislation and the transparency of algorithms and rules enforcement, and the economics that are driving online harms and misinformation. That is an essential part of ensuring that we are dealing with the problems.

None Portrait The Chair
- Hansard -

May I say, for the sake of the record, that we have now been joined by Poppy Wood, the UK director of Reset.tech? Ms Wood, you are not late; we were early. We are trying to make as much use as we can of the limited time. I started with the Opposition Front Bencher. If you have any questions for Poppy Wood, go ahead.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q I do—thank you, Sir Roger. I am not sure if you managed to hear any of that interaction, Poppy. Do you have any comments to make on those points before I move on?

Poppy Wood: I did not hear your first set of questions—I apologise.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

That is fine. I will just ask you what you think the impact is of the decision to remove misinformation and disinformation from the scope of the Bill, particularly in relation to state actors?

Poppy Wood: Thank you very much, and thank you for having me here today. There is a big question about how this Bill tackles co-ordinated state actors—co-ordinated campaigns of disinformation and misinformation. It is a real gap in the Bill. I know you have heard from Full Fact and other groups about how the Bill can be beefed up for mis- and disinformation. There is the advisory committee, but I think that is pretty weak, really. The Bill is sort of saying that disinformation is a question that we need to explore down the line, but we all know that it is a really live issue that needs to be tackled now.

First of all, I would make sure that civil society organisations are on that committee and that its report is brought forward in months, not years, but then I would say there is just a real gap about co-ordinated inauthentic behaviour, which is not referenced. We are seeing a lot of it live with everything that is going on with Russia and Ukraine, but it has been going on for years. I would certainly encourage the Government to think about how we account for some of the risks that the platforms promote around co-ordinated inauthentic behaviour, particularly with regard to disinformation and misinformation.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q We have heard a lot from other witnesses about the ability of Ofcom to regulate the smaller high-risk platforms. What is your view on that?

Poppy Wood: Absolutely, and I agree with what was said earlier, particularly by groups such as HOPE not hate and Antisemitism Policy Trust. There are a few ways to do this, I suppose. As we are saying, at the moment the small but high-risk platforms just are not really caught in the current categorisation of platforms. Of course, the categories are not even defined in the Bill; we know there are going to be categories, but we do not know what they will be.

I suppose there are different ways to do this. One is to go back to where this Bill started, which was not to have categories of companies at all but to have a proportionality regime, where depending on your size and your functionality you had to account for your risk profile, and it was not set by Ofcom or the Government. The problem of having very prescriptive categories—category 1, category 2A, category 2B—is, of course, that it becomes a race to the bottom in getting out of these regulations without having to comply with the most onerous ones, which of course are category 1.

There is also a real question about search. I do not know how they have wriggled out of this, but it was one of the biggest surprises in the latest version of the Bill that search had been given its own category without many obligations around adult harm. I think that really should be revisited. All the examples that were given earlier today are absolutely the sort of thing we should be worrying about. If someone can google a tractor in their workplace and end up looking at a dark part of the web, there is a problem with search, and I think we should be thinking about those sorts of things. Apologies for the example, but it is a really, really live one and it is a really good thing to think about how search promotes these kinds of content.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q I want to touch on something we have not talked about a lot today, which is enforcement and the enforcement powers in the Bill. There are significant enforcement powers in the Bill, but do our two witnesses here which those enforcement powers are enough. Eva?

Eva Hartshorn-Sanders: Are you specifically asking about the takedown notices and the takedown powers?

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

No, I am talking about director liability and the enforcement on companies.

Eva Hartshorn-Sanders: Right. I think the responsibility on both companies and senior executives is a really critical part of this legislative package. You see how adding liability alongside financial penalties works in health and safety legislation and corporate manslaughter provisions to motivate changes not only within company culture but in the work that they are doing and what they factor into the decisions they make. It is a critical part of this Bill.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Is there more that could or should be added to the Bill?

Eva Hartshorn-Sanders: I think it is a good start. I would want to have another look at it to say more. There is a review after two years, as set out in clause 149, so there could be a factor that gets added into that, as well.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Poppy, do you have anything to add?

Poppy Wood: Yes. I think we could go much further on enforcement. One of the things that I really worry about is that if the platforms make an inadequate risk assessment, there is not much that Ofcom can do about it. I would really like to see powers for Ofcom to say, “Okay, your risk assessment hasn’t met the expectations that we put on you, so we want you to redo it. And while you’re redoing it, we may want to put you into a different category, because we may want to have higher expectations of you.” That way, you cannot start a process where you intentionally make an inadequate risk assessment in order to extend the process of you being properly regulated. I think that is one thing.

Then, going back to the point about categorisation, I think that Ofcom should be given the power to recategorise companies quickly. If you think that a category 2B company should be a category 1 company, what powers are there for Ofcom to do that? I do not believe that there are any for Ofcom to do that, certainly not to do it quickly, and when we are talking about small but high-risk companies, that is absolutely the sort of thing that Ofcom should be able to do—to say, “Okay, you are now acting like a category 1 company.” TikTok, Snapchat—they all started really small and they accelerated their growth in ways that we just could not have predicted. When we are talking about the emergence of new platforms, we need to have a regulator that can account for the scale and the pace at which these platforms grow. I think that is a place where I would really like to see Ofcom focusing.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q I have a question for the Centre for Countering Digital Hate. I raised some of your stats on reporting with Meta—Facebook—when they were here, such as the number of reports that are responded to. They basically said, “This is not true any more; we’re now great”—I am paraphrasing, obviously. Could you please let us know whether the reporting mechanism on major platforms—particularly Facebook—is now completely fixed, or whether there are still lots of issues with it?

Eva Hartshorn-Sanders: There are still lots of issues with it. We recently put a report out on anti-Muslim hatred and found that 90% of the content that was reported was not acted on. That was collectively, across the platforms, so it was not just Facebook. Facebook was in the mid-90s, I think, in terms of its failure to act on that type of harmful content. There are absolutely still issues with it, and this regulation—this law—is absolutely necessary to drive change and the investment that needs to go into it.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q I have a quick question for Poppy, although I am afraid it might not have a quick answer. How much of an impact does the algorithmic categorisation of things—the way we are fed things on social media—have on our lives? Do you think it is steering people towards more and more extreme content? Or is it a totally capitalist thing that is not harmful, and just something that sells us things every so often?

Poppy Wood: I think it goes without saying that the algorithmic promotion of harmful content is one of the biggest issues with the model we have in big tech today. It is not the individual pieces of content in themselves that are harmful. It is the scale over which they spread out—the amplification of them; the targeting; the bombardment.

If I see one piece of flat-earth content, that does not necessarily harm me; I probably have other counter-narratives that I can explore. What we see online, though, is that if you engage with that one piece of flat-earth content, you are quickly recommended something else—“You like this, so you’ll probably like that”—and then, before you know it, you are in a QAnon conspiracy theory group. I would absolutely say that the algorithmic promotion of harmful content is a real problem. Does that mean we ban algorithms? No. That would be like turning off the internet. You have to go back and ask, how it is that that kind of harm is promoted, and how is it that we are exploiting human behaviour? It is human nature to be drawn to things that we cannot resist. That is something that the Bill really needs to look at.

In the risk assessments, particularly for illegal content and content that is harmful to children, it explicitly references algorithmic promotion and the business model. Those are two really big things that you touched on in the question. The business model is to make money from our time spent online, and the algorithms serve us up the content that keeps us online. That is accounted for very well in the risk assessments. Some of the things around the safety duties do not necessarily account for that, just because you are risk assessing for it. Say you identify that our business model does promote harmful content; under the Bill, you do not have to mitigate that all the time. So I think there are questions around whether the Bill could go further on algorithmic promotion.

If you do not mind, I will quickly come back to the question you asked Eva about reporting. We just do not know whether reporting is really working because we cannot see—we cannot shine a light into these platforms. We just have to rely on them to tell us, “Hey, reporting is working. This many pieces of content were reported and this many pieces of content were taken down.” We just do not know if that is true. A big part of this regime has to be about transparency. It already is, but I think it could go much further in enabling Ofcom, Government, civil society and researchers to say, “Hey, you said that many pieces of content were reported and that many pieces of content were taken down, but actually, it turns out that none of that is true. We are still seeing that stuff online.” Transparency is a big part of the solution around understanding whether reporting is really working and whether the platforms are true to their word.

Caroline Ansell Portrait Caroline Ansell (Eastbourne) (Con)
- Hansard - - - Excerpts

Q May I ask a follow-up question on that? Poppy, you referenced risk assessments. Would you value and welcome more specifics around quality standards and minimum requirements on risk assessments? My main question is about privacy and anonymity, but I would appreciate a word on risk assessments.

Poppy Wood: Absolutely. I know that children’s groups are asking for minimum standards for children’s risk assessments, but I agree that they should be across the board. We should be looking for the best standards that we can get. I really do not trust the platforms to do these things properly, so I think we have to be really tough with them about what we expect from them. We should absolutely see minimum standards.

Caroline Ansell Portrait Caroline Ansell
- Hansard - - - Excerpts

Q Do you think Ofcom has the resources that it would require to push for an independent audit of risk assessments?

Poppy Wood: Obviously Ofcom is growing. The team at Ofcom are fantastic, and they are hiring really top talent. They have their work cut out in dealing with some of the biggest and wealthiest companies in the world. They need to be able to rely on civil society and researchers to help them to do their job, but I do not think we should rule out Ofcom being able to do these things. We should give it the powers to do them, because that makes this regime have proper teeth. If we find down the line that, actually, it is too much, that is for the Government to sort out with resourcing, or for civil society and researchers to support, but I would not want to rule things out of the Bill just because we think Ofcom cannot do them.

Caroline Ansell Portrait Caroline Ansell
- Hansard - - - Excerpts

Q What are your thoughts on the balance between privacy and anonymity?

Poppy Wood: Of course, the Bill has quite a unique provision for looking at anonymity online. We have done a big comparison of online safety regulations across the world, and nobody is looking at anonymity in the same way as the UK. It is novel, and with that comes risk. Let us remember that anonymity is a harm reduction mechanism. For lots of people in authoritarian regimes, and even for those in the UK who are survivors of domestic abuse or who want to explore their sexuality, anonymity is a really powerful tool for reducing harm, so we need to remember that when we are talking about anonymity online.

One of my worries about the anonymity agenda in the Bill is that it sounds really good and will resonate really well with the public, but it is very easy to get around, and it would be easy to oversell it as a silver bullet for online harm. VPNs exist so that you can be anonymous. They will continue to exist, and people will get around the rules, so we need to be really careful with the messaging on what the clauses on anonymity really do. I would say that the whole regime should be a privacy-first regime. There is much more that the regime can do on privacy. With age verification, it should be privacy first, and anonymity should be privacy first.

I also have some concerns about the watering down of privacy protections from the draft version of the Bill. I think the language was “duty to account for the right to privacy”, or something, and that right-to-privacy language has been taken out. The Bill could do more on privacy, remembering that anonymity is a harm-reducing tool.

Caroline Ansell Portrait Caroline Ansell
- Hansard - - - Excerpts

Q Eva, there is just one reference to anonymity in the Bill currently. Do you think there is an opportunity to express a fuller, more settled opinion and potentially expand on that juxtaposition?

Eva Hartshorn-Sanders: I heard the advice that the representative of the Information Commissioner’s Office gave earlier—he feels that the balance is right at the moment. It is important to incorporate freedom of speech and privacy within this framework in a democratic country. I do not think we need to add anything more than that.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you to the witnesses for joining us this afternoon. May I ask for your views on the clauses on journalistic content exemption and democratic content exemption? Do you think that these measures are likely to be effective?

Poppy Wood: I know you have spoken a lot about this over the past few days, but the content of democratic importance clause is a layer of the Bill that makes the Bill very complicated and hard to implement. My concern about these layers of free speech—whether it is the journalistic exemption, the news media exemption or the content of democratic importance clause—is that, as you heard from the tech companies, they just do not really know what to do with it. What we need is a Bill that can be implemented, so I would definitely err on the side of paring back the Bill so that it is easy to understand and clear. We should revisit anything that causes confusion or is obscure.

The clause on content of democratic importance is highly problematic—not just because it makes the Bill hard to implement and we are asking the platforms to decide what democratic speech is, but because I think it will become a gateway for the sorts of co-ordinated disinformation that we spoke about earlier. Covid disinformation for the past two years would easily have been a matter of public policy, and I think the platforms, because of this clause, would have said, “Well, if someone’s telling you to drink hydroxychloroquine as a cure for covid, we can’t touch that now, because it’s content of democratic importance.”

I have another example. In 2018, Facebook said that it had identified and taken down a Facebook page called “Free Scotland 2014”. In 2018—four years later—Facebook identified it. It was a Russian/Iranian-backed page that was promoting falsehoods in support of Scottish independence using fake news websites, with articles about the Queen and Prince Philip wanting to give themselves a pay rise by stealing from the poor. It was total nonsense, but that is easily content of democratic importance. Even though it was backed by fake actors—as we have said, I do not think there is anything in the Bill to preclude that at the moment, or at least to get the companies to focus on it—in 2014, that content would have been content of democratic importance, and the platforms took four years to take it down.

I think this clause would mean that that stuff became legitimate. It would be a major loophole for hate and disinformation. The best thing to do is to take that clause out completely. Clause 15(3) talks about content of democratic importance applying to speech across a diverse range of political opinion. Take that line in that subsection and put it in the freedom of expression clause—clause 19. What you then have is a really beefed-up freedom of expression clause that talks about political diversity, but you do not have layers on top of it that mean bad actors can promote hate and disinformation. I would say that is a solution, and that will make the Bill much easier to implement.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you, Poppy. Eva?

Eva Hartshorn-Sanders: I think the principle behind the duty is correct and that they should consider the democratic importance of content when they are making moderation decisions, but what we know from our work is that misinformation and disinformation on social media poses a real threat to elections and democracies around the world. As an international organisation, we have studied the real harms caused by online election disinformation in countries like the US. We saw websites like The Gateway Pundit profit from Google ads to the tune of over $1 million while spreading election disinformation. That has led to real-world death threats sent to election officials and contributed to the events of 6 January. It is not something we want to see replicated in the UK.

The problem with the democratic importance duty is that it is framed negatively about preventing platforms from removing content, rather than positively about addressing content that undermines elections. That is concerning because it is the latter that has proved to be damaging in the real world. I think where we are getting to is that there should be a positive duty on platforms to act on content that is designed and intended to undermine our democracy and our elections.

To add to that, the Joint Committee on the draft Bill looked specifically at having misinformation and disinformation on elections and public health on the face of the Bill rather than leaving it to secondary legislation. That is a position that we would support. The type of harm we have seen over the last couple of years through covid is a known harm and it is one that we should be addressing. It has led to the deaths of millions of people around the world.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q That is really helpful; thank you. You raised the point about the abuse that was directed at election officials in America. Do you think it should almost be a stand-alone offence to send harmful or threatening communications to elected people—MPs, councillors, mayors or police and crime commissioners—or possibly even election officials, the people who are involved in the democratic process, because of the risk that that abuse and threats could have on democracy?

Eva Hartshorn-Sanders: Obviously abuse is unacceptable, and there have been real issues with that globally and I know in the UK from the work we have done with MPs here, including through the misogyny research. I guess this is the balance—if people have concerns about legitimate political decisions that are being made—but that is why you have an independent regulator who can assess that content.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Poppy, do you have any thoughts on that?

Poppy Wood: We are seeing people who put themselves forward in public life receiving all sorts of horrible abuse, which was cited as a big reason for women and people of colour removing themselves from public life in recent elections. My understanding is that the threatening communications offences brought in under the illegal duties will probably cover quite a lot of that. The idea that Eva just gave of an election risk assessment or something might, coupled with the threatening communications offences, mean that you are accounting for how your platform promotes that sort of hate.

One of the things that you would want to try to avoid is making better protections for politicians than for everyone else, but I think that threatening communications already covers some of that stuff. Coupled with an elections risk assessment, that would hopefully mean that there are mitigating effects on the risks identified in those risk assessments to tackle the sorts of things that you were just talking about.

Eva Hartshorn-Sanders: Just to add to that, from our work on “Don’t Feed the Trolls”, we know that a lot of these hate campaigns are quite co-ordinated. There is a whole lot of supporting evidence behind that. They will often target people who raise themselves up in whatever position, whether elected or a different type. The misogyny report we have just done had a mix of women who were celebrities or just had a profile and a large Instagram following and who were, again, subject to that abuse.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Should there be more in the Bill with a specific reference to violence against women and girls, abuse and threats, and misogyny?

Eva Hartshorn-Sanders: There are definitely parts of the Bill that could be strengthened in that area. Part of that relates to incels and how they are treated, or not, as a terrorist organisation; or how small sites might be treated under the Bill. I can elaborate on that if you like.

None Portrait The Chair
- Hansard -

Thank you. Minister.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you for joining us this afternoon and for giving us your evidence so far. At the beginning of your testimony, Ms Hartshorn-Sanders, I think you mentioned—I want to ensure I heard correctly—that you believe, or have evidence, that Instagram is still, even today, failing to take down 90% of inappropriate content that is flagged to it.

Eva Hartshorn-Sanders: Our “Hidden Hate” report was on DMs—direct messages—that were shared by the participants in the study. One in 15 of those broke the terms and conditions that Instagram had set out related to misogynist abuse—sexual abuse. That was in the wake of the World cup, so after Instagram had done a big promotion about how great it was going to be in having policies on these issues going forward. We found that 90% of that content was not acted on when we reported it. This was not even them going out proactively to find the content and not doing anything with it; it was raised for their attention, using their systems.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q That clearly illustrates the problem we have. Two parts of the Bill are designed to address this: first, the ability for designated user representation groups to raise super-complaints—an issue such as the one you just mentioned, a systemic issue, could be the subject of such a super-compliant to Ofcom, in this case about Instagram—and, secondly, at clause 18, the Bill imposes duties on the platforms to have proper complaints procedures, through which they have to deal with complaints properly. Do those two provisions, the super-complaints mechanism for representative groups and clause 18 on complaints procedures, go a long way towards addressing the issue that you helpfully and rightly identified?

Eva Hartshorn-Sanders: That will depend on transparency, as Poppy mentioned. How much of that information can be shared? We are doing research at the moment on data that is shared personally, or is publicly available through the different tools that we have. So it is strengthening access to that data.

There is this information asymmetry that happens at the moment, where big tech is able to see patterns of abuse. In some cases, as in the misogyny report, you have situations where a woman might be subject to abuse from one person over and over again. The way that is treated in the EU is that Instagram will go back and look at the last 30 historically to see the pattern of abuse that exists. They are not applying that same type of rigorousness to other jurisdictions. So it is having access to it in the audits that are able to happen. Everyone should be safe online, so this should be a safety-by-design feature that the companies have.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Meta claimed in evidence to the Committee on Tuesday that it gave researchers good access to its data. Do you think that is true?

Eva Hartshorn-Sanders: I think it depends on who the researchers are. I personally do not have experience of it, but I cannot speak to that. On transparency, at the moment, the platforms generally choose what they share. They do not necessarily give you the data that you need. You can hear from my accent that I am originally from New Zealand. I know that in the wake of the Christchurch mosque terrorist attack, they were not prepared to provide the independent regulator with data on how many New Zealanders had seen the footage of the livestream, which had gone viral globally. That is inexcusable, really.

None Portrait The Chair
- Hansard -

Q Ms Wood, do you want to comment on any of this before we move on?

Poppy Wood: On the point about access to data, I do not believe that the platforms go as far as they could, or even as far as they say they do. Meta have a tool called CrowdTangle, which they use to provide access to data for certain researchers who are privileged enough to have access. That does not even include comments on posts; it is only the posts themselves. The platforms pull the rug out all the time from under researchers who are investigating things that the platforms do not like. We saw that with Laura Edelson at New York University, who they just cut off—that is one of the most famous cases. I think it is quite egregious of Meta to say that they give lots of access to data.

We know from the revelations of whistleblowers that Meta do their own internal research, and when they do not like the results, they just bury it. They might give certain researchers access to data under certain provisions, but independent researchers who want to investigate a certain emergent harm or a certain problem are not being given the sort of access that they really need to get insights that move the needle. I am afraid that I just do not believe that at all.

The Bill could go much further. A provision on access to data in clause 136 states that Ofcom has two years to issue a report on whether researchers should get access to data. I think we know that researchers should have access to data, so I would, as a bare minimum, shorten the time that Ofcom has to do that report from two years to six months. You could turn that into a question of how to give researchers access to data rather than of whether they should get it. The Digital Services Act—the EU equivalent of the Bill—goes a bit further on access to data than our Bill. One result of that might be that researchers go to the EU to get their data because they can get it sooner.

Improving the Bill’s access to data provisions is a no-brainer. It is a good thing for the Government because we will see more stuff coming out of academia, and it is a good thing for the safety tech sector, because the more research is out there, the more tools can be built to tackle online harms. I certainly call on the Government to think about whether clause 136 could go further.

None Portrait The Chair
- Hansard -

Thank you. Last brief question, Minister.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Goodness! There is a lot to ask about.

None Portrait The Chair
- Hansard -

Sorry, we are running out of time.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I appreciate that; thank you, Sir Roger. Ms Wood, you mentioned misinformation in your earlier remarks—I say “misinformation” rather than “state-sponsored disinformation”, which is a bit different. It is very difficult to define that in statute and to have an approach that does not lead to bias or to what might be construed as censorship. Do you have any particular thoughts on how misinformation could be concretely and tangibly addressed?

Poppy Wood: It is not an easy problem to solve, for sure. What everybody is saying is that you do it in a content-neutral way, so that you are not talking about listing specific types of misinformation but about the risks that are built into your system and that need to be mitigated. This is a safety by design question. We have heard a lot about introducing more friction into the system, checking the virality threshold, and being more transparent. If you can get better on transparency, I think you will get better on misinformation.

If there is more of an obligation on the platforms to, first, do a broader risk assessment outside of the content that will be listed as priority content and, secondly, introduce some “harm reduction by design” mechanisms, through friction and stemming virality, that are not specific to certain types of misinformation, but are much more about safety by design features—if we can do that, we are part of the way there. You are not going to solve this problem straightaway, but you should have more friction in the system, be it through a code of practice or a duty somewhere to account for risk and build safer systems. It cannot be a content play; it has to be a systems play.

None Portrait The Chair
- Hansard -

Thank you. I am sorry, but that brings us to the end of the time allotted to this session. Ladies, if either of you wishes to make a submission in writing in the light of what you have not answered or not been able to answer, please do. Ms Wood, Ms Hartsholm-Sanders, thank you very much indeed for joining us.

Examination of Witnesses

Owen Meredith and Matt Rogerson gave evidence.

15:25
None Portrait The Chair
- Hansard -

We shall now hear from Owen Meredith, chief executive of News Media Association, and Matt Rogerson, director of public policy at Guardian Media Group.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Good afternoon, both, and thank you for coming this afternoon. We have heard a lot about the journalistic content exemption. What is your view of the current measures in the Bill and their likely consequences?

Owen Meredith: You may be aware that we submitted evidence to the Joint Committee that did prelegislative scrutiny of the draft Bill, because we think that although the Government’s stated intention to have content from recognised news media publishers, who I represent, outside the scope of the Bill, we do not believe that the drafting, as it was and still is, achieves that. Ministers and the Secretary of State have confirmed, both in public appearances and on Second Reading, that they wish to table further amendments to achieve the aim that the Government have set out, which is to ensure that content from recognised news publishers is fully out of scope of the Bill. It needs to go further, but I understand that there will be amendments coming before you at some point to achieve that.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q What further would you like to see?

Owen Meredith: I would like to see a full exemption for recognised news publisher content.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q You would like to see a full exemption. Matt, do you have any thoughts on that?

Matt Rogerson: Yes. I would step back a bit and point to the evidence that a few of your witnesses gave today and Tuesday. I think Fair Vote gave evidence on this point. At the moment, our concern is that we do not know what the legal but harmful category of content that will be included in the Bill will look like. That is clearly going to be done after the event, through codes of practice. There is definitely a danger that news publisher content gets caught by the platforms imposing that. The reason for having a news publisher exemption is to enable users of platforms such as Facebook, Twitter and others to access the same news as they would via search. I agree with Owen’s point. I think the Government are going in the right direction with the exemption for broadcasters such as the BBC, The Times and The Guardian, but we would like to see it strengthened a bit to ensure a cast-iron protection.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Currently, is the definition of journalistic content used in the Bill clear, or do you find it ambiguous?

Matt Rogerson: I think it is quite difficult for platforms to interpret that. It is a relatively narrow version of what journalism is—it is narrower than the article 10 description of what journalism is. The legal definitions of journalism in the Official Secrets Act and the Information Commissioner’s Office journalism code are slightly more expansive and cover not just media organisations but acts of journalism. Gavin Millar has put together a paper for Index on Censorship, in which he talks about that potentially being a way to expand the definition slightly.

The challenge for the platforms is, first, that they have to take account of journalistic content, and there is not a firm view of what they should do with it. Secondly, defining what a piece of journalism or an act of journalism is takes a judge, generally with a lot of experience. Legal cases involving the media are heard through a specific bench of judges—the media and communications division—and they opine on what is and is not an act of journalism. There is a real challenge, which is that you are asking the platforms to—one assumes—use machine learning tools to start with to identify what is a potential act of journalism. Then an individual, whether they are based in California or, more likely, outsourced via an Accenture call centre, then determines within that whether it is an act of journalism and what to do with it. That does place quite a lot of responsibility on the platforms to do that. Again, I would come back to the fact that I think if the Bill was stripped back to focus on illegal content, rather than legal but harmful content, you would have less of these situations where there was concern that that sort of content was going to be caught.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q We have heard a lot of concern about disinformation by state actors purporting to be journalists and using that exemption, which could cause harm. Do you have any thoughts on that?

Matt Rogerson: Yes, a few. The first thing that is missing from the Bill is a focus on advertising. The reason we should focus on advertising is that that is why a lot of people get involved in misinformation. Ad networks at the moment are able to channel money to “unknown” sites in ways that mean that disinformation or misinformation is highly profitable. For example, a million dollars was spent via Google’s ad exchanges in the US; the second biggest recipient of that million dollars was “Unknown sites”—sites that do not categorise themselves as doing anything of any purpose. You can see how the online advertising market is channelling cash to the sort of sites that you are talking about.

In terms of state actors, and how they relate to the definition, the definition is set out quite broadly in the Bill, and it is more lengthy than the definition in the Crime and Courts Act 2013. On top of that definition, Ofcom would produce guidance, which is subject to a full and open public consultation, which would then work out how you are going to apply the definition in practice. Even once you have that guidance in place, there will be a period of case law developing where people will appeal to be inside of that exemption and people will be thrown out of that exemption. Between the platforms and Ofcom, you will get that iteration of case law developing. So I suppose I am slightly more confident that the exemption would work in practice and that Ofcom could find a workable way of making sure that bad actors do not make use of it.

None Portrait The Chair
- Hansard -

Mr Meredith, do you wish to add to that?

Owen Meredith: No, I would echo almost entirely what Matt has said on that. I know you are conscious of time.

None Portrait The Chair
- Hansard -

Thank you. Maria Miller.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q A great deal of the discussion we are having about this Bill is its scope—what is covered and what is not covered. Many of us will look regularly at newspapers online, particularly the comments sections, which can be quite colourful. Should comments on newspaper publisher platforms be included in the scope of the Bill?

Owen Meredith: Yes, I think they should be included within the news publisher exemption as it is spelt out. As far as I understand, that has always been the intention, since the original White Paper many years ago that led to where we are today. There is a very good reason for that, not least the fact that the comments on news publisher websites are still subject to the responsibility of the editor and the publisher; they are subject to the regulation of the Independent Press Standards Organisation, in the case of those publishers who are regulated under the self-regulation system by IPSO, as the majority of my members are. There is a very different environment in news publisher websites’ comments sections, where you are actively seeking to engage with those and read those as a user, whereas on social media platforms that content can come to you without you wishing to engage with it.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Can I just probe on that slightly? You say the comments are the responsibility of the editor. Does that mean that if something is published on there that is defamatory, it would then be attributed to the editor?

Owen Meredith: Everything published by the news site is ultimately the responsibility of the editor.

Matt Rogerson: I think there are various cases. I think Delfi is the relevant case in relation to comments, where if a publisher is notified of a defamatory comment within their comments section, they are legally liable for it if they do not take it down. To speak from a Guardian perspective, we would like comments sections to be included within the exemption. The self-regulation we have in place for our comments section has been quite a journey. We undertook quite a big bit of research on all the comments that had been left over an 11-year period. We tightened up significantly the processes that we had in place. We currently use a couple of steps to make sure those comments sections are well moderated. We use machine learning against very tightly defined terms, and then every single comment that is taken down is subject to human review. I think that works in the context of a relatively small website such as The Guardian, but it would be a much bigger challenge for a platform of the size of Facebook.

None Portrait The Chair
- Hansard -

Kim Leadbeater?

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you, Chair, and thank you to the witnesses. I just want to clarify something. We were talking about the journalistic content definition as it is. You are saying that you do not think it is reasonable to expect service providers to identify journalistic content using the definition contained in the Bill. Do you think the Bill should be clearer about what it means by journalistic content and journalism?

Matt Rogerson: My point is that for news publishers there is a lack of definition in the journalistic content exemption, and that platforms without the exemption would have to identify whether every piece of content on their platform was journalism, so it would be very difficult for the platforms to implement. That is why for trusted news brands such as the BBC, The Times, and The Guardian, the news media exemption is really important.

What we do not know, and what Gavin Millar suggested in his paper to Index on Censorship, is how that journalistic content exemption will be interpreted by the platforms. His fear in the paper is that the current definition means that the content has to be UK-linked. It could mean, for example, that a blog or a journalist that talks about issues in the Gulf or Ukraine would not be seen as journalistic content and therefore would not be able to take advantage of the systems that the platforms put in place. I think his view is that it should be in line with the article 10 definition of journalistic content, which would seem to make sense.

Owen Meredith: If I could add to that, speaking from my members’ perspective, they would all fall under the recognised news publisher definition. I think that is why it is an important definition. It is not an easy thing to get right, and I think the Department has done a good job in drafting the Bill. I think it captures everyone we would expect it to capture. I think actually it does set a relatively high bar for anyone else who is seeking to use that. I do not think it is possible for someone to simply claim that they are a recognised news publisher if they are operating in a way that we would not expect of such a person or entity. I think it is very important that that definition is clear. I think it is clear and workable.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q I suppose there are two separate clauses there. There is the news publisher clause and the journalistic content clause. Just so I am clear, you are happy with the news publisher clause?

Owen Meredith: Yes.

Matt Rogerson: Yes.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q What about the journalistic content clause? This is an expression that was new to me—this idea of a citizen journalist. I do not even know what that means. Are we confident that this clause, which talks about journalistic content, is the worrying one?

Owen Meredith: Matt spoke to this a little bit, but from my perspective, my focus has been on making sure that the recognised news publisher clause is right, because everything that my members publish is journalistic content. Therefore, the bulk of journalistic content that is out there will be covered by that. I think where there are elements of what else could be considered journalistic content, the journalistic content clause will pick those up.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q As journalists, does that worry you?

Matt Rogerson: I wish I was a journalist.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Sorry, as representatives of journalists.

Matt Rogerson: It worries me in the sense that we want a plural media ecosystem in this country, and we want individuals who are journalists to have their content published on platforms, so that it can be read by the 50% of the UK population that get their news from Facebook. I think it is potentially problematic that they won’t be able to publish on that platform if they talk about issues that are in the “legal but harmful” bucket of harms, as defined after the Bill is passed. I think there is concern for those groups.

There are suggestions for how you could change the clause to enable them to have more protection. As I say, Gavin Millar has outlined that in his paper. Even then, once you have got that in place, if you have a series of legal but harmful harms that are relatively unclear, the challenge for the platforms will be interpreting that and interpreting it against the journalistic content clause.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q My only concern is that someone who just decides to call themselves a journalist will be able to say what they want.

Owen Meredith: I do not think that would be allowable under the Bill, because of the distinction between a recognised news publisher publishing what we would all recognise as journalistic content, versus the journalistic content exemption. I think that is why they are treated differently.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Can I start by clarifying a comment that Owen Meredith made at the very beginning? You were commenting on where you would like the Bill to go further in protecting media organisations, and you said that you wanted there to be a wholesale exemption for recognised news publishers. I think there already is a wholesale exemption for recognised news publishers. The area where the Government have said they are looking at going further is in relation to what some people call a temporary “must carry” provision, or a mandatory right of appeal for recognised news publishers. Can I just clarify that that is what you meant?

Owen Meredith: Yes. I think the issue is how that exemption will work in practice. I think that what the Government have said they are looking at and will bring forward does address the operating in practice.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you. Can I move on to the question that Kim Leadbeater asked a moment ago, and that a number of Members have raised? You very kindly said a moment ago that you thought that clause 50, which sets out the definition of “recognised news publisher”, works as drafted. I would like to test that a bit, because some witnesses have said that it is quite widely drawn, and suggested that it would be relatively easy for somebody to set themselves up in a manner that met the test laid out in clause 50. Given the criticism that we have heard a few times today and on Tuesday, can you just expand for the Committee why you think that is not the case?

Owen Meredith: As I alluded to earlier, it is a real challenge to set out this legal definition in a country that believes, rightly, in the freedom of the press as a fourth pillar of democracy. It is a huge challenge to start with, and therefore we have to set out criteria that cover the vast majority of news publishers but do not end up with a backdoor licensing system for the press, which I think we are all keen to avoid. I think it meets that criterion.

On the so-called bad actors seeking to abuse that, I have listened to and read some of the evidence that you have had from others—not extensively, I must say, due to other commitments this week—and I think that it would be very hard for someone to meet all those criteria as set out in order to take advantage of this. I think that, as Matt has said, there will clearly be tests and challenges to that over time. It will rightly be challenged in court or go through the usual judicial process.

Matt Rogerson: It seems to me that the whole Bill will be an iterative process. The internet will not suddenly become safe when the Bill receives Royal Assent, so there will be this process whereby guidance and case law are developed, in terms of what a newspaper is, against the criteria. There are exemptions for news publishers in a whole range of other laws that are perfectly workable. I think that Ofcom is perfectly well equipped to create guidance that enables it to be perfectly workable.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you. So you are categorically satisfied about the risks that we have heard articulated; that maleficent actors would not be able to set themselves up in such a way that they benefit from this exemption.

Matt Rogerson: Subject to the guidance developed by Ofcom, which we will be engaged in developing, I do think so. The other thing to bear in mind is that the platforms already have lists of trusted publishers. For example, Google has a list in relation to Google News—I think it has about 65,000 publishers—which it automates to push through Google News as trusted news publishers. Similarly, Facebook has a list of trusted news publishers that it uses as a signal for the Facebook newsfeed. So I do not buy the idea that you can’t automate the use of trusted news sources within those products.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you; that is very helpful. I have only one other question. In relation to questions concerning freedom of speech, the Government believe, and I believe, that the Bill very powerfully protects freedom of speech. Indeed, it does so explicitly through clause 19, in addition to the protections for recognised news publishers that we have discussed already and the additional protections for content of journalistic and democratic importance, notwithstanding the definitional question that have been raised. Would you agree that this Bill respects and protects free speech, while also delivering the safety objectives that it quite rightly has?

Owen Meredith: If I can speak to the point that directly relates to my members and those I represent, which is “Does it protect press freedom?”, which is perhaps an extension of your question, I would say that it is seeking to. Given the assurances you have given about the detailed amendments that you intend to bring forward—if those are correct, and I am very happy to write to the Committee and comment once we have seen the detail, if it would be helpful to do so—and everything I have heard about what you are intending to do, I believe it will. But I do not believe that the current draft properly and adequately protects press freedom, which is why, I think, you will be bringing forward amendments.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Yes, but with the amendment committed to on Second Reading, you would say that the Bill does meet those freedom of speech objectives, subject to the detail.

Owen Meredith: Subject to seeing the drafting, but I believe the intention—yes.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Thank you. That is very helpful. Mr Rogerson?

Matt Rogerson: As we know, this is a world first: regulation of the internet, regulation of speech acts on the internet. From a news publisher perspective, I think all the principles are right in terms of what the Government are trying to do. In terms of free speech more broadly, a lot of it will come down to how the platforms implement the Bill in practice. Only time will tell in terms of the guidance that Ofcom develops and how the platforms implement that at vast scale. That is when we will see what impact the Bill actually has in practice.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q From a general free speech perspective—which obviously includes the press’s freedom of speech, but everybody else’s as well—what do you think about the right enshrined in clause 19(2), where for the first time ever the platforms’ have to have regard to the importance of protecting users’ right to freedom of speech is put on the face of a Bill? Do you think that is helpful? It is a legal obligation they do not currently have, but they will have it after the passage of the Bill. In relation to “legal but harmful” duties, platforms will also have an obligation to be consistent in the application of their own terms and conditions, which they do not have to be at the moment. Very often, they are not consistent; very often, they are arbitrary. Do you think those two changes will help general freedom of speech?

Matt Rogerson: Yes. With the development of the online platforms to the dominant position they are in today, that will be a big step forward. The only thing I would add is that, as well as this Bill, the other Bill that will make a massive difference when it comes through is the digital markets unit Bill. We need competition to Facebook so that consumers have a choice and so that they can decide which social network they want to be on, not just the one dominant social network that is available to them in this country.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I commend your ingenuity in levering an appeal for more digital competition into this discussion. Thank you.

None Portrait The Chair
- Hansard -

One final quick question from the Opposition Front Bench.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Mr Rogerson, you mentioned that platforms and tech companies currently have a list of approved broadcasters that they are enabled to use, to ensure they have that content. Isn’t it true that one of those broadcasters was Russia Today, and it was only because Ofcom intervened to remove it from social media that it was taken down, but under the current provisions in this Bill, Ofcom would not be able to do that and Russia Today would be allowed to spread disinformation on social media platforms?

Matt Rogerson: On the Russia Today problem, I think Russia Today had a licence from Ofcom, so the platforms probably took their cue from the fact that Russia Today was beamed into British homes via Freeview. Once that changed, the position of having their content available on social media changed as well. Ultimately, if it was allowed to go via broadcast, if it had a broadcast licence, I would imagine that social media companies took that as meaning that it was a—

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q But under the new Bill, as journalistic content, it would be allowed to remain on those social media platforms.

Matt Rogerson: I think that would be subject to the guidance that Ofcom creates and the consultation on that guidance. I do not believe that Russia Today would be allowed under the definitions. If it is helpful, I could write to you to set out why.

None Portrait The Chair
- Hansard -

Mr Meredith, Mr Rogerson, thank you very much. If you have any further comments that you wish to make, you are free to put them in writing.

Examination of Witnesses

Tim Fassam, Rocio Concha and Martin Lewis gave evidence.

15:50
None Portrait The Chair
- Hansard -

We will now hear from Tim Fassam, the director of government relations and policy at PIMFA, the Personal Investment Management & Financial Advice Association, and from Rocio Concha, director of policy and advocacy at Which? We will be joined by Martin Lewis, of MoneySavingExpert, in due course. Thank you to the witnesses for joining us. I call the Opposition Front Bench.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you for joining us this afternoon. As a constituency MP, I am sure I am not alone in saying that a vast amount of my casework comes from members of my community writing to me to say that they have been scammed online, that they have been subject to fraud and that they feel horrendous about it. They feel shame and they do not know what to do about it. It is the single biggest crime in the UK, with victims losing an estimated £2.3 billion. In your opinion, does the Bill go far enough to tackle that?

Rocio Concha: This Bill is very important in tackling fraud. It is very important for Which? We were very pleased when fraud was included to tackle the issue that you mentioned and also when paid-for advertising was included. It was a very important step, and it is a very good Bill, so we commend DCMS for producing it.

However, we have found some weakness in the Bill, and those can be solved with very simple amendments, which will have a big impact on the Bill in terms of achieving its objective. For example, at the moment in the Bill, search engines such as Google and Yahoo! are not subject to the same duties in terms of protecting consumers from fraudulent advertising as social media platforms are. There is no reason for Google and Yahoo! to have weaker duties in the Bill, so we need to solve that.

The second area is booster content. Booster content is user-generated content, but it is also advertising. In the current definition of fraudulent advertising in the Bill, booster content is not covered. For example, if a criminal makes a Facebook page and starts publishing things about fake investments, and then he pays Facebook to boost that content in order to reach more people, the Bill, at the moment, does not cover that fraudulent advertising.

The last part is that, at the moment, the risk checks that platforms need to do for priority illegal content, the transparency reporting that they need to do to basically say, “We are finding this illegal content and this is what we are doing about it,” and the requirement to have a way for users to tell them about illegal content or complain about something that they are not doing to tackle this, only apply to priority illegal content. They do not apply to fraudulent advertising, but we think they need to.

Paid-for advertising is the most expensive way that criminals have to reach out to a lot of people. The good news, as I said before, is that this can be solved with very simple amendments to the Bill. We will send you suggestions for those amendments and, if we fix the problem, we think the Bill will really achieve its objective.

None Portrait The Chair
- Hansard -

One moment—I think we have been joined by Martin Lewis on audio. I hope you can hear us, Mr Lewis. You are not late; we started early. I will bring you in as soon as we have you on video, preferably, but otherwise on audio.

Tim Fassam: I would echo everything my colleague from Which? has said. The industry, consumer groups and the financial services regulators are largely in agreement. We were delighted to see fraudulent advertising and wider issues of economic crime included in the Bill when they were not in the initial draft. We would also support all the amendments that Which? are putting forward, especially the equality between search and social media.

Our members compiled a dossier of examples of fraudulent activity, and the overwhelming examples of fraudulent adverts were on search, rather than social media. We would also argue that search is potentially higher risk, because the act of searching is an indication that you may be ready to take action. If you are searching “invest my pension”, hopefully you will come across Martin’s site or one of our members’ sites, but if you come across a fraudulent advert in that moment, you are more likely to fall foul of it.

We would also highlight two other areas where we think the Bill needs further work. These are predominantly linked to the interaction between Ofcom, the police and the Financial Conduct Authority, because the definitions of fraudulent adverts and fraudulent behaviour are technical and complex. It is not reasonable to expect Ofcom to be able to ascertain whether an advert or piece of content is in breach of the Financial Services and Markets Act 2000; that is the FCA’s day job. Is it fraud? That is Action Fraud’s and the police’s day job. We would therefore suggest that the Bill go as far as allowing the police and the FCA to direct Ofcom to have content removed, and creating an MOU that enables Ofcom to refer things to the FCA and the police for their expert analysis of whether it breaches those definitions of fraudulent adverts or fraudulent activity.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you, both. You mentioned that search is a concern, especially because it is currently out of scope of the Bill in terms of this issue. Another issue is that when people do use search to look for a financial service or something that they wish to purchase, the cookies are remembered. The algorithms on social media platforms are then triggered to promote specific adverts to them as a result of that search history or things they have mentioned via voice control to their home help devices. That is a concern. Digital advertising that you see on third-party websites is also not within scope. That has been raised as well. Do you have any thoughts on those points?

Rocio Concha: Yes. Open-display advertising is not part of the Bill. That also needs to be tackled. I think the online advertising programme should be considered, to tackle this issue. I agree with you: this is a very important step in the right direction, and it will make a huge difference if we fix this small weakness in terms of the current scope. However, there are still areas out there that need to be tackled.

None Portrait The Chair
- Hansard -

Mr Lewis, I am living in hope that we may be able to see you soon—although that may be a forlorn hope. However, I am hoping that you can hear us. Do you want to come in and comment at all at this point? [Interruption.] Oh, we have got you on the screen. Thank you very much for joining us.

Martin Lewis: Hurrah. I am so sorry, everybody—for obvious reasons, it has been quite a busy day on other issues for me, so you’ll forgive me.

None Portrait The Chair
- Hansard -

I can’t think why it has been.

Martin Lewis: I certainly agree with the other two witnesses. Those three issues are all very important to be brought in. From a wider perspective, I was vociferously campaigning to have scam adverts brought within the scope of the Online Safety Bill. I am delighted that that has happened, but let us be honest among ourselves: it is far from a panacea.

Adverts and scams come in so many places—on social media, in search engines and in display advertising, which is very common and is not covered. While I accept that the online advertising programme will address that, if I had my way I would be bringing it all into the Online Safety Bill. However, the realpolitik is that that is not going to happen, so we have to have the support in the OAP coming later.

It is also worth mentioning just for context that, although I think there is little that we can do about this—or it would take brighter people than me—one of the biggest routes for scams is email. Everybody is being emailed—often with my face, which is deeply frustrating. We have flaccid policing of what is going on on social media, and I hope the Bill will improve it, but at least there is some policing, even though it is flaccid, and it is the same on search engines. There is nothing on email, so whatever we do in this Bill, it will not stop scams reaching people. There are many things that would improve that, certainly including far better resourcing for policing so that people who scam individuals get at least arrested and possibly even punished and sentenced. Of course, that does not happen at the moment, because scamming is a crime that you can undertake with near impunity.

There is a lot that needs to be done to make the situation work, but in general the moves in the Online Safety Bill to include scam advertising are positive. I would like to see search engines and display advertising brought into that. I absolutely support the call for the FCA to be involved, because what is and is not a scam can certainly be complicated. There are more obvious ones and less obvious ones. We saw that with the sale of bonds at 5% or 6%, which pretend to be deposit bonds but are nothing of the sort. That might get a bit more difficult for Ofcom, and it would be great to see the regulator involved. I support all the calls of the other witnesses, but we need to be honest with ourselves: even if we do all that, we are still a long way from seeing the back of all scam adverts and all scams.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you, Mr Lewis. My final question is not necessarily about financial services advertising. With the rise of influencer culture, specifically on social media platforms such as TikTok and Instagram, we are seeing a failure to disclose adverts correctly and the potential for harmful advertising. Slimming products, for example, that are not particularly safe, especially for children, are being targeted at children. What more would you like to see this Bill do to tackle some of that? I know the ASA has taken action against some prolific offenders, but what more would you like to see in this Bill to tackle that and keep children safe from adverts that are not marked as such?

Rocio Concha: To be honest, in this area we do not have any specific proposals. I completely agree with you that this is an area that needs to be tackled, but I do not have a specific proposal for this Bill.

Tim Fassam: This is an area that we have raised with the Financial Conduct Authority—particularly the trend for financial advice TikTok and adverts for non-traditional investments, such as whisky barrels or wine, which do not meet the standards required by the FCA for other investment products. That is also true of a number of cryptocurrency adverts and formats. We have been working with the FCA to try to identify ways to introduce more consistency in the application of the rule. There has been a welcome expansion by the Treasury on the promotion of high-risk investments, which is now a regulated activity in and of itself.

I go back to my initial point. We do not believe that there is any circumstance in which the FCA would want content in any place taken down where that content should not be removed, because they are the experts in identifying consumer harm in this space.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Mr Lewis, do you have anything to add?

Martin Lewis: I still believe that most of this comes down to an issue of policing. The rules are there and are not being enforced strongly enough. The people who have to enforce the rules are not resourced well enough to do that. Therefore, you get people who are able to work around the rules with impunity.

Advertising in the UK, especially online, has been the wild west for a very long time, and it will continue to be so for quite a while. The Advertising Standards Authority is actually better at dealing with the influencer issue, because of course it is primarily strong at dealing with people who listen to the Advertising Standards Authority. It is not very good at dealing with criminal scammers based outside the European Union, who frankly cannot be bothered and will not reply—they are not going to stop—but it is better at dealing with influencers who have a reputation.

We all know it is still extremely fast and loose out there. We need to adequately resource it; putting rules and laws in place is only one step. Resourcing the policing and the execution of those rules and laws is a secondary step, and I have doubts that we will ever quite get there, because resources are always squeezed and put on the back burner.

None Portrait The Chair
- Hansard -

Thank you. Do I have any questions from Government Back Benchers? No. Does anyone have any further questions?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Yes, I do. If nobody else has questions, I will have another bite of the cherry.

None Portrait The Chair
- Hansard -

The Minister is going to come in in a minute.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q I would just like to query your thoughts on a right to redress for victims. Do you think that having an ombudsman in the Bill would be appropriate, and what would you like to see to support victims of fraud?

Martin Lewis: As you will know, I had to sue Facebook for defamation, which is a ridiculous thing to do in order to stop scam adverts. I was unable to report the scam adverts to the police, because I had not been scammed—even though it was my face that was in them—and many victims were not willing to come forward. That is a rather bizarre situation, and we got Facebook to put forward £3 million to set up Citizens Advice Scam Action—that is what I settled for, as well as a scam ad reporting tool.

There are two levels here. The problem is who is at fault. Of course, those mainly at fault for scams are the scammers. They are criminals and should be prosecuted, but not enough of them are. You have times when it is the bank’s fault. If a company has not put proper precautions in place, and people have got scammed because it has put up adverts or posts that it should have prevented, they absolutely need to have some responsibility for that. I think you will struggle to have a direct redress system put in place. I would like to see it, but it would be difficult.

It is rather interesting to me that I am worried that the £3 million for Citizens Advice Scam Action, which was at least meant to provide help and support for victims of scams, is going to run out. I have not seen any more money coming from Facebook, Google or any of the other big players out there. If we are not going to fund direct redress, we could at least make sure that they fund a collective form of redress and help for the victims of scams, as a bare minimum. It is very strange that these firms go so quiet on this, and what they say is, “We are doing everything we can.”

From my meetings with these firms—these are meetings with lawyers in the room, so I have to be slightly careful—one of the things that I would warn the Committee about is that they tend to get you in and give you a presentation on all the technological reasons why they cannot stop scam adverts. My answer to them after about 30 seconds, having stopped what was meant to be an hour-long presentation, is, “I have not framed the fact that you need a technological solution. I have said you need a solution. If the answer to stopping scam adverts, and to stopping scams, is that you have to pre-vet every single advert, as old-fashioned media did, and that every advert that you put up has to have been vetted by a human being, so be it. You’re making it a function of technology, but let’s be honest: this is a function of profitability.” We have to look at the profitability of these companies when it comes to redress. What your job is—if you forgive me saying this—is to make sure that it costs them more money to let people be scammed than it does to stop people being scammed. If we solve that, we will have a lot fewer scams on social media and on the search advertising.

Rocio Concha: I completely agree with everything that Martin says. At the moment, the provisions in the Bill for “priority illegal content” require the platforms to publish reports that say, “This is how much illegal content we are seeing on the platform, and these are the measures that we are going to take.” They are also required to have a way for users to report it and to complain when they think that the platforms are not doing the right thing. At the moment, that does not apply to fraudulent advertising, so you have an opportunity to fix that in the Bill very easily, to at least get the transparency out there. The platform has to say, “We are finding this”—that puts pressure on the platform, because it is there and is also with the regulator—“and these are the measures that we are taking.” That gives us transparency to say, “Are these measures enough?” There should also be an easy way for the user to complain when they think that platforms are not doing the right thing. It is a complex question, but there are many things in the Bill that you can improve in order to improve the situation.

Tim Fassam: I wonder if it would be useful to give the Committee a case study. Members may be familiar with London Capital & Finance. Now, London Capital & Finance is one of the most significant recent scams. It sold mini-bonds fraudulently, at a very high advertised return, which then collapsed, with individuals losing all their money.

Those individuals were compensated through two vehicles. One was a Government Bill; so, they were compensated by the taxpayer. The others, because they were found to have been given financial advice despite LCF not having advice permissions or operating through a regulated product, went on to the Financial Services Compensation Scheme, which, among others, our members pay for; legitimate financial services companies pay for it. The most recent estimate is over £650 million. The expectation is that that will reach £1 billion at some point over the next few years, in terms of cost to the economy.

LCF was heavily driven by online advertising, and we would argue that the online platforms were in fact probably the only people who could have stopped it happening. They have profited from those adverts and they have not contributed anything to either of those two schemes. We would argue—possibly not for this Bill—that serious consideration should be given to the tech platforms being part of the financial services compensation scheme architecture and contributing to the costs of scams that individuals have fallen foul of, as an additional incentive for them to get on top of this problem.

Martin Lewis: That is a very important point, but I will just pick up on what Rocio was saying. One of the things that I would like to see, as well as much more rigid requirements of how reporting scams can be put in place—because I cannot see proper pre-vetting happening with these technology companies, but we can at least rely on social policing and reporting of scams. There are many people who recognise a scam, just as there are many people who do not recognise a scam.

However, I also think this is a wonderful opportunity to make sure that the method, the language and the symbols used for reporting scams are universal in the UK, so that whatever site you are on, if you see an advert you click the same symbol, and the process is unified and universal, and works in a very similar way, so that you can report a scam the same way on every site, which makes it simpler, and we can train people in how to do it and we can make the processes work.

Then, of course, we have to make sure that they act on the back of reports, but simply the various ways it is reported, and the complexity, and the number of clicks that you need to make mean it is a lot easier generally to click on an advert than it is to click to report an advert that is a scam. And with so many scams out there, I think there should be a parity of ease between those two factors.

Caroline Ansell Portrait Caroline Ansell
- Hansard - - - Excerpts

Q May I ask, directly related to that, about the complaints procedure? What would you like to see in terms of changes there, to make it more unified, more universal and simpler? It has been suggested that it is not robust enough, not dynamic enough and not fast enough.

Rocio Concha: There were complaints from the users. At the moment, this Bill will not allow this for fraudulent advertising. So, we need to make sure that it is a requirement for the platforms to allow and to have an easy tool for people to complain and to report when they see something that is fraudulent. At the moment, the Bill does not do that. It is an easy fix; you can do it. And then the user will have that tool. It would also give us transparency for the regulator and for organisations such as ours, to see what is happening and to see what measures the platforms are taking.

Tim Fassam: I would agree with that. I would also highlight a particular problem that our members have flagged, and we have flagged directly with Meta and Instagram. Within the definition in the Bill of individuals who can raise concern about social media platforms, our members find they fall between two stools, because quite often what is happening is that people are claiming an association with a legitimate firm. So they will have a firm’s logo, or a firm’s web address, in their profile for their social media and then they will not directly claim to be a financial adviser but imply an association with a legitimate financial advice firm. This happens surprisingly frequently.

Our members find it incredibly difficult to get those accounts taken down, because it is not a fraudulent account; that individual is not pretending to be someone else and they are not the individual claiming pretence. They are not directly claiming to be an employee; they could just say they are a fan of the company. And they are not a direct victim of this individual. What happens is that when they report, it goes into a volume algorithm, and only if a very large number of complaints are made does that particular site get taken down. I think that could be expanded to include complaints from individuals affected by the account, rather than directly believing they are pretending to be that.

None Portrait The Chair
- Hansard -

Mr Lewis, you were nodding.

Martin Lewis: I was nodding—I was smiling and thinking, “If it makes you feel any better, Tim, I have pictures of me that tell people to invest money that are clearly fake, because I don’t do any adverts, and it still is an absolute pain in the backside for me to get them taken down, having sued Facebook.” So, if your members want to feel any sense of comradeship, they are not alone in this; it is very difficult.

I think the interesting thing is about that volumetric algorithm. Of course, we go back to the fact that these big companies like to err on the side of making money and err away from the side of protecting consumers, because those two, when it comes to scams, are diametrically opposed. The sooner we tidy it up, the better. You could have a process where once there has been a certain number of reports—I absolutely get Tim’s point that in certain cases there is not a big enough volume—the advert is taken down and then the company has to proactively decide to put it back up and effectively say, “We believe this is a valid advert.” Then the system would certainly work better, especially if you bring down the required number of reports. At the moment, I think, there tends to be an erring on the side of, “Keep it up as long as it’s making us money, unless it absolutely goes over the top.”

Many tech experts have shown me adverts with my face in on various social media platforms. They say it would take them less than five minutes to write a program to screen them out, but those adverts continue to appear. We just have to be conscious here that—there is often a move towards self-regulation. Let me be plain, as I am giving evidence. I do not trust any of these companies to have the user and the consumer interest at heart when it comes to their advertising; what they have at heart is their own profits, so if we want to stop them, we have to make this Bill robust enough to stop them, because that is the only way it will stop. Do not rely on them trying to do good, because they are trying to make profit and they will err on the side of that over the side of protecting individuals from scam adverts.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q I thank the witnesses for coming. In terms of regulation, I was going to ask whether you believe that Ofcom is the most suitable regulator to operate in this area. You have almost alluded to the fact that you might not. On that basis, should we specify in the Bill a duty for Ofcom to co-operate with other regulators—for example, the Competition and Markets Authority, the Financial Conduct Authority, Action Fraud or whoever else?

Tim Fassam: I believe that would be helpful. I think Ofcom is the right organisation to manage the relationship with the platforms, because it is going to be much broader than the topics we are talking about in our session, but we do think the FCA, Action Fraud and potentially the CMA should be able to direct, and be very clear with Ofcom, that action needs to be taken. Ofcom should have the ability to ask for things to be reviewed to see whether they break the rules.

The other area where we think action probably needs to be taken is where firms are under investigation, because the Bill assumes it is clear cut whether something is fraud, a scam, a breach of the regulations or not. In some circumstances, that can take six months or a year to establish through investigation. We believe that if, for example, the FCA feels that something is high risk, it should be able to ask Ofcom to suspend an advert, or a firm from advertising, pending an investigation to assess whether it is a breach of the regulation.

Rocio Concha: I agree that Ofcom is the right regulator, the main regulator, but it needs to work with the other regulators—with the FCA, ASA and CMA—to enforce the Bill effectively. There is another area. Basically, we need to make sure that Ofcom and all the regulators involved have the right resources. When the initial version of the Bill was published, Ofcom got additional resources to enable it to enforce the Bill. But the Bill has increased in scope, because now it includes fraud and fraudulent advertising. We need to make sure that Ofcom has the right resources to enforce the full Bill effectively. That is something that the Government really need to consider.

Martin Lewis: I was going to make exactly that point, but it has just been made brilliantly so I will not waste your time.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I thank the witnesses for joining us this afternoon, and particularly Martin Lewis for his campaigning in this area.

I will start by agreeing with the point that Martin Lewis made a minute or two ago—that we cannot trust these companies to work on their own. Mr Lewis, I am not sure whether you have had a chance to go through clause 34, which we inserted into the Bill following your evidence to the Joint Committee last year. It imposes a duty on these companies to take steps and implement systems to

“prevent individuals from encountering content consisting of fraudulent advertisements”.

There is a clear duty to stop them from doing this, rather as you were asking a minute ago when you described the presentation. Does that strong requirement in clause 34, to stop individuals from encountering fraudulent advertisement content, meet the objective that you were asking for last year?

Martin Lewis: Let me start by saying that I am very grateful that you have put it in there and thankful that the Government have listened to our campaign. What I am about to say is not intended as criticism.

It is very difficult to know how this will work in practice. The issue is all about thresholds. How many scam adverts can we stomach? I still have, daily—even from the platform that I sued, never mind the others—tens of reports directly to me of scam adverts with my face on. Even though there is a promise that we will try to mitigate that, the companies are not doing it. We have to have a legitimate understanding that we are not going to have zero scam adverts on these platforms; unless they were to pre-vet, which I do not think they will, the way they operate means that will not happen.

I am not a lawyer but my concern is that the Bill should make it clear, and that any interpretation of the Bill from Ofcom should be clear, about exactly what threshold of scam adverts is acceptable—we know that they are going to happen—and what threshold is not acceptable. I do not have the expertise to answer your question; I have to rely on your expertise to do that. But I ask the Committee to think properly about what the threshold level should be.

What is and is not acceptable? What counts as “doing everything they can”? They are going to get big lawyers involved if you say there must be zero scam adverts—that is not going to happen. How many scam adverts are acceptable and how many are not? I am so sorry to throw that back as a question when I am a witness, but I do not have the expertise to answer. But that is my concern: I am not 100% convinced of the threshold level that you are setting.

None Portrait The Chair
- Hansard -

Q Mr Fassam, do you have the answer?

Tim Fassam: I think we are positive about the actions that have been taken regarding social media; our concern is that the clause is not applied to search and that it excludes paid-for ads that are also user-generated content—promoted tweets or promoted posts, for example. We would ensure that that applied to all paid-for adverts and that it was consistent between social media and search.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Mr Fassam, I will address those two questions, if I may. Search is covered by clause 35 and user-generated content is subject to the Bill’s general provisions on user-generated content. Included in the scope of that are the priority illegal offences defined in schedule 7. Among those are included, on page 185—not that I expect you to have memorised the Bill—financial services offences that include a number of those offences to do with pretending to carry out regulated financial activity when in fact you are not regulated. Also included are the fraud offences—the various offences under the Fraud Act 2006. Do come back if you think I have this wrong, but I believe that we have search covered in clause 35 and promoted user-generated content covered via schedule 7 page 185.

Tim Fassam: You absolutely do, but to a weaker standard than in clause 34.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q In clause 35 there is the drafting point that we are looking at. It says “minimise the risk” instead of “prevent”. You are right to point out that drafting issue. In relation to the user-generated stuff, there is a duty on the platforms to proactively stop priority illegal content, as defined in schedule 7. I do take your drafting point on clause 35.

Tim Fassam: Thank you.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I want to pick up on Martin Lewis’s point about enforcement. He said that he had to sue Facebook himself, which was no doubt an onerous, painful and costly enterprise—at least costly initially, because hopefully you got your expenses back. Under the Bill, enforcement will fall to Ofcom. The penalties that social media firms could be handed by Ofcom for failing to meet the duties we have discussed include a fine amounting to 10% of global revenue as a maximum, which runs into billions of pounds. Do the witnesses feel that level of sanction—10% of global revenue and ultimately denial of service—is adequately punitive? Will it provide an adequate deterrent to the social media firms that we are considering?

None Portrait The Chair
- Hansard -

Mr Lewis, as you were named, I think you had better start.

Martin Lewis: Ten per cent. of the global revenue of a major social media or search player is a lot of money—it certainly would hit them in the pocket. I reiterate my previous point: it is all about the threshold at which that comes in and how rigidly Ofcom is enforcing it. There are very few organisations that have the resources, legally, to take on big institutions of state, regulators and Governments. If any does, it is the gigantic tech firms. Absolutely, 10% of global revenue sounds like a suitable wall to prevent them jumping over. That is the aim, because we want those companies to work for people; we don’t want them to do scam adds. We want them to work well and we want them never to be fined because is no reason to fine them.

The proof of the pudding will be in how robust Ofcom feels it can be, off the back of the Bill, taking those companies on. I go back to needing to understand how many scam ads you permit under the duty to prevent scam ads. It clearly is not zero—you are not going to tell me it is zero. So how many are allowed, what are the protocols that come into place and how quickly do they have to take the ads down? Ultimately, I think that is going to be a decision for Ofcom, but it is the level of stringency that you put on Ofcom in order for it to interpret how it takes that decision that is going to decide whether this works or not.

Rocio Concha: I completely agree with Martin. Ofcom needs to have the right resources in order to monitor how the platforms are doing that, and it needs to have the right powers. At the moment, Ofcom can ask for information in a number of areas, including fraud, but not advertising. We need to make sure that Ofcom can ask for that information so that it can monitor what the platforms are doing. We need to make sure that it has the right powers and the right resources to enforce the Bill effectively.

Tim Fassam: You would hope that 10% would certainly be a significant disincentive. Our focus would be on whether companies are contributing to compensating the victims of fraud and scams, and whether they have been brought into the architecture that is utilised to compensate victims of fraud and scams. That would be the right aim in terms of financial consequences for the firms.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I have one final question that again relates to the question of reporting scams, which I think two or three witnesses have referred to. I will briefly outline the provisions in the Bill that address that. I would like to ask the witnesses if they think those provisions are adequate. First, in clause 18, the Bill imposes on large social media firms an obligation to have a proper complaints procedure so that complaints are not ignored, as appears to happen on a shockingly frequent basis. That is at the level of individual complaints. Of course, if social media firms do not do that, it will be for Ofcom to enforce against them.

Secondly, clauses 140 and 141 contain a procedure for so-called super-complaints, where a body that represents users—it could be Which? or an organisation like it—is able to bring something almost like a class action or group complaint to Ofcom if it thinks a particular social media firm has systemic problems. Will those two clauses address the issue of complaints not being properly handled or, in some cases, not being dealt with at all?

Martin Lewis: Everything helps. I think the super-complaint point is really important. We must remember that many victims of scams are not so good at complaining and, by the nature of the crossover of individuals, there is a huge mental health issue at stake with scams. There is both the impact on people with mental health issues and the impact on people’s mental health of being scammed, which means that they may not be as robust and up for the fight or for complaining. As long as it works and applies to all the different categories that are repeated here, the super-complaint status is a good measure.

We absolutely need proper reporting lines. I urge you, Minister—I am not sure that this is in the Bill—to standardise this so that we can talk about what someone should do when they report: the same imagery, the same button. With that, people will know what to do. The more we can do that, the easier and better the system will be.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q That is a really important point—you made it earlier—about the complaints process being hidden. Clause 18(2)(c) says that the complaints system must be

“easy to access, easy to use (including by children) and transparent.”

The previous paragraph (b) states that the system must

“provides for appropriate action to be taken by the provider of the service in response to complaints of a relevant kind”.

The Bill is saying that a complaints process must do those two things, because if it does not, Ofcom will be on the company’s back.

Martin Lewis: I absolutely support all of that. I am just pushing for that tiny bit more leadership, whether it is from you or Ofcom, that comes up with a standardised system with standardised imagery and placing, so that everybody knows that on the top left of the advert you have the button that you click to fill in a form to report it. The more we have that cross-platform and cross-search and cross-social media, the easier it will be for people. I am not sure it is a position for the Bill in itself, but Government leadership would work really well on that.

Tim Fassam: They are both welcome—the super-complaint and the new complaints process. We want to ensure that we have a system that looks not just at weight of number of complaints, but at the content. In particular, you may find on the super-complaint point that, for example, the firm that a fraudster is pretending to be is the organisation that has the best grasp of the issue, so do not forget about commercial organisations as well as consumer organisations when thinking about who is appropriate to make super-complaints.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Well, your organisation, as one that represents firms in this space, could in fact be designated as a super-complainant to represent your members, as much as someone like Which? could be designated to represent the man on the street like you or me.

Tim Fassam: Absolutely. We suggested to Meta when we met them about 18 months ago that we could be a clearing house to identify for them whether they need to take something seriously, because our members have analysed it and consider it to represent a real risk.

None Portrait The Chair
- Hansard -

Last word to Rocio Concha.

Rocio Concha: I completely agree about the super-complaint. We as a consumer organisation have super-complaint powers. As with other regulators, we would like to have it in this context as well. We have done many super-complaints representing consumers in particular areas with the regulators, so I think we need it in this Bill as well.

On reporting, I want to clarify something. At the moment, the Bill does not have a requirement for users to complain and report to platforms in relation to fraudulent advertising. It happens for priority illegal content, but our assessment of the Bill is that it is unclear whether it applies to fraudulent advertising. We probably do not have time to look at this now, but we sent you amendments to where we thought the Bill had weaknesses. We agree with you that users should have an easy and transparent way to report illegal or fraudulent advertising, and they should have an easy way to complain about it. At the moment, it is not clear that the Bill will require that for fraudulent advertising.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Yes, that is a very good question. Please do write to us about that. Clause 140, on super-complaints, refers to “regulated services”. My very quick, off-the-cuff interpretation is that that would include everything covered and regulated by the Bill. I notice that there is a reference to user-to-user services in clause 18. Do write to us on that point. We would be happy to look at it in detail. Do not take my comment as definitive, because I have only just looked at it in the last 20 seconds.

Rocio Concha: My comment was in relation not to the super-complaints but to the requirements. We already sent you our comments with suggestions on how you can fix this in the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am very grateful. Thank you.

None Portrait The Chair
- Hansard -

Ms Concha and Mr Fassam, thank you very much. Do please write in if you have further comments. Mr Lewis, we are deeply grateful to you. You can now go back to your day job and tell us whether we are going to be worse or better off as a result of the statement today—please don’t answer that now.

Martin Lewis: I am interviewing the Chancellor in 15 minutes.

None Portrait The Chair
- Hansard -

Thank you all very much.

Examination of Witness

Frances Haugen gave evidence.

16:36
None Portrait The Chair
- Hansard -

We now have Frances Haugen, a former Facebook employee. Thank you for joining us.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Good afternoon, Frances. Thank you for joining us.

Frances Haugen: Thank you so much for inviting me.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

No problem. Could you give us a brief overview of how, in your opinion, platforms such as Meta will be able to respond to the Bill if it is enacted in its current form?

Frances Haugen: There are going to be some pretty strong challenges in implementing the Bill as it is currently written. I want to be really honest with you about the limitations of artificial intelligence. We call it artificial intelligence, but people who actually build these systems call it machine learning, because it is not actually intelligent. One of the major limitations in the Bill is that there are carve-outs, such as “content of democratic importance”, that computers will not be able to distinguish. That might have very serious implications. If the computers cannot differentiate between whether something is or is not hate speech, imagine a concept even more ambiguous that requires even more context, such as defining what is of democratic importance. If we have carve-outs like that, it may actually prevent the platforms from doing any content moderation, because they will never know whether a piece of content is safe or not safe.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q You have just answered my question on AI and algorithmic intention. When I questioned Meta in Tuesday’s oral evidence session, they were unable to tell me how many human moderators they had directly working for them and how many had abided by a UK standard and code of conduct. Do you see the lack of human moderators being a problem as the Bill is enacted by platforms such as Meta?

Frances Haugen: I think it is unacceptable that large corporations such as this do not answer very basic questions. I guarantee you that they know exactly how many moderators they have hired—they have dashboards to track these numbers. The fact that they do not disclose those numbers shows why we need to pass laws to have mandatory accountability. The role of moderators is vital, especially for things like people questioning judgment decisions. Remember, no AI system is going to be perfect, and one of the major ways people can have accountability is to be able to complain and say, “This was inaccurately judged by a computer.” We need to ensure that there is always enough staffing and that moderators can play an active role in this process.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q One final question from me, because I know others will want to come in. How do you think platforms such as Meta—I know we have used Meta as an example, but there are others—can be incentivised, beyond the statutory duty that we are currently imposing, to publish their data to allow academics and researchers into their platforms to examine exactly what is going on? Or is this the only way?

Frances Haugen: All industries that live in democratic societies must live within democratic processes, so I do believe that it is absolutely essential that we the public, through our democratic representatives like yourself, have mandatory transparency. The only two other paths I currently see towards getting any transparency out of Meta, because Meta has demonstrated that it does not want to give even the slightest slivers of data—for example, how many moderators there are—are via ESG, so we can threaten then with divestment by saying, “Prosocial companies are transparent with their data,” and via litigation. In the United States, sometimes we can get data out of these companies through the discovery process. If we want consistent and guaranteed access to data, we must put it in the Bill, because those two routes are probabilistic—we cannot ensure that we will get a steady, consistent flow of data, which is what we need to have these systems live within a democratic process.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Turning to the issue of child safety and online abuse with images involving children, what should be added to or removed from the Bill to improve how it protects children online? Have you got any thoughts on that? Some groups have described the Bill’s content as overly broad. Would you make any comments on how effective it will be in terms of online safety for children?

Frances Haugen: I am not well versed on the exact provisions in the Bill regarding child safety. What I can say is that one of the most important things that we need to have in there is transparency around how the platforms in general keep children under the age of 13 off their systems—transparency on those processes—because we know that Facebook is doing an inadequate job. That is the single biggest lever in terms of child safety.

I have talked to researchers at places like Oxford and they talk about how, with social media, one of the critical windows is when children transition through puberty, because they are more sensitive on issues, they do not have great judgment yet and their lives are changing in really profound ways. Having mandatory transparency on what platforms are doing to keep kids off their platforms, and the ability to push for stronger interventions, is vital, because keeping kids off them until they are at least 13, if not 16, is probably the biggest single thing we can do to move the ball down the field for child safety.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q You say that transparency is so important. Can you give us any specifics about particular areas that should be subject to transparency?

Frances Haugen: Specifically for children or across the whole platform?

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Specifically for children.

Frances Haugen: I will give you an example. Facebook has estimated ages for every single person on the platform, because the reality is that lots of adults also lie about their ages when they join, and advertisers want to target very specific demographics—for example, if you are selling a kit for a 40th birthday, you do not want to mis-target that by 10 years. Facebook has estimated ages for everyone on the platform. It could be required to publish every year, so that we could say, “Hey, there are four kids on the platform who you currently believe, using your estimated ages, are 14 years old—based not on how old they say they are, but on your estimate that this person is 14 years old. When did they join the platform? What fraction of your 14-year-olds have been on the platform since they were 10?” That is a vital statistic.

If the platforms were required to publish that every single quarter, we could say, “Wow! You were doing really badly four years ago, and you need to get a lot better.” Those kinds of lagging metrics are a way of allowing the public to grade Facebook’s homework, instead of just trusting Facebook to do a good job.

Facebook already does analyses like this today. They already know that on Facebook Blue, for example, for some age cohorts, 20% of 11-year-olds were on the platform—and back then, not that many kids were online. Today, I would guess a much larger fraction of 11-year-olds are on Instagram. We need to have transparency into how badly they are doing their jobs.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Frances, do you think that the Bill needs to set statutory minimum standards for things such as risk assessments and codes of practice? What will a company such as Facebook do without a minimum standard to go by?

Frances Haugen: It is vital to get into the statute minimum standards for things such as risk assessments and codes of conduct. Facebook has demonstrated time and again—the reality is that other social media platforms have too—that it does the bare minimum to avoid really egregious reputational damage. It does not ensure the level of quality needed for public safety. If you do not put that into the Bill, I worry that it will be watered down by the mountains of lobbyists that Facebook will throw at this problem.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you. You alluded earlier to the fact that the Bill contains duties to protect content of democratic importance and journalistic content. What is your view on those measures and their likely effectiveness?

Frances Haugen: I want to reiterate that AI struggles to do even really basic tasks. For example, Facebook’s own document said that it only took down 0.8% of violence-inciting content. Let us look at a much broader category, such as content of democratic importance—if you include that in the Bill, I guarantee you that the platforms will come back to you and say that they have no idea how to implement the Bill. There is no chance that AI will do a good job of identifying content of democratic importance at any point in the next 30 years.

The second question is about carve-outs for media. At a minimum, we need to greatly tighten the standards for what counts as a publication. Right now, I could get together with a friend and start a blog and, as citizen journalists, get the exact same protections as an established, thoughtful, well-staffed publication with an editorial board and other forms of accountability. Time and again, we have seen countries such as Russia use small media outlets as part of their misinformation and disinformation strategies. At a minimum, we need to really tighten that standard.

We have even seen situations where they will use very established publications, such as CNN. They will take an article that says, “Ukrainians destroyed a bunch of Russian tanks,” and intentionally have their bot networks spread that out. They will just paste the link and say, “Russia destroyed a bunch of tanks.” People briefly glance at the snippet, they see the picture of the tank, they see “CNN”, and they think, “Ah, Russia is winning.” We need to remember that even real media outlets can be abused by our enemies to manipulate the public.

Caroline Ansell Portrait Caroline Ansell
- Hansard - - - Excerpts

Q Good afternoon, Frances. I want to ask you about anonymity and striking a balance. We have heard variously that anonymity affords some users safe engagement and actually reduces harm, while for others anonymity has been seen to fuel abuse. How do you see the balance, and how do you see the Bill striving to achieve that?

Frances Haugen: It is important for people to understand what anonymity really is and what it would really mean to have confirmed identities. Platforms already have a huge amount of data on their users. We bleed information about ourselves on to these platforms. It is not about whether the platforms could identify people to the authorities; it is that they choose not to do that.

Secondly, if we did, say, mandate IDs, platforms would have two choices. The first would be to require IDs, so that every single user on their platform would have to have an ID that is verifiable via a computer database—you would have to show your ID and the platform would confirm it off the computer. Platforms would suddenly lose users in many countries around the world that do not have well-integrated computerised databases. The platforms will come back to you and say that they cannot lose a third or half of their users. As long as they are allowed to have users from countries that do not have those levels of sophisticated systems, users in the UK will just use VPNs—a kind of software that allows you to kind of teleport to a different place in the world—and pretend to be users from those other places. Things such as ID identification are not very effective.

Lastly, we need to remember that there is a lot of nuance in things like encryption and anonymity. As a whistleblower, I believe there is a vital need for having access to private communications, but I believe we need to view these things in context. There is a huge difference between, say, Signal, which is open source and anyone in the world can read the code for it—the US Department of Defence only endorses Signal for its employees, because it knows exactly what is being used—and something like Messenger. Messenger is very different, because we have no idea how it actually works. Facebook says, “We use this protocol,” but we cannot see the code; we have no idea. It is the same for Telegram; it is a private company with dubious connections.

If people think that they are safe and anonymous, but they are not actually anonymous, they can put themselves at a lot of risk. The secondary thing is that when we have anonymity in context with more sensitive data—for example, Instagram and Facebook act like directories for finding children—that is a very different context for having anonymity and privacy from something like Signal, where you have to know someone’s phone number in order to contact them.

These things are not cut-and-dried, black-or-white issues. I think it is difficult to have mandatory identity. I think it is really important to have privacy. We have to view them in context.

Caroline Ansell Portrait Caroline Ansell
- Hansard - - - Excerpts

Thank you. That is very helpful.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you for joining us and giving evidence, Frances; it is nice to see you again. We had evidence from Meta, your former employer, on Tuesday, in which its representative suggested that it engages in open and constructive co-operation with researchers. Do you think that testimony was true?

Frances Haugen: I think that shows a commendable level of chutzpah. Researchers have been trying to get really basic datasets out of Facebook for years. When I talk about a basic dataset, it is things as simple as, “Just show us the top 10,000 links that are distributed in any given week.” When you ask for information like that in a country like the United States, no one’s privacy is violated: every one of those links will have been viewed by hundreds of thousands, if not millions of people. Facebook will not give out even basic data like that, even though hundreds if not thousands of academics have begged for this data.

The idea that they have worked in close co-operation with researchers is a farce. The only way that they are going to give us even the most basic data that we need to keep ourselves safe is if it is mandated in the Bill. We need to not wait two years after the Bill passes—and remember, it does not even say that it will happen; Ofcom might say, “Oh, maybe not.” We need to take a page from the Digital Services Act and say, “On the day that the Bill passes, we get access to data,” or, at worst, “Within three months, we are going to figure out how to do it.” It needs to be not, “Should we do it?” but “How will we do it?”

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q When I was asking questions on Tuesday, the representative of Meta made a second claim that raised my eyebrow. He claimed that, in designing its algorithms, it did not primarily seek to optimise for engagement. Do you think that was true?

Frances Haugen: First, I left the company a year ago. Because we have no transparency with these companies, they do not have to publish their algorithms or the consequences of their algorithms, so who knows? Maybe they use astrology now to rank the content. We have no idea. All I know is that Meta definitely still uses signals—did users click on it, did they dwell on it, did they re-share it, or did they put a comment on it? There is no way it is not using those. It is very unlikely that they do not still use engagement in their ranking.

The secondary question is, do they optimise for engagement? Are they trying to maximise it? It is possible that they might interpret that and say, “No, we have multiple things we optimise for,” because that is true. They look at multiple metrics every single time they try to decide whether or not to shift things. But I think it is very likely that they are still trying to optimise for engagement, either as their top metric or as one of their top metrics.

Remember, Meta is not trying to optimise for engagement to keep you there as long as possible; it is optimising for engagement to get you and your friends to produce as much content as possible, because without content production, there can be no content consumption. So that is another thing. They might say, “No, we are optimising for content production, not engagement,” but that is one step off.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q The Bill contains provisions that require companies to do risk assessments that cover their algorithms, and then to be transparent about those risk assessments with Ofcom. Do you think those provisions will deliver the change required in the approach that the companies take?

Frances Haugen: I have a feeling that there is going to be a period of growing pains after the first time these risk assessments happen. I can almost entirely guarantee you that Facebook will try to give you very little. It will likely be a process of back and forth with the regulator, where you are going to have to have very specific standards for the level of transparency, because Facebook is always going to try to give you the least possible.

One of the things that I am actually quite scared about is that, in things like the Digital Services Act, penalties go up to 10% of global profits. Facebook as a company has something like 35% profit margins. One of the things I fear is that these reports may be so damning— that we have such strong opinions after we see the real, hard consequences of what they are doing—that Facebook might say, “This isn’t worth the risk. We’re just going to give you 10% of our profits.” That is one of the things I worry about: that they may just say, “Okay, now we’re 25% profitable instead of 35% profitable. We’re that ashamed.”

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Let me offer a word of reassurance on that. In this Bill, the penalties are up to 10% of global revenue, not profit. Secondly, in relation to the provision of information to Ofcom, there is personal criminal liability for named executives, with a period of incarceration of up to two years, for the reason you mentioned.

Frances Haugen: Oh, good. That’s wonderful.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We had a case last year where Facebook—it was actually Facebook—failed to provide some information to the CMA in a takeover case, and it paid a £50 million fine rather than provide the information, hence the provision for personal criminal liability for failing to provide information that is now in this Bill.

My final question is a simple one. From your perspective, at the moment, when online tech companies are making product design decisions, what priority do they give to safety versus profit?

Frances Haugen: What I saw when I was at Facebook was that there was a culture that encouraged people to always have the most positive interpretation of things. If things are still the same as when I left—like I said, I do not know; I left last May—what I saw was that people routinely had to weigh little changes in growth versus changes in safety metrics, and unless they were major changes in safety metrics, they would continue to pursue growth. The only problem with a strategy like that is that those little deficits add up to very large harms over time, so we must have mandated transparency. The public have to have access to data, because unless Facebook has to add the public cost of the harm of its products, it is not going to prioritise enough those little incremental harms as they add up.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Thank you very much.

None Portrait The Chair
- Hansard -

Ms Haugen, thank you very much indeed for joining us today, and thank you also for the candour with which you have answered your questions. We are very grateful to you indeed.

The Committee will meet again on Tuesday 7 June at 9.25 am for the start of its line-by-line consideration of the Bill. That session will be in Committee Room 14.

Ordered, That further consideration be now adjourned. —(Steve Double.)

16:58
Adjourned till Tuesday 7 June at twenty-five minutes past Nine o’clock.
Written evidence to be reported to the House
OSB24 The Investment Association
OSB25 Jeremy Peckam
OSB26 Mid-Sized Platform Group
OSB27 Carnegie UK
OSB28 Full Fact
OSB29 Together Association
OSB30 The Christian Institute
OSB31 Clean up the Internet
OSB32 Joint Submission on Children's Amendments on the Online Safety Bill submitted by 5Rights Foundation, NSPCC and Children’s Charities’ Coalition on Internet Safety (CHIS) (and others)
OSB33 Internet Advertising Bureau UK (IAB UK)
OSB33A Annex - IAB UK Digital advertising industry commitment to tackle scam advertising online
OSB34 Victims’ Commissioner
OSB35 The British Psychological Society
OSB36 Paul Wragg
OSB37 Joint submission from Global Encryption Coalition signatories
OSB38 Internet Matters

Online Safety Bill (Fifth sitting)

Committee stage
Tuesday 7th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 7 June 2022 - (7 Jun 2022)
The Committee consisted of the following Members:
Chairs: † Sir Roger Gale, Christina Rees
† Ansell, Caroline (Eastbourne) (Con)
† Bailey, Shaun (West Bromwich West) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
† Carden, Dan (Liverpool, Walton) (Lab)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Double, Steve (St Austell and Newquay) (Con)
† Fletcher, Nick (Don Valley) (Con)
† Holden, Mr Richard (North West Durham) (Con)
† Keeley, Barbara (Worsley and Eccles South) (Lab)
† Leadbeater, Kim (Batley and Spen) (Lab)
† Miller, Dame Maria (Basingstoke) (Con)
† Mishra, Navendu (Stockport) (Lab)
† Moore, Damien (Southport) (Con)
Nicolson, John (Ochil and South Perthshire) (SNP)
† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
Russell, Dean (Watford) (Con)
† Stevenson, Jane (Wolverhampton North East) (Con)
Katya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks
† attended the Committee
Public Bill Committee
Tuesday 7 June 2022
(Morning)
[Sir Roger Gale in the Chair]
Online Safety Bill
09:25
None Portrait The Chair
- Hansard -

Good morning, ladies and gentleman. If anybody wishes to take their jacket off, they are at liberty to do so when I am in the Chair—my co-Chairman is joining us, and I am sure she will adopt the same procedure. I have a couple of preliminary announcements. Please make sure that all mobile phones are switched off. Tea and coffee are not allowed in the Committee, I am afraid. I think they used to be available outside in the corridor, but I do not know whether that is still the case.

We now start line-by-line consideration of the Bill. The selection and grouping list for the sitting is available on the table in the room for anybody who does not have it. It shows how the clauses and selected amendments have been grouped for debate. Grouped amendments are generally on the same subject or a similar issue.

Now for a slight tutorial to remind me and anybody else who is interested, including anybody who perhaps has not engaged in this arcane procedure before, of the proceedings. Each group has a lead amendment, and that amendment is moved first. The other grouped amendments may be moved later, but they are not necessarily voted on at that point, because some of them relate to matters that appear later in the Bill. Do not panic; that does not mean that we have forgotten them, but that we will vote on them—if anybody wants to press them to a Division—when they are reached in order in the Bill. However, if you are in any doubt and feel that we have missed something—occasionally I do; the Clerks never do—just let us know. I am relaxed about this, so if anybody wants to ask a question about anything that they do not understand, please interrupt and ask, and we will endeavour to confuse you further.

The Member who has put their name to the lead amendment, and only the lead amendment, is usually called to speak first. At the end of the debate, the Minister will wind up, and the mover of the lead amendment—that might be the Minister if it is a Government amendment, or it might be an Opposition Member—will indicate whether they want a vote on that amendment. We deal with that first, then we deal with everything else in the order in which it arises. I hope all that is clear, but as I said, if there are any questions, please interrupt and ask.

We start consideration of the Bill with clause 1, to which there are no amendments. Usually, the Minister would wind up at the end of each debate, but as there are no amendments to clause 1, the Minister has indicated that he would like to say a few words about the clause.

Clause 1

Overview of Act

Question proposed, That the clause stand part of the Bill.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

Thank you, Sir Roger; it is a pleasure to serve under your chairmanship once again. It may be appropriate to take this opportunity to congratulate my right hon. Friend the Member for Basingstoke on her damehood in the Queen’s birthday honours, which was very well deserved indeed.

This simple clause provides a high-level overview of the different parts of the Bill and how they come together to form the legislation.

None Portrait The Chair
- Hansard -

The Minister was completely out of order in congratulating the right hon. Lady, but I concur with him. I call the shadow Minister.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

Thank you, Sir Roger; it is a genuine privilege and an honour to serve under your chairship today and for the duration of the Committee. I concur with congratulations to the right hon. Member for Basingstoke and I, too, congratulate her.

If you would indulge me, Sir Roger, this is the first time I have led on behalf of the Opposition in a Bill Committee of this magnitude. I am very much looking forward to getting my teeth stuck into the hours of important debate that we have ahead of us. I would also like to take this opportunity to place on record an early apology for any slight procedural errors I may inadvertently make as we proceed. However, I am very grateful to be joined by my hon. Friend the Member for Worsley and Eccles South, who is much more experienced in these matters. I place on record my grateful support to her. Along with your guidance, Sir Roger, I expect that I will quickly pick up the correct parliamentary procedure as we make our way through this colossal legislation. After all, we can agree that it is a very important piece of legislation that we all need to get right.

I want to say clearly that the Opposition welcome the Bill in principle; the Minister knows that, as we voted in favour of it at Second Reading. However, it will come as no surprise that we have a number of concerns about areas where we feel the Bill is lacking, which we will explore further. We have many reservations about how the Bill has been drafted. The structure and drafting pushes services into addressing harmful content—often in a reactive, rather than proactive, way—instead of harmful systems, business models and algorithms, which would be a more lasting and systemic approach.

Despite that, we all want the Bill to work and we know that it has the potential to go far. We also recognise that the world is watching, so the Opposition look forward to working together to do the right thing, making the internet a truly safe space for all users across the UK. We will therefore not oppose clause 1.

Dan Carden Portrait Dan Carden (Liverpool, Walton) (Lab)
- Hansard - - - Excerpts

It is a pleasure to serve on the Committee. I want to apologise for missing the evidence sessions. Unfortunately, I came down with covid, but I have been following the progress of the Committee.

This is important legislation. We spend so much of our lives online these days, yet there has never been an attempt to regulate the space, or for democratically elected Members to contribute towards its regulation. Clause 1 gives a general outline of what to expect in the Bill. I have no doubt that this legislation is required, but also that it will not get everything right, and that it will have to change over the years. We may see many more Bills of this nature in this place.

I have concerns that some clauses have been dropped, and I hope that there will be future opportunities to amend the Bill, not least with regard to how we educate and ensure that social media companies promote media literacy, so that information that is spread widely online is understood in its context—that it is not always correct or truthful. The Bill, I hope, will go some way towards ensuring that we can rely more on the internet, which should provide a safer space for all its users.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

May I join others in welcoming line-by-line scrutiny of the Bill? I am sure that the Minister will urge us to ensure that we do not make the perfect the enemy of the good. This is a very lengthy and complex Bill, and a great deal of time and scrutiny has already gone into it. I am sure that we will all pay due regard to that excellent work.

The hon. Member for Pontypridd is absolutely right to say that in many ways the world is watching what the Government are doing regarding online regulation. This will set a framework for many countries around the world, and we must get it right. We are ending the myth that social media and search engines are not responsible for their content. Their use of algorithms alone demonstrates that, while they may not publish all of the information on their sites, they are the editors at the very least and must take responsibility.

We will no doubt hear many arguments about the importance of free speech during these debates and others. I would like gently to remind people that there are many who feel that their free speech is currently undermined by the way in which the online world operates. Women are subject to harassment and worse online, and children are accessing inappropriate material. There are a number of areas that require specific further debate, particularly around the safeguarding of children, adequate support for victims, ensuring that the criminal law is future-proof within this framework, and ensuring that we pick up on the comments made in the evidence sessions regarding the importance of guidance and codes of practice. It was slightly shocking to hear from some of those giving evidence that the operators did not know what was harmful, as much has been written about the harm caused by the internet.

I will listen keenly to the Minister’s responses on guidance and codes of practice, and secondary legislation more generally, because it is critical to how the Bill works. I am sure we will have many hours of interesting and informed debate on this piece of legislation. While there has already been a great deal of scrutiny, the Committee’s role is pivotal to ensure that the Bill is as good as it can be.

Question put and agreed to.

Clause 1 accordingly ordered to stand part of the Bill.

Clause 2

Key Definitions

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause 3 stand part.

That schedules 1 and 2 be the First and Second schedules to the Bill.

Clause 4 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

We do not oppose clauses 2, 3 or 4, or the intentions of schedules 1 and 2, and have not sought to amend them at this stage, but this is an important opportunity to place on record some of the Opposition’s concerns as the Bill proceeds.

The first important thing to note is the broadness in the drafting of all the definitions. A service has links to the UK if it has a significant number of users in the UK, if the UK users are a target market, or if

“there are reasonable grounds to believe there is a material risk of significant harm to individuals”

in the UK using the service. Thus, territorially, a very wide range of online services could be caught. The Government have estimated in their impact assessment that 25,100 platforms will be in scope of the new regime, which is perhaps a conservative estimate. The impact assessment also notes that approximately 180,000 platforms could potentially be considered in scope of the Bill.

The provisions on extraterritorial jurisdiction are, again, extremely broad and could lead to some international platforms seeking to block UK users in a way similar to that seen following the introduction of GDPR. Furthermore, as has been the case under GDPR, those potentially in scope through the extraterritorial provisions may vigorously resist attempts to assert jurisdiction.

Notably absent from schedule 1 is an attempt to include or define how the Bill and its definitions of services that are exempt may adapt to emerging future technologies. The Minister may consider that a matter for secondary legislation, but as he knows, the Opposition feel that the Bill already leaves too many important matters to be determined at a later stage via statutory instruments. Although it good to see that the Bill has incorporated everyday internet behaviour such as a like or dislike button, as well as factoring in the use of emojis and symbols, it fails to consider how technologies such as artificial intelligence will sit within the framework as it stands.

It is quite right that there are exemptions for everyday user-to-user services such as email, SMS, and MMS services, and an all-important balance to strike between our fundamental right to privacy and keeping people safe online. That is where some difficult questions arise on platforms such as WhatsApp, which are embedded with end-to-end encryption as a standard feature. Concerns have been raised about Meta’s need to extend that feature to Instagram and Facebook Messenger.

The Opposition also have concerns about private messaging features more widely. Research from the Centre for Missing and Exploited Children highlighted the fact that a significant majority of online child abuse takes place in private messages. For example, 12 million of the 18.4 million child sexual abuse reports made by Facebook in 2019 related to content shared on private channels. Furthermore, recent data from the Office for National Statistics shows that private messaging plays a central role in contact between children and people they have not met offline before. Nearly three quarters—74%—of cases of children contacted by someone they do not know initially take place by private message. We will address this issue further in new clause 20, but I wanted to highlight those exemptions early on, as they are relevant to schedule 1.

On a similar point, we remain concerned about how emerging online systems such as the metaverse have had no consideration in Bill as it stands. Only last week, colleagues will have read about a researcher from a non- profit organisation that seeks to limit the power of large corporations, SumOfUs, who claimed that she experienced sexual assault by a stranger in Meta’s virtual reality space, Horizon Worlds. The organisation’s report said:

“About an hour into using the platform, a SumOfUs researcher was led into a private room at a party where she was raped by a user who kept telling her to turn around so he could do it from behind while users outside the window could see—all while another user in the room watched and passed around a vodka bottle.”

There is currently no clear distinction about how these very real technologies will sit in the Bill more widely. Even more worryingly, there has been no consideration of how artificial intelligence systems such as Horizon Worlds, with clear user-to-user functions, fit within the exemptions in schedule 1. If we are to see exemptions for internal business services or services provided by public bodies, along with many others, as outlined in the schedule, we need to make sure that the exemptions are fit for purpose and in line with the rapidly evolving technology that is widely available overseas. Before long, I am sure that reality spaces such as Horizon Worlds will become more and more commonplace in the UK too.

I hope that the Minister can reassure us all of his plans to ensure that the Bill is adequately future-proofed to cope with the rising expansion of the online space. Although we do not formally oppose the provisions outlined in schedule 1, I hope that the Minister will see that there is much work to be done to ensure that the Bill is adequately future-proofed to ensure that the current exemptions are applicable to future technologies too.

Turning to schedule 2, the draft Bill was hugely lacking in provisions to tackle pornographic content, so it is a welcome step that we now see some attempts to tackle the rate at which pornographic content is easily accessed by children across the country. As we all know, the draft Bill only covered pornography websites that allow user-generated content such as OnlyFans. I am pleased to see that commercial pornography sites have now been brought within scope. This positive step forward has been made possible thanks to the incredible efforts of campaigning groups, of which there are far too many to mention, and from some of which we took evidence. I pay tribute to them today. Over the years, it is thanks to their persistence that the Government have been forced to take notice and take action.

Once again—I hate to repeat myself—I urge the Minister to consider how far the current definitions outlined in schedule 2 relating to regulated provider pornographic content will go to protect virtual technologies such as those I referred to earlier. We are seeing an increase in all types of pornographic and semi-pornographic content that draws on AI or virtual technology. An obvious example is the now thankfully defunct app that was making the rounds online in 2016 called DeepNude. While available, the app used neural networks to remove clothing from images of women, making them look realistically nude. The ramifications and potential for technology like this to take over the pornographic content space is essentially limitless.

I urge the Minister carefully to keep in mind the future of the online space as we proceed. More specifically, the regulation of pornographic content in the context of keeping children safe is an area where we can all surely get on board. The Opposition have no formal objection at this stage to the provisions outlined in schedule 2.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

Thank you, Sir Roger, for chairing our sittings. It is a pleasure to be part of this Bill Committee. I have a couple of comments on clause 2 and more generally.

The Opposition spokesperson, the hon. Member for Pontypridd, made some points about making sure that we are future-proofing the Bill. There are some key issues where we need to make sure that we are not going backwards. That particularly includes private messaging. We need to make sure that the ability to use AI to find content that is illegal, involving child sexual abuse for example, in private messages is still included in the way that it is currently and that the Bill does not accidentally bar those very important safeguards from continuing. That is one way in which we need to be clear on the best means to go forward with the Bill.

Future-proofing is important—I absolutely agree that we need to ensure that the Bill either takes into account the metaverse and virtual reality or ensures that provisions can be amended in future to take into account the metaverse, virtual reality and any other emerging technologies that we do not know about and cannot even foresee today. I saw a meme online the other day that was somebody taking a selfie of themselves wearing a mask and it said, “Can you imagine if we had shown somebody this in 1995 and asked them what this was? They wouldn’t have had the faintest idea.” The internet changes so quickly that we need to ensure that the Bill is future-proofed, but we also need to make sure that it is today-proofed.

I still have concerns, which I raised on Second Reading, about whether the Bill adequately encompasses the online gaming world, where a huge number of children use the internet—and where they should use it—to interact with their friends in a safe way. A lot of online gaming is free from the bullying that can be seen in places such as WhatsApp, Snapchat and Instagram. We need to ensure that those safeguards are included for online gaming. Private messaging is a thing in a significant number of online games, but many people use oral communication—I am thinking of things such as Fortnite and Roblox, which is apparently a safe space, according to Roblox Corporation, but according to many researchers is a place where an awful lot of grooming takes place.

My other question for the Minister—I am not bothered if I do not get an answer today, as I would rather have a proper answer than the Minister try to come up with an answer right at this moment—is about what category the app store and the Google Play store fall into.

09:45
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

On a point of order, Sir Roger. The livestream is not working. In the interest of transparency we should pause the Committee while it is fixed so that people can observe.

None Portrait The Chair
- Hansard -

I am reluctant to do that. It is a technical fault and it is clearly undesirable, but I do not think we can suspend the Committee for the sake of a technical problem. Every member of the public who wishes to express an interest in these proceedings is able to be present if they choose to do so. Although I understand the hon. Lady’s concern, we have to continue. We will get it fixed as soon as we can.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Will the hon. Lady give way?

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

You are making some really important points about the world of the internet and online gaming for children and young people. That is where we need some serious consideration about obligations on providers about media literacy for both children and grown-ups. Many people with children know that this is a really dangerous space for young people, but we are not quite sure we have enough information to understand what the threats, risks and harms are. That point about media literacy, particularly in regard to the gaming world, is really important.

None Portrait The Chair
- Hansard -

Order. Before we proceed, the same rules apply in Committee as on the Floor of the House to this extent: the Chair is “you”, and you speak through the Chair, so it is “the hon. Lady”. [Interruption.] One moment.

While I am on my feet, I should perhaps have said earlier, and will now say for clarification, that interventions are permitted in exactly the same way as they are on the Floor of the House. In exactly the same way, it is up to the Member who has the Floor to decide whether to give way or not. The difference between these debates and those on the Floor of the House is of course that on the Floor of the House a Member can speak only once, whereas in Committee you have the opportunity to come back and speak again if you choose to do so. Once the Minister is winding up, that is the end of the debate. The Chair would not normally admit, except under exceptional circumstances, any further speech, as opposed to an intervention.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Thank you, Sir Roger.

I do not want to get sidetracked, but I agree that there is a major parental knowledge gap. Tomorrow’s parents will have grown up on the internet, so in 20 years’ time we will have not have that knowledge gap, but today media literacy is lacking particularly among parents as well as among children. In Scotland, media literacy is embedded in the curriculum; I am not entirely sure what the system is in the rest of the UK. My children are learning media literacy in school, but there is still a gap about media literacy for parents. My local authority is doing a media literacy training session for parents tomorrow night, which I am very much looking forward to attending so that I can find out even more about how to keep my children safe online.

I was asking the Minister about the App Store and the Google Play Store. I do not need an answer today, but one at some point would be really helpful. Do the App Store, the Google Play Store and other stores of that nature fall under the definition of search engines or of user-to-user content? The reality is that if somebody creates an app, presumably they are a user. Yes, it has to go through an approval process by Apple or Google, but once it is accepted by them, it is not owned by them; it is still owned by the person who generated it. Therefore, are those stores considered search engines, in that they are simply curating content, albeit moderated content, or are they considered user-to-user services?

That is really important, particularly when we are talking about age verification and children being able to access various apps. The stores are the key gateways where children get apps. Once they have an app, they can use all the online services that are available on it, in line with whatever parental controls parents choose to put in place. I would appreciate an answer from the Minister, but he does not need to provide it today. I am happy to receive it at a later time, if that is helpful.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I want to pick up on two issues, which I hope the Minister can clarify in his comments at the end of this section.

First, when we took evidence, the Internet Watch Foundation underlined the importance of end-to-end encryption being in scope of the Bill, so that it does not lose the ability to pick up child abuse images, as has already been referred to in the debate. The ability to scan end-to-end encryption is crucial. Will the Minister clarify if that is in scope and if the IWF will be able to continue its important work in safeguarding children?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

A number of people have raised concerns about freedom of speech in relation to end-to-end encryption. Does the right hon. Lady agree with me that, there should not be freedom of speech when it comes to child sexual abuse images, and that it is reasonable for those systems to check for child sexual abuse images?

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

The hon. Lady is right to pick up on the nuance and the balance that we have to strike in legislation between freedom of speech and the protection of vulnerable individuals and children. I do not think there can be many people, particularly among those here today, who would want anything to trump the safeguarding of children. Will the Minister clarify exactly how the Bill works in relation to such important work?

Secondly, it is important that the Government have made the changes to schedule 2. They have listened closely on the issue of pornography and extended the provisions of the Bill to cover commercial pornography. However, the hon. Member for Pontypridd mentioned nudification software, and I am unclear whether the Bill would outlaw such software, which is designed to sexually harass women. That software takes photographs only of women, because its database relates only to female figures, and makes them appear to be completely naked. Does that software fall in scope of the Bill? If not, will the Minister do something about that? The software is available and we have to regulate it to ensure that we safeguard women’s rights to live without harassment in their day-to-day life.

Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

This part of the Bill deals with the definitions of services and which services would be exempt. I consider myself a millennial; most people my age or older are Facebook and Twitter users, and people a couple of years younger might use TikTok and other services. The way in which the online space is used by different generations, particularly by young people, changes rapidly. Given the definitions in the Bill, how does the Minister intend to keep pace with the changing ways in which people communicate? Most online games now allow interaction between users in different places, which was not the case a few years ago. Understanding how the Government intend the Bill to keep up with such changes is important. Will the Minister tell us about that?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me briefly speak to the purpose of these clauses and then respond to some of the points made in the debate.

As the shadow Minister, the hon. Member for Pontypridd, touched on, clauses 2 and 3 define some of the key terms in the Bill, including “user-to-user services” and “search services”—key definitions that the rest of the Bill builds on. As she said, schedule 1 and clause 4 contain specific exemptions where we believe the services concerned present very low risk of harm. Schedule 2 sets out exemptions relating to the new duties that apply to commercial providers of pornography. I thank the shadow Minister and my right hon. Friend the Member for Basingstoke for noting the fact that the Government have substantially expanded the scope of the Bill to now include commercial pornography, in response to widespread feedback from Members of Parliament across the House and the various Committees that scrutinised the Bill.

The shadow Minister is quite right to say that the number of platforms to which the Bill applies is very wide. [Interruption.] Bless you—or bless my hon. Friend the Member for North West Durham, I should say, Sir Roger, although he is near sanctified already. As I was saying, we are necessarily trying to protect UK users, and with many of these platforms not located in the UK, we are seeking to apply these duties to those companies as well as ones that are domestically located. When we come to discuss the enforcement powers, I hope the Committee will see that those powers are very powerful.

The shadow Minister, the hon. Member for Liverpool, Walton and others asked about future technologies and whether the Bill will accommodate technologies that we cannot even imagine today. The metaverse is a good example: The metaverse did not exist when the Bill was first contemplated and the White Paper produced. Actually, I think Snapchat did not exist when the White Paper that preceded the Bill was first conceived. For that reason, the Bill is tech agnostic. We do not talk about specific technologies; we talk about the duties that apply to companies and the harms they are obligated to prevent.

The whole Bill is tech agnostic because we as parliamentarians today cannot anticipate future developments. When those future developments arise, as they inevitably will, the duties under the Bill will apply to them as well. The metaverse is a good example, because even though it did not exist when the structure of the Bill was conceived, anything happening in the metaverse is none the less covered by the Bill. Anything that happens in the metaverse that is illegal or harmful to children, falls into the category of legal but harmful to adults, or indeed constitutes pornography will be covered because the Bill is tech agnostic. That is an extremely important point to make.

The hon. Member for Aberdeen North asked about gaming. Parents are concerned because lots of children, including quite young children, use games. My own son has started playing Minecraft even though he is very young. To the extent that those games have user-to-user features—for example, user-to-user messaging, particularly where those messages can be sent widely and publicly—those user-to-user components are within the scope of the Bill.

The hon. Member for Aberdeen North also asked about the App Store. I will respond quickly to her question now rather than later, to avoid leaving the Committee in a state of tingling anticipation and suspense. The App Store, or app stores generally, are not in the scope of the Bill, because they are not providing, for example, user-to-user services, and the functionality they provide to basically buy apps does not count as a search service. However, any app that is purchased in an app store, to the extent that it has either search functionality, user-to-user functionality or purveys or conveys pornography, is in scope. If an app that is sold on one of these app stores turns out to provide a service that breaks the terms of the Bill, that app will be subject to regulatory enforcement directly by Ofcom.

The hon. Members for Aberdeen North and for Liverpool, Walton touched on media literacy, noting that there has been a change to the Bill since the previous version. We will probably debate this later, so I will be brief. The Government published a media literacy strategy, backed by funding, to address this point. It was launched about a year ago. Ofcom also has existing statutory duties—arising under the Communications Act 2003, I believe. The critical change made since the previous draft of the Bill—it was made in December last year, I believe—is that Ofcom published an updated set of policy intentions around media literacy that went even further than we had previously intended. That is the landscape around media literacy.

10:00
Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

On the way that media literacy relates to misinformation and disinformation, we heard from William Moy, chief executive of Full Fact. His view was that the Bill does nothing to tackle disinformation and that another information incident, as we have seen with covid and Ukraine recently, is inevitable. Full Fact’s view was that the Bill should give the regulator the power to declare misinformation incidents. Is that something the Minister has considered?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am sure we will discuss this topic a bit more as the Bill progresses.

I will make a few points on disinformation. The first is that, non-legislatively, the Government have a counter-disinformation unit, which sits within the Department for Digital, Culture, Media and Sport. It basically scans for disinformation incidents. For the past two years it has been primarily covid-focused, but in the last three or four months it has been primarily Russia/Ukraine-focused. When it identifies disinformation being spread on social media platforms, the unit works actively with the platforms to get it taken down. In the course of the Russia-Ukraine conflict, and as a result of the work of that unit, I have personally called in some of the platforms to complain about the stuff they have left up. I did not have a chance to make this point in the evidence session, but when the person from Twitter came to see us, I said that there was some content on Russian embassy Twitter accounts that, in my view, was blatant disinformation—denial of the atrocities that have been committed in Bucha. Twitter had allowed it to stay up, which I thought was wrong. Twitter often takes down such content, but in that example, wrongly and sadly, it did not. We are doing that work operationally.

Secondly, to the extent that disinformation can cause harm to an individual, which I suspect includes a lot of covid disinformation—drinking bleach is clearly not very good for people—that would fall under the terms of the legal but harmful provisions in the Bill.

Thirdly, when it comes to state-sponsored disinformation of the kind that we know Russia engages in on an industrial scale via the St Petersburg Internet Research Agency and elsewhere, the Home Office has introduced the National Security Bill—in fact, it had its Second Reading yesterday afternoon, when some of us were slightly distracted. One of the provisions in that Bill is a foreign interference offence. It is worth reading, because it is very widely drawn and it criminalises foreign interference, which includes disinformation. I suggest the Committee has a look at the foreign interference offence in the National Security Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful for the Minister’s intervention in bringing in the platforms to discuss disinformation put out by hostile nation states. Does he accept that if Russia Today had put out some of that disinformation, the platforms would be unable to take such content down as a result of the journalistic exemption in the Bill?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We will no doubt discuss in due course clauses 15 and 50, which are the two that I think the shadow Minister alludes to. If a platform is exempt from the duties of the Bill owing to its qualification as a recognised news publisher under clause 50, it removes the obligation to act under the Bill, but it does not prevent action. Social media platforms can still choose to act. Also, it is not a totally straightforward matter to qualify as a regulated news publisher under clause 50. We saw the effect of sanctions: when Russia Today was sanctioned, it was removed from many platforms as a result of the sanctioning process. There are measures outside the Bill, such as sanctions, that can help to address the shocking disinformation that Russia Today was pumping out.

The last point I want to pick up on was rightly raised by my right hon. Friend the Member for Basingstoke and the hon. Member for Aberdeen North. It concerns child sexual exploitation and abuse images, and particularly the ability of platforms to scan for those. Many images are detected as a result of scanning messages, and many paedophiles or potential paedophiles are arrested as a result of that scanning. We saw a terrible situation a little while ago, when—for a limited period, owing to a misconception of privacy laws—Meta, or Facebook, temporarily suspended scanning in the European Union; as a result, loads of images that would otherwise have been intercepted were not.

I agree with the hon. Member for Aberdeen North that privacy concerns, including end-to-end encryption, should not trump the ability of organisations to scan for child sexual exploitation and abuse images. Speaking as a parent—I know she is, too—there is, frankly, nothing more important than protecting children from sexual exploitation and abuse. Some provisions in clause 103 speak to this point, and I am sure we will debate those in more detail when we come to that clause. I mention clause 103 to put down a marker as the place to go for the issue being raised. I trust that I have responded to the points raised in the debate, and I commend the clause to the Committee.

Question put and agreed to.

Clause 2 accordingly ordered to stand part of the Bill.

Clause 3 ordered to stand part of the Bill.

Schedules 1 and 2 agreed to.

Clause 4 ordered to stand part of the Bill.

None Portrait The Chair
- Hansard -

Before we move on, we have raised the issue of the live feed. The audio will be online later today. There is a problem with the feed—it is reaching the broadcasters, but it is not being broadcast at the moment.

As we are not certain we can sort out the technicalities between now and this afternoon, the Committee will move to Committee Room 9 for this afternoon’s sitting to ensure that the live stream is available. Mr Double, if Mr Russell intends to be present—he may not; that is up to you—it would be helpful if you would let him know. Ms Blackman, if John Nicolson intends to be present this afternoon, would you please tell him that Committee Room 9 will be used?

It would normally be possible to leave papers and other bits and pieces in the room, because it is usually locked between the morning and afternoon sittings. Clearly, because we are moving rooms, you will all need to take your papers and laptops with you.

Clause 5

Overview of Part 3

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I want to just put it on the record that the irony is not lost on me that we are having tech issues relating to the discussion of the Online Safety Bill. The Opposition have huge concerns regarding clause 5. We share the frustrations of stakeholders who have been working on these important issues for many years and who feel the Bill has been drafted in overly complex way. In its evidence, the Carnegie UK Trust outlined its concerns over the complexity of the Bill, which will likely lead to ineffective regulation for both service users and companies. While the Minister is fortunate to have a team of civil servants behind him, he will know that the Opposition sadly do not share the same level of resources—although I would like to place on the record my sincere thanks to my researcher, Freddie Cook, who is an army of one all by herself. Without her support, I would genuinely not know where I was today.

Complexity is an issue that crops up time and again when speaking with charities, stakeholders and civil society. We all recognise that the Bill will have a huge impact however it passes, but the complexity of its drafting is a huge barrier to implementation. The same can be said for the regulation. A Bill as complex as this is likely to lead to ineffective regulation for both service users and companies, who, for the first time, will be subject to specific requirements placed on them by the regulator. That being said, we absolutely support steps to ensure that providers of regulated user-to-user services and regulated search services have to abide by a duty of care regime, which will also see the regulator able to issue codes of practice.

I would also like to place on record my gratitude—lots of gratitude today—to Professor Lorna Woods and Will Perrin, who we heard from in evidence sessions last week. Alongside many others, they have been and continue to be an incredible source of knowledge and guidance for my team and for me as we seek to unpick the detail of this overly complex Bill. Colleagues will also be aware that Professor Woods and Mr Perrin originally developed the idea of a duty of care a few years ago now; their model was based on the idea that social media providers should be,

“seen as responsible for public space they have created, much as property owners or operators are in a physical world.”

It will come as no surprise to the Minister that Members of the Opposition fully fall behind that definition and firmly believe that forcing platforms to identify and act on harms that present a reasonable chance of risk is a positive step forward.

More broadly, we welcome moves by the Government to include specific duties on providers of services likely to be accessed by children, although I have some concerns about just how far they will stretch. Similarly, although I am sure we will come to address those matters in the debates that follow, we welcome steps to require Ofcom to issue codes of practice, but have fundamental concerns about how effective they will be if Ofcom is not allowed to remain fully independent and free from Government influence.

Lastly, on subsection 7, I imagine our debate on chapter 7 will be a key focus for Members. I know attempts to define key terms such as “priority content” will be a challenge for the Minister and his officials but we remain concerned that there are important omissions, which we will come to later. It is vital that those key terms are broad enough to encapsulate all the harms that we face online. Ultimately, what is illegal offline must be approached in the same way online if the Bill is to have any meaningful positive impact, which is ultimately what we all want.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I want to make a couple of brief comments. Unfortunately, my hon. Friend the Member for Ochil and South Perthshire is not here as, ironically, he is at the DCMS committee taking evidence on the Online Safety Bill. That is a pretty unfortunate clash of timing, but that is why I am here solo for the morning.

I wanted to make a quick comment on subsection 7. The Minister will have heard the evidence given on schedule 7 and the fact that the other schedules, particularly schedule 6, has a Scottish-specific section detailing the Scottish legislation that applies. Schedule 7 has no Scotland-specific section and does not adequately cover the Scottish legislation. I appreciate that the Minister has tabled amendment 126, which talks about the Scottish and Northern Irish legislation that may be different from England and Wales legislation, but will he give me some comfort that he does intend Scottish-specific offences to be added to schedule 7 through secondary legislation? There is a difference between an amendment on how to add them and a commitment that they will be added if necessary and if he feels that that will add something to the Bill. If he could commit that that will happen, I would appreciate that—obviously, in discussion with Scottish Ministers if amendment 126 is agreed. It would give me a measure of comfort and would assist, given the oral evidence we heard, in overcoming some of the concerns raised about schedule 7 and the lack of inclusion of Scottish offences.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

In many ways, clause 6 is the central meat of the Bill. It brings into play a duty of care, which means that people operating online will be subject to the same rules as the rest of us when it comes to the provision of services. But when it comes to the detail, the guidance and codes that will be issued by Ofcom will play a central role. My question for the Minister is: in the light of the evidence that we received, I think in panel three, where the providers were unable to define what was harmful because they had not yet seen codes of practice from Ofcom, could he update us on when those codes and guidance might be available? I understand thoroughly why they may not be available at this point, and they certainly should not form part of the Bill because they need to be flexible enough to be changed in future, but it is important that we know how the guidance and codes work and that they work properly.

Will the Minister update the Committee on what further consideration he and other Ministers have given to the establishment of a standing committee to scrutinise the implementation of the Bill? Unless we have that in place, it will be difficult to know whether his legislation will work.

Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

Some of the evidence we heard suggested that the current precedent was that the Secretary of State had very little to do with independent regulators in this realm, but that the Bill overturns that precedent. Does the right hon. Lady have any concerns that the Bill hands too much power to the Secretary of State to intervene and influence regulators that should be independent?

10:16
Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

The hon. Gentleman brings up an important point. We did hear about that in the evidence. I have no doubt the Secretary of State will not want to interfere in the workings of Ofcom. Having been in his position, I know there would be no desire for the Department to get involved in that, but I can understand why the Government might want the power to ensure things are working as they should. Perhaps the answer to the hon. Gentleman’s question is to have a standing committee scrutinising the effectiveness of the legislation and the way in which it is put into practice. That committee could be a further safeguard against what he implies: an unnecessary overreach of the Secretary of State’s powers.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Thank you, Sir Roger, for allowing me to intervene again. I was not expecting the standing committee issue to be brought up at this point, but I agree that there needs to be a post-implementation review of the Bill. I asked a series of written questions to Departments about post-legislative review and whether legislation that the Government have passed has had the intended effect. Most of the Departments that answered could not provide information on the number of post-legislative reviews. Of those that could provide me with the information, none of them had managed to do 100% of the post-implementation reviews that they were supposed to do.

It is important that we know how the Bill’s impact will be scrutinised. I do not think it is sufficient for the Government to say, “We will scrutinise it through the normal processes that we normally use,” because it is clear that those normal processes do not work. The Government cannot say that legislation they have passed has achieved the intended effect. Some of it will have and some of it will not have, but we do not know because we do not have enough information. We need a standing committee or another way to scrutinise the implementation.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I thank the hon. Lady for raising this point. Having also chaired a Select Committee, I can understand the sensitivities that this might fall under the current DCMS Committee, but the reality is that the Bill’s complexity and other pressures on the DCMS Committee means that this perhaps should be seen as an exceptional circumstance—in no way is that meant as a disrespect to that Select Committee, which is extremely effective in what it does.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I completely agree. Having sat on several Select Committees, I am aware of the tight timescales. There are not enough hours in the day for Select Committees to do everything that they would like to do. It would be unfortunate and undesirable were this matter to be one that fell between the cracks. Perhaps DCMS will bring forward more legislation in future that could fall between the cracks. If the Minister is willing to commit to a standing committee or anything in excess of the normal governmental procedures for review, that would be a step forward from the position that we are currently in. I look forward to hearing the Minister’s views on that.

Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

I want to add my voice to the calls for ways to monitor the success or failures of this legislation. We are starting from a position of self-regulation where companies write the rules and regulate themselves. It is right that we are improving on that, but with it comes further concerns around the powers of the Secretary of State and the effectiveness of Ofcom. As the issues are fundamental to freedom of speech and expression, and to the protection of vulnerable and young people, will the Minster consider how we better monitor whether the legislation does what it says on the tin?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clause 5 simply provides an overview of part 3 of the Bill. Several good points have been raised in the course of this discussion. I will defer replying to the substance of a number of them until we come to the relevant clause, but I will address two or three of them now.

The shadow Minister said that the Bill is a complex, and she is right; it is 193-odd clauses long and a world-leading piece of legislation. The duties that we are imposing on social media firms and internet companies do not already exist; we have no precedent to build on. Most matters on which Parliament legislates have been considered and dealt with before, so we build on an existing body of legislation that has been built up over decades or, in some cases in the criminal law, over centuries. In this case, we are constructing a new legislative edifice from the ground up. Nothing precedes this piece of legislation—we are creating anew—and the task is necessarily complicated by virtue of its novelty. However, I think we have tried to frame the Bill in a way that keeps it as straightforward and as future-proof as possible.

The shadow Minister is right to point to the codes of practice as the source of practical guidance to the public and to social media firms on how the obligations operate in practice. We are working with Ofcom to ensure that those codes of practice are published as quickly as possible and, where possible, prepared in parallel with the passage of the legislation. That is one reason why we have provided £88 million of up-front funding to Ofcom in the current and next financial years: to give it the financial resources to do precisely that.

My officials have just confirmed that my recollection of the Ofcom evidence session on the morning of Tuesday 24 May was correct: Ofcom confirmed to the Committee that it will publish, before the summer, what it described as a “road map” providing details on the timing of when and how those codes of practice will be created. I am sure that Ofcom is listening to our proceedings and will hear the views of the Committee and of the Government. We would like those codes of practice to be prepared and introduced as quickly as possible, and we certainly provided Ofcom with the resources to do precisely that.

There was question about the Scottish offences and, I suppose, about the Northern Irish offences as well—we do not want to forget any part of the United Kingdom.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We are in agreement on that. I can confirm that the Government have tabled amendments 116 to 126 —the Committee will consider them in due course—to place equivalent Scottish offences, which the hon. Member for Aberdeen North asked about, in the Bill. We have done that in close consultation with the Scottish Government to ensure that the relevant Scottish offences equivalent to the England and Wales offences are inserted into the Bill. If the Scottish Parliament creates any new Scottish offences that should be inserted into the legislation, that can be done under schedule 7 by way of statutory instrument. I hope that answers the question.

The other question to which I will briefly reply was about parliamentary scrutiny. The Bill already contains a standard mechanism that provides for the Bill to be reviewed after a two to five-year period. That provision appears at the end of the Bill, as we would expect. Of course, there are the usual parliamentary mechanisms—Backbench Business debates, Westminster Hall debates and so on—as well as the DCMS Committee.

I heard the points about a standing Joint Committee. Obviously, I am mindful of the excellent prelegislative scrutiny work done by the previous Joint Committee of the Commons and the Lords. Equally, I am mindful that standing Joint Committees, outside the regular Select Committee structure, unusual. The only two that spring immediately to mind are the Intelligence and Security Committee, which is established by statute, and the Joint Committee on Human Rights, chaired by the right hon. and learned Member for Camberwell and Peckham (Ms Harman), which is established by Standing Orders of the House. I am afraid I am not in a position to make a definitive statement about the Government’s position on this. It is of course always open to the House to regulate its own businesses. There is nothing I can say today from a Government point of view, but I know that hon. Members’ points have been heard by my colleagues in Government.

We have gone somewhat beyond the scope of clause 5. You have been extremely generous, Sir Roger, in allowing me to respond to such a wide range of points. I commend clause 5 to the Committee.

Question put and agreed to.

Clause 5 accordingly ordered to stand part of the Bill.

Clause 6

Providers of user-to-user services: duties of care

None Portrait The Chair
- Hansard -

Before we proceed, perhaps this is the moment to explain what should happen and what is probably going to happen. Ordinarily, a clause is taken with amendments. This Chairman takes a fairly relaxed view of stand part debates. Sometimes it is convenient to have a very broad-ranging debate on the first group of amendments because it covers matters relating to the whole clause. The Chairman would then normally say, “Well, you’ve already had your stand part debate, so I’m not going to allow a further stand part debate.” It is up to hon. Members to decide whether to confine themselves to the amendment under discussion and then have a further stand part debate, or whether to go free range, in which case the Chairman would almost certainly say, “You can’t have a stand part debate as well. You can’t have two bites of the cherry.”

This is slightly more complex. It is a very complex Bill, and I think I am right in saying that it is the first time in my experience that we are taking other clause stand parts as part of the groups of amendments, because there is an enormous amount of crossover between the clauses. That will make it, for all of us, slightly harder to regulate. It is for that reason—the Minister was kind enough to say that I was reasonably generous in allowing a broad-ranging debate—that I think we are going to have to do that with this group.

I, and I am sure Ms Rees, will not wish to be draconian in seeking to call Members to order if you stray slightly outside the boundaries of a particular amendment. However, we have to get on with this, so please try not to be repetitive if you can possibly avoid it, although I accept that there may well be some cases where it is necessary.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move amendment 69, in clause 6, page 5, line 39, at end insert—

‘(6A) All providers of regulated user-to-user services must name an individual whom the provider considers to be a senior manager of the provider, who is designated as the provider’s illegal content safety controller, and who is responsible for the provider’s compliance with the following duties—

(a) the duties about illegal content risk assessments set out in section 8,

(b) the duties about illegal content set out in section 9.

(6B) An individual is a “senior manager” of a provider if the individual plays a significant role in—

(a) the making of decisions about how the provider’s relevant activities are to be managed or organised, or

(b) the actual managing or organising of the provider’s relevant activities.

(6C) A provider’s “relevant activities” are activities relating to the provider’s compliance with the duties of care imposed by this Act.

(6D) The Safety Controller commits an offence if the provider fails to comply with the duties set out in sections 8 and 9 which must be complied with by the provider.”

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss amendment 70, in clause 96, page 83, line 7, after “section” insert “6(6D),”.

This is one of those cases where the amendment relates to a later clause. While that clause may be debated now, it will not be voted on now. If amendment 69 is negated, amendment 70 will automatically fall later. I hope that is clear, but it will be clearer when we get to amendment 70. Having confused the issue totally, without further ado, I call Ms Davies-Jones.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

With your permission, Sir Roger, I would like to discuss clause 6 and our amendments 69 and 70, and then I will come back to discuss clauses 7, 21 and 22.

Chapter 2 includes a number of welcome improvements from the draft Bill that the Opposition support. It is only right that, when it comes to addressing illegal content, all platforms, regardless of size or reach, will now be required to develop suitable and sufficient risk assessments that must be renewed before design change is applied. Those risk assessments must be linked to safety duties, which Labour has once again long called for.

It was a huge oversight that, until this point, platforms have not had to perform risk assessments of that nature. During our oral evidence sessions only a few weeks ago, we heard extensive evidence about the range of harms that people face online. Yet the success of the regulatory framework relies on regulated companies carefully assessing the risk posed by their platforms and subsequently developing and implementing appropriate mitigations. Crucial to that, as we will come to later, is transparency. Platforms must be compelled to publish the risk assessments, but in the current version of the Bill, only the regulator will have access to them. Although we welcome the fact that the regulator will have the power to ensure that the risk assessments are of sufficient quality, there remain huge gaps, which I will come on to.

10:30
Companies cannot be obligated to act only on risks identified in their own risk assessments, which would surely lead companies to feel compelled to play down the likelihood of current and emerging risks cropping up. Platforms have a track record of burying documents and research that point to risks of harm in their systems and processes. We only have to turn to the revelations we all heard from the incredible Facebook whistleblower, Frances Haugen, about how Facebook—now known as Meta—was failing to tackle global issues such as online human trafficking despite research indicating that its policies were causing direct harm.
Despite that, the Bill should be commended for requiring platforms to document such risks. However, without making those documents public, platforms can continue to hide behind a veil of secrecy. That is why we have tabled a number of amendments to improve transparency measures in the Bill. Under the Bill as drafted, risk assessments will have to be made only to the regulator, and civil society groups, platforms and other interested participants will not have access to them. However, such groups are often at the heart of understanding and monitoring the harms that occur to users online, and they have an in-depth understanding of what mitigations may be appropriate.
We broadly welcome the Government’s inclusion of functionality in the risk assessments, which will look at not just content but how it spreads. There remains room for improvement, much of which will be discussed as we delve further into chapter 2.
Our amendment 69 would require regulated companies to designate a senior manager as a safety controller who is legally responsible for ensuring that the service meets its illegality risk assessment and content safety duties and is criminally liable for significant and egregious failures to protect users from harms. Typically, senior executives in technology companies have not taken their safeguarding responsibilities seriously, and Ofcom’s enforcement powers remain poorly targeted towards delivering child safety outcomes. The Bill is an opportunity to promote cultural change within companies and to embed compliance with online safety regulations at board level but, as it stands, it completely fails to do so.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I do not intend to speak to this specific point, but I wholeheartedly agree and will be happy to back amendment 69, should the hon. Lady press it to a vote.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful to the hon. Lady and for SNP support for amendment 69.

The Bill introduces criminal liability for senior managers who fail to comply with information notice provisions, but not for actual failure to fulfil their statutory duties with regard to safety, including child safety, and yet such failures lead to the most seriously harmful outcomes. Legislation should focus the minds of those in leadership positions in services that operate online platforms.

A robust corporate and senior management liability scheme is needed to impose personal liability on directors whose actions consistently and significantly put children at risk. The Bill must learn lessons from other regulated sectors, principally financial services, where regulation imposes specific duties on directors and senior management of financial institutions. Those responsible individuals face regulatory enforcement if they act in breach of such duties. Are we really saying that the financial services sector is more important than child safety online?

The Government rejected the Joint Committee’s recommendation that each company appoint a safety controller at, or reporting to, board level. As a result, there is no direct relationship in the Bill between senior management liability and the discharge by a platform of its safety duties. Under the Bill as drafted, a platform could be wholly negligent in its approach to child safety and put children at significant risk of exposure to illegal activity, but as long as the senior manager co-operated with the regulator’s investigation, senior managers would not be held personally liable. That is a disgrace.

The Joint Committee on the draft Bill recommended that

“a senior manager at board level or reporting to the board should be designated the ‘Safety Controller’ and made liable for a new offence: the failure to comply with their obligations as regulated service providers when there is clear evidence of repeated and systemic failings that result in a significant risk of serious harm to users. We believe that this would be a proportionate last resort for the Regulator. Like any offence, it should only be initiated and provable at the end of an exhaustive legal process.”

Amendment 69 would make provision for regulated companies to appoint an illegal content safety controller, who has responsibility and accountability for protecting children from illegal content and activity. We believe this measure would drive a more effective culture of online safety awareness within regulated firms by making senior management accountable for harms caused through their platforms and embedding safety within governance structures. The amendment would require consequential amendments setting out the nature of the offences for which the safety officer may be liable and the penalties associated with them.

In financial services regulation, the Financial Conduct Authority uses a range of personal accountability regimes to deter individuals who may exhibit unwanted and harmful behaviour and as mechanisms for bringing about cultural change. The senior managers and certificate regime is an overarching framework for all staff in financial sectors and service industries. It aims to

“encourage a culture of staff at all levels taking personal responsibility for their actions”,

and to

“make sure firms and staff clearly understand and can demonstrate where responsibility lies.”

Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

One of the challenges for this legislation will be the way it is enforced. Have my hon. Friend and her Front-Bench colleagues given consideration to the costs of the funding that Ofcom and the regulatory services may need?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

That is a huge concern for us. As was brought up in our evidence sessions with Ofcom, it is recruiting, effectively, a fundraising officer for the regulator. That throws into question the potential longevity of the regulator’s funding and whether it is resourced effectively to properly scrutinise and regulate the online platforms. If that long-term resource is not available, how can the regulator effectively scrutinise and bring enforcement to bear against companies for enabling illegal activity?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Just to reassure the shadow Minister and her hon. Friend the Member for Liverpool, Walton, the Bill confers powers on Ofcom to levy fees and charges on the sector that it is regulating—so, on social media firms—to recoup its costs. We will debate that in due course—I think it is in clause 71, but that power is in the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful to the Minister for that clarification and I look forward to debating that further as the Bill progresses.

Returning to the senior managers and certificate regime in the financial services industry, it states that senior managers must be preapproved by the regulator, have their responsibilities set out in a statement of responsibilities and be subject to enhanced conduct standards. Those in banks are also subject to regulatory requirements on their remuneration. Again, it baffles me that we are not asking the same for child safety from online platforms and companies.

The money laundering regulations also use the threat of criminal offences to drive culture change. Individuals can be culpable for failure of processes, as well as for intent. I therefore hope that the Minister will carefully consider the need for the same to apply to our online space to make children safe.

Amendment 70 is a technical amendment that we will be discussing later on in the Bill. However, I am happy to move it in the name of the official Opposition.

None Portrait The Chair
- Hansard -

The Committee will note that, at the moment, the hon. Lady is not moving amendment 70; she is only moving amendment 69. So the Question is, That that amendment be made.

Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

I congratulate my own Front Bench on this important amendment. I would like the Minister to respond to the issue of transparency and the reason why only the regulator would have sight of these risk assessments. It is fundamental that civil society groups and academics have access to them. Her Majesty’s Revenue and Customs is an example of where that works very well. HMRC publishes a lot of its data, which is then used by academics and researchers to produce reports and documents that feed back into the policy making processes and HMRC’s work. It would be a missed opportunity if the information and data gathered by Ofcom were not widely available for public scrutiny.

I would reinforce the earlier points about accountability. There are too many examples—whether in the financial crash or the collapse of companies such as Carillion—where accountability was never there. Without this amendment and the ability to hold individuals to account for the failures of companies that are faceless to many people, the legislation risks being absolutely impotent.

Finally, I know that we will get back to the issue of funding in a later clause but I hope that the Minister can reassure the Committee that funding for the enforcement of these regulations will be properly considered.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me start by speaking to clauses 6, 7, 21 and 22 stand part. I will then address the amendments moved by the shadow Minister.

None Portrait The Chair
- Hansard -

Order. I apologise for interrupting, Minister, but the stand part debates on clauses 7, 21 and 22 are part of the next grouping, not this one. I am fairly relaxed about it, but just be aware that you cannot have two debates on this.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The grouping sheet I have here suggests that clause 7 stand part and clauses 21 and 22 stand part are in this grouping, but if I have misunderstood—

None Portrait The Chair
- Hansard -

No, there are two groups. Let me clarify this for everyone, because it is not as straightforward as it normally is. At the moment we are dealing with amendments 69 and 70. The next grouping, underneath this one on your selection paper, is the clause stand part debates—which is peculiar, as effectively we are having the stand part debate on clause 6 now. For the convenience of the Committee, and if the shadow Minister is happy, I am relaxed about taking all this together.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am happy to come back in and discuss clauses 7, 21 and 22 stand part afterwards.

None Portrait The Chair
- Hansard -

The hon. Lady can be called again. The Minister is not winding up at this point.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

In the interests of simplicity, I will stick to the selection list and adapt my notes accordingly to confine my comments to amendments 69 and 70, and then we will come to the stand part debates in due course. I am happy to comply, Sir Roger.

Speaking of compliance, that brings us to the topic of amendments 69 and 70. It is worth reminding ourselves of the current enforcement provisions in the Bill, which are pretty strong. I can reassure the hon. Member for Liverpool, Walton that the enforcement powers here are far from impotent. They are very potent. As the shadow Minister acknowledged in her remarks, we are for the first time ever introducing senior management liability, which relates to non-compliance with information notices and offences of falsifying, encrypting or destroying information. It will be punishable by a prison sentence of up to two years. That is critical, because without that information, Ofcom is unable to enforce.

We have had examples of large social media firms withholding information and simply paying a large fine. There was a Competition and Markets Authority case a year or two ago where a large social media firm did not provide information repeatedly requested over an extended period and ended up paying a £50 million fine rather than providing the information. Let me put on record now that that behaviour is completely unacceptable. We condemn it unreservedly. It is because we do not want to see that happen again that there will be senior manager criminal liability in relation to providing information, with up to two years in prison.

In addition, for the other duties in the Bill there are penalties that Ofcom can apply for non-compliance. First, there are fines of up to 10% of global revenue. For the very big American social media firms, the UK market is somewhere just below 10% of their global revenue, so 10% of their global revenue is getting on for 100% of their UK revenue. That is a very significant financial penalty, running in some cases into billions of pounds.

In extreme circumstances—if those measures are not enough to ensure compliance—there are what amount to denial of service powers in the Bill, where essentially Ofcom can require internet service providers and others, such as payment providers, to disconnect the companies in the UK so that they cannot operate here. Again, that is a very substantial measure. I hope the hon. Member for Liverpool, Walton would agree that those measures, which are in the Bill already, are all extremely potent.

The question prompted by the amendment is whether we should go further. I have considered that issue as we have been thinking about updating the Bill—as hon. Members can imagine, it is a question that I have been debating internally. The question is whether we should go further and say there is personal criminal liability for breaches of the duties that go beyond information provision. There are arguments in favour, which we have heard, but there are arguments against as well. One is that if we introduce criminal liability for those other duties, that introduces a risk that the social media firms, fearing criminal prosecution, will become over-zealous and just take everything down because they are concerned about being personally liable. That could end up having a chilling effect on content available online and goes beyond what we in Parliament would intend.

10:45
Secondly, providing information is pretty cut and dried. We say, “Give us that information. Have you provided it—yes or no? Is that information accurate—yes or no?” It is pretty obvious what the individual executive must do to meet that duty. When it comes to some of the other duties, that clarity that comes with information provision is sometimes less obvious, which makes it harder to justify expanding criminal liability to those circumstances.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

In a moment.

For those reasons, I think we have drawn the line in the right place. There is personal criminal liability for information provision, with fines of 10% of local revenue and service disruption—unplugging powers—as well. Having thought about it quite carefully, I think we have struck the balance in the right place. We do not want to deter people from offering services in the UK. If they worried that they might go to prison too readily, it might deter people from locating here. I fully recognise that there is a balance to strike. I feel that the balance is being struck in the right place.

I will go on to comment on a couple of examples we heard about Carillion and the financial crisis, but before I do so, I will give way as promised.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I appreciate that the Minister says he has been swithering on this point—he has been trying to work out the correct place to draw the line. Given that we do not yet have a commitment for a standing committee—again, that is potentially being considered—we do not know how the legislation is going to work. Will the Minister, rather than accepting the amendment, give consideration to including the ability to make changes via secondary legislation so that there is individual criminal liability for different breaches? That would allow him the flexibility in the future, if the regime is not working appropriately, to add through secondary legislation individual criminal liability for breaches beyond those that are currently covered.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have not heard that idea suggested. I will think about it. I do not want to respond off the cuff, but I will give consideration to the proposal. Henry VIII powers, which are essentially what the hon. Lady is describing—an ability through secondary legislation effectively to change primary legislation—are obviously viewed askance by some colleagues if too wide in scope. We do use them, of course, but normally in relatively limited circumstances. Creating a brand new criminal offence via what amounts to a Henry VIII power would be quite a wide application of the power, but it is an idea that I am perfectly happy to go away and reflect on. I thank her for mentioning the idea.

A couple of examples were given about companies that have failed in the past. Carillion was not a financial services company and there was no regulatory oversight of the company at all. In relation to financial services regulation, despite the much stricter regulation that existed in the run-up to the 2008 financial crisis, that crisis occurred none the less. [Interruption.] We were not in government at the time. We should be clear-eyed about the limits of what regulation alone can deliver, but that does not deter us from taking the steps we are taking here, which I think are extremely potent, for all the reasons that I mentioned and will not repeat.

Question put, That the amendment be made.

Division 1

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

Question proposed, That the clause stand part of the Bill.
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause 7 stand part.

Clauses 21 and 22 stand part.

My view is that the stand part debate on clause 6 has effectively already been had, but I will not be too heavy-handed about that at the moment.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

On clause 7, as I have previously mentioned, we were all pleased to see the Government bring in more provisions to tackle pornographic content online, much of which is easily accessible and can cause harm to those viewing it and potentially to those involved in it.

As we have previously outlined, a statutory duty of care for social platforms online has been missing for far too long, but we made it clear on Second Reading that such a duty will only be effective if we consider the systems, business models and design choices behind how platforms operate. For too long, platforms have been abuse-enabling environments, but it does not have to be this way. The amendments that we will shortly consider are largely focused on transparency, as we all know that the duties of care will only be effective if platforms are compelled to proactively supply their assessments to Ofcom.

On clause 21, the duty of care approach is one that the Opposition support and it is fundamentally right that search services are subject to duties including illegal content risk assessments, illegal content assessments more widely, content reporting, complaints procedures, duties about freedom of expression and privacy, and duties around record keeping. Labour has long held the view that search services, while not direct hosts of potentially damaging content, should have responsibilities that see them put a duty of care towards users first, as we heard in our evidence sessions from HOPE not hate and the Antisemitism Policy Trust.

It is also welcome that the Government have committed to introducing specific measures for regulated search services that are likely to be accessed by children. However, those measures can and must go further, so we will be putting forward some important amendments as we proceed.

Labour does not oppose clause 22, either, but I would like to raise some important points with the Minister. We do not want to be in a position whereby those designing, operating and using a search engine in the United Kingdom are subject to a second-rate internet experience. We also do not want to be in a position where we are forcing search services to choose what is an appropriate design for people in the UK. It would be worrying indeed if our online experience vastly differed from that of, let us say, our friends in the European Union. How exactly will clause 22 ensure parity? I would be grateful if the Minister could confirm that before we proceed.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The shadow Minister has already touched on the effect of these clauses: clause 6 sets out duties applying to user-to-user services in a proportionate and risk-based way; clause 7 sets out the scope of the various duties of care; and clauses 21 and 22 do the same in relation to search services.

In response to the point about whether the duties on search will end up providing a second-rate service in the United Kingdom, I do not think that they will. The duties have been designed to be proportionate and reasonable. Throughout the Bill, Members will see that there are separate duties for search and for user-to-user services. That is reflected in the symmetry—which appears elsewhere, too—of clauses 6 and 7, and clauses 21 and 22. We have done that because we recognise that search is different. It indexes the internet; it does not provide a user-to-user service. We have tried to structure these duties in a way that is reasonable and proportionate, and that will not adversely impair the experience of people in the UK.

I believe that we are ahead of the European Union in bringing forward this legislation and debating it in detail, but the European Union is working on its Digital Services Act. I am confident that there will be no disadvantage to people conducting searches in United Kingdom territory.

Question put and agreed to.

Clause 6 accordingly ordered to stand part of the Bill.

Clause 7 ordered to stand part of the Bill.

Clause 8

Illegal content risk assessment duties

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move amendment 10, in clause 8, page 6, line 33, at end insert—

“(4A) A duty to publish the illegal content risk assessment and proactively supply this to OFCOM.”

This amendment creates a duty to publish an illegal content risk assessment and supply it to Ofcom.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 14, in clause 8, page 6, line 33, at end insert—

“(4A) A duty for the illegal content risk assessment to be approved by either—

(a) the board of the entity; or, if the organisation does not have a board structure,

(b) a named individual who the provider considers to be a senior manager of the entity, who may reasonably be expected to be in a position to ensure compliance with the illegal content risk assessment duties, and reports directly into the most senior employee of the entity.”

This amendment seeks to ensure that regulated companies’ boards or senior staff have responsibility for illegal content risk assessments.

Amendment 25, in clause 8, page 7, line 3, after the third “the” insert “production,”.

This amendment requires the risk assessment to take into account the risk of the production of illegal content, as well as the risk of its presence and dissemination.

Amendment 19, in clause 8, page 7, line 14, at end insert—

“(h) how the service may be used in conjunction with other regulated user-to-user services such that it may—

(i) enable users to encounter illegal content on other regulated user-to-user services, and

(ii) constitute part of a pathway to harm to individuals who are users of the service, in particular in relation to CSEA content.”

This amendment would incorporate into the duties a requirement to consider cross-platform risk.

Clause stand part.

Amendment 20, in clause 9, page 7, line 30, at end insert

“, including by being directed while on the service towards priority illegal content hosted by a different service;”.

This amendment aims to include within companies’ safety duties a duty to consider cross-platform risk.

Amendment 26, in clause 9, page 7, line 30, at end insert—

“(aa) prevent the production of illegal content by means of the service;”.

This amendment incorporates a requirement to prevent the production of illegal content within the safety duties.

Amendment 18, in clause 9, page 7, line 35, at end insert—

“(d) minimise the presence of content which reasonably foreseeably facilitates or aids the discovery or dissemination of priority illegal content, including CSEA content.”

This amendment brings measures to minimise content that may facilitate or aid the discovery of priority illegal content within the scope of the duty to maintain proportionate systems and processes.

Amendment 21, in clause 9, page 7, line 35, at end insert—

“(3A) A duty to collaborate with other companies to take reasonable and proportionate measures to prevent the means by which their services can be used in conjunction with other services to facilitate the encountering or dissemination of priority illegal content, including CSEA content,”.

This amendment creates a duty to collaborate in cases where there is potential cross-platform risk in relation to priority illegal content and CSEA content.

Clause 9 stand part.

Amendment 30, in clause 23, page 23, line 24, after “facilitating” insert

“the production of illegal content and”.

This amendment requires the illegal content risk assessment to consider the production of illegal content.

Clause 23 stand part.

Amendment 31, in clause 24, page 24, line 2, after “individuals” insert “producing or”.

This amendment expands the safety duty to include the need to minimise the risk of individuals producing certain types of search content.

Clause 24 stand part.

Members will note that amendments 17 and 28 form part of a separate group. I hope that is clear.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

At this stage, I will speak to clause 8 and our amendments 10, 14, 25, 19 and 17.

None Portrait The Chair
- Hansard -

Order. This is confusing. The hon. Lady said “and 17”. Amendment 17 is part of the next group of amendments.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Apologies, Sir Roger; I will speak to amendments 10, 14, 25 and 19.

None Portrait The Chair
- Hansard -

It’s all right, we’ll get there.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Opposition welcome the moves to ensure that all user-to-user services are compelled to provide risk assessments in relation to illegal content, but there are gaps, ranging from breadcrumbing to provisions for the production of livestreaming of otherwise illegal content.

Labour is extremely concerned by the lack of transparency around the all-important illegal content risk assessments, which is why we have tabled amendment 10. The effectiveness of the entire Bill is undermined unless the Government commit to a more transparent approach more widely. As we all know, in the Bill currently, the vital risk assessments will only be made available to the regulator, rather than for public scrutiny. There is a real risk—for want of a better word—in that approach, as companies could easily play down or undermine the risks. They could see the provision of the risk assessments to Ofcom as a simple, tick-box exercise to satisfy the requirements of them, rather than using the important assessments as an opportunity truly to assess the likelihood of current and emerging risks.

As my hon. Friend the Member for Worsley and Eccles South will touch on in her later remarks, the current approach runs the risk of allowing businesses to shield themselves from true transparency. The Minister knows that this is a major issue, and that until service providers and platforms are legally compelled to provide data, we will be shielded from the truth, because there is no statutory requirement for them to be transparent. That is fundamentally wrong and should not be allowed to continue. If the Government are serious about their commitment to transparency, and to the protection of adults and children online, they should make this small concession and see it as a positive step forward.

Amendment 14 would ensure that regulated companies, boards or senior staff have appropriate oversight of risk assessments related to adults. An obligation on boards or senior managers to approve risk assessments would hardwire the safety duties and create a culture of compliance in the regulated firms. The success of the regulatory framework relies on regulated companies carefully risk assessing their platforms. Once risks have been identified, the platform can concentrate on developing and implementing appropriate mitigations.

To date, boards and top executives of the regulated companies have not taken the risks to children seriously enough. Platforms either have not considered producing risk assessments or, if they have done so, they have been of limited efficiency and have demonstrably failed to adequately identify and respond to harms to children. Need I remind the Minister that the Joint Committee on the draft Bill recommended that risk assessments should be approved at board level?

Introducing a requirement on regulated companies to have the board or a senior manager approve the risk assessment will hardwire the safety duties into decision making, and create accountability and responsibility at the most senior level of the organisation. That will trickle down the organisation and help embed a culture of compliance across the company. We need to see safety online as a key focus for these platforms, and putting the onus on senior managers to take responsibility is a positive step forward in that battle.

10:59
On amendment 25, the Opposition fully support the Bill’s ambition to hold regulated services accountable for online sexual exploitation of children occurring on their platforms. The implications of the duties of care introduced by the Bill will be felt around the world in the prevention, disruption and detection of online sexual exploitation of children.
We are encouraged by the prioritisation of tackling the dissemination of child sexual exploitation and abuse. However, there is room for the Bill to go even further in strengthening child protection online, particularly in relation to the use of online platforms to generate new child sexual exploitation and abuse content. While it is a welcome step forward that the Bill is essentially encouraging a safety-by-design approach, clause 8 does not go far enough to tackle newly produced content or livestreamed content.
The Minister will be aware of the huge problems with online sexual exploitation of children. I pay tribute to the hard work of my hon. Friend the Member for Rotherham (Sarah Champion), alongside the International Justice Mission, which has been a particularly vocal champion of vulnerable young children at home and abroad.
The Philippines is a source country for livestreamed sexual exploitation of children. In its recent white paper, the IJM found that traffickers often use cheap Android smartphones with prepaid cellular data services to communicate with customers to produce and distribute explicit material. In order to reach the largest possible customer base, they often connect with sexually motivated offenders through everyday technology—the same platforms that the rest of us use to communicate with friends, family and co-workers.
One key issue with assessing the extent of online sexual exploitation of children is that we are entirely dependent on detection of the crime. Sadly, most current technologies widely used to detect various forms of online sexual exploitation of children are not designed to recognise livestreaming. Clearly, the implications of that are huge for both child sexual exploitation and human trafficking more widely. The International Justice Mission reports that file hashtag and PhotoDNA, which are widely used to great effect in enabling the detection and reporting of millions of known child sexual exploitation files, do not and cannot detect newly produced child sexual exploitation material.
The livestreaming of CSEM involves an ephemeral video stream, not a stored still or a video file. It is also therefore not usually subject to screening or content review. We must consider how easy it is for platforms to host live content and how ready they are to screen that content. I need only point the Minister to the devastating mass shooting that took place in Buffalo last month. The perpetrator livestreamed the racist attack online, using a GoPro camera attached to a military-style helmet. The shooter streamed live on the site Twitch for around two minutes before the site took the livestream down, but since then the video has been posted elsewhere on the internet and on smaller platforms.
Other white supremacists have used social media to publicise gruesome attacks, including the mass shooter in Christchurch, New Zealand, in 2019. Since that shooting, social media companies have got better in some ways at combating videos of atrocities online, including stopping livestreams of attacks faster, but violent videos, such as those of mass shootings, are saved by users and then reappear across the internet on Facebook, Instagram, Twitter, TikTok and other high-harm, smaller platforms. These reuploaded videos are harder for companies to take down. Ultimately, more needs to be done at the back end in terms of design features if we are to truly make people safe.
When it comes to exploitation being livestreamed online—unlike publicised terror attacks—crimes that are not detected are not reported. Therefore, livestreaming of child sexual exploitation is a severely under-reported crime and reliable figures for its prevalence do not exist. Anecdotally, the problem in the Philippines is overwhelming, but it is not limited to the Philippines. The IJM is aware of similar child trafficking originating from other source countries in south-east Asia, south Asia, Africa and Europe. Therefore, it is essential that technology companies and online platforms are compelled to specifically consider the production of illegal content when drawing up their risk assessments.
I turn to amendment 19, which we tabled to probe the Minister on how well he believes the clause encapsulates the cross-platform risk that children may face online. Organisations such as the National Society for the Prevention of Cruelty to Children and 5Rights have raised concerns that, as the Bill is drafted, there is a gap where children are groomed on one platform, where no abuse takes place, but are then directed to another platform, where they are harmed.
Well-established grooming pathways see abusers exploit the design features of social networks to contact children before moving communication across to other platforms, including livestreaming sites and encrypted messaging services. Perpetrators manipulate features such as Facebook’s algorithmic friend suggestions to make initial contact with large numbers of children whereby they can use direct messages to groom children and then coerce them into sending sexual images via WhatsApp.
Similarly, an abuser might groom a child through playing video games and simultaneously building that relationship further via a separate chat platform such as Discord. I want to point colleagues to Frida. Frida was groomed at the age of 13, and Frida’s story sadly highlights the subtle ways in which abusers can groom children on social networks before migrating them to other, more harmful apps and sites.
This is Frida’s experience in her own words:
“When I was 13, a man in his 30s contacted me on Facebook. I added him because you just used to add anyone on Facebook. He started messaging me and I like the attention. We’d speak every day, usually late at night for hours at a time. We started using WhatsApp to message. He started asking for photos so I sent some. Then he asked for some explicit photos so I did that too, and he reciprocated. He told me he’d spoken to other girls online and lied about his age to them, but he didn’t lie to me so I felt like I could trust him.”
Frida was 13 years old. How many other Fridas are there?
We recognise that no online service can assemble every piece of the jigsaw. However, the Bill does not place requirements on services to consider how abuse spreads from their platform to others or vice versa, to risk-assess accordingly or to co-operate with other platforms proactively to address harm. Amendment 19 would require companies to understand when discharging their risk assessment duties how abuse spreads from their platform to others or vice versa. For example, companies should understand how their platforms are situated on abuse pathways whereby the grooming and other online sexual abuse risks start on their site before migrating to other services, or whether they inherit risks from other sites.
Companies should also know whether they are dealing with abuse cross-platform risks, which happen sequentially, as tends to be the case for grooming initiated on social networks, or simultaneously, as tends to be the case on gaming services. Lastly, they should understand which functionalities and design features allowed child sexual exploitation offences to be committed and transferred across platforms.
The NSPCC research found that four UK adults in five think that social media companies should have a legal duty to work with each other to prevent online grooming from happening across multiple platforms, so that is an area in which the Minister has widespread support, both in the House and in the public realm.
This matter is not addressed explicitly. We are concerned that companies might be able to cite competition worries to avoid considering that aspect of online abuse. That is unacceptable. We are also concerned that forthcoming changes to the online environment such as the metaverse will create new risks such as more seamless moving of abuse between different platforms .
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I want to talk about a few different things relating to the amendments. Speaking from the Opposition Front Bench, the hon. Member for Pontypridd covered in depth amendment 20, which relates to being directed to other content. Although this seems like a small amendment, it would apply in a significant number of different situations. Particular mention was made of Discord for gaming, but also of things such as moving from Facebook to Messenger—all those different directions that can happen. A huge number of those are important for those who would seek to abuse children online by trying to move from the higher-regulation services or ones with more foot traffic to areas with perhaps less moderation so as to attack children in more extreme ways.

I grew up on the internet and spent a huge amount of time speaking to people, so I am well aware that people can be anyone they want to be on the internet, and people do pretend to be lots of different people. If someone tells us their age on the internet, we cannot assume that that is in any way accurate. I am doing what I can to imprint that knowledge on my children in relation to any actions they are taking online. In terms of media literacy, which we will come on to discuss in more depth later, I hope that one of the key things that is being told to both children and adults is that it does not matter if people have pictures on their profile—they can be anybody that they want to online and could have taken those pictures from wherever.

In relation to amendment 21 on collaboration, the only reasonable concern that I have heard is about an action that was taken by Facebook in employing an outside company in the US. It employed an outside company that placed stories in local newspapers on concerns about vile things that were happening on TikTok. Those stories were invented—they were made up—specifically to harm TikTok’s reputation. I am not saying for a second that collaboration is bad, but I think the argument that some companies may make that it is bad because it causes them problems and their opponents may use it against them proves the need to have a regulator. The point of having a regulator is to ensure that any information or collaboration that is required is done in a way that, should a company decide to use it with malicious intent, the regulator can come down on them. The regulator ensures that the collaboration that we need to happen in order for emergent issues to be dealt with as quickly as possible is done in a way that does not harm people. If it does harm people, the regulator is there to take action.

I want to talk about amendments 25 and 30 on the production of images and child sexual abuse content. Amendment 30 should potentially have an “or” at the end rather than an “and”. However, I am very keen to support both of those amendments, and all the amendments relating to the production of child sexual abuse content. On the issues raised by the Opposition about livestreaming, for example, we heard two weeks ago about the percentage of self-generated child sexual abuse content. The fact is that 75% of that content is self-generated. That is absolutely huge.

If the Bill does not adequately cover production of the content, whether it is by children and young people who have been coerced into producing the content and using their cameras in that way, or whether it is in some other way, then the Bill fails to adequately protect our children. Purely on the basis of that 75% stat, which is so incredibly stark, it is completely reasonable that production is included. I would be happy to support the amendments in that regard; I think they are eminently sensible. Potentially, when the Bill was first written, production was not nearly so much of an issue. However, as it has moved on, it has become a huge issue and something that needs tackling. Like Opposition Members, I do not feel like the Bill covers production in as much detail as it should, in order to provide protection for children.

Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

Amendment 10 would create a duty to publish the illegal content risk assessment, and proactively supply that to Ofcom. This is new legislation that is really a trial that will set international precedent, and a lot of the more prescriptive elements—which are necessary—are perhaps the most challenging parts of the Bill. The Minister has been very thoughtful on some of the issues, so I want to ask him, when we look at the landscape of how we look to regulate companies, where does he stand on transparency and accountability? How far is he willing to go, and how far does the Bill go, on issues of transparency? It is my feeling that the more companies are forced to publish and open up, the better. As we saw with the case of the Facebook whistleblower Frances Haugen, there is a lot to uncover. I therefore take this opportunity to ask the Minister how far the Bill goes on transparency and what his thoughts are on that.

11:15
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clause 8 sets out the risk assessment duties for illegal content, as already discussed, that apply to user-to-user services. Ofcom will issue guidance on how companies can undertake those. To comply with those duties, companies will need to take proportionate measures to mitigate the risks identified in those assessments. The clause lists a number of potential risk factors the providers must assess, including how likely it is that users will encounter illegal content, as defined later in the Bill,

“by means of the service”.

That phrase is quite important, and I will come to it later, on discussing some of the amendments, because it does not necessarily mean just on the service itself but, in a cross-platform point, other sites where users might find themselves via the service. That phrase is important in the context of some of the reasonable queries about cross-platform risks.

Moving on, companies will also need to consider how the design and operation of their service may reduce or increase the risks identified. Under schedule 3, which we will vote on, or at least consider, later on, companies will have three months to carry out risk assessments, which must be kept up to date so that fresh risks that may arise from time to time can be accommodated. Therefore, if changes are made to the service, the risks can be considered on an ongoing basis.

Amendment 10 relates to the broader question that the hon. Member for Liverpool, Walton posed about transparency. The Bill already contains obligations to publish summary risk assessments on legal but harmful content. That refers to some of the potentially contentious or ambiguous types of content for which public risk assessments would be helpful. The companies are also required to make available those risk assessments to Ofcom on request. That raises a couple of questions, as both the hon. Member for Liverpool, Walton mentioned and some of the amendments highlighted. Should companies be required to proactively serve up their risk assessments to Ofcom, rather than wait to be asked? Also, should those risk assessments all be published—probably online?

In considering those two questions, there are a couple of things to think about. The first is Ofcom’s capacity. As we have discussed, 25,000 services are in scope. If all those services proactively delivered a copy of their risk assessment, even if they are very low risk and of no concern to Ofcom or, indeed, any of us, they would be in danger of overwhelming Ofcom. The approach contemplated in the Bill is that, where Ofcom has a concern or the platform is risk assessed as being significant—to be clear, that would apply to all the big platforms—it will proactively make a request, which the platform will be duty bound to meet. If the platform does not do that, the senior manager liability and the two years in prison that we discussed earlier will apply.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Minister mentioned earlier that Ofcom would be adequately resourced and funded to cope with the regulatory duty set out in the Bill. If Ofcom is not able to receive risk assessments for all the platforms potentially within scope, even if those platforms are not deemed to be high risk, does that not call into question whether Ofcom has the resource needed to actively carry out its duties in relation to the Bill?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Of course, Ofcom is able to request any of them if it wants to—if it feels that to be necessary—but receiving 25,000 risk assessments, including from tiny companies that basically pose pretty much no risk at all and hardly anyone uses, would, I think, be an unreasonable and disproportionate requirement to impose. I do not think it is a question of the resources being inadequate; it is a question of being proportionate and reasonable.

Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

The point I was trying to get the Minister to think about was the action of companies in going through the process of these assessments and then making that information publicly available to civil society groups; it is about transparency. It is what the sector needs; it is the way we will find and root out the problems, and it is a great missed opportunity in this Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

To reassure the hon. Member on the point about doing the risk assessment, all the companies have to do the risk assessment. That obligation is there. Ofcom can request any risk assessment. I would expect, and I think Parliament would expect, it to request risk assessments either where it is concerned about risk or where the platform is particularly large and has a very high reach—I am thinking of Facebook and companies like that. But hon. Members are talking here about requiring Ofcom to receive and, one therefore assumes, to consider, because what is the point of receiving an assessment unless it considers it? Receiving it and just putting it on a shelf without looking at it would be pointless, obviously. Requiring Ofcom to receive and look at potentially 25,000 risk assessments strikes me as a disproportionate burden. We should be concentrating Ofcom’s resources—and it should concentrate its activity, I submit—on those companies that pose a significant risk and those companies that have a very high reach and large numbers of users. I suggest that, if we imposed an obligation on it to receive and to consider risk assessments for tiny companies that pose no risk, that would not be the best use of its resources, and it would take away resources that could otherwise be used on those companies that do pose risk and that have larger numbers of users.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Just to be clear, we are saying that the only reason why we should not be encouraging the companies to do the risk assessment is that Ofcom might not be able to cope with dealing with all the risk assessments. But surely that is not a reason not to do it. The risk assessment is a fundamental part of this legislation. We have to be clear that there is no point in the companies having those risk assessments if they are not visible and transparent.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

All the companies have to do the risk assessment, for example for the “illegal” duties, where they are required to by the Bill. For the “illegal” duties, that is all of them; they have to do those risk assessments. The question is whether they have to send them to Ofcom—all of them—even if they are very low risk or have very low user numbers, and whether Ofcom, by implication, then has to consider them, because it would be pointless to require them to be sent if they were not then looked at. We want to ensure that Ofcom’s resources are pointed at the areas where the risks arise. Ofcom can request any of these. If Ofcom is concerned—even a bit concerned—it can request them.

Hon. Members are then making a slightly adjacent point about transparency—about whether the risk assessments should be made, essentially, publicly available. In relation to comprehensive public disclosure, there are legitimate questions about public disclosure and about getting to the heart of what is going on in these companies in the way in which Frances Haugen’s whistleblower disclosures did. But we also need to be mindful of what we might call malign actors—people who are trying to circumvent the provisions of the Bill—in relation to some of the “illegal” provisions, for example. We do not want to give them so much information that they know how they can circumvent the rules. Again, there is a balance to strike between ensuring that the rules are properly enforced and having such a high level of disclosure that people seeking to circumvent the rules are able to work out how to do so.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

If the rules are so bad that people can circumvent them, they are not good enough anyway and they need to be updated, but I have a specific question on this. The Minister says that Ofcom will be taking in the biggest risk assessments, looking at them and ensuring that they are adequate. Will he please give consideration to asking Ofcom to publish the risk assessments from the very biggest platforms? Then they will all be in one place. They will be easy for people to find and people will not have to rake about in the bottom sections of a website. And it will apply only in the case of the very biggest, most at risk platforms, which should be regularly updating their risk assessments and changing their processes on a very regular basis in order to ensure that people are kept safe.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Lady for her intervention and for the—

None Portrait The Chair
- Hansard -

Order. I am sorry to interrupt the Minister, but I now have to adjourn the sitting until this afternoon, when the Committee will meet again, in Room 9 and with Ms Rees in the Chair.

11:25
The Chair adjourned the Committee without Question put (Standing Order No. 88).
Adjourned till this day at Two o’clock.

Online Safety Bill (Sixth sitting)

Committee stage
Tuesday 7th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 7 June 2022 - (7 Jun 2022)
The Committee consisted of the following Members:
Chairs: Sir Roger Gale, † Christina Rees
† Ansell, Caroline (Eastbourne) (Con)
† Bailey, Shaun (West Bromwich West) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
† Carden, Dan (Liverpool, Walton) (Lab)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Double, Steve (St Austell and Newquay) (Con)
† Fletcher, Nick (Don Valley) (Con)
† Holden, Mr Richard (North West Durham) (Con)
† Keeley, Barbara (Worsley and Eccles South) (Lab)
† Leadbeater, Kim (Batley and Spen) (Lab)
† Miller, Dame Maria (Basingstoke) (Con)
† Mishra, Navendu (Stockport) (Lab)
† Moore, Damien (Southport) (Con)
† Nicolson, John (Ochil and South Perthshire) (SNP)
† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
Russell, Dean (Watford) (Con)
† Stevenson, Jane (Wolverhampton North East) (Con)
Katya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks
† attended the Committee
Public Bill Committee
Tuesday 7 June 2022
(Afternoon)
[Christina Rees in the Chair]
Online Safety Bill
14:00
None Portrait The Chair
- Hansard -

Welcome back. I have a few announcements. I have been reassured that we will have no transmission problems this afternoon, and apparently the audio of this morning’s sitting is available if Members want to listen to it. I have no objections to Members taking their jackets off, because it is rather warm this afternoon. We are expecting a Division in the main Chamber at about 4 o’clock, so we will suspend for 15 minutes if that happens.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I am sorry, Ms Rees, but I am afraid that I cannot hear you very well.

None Portrait The Chair
- Hansard -

I will shout a bit in that case.

Clause 8

Illegal content risk assessment duties

Amendment proposed (this day): 10, in clause 8, page 6, line 33, at end insert—

“(4A) A duty to publish the illegal content risk assessment and proactively supply this to OFCOM.”—(Alex Davies-Jones.)

This amendment creates a duty to publish an illegal content risk assessment and supply it to Ofcom.

Question again proposed, That the amendment be made.

None Portrait The Chair
- Hansard -

I remind the Committee that with this we are discussing the following:

Amendment 14, in clause 8, page 6, line 33, at end insert—

“(4A) A duty for the illegal content risk assessment to be approved by either—

(a) the board of the entity; or, if the organisation does not have a board structure,

(b) a named individual who the provider considers to be a senior manager of the entity, who may reasonably be expected to be in a position to ensure compliance with the illegal content risk assessment duties, and reports directly into the most senior employee of the entity.”

This amendment seeks to ensure that regulated companies’ boards or senior staff have responsibility for illegal content risk assessments.

Amendment 25, in clause 8, page 7, line 3, after the third “the” insert “production,”.

This amendment requires the risk assessment to take into account the risk of the production of illegal content, as well as the risk of its presence and dissemination.

Amendment 19, in clause 8, page 7, line 14, at end insert—

“(h) how the service may be used in conjunction with other regulated user-to-user services such that it may—

(i) enable users to encounter illegal content on other regulated user-to-user services, and

(ii) constitute part of a pathway to harm to individuals who are users of the service, in particular in relation to CSEA content.”

This amendment would incorporate into the duties a requirement to consider cross-platform risk.

Clause stand part.

Amendment 20, in clause 9, page 7, line 30, at end insert—

“, including by being directed while on the service towards priority illegal content hosted by a different service;”.

This amendment aims to include within companies’ safety duties a duty to consider cross-platform risk.

Amendment 26, in clause 9, page 7, line 30, at end insert—

“(aa) prevent the production of illegal content by means of the service;”.

This amendment incorporates a requirement to prevent the production of illegal content within the safety duties.

Amendment 18, in clause 9, page 7, line 35, at end insert—

“(d) minimise the presence of content which reasonably foreseeably facilitates or aids the discovery or dissemination of priority illegal content, including CSEA content.”

This amendment brings measures to minimise content that may facilitate or aid the discovery of priority illegal content within the scope of the duty to maintain proportionate systems and processes.

Amendment 21, in clause 9, page 7, line 35, at end insert—

“(3A) A duty to collaborate with other companies to take reasonable and proportionate measures to prevent the means by which their services can be used in conjunction with other services to facilitate the encountering or dissemination of priority illegal content, including CSEA content,”.

This amendment creates a duty to collaborate in cases where there is potential cross-platform risk in relation to priority illegal content and CSEA content.

Clause 9 stand part.

Amendment 30, in clause 23, page 23, line 24, after “facilitating” insert—

“the production of illegal content and”.

This amendment requires the illegal content risk assessment to consider the production of illegal content.

Clause 23 stand part.

Amendment 31, in clause 24, page 24, line 2, after “individuals” insert “producing or”.

This amendment expands the safety duty to include the need to minimise the risk of individuals producing certain types of search content.

Clause 24 stand part.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

It is a great pleasure to serve under your chairmanship, Ms Rees, and I am glad that this afternoon’s Committee proceedings are being broadcast to the world.

Before we adjourned this morning, I was in the process of saying that one of the challenges with public publication of the full risk assessment, even for larger companies, is that the vulnerabilities in their systems, or the potential opportunities to exploit those systems for criminal purposes, would then be publicly exposed in a way that may not serve the public interest, and that is a reason for not requiring complete disclosure of everything.

However, I draw the Committee’s attention to the existing transparency provisions in clause 64. We will come on to them later, but I want to mention them now, given that they are relevant to amendment 10. The transparency duties state that, once a year, Ofcom must serve notice on the larger companies—those in categories 1, 2A and 2B—requiring them to produce a transparency report. That is not a power for Ofcom—it is a requirement. Clause 64(1) states that Ofcom

“must give every provider…a notice which requires the provider to produce…(a ‘transparency report’).”

The content of the transparency report is specified by Ofcom, as set out in subsection (3). As Members will see, Ofcom has wide powers to specify what must be included in the report. On page 186, schedule 8—I know that we will debate it later, but it is relevant to the amendment—sets out the scope of what Ofcom can require. It is an extremely long list that covers everything we would wish to see. Paragraph 1, for instance, states:

“The incidence of illegal content, content that is harmful to children and priority content that is harmful to adults on a service.”

Therefore, the transparency reporting requirement—it is not an option but a requirement—in clause 64 addresses the transparency point that was raised earlier.

Amendment 14 would require a provider’s board members or senior manager to take responsibility for the illegal content risk assessment. We agree with the Opposition’s point. Indeed, we agree with what the Opposition are trying to achieve in a lot of their amendments.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

There is a “but” coming. We think that, in all cases apart from one, the Bill as drafted already addresses the matter. In the case of amendment 14, the risk assessment duties as drafted already explicitly require companies to consider how their governance structures may affect the risk of harm to users arising from illegal content. Ofcom will provide guidance to companies about how they can comply with those duties, which is very likely to include measures relating to senior-level engagement. In addition, Ofcom can issue confirmation decisions requiring companies to take specific steps to come into compliance. To put that simply, if Ofcom thinks that there is inadequate engagement by senior managers in relation to the risk assessment duties, it can require—it has the power to compel—a change of behaviour by the company.

I come now to clause 9—I think this group includes clause 9 stand part as well. The shadow Minister has touched on this. Clause 9 contains safety duties in relation to—

None Portrait The Chair
- Hansard -

Order. Minister, I do not think we are doing clause 9. We are on clause 8.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I think the group includes clause 9 stand part, but I will of course be guided by you, Ms Rees.

None Portrait The Chair
- Hansard -

No, clause 9 is separate.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Very well; we will debate clause 9 separately. In that case, I will move on to amendments 19 and 20, which seek to address cross-platform risk. Again, we completely agree with the Opposition that cross-platform risk is a critical issue. We heard about it in evidence. It definitely needs to be addressed and covered by the Bill. We believe that it is covered by the Bill, and our legal advice is that it is covered by the Bill, because in clause 8 as drafted—[Interruption.] Bless you—or rather, I bless the shadow Minister, following Sir Roger’s guidance earlier, lest I inadvertently bless the wrong person.

Clause 8 already includes the phrase to which I alluded previously. I am talking about the requirement that platforms risk-assess illegal content that might be encountered

“by means of the service”.

That is a critical phrase, because it means not just on that service itself; it also means, potentially, via that service if, for example, that service directs users onward to illegal content on another site. By virtue of the words,

“by means of the service”,

appearing in clause 8 as drafted, the cross-platform risk that the Opposition and witnesses have rightly referred to is covered. Of course, Ofcom will set out further steps in the code of practice as well.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I was listening very closely to what the Minister was saying and I was hoping that he might be able to comment on some of the evidence that was given, particularly by Professor Lorna Woods, who talked about the importance of risk assessments being about systems, not content. Would the Minister pick up on that point? He was touching on it in his comments, and I was not sure whether this was the appropriate point in the Bill at which to bring it up.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my right hon. Friend for raising that. The risk assessments and, indeed, the duties arising under this Bill all apply to systems and processes—setting up systems and processes that are designed to protect people and to prevent harmful and illegal content from being encountered. We cannot specify in legislation every type of harmful content that might be encountered. This is about systems and processes. We heard the Chairman of the Joint Committee on the draft Online Safety Bill, our hon. Friend the Member for Folkestone and Hythe (Damian Collins), confirm to the House on Second Reading his belief—his accurate belief—that the Bill takes a systems-and-processes approach. We heard some witnesses saying that as well. The whole point of this Bill is that it is tech-agnostic—to future-proof it, as hon. Members mentioned this morning—and it is based on systems and processes. That is the core architecture of the legislation that we are debating.

Amendments 25 and 26 seek to ensure that user-to-user services assess and mitigate the risk of illegal content being produced via functions of the service. That is covered, as it should be—the Opposition are quite right to raise the point—by the illegal content risk assessment and safety duties in clauses 8 and 9. Specifically, clause 8(5)(d), on page 7 of the Bill—goodness, we are only on page 7 and we have been going for over half a day already—requires services to risk-assess functionalities of their service being used to facilitate the presence of illegal content. I stress the word “presence” in clause 8(5)(d). Where illegal content is produced by a functionality of the service—for example, by being livestreamed—that content will be present on the service and companies must mitigate that risk. The objective that the Opposition are seeking to achieve, and with which we completely agree with, is covered in clause 8(5)(d) by the word “presence”. If the content is present, it is covered by that section.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Specifically on that, I understand the point the hon. Gentleman is making and appreciate his clarification. However, on something such as Snapchat, if somebody takes a photo, it is sent to somebody else, then disappears immediately, because that is what Snapchat does—the photo is no longer present. It has been produced and created there, but it is not present on the platform. Can the Minister consider whether the Bill adequately covers all the instances he hopes are covered?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady raises an interesting point about time. However, the clause 8(5)(d) uses the wording,

“the level of risk of functionalities of the service facilitating the presence or dissemination of illegal content”

and so on. That presence can happen at any time, even fleetingly, as with Snapchat. Even when the image self-deletes after a certain period—so I am told, I have not actually used Snapchat—the presence has occurred. Therefore, that would be covered by clause 8(5)(d).

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Will the Minister explain how we would be able to prove, once the image is deleted, that it was present on the platform?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The question of proof is a separate one, and that would apply however we drafted the clause. The point is that the clause provides that any presence of a prohibited image would fall foul of the clause. There are also duties on the platforms to take reasonable steps. In the case of matters such as child sexual exploitation and abuse images, there are extra-onerous duties that we have discussed before, for obvious and quite correct reasons.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Will the Minister stress again that in this clause specifically he is talking about facilitating any presence? That is the wording that he has just used. Can he clarify exactly what he means? If the Minister were to do so, it would be an important point for the Bill as it proceeds.

None Portrait The Chair
- Hansard -

Order. Minister, before you continue, before the Committee rose earlier today, there was a conversation about clause 9 being in, and then I was told it was out. This is like the hokey cokey; it is back in again, just to confuse matters further. I was confused enough, so that point needs to be clarified.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It is grouped, Chair. We were discussing clause 8 and the relevant amendments, then we were going to come back to clause 9 and the relevant amendments.

None Portrait The Chair
- Hansard -

Is that as clear as mud?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am happy to follow your direction, Ms Rees. I find that that is usually the wisest course of action.

I will speak to amendment 18, which is definitely on the agenda for this grouping and which the shadow Minister addressed earlier. It would oblige service providers to put in place systems and processes

“to minimise the presence of content which reasonably foreseeably facilitates or aids the discovery or dissemination of priority illegal content, including CSEA content.”

The Government completely support that objective, quite rightly promoted by the Opposition, but it is set out in the Bill as drafted. The companies in scope are obliged to take comprehensive measures to tackle CSEA content, including where a service directs users on the first service to the second service.

Amendment 21, in a similar spirit, talks about cross-platform collaboration. I have already mentioned the way in which the referral of a user from one platform to another is within the scope of the Bill. Again, under its provisions, service providers must put in place proportionate systems and processes to mitigate identified cross-platform harms and, where appropriate, to achieve that objective service providers would be expected to collaborate and communicate with one another. If Ofcom finds that they are not engaging in appropriate collaborative behaviour, which means they are not discharging their duty to protect people and children, it can intervene. While agreeing completely with the objective sought, the Bill already addresses that.

14:15
Amendments 30 and 31 are slightly different, as they try to put a duty on search services not to facilitate
“the production of illegal content”.
Search services cannot produce or facilitate illegal content; all they can do is facilitate searches. Searching for illegal content using a search service is already covered by the Bill, and the end company that might be providing the illegal content would be covered as well if it is a user-to-user service. Everything that search services could reasonably be expected to do in this area is already covered by the duties imposed upon them.
Ms Rees, are we dealing with clauses 23 and 24 now, or later?
None Portrait The Chair
- Hansard -

They are in this group, so you may deal with them now.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Obviously, I encourage the Committee to support those clauses standing part of the Bill. They impose duties on search services—we touched on search a moment ago—to assess the nature and risk to individuals of accessing illegal content via their services, and to minimise the risk of users encountering that illegal content. They are very similar duties to those we discussed for user-to-user services, but applied in the search context. I hope that that addresses all the relevant provisions in the group that we are debating.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful for the opportunity to speak to amendments to clause 9 and to clauses 23 and 24, which I did not speak on earlier. I am also very grateful that we are being broadcast live to the world and welcome that transparency for all who might be listening.

On clause 9, it is right that the user-to-user services will be required to have specific duties and to take appropriate measures to mitigate and manage the risk of harm to individuals and their likelihood of encountering priority illegal content. Again, however, the Bill does not go far enough, which is why we are seeking to make these important amendments. On amendment 18, it is important to stress that the current scope of the Bill does not capture the range of ways in which child abusers use social networks to organise abuse, including to form offender networks. They post digital breadcrumbs that signpost to illegal content on third-party messaging apps and the dark web, and they share child abuse videos that are carefully edited to fall within content moderation guidelines. This range of techniques, known as child abuse breadcrumbing, is a significant enabler of online child abuse.

Our amendment would give the regulator powers to tackle breadcrumbing and ensure a proactive upstream response. The amendment would ensure that tens of millions of interactions with accounts that actively enable the discovery and sharing of child abuse material will be brought into regulatory scope. It will not leave that as ambiguous. The amendment will also ensure that companies must tackle child abuse at the earliest possible stage. As it stands, the Bill would reinforce companies’ current focus only on material that explicitly reaches the criminal threshold. Because companies do not focus their approach on other child abuse material, abusers can exploit this knowledge to post carefully edited child abuse images and content that enables them to connect and form networks with other abusers. Offenders understand and can anticipate that breadcrumbing material will not be proactively identified or removed by the host site, so they are able to organise and link to child abuse in plain sight.

We all know that child abuse breadcrumbing takes many forms, but techniques include tribute sites where users create social media profiles using misappropriated identities of known child abuse survivors. These are used by offenders to connect with likeminded perpetrators to exchange contact information, form offender networks and signpost child abuse material elsewhere online. In the first quarter of 2021, there were 6 million interactions with such accounts.

Abusers may also use Facebook groups to build offender groups and signpost to child abuse hosted on third-party sites. Those groups are thinly veiled in their intentions; for example, as we heard in evidence sessions, groups are formed for those with an interest in children celebrating their 8th, 9th or 10th birthdays. Several groups with over 50,000 members remained alive despite being reported to Meta, and algorithmic recommendations quickly suggested additional groups for those members to join.

Lastly, abusers can signpost to content on third-party sites. Abusers are increasingly using novel forms of technology to signpost to online child abuse, including QR codes, immersive technologies such as the metaverse, and links to child abuse hosted on the blockchain. Given the highly agile nature of the child abuse threat and the demonstrable ability of sophisticated offenders to exploit new forms of technology, this amendment will ensure that the legislation is effectively futureproofed. Technological change makes it increasingly important that the ability of child abusers to connect and form offender networks can be disrupted at the earliest possible stage.

Turning to amendment 21, we know that child abuse is rarely siloed on a single platform or app. Well-established grooming pathways see abusers exploit the design features of social networks to contact children before they move communication across to other platforms, including livestreaming sites, as we have already heard, and encrypted messaging services. Offenders manipulate features such as Facebook’s algorithmic friend suggestions to make initial contact with a large number of children. They can then use direct messages to groom them and coerce children into sending sexual images via WhatsApp. Similarly, as we heard earlier, abusers can groom children through playing videogames and then bringing them on to another ancillary platform, such as Discord.

The National Society for the Prevention of Cruelty to Children has shared details of an individual whose name has been changed, and whose case particularly highlights the problems that children are facing in the online space. Ben was 14 when he was tricked on Facebook into thinking he was speaking to a female friend of a friend, who turned out to be a man. Using threats and blackmail, he coerced Ben into sending abuse images and performing sex acts live on Skype. Those images and videos were shared with five other men, who then bombarded Ben with further demands. His mum, Rachel, said:

“The abuse Ben suffered had a devastating impact on our family. It lasted two long years, leaving him suicidal.

It should not be so easy for an adult to meet and groom a child on one site then trick them into livestreaming their own abuse on another app, before sharing the images with like-minded criminals at the click of a button.

Social media sites should have to work together to stop this abuse happening in the first place, so other children do not have to go through what Ben did.”

The current drafting of the Bill does not place sufficiently clear obligations on platforms to co-operate on the cross-platform nature of child abuse. Amendment 21 would require companies to take reasonable and proportionate steps to share threat assessments, develop proportionate mechanisms to share offender intelligence, and create a rapid response arrangement to ensure that platforms develop a coherent, systemic approach to new and emerging threats. Although the industry has developed a systemic response to the removal of known child abuse images, these are largely ad hoc arrangements that share information on highly agile risk profiles. The cross-platform nature of grooming and the interplay of harms across multiple services need to be taken into account. If it is not addressed explicitly in the Bill, we are concerned that companies may be able to cite competition concerns to avoid taking action.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

On the topic of child abuse images, the hon. Member spoke earlier about livestreaming and those images not being captured. I assume that she would make the same point in relation to this issue: these live images may not be captured by AI scraping for them, so it is really important that they are included in the Bill in some way as well.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree with the hon. Member, and appreciate her intervention. It is fundamental for this point to be captured in the Bill because, as we are seeing, this is happening more and more. More and more victims are coming forward who have been subject to livestreaming that is not picked up by the technology available, and is then recorded and posted elsewhere on smaller platforms.

Legal advice suggests that cross-platform co-operation is likely to be significantly impeded by the negative interplay with competition law unless there is a clear statutory basis for enabling or requiring collaboration. Companies may legitimately have different risk and compliance appetites, or may simply choose to hide behind competition law to avoid taking a more robust form of action.

New and emerging technologies are likely to produce an intensification of cross-platform risks in the years ahead, and we are particularly concerned about the child abuse impacts in immersive virtual reality and alternative-reality environments, including the metaverse. A number of high-risk immersive products are already designed to be platform-agnostic, meaning that in-product communication takes place between users across multiple products and environments. There is a growing expectation that these environments will be built along such lines, with an incentive for companies to design products in this way in the hope of blunting the ability of Governments to pursue user safety objectives.

Separately, regulatory measures that are being developed in the EU, but are highly likely to impact service users in the UK, could result in significant unintended safety consequences. Although the interoperability provisions in the Digital Markets Act are strongly beneficial when viewed through a competition lens—they will allow the competition and communication of multiple platforms—they could, without appropriate safety mitigations, provide new means for abusers to contact children across multiple platforms, significantly increase the overall profile of cross-platform risk, and actively frustrate a broad number of current online safety responses. Amendment 21 will provide corresponding safety requirements that can mitigate the otherwise significant potential for unintended consequences.

The Minister referred to clauses 23 and 24 in relation to amendments 30 and 31. We think a similar consideration should apply for search services as well as for user-to-user services. We implore that the amendments be made, in order to prevent those harms from occurring.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have already commented on most of those amendments, but one point that the shadow Minister made that I have not addressed was about acts that are essentially preparatory to acts of child abuse or the exchange of child sexual exploitation and abuse images. She was quite right to raise that issue as a matter of serious concern that we would expect the Bill to prevent, and I offer the Committee the reassurance that the Bill, as drafted, does so.

Schedule 6 sets out the various forms of child sexual exploitation and abuse that are designated as priority offences and that platforms have to take proactive steps to prevent. On the cross-platform point, that includes, as we have discussed, things that happen through a service as well as on a service. Critically, paragraph 9 of schedule 6 includes “inchoate offences”, which means someone not just committing the offence but engaging in acts that are preparatory to committing the offence, conspiring to commit the offence, or procuring, aiding or abetting the commission of the offence. The preparatory activities that the shadow Minister referred to are covered under schedule 6, particularly paragraph 9.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I thank the Minister for giving way. I notice that schedule 6 includes provision on the possession of indecent photographs of children. Can he confirm that that provision encapsulates the livestreaming of sexual exploitation?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes, I can.

Question put, That the amendment be made.

Division 2

Ayes: 5


Labour: 5

Noes: 9


Conservative: 9

14:29
Amendment proposed: 14, in clause 8, page 6, line 33, at end insert:
“(4A) A duty for the illegal content risk assessment to be approved by either—
(a) the board of the entity; or, if the organisation does not have a board structure,
(b) a named individual who the provider considers to be a senior manager of the entity, who may reasonably be expected to be in a position to ensure compliance with the illegal content risk assessment duties, and reports directly into the most senior employee of the entity.”—(Alex Davies-Jones.)
This amendment seeks to ensure that regulated companies’ boards or senior staff have responsibility for illegal content risk assessments.
Question put, That the amendment be made.

Division 3

Ayes: 7


Labour: 5
Scottish National Party: 2

Noes: 9


Conservative: 9

Amendment proposed: 25, in clause 8, page 7, line 3, after the third “the” insert “production,”.—(Alex Davies-Jones.)
This amendment requires the risk assessment to take into account the risk of the production of illegal content, as well as the risk of its presence and dissemination.
Question put, That the amendment be made.

Division 4

Ayes: 7


Labour: 5
Scottish National Party: 2

Noes: 9


Conservative: 9

Amendment proposed: 19, in clause 8, page 7, line 14, at end insert—
“(h) how the service may be used in conjunction with other regulated user-to-user services such that it may—
(i) enable users to encounter illegal content on other regulated user-to-user services, and
(ii) constitute part of a pathway to harm to individuals who are users of the service, in particular in relation to CSEA content.”—(Alex Davies-Jones.)
This amendment would incorporate into the duties a requirement to consider cross-platform risk.
Question put, That the amendment be made.

Division 5

Ayes: 7


Labour: 5
Scottish National Party: 2

Noes: 9


Conservative: 9

Amendment proposed: 17, in clause 8, page 7, line 14, at end insert—
‘(5A) The duties set out in this section apply in respect of content which reasonably foreseeably facilitates or aids the discovery or dissemination of CSEA content.” —(Alex Davies-Jones.)
This amendment extends the illegal content risk assessment duties to cover content which could be foreseen to facilitate or aid the discovery or dissemination of CSEA content.
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss amendment 28, in clause 10, page 9, line 18, at end insert—

“(ba) matters relating to CSEA content including—

(i) the level of illegal images blocked at the upload stage and number and rates of livestreams of CSEA in public and private channels terminated; and

(ii) the number and rates of images and videos detected and removed by different tools, strategies and/or interventions.”

This amendment requires the children’s risk assessment to consider matters relating to CSEA content.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

As this is the first time I have spoken in the Committee, may I say that it is a pleasure to serve with you in the Chair, Ms Rees? I agree with my hon. Friend the Member for Pontypridd that we are committed to improving the Bill, despite the fact that we have some reservations, which we share with many organisations, about some of the structure of the Bill and some of its provisions. As my hon. Friend has detailed, there are particular improvements to be made to strengthen the protection of children online, and I think the Committee’s debate on this section is proving fruitful.

Amendment 28 is a good example of where we must go further if we are to achieve the goal of the Bill and protect children from harm online. The amendment seeks to require regulated services to assess their level of risk based, in part, on the frequency with which they are blocking, detecting and removing child sexual exploitation and abuse content from their platforms. By doing so, we will be able to ascertain the reality of their overall risk and the effectiveness of their existing response.

The addition of livestreamed child sexual exploitation and abuse content not only acknowledges first-generation CSEA content, but recognises that livestreamed CSEA content happens on both public and private channels, and that they require different methods of detection.

Furthermore, amendment 28 details the practical information needed to assess whether the action being taken by a regulated service is adequate in countering the production and dissemination of CSEA content, in particular first-generation CSEA content. Separating the rates of terminated livestreams of CSEA in public and private channels is important, because those rates may vary widely depending on how CSEA content is generated. By specifying tools, strategies and interventions, the amendment would ensure that the systems in place to detect and report CSEA are adequate, and that is why we would like it to be part of the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The Government support the spirit of amendments 17 and 28, which seek to achieve critical objectives, but the Bill as drafted delivers those objectives. In relation to amendment 17 and cross-platform risk, clause 8 already sets out harms and risks—including CSEA risks—that arise by means of the service. That means through the service to other services, as well as on the service itself, so that is covered.

Amendment 28 calls for the risk assessments expressly to cover illegal child sexual exploitation content, but clause 8 already requires that to happen. Clause 8(5) states that the risk assessment must cover the

“risk of individuals who are users of the service encountering…each kind of priority illegal content”.

If we follow through the definition of priority illegal content, we find all those CSEA offences listed in schedule 6. The objective of amendment 28 is categorically delivered by clause 8(5)(b), referencing onwards to schedule 6.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The amendment specifically mentions the level and rates of those images. I did not quite manage to follow through all the things that the Minister just spoke about, but does the clause specifically talk about the level of those things, rather than individual incidents, the possibility of incidents or some sort of threshold for incidents, as in some parts of the Bill?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The risk assessments that clause 8 requires have to be suitable and sufficient; they cannot be perfunctory and inadequate in nature. I would say that suitable and sufficient means they must go into the kind of detail that the hon. Lady requests. More details, most of which relate to timing, are set out in schedule 3. Ofcom will be making sure that these risk assessments are not perfunctory.

Importantly, in relation to CSEA reporting, clause 59, which we will come to, places a mandatory requirement on in-scope companies to report to the National Crime Agency all CSEA content that they detect on their platforms, if it has not already been reported. Not only is that covered by the risk assessments, but there is a criminal reporting requirement here. Although the objectives of amendments 17 and 28 are very important, I submit to the Committee that the Bill delivers the intention behind them already, so I ask the shadow Minister to withdraw them.

Question put, That the amendment be made.

Division 6

Ayes: 7


Labour: 5
Scottish National Party: 2

Noes: 9


Conservative: 9

Clause 8 ordered to stand part of the Bill.
Clause 9
Safety duties about illegal content
None Portrait The Chair
- Hansard -

Amendments 20, 26, 18 and 21 to clause 9 have already been debated. Does the shadow Minister wish to press any of them to a vote?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Amendments 20, 18 and 21.

Amendment proposed: 20, in clause 9, page 7, line 30, at end insert

“, including by being directed while on the service towards priority illegal content hosted by a different service;”—(Alex Davies-Jones.)

This amendment aims to include within companies’ safety duties a duty to consider cross-platform risk.

Question put, That the amendment be made.

Division 7

Ayes: 7


Labour: 5
Scottish National Party: 2

Noes: 9


Conservative: 9

Amendment proposed: 18, in clause 9, page 7, line 35, at end insert—
“(d) minimise the presence of content which reasonably foreseeably facilitates or aids the discovery or dissemination of priority illegal content, including CSEA content.”—(Alex Davies-Jones.)
This amendment brings measures to minimise content that may facilitate or aid the discovery of priority illegal content within the scope of the duty to maintain proportionate systems and processes.
Question put, That the amendment be made.

Division 8

Ayes: 7


Labour: 5
Scottish National Party: 2

Noes: 9


Conservative: 9

14:45
Amendment proposed: 21, in clause 9, page 7, line 35, at end insert—
“(3A) A duty to collaborate with other companies to take reasonable and proportionate measures to prevent the means by which their services can be used in conjunction with other services to facilitate the encountering or dissemination of priority illegal content, including CSEA content,”—(Alex Davies-Jones.)
This amendment creates a duty to collaborate in cases where there is potential cross-platform risk in relation to priority illegal content and CSEA content.
Question put, That the amendment be made.

Division 9

Ayes: 7


Labour: 5
Scottish National Party: 2

Noes: 9


Conservative: 9

Clause 9 ordered to stand part of the Bill.
Clause 10
Children’s Risk Assessment duties
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I beg to move amendment 15, in clause10, page 8, line 41, at end insert—

“(4A) A duty for the children’s risk assessment to be approved by either—

(a) the board of the entity; or, if the organisation does not have a board structure,

(b) a named individual who the provider considers to be a senior manager of the entity, who may reasonably be expected to be in a position to ensure compliance with the children’s risk assessment duties, and reports directly into the most senior employee of the entity.”

This amendment seeks to ensure that regulated companies’ boards or senior staff have responsibility for children’s risk assessments.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 11, in clause 10, page 9, line 2, at end insert—

“(5A) A duty to publish the children’s risk assessment and proactively supply this to OFCOM.”

This amendment creates a duty to publish the children’s risk assessment and supply it to Ofcom.

Amendment 27, in clause 10, page 9, line 25, after “facilitating” insert “the production of illegal content and”

This amendment requires the children’s risk assessment to consider the production of illegal content.

Clause 10 stand part.

Amendment 16, in clause 25, page 25, line 10, at end insert—

‘”(3A) A duty for the children’s risk assessment to be approved by either—

(a) the board of the entity; or, if the organisation does not have a board structure,

(b) a named individual who the provider considers to be a senior manager of the entity, who may reasonably be expected to be in a position to ensure compliance with the children’s risk assessment duties, and reports directly into the most senior employee of the entity.”

This amendment seeks to ensure that regulated companies’ boards or senior staff have responsibility for children’s risk assessments.

Amendment 13, in clause 25, page 25, line 13, at end insert—

“(4A) A duty to publish the children’s risk assessment and proactively supply this to OFCOM.”

This amendment creates a duty to publish the children’s risk assessment and supply it to Ofcom.

Amendment 32, in clause 25, page 25, line 31, after “facilitating” insert “the production of illegal content and”

This amendment requires the children’s risk assessment to consider risks relating to the production of illegal content.

Clause 25 stand part.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I will speak to other amendments in this group as well as amendment 15. The success of the Bill’s regulatory framework relies on regulated companies carefully risk-assessing their platforms. Once risks have been identified, the platform can concentrate on developing and implementing appropriate mitigations. However, up to now, boards and top executives have not taken the risk to children seriously. Services have either not considered producing risk assessments or, if they have done so, they have been of limited efficacy and failed to identify and respond to harms to children.

In evidence to the Joint Committee, Frances Haugen explained that many of the corporate structures involved are flat, and accountability for decision making can be obscure. At Meta, that means teams will focus only on delivering against key commercial metrics, not on safety. Children’s charities have also noted that corporate structures in the large technology platforms reward employees who move fast and break things. Those companies place incentives on increasing return on investment rather than child safety. An effective risk assessment and risk mitigation plan can impact on profit, which is why we have seen so little movement from companies to take the measures themselves without the duty being placed on them by legislation.

It is welcome that clause 10 introduces a duty to risk-assess user-to-user services that are likely to be accessed by children. But, as my hon. Friend the Member for Pontypridd said this morning, it will become an empty, tick-box exercise if the Bill does not also introduce the requirement for boards to review and approve the risk assessments.

The Joint Committee scrutinising the draft Bill recommended that the risk assessment be approved at board level. The Government rejected that recommendation on the grounds thar Ofcom could include that in its guidance on producing risk assessments. As with much of the Bill, it is difficult to blindly accept promised safeguards when we have not seen the various codes of practice and guidance materials. The amendments would make sure that decisions about and awareness of child safety went right to the top of regulated companies. The requirement to have the board or a senior manager approve the risk assessment will hardwire the safety duties into decision making and create accountability and responsibility at the most senior level of the organisation. That should trickle down the organisation and help embed a culture of compliance across it. Unless there is a commitment to child safety at the highest level of the organisation, we will not see the shift in attitude that is urgently needed to keep children safe, and which I believe every member of the Committee subscribes to.

On amendments 11 and 13, it is welcome that we have risk assessments for children included in the Bill, but the effectiveness of that duty will be undermined unless the risk assessments can be available for scrutiny by the public and charities. In the current version of the Bill, risk assessments will only be made available to the regulator, which we debated on an earlier clause. Companies will be incentivised to play down the likelihood of currently emerging risks because of the implications of having to mitigate against them, which may run counter to their business interests. Unless the risk assessments are published, there will be no way to hold regulated companies to account, nor will there be any way for companies to learn from one another’s best practice, which is a very desirable aim.

The current situation shows that companies are unwilling to share risk assessments even when requested. In October 2021, following the whistleblower disclosures made by Frances Haugen, the National Society for the Prevention of Cruelty to Children led a global coalition of 60 child protection organisations that urged Meta to publish its risk assessments, including its data privacy impact assessments, which are a legal requirement under data protection law. Meta refused to share any of its risk assessments, even in relation to child sexual abuse and grooming. The company argued that risk assessments were live documents and it would not be appropriate for it to share them with any organisation other than the Information Commissioner’s Office, to whom it has a legal duty to disclose. As a result, civil society organisations and the charities that I talked about continue to be in the dark about whether and how Meta has appropriately identified online risk to children.

Making risk assessments public would support the smooth running of the regime and ensure its broader effectiveness. Civil society and other interested groups would be able to assess and identify any areas where a company might not be meeting its safety duties and make full, effective use of the proposed super-complaints mechanism. It will also help civil society organisations to hold the regulated companies and the regulator, Ofcom, to account.

As we have seen from evidence sessions, civil society organisations are often at the forefront of understanding and monitoring the harms that are occurring to users. They have an in depth understanding of what mitigations may be appropriate and they may be able to support the regulator to identify any obvious omissions. The success of the systemic risk assessment process will be significantly underpinned by and reliant upon the regulator’s being able to rapidly and effectively identify new and emerging harms, and it is highly likely that the regulator will want to draw on civil society expertise to ensure that it has highly effective early warning functions in place.

However, civil society organisations will be hampered in that role if they remain unable to determine what, if anything, companies are doing to respond to online threats. If Ofcom is unable to rapidly identify new and emerging harms, the resulting delays could mean entire regulatory cycles where harms were not captured in risk profiles or company risk assessments, and an inevitable lag between harms being identified and companies being required to act upon them. It is therefore clear that there is a significant public value to publishing risk assessments.

Amendments 27 and 32 are almost identical to the suggested amendments to clause 8 that we discussed earlier. As my hon. Friend the Member for Pontypridd said in our discussion about amendments 25, 26 and 30, the duty to carry out a suitable and sufficient risk assessment could be significantly strengthened by preventing the creation of illegal content, not only preventing individuals from encountering it. I know the Minister responded to that point, but the Opposition did not think that response was fully satisfactory. This is just as important for children’s risk assessments as it is for illegal content risk assessments.

Online platforms are not just where abusive material is published. Sex offenders use mainstream web platforms and services as tools to commit child sexual abuse. This can be seen particularly in the livestreaming of child sexual exploitation. Sex offenders pay to direct and watch child sexual abuse in real time. The Philippines is a known hotspot for such abuse and the UK has been identified by police leads as the third-largest consumer of livestreamed abuse in the world. What a very sad statistic that our society is the third-largest consumer of livestreamed abuse in the world.

Ruby is a survivor of online sexual exploitation in the Philippines, although Ruby is not her real name; she recently addressed a group of MPs about her experiences. She told Members how she was trafficked into sexual exploitation aged 16 after being tricked and lied to about the employment opportunities she thought she would be getting. She was forced to perform for paying customers online. Her story is harrowing. She said:

“I blamed myself for being trapped. I felt disgusted by every action I was forced to do, just to satisfy customers online. I lost my self-esteem and I felt very weak. I became so desperate to escape that I would shout whenever I heard a police siren go by, hoping somebody would hear me. One time after I did this, a woman in the house threatened me with a knife.”

Eventually, Ruby was found by the Philippine authorities and, after a four-year trial, the people who imprisoned her and five other girls were convicted. She said it took many years to heal from the experience, and at one point she nearly took her own life.

It should be obvious that if we are to truly improve child protection online we need to address the production of new child abuse material. In the Bill, we have a chance to address not only what illegal content is seen online, but how online platforms are used to perpetrate abuse. It should not be a case of waiting until the harm is done before taking action.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As the hon. Lady said, we discussed in the groupings for clauses 8 and 9 quite a few of the broad principles relating to children, but I will none the less touch on some of those points again because they are important.

On amendment 27, under clause 8 there is already an obligation on platforms to put in place systems and processes to reduce the risk that their services will be used to facilitate the presence of illegal content. As that includes the risk of illegal content being present, including that produced via the service’s functionality, the terrible example that the hon. Lady gave is already covered by the Bill. She is quite right to raise that example, because it is terrible when such content involving children is produced, but such cases are expressly covered in the Bill as drafted, particularly in clause 8.

Amendment 31 covers a similar point in relation to search. As I said for the previous grouping, search does not facilitate the production of content; it helps people to find it. Clearly, there is already an obligation on search firms to stop people using search engines to find illegal content, so the relevant functionality in search is already covered by the Bill.

Amendments 15 and 16 would expressly require board member sign-off for risk assessments. I have two points to make on that. First, the duties set out in clause 10(6)(h) in relation to children’s risk assessments already require the governance structures to be properly considered, so governance is directly addressed. Secondly, subsection (2) states that the risk assessment has to be “suitable and sufficient”, so it cannot be done in a perfunctory or slipshod way. Again, Ofcom must be satisfied that those governance arrangements are appropriate. We could invent all the governance arrangements in the world, but the outcome needs to be delivered and, in this case, to protect children.

Beyond governance, the most important things are the sanctions and enforcement powers that Ofcom can use if those companies do not protect children. As the hon. Lady said in her speech, we know that those companies are not doing enough to protect children and are allowing all kinds of terrible things to happen. If those companies continue to allow those things to happen, the enforcement powers will be engaged, and they will be fined up to 10% of their global revenue. If they do not sort it out, they will find that their services are disconnected. Those are the real teeth that will ensure that those companies comply.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I know that the Minister listened to Frances Haugen and to the members of charities. The charities and civil society organisations that are so concerned about this point do not accept that the Bill addresses it. I cannot see how his point addresses what I said about board-level acceptance of that role in children’s risk assessments. We need to change the culture of those organisations so that they become different from how they were described to us. He, like us, was sat there when we heard from the big platform providers, and they are not doing enough. He has had meetings with Frances Haugen; he knows what they are doing. It is good and welcome that the regulator will have the powers that he mentions, but that is just not enough.

15:19
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I agree with the hon. Lady that, as I said a second ago, those platforms are not doing enough to protect children. There is no question about that at all, and I think there is unanimity across the House that they are not doing enough to protect children.

I do not think the governance point is a panacea. Frankly, I think the boards of these companies are aware of what is going on. When these big questions arise, they go all the way up to Mark Zuckerberg. It is not as if Mark Zuckerberg and the directors of companies such as Meta are unaware of these risks; they are extremely aware of them, as Frances Haugen’s testimony made clear.

We do address the governance point. As I say, the risk assessments do need to explain how governance matters are deployed to consider these things—that is in clause 10(6)(h). But for me, it is the sanctions—the powers that Ofcom will have to fine these companies billions of pounds and ultimately to disconnect their service if they do not protect our children—that will deliver the result that we need.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The Minister is talking about companies of such scale that even fines of billions will not hurt them. I refer him to the following wording in the amendments:

“a named individual who the provider considers to be a senior manager of the entity, who may reasonably be expected to be in a position to ensure compliance with the children’s risk assessment duties”.

That is the minimum we should be asking. We should be asking these platforms, which are doing so much damage and have had to be dragged to the table to do anything at all, to be prepared to appoint somebody who is responsible. The Minister tries to gloss over things by saying, “Oh well, they must be aware of it.” The named individual would have to be aware of it. I hope he understands the importance of his role and the Committee’s role in making this happen. We could make this happen.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As I say, clause 10 already references the governance arrangements, but my strong view is that the only thing that will make these companies sit up and take notice—the only thing that will make them actually protect children in a way they are currently not doing—is the threat of billions of pounds of fines and, if they do not comply even after being fined at that level, the threat of their service being disconnected. Ultimately, that is the sanction that will make these companies protect our children.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

As my hon. Friend the Member for Worsley and Eccles South has said, the point here is about cultural change, and the way to do that is through leadership. It is not about shutting the gate after the horse has bolted. Fining the companies might achieve something, but it does not tackle the root of the problem. It is about cultural change and leadership at these organisations. We all agree across the House that they are not doing enough, so how do we change that culture? It has to come from leadership.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes, and that is why governance is addressed in the clause as drafted. But the one thing that will really change the way the leadership of these companies thinks about this issue is the one thing they ultimately care about—money. The reason they allow unsafe content to circulate and do not rein in or temper their algorithms, and the reason we are in this situation, which has arisen over the last 10 years or so, is that these companies have consistently prioritised profit over protection. Ultimately, that is the only language they understand—it is that and legal compulsion.

While the Bill rightly addresses governance in clause 10 and in other clauses, as I have said a few times, what has to happen to make this change occur is the compulsion that is inherent in the powers to fine and to deny service—to pull the plug—that the Bill also contains. The thing that will give reassurance to our constituents, and to me as a parent, is knowing that for the first time ever these companies can properly be held to account. They can be fined. They can have their connection pulled out of the wall. Those are the measures that will protect our children.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Minister is being very generous with his time, but I do not think he appreciates the nature of the issue. Mark Zuckerberg’s net worth is $71.5 billion. Elon Musk, who is reported to be purchasing Twitter, is worth $218 billion. Bill Gates is worth $125 billion. Money does not matter to these people.

The Minister discusses huge fines for the companies and the potential sanction of bringing down their platforms. They will just set up another one. That is what we are seeing with the smaller platforms: they are closing down and setting up new platforms. These measures do not matter. What matters and will actually make a difference to the safety of children and adults online is personal liability—holding people personally responsible for the direct harm they are causing to people here in the United Kingdom. That is what these amendments seek to do, and that is why we are pushing them so heavily. I urge the Minister to respond to that.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We discussed personal liability extensively this morning. As we discussed, there is personal liability in relation to providing information, with a criminal penalty of up to two years’ imprisonment, to avoid situations like the one we saw a year or two ago, where one of these companies failed to provide the Competition and Markets Authority with the information that it required.

The shadow Minister pointed out the very high levels of global turnover—$71.5 billion—that these companies have. That means that ultimately they can be fined up to $7 billion for each set of breaches. That is a vast amount of money, particularly if those breaches happen repeatedly. She said that such companies will just set up again if we deny their service. Clearly, small companies can close down and set up again the next day, but gigantic companies, such as Meta—Facebook—cannot do that. That is why I think the sanctions I have pointed to are where the teeth really lie.

I accept the point about governance being important as well; I am not dismissing that. That is why we have personal criminal liability for information provision, with up to two years in prison, and it is why governance is referenced in clause 10. I accept the spirit of the points that have been made, but I think the Bill delivers these objectives as drafted.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

Will my hon. Friend give way?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

One last time, because I am conscious that we need to make some progress this afternoon.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I have huge sympathy with the point that the Minister is making on this issue, but the hon. Member for Pontypridd is right to drive the point home. The Minister says there will be huge fines, but I think there will also be huge court bills. There will be an awful lot of litigation about how things are interpreted, because so much money will come into play. I just reiterate the importance of the guidance and the codes of practice, because if we do not get those right then the whole framework will be incredibly fragile. We will need ongoing scrutiny of how the Bill works or there will be a very difficult situation.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

My right hon. Friend, as always, makes a very good point. The codes of practice will be important, particularly to enable Ofcom to levy fines where appropriate and then successfully defend them. This is an area that may get litigated. I hope that, should lawyers litigating these cases look at our transcripts in the future, they will see how strongly those on both sides of the House feel about this point. I know that Ofcom will ensure that the codes of practice are properly drafted. We touched this morning on the point about timing; we will follow up with Ofcom to make sure that the promise it made us during the evidence session about the road map is followed through and that those get published in good time.

On the point about the Joint Committee, I commend my right hon. Friend for her persistence—[Interruption.] Her tenacity—that is the right word. I commend her for her tenacity in raising that point. I mentioned it to the Secretary of State when I saw her at lunchtime, so the point that my right hon. Friend made this morning has been conveyed to the highest levels in the Department.

I must move on to the final two amendments, 11 and 13, which relate to transparency. Again, we had a debate about transparency earlier, when I made the point about the duties in clause 64, which I think cover the issue. Obviously, we are not debating clause 64 now but it is relevant because it requires Ofcom—it is not an option but an obligation; Ofcom must do so—to require providers to produce a transparency report every year. Ofcom can say what is supposed to be in the report, but the relevant schedule lists all the things that can be in it, and covers absolutely everything that the shadow Minister and the hon. Member for Worsley and Eccles South want to see in there.

That requirement to publish transparently and publicly is in the Bill, but it is to be found in clause 64. While I agree with the Opposition’s objectives on this point, I respectfully say that those objectives are delivered by the Bill as drafted, so I politely and gently request that the amendments be withdrawn.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a couple of comments, particularly about amendments 15 and 16, which the Minister has just spoken about at some length. I do not agree with the Government’s assessment that the governance subsection is adequate. It states that the risk assessment must take into account

“how the design and operation of the service (including the business model, governance, use of proactive technology…may reduce or increase the risks identified.”

It is actually an assessment of whether the governance structure has an impact on the risk assessment. It has no impact whatever on the level at which the risk assessment is approved or not approved; it is about the risks that the governance structure poses to children or adults, depending on which section of the Bill we are looking at.

The Minister should consider what is being asked in the amendment, which is about the decision-making level at which the risk assessments are approved. I know the Minister has spoken already, but some clarification would be welcome. Does he expect a junior tech support member of staff, or a junior member of the legal team, to write the risk assessment and then put it in a cupboard? Or perhaps they approve it themselves and then nothing happens with it until Ofcom asks for it. Does he think that Ofcom would look unfavourably on behaviour like that? If he was very clear with us about that, it might put our minds at rest. Does he think that someone in a managerial position or a board member, or the board itself, should take decisions, rather than a very junior member of staff? There is a big spread of people who could be taking decisions. If he could give us an indication of what Ofcom might look favourably on, it would be incredibly helpful for our deliberations.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am anxious about time, but I will respond to that point because it is an important one. The hon. Lady is right to say that clause 10(6)(h) looks to identify the risks associated with governance. That is correct —it is a risk assessment. However in clause 11(2)(a), there is a duty to mitigate those risks, having identified what the risks are. If, as she hypothesised, a very junior person was looking at these matters from a governance point of view, that would be identified as a risk. If it was not, Ofcom would find that that was not sufficient or suitable. That would breach clause 10(2), and the service would then be required to mitigate. If it did not mitigate the risks by having a more senior person taking the decision, Ofcom would take enforcement action for its failure under clause 11(2)(a).

For the record, should Ofcom or lawyers consult the transcript to ascertain Parliament’s intention in the course of future litigation, it is absolutely the Government’s view, as I think it is the hon. Lady’s, that a suitable level of decision making for a children’s risk assessment would be a very senior level. The official Opposition clearly think that, because they have put it in their amendment. I am happy to confirm that, as a Minister, I think that. Obviously the hon. Lady, who speaks for the SNP, does too. If the transcripts of the Committee’s proceedings are examined in the future to ascertain Parliament’s intention, Parliament’s intention will be very clear.

None Portrait The Chair
- Hansard -

Barbara Keeley, do you have anything to add?

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

All I have to add is the obvious point—I am sure that we are going to keep running into this—that people should not have to look to a transcript to see what the Minister’s and Parliament’s intention was. It is clear what the Opposition’s intention is—to protect children. I cannot see why the Minister will not specify who in an organisation should be responsible. It should not be a question of ploughing through transcripts of what we have talked about here in Committee; it should be obvious. We have the chance here to do something different and better. The regulator could specify a senior level.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clearly, we are legislating here to cover, as I think we said this morning, 25,000 different companies. They all have different organisational structures, different personnel and so on. To anticipate the appropriate level of decision making in each of those companies and put it in the Bill in black and white, in a very prescriptive manner, might not adequately reflect the range of people involved.

15:15
Question put, That the amendment be made.

Division 10

Ayes: 7


Labour: 5
Scottish National Party: 2

Noes: 9


Conservative: 9

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I beg to move amendment 72, in clause 10, page 9, line 24, after “characteristic” insert “or characteristics”.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 73, in clause 10, page 9, line 24, after “group” insert “or groups”.

Amendment 85, in clause 12, page 12, line 22, leave out subsection (d) and insert—

“(d) the level of risk of harm to adults presented by priority content that is harmful to adults which particularly affects individuals with certain characteristics or members of certain groups;”.

This amendment would recognise the intersectionality of harms.

Amendment 74, in clause 12, page 12, line 24, after “characteristic” insert “or characteristics”.

Amendment 75, in clause 12, page 12, line 24, after “group” insert “or groups”.

Amendment 71, in clause 83, page 72, line 12, at end insert—

“(1A) For each of the above risks, OFCOM shall identify and assess the level of risk of harm which particularly affects people with certain characteristics or membership of a group or groups.”

This amendment requires Ofcom as part of its risk register to assess risks of harm particularly affecting people with certain characteristics or membership of a group or groups.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

May I say—this might be a point of order—how my constituency name is pronounced? I get a million different versions, but it is Worsley, as in “worse”. It is an unfortunate name for a great place.

I will speak to all the amendments in the group together, because they relate to how levels of risk are assessed in relation to certain characteristics. The amendments are important because small changes to the descriptions of risk assessment will help to close a significant gap in protection.

Clauses 10 and 12 introduce a duty on regulated companies to assess harms to adults and children who might have an innate vulnerability arising from being a member of a particular group or having a certain characteristic. However, Ofcom is not required to assess harms to people other than children who have that increased innate vulnerability. Amendment 71 would require Ofcom to assess risks of harm particularly affecting people with certain characteristics or membership of a group or groups as part of its risk register. That would reduce the regulatory burden if companies had Ofcom’s risk assessment to base their work on.

Getting this right is important. The risk management regime introduced by the Bill should not assume that all people are at the same risk of harm—they are clearly not. Differences in innate vulnerability increase the incidence and impact of harm, such as by increasing the likelihood of encountering content or of that content being harmful, or heightening the impact of the harm.

It is right that the Bill emphasises the vulnerability of children, but there are other, larger groups with innate vulnerability to online harm. As we know, that often reflects structural inequalities in society.

For example, women will be harmed in circumstances where men might not be, and they could suffer some harms that have a more serious impact than they might for men. A similar point can be made for people with other characteristics. Vulnerability is then compounded by intersectional issues—people might belong to more than one high-risk group—and I will come to that in a moment.

The initial Ofcom risk assessment introduced by clause 83 is not required to consider the heightened risks to different groups of people, but companies are required to assess that risk in their own risk assessments for children and adults. They need to be given direction by an assessment by Ofcom, which amendment 71 would require.

Amendments 72 to 75 address the lack of recognition in these clauses of intersectionality issues. They are small amendments in the spirit of the Bill’s risk management regime. As drafted, the Bill refers to a singular “group” or “characteristic” for companies to assess for risk. However, some people are subject to increased risks of harm arising from being members of more than one group. Companies’ risk assessments for children and adults should reflect intersectionality, and not just characteristics taken individually. Including the plural of “group” and “characteristic” in appropriate places would achieve that.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I will first speak to our amendment 85, which, like the Labour amendment, seeks to ensure that the Bill is crystal clear in addressing intersectionality. We need only consider the abuse faced by groups of MPs to understand why that is necessary. Female MPs are attacked online much more regularly than male MPs, and the situation is compounded if they have another minority characteristic. For instance, if they are gay or black, they are even more likely to be attacked. In fact, the MP who is most likely to be attacked is black and female. There are very few black female MPs, so it is not because of sheer numbers that they are at such increased risk of attack. Those with a minority characteristic are at higher risk of online harm, but the risk facing those with more than one minority characteristic is substantially higher, and that is what the amendment seeks to address.

I have spoken specifically about people being attacked on Twitter, Facebook and other social media platforms, but people in certain groups face an additional significant risk. If a young gay woman does not have a community around her, or if a young trans person does not know anybody else who is trans, they are much more likely to use the internet to reach out, to try to find people who are like them, to try to understand. If they are not accepted by their family, school or workplace, they are much more likely to go online to find a community and support—to find what is out there in terms of assistance—but using the internet as a vulnerable, at-risk person puts them at much more significant risk. This goes back to my earlier arguments about people requiring anonymity to protect themselves when using the internet to find their way through a difficult situation in which they have no role models.

It should not be difficult for the Government to accept this amendment. They should consider it carefully and understand that all of us on the Opposition Benches are making a really reasonable proposal. This is not about saying that someone with only one protected characteristic is not at risk; it is about recognising the intersectionality of risk and the fact that the risk faced by those who fit into more than one minority group is much higher than that faced by those who fit into just one. This is not about taking anything away from the Bill; it is about strengthening it and ensuring that organisations listen.

We have heard that a number of companies are not providing the protection that Members across the House would like them to provide against child sexual abuse. The governing structures, risk assessments, rules and moderation at those sites are better at ensuring that the providers make money than they are at providing protection. When regulated providers assess risk, it is not too much to ask them to consider not just people with one protected characteristic but those with multiple protected characteristics.

As MPs, we work on that basis every day. Across Scotland and the UK, we support our constituents as individuals and as groups. When protected characteristics intersect, we find ourselves standing in Parliament, shouting strongly on behalf of those affected and giving them our strongest backing, because we know that that intersection of harms is the point at which people are most vulnerable, in both the real and the online world. Will the Minister consider widening the provision so that it takes intersectionality into account and not only covers people with one protected characteristic but includes an over and above duty? I genuinely do not think it is too much for us to ask providers, particularly the biggest ones, to make this change.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Once again, the Government recognise the intent behind these amendments and support the concept that people with multiple intersecting characteristics, or those who are members of multiple groups, may experience—or probably do experience—elevated levels of harm and abuse online compared with others. We completely understand and accept that point, as clearly laid out by the hon. Member for Aberdeen North.

There is a technical legal reason why the use of the singular characteristic and group singular is adopted here. Section 6(c) of the Interpretation Act 1978 sets out how words in Bills and Acts are interpreted, namely that such words in the singular also cover the plural. That means that references in the singular, such as

“individuals with a certain characteristic”

in clause 10(6)(d), also cover characteristics in the plural. A reference to the singular implies a reference to the plural.

Will those compounded risks, where they exist, be taken into account? The answer is yes, because the assessments must assess the risk in front of them. Where there is evidence that multiple protected characteristics or the membership of multiple groups produce compounded risks, as the hon. Lady set out, the risk assessment has to reflect that. That includes the general sectoral risk assessment carried out by Ofcom, which is detailed in clause 83, and Ofcom will then produce guidance under clause 84.

The critical point is that, because there is evidence of high levels of compounded risk when people have more than one characteristic, that must be reflected in the risk assessment, otherwise it is inadequate. I accept the point behind the amendments, but I hope that that explains, with particular reference to the 1978 Act, why the Bill as drafted covers that valid point.

None Portrait The Chair
- Hansard -

Barbara Keeley?

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I have nothing to add. I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Clause 10 ordered to stand part of the Bill.

Clause 11

Safety duties protecting children

None Portrait The Chair
- Hansard -

We now come to amendment 95, tabled by the hon. Member for Upper Bann, who is not on the Committee. Does anyone wish to move the amendment? If not, we will move on.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I beg to move amendment 29, in clause 11, page 10, line 20, at end insert—

“(c) prevent the sexual or physical abuse of a child by means of that service.”

This amendment establishes a duty to prevent the sexual or physical abuse of a child by means of a service.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss amendment 33, in clause 26, page 26, line 18, at end insert—

“(c) prevent the sexual or physical abuse of a child by means of that service.”

This amendment establishes a duty to prevent the sexual or physical abuse of a child by means of a service.

15:29
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The purpose of this clause is to ensure that children at risk of online harms are given protections from harmful, age-inappropriate content through specific children’s safety duties for user-to-user services likely to be accessed by children.

It is welcome that the Bill contains strong provisions to ensure that service providers act upon and mitigate the risks identified in the required risk assessment, and to introduce protective systems and processes to address what children encounter. This amendment aims to ensure that online platforms are proactive in their attempts to mitigate the opportunity for sex offenders to abuse children.

As we have argued with other amendments, there are missed opportunities in the Bill to be preventive in tackling the harm that is created. The sad reality is that online platforms create an opportunity for offenders to identify, contact and abuse children, and to do so in real time through livestreaming. We know there has been a significant increase in online sexual exploitation during the pandemic. With sex offenders unable to travel or have physical contact with children, online abuse increased significantly.

In 2021, UK law enforcement received a record 97,727 industry reports relating to online child abuse, a 29% increase on the previous year, which is shocking. An NSPCC freedom of information request to police forces in England and Wales last year showed that online grooming offences reached record levels in 2020-21, with the number of sexual communications with a child offences in England and Wales increasing by almost 70% in three years. There has been a deeply troubling trend in internet-facilitated abuse towards more serious sexual offences against children, and the average age of children in child abuse images, particularly girls, is trending to younger ages.

In-person contact abuse moved online because of the opportunity there for sex offenders to continue exploiting children. Sadly, they can do so with little fear of the consequences, because detection and disruption of livestreamed abuse is so low. The duty to protect children from sexual offenders abusing them in real time and livestreaming their exploitation cannot be limited to one part of the internet and tech sector. While much of the abuse might take place on the user-to-user services, it is vital that protections against such abuse are strengthened across the board, including in the search services, as set out in clause 26.

At the moment there is no list of harms in the Bill that must be prioritised by regulated companies. The NSPCC and others have suggested including a new schedule, similar to schedule 7, setting out what the primary priority harms should be. It would be beneficial for the purposes of parliamentary scrutiny for us to consider the types of priority harm that the Government intend the Bill to cover, rather than leaving that to secondary legislation. I hope the Minister will consider that and say why it has not yet been included.

To conclude, while we all hope the Bill will tackle the appalling abuse of children currently taking place online, this cannot be achieved without tackling the conditions in which these harms can take place. It is only by requiring that steps be taken across online platforms to limit the opportunities for sex offenders to abuse children that we can see the prevalence of this crime reduced.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I rise, hopefully to speak to clause 11 more generally—or will that be a separate stand part debate, Ms Rees?

None Portrait The Chair
- Hansard -

That is a separate debate.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

My apologies. I will rise later.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The Government obviously support the objective of these amendments, which is to prevent children from suffering the appalling sexual and physical abuse that the hon. Member for Worsley and Eccles South outlined in her powerful speech. It is shocking that these incidents have risen in the way that she described.

To be clear, that sort of appalling sexual abuse is covered in clause 9—which we have debated already—which covers illegal content. As Members would expect, child sexual abuse is defined as one of the items of priority illegal content, which are listed in more detail in schedule 6, where the offences that relate to sexual abuse are enumerated. As child sexual exploitation is a priority offence, services are already obliged through clause 9 to be “proactive” in preventing it from happening. As such, as Members would expect, the requirements contained in these amendments are already delivered through clause 9.

The hon. Member for Worsley and Eccles South also asked when we are going to hear what the primary priority harms to children might be. To be clear, those will not include the sexual exploitation offences, because as Members would also expect, those are already in the Bill as primary illegal offences. The primary priority harms might include material promoting eating disorders and that kind of thing, which is not covered by the criminal matters—the illegal matters. I have heard the hon. Lady’s point that if that list were to be published, or at least a draft list, that would assist Parliament in scrutinising the Bill. I will take that point away and see whether there is anything we can do in that area. I am not making a commitment; I am just registering that I have heard the point and will take it away.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause 26 stand part.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I rise to speak to clause 11, because this is an important part of the Bill that deals with the safety duties protecting children. Many of us here today are spurred on by our horror at the way in which internet providers, platform providers and search engines have acted over recent years, developing their products with no regard for the safety of children, so I applaud the Government for bringing forward this groundbreaking legislation. They are literally writing the book on this, but in doing so, we have be very careful about the language we use and the way in which we frame our requirements of these organisations. The Minister has rightly characterised these organisations as being entirely driven by finance, not the welfare of their consumers, which must make them quite unique in the world. I can only hope that that will change: presumably, over time, people will not want to use products that have no regard for the safety of those who use them.

In this particular part of the Bill, the thorny issue of age assurance comes up. I would value the Minister’s views on some of the evidence that we received during our evidence sessions about how we ensure that age assurance is effective. Some of us who have been in this place for a while would be forgiven for thinking that we had already passed a law on age assurance. Unfortunately, that law did not seem to come to anything, so let us hope that second time is lucky. The key question is: who is going to make sure that the age assurance that is in place is good enough? Clause 11(3) sets out

“a duty to operate a service using proportionate systems and processes”

that is designed to protect children, but what is a proportionate system? Who is going to judge that? Presumably it will be Ofcom in the short term, and in the long term, I am sure the courts will get involved.

In our evidence, we heard some people advocating very strongly for these sorts of systems to be provided by third parties. I have to say, in a context where we are hearing how irresponsible the providers of these services are, I can understand why people would think that a third party would be a more responsible way forward. Can the Minister help the Committee understand how Ofcom will ensure that the systems used, particularly the age assurance systems, are proportionate—I do not particularly like that word; I would like those systems to be brilliant, not proportionate—and are actually doing what we need them to do, which is safeguard children? For the record, and for the edification of judges who are looking at this matter in future—and, indeed, Ofcom—will he set out how important this measure is within the Bill?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my right hon. Friend for her remarks, in which she powerfully and eloquently set out how important the clause is to protecting children. She is right to point out that this is a critical area in the Bill, and it has wide support across the House. I am happy to emphasise, for the benefit of those who may study our proceedings in future, that protecting children is probably the single-most important thing that the Bill does, which is why it is vital that age-gating, where necessary, is effective.

My right hon. Friend asked how Ofcom will judge whether the systems under clause 11(3) are proportionate to

“prevent children of any age from encountering”

harmful content and so on. Ultimately, the proof of the pudding is in the eating; it has to be effective. When Ofcom decides whether a particular company or service is meeting the duty set out in the clause, the simple test will be one of effectiveness: is it effective and does it work? That is the approach that I would expect Ofcom to take; that is the approach that I would expect a court to take. We have specified that age verification, which is the most hard-edged type of age assurance—people have to provide a passport or something of that nature—is one example of how the duty can be met. If another, less-intrusive means is used, it will still have to be assessed as effective by Ofcom and, if challenged, by the courts.

I think my right hon. Friend was asking the Committee to confirm to people looking at our proceedings our clear intent for the measures to be effective. That is the standard to which we expect Ofcom and the courts to hold those platforms in deciding whether they have met the duties set out in the clause.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

For clarification, does the Minister anticipate that Ofcom might be able to insist that a third-party provider be involved if there is significant evidence that the measures put in place by a platform are ineffective?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We have deliberately avoided being too prescriptive about precisely how the duty is met. We have pointed to age verification as an example of how the duty can be met without saying that that is the only way. We would not want to bind Ofcom’s hands, or indeed the hands of platforms. Clearly, using a third party is another way of delivering the outcome. If a platform were unable to demonstrate to Ofcom that it could deliver the required outcome using its own methods, Ofcom may well tell it to use a third party instead. The critical point is that the outcome must be delivered. That is the message that the social media firms, Ofcom and the courts need to hear when they look at our proceedings. That is set out clearly in the clause. Parliament is imposing a duty, and we expect all those to whom the legislation applies to comply with it.

Question put and agreed to.

Clause 11 accordingly ordered to stand part of the Bill.

Clause 12

Adults’ risk assessment duties

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move amendment 12, in clause 12, page 12, line 10, at end insert—

“(4A) A duty to publish the adults’ risk assessment and proactively supply this to OFCOM.”

This amendment creates a duty to publish the adults’ risk assessment and supply it to Ofcom.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The amendment creates a duty to publish the adults’ risk assessment and supply it to Ofcom. As my hon. Friend the Member for Worsley and Eccles South remarked when addressing clause 10, transparency and scrutiny of those all-important risk assessments must be at the heart of the Online Safety Bill. We all know that the Government have had a hazy record on transparency lately but, for the sake of all in the online space, I sincerely hope that the Minister will see the value in ensuring that the risk assessments are accurate, proactively supplied and published for us all to consider.

It is only fair that all the information about risks to personal safety be made available to users of category 1 services, which we know are the most popular and, often, the most troublesome services. We all want people to feel compelled to make their own decisions about their behaviour both online and offline. That is why we are pushing for a thorough approach to risk assessments more widely. Also, without a formal duty to publicise those risk assessments, I fear there will be little change in our safety online. The Minister has referenced that the platforms will be looking back at Hansard in years to come to determine whether or not they should be doing the right thing. Unless we make that a statutory obligation within the Bill, I fear that reference will fall on deaf ears.

15:45
The Government have made some positive steps towards keeping children safe online. Sadly, the same cannot be said for adults. We need to be careful when we formally differentiate between children and adults, because age, as they say, is but only a number. A 17-year-old will obviously fall short of being legally deemed an adult in this country, but an 18-year-old, who only a few months or even a day earlier was 17, should have exactly the same protections. Platforms should of course be required to protect adults too.
We have seen what years of no accountability has done to the online space. My hon. Friend referred to Frances Haugen’s experiences at Meta, which we all heard about recently in evidence sessions—none of it filled me with confidence. We know that those category 1 companies have the information, but they will not feel compelled to publish it until there is a statutory duty to do so. The Minister knows that would be an extremely welcome move; he would be commended by academics, stakeholders, parliamentarians and the public alike. Why exactly does that glaring omission still remain? If the Minister cannot answer me fully, and instead refers to platforms looking to Hansard in the future, then I am keen to press this amendment to a Division. I cannot see the benefits of withholding those risk assessments from the public and academics.
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Once again, I agree with the point about transparency and the need to have those matters brought into the light of day. We heard from Frances Haugen how Facebook—now Meta—actively resisted doing so. However, I point to two provisions already in the Bill that deliver precisely that objective. I know we are debating clause 12, but there is a duty in clause 13(2) for platforms to publish in their terms of service—a public document—the findings of the most recent adult risk assessment. That duty is in clause 13—the next clause we are going to debate—in addition to the obligations I have referred to twice already in clause 64, where Ofcom compels those firms to publish their transparency reports. I agree with the points that the shadow Minister made, but suggest that through clause 13(2) and clause 64, those objectives are met in the Bill as drafted.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I thank the Minister for his comments, but sadly we do not feel that is appropriate or robust enough, which is why we will be pressing the amendment to a Division.

Question put, That the amendment be made.

The Committee divided.

Division 11

Ayes: 5


Labour: 5

Noes: 9


Conservative: 9

Clause 12 ordered to stand part of the Bill.
Clause 13
Safety duties protecting adults
Question proposed, That the clause stand part of the Bill.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

While I am at risk of parroting my hon. Friend the Member for Worsley and Eccles South on clause 11, it is important that adults and the specific risks they face online are considered in the clause. The Minister knows we have wider concerns about the specific challenges of the current categorisation system. I will come on to that at great length later, but I thought it would be helpful to remind him at this relatively early stage that the commitments to safety and risk assessments for category 1 services will only work if category 1 encapsulates the most harmful platforms out there. That being said, Labour broadly supports this clause and has not sought to amend it.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am eagerly awaiting the lengthy representations that the shadow Minister just referred to, as are, I am sure, the whole Committee and indeed the millions watching our proceedings on the live broadcast. As the shadow Minister said, clause 13 sets out the safety duties in relation to adults. This is content that is legal but potentially harmful to adults, and for those topics specified in secondary legislation, it will require category 1 services to set out clearly what actions they might be taking—from the actions specified in subsection (4) —in relation to that content.

It is important to specify that the action they may choose to take is a choice for the platform. I know some people have raised issues concerning free speech and these duties, but I want to reiterate and be clear that this is a choice for the platform. They have to be publicly clear about what choices they are making, and they must apply those choices consistently. That is a significant improvement on where we are now, where some of these policies get applied in a manner that is arbitrary.

Question put and agreed to.

Clause 13 accordingly ordered to stand part of the Bill.

Clause 14

User empowerment duties

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move amendment 46, in clause 14, page 14, line 12, after “non-verified users” insert

“and to enable them to see whether another user is verified or non-verified.”

This amendment would make it clear that, as part of the User Empowerment Duty, users should be able to see which other users are verified and which are non-verified.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 47, in clause 189, page 155, line 1, at end insert

“‘Identity Verification’ means a system or process designed to enable a user to prove their identity, for purposes of establishing that they are a genuine, unique, human user of the service and that the name associated with their profile is their real name.”

This amendment adds a definition of Identity Verification to the terms defined in the Bill.

New clause 8—OFCOM’s guidance about user identity verification

“(1) OFCOM must produce guidance for providers of Category 1 services on how to comply with the duty set out in section 57(1).

(2) In producing the guidance (including revised or replacement guidance), OFCOM must have regard to—

(a) ensuring providers offer forms of identity verification which are likely to be accessible to vulnerable adult users and users with protected Characteristics under the Equality Act 2010,

(b) promoting competition, user choice, and interoperability in the provision of identity verification,

(c) protection of rights, including rights to privacy, freedom of expression, safety, access to information, and the rights of children,

(d) alignment with other relevant guidance and regulation, including with regards to Age Assurance and Age Verification.

(3) In producing the guidance (including revised or replacement guidance), OFCOM must set minimum standards for the forms of identity verification which Category services must offer, addressing—

(a) effectiveness,

(b) privacy and security,

(c) accessibility,

(d) time-frames for disclosure to Law Enforcement in case of criminal investigations,

(e) transparency for the purposes of research and independent auditing,

(f) user appeal and redress mechanisms.

(4) Before producing the guidance (including revised or replacement guidance), OFCOM must consult—

(a) the Information Commissioner,

(b) the Digital Markets Unit,

(c) persons whom OFCOM consider to have technological expertise relevant to the duty set out in section 57(1),

(d) persons who appear to OFCOM to represent the interests of users including vulnerable adult users of Category 1 services, and

(e) such other persons as OFCOM considers appropriate.

(5) OFCOM must publish the guidance (and any revised or replacement guidance).”

This new clause would require Ofcom to set a framework of principles and minimum standards for the User Verification Duty.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The revised Bill seeks to address the problems associated with anonymity through requiring platforms to empower users, with new options to verify their identity and filter out non-verified accounts. This is in line with the approach recommended by Clean Up The Internet and also reflects the approach proposed in the Social Media Platforms (Identity Verification) Bill, which was tabled by the hon. Member for Stroud (Siobhan Baillie) and attracted cross-party support. It has the potential to strike a better balance between tackling the clear role that anonymity can play in fuelling abuse and disinformation, while safeguarding legitimate uses of anonymity, including by vulnerable users, for whom anonymity can act as a protection. However, Labour does share the concerns of stakeholders around the revised Bill, which we have sought to amend.

Amendment 46 aims to empower people to use this information about verification when making judgments about the reliability of other accounts and the content they share. This would ensure that the user verification duty helps disrupt the use of networks of inauthentic accounts to spread disinformation. Labour welcomes the inclusion in the revised Bill of measures designed to address harm associated with misuse of anonymous social media accounts. There is considerable evidence from Clean Up The Internet and others that anonymity fuels online abuse, bullying and trolling and that it is one of the main tools used by organised disinformation networks to spread and amplify false, extremist and hateful content.

The revised Bill seeks to address the problems associated with anonymity, by requiring platforms to empower users with new options to verify their identity and to filter out non-verified accounts. In doing so, it has the potential to strike a better balance between tackling the clear role that anonymity can play in fuelling abuse and misinformation while safeguarding legitimate users of anonymity, including vulnerable users, for whom anonymity acts as a protection.

Clause 14 falls short of truly empowering people to make the most well-informed decisions about the type of content they engage with. We believe that this could be simple, and a simple change from a design perspective. Category 1 platforms are already able to verify different types of accounts, whether they be personal or business accounts, so ensuring that people are equipped with this information more broadly would be an easy step for the big platforms to make. Indeed, the Joint Committee’s prelegislative scrutiny recommended that the Government consider, as part of Ofcom’s code of practice, a requirement for the largest and highest-risk platforms to offer the choice of verified or unverified status and user options on how they interact with accounts in either category.

I know that there are concerns about verification, and there is a delicate balance between anonymity, free speech and protecting us all online. I somewhat sympathise with the Minister in being tasked with bringing forward this complex legislation, but the options for choosing what content and users we do and do not engage with are already there on most platforms. On Twitter, we are able to mute accounts—I do so regularly—or keywords that we want to avoid. Similarly, we can restrict individuals on Instagram.

In evidence to the Joint Committee, the Secretary of State said that the first priority of the draft Bill was to end all online abuse, not just that from anonymous accounts. Hopes were raised about the idea of giving people the option to limit their interaction with anonymous or non-verified accounts. Clearly, the will is there, and the amendment ensures that there is a way, too. I urge the Minister to accept the amendment, if he is serious about empowering users across the United Kingdom.

Now I move on to amendment 47. As it stands, the Bill does not adequately define “verification” or set minimum standards for how it will be carried out. There is a risk that platforms will treat this as a loophole in order to claim that their current, wholly inadequate processes count as verification. We also see entirely avoidable risks of platforms developing new verification processes that fail to protect users’ privacy and security or which serve merely to extend their market dominance to the detriment of independent providers. That is why it is vital that a statutory definition of identity verification is placed in the Bill.

I have already spoken at length today, and I appreciate that we are going somewhat slowly on the Bill, but it is complex legislation and this is an incredibly important detail that we need to get right if the Bill is to be truly world leading. Without a definition of identity verification, I fear that we are at risk of allowing technology, which can easily replicate the behaviours of a human being, to run rife, which would essentially invalidate the process of verification entirely.

I have also spoken at length about my concerns relating to AI technologies, the lack of future proofing in the Bill and the concerns that could arise in the future. I am sure that the Minister is aware that that could have devastating impacts on our democracy and our online safety more widely.

New clause 8 would ensure that the user empowerment duty and user verification work as intended by simply requiring Ofcom to set out principles and minimum standards for compliance. We note that the new clause is entirely compatible with the Government’s stated aims for the Bill and would provide a clearer framework for both regulated companies and the regulator. By its very nature, it is vital that in preparing the guidance Ofcom must ensure that the delicate balance that I touched on earlier between freedom of expression, the right to privacy and safety online is kept in mind throughout.

We also felt it important that, in drawing up the guidance a collaborative approach should be taken. Regulating the online space is a mammoth task, and while we have concerns about Ofcom’s independence, which I will gladly touch on later, we also know that it will be best for us all if it is required to draw on the expertise of other expert organisations in doing so.

None Portrait The Chair
- Hansard -

There is a Division in the House, so I will suspend the sitting for 15 minutes.

15:58
Sitting suspended for a Division in the House.
16:13
On resuming—
None Portrait The Chair
- Hansard -

If no other Member would like to speak to amendment 46, I call the Minister.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I would be delighted to speak to the amendment, which would change the existing user empowerment duty in clause 14 to require category 1 services to enable adult users to see whether other users are verified. In effect, however, that objective already follows as a natural consequence of the duty in clause 14(6). When a user decides to filter out non-verified users, by definition such users will be able to see content only from verified users, so they could see from that who was verified and who was not. The effect intended by the amendment, therefore, is already achieved through clause 14(6).

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am sorry to disagree with the Minister so vigorously, but that is a rubbish argument. It does not make any sense. There is a difference between wanting to filter out everybody who is not verified and wanting to actually see if someone who is threatening someone else online is a verified or a non-verified user. Those are two very different things. I can understand why a politician, for example, might not want to filter out unverified users but would want to check whether a person was verified before going to the police to report a threat.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

When it comes to police investigations, if something is illegal and merits a report to the police, users should report it, regardless of whether someone is verified or not—whatever the circumstances. I would encourage any internet user to do that. That effectively applies on Twitter already; some people have blue ticks and some people do not, and people should report others to the police if they do something illegal, whether or not they happen to have a blue tick.

Amendment 47 seeks to create a definition of identity verification in clause 189. In addition, it would compel the person’s real name to be displayed. I understand the spirit of the amendment, but there are two reasons why I would not want to accept it and would ask hon. Members not to press it. First, the words “identity verification” are ordinary English words with a clear meaning and we do not normally define in legislation ordinary English words with a clear meaning. Secondly, the amendment would add the new requirement that, if somebody is verified, their real name has to be displayed, but I do not think that that is the effect of the drafting as it stands. Somebody may be verified, and the company knows who they are—if the police go to the company, they will have the verified information—but there is no obligation, as the amendment is drafted, for that information to be displayed publicly. The effect of that part of the amendment would be to force users to choose between disclosing their identity to everyone or having no control over who they interact with. That may not have been the intention, but I am not sure that this would necessarily make sense.

New clause 8 would place requirements on Ofcom about how to produce guidance on user identity verification and what that guidance must contain. We already have provisions on that in clause 58, which we will no doubt come to, although probably not later on today—maybe on Thursday. Clause 58 allows Ofcom to include in its regulatory guidance the principles and standards referenced in the new clause, which can then assist service providers in complying with their duties. Of course, if they choose to ignore the guidelines and do not comply with their duties, they will be subject to enforcement action, but we want to ensure that there is flexibility for Ofcom, in writing those guidelines, and for companies, in following those guidelines or taking alternative steps to meet their duty.

This morning, a couple of Members talked about the importance of remaining flexible and being open to future changes in technology and a wide range of user needs. We want to make sure that flexibility is retained. As drafted, new clause 8 potentially undermines that flexibility. We think that the powers set out in clause 58 give Ofcom the ability to set the relevant regulatory guidance.

Clause 14 implements the proposals made by my hon. Friend the Member for Stroud in her ten-minute rule Bill and the proposals made, as the shadow Minister has said, by a number of third-party stakeholders. We should all welcome the fact that these new user empowerment duties have now been included in the Bill in response to such widespread parliamentary lobbying.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful to the Minister for giving way. I want to recount my own experience on this issue. He mentioned that anybody in receipt of anonymous abuse on social media should report it to the police, especially if it is illegal. On Thursday, I dared to tweet my opinions on the controversial Depp-Heard case in America. As a result of putting my head above the parapet, my Twitter mentions were an absolute sewer of rape threats and death threats, mainly from anonymous accounts. My Twitter profile was mocked up—I had devil horns and a Star of David on my forehead. It was vile. I blocked, deleted and moved on, but I also reported those accounts to Twitter, especially those that sent me rape threats and death threats.

That was on Thursday, and to date no action has been taken and I have not received any response from Twitter about any of the accounts I reported. The Minister said they should be reported to the police. If I reported all those accounts to the police, I would still be there now reporting them. How does he anticipate that this will be resourced so that social media companies can tackle the issue? That was the interaction resulting from just one tweet that I sent on Thursday, and anonymous accounts sent me a barrage of hate and illegal activity.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The shadow Minister raises a very good point. Of course, what she experienced on Twitter was despicable, and I am sure that all members of the Committee would unreservedly condemn the perpetrators who put that content on there. Once the Bill is passed, there will be legal duties on Twitter to remove illegal content. At the moment, they do not exist, and there is no legal obligation for Twitter to remove that content, even though much of it, from the sound of it, would cross one of various legal thresholds. Perhaps some messages qualify as malicious communication, and others might cross other criminal thresholds. That legal duty does not exist at the moment, but when this Bill passes, for the first time there will be that duty to protect not just the shadow Minister but users across the whole country.

Question put, That the amendment be made.

Division 12

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 8


Conservative: 8

Clause 14 ordered to stand part of the Bill.
Clause 15
Duties to protect content of democratic importance
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I beg to move amendment 105, in clause 15, page 14, line 33, after “ensure” insert “the safety of people involved in UK elections and”.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss amendment 106, in clause 37, page 25, line 31, at end insert—

‘(2A) OFCOM must prepare and issue a code of practice for providers of Category 1 and 2(a) services describing measures recommended for the purpose of compliance with duties set out in section 15 concerning the safety of people taking part in elections.”

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I rise to speak to amendments 105 and 106, in my name, on protecting democracy and democratic debate.

Within the Bill, there are significant clauses intended to prevent the spread of harm online, to protect women and girls against violence and to help prevent child sexual exploitation, while at the same time protecting the right of journalists to do their jobs. Although those clauses are not perfect, I welcome them.

The Bill is wide-ranging. The Minister talked on Second Reading about the power in clause 150 to protect another group—those with epilepsy—from being trolled with flashing images. That subject is close to my heart due to the campaign for Zach’s law—Zach is a young boy in my constituency. I know we will return to that important issue later in the Committee, and I thank the Minister for his work on it.

In protecting against online harm while preserving fundamental rights and values, we must also address the threats posed to those involved in the democratic process. Let me be clear: this is not self-serving. It is about not just MPs but all political candidates locally and nationally and those whose jobs facilitate the execution of our democratic process and political life: the people working on elections or for those elected to public office at all levels across the UK. These people must be defended from harm not only for their own protection, but to protect our democracy itself and, with it, the right of all our citizens to a political system capable of delivering on their priorities free from threats and intimidation.

Many other groups in society are also subjected to a disproportionate amount of targeted abuse, but those working in and around politics sadly receive more than almost any other people in this country, with an associated specific set of risks and harms. That does not mean messages gently, or even firmly, requesting us to vote one way or another—a staple of democratic debate—but messages of hate, abuse and threats intended to scare people in public office, grind them down, unfairly influence their voting intentions or do them physical and psychological harm. That simply cannot be an acceptable part of political life.

As I say, we are not looking for sympathy, but we have a duty to our democracy to try to stamp that out from our political discourse. Amendment 105 would not deny anybody the right to tell us firmly where we are going wrong—quite right, too—but it is an opportunity to draw the essential distinction between legitimately holding people in public life to account and illegitimate intimidation and harm.

The statistics regarding the scale of online abuse that MPs receive are shocking. In 2020, a University of Salford study found that MPs received over 7,000 abusive or hate-filled tweets a month. Seven thousand separate messages of harm a month on Twitter alone directed at MPs is far too many, but who in this room does not believe that the figure is almost certainly much higher today? Amnesty conducted a separate study in 2017 looking at the disproportionate amount of abuse that women and BAME MPs faced online, finding that my right hon. Friend the Member for Hackney North and Stoke Newington (Ms Abbott) was the recipient of almost a third of all the abusive tweets analysed, as alluded to already by the hon. Member for Edinburgh—

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Aberdeen North.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I knew that. [Laughter.]

Five years later, we continue to see significant volumes of racist, sexist and homophobic hate-filled abuse and threats online to politicians of all parties. That is unacceptable in itself, but we must ask whether this toxic environment helps to keep decent people in politics or, indeed, attracts good people into politics, so that our democracy can prosper into the future across the political spectrum. The reality we face is that our democracy is under attack online each and every day, and every day we delay acting is another day on which abuse becomes increasingly normalised or is just seen as part of the job for those who have put themselves forward for public service. This form of abuse harms society as a whole, so it deserves specific consideration in the Bill.

While elected Members and officials are not a special group of people deserving of more legal protections than anyone else, we must be honest that the abuse they face is distinct and specific to those roles and directly affects our democracy itself. It can lead to the most serious physical harm, with two Members of Parliament having been murdered in the last six years, and many others face death threats or threats of sexual or other violence on a daily basis. However, this is not just about harm to elected representatives; online threats are often seen first, and sometimes only, by their members of staff. They may not be the intended target, but they are often the people harmed most. I am sure we all agree that that is unacceptable and cannot continue.

All of us have probably reported messages and threats to social media platforms and the police, with varying degrees of success in terms of having them removed or the individuals prosecuted. Indeed, we sadly heard examples of that from my hon. Friend the shadow Minister. Often we are told that nothing can be done. Currently, the platforms look at their own rules to determine what constitutes freedom of speech or expression and what is hateful speech or harm. That fine line moves. There is no consistency across platforms, and we therefore urgently need more clarity and a legal duty in place to remove that content quickly.

Amendment 105 would explicitly include in the Bill protection and consideration for those involved in UK elections, whether candidates or staff. Amendment 106 would go further and place an obligation on Ofcom to produce a code of practice, to be issued to the platforms. It would define what steps platforms must take to protect those involved in elections and set out what content is acceptable or unacceptable to be directed at them.

16:30
While I am cautious about heaping responsibility on Ofcom and I remain nervous about the Government’s willingness to leave more and more contentious issues for it to deal with, I believe that that is a reasonable step. It would allow Ofcom to outline what steps a platform must take to protect democratic debate and to set out acceptable and unacceptable content in the context of our ever-changing political landscape. That form of nuance would need to be regularly updated, so it clearly would not be practical to put it in the Bill.
Let us be honest: will this amendment solve the issue entirely? No. However, does more need to be done to protect our democracy? Yes. I am in constant conversation with people and organisations in this sector about what else could be brought forward to assist the police and the Crown Prosecution Service in prosecuting those who wish to harm those elected to public office—both online and offline. Directly addressing the duty of platforms to review content, remove harmful speech and report those who wish to do harm would, I believe, be a positive first step towards protecting our democratic debate and defending those who work to make it effective on behalf of the people of the United Kingdom.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I want to make a few comments on the amendment. As a younger female parliamentarian, I find that I am often asked to speak to young people about becoming an MP or getting involved in politics. I find it difficult to say to young women, “Yes, you should do this,” and most of the reason for that is what people are faced with online. It is because a female MP cannot have a Twitter account without facing abuse. I am sure male MPs do as well, but it tends to be worse for women.

We cannot engage democratically and with constituents on social media platforms without receiving abuse and sometimes threats as well. It is not just an abusive place to be—that does not necessarily meet the threshold for illegality—but it is pretty foul and toxic. There have been times when I have deleted Twitter from my phone because I just need to get away from the vile abuse that is being directed towards me. I want, in good conscience, to be able to make an argument to people that this is a brilliant job, and it is brilliant to represent constituents and to make a difference on their behalf at whatever level of elected politics, but right now I do not feel that I am able to do that.

When my footballing colleague, the hon. Member for Batley and Spen, mentions “UK elections” in the amendment, I assume she means that in the widest possible way—elections at all levels.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

indicated assent.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Sometimes we miss out the fact that although MPs face abuse, we have a level of protection as currently elected Members. Even if there were an election coming up, we have a level of security protection and access that is much higher than for anybody else challenging a candidate or standing in a council or a Scottish Parliament election. As sitting MPs, we already have an additional level of protection because of the security services we have in place. We need to remember, and I assume this is why the amendment is drawn in a pretty broad way, that everybody standing for any sort of elected office faces significant risk of harm—again, whether or not that meets the threshold for illegality.

There are specific things that have been mentioned. As has been said, epilepsy is specifically mentioned as a place where specific harm occurs. Given the importance of democracy, which is absolutely vital, we need to have a democratic system where people are able to stand in elections and make their case. Given the importance of democracy, which is absolutely vital, we need to have a democratic system where people are able to stand in elections and make their case. That is why we have election addresses and a system where the election address gets delivered through every single person’s door. There is an understanding and acceptance by people involved in designing democratic processes that the message of all candidates needs to get out there. If the message of all candidates cannot get out there because some people are facing significant levels of abuse online, then democracy is not acting in the way that it should be. These amendments are fair and make a huge amount of sense. They are protecting the most important tenets of democracy and democratic engagement.

I want to say something about my own specific experiences. We have reported people to the police and have had people in court over the messages they have sent, largely by email, which would not be included in the Bill, but there have also been some pretty creepy ones on social media that have not necessarily met the threshold. As has been said, it is my staff who have had to go to court and stand in the witness box to explain the shock and terror they have felt on seeing the email or the communication that has come in, so I think any provision should include that.

Finally, we have seen situations where people working in elections—this is not an airy-fairy notion, but something that genuinely happened—have been photographed and those pictures have been shared on social media, and they have then been abused as a result. They are just doing their job, handing out ballot papers or standing up and announcing the results on the stage, and they have to abide by the processes that are in place now. In order for us to have free and fair elections that are run properly and that people want to work at and support, we need to have that additional level of protection. The hon. Member for Batley and Spen made a very reasonable argument and I hope the Minister listened to it carefully.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have listened very carefully to both the hon. Member for Batley and Spen and the hon. Member for Aberdeen North. I agree with both of them that abuse and illegal activity directed at anyone, including people running for elected office, is unacceptable. I endorse and echo the comments they made in their very powerful and moving speeches.

In relation to the technicality of these amendments, what they are asking for is in the Bill already but in different places. This clause is about protecting content of “democratic importance” and concerns stopping online social media firms deleting content through over-zealous takedown. What the hon. Members are talking about is different. They are talking about abuse and illegal activities, such as rape threats, that people get on social media, particularly female MPs, as they both pointed out. I can point to two other places in the Bill where what they are asking for is delivered.

First, there are the duties around illegal content that we debated this morning. If there is content online that is illegal—some of the stuff that the shadow Minister referred to earlier sounds as if it would meet that threshold—then in the Bill there is a duty on social media firms to remove that content and to proactively prevent it if it is on the priority list. The route to prosecution will exist in future, as it does now, and the user-verification measures, if a user is verified, make it more likely for the police to identify the person responsible. In the context of identifying people carrying out abuse, I know the Home Office is looking at the Investigatory Powers Act 2016 as a separate piece of work that speaks to that issue.

So illegal content is dealt with in the illegal content provisions in the Bill, but later we will come to clause 150, which updates the Malicious Communications Act 1988 and creates a new harmful communications offence. Some of the communications that have been described may not count as a criminal offence under other parts of criminal law, but if they meet the test of harmful communication in clause 150, they will be criminalised and will therefore have to be taken down, and prosecution will be possible. In meeting the very reasonable requests that the hon. Members for Batley and Spen and for Aberdeen North have made, I would point to those two parts of the Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

But clause 150(5) says that if a message

“is, or is intended to be, a contribution to a matter of public interest”,

people are allowed to send it, which basically gives everybody a get-out clause in relation to anything to do with elections.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

No, it does not.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I know we are not discussing that part of the Bill, and if the Minister wants to come back to this when we get to clause 150, I have no problem with that.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I will answer the point now, as it has been raised. Clause 150 categorically does not give a get-out-of-jail-free card or provide an automatic excuse. Clearly, there is no way that abusing a candidate for elected office with rape threats and so on could possibly be considered a matter of public interest. In fact, even if the abuse somehow could be considered as possibly contributing to public debate, clause 150(5) says explicitly in line 32 on page 127:

“but that does not determine the point”.

Even where there is some potentially tenuous argument about a contribution to a matter of public interest, which most definitely would not be the case for the rape threats that have been described, that is not determinative. It is a balancing exercise that gets performed, and I hope that puts the hon. Lady’s mind at rest.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

The Minister makes a really valid point and is right about the impact on the individual. The point I am trying to make with the amendments is that this is about the impact on the democratic process, which is why I think it fits in with clause 15. It is not about how individuals feel; it is about the impact that that has on behaviours, and about putting the emphasis and onus on platforms to decide what is of democratic importance. In the evidence we had two weeks ago, the witnesses certainly did not feel comfortable with putting the onus on platforms. If we were to have a code of practice, we would at least give them something to work with on the issue of what is of democratic importance. It is about the impact on democracy, not just the harm to the individual involved.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clearly, if a communication is sufficiently offensive that it meets the criminal threshold, it is covered, and that would obviously harm the democratic process as well. If a communication was sufficiently offensive that it breached the harmful communication offence in clause 150, it would also, by definition, harm the democratic process, so communications that are damaging to democracy would axiomatically be caught by one thing or the other. I find it difficult to imagine a communication that might be considered damaging to democracy but that would not meet one of those two criteria, so that it was not illegal and would not meet the definition of a harmful communication.

My main point is that the existing provisions in the Bill address the kinds of behaviours that were described in those two speeches—the illegal content provisions, and the new harmful communication offence in clause 150. On that basis, I hope the hon. Member for Batley and Spen will withdraw the amendment, safe in the knowledge that the Bill addresses the issue that she rightly and reasonably raises.

Question put, That the amendment be made.

Division 13

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

Question proposed, That the clause stand part of the Bill.
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause 16 stand part.

New clause 7—Report on duties to protect content of democratic importance and journalistic content

“(1) The Secretary of State must publish a report which—

(a) reviews the extent to which Category 1 services have fulfilled their duties under—

(i) Clause 15; and

(ii) Clause 16;

(b) analyses the effectiveness of Clauses 15 and 16 in protecting against—

(i) foreign state actors;

(ii) extremist groups and individuals; and

(iii) sources of misinformation and disinformation.

(2) The report must be laid before Parliament within one year of this Act being passed.”

This new clause would require the Secretary of State to publish a report reviewing the effectiveness of Clauses 15 and 16.

16:44
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I will speak to clauses 15 and 16 and to new clause 7. The duties outlined in the clause, alongside clause 16, require platforms to have special terms and processes for handling journalistic and democratically important content. In respect of journalistic content, platforms are also required to provide an expedited appeals process for removed posts, and terms specifying how they will define journalistic content. There are, however, widespread concerns about both those duties.

As the Bill stands, we feel that there is too much discretion for platforms. They are required to define “journalistic” content, a role that they are completely unsuited to and, from what I can gather, do not want. In addition, the current drafting leaves the online space open to abuse. Individuals intent on causing harm are likely to apply to take advantage of either of those duties; masquerading as journalists or claiming democratic importance in whatever harm they are causing, and that could apply to almost anything. In the evidence sessions, we also heard about the concerns expressed brilliantly by Kyle Taylor from Fair Vote and Ellen Judson from Demos, that the definitions as they stand in the Bill thus far are broad and vague. However, we will come on to those matters later.

Ultimately, treating “journalistic” and “democratically important” content differently is unworkable, leaving platforms to make impossible judgments over, for example, when and for how long an issue becomes a matter of reasonable public debate, or in what settings a person is acting as a journalist. As the Minister knows, the duties outlined in the clause could enable a far-right activist who was standing in an election, or potentially even just supporting candidates in elections, to use all social media platforms. That might allow far-right figures to be re-platformed on to social media sites where they would be free to continue spreading hate.

The Bill indicates that content will be protected if created by a political party ahead of a vote in Parliament, an election or a referendum, or when campaigning on a live political issue—basically, anything. Can the Minister confirm whether the clause means that far-right figures who have been de-platformed for hate speech already must be reinstated if they stand in an election? Does that include far-right or even neo-Nazi political parties? Content and accounts that have been de-platformed from mainstream platforms for breaking terms of service should not be allowed to return to those platforms via this potential—dangerous—loophole.

As I have said, however, I know that these matters are complex and, quite rightly, exemptions must be in place to allow for free discussion around matters of the day. What cannot be allowed to perpetuate is hate sparked by bad actors using simple loopholes to avoid any consequences.

On clause 16, the Minister knows about the important work that Hope not Hate is doing in monitoring key far-right figures. I pay tribute to it for its excellent work. Many of them self-define as journalists and could seek to exploit this loophole in the Bill and propagate hate online. Some of the most high-profile and dangerous far-right figures in the UK, including Stephen Yaxley-Lennon, also known as Tommy Robinson, now class themselves as journalists. There are also far-right and conspiracy-theory so-called “news companies” such as Rebel Media and Urban Scoop. Both those replicate mainstream news publishers, but are used to spread misinformation and discriminatory content. Many of those individuals and organisations have been de-platformed already for consistently breaking the terms of service of major social media platforms, and the exemption could see them demand their return and have their return allowed.

New clause 7 would require the Secretary of State to publish a report reviewing the effectiveness of clauses 15 and 16. It is a simple new clause to require parliamentary scrutiny of how the Government’s chosen means of protecting content of democratic importance and content of journalistic content are working.

Hacked Off provided me with a list of people it found who have claimed to be journalists and who would seek to exploit the journalistic content duty, despite being banned from social media because they are racists or bad actors. First is Charles C. Johnson, a far-right activist who describes himself as an “investigative journalist”. Already banned from Twitter for saying he would “take out” a civil rights activist, he is also alleged to be a holocaust denier.

Secondly, we have Robert Stacy McCain. Robert has been banned from Twitter for participating in targeted abuse. He was a journalist for The Washington Post, but is alleged to have also been a member of the League of the South, a far-right group known to include racists. Then, there is Richard B. Spencer, a far-right journalist and former editor, only temporary banned for using overlapping accounts. He was pictured making the Nazi salute and has repeated Nazi propaganda. When Trump became President, he encouraged people to “party like it’s 1933”. Sadly, the list goes on and on.

Transparency is at the very heart of the Bill. The Minister knows we have concerns about clauses 15 and 16, as do many of his own Back Benchers. We have heard from my hon. Friend the Member for Batley and Spen how extremist groups and individuals and foreign state actors are having a very real impact on the online space. If the Minister is unwilling to move on tightening up those concepts, the very least he could commit to is a review that Parliament will be able to formally consider.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the shadow Minister for her comments and questions. I would like to pick up on a few points on the clauses. First, there was a question about what content of democratic importance and content of journalistic importance mean in practice. As with many concepts in the Bill, we will look to Ofcom to issue codes of practice specifying precisely how we might expect platforms to implement the various provisions in the Bill. That is set out in clause 37(10)(e) and (f), which appear at the top of page 37, for ease. Clauses 15 and 16 on content of democratic and journalistic importance are expressly referenced as areas where codes of practice will have to be published by Ofcom, which will do further work on and consult on that. It will not just publish it, but will go through a proper process.

The shadow Minister expressed some understandable concerns a moment ago about various extremely unpleasant people, such as members of the far right who might somehow seek to use the provisions in clauses 15 and 16 as a shield behind which to hide, to enable them to continue propagating hateful, vile content. I want to make it clear that the protections in the Bill are not absolute—it is not that if someone can demonstrate that what they are saying is of democratic importance, they can say whatever they like. That is not how the clauses are drafted.

I draw attention to subsection (2) of both clauses 15 and 16. At the end of the first block of text, just above paragraph (a), it says “taken into account”: the duty is to ensure that matters concerning the importance of freedom of expression relating to content of democratic importance are taken into account when making decisions. It is not an absolute prohibition on takedown or an absolute protection, but simply something that has to be taken into account.

If someone from the far right, as the shadow Minister described, was spewing out vile hatred, racism or antisemitism, and tried to use those clauses, the fact that they might be standing in an election might well be taken into account. However, in performing that balancing exercise, the social media platforms and Ofcom acting as enforcers—and the court if it ever got judicially reviewed—would weigh those things up and find that taking into account content of democratic importance would not be sufficient to outweigh considerations around vile racism, antisemitism or misogyny.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Minister mentions that it would be taken into account. How long does he anticipate it would be taken into account for, especially given the nature of an election? A short campaign could be a number of weeks, or something could be posted a day before an election, be deemed democratically important and have very serious and dangerous ramifications.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As I say, if content was racist, antisemitic or flagrantly misogynistic, the balancing exercise is performed and the democratic context may be taken into account. I do not think the scales would tip in favour of leaving the content up. Even during an election period, I think common sense dictates that.

To be clear on the timing point that the hon. Lady asked about, the definition of democratic importance is not set out in hard-edged terms. It does not say, “Well, if you are in a short election period, any candidate’s content counts as of democratic importance.” It is not set out in a manner that is as black and white as that. If, for example, somebody was a candidate but it was just racist abuse, I am not sure how even that would count as democratic importance, even during an election period, because it would just be abuse; it would not be contributing to any democratic debate. Equally, somebody might not be a candidate, or might have been a candidate historically, but might be contributing to a legitimate debate after an election. That might be seen as being of democratic importance, even though they were not actually a candidate. As I said, the concept is not quite as black and white as that. The main point is that it is only to be taken into account; it is not determinative.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I appreciate the Minister’s allowing me to come back on this. During the Committee’s evidence sessions, we heard of examples where bad-faith state actors were interfering in the Scottish referendum, hosting Facebook groups and perpetuating disinformation around the royal family to persuade voters to vote “Yes” to leave the United Kingdom. That disinformation by illegal bad-faith actors could currently come under both the democratic importance and journalistic exemptions, so would be allowed to remain for the duration of that campaign. Given the exemptions in the Bill, it could not be taken down but could have huge, serious ramifications for democracy and the security of the United Kingdom.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I understand the points that the hon. Lady is raising. However, I do not think that it would happen in that way.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

You don’t think?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

No, I don’t. First of all, as I say, it is taken into account; it is not determinative. Secondly, on the point about state-sponsored disinformation, as I think I mentioned yesterday in response to the hon. Member for Liverpool, Walton, there is, as we speak, a new criminal offence of foreign interference being created in the National Security Bill. That will criminalise the kind of foreign interference in elections that she referred to. Because that would then create a new category of illegal content, that would flow through into this Bill. That would not be overridden by the duty to protect content of democratic importance set out here. I think that the combination of the fact that this is a balancing exercise, and not determinative, and the new foreign interference offence being created in the National Security Bill, will address the issue that the hon. Lady is raising—reasonably, because it has happened in this country, as she has said.

I will briefly turn to new clause 7, which calls for a review. I understand why the shadow Minister is proposing a review, but there is already a review mechanism in the Bill; it is to be found in clause 149, and will, of course, include a review of the way that clauses 15 and 16 operate. They are important clauses; we all accept that journalistic content and content of democratic importance is critical to the functioning of our society. Case law relating to article 10 of the European convention on human rights, for example, recognises content of journalistic importance as being especially critical. These two clauses seek to ensure that social media firms, in making their decisions, and Ofcom, in enforcing the firms, take account of that. However, it is no more than that: it is “take account”, it is not determinative.

Question put and agreed to.

Clause 15 accordingly ordered to stand part of the Bill.

Clause 16 ordered to stand part of the Bill.

Ordered, That further consideration be now adjourned. —(Steve Double.)

16:58
Adjourned till Thursday 9 June at half-past Eleven o’clock.
Written evidence to be reported to the House
OSB39 LV=General Insurance
OSB40 Epilepsy Society
OSB41 Free Speech Union
OSB42 Graham Smith
OSB43 Center for Data Innovation
OSB44 Samaritans
OSB45 End Violence Against Women coalition, Glitch, Refuge, Carnegie UK, 5Rights, NSPCC and Professors Lorna Woods and Clare McGlynn (joint submission)
OSB46 Sky
OSB47 Peter Wright, Editor Emeritus, DMG Media
OSB48 Graham Smith (further submission)
OSB49 CARE (Christian Action Research and Education)
OSB50 Age Verification Providers Association (supplementary submission)
OSB51 Legal Advice Centre at Queen Mary, University of London and Mishcon de Reya LLP (joint submission)
OSB52 Google UK (supplementary submission)
OSB53 Refuge (supplementary submission)
OSB54 Reset (supplementary submission)
OSB55 Public Service Broadcasters (BBC, Channel 4, and Channel 5)
OSB56 Which?
OSB57 Professor Corinne Fowler, School of Museum Studies, University of Leicester
OSB58 Independent Media Association
OSB59 Hacked Off Campaign
OSB60 Center for Countering Digital Hate

Online Safety Bill (Seventh sitting)

Committee stage
Thursday 9th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 9 June 2022 - (9 Jun 2022)
The Committee consisted of the following Members:
Chairs: Sir Roger Gale, † Christina Rees
† Ansell, Caroline (Eastbourne) (Con)
† Bailey, Shaun (West Bromwich West) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
† Carden, Dan (Liverpool, Walton) (Lab)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Double, Steve (St Austell and Newquay) (Con)
† Fletcher, Nick (Don Valley) (Con)
† Holden, Mr Richard (North West Durham) (Con)
† Keeley, Barbara (Worsley and Eccles South) (Lab)
† Leadbeater, Kim (Batley and Spen) (Lab)
† Miller, Dame Maria (Basingstoke) (Con)
† Mishra, Navendu (Stockport) (Lab)
† Moore, Damien (Southport) (Con)
† Nicolson, John (Ochil and South Perthshire) (SNP)
† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
† Russell, Dean (Watford) (Con)
† Stevenson, Jane (Wolverhampton North East) (Con)
Katya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks
† attended the Committee
Public Bill Committee
Thursday 9 June 2022
(Morning)
[Christina Rees in the Chair]
Online Safety Bill
11:30
None Portrait The Chair
- Hansard -

We are now sitting in public and proceedings are being broadcast. Please switch electronic devices to silent. Tea and coffee are not allowed during sittings. I have no objections to Members taking their jackets off—it is very warm in this room.

Clause 17

Duty about content reporting

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause 27 stand part.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

Good morning, Ms Rees. It is a pleasure to serve once again under your chairmanship. I wondered whether the shadow Minister, the hon. Member for Pontypridd, wanted to speak first—I am always happy to follow her, if she would prefer that.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I do my best.

Clauses 17 and 27 have similar effects, the former applying to user-to-user services and the latter to search services. They set out an obligation on the companies to put in place effective and accessible content reporting mechanisms, so that users can report issues. The clauses will ensure that service providers are made aware of illegal and harmful content on their sites. In relation to priority illegal content, the companies must proactively prevent it in the first place, but in the other areas, they may respond reactively as well.

The clause will ensure that anyone who wants to report illegal or harmful content can do so in a quick and reasonable way. We are ensuring that everyone who needs to do that will be able to do so, so the facility will be open to those who are affected by the content but who are not themselves users of the site. For example, that might be non-users who are the subject of the content, such as a victim of revenge pornography, or non-users who are members of a specific group with certain characteristics targeted by the content, such as a member of the Jewish community reporting antisemitic content. There is also facility for parents and other adults with caring responsibility for children, and adults caring for another adult, to report content. Clause 27 sets out similar duties in relation to search. I commend the clauses to the Committee.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I will talk about this later, when we come to a subsequent clause to which I have tabled some amendments—I should have tabled some to this clause, but unfortunately missed the chance to do so.

I appreciate the Minister laying out why he has designated the people covered by this clause; my concern is that “affected” is not wide enough. My logic is that, on the strength of these provisions, I might not be able to report racist content that I come across on Twitter if I am not the subject of that content—if I am not a member of a group that is the subject of the content or if I am not caring for someone who is the subject of it.

I appreciate what the Minister is trying to do, and I get the logic behind it, but I think the clause unintentionally excludes some people who would have a reasonable right to expect to be able to make reports in this instance. That is why I tabled amendments 78 and 79 to clause 28, about search functions, but those proposals would have worked reasonably for this clause as well. I do not expect a positive answer from the Minister today, but perhaps he could give consideration to my concern. My later amendments would change “affected person” to “any other person”. That would allow anyone to make a report, because if something is illegal content, it is illegal content. It does not matter who makes the report, and it should not matter that I am not a member of the group of people targeted by the content.

I report things all the time, particularly on Twitter, and a significant amount of it is nothing to do with me. It is not stuff aimed at me; it is aimed at others. I expect that a number of the platforms will continue to allow reporting for people who are outwith the affected group, but I do not want to be less able to report than I am currently, and that would be the case for many people who see concerning content on the internet.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The hon. Lady is making a really important point. One stark example that comes to my mind is when English footballers suffered horrific racist abuse following the penalty shootout at the Euros last summer. Hundreds of thousands of people reported the abuse that they were suffering to the social media platforms on their behalf, in an outcry of solidarity and support, and it would be a shame if people were prevented from doing that.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I absolutely agree. I certainly do not think I am suggesting that the bigger platforms such as Twitter and Facebook will reduce their reporting mechanisms as a result of how the Bill is written. However, it is possible that newer or smaller platforms, or anything that starts after this legislation comes, could limit the ability to report on the basis of these clauses.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

Good morning, Ms Rees.

It is important that users of online services are empowered to report harmful content, so that it can be removed. It is also important for users to have access to complaints procedures when wrong moderation decisions have been made. Reporting and complaint mechanisms are integral to ensuring that users are safe and that free speech is upheld, and we support these provisions in the Bill.

Clauses 17 and 18, and clauses 27 and 28, are two parts of the same process: content reporting by individual users, and the handling of content reported as a complaint. However, it is vital that these clauses create a system that works. That is the key point that Labour Members are trying to make, because the wild west system that we have at the moment does not work.

It is welcome that the Government have proposed a system that goes beyond the users of the platform and introduces a duty on companies. However, companies have previously failed to invest enough money in their complaints systems for the scale at which they are operating in the UK. The duties in the Bill are an important reminder to companies that they are part of a wider society that goes beyond their narrow shareholder interest.

One example of why this change is so necessary, and why Labour Members are broadly supportive of the additional duties, is the awful practice of image abuse. With no access to sites on which their intimate photographs are being circulated, victims of image abuse have very few if any routes to having the images removed. Again, the practice of image abuse has increased during the pandemic, including through revenge porn, which the Minister referred to. The revenge porn helpline reported that its case load more than doubled between 2019 and 2020.

These clauses should mean that people can easily report content that they consider to be either illegal, or harmful to children, if it is hosted on a site likely to be accessed by children, or, if it is hosted on a category 1 platform, harmful to adults. However, the Minister needs to clarify how these service complaints systems will be judged and what the performance metrics will be. For instance, how will Ofcom enforce against a complaint?

In many sectors of the economy, even with long-standing systems of regulation, companies can have tens of millions of customers reporting content, but that does not mean that any meaningful action can take place. The hon. Member for Aberdeen North has just told us how often she reports on various platforms, but what action has taken place? Many advocacy groups of people affected by crimes such as revenge porn will want to hear, in clear terms, what will happen to material that has been complained about. I hope the Minister can offer that clarity today.

Transparency in reporting will be vital to analysing trends and emerging types of harm. It is welcome that in schedule 8, which we will come to later, transparency reporting duties apply to the complaints process. It is important that as much information as possible is made public about what is going on in companies’ complaints and reporting systems. As well as the raw number of complaints, reporting should include what is being reported or complained about, as the Joint Committee on the draft Bill recommended last year. Again, what happens to the reported material will be an important metric on which to judge companies.

Finally, I will mention the lack of arrangements for children. We have tabled new clause 3, which has been grouped for discussion with other new clauses at the end of proceedings, but it is relevant to mention it now briefly. The Children’s Commissioner highlighted in her oral evidence to the Committee how children had lost faith in complaints systems. That needs to be changed. The National Society for the Prevention of Cruelty to Children has also warned that complaints mechanisms are not always appropriate for children and that a very low proportion of children have ever reported content. A child specific user advocacy body could represent the interests of child users and support Ofcom’s regulatory decisions. That would represent an important strengthening of protections for users, and I hope the Government will support it when the time comes.

Jane Stevenson Portrait Jane Stevenson (Wolverhampton North East) (Con)
- Hansard - - - Excerpts

I rise briefly to talk about content reporting. I share the frustrations of the hon. Member for Aberdeen North. The way I read the Bill was that it would allow users and affected persons, rather than “or” affected persons, to report content. I hope the Minister can clarify that that means affected persons who might not be users of a platform. That is really important.

Will the Minister also clarify the use of human judgment in these decisions? Many algorithms are not taking down some content at the moment, so I would be grateful if he clarified that there is a need for platforms to provide a genuine human judgment on whether content is harmful.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I want to raise an additional point about content reporting and complaints procedures. I met with representatives of Mencap yesterday, who raised the issue of the accessibility of the procedures that are in place. I appreciate that the Bill talks about procedures being accessible, but will the Minister give us some comfort about Ofcom looking at the reporting procedures that are in place, to ensure that adults with learning disabilities in particular can access those content reporting and complaints procedures, understand them and easily find them on sites?

That is a specific concern that Mencap raised on behalf of its members. A number of its members will be users of sites such as Facebook, but may find it more difficult than others to access and understand the procedures that are in place. I appreciate that, through the Bill, the Minister is making an attempt to ensure that those procedures are accessible, but I want to make sure they are accessible not just for the general public but for children, who may need jargon-free access to content reporting and complaints procedures, and for people with learning disabilities, who may similarly need jargon-free, easy-to-understand and easy-to-find access to those procedures.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me try to address some of the questions that have been raised in this short debate, starting with the question that the hon. Member for Aberdeen North quite rightly asked at the beginning. She posed the question, “What if somebody who is not an affected person encountered some content and wanted to report it?” For example, she might encounter some racist content on Twitter or elsewhere and would want to be able to report it, even though she is not herself the target of it or necessarily a member of the group affected. I can also offer the reassurance that my hon. Friend the Member for Wolverhampton North East asked for.

The answer is to be found in clause 17(2), which refers to

“A duty to operate a service using systems and processes that allow users and”—

I stress “and”—“affected persons”. As such, the duty to offer content reporting is to users and affected persons, so if the hon. Member for Aberdeen North was a user of Twitter but was not herself an affected person, she would still be able to report content in her capacity as a user. I hope that provides clarification.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I appreciate that. That is key, and I am glad that this is wider than just users of the site. However, taking Reddit as an example, I am not signed up to that site, but I could easily stumble across content on it that was racist in nature. This clause would mean that I could not report that content unless I signed up to Reddit, because I would not be an affected person or a user of that site.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Lady for her clarificatory question. I can confirm that in order to be a user of a service, she would not necessarily have to sign up to it. The simple act of browsing that service, of looking at Reddit—not, I confess, an activity that I participate in regularly—regardless of whether or not the hon. Lady has an account with it, makes her a user of that service, and in that capacity she would be able to make a content report under clause 17(2) even if she were not an affected person. I hope that clears up the question in a definitive manner.

The hon. Lady asked in her second speech about the accessibility of the complaints procedure for children. That is strictly a matter for clause 18, which is the next clause, but I will quickly answer her question. Clause 18 contains provisions that explicitly require the complaints process to be accessible. Subsection (2)(c) states that the complaints procedure has to be

“easy to access, easy to use (including by children) and transparent”,

so the statutory obligation that she requested is there in clause 18.

11:45
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Can the Minister explain the logic in having that phrasing for the complaints procedure but not for the content-reporting procedure? Surely it would also make sense for the content reporting procedure to use the phrasing

“easy to access, easy to use (including by children) and transparent.”

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

There is in clause 17(2)

“a duty to operate a service that allows users and affected persons to easily report content which they consider to be content of a…kind specified below”,

which, of course, includes services likely to be accessed by children, under subsection (4). The words “easily report” are present in clause 17(2).

I will move on to the question of children reporting more generally, which the shadow Minister raised as well. Clearly, a parent or anyone with responsibility for a child has the ability to make a report, but it is also worth mentioning the power in clauses 140 to 142 to make super-complaints, which the NSPCC strongly welcomed its evidence. An organisation that represents a particular group—an obvious example is the NSPCC representing children, but it would apply to loads of other groups—has the ability to make super-complaints to Ofcom on behalf of those users, if it feels they are not being well treated by a platform. A combination of the parent or carer being able to make individual complaints, and the super-complaint facility, means that the points raised by Members are catered for. I commend the clause to the Committee.

Question put and agreed to.

Clause 17 accordingly ordered to stand part of the Bill.

Clause 18

Duties about complaints procedures

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 78, in clause 28, page 28, line 28, leave out “affected” and replace with “any other”

This amendment allows those who do not fit the definition of “affected person” to make a complaint about search content which they consider to be illegal.

Amendment 79, in clause 28, page 28, line 30, leave out “affected” and replace with “any other”

This amendment allows those who do not fit the definition of “affected person” to make a complaint about search content which they consider not to comply with sections 24, 27 or 29.

Clause 28 stand part.

New clause 1—Report on redress for individual complaints

“(1) The Secretary of State must publish a report assessing options for dealing with appeals about complaints made under—

(a) section 18; and

(b) section 28

(2) The report must—

(a) provide a general update on the fulfilment of duties about complaints procedures which apply in relation to all regulated user-to-user services and regulated search services;

(b) assess which body should be responsible for a system to deal with appeals in cases where a complainant considers that a complaint has not been satisfactorily dealt with; and

(c) provide options for how the system should be funded, including consideration of whether an annual surcharge could be imposed on user-to-user services and search services.

(3) The report must be laid before Parliament within six months of the commencement of this Act.”

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I will speak to new clause 1. Although duties about complaints procedures are welcome, it has been pointed out that service providers’ user complaints processes are often obscure and difficult to navigate—that is the world we are in at the moment. The lack of any external complaints option for individuals who seek redress is worrying.

The Minister has just talked about the super-complaints mechanism—which we will come to later in proceedings—to allow eligible entities to make complaints to Ofcom about a single regulated service if that complaint is of particular importance or affects a particularly large number of service users or members of the public. Those conditions are constraints on the super-complaints process, however.

An individual who felt that they had been failed by a service’s complaints system would have no source of redress. Without redress for individual complaints once internal mechanisms have been exhausted, victims of online abuse could be left with no further options, consumer protections could be compromised, and freedom of expression could be impinged upon for people who felt that their content had been unfairly removed.

Various solutions have been proposed. The Joint Committee recommended the introduction of an online safety ombudsman to consider complaints for which recourse to internal routes of redress had not resulted in resolution and the failure to address risk had led to significant and demonstrable harm. Such a mechanism would give people an additional body through which to appeal decisions after they had come to the end of a service provider’s internal process. Of course, we as hon. Members are all familiar with the ombudsman services that we already have.

Concerns have been raised about the level of complaints such an ombudsman could receive. However, as the Joint Committee noted, complaints would be received only once the service’s internal complaints procedure had been exhausted, as is the case for complaints to Ofcom about the BBC. The new clause seeks to ensure that we find the best possible solution to the problem. There needs to be a last resort for users who have suffered serious harm on services. It is only through the introduction of an external redress mechanism that service providers can truly be held to account for their decisions as they impact on individuals.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I rise to contribute to the stand part debate on clauses 18 and 28. It was interesting, though, to hear the debate on clause 17, because it is right to ask how the complaints services will be judged. Will they work in practice? When we start to look at how to ensure that the legislation works in all eventualities, we need to ensure that we have some backstops for when the system does not work as it should.

It is welcome that there will be clear duties on providers to have operational complaints procedures—complaints procedures that work in practice. As we all know, many of them do not at the moment. As a result, we have a loss of faith in the system, and that is not going to be changed overnight by a piece of legislation. For years, people have been reporting things—in some cases, very serious criminal activity—that have not been acted on. Consumers—people who use these platforms—are not going to change their mind overnight and suddenly start trusting these organisations to take their complaints seriously. With that in mind, I hope that the Minister listened to the points I made on Second Reading about how to give extra support to victims of crimes or people who have experienced things that should not have happened online, and will look at putting in place the right level of support.

The hon. Member for Worsley and Eccles South talked about the idea of an ombudsman; it may well be that one should be in place to deal with situations where complaints are not dealt with through the normal processes. I am also quite taken by some of the evidence we received about third-party complaints processes by other organisations. We heard a bit about the revenge porn helpline, which was set up a few years ago when we first recognised in law that revenge pornography was a crime. The Bill creates a lot more victims of crime and recognises them as victims, but we are not yet hearing clearly how the support systems will adequately help that massively increased number of victims to get the help they need.

I will probably talk in more detail about this issue when we reach clause 70, which provides an opportunity to look at the—unfortunately—probably vast fines that Ofcom will be imposing on organisations and how we might earmark some of that money specifically for victim support, whether by funding an ombudsman or helping amazing organisations such as the revenge porn helpline to expand their services.

We must address this issue now, in this Bill. If we do not, all those fines will go immediately into the coffers of the Treasury without passing “Go”, and we will not be able to take some of that money to help those victims directly. I am sure the Government absolutely intend to use some of the money to help victims, but that decision would be at the mercy of the Treasury. Perhaps we do not want that; perhaps we want to make it cleaner and easier and have the money put straight into a fund that can be used directly for people who have been victims of crime or injustice or things that fall foul of the Bill.

I hope that the Minister will listen to that and use this opportunity, as we do in other areas, to directly passport fines for specific victim support. He will know that there are other examples of that that he can look at.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

As the right hon. Member for Basingstoke has mentioned the revenge porn helpline, I will mention the NSPCC’s Report Remove tool for children. It does exactly the same thing, but for younger people—the revenge porn helpline is specifically only for adults. Both those tools together cover the whole gamut, which is massively helpful.

The right hon. Lady’s suggestion about the hypothecation of fines is a very good one. I was speaking to the NSPCC yesterday, and one of the issues that we were discussing was super-complaints. Although super-complaints are great and I am very glad that they are included in the Bill, the reality is that some of the third-sector organisations that are likely to be undertaking super-complaints are charitable organisations that are not particularly well funded. Given how few people work for some of those organisations and the amazing amount of work they do, if some of the money from fines could support not just victims but the initial procedure for those organisations to make super-complaints, it would be very helpful. That is, of course, if the Minister does not agree with the suggestion of creating a user advocacy panel, which would fulfil some of that role and make that support for the charitable organisations less necessary—although I am never going to argue against support for charities: if the Minister wants to hypothecate it in that way, that would be fantastic.

I tabled amendments 78 and 79, but the statement the Minister made about the definition of users gives me a significant level of comfort about the way that people will be able to access a complaints procedure. I am terribly disappointed that the Minister is not a regular Reddit user. I am not, either, but I am well aware of what Reddit entails. I have no desire to sign up to Reddit, but knowing that even browsing the site I would be considered a user and therefore able to report any illegal content I saw, is massively helpful. On that basis, I am comfortable not moving amendments 78 and 79.

On the suggestion of an ombudsman—I am looking at new clause 1—it feels like there is a significant gap here. There are ombudsman services in place for many other areas, where people can put in a complaint and then go to an ombudsman should they feel that it has not been appropriately addressed. As a parliamentarian, I find that a significant number of my constituents come to me seeking support to go to the ombudsman for whatever area it is in which they feel their complaint has not been appropriately dealt with. We see a significant number of issues caused by social media companies, in particular, not taking complaints seriously, not dealing with complaints and, in some cases, leaving illegal content up. Particularly in the initial stages of implementation—in the first few years, before companies catch up and are able to follow the rules put in place by the Bill and Ofcom—a second-tier complaints system that is removed from the social media companies would make things so much better than they are now. It would provide an additional layer of support to people who are looking to make complaints.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I am sure the hon. Lady will agree with me that it is not either/or—it is probably both. Ultimately, she is right that an ombudsman would be there to help deal with what I think will be a lag in implementation, but if someone is a victim of online intimate image abuse, in particular, they want the material taken down immediately, so we need to have organisations such as those that we have both mentioned there to help on the spot. It has to be both, has it not?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I completely agree. Both those helplines do very good work, and they are absolutely necessary. I would strongly support their continuation in addition to an ombudsman-type service. Although I am saying that the need for an ombudsman would likely be higher in the initial bedding-in years, it will not go away—we will still need one. With NHS complaints, the system has been in place for a long time, and it works pretty well in the majority of cases, but there are still cases it gets wrong. Even if the social media companies behave in a good way and have proper complaints procedures, there will still be instances of them getting it wrong. There will still be a need for a higher level. I therefore urge the Minister to consider including new clause 1 in the Bill.

Shaun Bailey Portrait Shaun Bailey (West Bromwich West) (Con)
- Hansard - - - Excerpts

It is a pleasure to see you in the Chair, Ms Rees, and to make my first contribution in Committee—it will be a brief one. It is great to follow the hon. Member for Aberdeen North, and I listened intently to my right hon. Friend the Member for Basingstoke, from whom I have learned so much having sat with her in numerous Committees over the past two years.

I will speak to clause 18 stand part, in particular on the requirements of the technical specifications that the companies will need to use to ensure that they fulfil the duties under the clause. The point, which has been articulated well by numerous Members, is that we can place such a duty on service providers, but we must also ensure that the technical specifications in their systems allow them to follow through and deliver on it.

I sat in horror during the previous sitting as I listened to the hon. Member for Pontypridd talking about the horrendous abuse that she has to experience on Twitter. What that goes to show is that, if the intention of this clause and the Bill are to be fulfilled, we must ensure that the companies enable themselves to have the specifications in their systems on the ground to deliver the requirements of the Bill. That might mean that the secondary legislation is slightly more prescriptive about what those systems look like.

It is all well and good us passing primary legislation in this place to try to control matters, but my fear is that if those companies do not have systems such that they can follow through, there is a real risk that what we want will not materialise. As we proceed through the Bill, there will be mechanisms to ensure that that risk is mitigated, but the point that I am trying to make to my hon. Friend the Minister is that we should ensure that we are on top of this, and that companies have the technical specifications in their complaints procedures to meet the requirements under clause 18.

We must ensure that we do not allow the excuse, “Oh, well, we’re a bit behind the times on this.” I know that later clauses seek to deal with that, but it is important that we do not simply fall back on excuses. We must embed a culture that allows the provisions of the clause to be realised. I appeal to the Minister to ensure that we deal with that and embed a culture that looks at striding forward to deal with complaints procedures, and that these companies have the technical capabilities on the ground so that they can deal with these things swiftly and in the right way. Ultimately, as my right hon. Friend the Member for Basingstoke said, it is all well and good us making these laws, but it is vital that we ensure that they can be applied.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me address some of the issues raised in the debate. First, everyone in the House recognises the enormous problem at the moment with large social media firms receiving reports about harmful and even illegal content that they just flagrantly ignore. The purpose of the clause, and indeed of the whole Bill and its enforcement architecture, is to ensure that those large social media firms no longer ignore illegal and harmful content when they are notified about it. We agree unanimously on the importance of doing that.

The requirement for those firms to take the proper steps is set out in clause 18(2)(b), at the very top of page 18 —it is rather depressing that we are on only the 18th of a couple of hundred pages. That paragraph creates a statutory duty for a social media platform to take “appropriate action”—those are the key words. If the platform is notified of a piece of illegal content, or content that is harmful to children, or of content that it should take down under its own terms and conditions if harmful to adults, then it must do so. If it fails to do so, Ofcom will have the enforcement powers available to it to compel—ultimately, escalating to a fine of up to 10% of global revenue or even service disconnection.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me develop the point before I give way. Our first line of defence is Ofcom enforcing the clause, but we have a couple of layers of additional defence. One of those is the super-complaints mechanism, which I have mentioned before. If a particular group of people, represented by a body such as the NSPCC, feel that their legitimate complaints are being infringed systemically by the social media platform, and that Ofcom is failing to take the appropriate action, they can raise that as a super-complaint to ensure that the matter is dealt with.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I should give way to the hon. Member for Aberdeen North first, and then I will come to the shadow Minister.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I wanted to ask specifically about the resourcing of Ofcom, given the abilities that it will have under this clause. Will Ofcom have enough resource to be able to be that secondary line of defence?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

A later clause gives Ofcom the ability to levy the fees and charges it sees as necessary and appropriate to ensure that it can deliver the duties. Ofcom will have the power to set those fees at a level to enable it to do its job properly, as Parliament would wish it to do.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

This is the point about individual redress again: by talking about super-complaints, the Minister seems to be agreeing that it is not there. As I said earlier, for super-complaints to be made to Ofcom, the issue has to be of particular importance or to impact a particularly large number of users, but that does not help the individual. We know how much individuals are damaged; there must be a system of external redress. The point about internal complaints systems is that we know that they are not very good, and we require a big culture change to change them, but unless there is some mechanism thereafter, I cannot see how we are giving the individual any redress—it is certainly not through the super-complaints procedure.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As I said explicitly a few moments ago, the hon. Lady is right to point out the fact that the super-complaints process is to address systemic issues. She is right to say that, and I think I made it clear a moment or two ago.

Whether there should be an external ombudsman to enforce individual complaints, rather than just Ofcom enforcing against systemic complaints, is a question worth addressing. In some parts of our economy, we have ombudsmen who deal with individual complaints, financial services being an obvious example. The Committee has asked the question, why no ombudsman here? The answer, in essence, is a matter of scale and of how we can best fix the issue. The volume of individual complaints generated about social media platforms is just vast. Facebook in the UK alone has tens of millions of users—I might get this number wrong, but I think it is 30 million or 40 million users.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I will in a moment. The volume of complaints that gets generated is vast. The way that we will fix this is not by having an external policeman to enforce on individual complaints, but by ensuring that the systems and processes are set up correctly to deal with problems at this large scale. [Interruption.] The shadow Minister, the hon. Member for Pontypridd, laughs, but it is a question of practicality. The way we will make the internet safe is to make sure that the systems and processes are in place and effective. Ofcom will ensure that that happens. That will protect everyone, not just those who raise individual complaints with an ombudsman.

None Portrait Several hon. Members rose—
- Hansard -

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I can see that there is substantial demand to comment, so I shall start by giving way to my right hon. Friend the Member for Basingstoke.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

The Minister is doing an excellent job explaining the complex nature of the Bill. Ultimately, however, as he and I know, it is not a good argument to say that this is such an enormous problem that we cannot have a process in place to deal with it. If my hon. Friend looks back at his comments, he will see that that is exactly the point he was making. Although it is possibly not necessary with this clause, I think he needs to give some assurances that later in the Bill he will look at hypothecating some of the money to be generated from fines to address the issues of individual constituents, who on a daily basis are suffering at the hands of the social media companies. I apologise for the length of my intervention.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

It is categorically not the Government’s position that this problem is too big to fix. In fact, the whole purpose of this piece of groundbreaking and world-leading legislation is to fix a problem of such magnitude. The point my right hon. Friend was making about the hypothecation of fines to support user advocacy is a somewhat different one, which we will come to in due course, but there is nothing in the Bill to prevent individual groups from assisting individuals with making specific complaints to individual companies, as they are now entitled to do in law under clauses 17 and 18.

The point about an ombudsman is a slightly different one—if an individual complaint is made to a company and the individual complainant is dissatisfied with the outcome of their individual, particular and personal complaint, what should happen? In the case of financial services, if, for example, someone has been mis-sold a mortgage and they have suffered a huge loss, they can go to an ombudsman who will bindingly adjudicate that individual, single, personal case. The point that I am making is that having hundreds of thousands or potentially millions of cases being bindingly adjudicated on a case-by- case basis is not the right way to tackle a problem of this scale. The right way to tackle the problem is to force the social media companies, by law, to systemically deal with all of the problem, not just individual problems that may end up on an ombudsman’s desk.

That is the power in the Bill. It deals at a systems and processes level, it deals on an industry-wide level, and it gives Ofcom incredibly strong enforcement powers to make sure this actually happens. The hon. Member for Pontypridd has repeatedly called for a systems and processes approach. This is the embodiment of such an approach and the only way to fix a problem of such magnitude.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I associate myself with the comments of the right hon. Member for Basingstoke. Surely, if we are saying that this is such a huge problem, that is an argument for greater stringency and having an ombudsman. We cannot say that this is just about systems. Of course it is about systems, but online harms—we have heard some powerful examples of this—are about individuals, and we have to provide redress and support for the damage that online harms do to them. We have to look at systemic issues, as the Minister is rightly doing, but we also have to look at individual cases. The idea of an ombudsman and greater support for charities and those who can support victims of online crime, as mentioned by the hon. Member for Aberdeen North, is really important.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Lady for her thoughtful intervention. There are two separate questions here. One is about user advocacy groups helping individuals to make complaints to the companies. That is a fair point, and no doubt we will debate it later. The ombudsman question is different; it is about whether to have a right of appeal against decisions by social media companies. Our answer is that, rather than having a third-party body—an ombudsman—effectively acting as a court of appeal against individual decisions by the social media firms, because of the scale of the matter, the solution is to compel the firms, using the force of law, to get this right on a systemic and comprehensive basis.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I give way first to the hon. Member for Aberdeen North—I think she was first on her feet—and then I will come to the hon. Member for Pontypridd.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Does the Minister not think this is going to work? He is creating this systems and processes approach, which he suggests will reduce the thousands of complaints—complaints will be made and complaints procedures will be followed. Surely, if it is going to work, in 10 years’ time we are going to need an ombudsman to adjudicate on the individual complaints that go wrong. If this works in the way he suggests, we will not have tens of millions of complaints, as we do now, but an ombudsman would provide individual redress. I get what he is arguing, but I do not know why he is not arguing for both things, because having both would provide the very best level of support.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I will address the review clause now, since it is relevant. If, in due course, as I hope and expect, the Bill has the desired effect, perhaps that would be the moment to consider the case for an ombudsman. The critical step is to take a systemic approach, which the Bill is doing. That engages the question of new clause 1, which would create a mechanism, probably for the reason the hon. Lady just set out, to review how things are going and to see if, in due course, there is a case for an ombudsman, once we see how the Bill unfolds in practice.

Jane Stevenson Portrait Jane Stevenson
- Hansard - - - Excerpts

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me finish the point. It is not a bad idea to review it and see how it is working in practice. Clause 149 already requires a review to take place between two and four years after Royal Assent. For the reasons that have been set out, it is pretty clear from this debate that we would expect the review to include precisely that question. If we had an ombudsman on day one, before the systems and processes had had a chance to have their effect, I fear that the ombudsman would be overwhelmed with millions of individual issues. The solution lies in fixing the problem systemically.

12:15
None Portrait Several hon. Members rose—
- Hansard -

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I think the shadow Minister wanted to intervene, unless I have answered her point already.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I wanted to reiterate the point that the hon. Member for Aberdeen North made, which the Minister has not answered. If he has such faith that the systems and processes will be changed and controlled by Ofcom as a result of the Bill, why is he so reluctant to put in an ombudsman? It will not be overwhelmed with complaints if the systems and processes work, and therefore protect victims. We have already waited far too long for the Bill, and now he says that we need to wait two to four years for a review, and even longer to implement an ombudsman to protect victims. Why will he not just put this in the Bill now to keep them safe?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Because we need to give the new systems and processes time to take effect. If the hon. Lady felt so strongly that an ombudsman was required, she was entirely at liberty to table an amendment to introduce one, but she has not done so.

Jane Stevenson Portrait Jane Stevenson
- Hansard - - - Excerpts

I wonder whether Members would be reassured if companies were required to have a mechanism by which users could register their dissatisfaction, to enable an ombudsman, or perhaps Ofcom, to gauge the volume of dissatisfaction and bring some kind of group claim against the company. Is that a possibility?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes. My hon. Friend hits the nail on the head. If there is a systemic problem and a platform fails to act appropriately not just in one case, but in a number of them, we have, as she has just described, the super-complaints process in clauses 140 to 142. Even under the Bill as drafted, without any changes, if a platform turns out to be systemically ignoring reasonable complaints made by the public and particular groups of users, the super-complainants will be able to do exactly as she describes. There is a mechanism to catch this—it operates not at individual level, but at the level of groups of users, via the super-complaint mechanism—so I honestly feel that the issue has been addressed.

When the numbers are so large, I think that the super-complaint mechanism is the right way to push Ofcom if it does not notice. Obviously, the first line of defence is that companies comply with the Bill. The second line of defence is that if they fail to do so, Ofcom will jump on them. The third line of defence is that if Ofcom somehow does not notice, a super-complaint group—such as the NSPCC, acting for children—will make a super-complaint to Ofcom. We have three lines of defence, and I submit to the Committee that they are entirely appropriate.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I was about to sit down, but of course I will give way.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The Minister said that the Opposition had not tabled an amendment to bring in an ombudsman.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

On this clause.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

On this clause. What we have done, however—we are debating it now—is to table a new clause to require a report on redress for individual complaints. The Minister talks about clause 149 and a process that will kick in between two and five years away, but we have a horrendous problem at the moment. I and various others have described the situation as the wild west, and very many people—thousands, if not millions, of individuals—are being failed very badly. I do not see why he is resisting our proposal for a report within six months of the commencement of the Act, which would enable us to start to see at that stage, not two to five years down the road, how these systems—he is putting a lot of faith in them—were turning out. I think that is a very sound idea, and it would help us to move forward.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The third line of defence—the super-complaint process—is available immediately, as I set out a moment ago. In relation to new clause 1, which the hon. Lady mentioned a moment ago, I think six months is very soon for a Bill of this magnitude. The two-to-five-year timetable under the existing review mechanism in clause 149 is appropriate.

Although we are not debating clause 149, I hope, Ms Rees, that you will forgive me for speaking about it for a moment. If Members turn to pages 125 and 126 and look at the matters covered by the review, they will see that they are extraordinarily comprehensive. In effect, the review covers the implementation of all aspects of the Bill, including the need to minimise the harms to individuals and the enforcement and information-gathering powers. It covers everything that Committee members would want to be reviewed. No doubt as we go through the Bill we will have, as we often do in Bill Committee proceedings, a number of occasions on which somebody tables an amendment to require a review of x, y or z. This is the second such occasion so far, I think, and there may be others. It is much better to have a comprehensive review, as the Bill does via the provisions in clause 149.

Question put and agreed to.

Clause 18 accordingly ordered to stand part of the Bill.

Clause 19

Duties about freedom of expression and privacy

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause 29 stand part.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clause 19, on user-to-user services, and its associated clause 29, which relates to search services, specify a number of duties in relation to freedom of expression and privacy. In carrying out their safety duties, in-scope companies will be required by clause 19(2) to have regard to the importance of protecting users’ freedom of expression and privacy.

Let me pause for a moment on this issue. There has been some external commentary about the Bill’s impact on freedom of expression. We have already seen, via our discussion of a previous clause, that there is nothing in the Bill that compels the censorship of speech that is legal and not harmful to children. I put on the record again the fact that nothing in the Bill requires the censorship of legal speech that poses no harm to children.

We are going even further than that. As far as I am aware, for the first time ever there will be a duty on social media companies, via clause 19(2), to have regard to freedom of speech. There is currently no legal duty at all on platforms to have regard to freedom of speech. The clause establishes, for the first time, an obligation to have regard to freedom of speech. It is critical that not only Committee members but others more widely who consider the Bill should bear that carefully in mind. Besides that, the clause speaks to the right to privacy. Existing laws already speak to that, but the clause puts it in this Bill as well. Both duties are extremely important.

In addition, category 1 service providers—the really big ones—will need proactively to assess the impact of their policies on freedom of expression and privacy. I hope all Committee members will strongly welcome the important provisions I have outlined.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

As the Minister says, clauses 19 and 29 are designed to provide a set of balancing provisions that will require companies to have regard to freedom of expression and privacy when they implement their safety duties. However, it is important that companies cannot use privacy and free expression as a basis to argue that they can comply with regulation in less substantive ways. That is a fear here.

Category 1 providers will need to undertake an impact assessment to determine the impact of their product and safety decisions on freedom of expression, but it is unclear whether that applies only in respect of content that is harmful to adults. Unlike with the risk assessments for the illegal content and child safety duties set out in part 3, chapter 2, these clauses do not set expectations about whether risk assessments are of a suitable and sufficient quality. It is also not clear what powers Ofcom has at its disposal to challenge any assessments that it considers insufficient or that reach an inappropriate or unreasonable assessment of how to balance fundamental rights. I would appreciate it if the Minister could touch on that when he responds.

The assumption underlying these clauses is that privacy and free expression may need to act as a constraint on safety measures, but I believe that that is seen quite broadly as simplistic and potentially problematic. To give one example, a company could argue that end-to-end encryption is important for free expression, and privacy could justify any adverse impact on users’ safety. The subjects of child abuse images, which could more easily be shared because of such a decision, would see their safety and privacy rights weakened. Such an argument fails to take account of the broader nuance of the issues at stake. Impacts on privacy and freedom of expression should therefore be considered across a range of groups rather than assuming an overarching right that applies equally to all users.

Similarly, it will be important that Ofcom understands and delivers its functions in relation to these clauses in a way that reflects the complexity and nuance of the interplay of fundamental rights. It is important to recognise that positive and negative implications for privacy and freedom of expression may be associated with any compliance decision. I think the Minister implied that freedom of speech was a constant positive, but it can also have negative connotations.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am pleased that the clause is in the Bill, and I think it is a good one to include. Can the Minister reaffirm what he said on Tuesday about child sexual abuse, and the fact that the right to privacy does not trump the ability—particularly with artificial intelligence—to search for child sexual abuse images?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I confirm what the hon. Lady has just said. In response to the hon. Member for Worsley and Eccles South, it is important to say that the duty in clause 19 is “to have regard”, which simply means that a balancing exercise must be performed. It is not determinative; it is not as if the rights in the clause trump everything else. They simply have to be taken into account when making decisions.

To repeat what we discussed on Tuesday, I can explicitly and absolutely confirm to the hon. Member for Aberdeen North that in my view and the Government’s, concerns about freedom of expression or privacy should not trump platforms’ ability to scan for child sexual exploitation and abuse images or protect children. It is our view that there is nothing more important than protecting children from exploitation and sexual abuse.

We may discuss this further when we come to clause 103, which develops the theme a little. It is also worth saying that Ofcom will be able to look at the risk assessments and, if it feels that they are not of an adequate standard, take that up with the companies concerned. We should recognise that the duty to have regard to freedom of expression is not something that currently exists. It is a significant step forward, in my view, and I commend clauses 19 and 29 to the Committee.

None Portrait The Chair
- Hansard -

With your indulgence, Minister, Nick Fletcher would like to speak.

Nick Fletcher Portrait Nick Fletcher (Don Valley) (Con)
- Hansard - - - Excerpts

I have been contacted by a number of people about this clause, and they have serious concerns about the “have regard” statement. The Christian Institute said that it was

“promised ‘considerably stronger protections for free speech’, but the Bill does not deliver. Internet companies will be under ‘a duty to have regard to the importance of’ protecting free speech,”

but a “have regard” duty

“has no weight behind it. It is perfectly possible to…have regard to something…and then ignore it in practice.”

The “have regard” duty is not strong enough, and it is a real concern for a lot of people out there. Protecting children is absolutely imperative, but there are serious concerns when it comes to freedom of speech. Can the Minister address them for me?

12:30
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As I have said, at the moment there is nothing at all. Platforms such as Facebook can and do arbitrarily censor content with little if any regard for freedom of speech. Some platforms have effectively cancelled Donald Trump while allowing the Russian state to propagate shocking disinformation about the Russian invasion of Ukraine, so there is real inconsistency and a lack of respect for freedom of speech. This at least establishes something where currently there is nothing. We can debate whether “have regard to” is strong enough. We have heard the other point of view from the other side of the House, which expressed concern that it might be used to allow otherwise harmful content, so there are clearly arguments on both sides of the debate. The obligation to have regard does have some weight, because the issue cannot be completely ignored. I do not think it would be adequate to simply pay lip service to it and not give it any real regard, so I would not dismiss the legislation as drafted.

I would point to the clauses that we have recently discussed, such as clause 15, under which content of democratic importance—which includes debating current issues and not just stuff said by an MP or candidate—gets additional protection. Some of the content that my hon. Friend the Member for Don Valley referred to a second ago would probably also get protection under clause 14, under which content of democratic importance has to be taken in account when making decisions about taking down or removing particular accounts. I hope that provides some reassurance that this is a significant step forwards compared with where the internet is today.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I share the Minister’s sentiments about the Bill protecting free speech; we all want to protect that. He mentions some of the clauses we debated on Tuesday regarding democratic importance. Some would say that debating this Bill is of democratic importance. Since we started debating the Bill on Tuesday, and since I have mentioned some of the concerns raised by stakeholders and others about the journalistic exemption and, for example, Tommy Robinson, my Twitter mentions have been a complete sewer—as everyone can imagine. One tweet I received in the last two minutes states:

“I saw your vicious comments on Tommy Robinson…The only reason you want to suppress him is to bury the Pakistani Muslim rape epidemic”

in this country. Does the Minister agree that that is content of democratic importance, given we are debating this Bill, and that it should remain on Twitter?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

That sounds like a very offensive tweet. Could the hon. Lady read it again? I didn’t quite catch it.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Yes:

“I saw your vicious comments on Tommy Robinson…The only reason you want to suppress him is to bury the Pakistani Muslim rape epidemic”

in this country. It goes on:

“this is a toxic combination of bloc vote grubbing and woke”

culture, and there is a lovely GIF to go with it.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I do not want to give an off-the-cuff assessment of an individual piece of content—not least because I am not a lawyer. It does not sound like it meets the threshold of illegality. It most certainly is offensive, and that sort of matter is one that Ofcom will set out in its codes of practice, but there is obviously a balance between freedom of speech and content that is harmful, which the codes of practice will delve into. I would be interested if the hon. Lady could report that to Twitter and then report back to the Committee on what action it takes.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Yes, I will do that right now and see what happens.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

At the moment, there is no legal obligation to do anything about it, which is precisely why this Bill is needed, but let us put it to the test.

Question put and agreed to.

Clause 19 accordingly ordered to stand part of the Bill.

Clause 20

Record-keeping and review duties

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause 30 stand part.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Record-keeping and review duties on in-scope services make up an important function of the regulatory regime that we are discussing today. Platforms will need to report all harms identified and the action taken in response to this, in line with regulation. The requirements to keep records of the action taken in response to harm will be vital in supporting the regulator to make effective decisions about regulatory breaches and whether company responses are sufficient. That will be particularly important to monitor platforms’ responses through risk assessments—an area where some charities are concerned that we will see under-reporting of harms to evade regulation.

Evidence of under-reporting can be seen in the various transparency reports that are currently being published voluntarily by sites, where we are not presented with the full picture and scale of harm and the action taken to address that harm is thus obscured.

As with other risk assessments, the provisions in clauses 20 and 30 could be strengthened through a requirement on in-scope services to publish their risk assessments. We have made that point many times. Greater transparency would allow researchers and civil society to track harms and hold services to account.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The shadow Minister has eloquently introduced the purpose and effect of the clause, so I shall not repeat what she has said. On her point about publication, I repeat the point that I made on Tuesday, which is that the transparency requirements—they are requirements, not options—set out in clause 64 oblige Ofcom to ensure the publication of appropriate information publicly in exactly the way she requests.

Question put and agreed to.

Clause 20 accordingly ordered to stand part of the Bill.

Clauses 21 to 24 ordered to stand part of the Bill.

Clause 25

Children’s risk assessment duties

Amendment proposed: 16, in clause 25, page 25, line 10, at end insert—

“(3A) A duty for the children’s risk assessment to be approved by either—

(a) the board of the entity; or, if the organisation does not have a board structure,

(b) a named individual who the provider considers to be a senior manager of the entity, who may reasonably be expected to be in a position to ensure compliance with the children’s risk assessment duties, and reports directly into the most senior employee of the entity.” —(Alex Davies-Jones.)

This amendment seeks to ensure that regulated companies’ boards or senior staff have responsibility for children’s risk assessments.

Division 14

Ayes: 7


Labour: 5
Scottish National Party: 2

Noes: 10


Conservative: 10

Clause 25 ordered to stand part of the Bill.
Clauses 26 to 30 ordered to stand part of the Bill.
Clause 31
Children’s access assessments
None Portrait The Chair
- Hansard -

I call Kirsty Blackman to move amendment 22. [Interruption.] Sorry—my bad, as they say. I call Barbara Keeley to move amendment 22.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I beg to move amendment 22, in clause 31, page 31, line 17, leave out subsection (3).

This amendment removes the condition that applies a child use test to a service or part of a service.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause stand part.

Clause 32 stand part.

That schedule 3 be the Third schedule to the Bill.

Clause 33 stand part.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The purpose of the amendment is to remove the child use test from the children’s access assessment and to make sure that any service likely to be accessed by children is within the scope of the child safety duty. The amendment is supported by the NSPCC and other children’s charities.

Children require protection wherever they are online. I am sure that every Committee member believes that. The age-appropriate design code from the Information Commissioner’s Office requires all services that are likely to be accessed by children to provide high levels of data protection and privacy. Currently, the Bill will regulate only user-to-user and search services that have a significant number of child users or services for which children form a significant part of their user base. It will therefore not apply to all services that fall within the scope of the ICO’s code, creating a patchwork of regulation that could risk uncertainty, legal battles and unnecessary complexity. It might also create a perverse incentive for online services to stall the introduction of their child safety measures until Ofcom has the capacity to investigate and reach a determination on the categorisation of their sites.

The inclusion of a children’s access assessment in the Bill may result in lower standards of protection, with highly problematic services such as Telegram and OnlyFans able to claim that they are excluded from the child safety duties because children do not account for a significant proportion of their user base. However, evidence has shown that children have been able to access those platforms.

Other services will remain out of the scope of the Bill as currently drafted. They include harmful blogs that promote life-threatening behaviours, such as pro-anorexia sites with provider-generated rather than user-generated content; some of the most popular games among children that do not feature user-generated content but are linked to increasing gambling addiction among children, and through which some families have lost thousands of pounds; and other services with user-generated content that is harmful but does not affect an appreciable number of children. That risks dozens, hundreds or even thousands of children falling unprotected.

Parents have the reasonable expectation that, under the new regime introduced by the Bill, children will be protected wherever they are online. They cannot be expected to be aware of exemptions or distinctions between categories of service. They simply want their children to be protected and their rights upheld wherever they are.

As I say, children have the right to be protected from harmful content and activity by any platform that gives them access. That is why the child user condition in clause 31 should be deleted from the Bill. As I have said, the current drafting could leave problematic platforms out of scope if they were to claim that they did not have a significant number of child users. It should be assumed that platforms are within the scope of the child safety duties unless they can provide evidence that children cannot access their sites, for example through age verification tools.

Although clause 33 provides Ofcom with the power to determine that a platform is likely to be accessed by children, this will necessitate Ofcom acting on a company-by-company basis to bring problematic sites back into scope of the child safety duties. That will take considerable time, and it will delay children receiving protection. It would be simpler to remove the child user condition from clause 31, as I have argued.

12:45
It is welcome that schedule 3 specifies the timing of service providers’ risk assessments and children’s access assessments. Three months from the publication of Ofcom guidance to the completion of the service assessments is ample time. What is concerning, as we have heard from contributions this morning, is the long delay that children have already faced in gaining protections online. We know that the situation has become very bad.
As I understand it, the duties on Ofcom to provide the necessary guidance on risk assessments and children’s access assessments will come into force only on such a date as the Secretary of State may, by regulations, appoint, because the measure is not one of those listed in clause 193(1). That means that children and adults may continue to be exposed to harm for a significant further stretch of time. Can the Minister offer any clarification as to when Ofcom will be required to publish guidance? After the disappointing flop of part 3 of the Digital Economy Act 2017 not being implemented, what reassurances can the Minister offer that this regime will come into effect as soon as possible?
None Portrait The Chair
- Hansard -

I definitely call Kirsty Blackman this time.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I would have been quite happy to move the amendment, but I do not think the Opposition would have been terribly pleased with me if I had stolen it. I have got my name on it, and I am keen to support it.

As I have said, I met the NSPCC yesterday, and we discussed how clause 31(3) might work, should the Minister decide to keep it in the Bill and not accept the amendment. There are a number of issues with the clause, which states that the child user condition is met if

“a significant number of children”

are users of the service, or if the service is

“likely to attract a significant number of users who are children”.

I do not understand how that could work. For example, a significant number of people who play Fortnite are adults, but a chunk of people who play it are kids. If some sort of invisible percentage threshold is applied in such circumstances, I do not know whether that threshold will be met. If only 20% of Fortnite users are kids, and that amounts only to half a million children, will that count as enough people to meet the child access assessment threshold?

Fortnite is huge, but an appropriate definition is even more necessary for very small platforms and services. With the very far-right sites that we have mentioned, it may be that only 0.5% of their users are children, and that may amount only to 2,000 children—a very small number. Surely, because of the risk of harm if children access these incredibly damaging and dangerous sites that groom people for terrorism, they should have a duty to meet the child access requirement threshold, if only so that we can tell them that they must have an age verification process—they must be able to say, “We know that none of our users are children because we have gone through an age verification process.” I am keen for children to be able to access the internet and meet their friends online, but I am keen for them to be excluded from these most damaging sites. I appreciate the action that the Government have taken in relation to pornographic content, but I do not think that this clause allows us to go far enough in stopping children accessing the most damaging content that is outwith pornographic content.

The other thing that I want to raise is about how the number of users will be calculated. The Minister made it very clear earlier on, and I thank him for doing so, that an individual does not have to be a registered user to be counted as a user of a site. People can be members of TikTok, for example, only if they are over 13. TikTok has some hoops in place—although they are not perfect—to ensure that its users are over 13, and to be fair, it does proactively remove users that it suspects are under 13, particularly if they are reported. That is a good move.

My child is sent links to TikTok videos through WhatsApp, however. He clicks on the links and is able to watch the videos, which will pop up in the WhatsApp mini-browser thing or in the Safari browser. He can watch the videos without signing up as a registered user of TikTok and without using the platform itself—the videos come through Safari, for example, rather than through the app. Does the Minister expect that platforms will count those people as users? I suggest that the majority of people who watch TikTok by those means are doing so because they do not have a TikTok account. Some will not have accounts because they are under 13 and are not allowed to by TikTok or by the parental controls on their phones.

My concern is that, if the Minister does not provide clarity on this point, platforms will count just the number of registered users, and will say, “It’s too difficult for us to look at the number of unregistered users, so in working out whether we meet the criteria, we are not even going to consider people who do not access our specific app or who are not registered users in some way, shape or form.” I have concerns about the operation of the provisions and about companies using that “get out of jail free” card. I genuinely believe that the majority of those who access TikTok other than through its platform are children and would meet the criteria. If the Minister is determined to keep subsection (3) and not accept the amendment, I feel that he should make it clear that those users must be included in the counting by any provider assessing whether it needs to fulfil the child safety duties.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I agree with thon. Lady’s important point, which feeds into the broader question of volume versus risk—no matter how many children see something that causes harm and damage, one is one too many—and the categorisation of service providers into category 1 to category 2A and category 2B. The depth of the risk is the problem, rather than the number of people who might be affected. The hon. Lady also alluded to age verification—I am sure we will come to that at some point—which is another can of worms. The important point, which she made well, is about volume versus risk. The point is not how many children see something; even if only a small number of children see something, the damage has been done.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I absolutely agree. In fact, I have tabled an amendment to widen category 1 to include sites with the highest risk of harm. The Minister has not said that he agrees with my amendment specifically, but he seems fairly amenable to increasing and widening some duties to include the sites of highest risk. I have also tabled another new clause on similar issues.

I am glad that these clauses are in the Bill—a specific duty in relation to children is important and should happen—but as the shadow Minister said, clause 31(3) is causing difficulty. It is causing difficulty for me and for organisations such as the NSPCC, which is unsure how the provisions will operate and whether they will do so in the way that the Government would like.

I hope the Minister will answer some of our questions when he responds. If he is not willing to accept the amendment, will he give consideration to how the subsection could be amended in the future—we have more stages, including Report and scrutiny in the other place—to ensure that there is clarity and that the intention of the purpose is followed through, rather than being an intention that is not actually translated into law?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Colleagues have spoken eloquently to the purpose and effect of the various clauses and schedule 3 —the stand part component of this group. On schedule 3, the shadow Minister, the hon. Member for Worsley and Eccles South, asked about timing. The Government share her desire to get this done as quickly as possible. In its evidence a couple of weeks ago, Ofcom said it would be publishing its road map before the summer, which would set out the timetable for moving all this forward. We agree that that is extremely important.

I turn to one or two questions that arose on amendment 22. As always, the hon. Member for Aberdeen North asked a number of very good questions. The first was whether the concept of a “significant number” applied to a number in absolute terms or a percentage of the people using a particular service, and which is looked at when assessing what is significant. The answer is that it can be either—either a large number in absolute terms, by reference to the population of the whole United Kingdom, or a percentage of those using the service. That is expressed in clause 31(4)(a). Members will note the “or” there. It can be a number in proportion to the total UK population or the proportion using a service. I hope that answers the hon. Member’s very good question.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

My concern is where services that meet neither of those criteria—they do not meet the “significant number” criterion in percentage terms because, say, only 0.05% of their users are children, and they do not meet it in population terms, because they are a pretty small platform and only have, say, 1,000 child users—but those children who use the platform are at very high risk because of the nature of the platform or the service provided. My concern is for those at highest risk where neither of the criteria are met and the service does not have to bother conducting any sort of age verification or access requirements.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am concerned to ensure that children are appropriately protected, as the hon. Lady sets out. Let me make a couple of points in that area before I address that point.

The hon. Lady asked another question earlier, about video content. She gave the example of TikTok videos being viewed or accessed not directly on TikTok but via some third-party means, such as a WhatsApp message. First, it is worth emphasising again that in order to count as a user, a person does not have to be registered and can simply be viewing the content. Secondly, if someone is viewing something through another service, such as WhatsApp—the hon. Lady used the example of browsing the internet on another site—the duty will bite at the level of WhatsApp, and it will have to consider the content that it is providing access to. As I said, someone does not have to be registered with a service in order to count as a user of that service.

On amendment 22, there is a drafting deficiency, if I may put it politely—this is a point of drafting rather than of principle. The amendment would simply delete subsection (3), but there would still be references to the “child user condition”—for example, the one that appears on the same page of the Bill at line 11. If the amendment were adopted as drafted, it would end up leaving references to “child user condition” in the Bill without defining what it meant, because we would have deleted the definition.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Is the Minister coming on to say that he is accepting what we are saying here?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

No, is the short answer. I was just mentioning in passing that there is that drafting issue.

On the principle, it is worth being very clear that, when it comes to content or matters that are illegal, that applies to all platforms, regardless of size, where children are at all at risk. In schedule 6, we set out a number of matters—child sexual exploitation and abuse, for example—as priority offences that all platforms have to protect children from proactively, regardless of scale.

13:00
Of course, anything to do with children that is illegal falls under the legal duties that we have discussed already. Anything that touches on illegality is covered, notwith-standing this clause, which deals with topics where the subject, act or content is not illegal. It is important to keep that in mind.
Other areas include gambling, which the shadow Minister mentioned. There is separate legislation—very strong legislation—that prohibits children from being involved in gambling. That stands independently of this Bill, so I hope that the Committee is assured—
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The Minister has not addressed the points I raised. I specifically raised—he has not touched on this—harmful pro-anorexia blogs, which we know are dangerous but are not in scope, and games that children access that increase gambling addiction. He says that there is separate legislation for gambling addiction, but families have lost thousands of pounds through children playing games linked to gambling addiction. There are a number of other services that do not affect an appreciable number of children, and the drafting causes them to be out of scope.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

rose—[Interruption.]

None Portrait The Chair
- Hansard -

There is no hard and fast rule about moving the Adjournment motion. It is up to the Government Whip.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have a few more things to say, but I am happy to finish here if it is convenient.

Ordered, That the debate be now adjourned.—(Steve Double.)

13:02
Adjourned till this day at Two o’clock.

Online Safety Bill (Eighth sitting)

Committee stage
Thursday 9th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 9 June 2022 - (9 Jun 2022)
The Committee consisted of the following Members:
Chairs: Sir Roger Gale, † Christina Rees
† Ansell, Caroline (Eastbourne) (Con)
† Bailey, Shaun (West Bromwich West) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
† Carden, Dan (Liverpool, Walton) (Lab)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Double, Steve (St Austell and Newquay) (Con)
† Fletcher, Nick (Don Valley) (Con)
† Holden, Mr Richard (North West Durham) (Con)
† Keeley, Barbara (Worsley and Eccles South) (Lab)
† Leadbeater, Kim (Batley and Spen) (Lab)
† Miller, Dame Maria (Basingstoke) (Con)
† Mishra, Navendu (Stockport) (Lab)
Moore, Damien (Southport) (Con)
† Nicolson, John (Ochil and South Perthshire) (SNP)
† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
† Russell, Dean (Watford) (Con)
† Stevenson, Jane (Wolverhampton North East) (Con)
Katya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks
† attended the Committee
Public Bill Committee
Thursday 9 June 2022
(Afternoon)
[Christina Rees in the Chair]
Online Safety Bill
Clause 31
Children’s access assessments
Amendment proposed (this day): 22, in clause 31, page 31, line 17, leave out subsection (3).—(Barbara Keeley.)
This amendment removes the condition that applies a child use test to a service or part of a service.
14:00
Question again proposed, That the amendment be made.
None Portrait The Chair
- Hansard -

I remind the Committee that with this we are discussing the following:

Clause stand part.

Clause 32 stand part.

That schedule 3 be the Third schedule to the Bill.

Clause 33 stand part.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

When the sitting was suspended for lunch, I was concluding my remarks and saying that where children are the victim of illegal activity or illegal content, all of that is covered in other aspects of the Bill. For areas such as gambling, we have separate legislation that protects children. In relation to potentially harmful content, the reason there is a “significant number” test for the child user condition that we are debating is that, without it, platforms that either would not have any children accessing them or had nothing of any concern on them—such as a website about corporation tax—would have an unduly burdensome and disproportionate obligation placed on them. That is why there is the test—just to ensure that there is a degree of proportionality in these duties. We find similar qualifications in other legislation; that includes the way the age-appropriate design code works. Therefore, I respectfully resist the amendment.

Question put, That the amendment be made.

Division 15

Ayes: 7


Labour: 5
Scottish National Party: 2

Noes: 9


Conservative: 9

Clause 31 ordered to stand part of the Bill.
Clause 32 ordered to stand part of the Bill.
Schedule 3 agreed to.
Clause 33 ordered to stand part of the Bill.
Clause 34
Duties about fraudulent advertising: Category 1 services
Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

I beg to move amendment 23, in clause 34, page 33, line 41, after “service” insert “that targets users”.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 24, in clause 35, page 34, line 34, after “service” insert “that targets users”.

New clause 5—Duty to distinguish paid-for advertisements

“(1) A provider of a Category 2A service must operate the service using systems and processes designed to clearly distinguish to users of that service paid-for advertisements from all other content appearing in or via search results of the service.

(2) The systems and processes described under subsection (1)—

(a) must include clearly displaying the words “paid-for advertisement” next to any paid-for advertisement appearing in or via search results of the service, and

(b) may include measures such as but not limited to the application of colour schemes to paid-for advertisements appearing in or via search results of the service.

(3) The reference to paid-for advertisements appearing “in or via search results of a search service” does not include a reference to any advertisements appearing as a result of any subsequent interaction by a user with an internet service other than the search service.

(4) If a person is the provider of more than one Category 2A service, the duties set out in this section apply in relation to each such service.

(5) The duties set out in this section extend to the design, operation and use of a Category 2A service that hosts paid-for advertisements targeted at users of that service in the United Kingdom.

(6) For the meaning of “Category 2A service”, see section 81 (register of a categories of service).

(7) For the meaning of “paid-for advertisement”, see section 189 (interpretation: general).”

New clause 6—Duty to verify advertisements

“(1) A provider of a Category 2A service must operate an advertisement verification process for any relevant advertisement appearing in or via search results of the service.

(2) In this section, “relevant advertisement” means any advertisement for a service or product to be designated in regulations made by the Secretary of State.

(3) The verification process under subsection (1) must include a requirement for advertisers to demonstrate that they are authorised by a UK regulatory body.

(4) In this section, “UK regulatory body” means a UK regulator responsible for the regulation of a particular service or product to be designated in regulations made by the Secretary of State.

(5) If a person is the provider of more than one Category 2A service, the duties set out in this section apply in relation to each such service.

(6) For the meaning of “Category 2A service”, see section 81 (register of a categories of service).

(7) Regulations under this section shall be made by statutory instrument.

(8) A statutory instrument containing regulations under this section may not be made unless a draft of the instrument has been laid before and approved by resolution of each House of Parliament.”

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I begin by thanking my hon. Friend the Member for Washington and Sunderland West (Mrs Hodgson) for her work on drafting these amendments and others relating to this chapter, which I will speak to shortly. She has campaigned excellently over many years in her role as chair of the all-party parliamentary group on ticket abuse. I attended the most recent meeting of that group back in April to discuss what we need to see changed in the Bill to protect people from scams online. I am grateful to those who have supported the group and the anti-ticket touting campaign for their insights.

It is welcome that, after much flip-flopping, the Government have finally conceded to Labour’s calls and those of many campaign groups to include a broad duty to tackle fraudulent advertising on search engines through chapter 5 of part 3 of the Bill. We know that existing laws to protect consumers in the online world have failed to keep pace with the actors attempting to exploit them, and that is particularly true of scams and fraudulent advertisements.

Statistics show a steep increase in this type of crime in the online world, although those figures are likely to be a significant underestimate and do not capture the devastating emotional impact that scams have on their victims. The scale of the problem is large and it is growing.

The Financial Conduct Authority estimates that fraud costs the UK up to £190 billion a year, with 86% of that fraud committed online. We know those figures are increasing. The FCA more than doubled the number of scam warnings it issued between 2019 and 2020, while UK Finance data shows that there has been a significant rise in cases across all scam types as criminals adapt to targeting victims online. The pandemic, which led to a boom in internet shopping, created an environment ripe for exploitation. Reported incidents of scams and fraud have increased by 41% since before the pandemic, with one in 10 of us now victims of fraud.

Being scammed can cause serious psychological harm. Research by the Money and Mental Health Policy Institute suggests that three in 10 online scam victims felt depressed as a result of being scammed, while four in 10 said they felt stressed. Clearly, action to tackle the profound harms that result from fraudulent advertising is long overdue.

This Bill is an important opportunity but, as with other issues the Government are seeking to address, we need to see changes if it is to be successful. Amendments 23 and 24 are small and very simple, but would have a profound impact on the ability of the Bill to prevent online fraud from taking place and to protect UK users.

As currently drafted, the duties set out in clauses 34 and 35 for category 1 and 2A services extend only to the design, operation and use of a category 1 or 2A service in the United Kingdom. Our amendments would mean that the duties extended to the design, operation and use of a category 1 or 2A service that targets users in the United Kingdom. That change would make the Bill far more effective, because it would reduce the risk of a company based overseas being able to target UK consumers without any action being taken against them—being allowed to target the public fraudulently without fear of disruption.

That would be an important change, because paid-for advertisements function by the advertiser stating where in the world, by geographical location, they wish to target consumers. For instance, a company would be able to operate from Hong Kong and take out paid-for advertisements to target consumers just in one particular part of north London. The current wording of the Bill does not acknowledge the fact that internet services can operate from anywhere in the world and use international boundaries to circumvent UK legislation.

Other legislation has been successful in tackling scams across borders. I draw the Committee’s attention to the London Olympic Games and Paralympic Games Act 2006, which made it a crime to sell a ticket to the Olympics into the black market anywhere in the world, rather than simply in the UK where the games took place. I suggest that we should learn from the action taken to regulate the Olympics back in 2012 and implement the same approach through amendments 23 and 24.

New clause 5 was also tabled by my hon. Friend the Member for Washington and Sunderland West, who will be getting a lot of mentions this afternoon.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

New clause 5 would tackle one of the reasons people become subject to fraud online by introducing a duty for search engines to ensure that all paid-for search advertisements should be made to look distinct from non-paid-for search results. When bad actors are looking to scam consumers, they often take out paid-for advertising on search results, so that they can give consumers the false impression that their websites are official and trustworthy.

Paid search results occur when companies pay a charge to have their site appear at the top of search results. This is valuable to them because it is likely to direct consumers towards their site. The new clause would stop scam websites buying their way to the top of a search result.

Let me outline some of the consequences of not distinguishing between paid-for and not-paid-for advertisements, because they can be awful. Earlier this year, anti-abortion groups targeted women who were searching online for a suitable abortion clinic. The groups paid for the women to have misleading adverts at the top of their search that directed them towards an anti-abortion centre rather than a clinic. One woman who knew that she wanted to have an abortion went on researching where she could have the procedure. Her search for a clinic on Google led her to an anti-abortion centre that she went on to contact and visit. That was because she trusted the top search results on Google, which were paid for. The fact that it was an advertisement was indicated only by the two letters “AD” appearing in very small font underneath the search headline and description.

Another example was reported by The Times last year. Google had been taking advertising money from scam websites selling premier league football tickets, even though the matches were taking place behind closed doors during lockdown. Because these advertisements appeared at the top of search results, it is entirely understandable that people looking for football tickets were deceived into believing that they would be able to attend the games, which led to them being scammed.

There have been similar problems with passport renewals. As colleagues will be very aware, people have been desperately trying to renew their passports amid long delays because of the backlog of cases. This is a target for fraudsters, who take out paid advertisements to offer people assistance with accessing passport renewal services and then scam them.

New clause 5 would end this practice by ensuring that search engines provide clear messaging to show that the user is looking at a paid-for advertisement, by stating that clearly and through other measures, such as a separate colour scheme. A duty to distinguish paid-for advertising is present in many other areas of advertising. For example, when we watch TV, there is no confusion between what is a programme and what is an advert; the same is true of radio advertising; and when someone is reading a newspaper or magazine, the line between journalism and the advertisements that fund the paper is unmistakable.

We cannot continue to have these discrepancies and be content with the internet being a wild west. Therefore, it is clear that advertising on search engines needs to be brought into line with advertising in other areas, with a requirement on search engines to distinguish clearly between paid-for and organic results.

New clause 6 is another new clause tabled by my hon. Friend the Member for Washington and Sunderland West. It would protect consumers from bad actors trying to exploit them online by placing a duty on search engines to verify adverts before they accept them. That would mean that, before their adverts were allowed to appear in a paid-for search result, companies would have to demonstrate that they were authorised by a UK regulatory body designated by the Secretary of State.

This methodology for preventing fraud is already in process for financial crime. Google only accepts financial services advertisements from companies that are a member of the Financial Conduct Authority. This gives companies a further incentive to co-operate with regulators and it protects consumers by preventing companies that are well-known for their nefarious activities from dominating search results and then misleading consumers. By extending this best practice to all advertisements, search engines would no longer be able to promote content that is fake or fraudulent after being paid to do so.

Without amending the Bill in this way, we risk missing an opportunity to tackle the many forms of scamming that people experience online, one of which is the world of online ticketing. In my role as shadow Minister for the arts and civil society, I have worked on this issue and been informed by the expertise of my hon. Friend the Member for Washington and Sunderland West.

In the meeting of the all-party parliamentary group on ticket abuse in April, we heard about the awful consequences of secondary ticket reselling practices. Ticket reselling websites, such as Viagogo, are rife with fraud. Large-scale ticket touts dominate the resale site, and Viagogo has a well-documented history of breaching consumer protection laws. Those breaches include a number of counts of fraud for selling non-existent tickets. Nevertheless, Viagogo continues to take out paid-for advertisements with Google and is continually able to take advantage of consumers by dominating search results and commanding false trust.

If new clause 6 is passed, then secondary ticketing websites such as Viagogo would have to be members of a regulatory body responsible for secondary ticketing, such as the Society of Ticket Agents and Retailers, or STAR. Viagogo would then have to comply with STAR standards for its business model to be successful.

I have used ticket touting as an example, but the repercussions of this change would be wider than that. Websites that sell holidays and flights, such as Skyscanner, would have to be a member of the relevant regulatory group, for example the Association of British Travel Agents. People would be able to go to football matches, art galleries and music festivals without fearing that they are getting ripped off or have been issued with fake tickets.

I will describe just a few examples of the poor situation we are in at the moment, to illustrate the need for change. The most heartbreaking one is of an elderly couple who bought two tickets from a secondary ticketing website to see their favourite artist, the late Leonard Cohen, to celebrate their 70th wedding anniversary. When the day came around and they arrived at the venue, they were turned away and told they had been sold fake tickets. The disappointment they must have felt would have been very hard to bear. In another instance, a British soldier serving overseas decided to buy his daughter concert tickets because he could not be with her on her birthday. When his daughter went along to the show, she was turned away at the door and told she could not enter because the tickets had been bought through a scam site and were invalid.

14:15
It is clear that the human impact of inaction is too great to ignore. Not only are victims scammed out of their money, but they go through intense stress and experience shame and humiliation. The Government have accepted the urgent need for action by following the advice of campaigners and the Joint Committee in including fraudulent advertising in the Bill, but more must be done if we are to prevent online fraud. By requiring search engines to verify advertisers before accepting their money, traders such as Viagogo will have an incentive to act responsibly and to comply with regulatory bodies.
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I rise to agree with all the amendments in this group that have been tabled by the Opposition. I want to highlight a couple of additional groups who are particularly at risk in relation to fraudulent advertising. One of those is pensioners and people approaching pension age. Because of the pension freedoms that are in place, we have a lot of people making uninformed decisions about how best to deal with their pensions, and sometimes they are able to withdraw a significant amount of money in one go. For an awful lot of people, withdrawing that money and paying the tax on it leads to a major financial loss—never mind the next step that they may take, which is to provide the money to fraudsters.

For pensioners in particular, requiring adverts to be clearly different from other search results would make a positive difference. The other thing that we have to remember is that pensioners generally did not grow up online, and some of them struggle more to navigate the internet than some of us who are bit younger.

John Nicolson Portrait John Nicolson (Ochil and South Perthshire) (SNP)
- Hansard - - - Excerpts

I speak with some experience of this issue, because I had a constituent who was a pensioner and who was scammed of £20,000—her life savings. Does my hon. Friend realise that it is sometimes possible to pressurise the banks into returning the money? In that particular case, I got the money back for my constituent by applying a great deal of pressure on the bank, and it is worth knowing that the banks are susceptible to a bit of publicity. That is perhaps worth bearing in mind, because it is a useful power that we have as Members of Parliament.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I thank my hon. Friend for his public service announcement. His constituent is incredibly lucky that my hon. Friend managed to act in that way and get the money back to her, because there are so many stories of people not managing to get their money back and losing their entire life savings as a result of scams. It is the case that not all those scams take place online—people can find scams in many other places—but we have the opportunity with the Bill to take action on scams that are found on the internet.

The other group I want to mention, and for whom highlighting advertising could make a positive difference, is people with learning disabilities. People with learning disabilities who use the internet may not understand the difference between adverts and search results, as the hon. Member for Worsley and Eccles South mentioned. They are a group who I would suggest are particularly susceptible to fraudulent advertising.

We are speaking a lot about search engines, but a lot of fraudulent advertising takes place on Facebook and so on. Compared with the majority of internet users, there is generally an older population on such sites, and the ability to tackle fraudulent advertising there is incredibly useful. We know that the sites can do it, because there are rules in place now around political advertising on Facebook, for example. We know that it is possible for them to take action; it is just that they have not yet taken proper action.

I am happy to support the amendments, but I am also glad that the Minister has put these measures in the Bill, because they will make a difference to so many of our constituents.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Member for Aberdeen North for her latter remarks. We made an important addition to the Bill after listening to parliamentarians across the House and to the Joint Committee, which many people served on with distinction. I am delighted that we have been able to make that significant move. We have heard a lot about how fraudulent advertising can affect people terribly, particularly more vulnerable people, so that is an important addition.

Amendments 23 and 24 seek to make it clear that where the target is in the UK, people are covered. I am happy to assure the Committee that that is already covered, because the definitions at the beginning of the Bill—going back to clause 3(5)(b), on page 3—make it clear that companies are in scope, both user-to-user and search, if there is a significant number of UK users or where UK users form one of the target markets, or is the only target market. Given the reference to “target markets” in the definitions, I hope that the shadow Minister will withdraw the amendment, because the matter is already covered in the Bill.

New clause 5 raises important points about the regulation of online advertising, but that is outside the purview of what the Bill is trying to achieve. The Government are going to work through the online advertising programme to tackle these sorts of issues, which are important. The shadow Minister is right to raise them, but they will be tackled holistically by the online advertising programme, and of course there are already codes of practice that apply and are overseen by the Advertising Standards Authority. Although these matters are very important and I agree with the points that she makes, there are other places where those are best addressed.

New clause 6 is about the verification process. Given that the Bill is primary legislation, we want to have the core duty to prevent fraudulent advertising in the Bill. How that is implemented in this area, as in many others, is best left to Ofcom and its codes of practice. When Ofcom publishes the codes of practice, it might consider such a duty, but we would rather leave Ofcom, as the expert regulator, with the flexibility to implement that via the codes of practice and leave the hard-edged duty in the Bill as drafted.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

We are going to press amendments 23 and 24 to a vote because they are very important. I cited the example of earlier legislation that considered it important, in relation to selling tickets, to include the wording “anywhere in the world”. We know that ticket abuses happen with organisations in different parts of the world.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady is perfectly entitled to press to a vote whatever amendments she sees fit, but in relation to amendments 24 and 25, the words she asks for,

“where the UK is a target market”,

are already in the Bill, in clause 3(5)(b), on page 3, which set out the definitions at the start. I will allow the hon. Lady a moment to look at where it states:

“United Kingdom users form one of the target markets for the service”.

That applies to user-to-user and to search, so it is covered already.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The problem is that we are getting into the wording of the Bill. As with the child abuse clause that we discussed before lunch, there are limitations. Clause 3 states that a service has links with the United Kingdom if

“the service has a significant number of United Kingdom users”.

It does not matter if a person is one of 50, 100 or 1,000 people who get scammed by some organisation operating in another part of the country. The 2006 Bill dealing with the sale of Olympic tickets believed that was important, and we also believe it is important. We have to find a way of dealing with ticket touting and ticket abuse.

Turning to fraudulent advertising, I have given examples and been supported very well by the hon. Member for Aberdeen North. It is not right that vulnerable people are repeatedly taken in by search results, which is the case right now. The reason we have tabled all these amendments is that we are trying to protect vulnerable people, as with every other part of the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

That is of course our objective as well, but let me just return to the question of the definitions. The hon. Lady is right that clause 3(5)(a) says

“a significant number of United Kingdom users”,

but paragraph (b) just says,

“United Kingdom users form one of the target markets”.

There is no significant number qualification in paragraph (b), and to put it beyond doubt, clause 166(1) makes it clear that service providers based outside the United Kingdom are within the scope of the Bill. To reiterate the point, where the UK is a target market, there is no size qualification: the service provider is in scope, even if it is only one user.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Does the Minister want to say anything about the other points I made about advertisements?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Not beyond the points I made previously, no.

Question put, That the amendment be made.

Division 16

Ayes: 7


Labour: 5
Scottish National Party: 2

Noes: 9


Conservative: 9

Question proposed, That the clause stand part of the Bill.
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 45, in clause 35, page 34, line 2, leave out subsection (1) and insert—

“(1) A provider of a Category 2A service must operate the service using proportionate systems and processes designed to—

(a) prevent individuals from encountering content consisting of fraudulent advertisements by means of the service;

(b) minimise the length of time for which any such content is present;

(c) where the provider is alerted by a person to the presence of such content, or becomes aware of it in any other way, swiftly take down such content.”

This amendment brings the fraudulent advertising provisions for Category 2A services in line with those for Category 1 services.

Government amendments 91 to 94.

Clause 35 stand part.

Amendment 44, in clause 36, page 35, line 10, at end insert—

“(4A) An offence under Part 3 of the Consumer Protection from Unfair Trading Regulations 2008.”

This amendment adds further offences to those which apply for the purposes of the Bill’s fraudulent advertising provisions.

Clause 36 stand part.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I am aware that the Minister has reconsidered the clause and tabled a Government amendment that is also in this group, with the same purpose as our amendment 45. That is welcome, as there was previously no justifiable reason why the duties on category 1 services and category 2A services were misaligned.

All three of the duties on category 1 services introduced by clause 34 are necessary to address the harm caused by fraudulent and misleading online adverts. Service providers need to take proportionate but effective action to prevent those adverts from appearing or reappearing, and when they do appear, those service providers need to act quickly by swiftly taking them down. The duties on category 2A services were much weaker, only requiring them to minimise the risk of individuals encountering content consisting of fraudulent advertisements in or via search results of the service. There was no explicit reference to prevention, even though that is vital, or any explicit requirement to act quickly to take harmful adverts down.

That difference would have created an opportunity for fraudsters to exploit by focusing on platforms with lesser protections. It could have resulted in an increase in fraud enabled by paid-for advertising on search services, which would have undermined the aims of the Bill. I am glad that the Government have recognised this and will require the same proactive, preventative response to harmful ads from regulated search engines as is required from category 1 services.

14:29
I will now speak to amendment 44, which focuses on the loophole that exists with regard to harm resulting from exposure to fraudulent and misleading advertising for debt help and solutions. The debt advice charity StepChange told us that as many as 15% of people searching for StepChange and other debt advice charities online are routed away by deceptive adverts, resulting in a staggering 1.7 million click-throughs every year. These adverts impersonate the names and branding of the charities and make misleading claims about the services on offer. People exposed to these adverts will be people needing debt advice who will often be under intense emotional and financial pressure. They can therefore be very vulnerable to scammers who then push them towards unsuitable services for a fee.
Debt advice charities, including StepChange and the Money Advice Trust, have been working hard to tackle these impersonator ads. For instance, StepChange reported 72 adverts to the tech giants and regulators last year for misleading and harmful practices, only some of which the Advertising Standards Authority has issued rulings against. StepChange and the Money Advice Trust are keen to have the safeguards in place that are needed by the people who are most vulnerable to harm and exploitation, yet in the current drafting of the Bill harmful adverts on debt advice could slip through the net.
The conditions for an advert to be defined as fraudulent are set out in clause 34(3) for category 1 services and clause 35(3) for category 2A search services. Both clauses specify that an advert is fraudulent if it amounts to an offence set out in clause 36. Clause 36 lists a series of offences gathered from financial services legislation and the Fraud Act 2006.
Charities are concerned that fraudulent debt advice advertisements will not be captured by the offences set out in clause 36(2) contained in the Financial Services and Markets Act 2000, which relate to persons unauthorised by the Financial Conduct Authority carrying on an activity that is regulated under the Act. While providing debt counselling and debt adjusting are regulated activities, brokering debt solutions is not. Therefore the offences listed in the Bill would not seem to capture the unregulated advertisers behind misleading adverts, including those that impersonate debt advice charities.
Furthermore, the explanatory notes for the offences taken from the Financial Services Act 2012 show that these offences appear to be intended to address financial market abuse, and so seem somewhat at a distance from the harm consumers face from fraudulent online ads for debt help services.
Clause 36(3) lists offences under the Fraud Act 2006. This could capture harmful advertisements for debt help and debt solutions, but it is not completely clear that these provisions capture, or best capture, the nature of unfair practice caused by misleading online adverts for debt solutions. The Government’s announcement on 8 March outlined that fraudulent paid-for online adverts would be included in this Bill. However, they drew a distinction between “fraudulent adverts”, to be covered by the Bill, and “misleading adverts”, which will be considered in the online advertising consultation. In reality, this dividing line is not clear cut, even where the Bill seeks to define “fraudulent adverts” in terms of offences in other legislation.
Amendment 44 seeks to align clause 36 offences better with important existing consumer protection legislation. It would insert further offences into clause 36 to include offences that are contained in part 3 of the existing consumer protection from unfair trading regulations of 2008. Those regulations are key pieces of consumer protection legislation. Part 3 of those regulations creates offences relating to misleading or aggressive practices. Most relevant here would be the regulation 9 offence for contravening the prohibition on “misleading actions”, which states that something is a misleading practice if it fulfils one of two conditions. The first is that it both contains “false information” and is likely to cause “the average consumer” to take a decision they would not otherwise have done. The second is that it causes “confusion” with other products or trade names.
It has been pointed out that these regulations by themselves have not stopped vulnerable consumers being exposed to adverts of misleading debt solutions, despite the best efforts of regulators and charities to stop them. Adding offences under the consumer protection regulations to the Bill would finally close the net.
There should be no objection from the Government to this amendment. Through the consumer protection regulations, they have already recognised misleading commercial practices as an offence, including promotions that mislead consumers or create confusion over trade names. We therefore have a situation where harmful debt adverts meet the criteria of offence in consumer protection regulations, but might not meet the Fraud Act 2006 provisions in the Online Safety Bill. The amendment seeks to clarify and align the treatment of misleading debt adverts, which can be so harmful to people.
I admit that these amendments can get very technical, but it is important that I finish by talking about the impact of these scams on people’s lives. I want to talk about the experience of a woman who was recommended to StepChange’s debt advice services but clicked on a copycat debt ad from a firm masquerading as StepChange in the online search results. After entering her personal information into what she thought was a genuine website, the woman was pestered by phone calls into setting up an individual voluntary arrangement, or IVA, and made a series of payments worth £650 that were meant for her creditors. Sadly, it was only after contact from her bank, four months later, that the woman realised the debt firm she had clicked on was a scam.
The Bill offers a chance to establish an important principle. People should be able to have confidence that the links they click on are for reputable regulated advice services. People should not have to be constantly on their guard against scams and other misleading promotions found on social media websites and in top-of-the-page search results. Without this amendment and the others to this chapter, we cannot be sure that those outcomes will be achieved.
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As we have heard already, these clauses are very important because they protect people from online fraudulent advertisements for the first time—something that the whole House quite rightly called for. As the shadow Minister said, the Government heard Parliament’s views on Second Reading, and the fact that the duties in clause 35 were not as strongly worded as those in clause 34 was recognised. The Government heard what Members said on Second Reading and tabled Government amendments 91 to 94, which make the duties on search firms in clause 35 as strong as those on user-to-user firms in clause 34. Opposition amendment 45 would essentially do the same thing, so I hope we can adopt Government amendments 91 to 94 without needing to move amendment 45. It would do exactly the same thing—we are in happy agreement on that point.

I listened carefully to what the shadow Minister said on amendment 44. The example she gave at the end of her speech—the poor lady who was induced into sending money, which she thought was being sent to pay off creditors but was, in fact, stolen—would, of course, be covered by the Bill as drafted, because it would count as an act of fraud.

The hon. Lady also talked about some other areas that were not fraud, such as unfair practices, misleading statements or statements that were confusing, which are clearly different from fraud. The purpose of clause 35 is to tackle fraud. Those other matters are, as she says, covered by the Consumer Protection from Unfair Trading Regulations 2008, which are overseen and administered by the Competition and Markets Authority. While matters to do with unfair, misleading or confusing content are serious—I do not seek to minimise their importance—they are overseen by a different regulator and, therefore, better handled by the CMA under its existing regulations.

If we introduce this extra offence to the list in clause 36, we would end up having a bit of regulatory overlap and confusion, because there would be two regulators involved. For that reason, and because those other matters—unfair, misleading and confusing advertisements —are different to fraud, I ask that the Opposition withdraw amendment 44 and, perhaps, take it up on another occasion when the CMA’s activities are in the scope of the debate.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

No, we want to press this amendment to a vote. I have had further comment from the organisations that I quoted. They believe that we do need the amendment because it is important to stop harmful ads going up in the first place. They believe that strengthened provisions are needed for that. Guidance just puts the onus for protecting consumers on the other regulatory regimes that the Minister talked about. The view of organisations such as StepChange is that those regimes—the Advertising Standards Authority regime—are not particularly strong.

The regulatory framework for financial compulsion is fragmented. FCA-regulated firms are clearly under much stronger obligations than those that fall outside FCA regulations. I believe that it would be better to accept the amendment, which would oblige search engines and social media giants to prevent harmful and deceptive ads from appearing in the first place. The Minister really needs to take on board the fact that in this patchwork, this fragmented world of different regulatory systems, some of the existing systems are clearly failing badly, and the strong view of expert organisations is that the amendment is necessary.

Question put and agreed to.

Clause 34 accordingly ordered to stand part of the Bill.

Clause 35

Duties about fraudulent advertising: Category 2A services

Amendments made: 91, in clause 35, page 34, line 3, leave out from “to” to end of line 5 and insert—

“(a) prevent individuals from encountering content consisting of fraudulent advertisements in or via search results of the service;

(b) if any such content may be encountered in or via search results of the service, minimise the length of time that that is the case;

(c) where the provider is alerted by a person to the fact that such content may be so encountered, or becomes aware of that fact in any other way, swiftly ensure that individuals are no longer able to encounter such content in or via search results of the service.”

This amendment alters the duty imposed on providers of Category 2A services relating to content consisting of fraudulent advertisements so that it is in line with the corresponding duty imposed on providers of Category 1 services by clause 34(1).

Amendment 92, in clause 35, page 34, line 16, leave out “reference” and insert “references”.

This amendment is consequential on Amendment 91.

Amendment 93, in clause 35, page 34, line 18, leave out “is a reference” and insert “are references”.

This amendment is consequential on Amendment 91.

Amendment 94, in clause 35, page 34, line 22, leave out

“does not include a reference”

and insert “do not include references”.—(Chris Philp.)

This amendment is consequential on Amendment 91.

Clause 35, as amended, ordered to stand part of the Bill.

Clause 36

Fraud etc offences

Amendment proposed: 44, in clause 36, page 35, line 10, at end insert—

“(4A) An offence under Part 3 of the Consumer Protection from Unfair Trading Regulations 2008.”—(Barbara Keeley.)

This amendment adds further offences to those which apply for the purposes of the Bill’s fraudulent advertising provisions.

Question put, That the amendment be made.

Division 17

Ayes: 7


Labour: 5
Scottish National Party: 2

Noes: 9


Conservative: 9

Clause 36 ordered to stand part of the Bill.
Clause 37
Codes of practice about duties
None Portrait The Chair
- Hansard -

Amendment 96 has been tabled by Carla Lockhart, who is not on the Committee. Does anyone wish to move amendment 96? No.

14:45
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move amendment 65, in clause 37, page 36, line 27, at end insert—

“(ia) organisations that campaign for the removal of animal abuse content, and”.

This amendment would add organisations campaigning for the removal of animal content to the list of bodies Ofcom must consult.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 63, in schedule 4, page 176, line 29, at end insert “and

(x) there are adequate safeguards to monitor cruelty towards humans and animals;”.

This amendment would ensure that ensuring adequate safeguards to monitor cruelty towards humans and animals is one of the online safety objectives for user-to-user services.

Amendment 64, in schedule 4, page 177, line 4, at end insert “and

(vii) the systems and process are appropriate to detect cruelty towards humans and animals;”.

This amendment would ensure that ensuring systems and processes are appropriate to detect cruelty towards humans and animals is one of the online safety objectives for search services.

Amendment 60, in clause 52, page 49, line 5, at end insert—

“(e) an offence, not within paragraph (a), (b) or (c), of which the subject is an animal.”

This amendment brings offences to which animals are subject within the definition of illegal content.

Amendment 59, in schedule 7, page 185, line 39, at end insert—

“Animal Welfare

22A An offence under any of the following provisions of the Animal Welfare Act 2006—

(a) section 4 (unnecessary suffering);

(b) section 5 (mutilation);

(c) section 7 (administration of poisons);

(d) section 8 (fighting);

(e) section 9 (duty of person responsible for animal to ensure welfare).

22B An offence under any of the following provisions of the Animal Health and Welfare (Scotland) Act 2006—

(a) section 19 (unnecessary suffering);

(b) section 20 (mutilation);

(c) section 21 (cruel operations);

(d) section 22 (administration of poisons);

(e) section 23 (fighting);

(f) section 24 (ensuring welfare of animals).

22C An offence under any of the following provisions of the Welfare of Animals Act (Northern Ireland) 2011—

(a) section 4 (unnecessary suffering);

(b) section 5 (prohibited procedures);

(c) section 7 (administration of poisons);

(d) section 8 (fighting);

(e) section 9 (ensuring welfare of animals).

22D For the purpose of paragraphs 22A, 22B or 22C of this Schedule, the above offences are deemed to have taken place regardless of whether the offending conduct took place within the United Kingdom, if the offending conduct would have constituted an offence under the provisions contained within those paragraphs.”

This amendment adds certain animal welfare offences to the list of priority offences in Schedule 7.

Amendment 66, in clause 140, page 121, line 8, at end insert—

“(d) causing harm to any human or animal.”

This amendment ensures groups are able to make complaints regarding animal abuse videos.

Amendment 67, in clause 140, page 121, line 20, at end insert

“, or a particular group that campaigns for the removal of harmful online content towards humans and animals”.

This amendment makes groups campaigning against harmful content eligible to make supercomplaints.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It is, as ever, a pleasure to serve under your chairship, Ms Rees. Amendment 65 would add organisations campaigning for the removal of animal content to the list of bodies that Ofcom must consult. As we all know, Ofcom must produce codes of practice that offer guidance on how regulated services can comply with its duties. Later in the Bill, clause 45 makes clear that if a company complies with the code of practice, it will be deemed to have complied with the Bill in general. In addition, the duties for regulated services come into force at the same time as the codes of practice. That all makes what the codes say extremely important.

The absence of protections relating to animal abuse content is a real omission from the Bill. Colleagues will have seen the written evidence from Action for Primates, which neatly summarised the key issues on which Labour is hoping to see agreement from the Government. Given this omission, it is clear that the current draft of the Bill is not fit for tackling animal abuse, cruelty and violence, which is all too common online.

There are no explicit references to content that can be disturbing and distressing to those who view it—both children and adults. We now know that most animal cruelty content is produced specifically for sharing on social media, often for profit through the monetisation schemes offered by platforms such as YouTube. Examples include animals being beaten, set on fire, crushed or partially drowned; the mutilation and live burial of infant monkeys; a kitten intentionally being set on by a dog and another being stepped on and crushed to death; live and conscious octopuses being eaten; and animals being pitted against each other in staged fights.

Animals being deliberately placed into frightening or dangerous situations from which they cannot escape or are harmed before being “rescued” on camera is becoming increasingly popular on social media, too. For example, kittens and puppies are “rescued” from the clutches of a python. Such fake rescues not only cause immense suffering to animals, but are fraudulent because viewers are asked to donate towards the rescue and care of the animals. This cannot be allowed to continue.

Indeed, as part of its Cancel Out Cruelty campaign, the Royal Society for the Prevention of Cruelty to Animals conducted research, which found that in 2020 there were nearly 500 reports of animal cruelty on social media. That was more than twice the figure reported for 2019. The majority of these incidents appeared on Facebook. David Allen, head of prevention and education at the RSPCA, has spoken publicly about the issue, saying:

“Sadly, we have seen an increase in recent years in the number of incidents of animal cruelty being posted and shared on social media such as Facebook, Instagram, TikTok and Snapchat.”

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

I totally agree with the points that the hon. Lady is making. Does she agree that the way in which the Bill is structured means that illegal acts that are not designated as “priority illegal” will likely be put at the very end of companies’ to-do list and that they will focus considerably more effort on what they will call “priority illegal” content?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree with and welcome the hon. Gentleman’s contribution. It is a very valid point and one that we will explore further. It shows the necessity of this harm being classed as a priority harm in order that we protect animals, as well as people.

David Allen continued:

“We’re very concerned that the use of social media has changed the landscape of abuse with videos of animal cruelty being shared for likes and kudos with this sort of content normalising—and even making light of—animal cruelty. What’s even more worrying is the level of cruelty that can be seen in these videos, particularly as so many young people are being exposed to graphic footage of animals being beaten or killed which they otherwise would never have seen.”

Although the Bill has a clear focus on protecting children, we must remember that the prevalence of cruelty to animals online has the potential to have a hugely negative impact on children who may be inadvertently seeing that content through everyday social media channels.

Jane Stevenson Portrait Jane Stevenson (Wolverhampton North East) (Con)
- Hansard - - - Excerpts

The hon. Lady knows that I am a great animal lover, and I obviously have concerns about children being exposed to these images. I am just wondering how she would differentiate between abusive images and the images that are there to raise awareness of certain situations that animals are in. I have seen many distressing posts about the Yulin dogmeat festival and about beagles being used in laboratory experiments. How would she differentiate between images that are there to raise awareness of the plight of animals and the abusive ones?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I thank the hon. Lady for her contribution. Like me, she is a passionate campaigner for animal welfare. It was a pleasure to serve on the Committee that considered her Glue Traps (Offences) Act 2022, which I know the whole House was pleased to pass. She raises a very important point and one that the Bill later explores with regard to other types of content, such as antisemitic content and racist content in terms of education and history and fact. The Bill deals specifically with that later, and this content would be dealt with in the same way. We are talking about where content is used as an educational tool and a raising-awareness tool, compared with just images and videos of direct abuse.

To give hon. Members a real sense of the extent of the issue, I would like to share some findings from a recent survey of the RSPCA’s frontline officers. These are pretty shocking statistics, as I am sure Members will all agree. Eighty-one per cent. of RSPCA frontline officers think that more abuse is being caught on camera. Nearly half think that more cases are appearing on social media. One in five officers said that one of the main causes of cruelty to animals is people hurting animals just to make themselves more popular on social media. Some of the recent cruelty videos posted on social media include a video of a magpie being thrown across the road on Instagram in June 2021; a woman captured kicking her dog on TikTok in March 2021; a teenager being filmed kicking a dog, which was shared on WhatsApp in May 2021; and videos posted on Instagram of cockerels being forced to fight in March 2021.

I am sure that colleagues will be aware of the most recent high-profile case, which was when disturbing footage was posted online of footballer Kurt Zouma attacking his cat. There was, quite rightly, an outpouring of public anger and demands for justice. Footage uploaded to Snapchat on 6 February showed Zouma kicking his Bengal cat across a kitchen floor in front of his seven-year-old son. Zouma also threw a pair of shoes at his pet cat and slapped its head. In another video, he was heard saying:

“I swear I’ll kill it.”

In sentencing him following his guilty plea to two offences under the Animal Welfare Act 2006, district judge Susan Holdham described the incident as “disgraceful and reprehensible”. She added:

“You must be aware that others look up to you and many young people aspire to emulate you.”

What makes that case even more sad is the way in which the video was filmed and shared, making light of such cruelty. I am pleased that the case has now resulted in tougher penalties for filming animal abuse and posting it on social media, thanks to new guidelines from the Sentencing Council. The prosecutor in the Zouma case, Hazel Stevens, told the court:

“Since this footage was put in the public domain there has been a spate of people hitting cats and posting it on various social media sites.”

There have been many other such instances. Just a few months ago, the most abhorrent trend was occurring on TikTok: people were abusing cats, dogs and other animals to music and encouraging others to do the same. Police officers discovered a shocking 182 videos with graphic animal cruelty on mobile phones seized during an investigation. This sickening phenomenon is on the rise on social media platforms, provoking a glamorisation of the behaviour. The videos uncovered during the investigation showed dogs prompted to attack other animals such as cats, or used to hunt badgers, deer, rabbits and birds. Lancashire police began the investigation after someone witnessed two teenagers encouraging a dog to attack a cat on an estate in Burnley in March of last year. The cat, a pet named Gatsby, was rushed to the vet by its owners once they discovered what was going on, but unfortunately it was too late and Gatsby’s injuries were fatal. The photos and videos found on the boys’ phones led the police to discover more teenagers in the area who were involved in such cruel activities. The views and interactions that the graphic footage was attracting made it even more visible, as the platform was increasing traffic and boosting content when it received attention.

It should not have taken such a high-profile case of a professional footballer with a viral video to get this action taken. There are countless similar instances occurring day in, day out, and yet the platforms and authorities are not taking the necessary action to protect animals and people from harm, or to protect the young people who seek to emulate this behaviour.

I pay tribute to the hard work of campaigning groups such as the RSPCA, Action for Primates, Asia for Animals Coalition and many more, because they are the ones who have fought to keep animal rights at the forefront. The amendment seeks to ensure that such groups are given a voice at the table when Ofcom consults on its all-important codes of practice. That would be a small step towards reducing animal abuse content online, and I hope the Minister can see the merits in joining the cause.

I turn to amendment 60, which would bring offences to which animals are subject within the definition of illegal content, a point raised by the hon. Member for Ochil and South Perthshire. The Minister will recall the Animal Welfare (Sentencing) Act 2021, which received Royal Assent last year. Labour was pleased to see the Government finally taking action against those who commit animal cruelty offences offline. The maximum prison sentence for animal cruelty was increased from six months to five years, and the Government billed that move as them taking a firmer approach to cases such as dog fighting, abuse of puppies and kittens, illegally cropping a dog’s ears and gross neglect of farm animals. Why, then, have the Government failed to include offences against animals within the scope of illegal content online? We want parity between the online and offline space, and that seems like a sharp omission from the Bill.

Placing obligations on service providers to remove animal cruelty content should fall within both the spirit and the scope of the Bill. We all know that the scope of the Bill is to place duties on service providers to remove illegal and harmful content, placing particular emphasis on the exposure of children. Animal cruelty content is a depiction of illegality and also causes significant harm to children and adults.

If my inbox is anything to go by, all of us here today know what so many of our constituents up and down the country feel about animal abuse. It is one of the most popular topics that constituents contact me about. Today, the Minister has a choice to make about his Government's commitment to preventing animal cruelty and keeping us all safe online. I hope he will see the merit in acknowledging the seriousness of animal abuse online.

Amendment 66 would ensure that groups were able to make complaints about animal abuse videos. Labour welcomes clause 140, as the ability to make super-complaints is a vital part of our democracy. However, as my hon. Friend the Member for Worsley and Eccles South and other Members have mentioned, the current definition of an “eligible entity” is far too loose. I have set out the reasons as to why the Government must go further to limit and prevent animal abuse content online. Amendment 66 would ensure that dangerous animal abuse content is a reasonable cause for a super-complaint to be pursued.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The shadow Minister raises important issues to do with animal cruelty. The whole House and our constituents feel extremely strongly about this issue, as we know. She set out some very powerful examples of how this terrible form of abuse takes place.

To some extent, the offences are in the Bill’s scope already. It covers, for example, extreme pornography. Given that the content described by the hon. Lady would inflict psychological harm to children, it is, to that extent, in scope.

The hon. Lady mentioned the Government’s wider activities to prevent animal cruelty. That work goes back a long time and includes the last Labour Government’s Animal Welfare Act 2006. She mentioned the more recent update to the criminal sentencing laws that increased by a factor of 10 the maximum sentence for cruelty to animals. It used to be six months and has now been increased to up to five years in prison.

In addition, just last year the Department for Environment, Food and Rural Affairs announced an action plan for animal welfare, which outlines a whole suite of activities that the Government are taking to protect animals in a number of different areas—sentience, international trade, farming, pets and wild animals. That action plan will be delivered through a broad programme of legislative and non-legislative work.

15:00
I mentioned some of the ways the Bill will assist with looking after animals. We are concerned to make sure that the Bill delivers its core intent: to protect children, to protect humans from illegal activity, and to stop the priority offences. Given that that is the objective, and given everything else I have just said about the other work that is going on—much of which is effective, as demonstrated by the prosecution of Kurt Zouma just a week or two ago—we do not feel able to accept the amendments as drafted. However, it is an area that I am sure is of concern to Members across the House, and now that the shadow Minister has raised the question, we will certainly give further thought to it.
On the basis of the Government’s existing work on animal welfare, the effect that the Bill as drafted will have in this area, and the fact that we will give this issue some further thought, I hope that the shadow Minister will let the matter rest for now.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I thank the Minister for agreeing to look at this issue further. However, we do see it as being within the scope of the Bill, and we have the opportunity to do something about it now, so we will be pressing these amendments to a vote. If you will allow me, Ms Rees, I would also like to pay tribute to the former Member of Parliament for Redcar, Anna Turley, who campaigned tirelessly on these issues when she was a Member of the House. We would like these amendments to be part of the Bill.

Question put, That the amendment be made.

Division 18

Ayes: 7


Labour: 5
Scottish National Party: 2

Noes: 8


Conservative: 8

Question proposed, That the clause stand part of the Bill.
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause 38 stand part.

That schedule 4 be the Fourth schedule to the Bill.

New clause 20—Use of proactive technology in private messaging: report

“(1) OFCOM must produce a report—

(a) examining the case for the use of proactive technology in private messaging where the aim is to identify CSEA content; and

(b) making recommendations to whether or not proactive technology should be used in such cases.

(2) The report must be produced in consultation with organisations that have expertise and experience in tackling CSEA.

(3) The report must be published and laid before both Houses of Parliament within six months of this Act being passed.”

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

On clause 37, it is welcome that Ofcom will have to prepare and issue a code of practice for service providers with duties relating to illegal content in the form of terrorism or child sexual exploitation and abuse content. The introduction of compliance measures relating to fraudulent advertising is also very welcome. We do, however, have some important areas to amend, including the role of different expert groups in assisting Ofcom during its consultation process, which I have already outlined in relation to animal cruelty.

On clause 38, Labour supports the notion that Ofcom must have specific principles to adhere to when preparing the codes of practice, and of course, the Secretary of State must have oversight of those. However, as I will touch on as we proceed, Labour feels that far too much power is given to the Secretary of State of the day in establishing those codes.

Labour believes that that schedule 4 is overwhelmingly loose in its language, and we have concerns about the ability of Ofcom—try as it might—to ensure that its codes of practice are both meaningful to service providers and in compliance with the Bill’s legislative requirements. Let me highlight the schedule’s broadness by quoting from it. Paragraph 4 states:

“The online safety objectives for regulated user-to-user services are as follows”.

I will move straight to paragraph 4(a)(iv), which says

“there are adequate systems and processes to support United Kingdom users”.

Forgive me if I am missing something here, but surely an assessment of adequacy is too subjective for these important codes of practice. Moreover, the Bill seems to have failed to consider the wide-ranging differences that exist among so-called United Kingdom users. Once again, there is no reference to future-proofing against emerging technologies. I hope that the Minister will therefore elaborate on how he sees the codes of practice and their principles, objectives and content as fit for purpose. More broadly, it is remarkable that schedule 4 is both too broad in its definitions and too limiting in some areas—we might call it a Goldilocks schedule.

I turn to new clause 20. As we have discussed, a significant majority of online child abuse takes place in private messages. Research from the NSPCC shows that 12 million of the 18.4 million child sexual abuse reports made by Facebook in 2019 related to content shared on private channels. Recent data from the Office for National Statistics shows that private messaging plays a central role in contact between children and people whom they have not met offline before. When children are contacted by someone they do not know, in nearly three quarters of cases that takes place by private message.

Schedule 4 introduces new restrictions on Ofcom’s ability to require a company to use proactive technology to identify or disrupt abuse in private messaging. That will likely restrict Ofcom’s ability to include in codes of practice widely used industry-standard tools such as PhotoDNA and CSAI Match, which detect known child abuse images, and artificial intelligence classifiers to detect self-generated images and grooming behaviour. That raises significant questions about whether the regulator can realistically produce codes of practice that respond to the nature and extent of the child abuse threat.

As it stands, the Bill will leave Ofcom unable to require companies to proactively use technology that can detect child abuse. Instead, Ofcom will be wholly reliant on the use of CSEA warning notices under clause 103, which will enable it to require the use of proactive technologies only where there is evidence that child abuse is already prevalent—in other words, where significant online harm has already occurred. That will necessitate the use of a laborious and resource-intensive process, with Ofcom having to build the evidence to issue CSEA warning notices company by company.

Those restrictions will mean that the Bill will be far less demanding than comparable international legislation in respect of the requirement on companies to proactively detect and remove online child abuse. So much for the Bill being world leading. For example, the EU child abuse legislative proposal published in May sets out clear and unambiguous requirements on companies to proactively scan for child abuse images and grooming behaviour on private messages.

If the regulator is unable to tackle online grooming sufficiently proactively, the impact will be disproportionately felt by girls. NSPCC data shows that an overwhelming majority of criminal offences target girls, with those aged 12 to 15 the most likely to be victims of online grooming. Girls were victims in 83% of offences where data was recorded. Labour recognises that once again there are difficulties between our fundamental right to privacy and the Bill’s intentions in keeping children safe. This probing new clause is designed to give the Government an opportunity to report on the effectiveness of their proposed approach.

Ultimately, the levels of grooming taking place on private messaging platforms are incredibly serious. I have two important testimonies that are worth placing on the record, both of which have been made anonymous to protect the victims but share the same sentiment. The first is from a girl aged 15. She said:

“I’m in a serious situation that I want to get out of. I’ve been chatting with this guy online who’s like twice my age. This all started on Instagram but lately all our chats have been on WhatsApp. He seemed really nice to begin with, but then he started making me do these things to prove my trust to him, like doing video chats with my chest exposed.”

The second is from a boy aged 17. He said:

“I’ve got a fitness page on Instagram to document my progress but I get a lot of direct messages from weird people. One guy said he’d pay me a lot of money to do a private show for him. He now messages me almost every day asking for more explicit videos and I’m scared that if I don’t do what he says, then he will leak the footage and my life would be ruined”.

Those testimonies go to show how fundamentally important it is for an early assessment to be made of the effectiveness of the Government’s approach following the Bill gaining Royal Assent.

We all have concerns about the use of proactive technology in private messaging and its potential impact on personal privacy. End-to-end encryption offers both risks and benefits to the online environment, but the main concern is based on risk profiles. End-to-end encryption is particularly problematic on social networks because it is embedded in the broader functionality of the service, so all text, DMs, images and live chats could be encrypted. Consequently, its impact on detecting child abuse becomes even greater. There is an even greater risk with Meta threatening to bring in end-to-end encryption for all its services. If platforms cannot demonstrate that they can mitigate those risks to ensure a satisfactory risk profile, they should not be able to proceed with end-to-end encryption until satisfactory measures and mitigations are in place.

Tech companies have made significant efforts to frame this issue in the false binary that any legislation that impacts private messaging will damage end-to-end encryption and will mean that encryption will not work or is broken. That argument is completely false. A variety of novel technologies are emerging that could allow for continued CSAM scanning in encrypted environments while retaining the privacy benefits afforded by end-to-end encryption.

Apple, for example, has developed its NeuralHash technology, which allows for on-device scans for CSAM before a message is sent and encrypted. That client-side implementation—rather than service-side encryption—means that Apple does not learn anything about images that do not match the known CSAM database. Apple’s servers flag accounts that exceed a threshold number of images that match a known database of CSAM image hashes, so that Apple can provide relevant information to the National Centre for Missing and Exploited Children. That process is secure and expressly designed to preserve user privacy.

Homomorphic encryption technology can perform image hashing on encrypted data without the need to decrypt the data. No identifying information can be extracted and no details about the encrypted image are revealed, but calculations can be performed on the encrypted data. Experts in hash scanning—including Professor Hany Farid of the University of California, Berkeley, who developed PhotoDNA—insist that scanning in end-to-end encrypted environments without damaging privacy will be possible if companies commit to providing the engineering resources to work on it.

To move beyond the argument that requiring proactive scanning for CSAM means breaking or damaging end-to-end encryption, amendments to the Bill could provide a powerful incentive for companies to invest in technology and engineering resources that will allow them to continue scanning while pressing ahead with end-to-end encryption, so that privacy is preserved but appropriate resources for and responses to online child sexual abuse can continue. It is highly unlikely that some companies will do that unless they have the explicit incentive to do so. Regulation can provide such an incentive, and I urge the Minister to make it possible.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

It is a pleasure to follow the shadow Minister, who made some important points. I will focus on clause 37 stand part. I pay tribute to the Minister for his incredible work on the Bill, with which he clearly wants to stop harm occurring in the first place. We had a great debate on the matter of victim support. The Bill requires Ofcom to produce a number of codes of practice to help to achieve that important aim.

Clause 37 is clear: it requires codes of practice on illegal content and fraudulent advertising, as well as compliance with “the relevant duties”, and it is on that point that I hope the Minister can help me. Those codes will help Ofcom to take action when platforms do things that they should not, and will, I hope, provide a way for platforms to comply in the first place rather than falling foul of the rules.

How will the codes help platforms that are harbouring material or configuring their services in a way that might be explicitly or inadvertently promoting violence against women and girls? The Minister knows that women are disproportionately the targets of online abuse on social media or other platforms. The impact, which worries me as much as I am sure it worries him, is that women and girls are told to remove themselves from social media as a way to protect themselves against extremely abusive or harassing material. My concern is that the lack of a specific code to tackle those important issues might inadvertently mean that Ofcom and the platforms overlook them.

Would a violence against women and girls code of practice help to ensure that social media platforms were monitored by Ofcom for their work to prevent tech-facilitated violence against women and girls? A number of organisations think that it would, as does the Domestic Abuse Commissioner herself. Those organisations have drafted a violence against women and girls code of practice, which has been developed by an eminent group of specialists—the End Violence Against Women Coalition, Glitch, Carnegie UK Trust, the NSPCC, 5Rights, and Professors Clare McGlynn and Lorna Woods, both of whom gave evidence to us. They believe it should be mandatory for Ofcom to adopt a violence against women and girls code to ensure that this issue is taken seriously and that action is taken to prevent the risks in the first place. Clause 37 talks about codes, but it is not specific on that point, so can the Minister help us? Like the rest of the Committee, he wants to prevent women from experiencing these appalling acts online, and a code of practice could help us deal with that better.

15:15
The Government already recognise that women disproportionately experience the impact of online abuse, and they have a track record of acting. They were the first to outlaw revenge pornography, and they have introduced more laws since. I hope the Minister will put at rest my mind and the minds of those who drew together the code that was issued late last month by setting out how this will be undertaken by Ofcom. Will a code on this issue be pulled together, or will it be incorporated into the codes that are being developed? It is incredibly important for him to do that.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I absolutely agree with the points that have been made about the violence against women code of conduct. It is vital, and it would be a really important addition to the Bill. I associate myself with the shadow Minister’s comments, and am happy to stand alongside her.

I want to make a few comments about new clause 20 and some of the issues it raises. The new clause is incredibly important, and we need to take seriously the concerns that have been raised with us by the groups that advocate on behalf of children. They would not raise those concerns if they did not think the Bill was deficient in this area. They do not have spare people and cannot spend lots of time doing unnecessary things, so if they are raising concerns, those are very important things that will make a big difference.

I want to go a little further than what the new clause says and ask the Minister about future-proofing the Bill and ensuring that technologies can be used as they evolve. I am pretty sure that everybody agrees that there should be no space where it is safe to share child sexual exploitation and abuse, whether physical space or online space, private messaging or a more open forum. None of those places should be safe or legal. None should enable that to happen.

My particular thought about future-proofing is about the development of technologies that are able to recognise self-generated pictures, videos, livestreams and so on that have not already been categorised, do not have a hash number and are not easy for the current technologies to find. There are lots of people out there working hard to stamp out these images and videos online, and I have faith that they are developing new technologies that are able to recognise images, videos, messages and oral communications that cannot currently be recognised.

I agree wholeheartedly with the new clause: it is important that a report be produced within six months of the Bill being passed. It would be great if the Minister would commit to thinking about whether Ofcom will be able to require companies to implement new technologies that are developed, as well as the technologies that are currently available. I am not just talking about child sexual abuse images, material or videos; I am also talking about private messaging where grooming is happening. That is a separate thing that needs to be scanned for, but it is incredibly important.

Some of the stories relayed by the shadow Minister relate to conversations and grooming that happened in advance of the self-generated material being created. If there had been a proactive action to scan for grooming behaviour by those companies whose platforms the direct messaging was taking place on, then those young people would potentially have been in a safer place, because it could have been stopped in advance of that self-generated material being created. Surely, that should be the aim. It is good that we can tackle this after the event—it is good that we have something—but tackling it before it happens would be incredibly important.

Jane Stevenson Portrait Jane Stevenson
- Hansard - - - Excerpts

Online sexual exploitation is a horrific crime, and we all want to see it ended for good. I have concerns about whether new clause 20 is saying we should open up all messaging—where is the consideration of privacy when the scanning is taking place? Forgive me, I do not know much about the technology that is available to scan for that content. I do have concerns that responsible users will have an infringement of privacy, even when doing nothing of concern.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I do not know whether everybody draws the same distinction as me. For me the distinction is that, because it will be happening with proactive technology—technological means will be scanning those messages rather than humans—nobody will see the messages. Software will scan messages, and should there be anything that is illegal—should there be child sexual abuse material—that is what will be flagged and further action taken.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am not sure whether the hon. Member for Wolverhampton North East heard during my contribution, but this technology does exist, so it is possible. It is a false argument made by those who believe that impacting end-to-end encryption will limit people’s privacy. The technology does exist, and I named some that is able to scan without preventing the encryption of the data. It simply scans for those images and transfers them over existing databases. It would have no impact on anybody’s right to privacy.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I thank the shadow Minister for her assistance with that intervention, which was incredibly helpful. I do not have concerns that anybody will be able to access that data. The only data that will be accessible is when the proactive technology identifies something that is illegal, so nobody can see any of the messages except for the artificial intelligence. When the AI recognises that something is abuse material, at that point the Bill specifies that it will go to the National Crime Agency if it is in relation to child abuse images.

Jane Stevenson Portrait Jane Stevenson
- Hansard - - - Excerpts

My concern is that, at the point at which the data is sent to the National Crime Agency, it will be visible to human decision making. I am wondering whether that will stop parents sharing pictures of their babies in the bath? There are instances where people could get caught up in a very innocent situation that is deemed to be something more sinister by AI. However, I will take the advice of the hon. Member for Pontypridd advice and look into the technology.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

In terms of the secondary processes that kick in after the AI has scanned the data, I assume it will be up to Ofcom and the provider to discuss what happens then. Once the AI identifies something, does it automatically get sent to the National Crime Agency, or does it go through a process of checking to ensure the AI has correctly identified something? I agree with what the Minister has reiterated on a number of occasions; if it is child sexual abuse material then I have no problem with somebody’s privacy being invaded in order for that to be taken to the relevant authorities and acted on.

I want to make one last point. The wording of new clause 20 is about a report on those proactive technologies. It is about requiring Ofcom to come up with and justify the use of those proactive technologies. To give the hon. Member for Wolverhampton North East some reassurance, it is not saying, “This will definitely happen.” I assume that Ofcom will be able to make the case—I am certain it will be able to—but it will have to justify it in order to be able to require those companies to undertake that use.

My key point is about the future-proofing of this, ensuring that it is not just a one-off, and that, if Ofcom makes a designation about the use of proactive technologies, it is able to make a re-designation or future designation, should new proactive technologies come through, so that we can require those new proactive technologies to be used to identify things that we cannot identify with the current proactive technologies.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

I want to associate myself with the comments of the right hon. Member for Basingstoke and the hon. Member for Aberdeen North, and to explore the intersection between the work we are doing to protect children and the violence against women and girls strategy. There is one group, girls, who apply to both. We know that they are sadly one of the most vulnerable groups for online harm and abuse, and we must do everything we can to protect them. Having a belt and braces approach, with a code of conduct requirement for the violence against women and girls strategy, plus implementing new clause 20 on this technology that can protect girls in particular, although not exclusively, is a positive thing. Surely, the more thorough we are in the preventive approach, the better, rather than taking action after it is too late?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I agree 100%. The case that the shadow Minister, the hon. Member for Pontypridd, made and the stories she highlighted about the shame that is felt show that we are not just talking about a one-off impact on people’s lives, but potentially years of going through those awful situations and then many years to recover, if they ever do, from the situations they have been through.

I do not think there is too much that we could do, too many codes of practice we could require or too many compliances we should have in place. I also agree that girls are the most vulnerable group when considering this issue, and we need to ensure that this Bill is as fit for purpose as it can be and meets the Government’s aim of trying to make the internet a safe place for children and young people. Because of the additional risks that there are for girls in particular, we need additional protections in place for girls. That is why a number of us in this room are making that case.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

This has been an important debate. I think there is unanimity on the objectives we are seeking to achieve, particularly protecting children from the risk of child sexual exploitation and abuse. As we have discussed two or three times already, we cannot allow end-to-end encryption to frustrate or prevent the protection of children.

I will talk about two or three of the issues that have arisen in the course of the debate. The first is new clause 20, a proposal requiring Ofcom to put together a report. I do not think that is strictly necessary, because the Bill already imposes a requirement to identify, assess and mitigate CSEA. There is no optionality here and no need to think about it; there is already a demand to prevent CSEA content, and Ofcom has to produce codes of practice explaining how it will do that. I think what is requested in new clause 20 is required already.

The hon. Member for Pontypridd mentioned the concern that Ofcom had to first of all prove that the CSEA risk existed. I think that might be a hangover from the previous draft of the Bill, where there was a requirement for the evidence to be “persistent and prevalent”—I think that might have been the phrase—which implied that Ofcom had to first prove that it existed before it could take action against it. So, for exactly the reason she mentioned, that it imposed a requirement to prove CSEA is there, we have changed the wording in the new version. Clause 103(1), at the top of page 87, instead of “persistent and prevalent”, now states “necessary and proportionate”. Therefore, if Ofcom simply considers something necessary, without needing to prove that it is persistent and prevalent—just if it thinks it is necessary—it can take the actions set out in that clause. For the reason that she mentioned, the change has been made already.

15:30
That brings me on to the powers in clause 103, which are extremely relevant—I apologise for speaking to that clause, Ms Rees, which we will come to later. That clause contains powers for Ofcom to direct the use of accredited technologies to ensure that CSEA is being scanned for. I have two points to make. First, on the question of whether the technology exists to scan inside an end-to-end encrypted environment, the advice that I have received so far is that, as the shadow Minister said, although it is getting close and is likely to be accomplished in the relatively near future, as of today it is not there. That is worth saying for the record.
Secondly, on the question of the hon. Member for Aberdeen North about whether that can keep up to date with future technology moves—an important question, because this technology will change almost month to month, and certainly year to year—in that context it is worth referring to the definition of “accredited” technology. If my memory is correct, that is to be found in clause 105(9) and (10), on page 90. In essence, those two subsections state that Ofcom may update accreditation whenever it feels that to be necessary—that can be at any time; it is not one-off. Indeed, Ofcom may appoint some other person or body to do the accreditation if it feels that it does not have the expertise itself. The concept of accredited technology is live; it can be updated the whole time.
Given that we are on the topic, however, we are still thinking—this is so important, and the hon. Member for Aberdeen North has rightly raised it two or three times—about whether there are ways to strengthen clause 103 further, to provide even more clear and powerful powers to act in this area. If we can think of ways to do that, or if anyone else can suggest one, we are receptive to that thinking. The reason—as I gave in answer to the hon. Lady two or three times—is that, as far as I am concerned, there can be no compromise when scanning for CSEA content.
We then come to the question of the risk assessments and the codes of practice, to ensure that all the relevant groups get covered and that no one gets forgotten—this brings me back to clause 37, you will be pleased to hear, Ms Rees. Subsection (3), which appears towards the bottom of page 35, states on lines 31 to 33:
“OFCOM must prepare and issue one or more codes of practice for providers of Part 3 services describing measures recommended for the purpose of compliance with the relevant duties”.
What are those relevant duties? The relevant duties are, mercifully, defined at the bottom of the following page, page 36, in subsection (10), which sets out what we mean, and the most important for protecting people are paragraphs (a), (b) and (c): anything that is illegal, anything that concerns the safety of children, and matters concerning the safety of adults, respectively. There is no risk that those very important topics can somehow get forgotten.
I hope that clarifies how the Bill operates. As I said, we are giving careful thought to finding ways—which I hope we can—to strengthen those powers in clause 103.
Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I think my hon. Friend’s list goes on to page 37, which means there would be a number of different relevant duties that would presumably then be subject to the ability to issue codes of practice. However, the point I was making in my earlier contribution is that this list does not include the issue of violence against women and girls. In looking at this exhaustive list that my hon. Friend has included in the Bill, I must ask whether he might inadvertently be excluding the opportunity for Ofcom to produce a code of practice on the issue of violence against women and girls. Having heard his earlier comments, I felt that he was slightly sympathetic to that idea.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clearly, and as Members have pointed out, women and girls suffer disproportionately from abuse online; unfortunately, tragically and disgracefully, they are disproportionately victims of such abuse. The duties in the Bill obviously apply to everybody—men and women—but women will obviously disproportionately benefit, because they are disproportionately victims.

Obviously, where there are things that are particular to women, such as particular kinds of abuse that women suffer that men do not, or particular kinds of abuse that girls suffer that boys do not, then we would expect the codes of practice to address those kinds of abuse, because the Bill states that they must keep children safe, in clause 37(10)(b), and adults safe, in clause 37(10)(c). Obviously, women are adults and we would expect those particular issues that my right hon. Friend mentioned to get picked up by those measures.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

My hon. Friend is giving me a chink of light there, in that subsection (10)(c) could actively mean that a code of practice that specifically dealt with violence against women and girls would be admissible as a result of that particular point. I had not really thought of it in that way—am I thinking about it correctly?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

My right hon. Friend makes an interesting point. To avoid answering a complicated question off the cuff, perhaps I should write to her. However, I certainly see no prohibition in these words in the clause that would prevent Ofcom from writing a particular code of practice. I would interpret these words in that way, but I should probably come back to her in writing, just in case I am making a mistake.

As I say, I interpret those words as giving Ofcom the latitude, if it chose to do so, to have codes of practice that were specific. I would not see this clause as prescriptive, in the sense that if Ofcom wanted to produce a number of codes of practice under the heading of “adults”, it could do so. In fact, if we track back to clause 37(3), that says:

“OFCOM must prepare and issue one or more codes of practice”.

That would appear to admit the possibility that multiple codes of practice could be produced under each of the sub-headings, including in this case for adults and in the previous case for children. [Interruption.] I have also received some indication from officials that I was right in my assessment, so hopefully that is the confirmation that my right hon. Friend was looking for.

Question put and agreed to.

Clause 37 accordingly ordered to stand part of the Bill.

Clause 38 ordered to stand part of the Bill.

Schedule 4

Codes of practice under section 37: principles, objectives, content

Amendment proposed: 63, in schedule 4, page 176, line 29, at end insert “and

(x) there are adequate safeguards to monitor cruelty towards humans and animals;”.—(Alex Davies-Jones.)

This amendment would ensure that ensuring adequate safeguards to monitor cruelty towards humans and animals is one of the online safety objectives for user-to-user services.

Question put, That the amendment be made.

Division 19

Ayes: 7


Labour: 5
Scottish National Party: 2

Noes: 9


Conservative: 9

Amendment proposed: 64, in schedule 4, page 177, line 4, at end insert “and
(vii) the systems and process are appropriate to detect cruelty towards humans and animals;”—(Alex Davies-Jones.)
This amendment would ensure that ensuring systems and processes are appropriate to detect cruelty towards humans and animals is one of the online safety objectives for search services.
Question put, That the amendment be made.

Division 20

Ayes: 7


Labour: 5
Scottish National Party: 2

Noes: 9


Conservative: 9

Schedule 4 agreed to.
Clause 39
Procedure for issuing codes of practice
None Portrait The Chair
- Hansard -

Before we begin the next debate, does anyone wish to speak to Carla Lockhart’s amendment 97? If so, it will be debated as part of this group; otherwise, it will not be selected. The amendment is not selected.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move amendment 48, in clause 39, page 37, line 17, at beginning insert—

“(A1) OFCOM must prepare the draft codes of practice required under section 37 within the period of six months beginning with the day on which this Act is passed.”

This amendment requires Ofcom to prepare draft codes of practice within six months of the passing of the Act.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause stand part.

Clauses 42 to 47 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

This is a mammoth part of the Bill, and I rise to speak to clause 39. Under the clause, Ofcom will submit a draft code of practice to the Secretary of State and, provided that the Secretary of State does not intend to issue a direction to Ofcom under clause 40, the Secretary of State would lay the draft code before Parliament. Labour’s main concern about the procedure for issuing codes of practice is that, without a deadline, they may not come into force for quite some time, and the online space needs addressing now. We have already waited far too long for the Government to bring forward the Bill. Parliamentary oversight is also fundamentally important, and the codes will have huge implications for the steps that service providers take, so it is vital that they are given due diligence at the earliest opportunity.

Amendment 48 would require Ofcom to prepare draft codes of practice within six months of the passing of the Act. This simple amendment would require Ofcom to bring forward these important codes of practice within an established time period—six months—after the Bill receives Royal Assent. Labour recognises the challenges ahead for Ofcom in both capacity and funding.

On this note, I must raise with the Minister something that I have raised previously. I find it most curious that his Department recently sought to hire an online safety regulator funding policy adviser. The job advert listed some of the key responsibilities:

“The post holder will support ministers during passage of the Online Safety Bill; secure the necessary funding for Ofcom and DCMS in order to set up the Online Safety regulator; and help implement and deliver a funding regime which is first of its kind in the UK.”

That raises worrying questions about how prepared Ofcom is for the huge task ahead. That being said, the Government have drafted the Bill in a way that brings codes of practice to its heart, so they cannot and should not be susceptible to delay.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady is very kind in giving way—I was twitching to stand up. On the preparedness of Ofcom and its resources, Ofcom was given about £88 million in last year’s spending review to cover this and the next financial year—2022-23 and 2023-24—so that it could get ready. Thereafter, Ofcom will fund itself by raising fees, and I believe that the policy adviser will most likely advise on supporting the work on future fees. That does not imply that there will be any delay, because the funding for this year and next year has already been provided by the Government.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I appreciate that intervention, but the Minister must be aware that if Ofcom has to fundraise itself, that raises questions about its future capability as a regulator and its funding and resource requirements. What will happen if it does not raise those funds?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady’s use of the word “fundraise” implies that Ofcom will be going around with a collection tin on a voluntary basis.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It is your word.

15:45
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I will find the relevant clause in a moment. The Bill gives Ofcom the legal power to make the regulated companies pay fees to finance Ofcom’s regulatory work. It is not voluntary; it is compulsory.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful to the Minister for that clarification. Perhaps he should make that more obvious in the job requirements and responsibilities.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The fees requirements are in clauses 70 to 76, in particular clause 71, “Duty to pay fees”. The regulated companies have to pay the fees to Ofcom. It is not optional.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful to the Minister for that clarification.

The Government have drafted the Bill in a way that puts codes of practice at its heart, so they cannot and should not be susceptible to delay. We have heard from platforms and services that stress that the ambiguity of the requirements is causing concern. At least with a deadline for draft codes of practice, those that want to do the right thing will be able to get on with it in a timely manner.

The Age Verification Providers Association provided us with evidence in support of amendment 48 in advance of today’s sitting. The association agrees that early publication of the codes will set the pace for implementation, encouraging both the Secretary of State and Parliament to approve the codes swiftly. A case study it shared highlights delays in the system, which we fear will be replicated within the online space, too. Let me indulge Members with details of exactly how slow Ofcom’s recent record has been on delivering similar guidance required under the audio-visual media services directive.

The directive became UK law on 30 September 2020 and came into force on 1 November 2020. By 24 June 2021, Ofcom had issued a note as to which video sharing platforms were in scope. It took almost a year until, on 6 October 2021, Ofcom issued formal guidance on the measures.

In December 2021, Ofcom wrote to the verification service providers and

“signalled the beginning of a new phase of supervisory engagement”.

However, in March 2022 it announced that

“the information we collect will inform our Autumn 2022 VSP report, which intends to increase the public’s awareness of the measures platforms have in place to protect users from harm.”

There is still no indication that Ofcom intends to take enforcement action against the many VSPs that remain non-compliant with the directive. It is simply not good enough. I urge the Minister to carefully consider the aims of amendment 48 and to support it.

Labour supports the principles of clause 42. Ofcom must not drag out the process of publishing or amending the codes of practice. Labour also supports a level of transparency around the withdrawal of codes of practice, should that arise.

Labour also supports clause 43 and the principles of ensuring that Ofcom has a requirement to review its codes of practice. We do, however, have concerns over the Secretary of State’s powers in subsection (6). It is absolutely right that the Secretary of State of the day has the ability to make representations to Ofcom in order to prevent the disclosure of certain matters in the interests of national security, public safety or relations with the Government of a country outside the UK. However, I am keen to hear the Minister’s assurances about how well the Bill is drafted to prevent those powers from being used, shall we say, inappropriately. I hope he can address those concerns.

On clause 44, Ofcom should of course be able to propose minor amendments to its codes of practice. Labour does, however, have concerns about the assessment that Ofcom will have to make to ensure that the minor nature of changes will not require amendments to be laid before Parliament, as in subsection (1). As I have said previously, scrutiny must be at the heart of the Bill, so I am interested to hear from the Minister how exactly he will ensure that Ofcom is making appropriate decisions about what sorts of changes are allowed to circumvent parliamentary scrutiny. We cannot and must not get to a place where the Secretary of State, in agreeing to proposed amendments, actively prevents scrutiny from taking place. I am keen to hear assurances on that point from the Minister.

On clause 45, as I mentioned previously on amendment 65 to clause 37, as it stands, service providers would be treated as complying with their duties if they had followed the recommended measures set out in the relevant codes of practice, as set out in subsection (1). However, providers could take alternative measures to comply, as outlined in subsection (5). Labour supports the clause in principle, but we are concerned that the definition of alternative measures is too broad. I would be grateful if the Minister could elaborate on his assessment of the instances in which a service provider may seek to comply via alternative measures. Surely the codes of practice should be, for want of a better phrase, best practice. None of us want to get into a position where service providers are circumnavigating their duties by taking the alternative measures route.

Again, Labour supports clause 46 in principle, but we feel that the provisions in subsection (1) could go further. We know that, historically, service providers have not always been transparent and forthcoming when compelled to be so by the courts. While we understand the reasoning behind subsection (3), we have broader concerns that service providers could, in theory, lean on their codes of practice as highlighting their best practice. I would be grateful if the Minister could address our concerns.

We support clause 47, which establishes that the duties in respect of which Ofcom must issue a code of practice under clause 37 will apply only once the first code of practice for that duty has come into force. However, we are concerned that this could mean that different duties will apply at different times, depending on when the relevant code for a particular duty comes into force. Will the Minister explain his assessment of how that will work in practice? We have concerns that drip feeding this information to service providers will cause further delay and confusion. In addition, will the Minister confirm how Ofcom will prioritise its codes of practice?

Lastly, we know that violence against women and girls has not a single mention in the Bill, which is an alarming and stark omission. Women and girls are disproportionately likely to be affected by online abuse and harassment. The Minister knows this—we all know this—and a number of us have spoken up on the issue on quite a few occasions. He also knows that online violence against women and girls is defined as including, but not limited to, intimate image abuse, online harassment, the sending of unsolicited explicit images, coercive sexting and the creation and sharing of deepfake pornography.

The Minister will also know that Carnegie UK is working with the End Violence Against Women coalition to draw up what a code of practice to tackle violence against women and girls could look like. Why has that been left out of the redraft of the Bill? What consideration has the Minister given to including a code of this nature in the Bill? If the Minister is truly committed to tackling violence against women and girls, why will he not put that on the face of the Bill?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a quick question about timelines because I am slightly confused about the order in which everything will happen. It is unlikely that the Bill will have been through the full parliamentary process before the summer, yet Ofcom intends to publish information and guidance by the summer, even though some things, such as the codes of practice, will not come in until after the Bill has received Royal Assent. Will the Minister give a commitment that, whether or not the Bill has gone through the whole parliamentary process, Ofcom will be able to publish before the summer?

Will Ofcom be encouraged to publish everything, whether that is guidance, information on its website or the codes of practice, at the earliest point at which they are ready? That will mean that anyone who has to apply those codes of practice or those regulations—people who will have to work within those codes, for example, or charities or other organisations that might be able to make super-complaints—will have as much information as possible, as early as possible, and will be able to prepare to fully implement their work at the earliest possible time. They will need that information in order to be able to gear up to do that.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I have three short questions for the Minister about clause 40 and the Secretary of State’s powers of direction. Am in order to cover that?

None Portrait The Chair
- Hansard -

We are not debating clause 40, Dame Maria, but we will come to it eventually.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I will do my best to make sure that we come to it very quickly indeed, by being concise in my replies on this group of amendments.

On amendment 48, which seeks to get Ofcom to produce its codes of practice within six months, obviously we are unanimous in wanting that to be done as quickly as possible. However, Ofcom has to go through a number of steps in order to produce those codes of practice. For example, first we have to designate in secondary legislation the priority categories of content that is harmful to children and content that is harmful to adults, and then Ofcom has to go through a consultation exercise before it publishes the codes. It has in the past indicated that it expects that to be a 12-month, rather than a six-month, process. I am concerned that a hard, six-month deadline may be either impossible to meet or make Ofcom rush and do it in a bad way. I accept the need to get this done quickly, for all the obvious reasons, but we also want to make sure that it is done right. For those reasons, a hard, six-month deadline would not help us very much.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Why does the Minister believe that six months is out of scope? Does he think that Ofcom is not adequately resourced to meet that deadline and make it happen as soon as possible?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

There are a number of steps to go through. Regardless of how well resourced Ofcom is and how fast it works, first, we have to designate the priority categories by secondary legislation, and there is a lead time for that. Secondly, Ofcom has to consult. Best practice suggests that consultations need to last for a certain period, because the consultation needs to be written, then it needs to open, and then the responses need to be analysed. Then, Ofcom obviously has to write the codes of practice. It might be counterproductive to set a deadline that tight.

There are quite a few different codes of practice to publish, and the hon. Lady asked about that. The ones listed in clause 47 will not all come out at the same time; they will be staggered and prioritised. Obviously, the ones that are most germane to safety, such as those on illegal content and children’s safety, will be done first. We would expect them to be done as a matter of extreme urgency.

I hope I have partly answered some of the questions that the hon. Member for Aberdeen North asked. The document to be published before the summer, which she asked about, is a road map. I understand it to be a sort of timetable that will set out the plan for doing everything we have just been debating—when the consultations will happen and when the codes of practice will be published. I guess we will get the road map in the next few weeks, if “before the summer” means before the summer recess. We will have all that set out for us, and then the formal process follows Royal Assent. I hope that answers the hon. Lady’s question.

There were one or two other questions from the hon. Member for Pontypridd. She asked whether a Secretary of State might misuse the power in clause 43(2)—a shocking suggestion, obviously. The power is only to request a review; it is nothing more sinister or onerous than that.

On clause 44, the hon. Lady asked what would happen if Ofcom and the Secretary of State between them—it would require both—conspired to allow through a change claiming it is minor when in fact it is not minor. First, it would require both of them to do that. It requires Ofcom to propose it and the Secretary of State to agree it, so I hope the fact that it is not the Secretary of State acting alone gives her some assurance. She asked what the redress is if both the Secretary of State and Ofcom misbehave, as it were. Well, the redress is the same as with any mis-exercise of a public power—namely, judicial review, which, as a former Home Office Minister, I have experienced extremely frequently—so there is legal redress.

The hon. Lady then asked about the alternative measures. What if a service provider, rather than meeting its duties via the codes of practice, does one of the alternative measures instead? Is it somehow wriggling out of what it is supposed to do? The thing that is legally binding, which it must do and about which there is no choice because there is a legal duty, is the duties that we have been debating over the past few days. Those are the binding requirements that cannot be circumvented. The codes of practice propose a way of meeting those. If the service provider can meet the duties in a different way and can satisfy Ofcom that it has met those duties as effectively as it would under the codes of practices, it is open to doing that. We do not want to be unduly prescriptive. The test is: have the duties been delivered? That is non-negotiable and legally binding.

I hope I have answered all the questions, while gently resisting amendment 48 and encouraging the Committee to agree that the various other clauses stand part of the Bill.

Question put, That the amendment be made.

The Committee divided:.

Division 21

Ayes: 7


Labour: 5
Scottish National Party: 2

Noes: 9


Conservative: 9

Clause 39 ordered to stand part of the Bill.
Ordered, That further consideration be now adjourned. —(Steve Double.)
16:00
Adjourned till Tuesday 14 June at twenty-five minutes past Nine o’clock.
Written evidence reported to the House
OSB61 Badger Trust
OSB62 Lego
OSB63 End Violence Against Women Coalition (EVAW)
OSB64 Hacked Off Campaign (further submission) (re: clause 50)
OSB65 Office of the City Remembrancer, on behalf of the City of London Corporation and City of London Police
OSB66 Juul Labs
OSB67 Big Brother Watch, ARTICLE 19, Open Rights Group, Index on Censorship, and Global Partners Digital
OSB68 News Media Association (supplementary submission)

Online Safety Bill (Ninth sitting)

Committee stage
Tuesday 14th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 14 June 2022 - (14 Jun 2022)
The Committee consisted of the following Members:
Chairs: Sir Roger Gale, † Christina Rees
† Ansell, Caroline (Eastbourne) (Con)
† Bailey, Shaun (West Bromwich West) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
Carden, Dan (Liverpool, Walton) (Lab)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Double, Steve (St Austell and Newquay) (Con)
† Fletcher, Nick (Don Valley) (Con)
Holden, Mr Richard (North West Durham) (Con)
† Keeley, Barbara (Worsley and Eccles South) (Lab)
† Leadbeater, Kim (Batley and Spen) (Lab)
† Miller, Dame Maria (Basingstoke) (Con)
Mishra, Navendu (Stockport) (Lab)
† Moore, Damien (Southport) (Con)
† Nicolson, John (Ochil and South Perthshire) (SNP)
† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
† Russell, Dean (Watford) (Con)
† Stevenson, Jane (Wolverhampton North East) (Con)
Katya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks
† attended the Committee
Public Bill Committee
Tuesday 14 June 2022
(Morning)
[Christina Rees in the Chair]
Online Safety Bill
09:25
None Portrait The Chair
- Hansard -

We are now sitting in public and the proceedings are being broadcast. Please switch electronic devices to silent. Tea and coffee are not allowed during sittings.

Clause 40

Secretary of State’s powers of direction

John Nicolson Portrait John Nicolson (Ochil and South Perthshire) (SNP)
- Hansard - - - Excerpts

I beg to move amendment 84, in clause 40, page 38, line 5, leave out subsection (a).

This amendment would remove the ability of the Secretary of State to modify Ofcom codes of practice ‘for reasons of public policy’.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause stand part.

Clause 41 stand part.

New clause 12—Secretary of State’s powers to suggest modifications to a code of practice—

“(1) The Secretary of State may on receipt of a code write within one month of that day to OFCOM with reasoned, evidence-based suggestions for modifying the code.

(2) OFCOM shall have due regard to the Secretary of State’s letter and must reply to the Secretary of State within one month of receipt.

(3) The Secretary of State may only write to OFCOM twice under this section for each code.

(4) The Secretary of State and OFCOM shall publish their letters as soon as reasonably possible after transmission, having made any reasonable redactions for public safety and national security.

(5) If the draft of a code of practice contains modifications made following changes arising from correspondence under this section, the affirmative procedure applies.”

This new clause gives the Secretary of State powers to suggest modifications to a code of practice, as opposed to the powers of direction proposed in clause 40.

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

Amendment 84 is very simple: it removes one sentence—

“for reasons of public policy”.

Of all the correspondence that I have had on the Bill—there has been quite a lot—this is the clause that has most aggrieved the experts. A coalition of groups with a broad range of interests, including child safety, human rights, women and girls, sport and democracy, all agree that the Secretary of State is granted excessive powers in the Bill, and that it threatens the independence of the independent regulator. Businesses are also wary of this power, in part due to the uncertainty that it causes.

The reduction of Ministers’ powers under the Bill was advised by the Joint Committee on the draft Bill and by the Digital, Culture, Media and Sport Committee. I am sure that the two hon. Members on the Government Benches who sat on those Committees and added their names to their reports—the hon. Members for Watford and for Wolverhampton North East—will vote for the amendment. How could they possibly have put their names to the Select Committee report and the Joint Committee report and then just a few weeks later decide that they no longer support the very proposals that they had advanced?

Could the Minister inform us which special interest groups specifically have backed the Secretary of State’s public policy powers under the Bill? I am fascinated to know. Surely, all of us believe in public policy that is informed by expert evidence. If the Secretary of State cannot produce any experts at all who believe that the powers that she enjoys are appropriate or an advantage, or improve legislation, then we should not be proceeding in the way that we are. Now that I know that our proceedings are being broadcast live, I also renew my call to anyone watching who is in favour of these powers as they are to say so, because so far we have found no one who holds that position.

We should be clear about exactly what these powers do. Under clause 40, the Secretary of State can modify the draft codes of practice, thus allowing the Government a huge amount of power over the independent communications regulator. The Government have attempted to play down these powers by stating that they would be used only in exceptional circumstances. However, the legislation does not define what “exceptional circumstances” means, and it is far too nebulous a term for us to proceed under the current circumstances. Rather, a direction can reflect public policy. Will the Minister also clarify the difference between “public policy” and “government policy”, which was the wording in the draft Bill?

The regulator must not be politicised in this way. Regardless of the political complexion of the Government, when they have too much influence over what people can say online, the implications for freedom of speech are grave, especially when the content that they are regulating is not illegal. I ask the Minister to consider how he would feel if, rather than being a Conservative, the Culture Secretary came from among my friends on the Labour Benches. I would argue that that would be a significant improvement, but I imagine that the Minister would not. I see from his facial expression that that is the case.

There are ways to future-proof and enhance the transparency of Ofcom in the Bill that do not require the overreach of these powers. When we are allowing the Executive powers over the communications regulator, the protections must be absolute and iron-clad. As it stands, the Bill leaves leeway for abuse of these powers. No matter how slim a chance the Minister feels that there is of that, as parliamentarians we must not allow it. That is why I urge the Government to consider amendment 84.

As somebody who is new to these proceedings, I think it would be nice if, just for once, the Government listened to arguments and were prepared to accept them, rather than us going through this Gilbert and Sullivan pantomime where we advance arguments, we vote and we always lose. The Minister often says he agrees with us, but he still rejects whatever we say.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

Good morning, Ms Rees; it is, as always, a pleasure to serve under your chairship.

Amendment 84 would remove the Secretary of State’s ability to modify Ofcom codes of practice

“for reasons of public policy”.

Labour agrees with the Carnegie UK Trust assessment of this: the codes are the fulcrum of the regulatory regime and it is a significant interference in Ofcom’s independence. Ofcom itself has noted that the “reasons of public policy” power to direct might weaken the regime. If Ofcom has undertaken a logical process, rooted in evidence, to arrive at a draft code, it is hard to see how a direction based on “reasons of public policy” is not irrational. That then creates a vulnerability to legal challenge.

On clause 40 more widely, the Secretary of State should not be able to give Ofcom specific direction on non-strategic matters. Ofcom’s independence in day-to-day decision making is paramount to preserving freedom of expression. Independence of media regulators is the norm in developed democracies. The UK has signed up to many international statements in that vein, including as recently as April 2022 at the Council of Europe. That statement says that

“media and communication governance should be independent and impartial to avoid undue influence on policy making, discriminatory treatment and preferential treatment of powerful groups, including those with significant political or economic power.”

The Bill introduces powers for the Secretary of State to direct Ofcom on internet safety codes. These provisions should immediately be removed. After all, in broadcasting regulation, Ofcom is trusted to make powerful programme codes with no interference from the Secretary of State. Labour further notes that although the draft Bill permitted this

“to ensure that the code of practice reflects government policy”,

clause 40 now specifies that any code may be required to be modified

“for reasons of public policy”.

Although that is more normal language, it is not clear what in practice the difference in meaning is between the two sets of wording. I would be grateful if the Minister could confirm what that is.

The same clause gives the Secretary of State powers to direct Ofcom, on national security or public safety grounds, in the case of terrorism or CSEA—child sexual exploitation and abuse—codes of practice. The Secretary of State might have some special knowledge of those, but the Government have not demonstrated why they need a power to direct. In the broadcasting regime, there are no equivalent powers, and the Secretary of State was able to resolve the case of Russia Today, on national security grounds, with public correspondence between the Secretary of State and Ofcom.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

Good morning, Ms Rees; it is a pleasure to serve under your chairmanship again. The SNP spokesman and the shadow Minister have already explained what these provisions do, which is to provide a power for the Secretary of State to make directions to Ofcom in relation to modifying a code of conduct. I think it is important to make it clear that the measures being raised by the two Opposition parties are, as they said, envisaged to be used only in exceptional circumstances. Of course the Government accept that Ofcom, in common with other regulators, is rightly independent and there should be no interference in its day-to-day regulatory decisions. This clause does not seek to violate that principle.

However, we also recognise that although Ofcom has great expertise as a regulator, there may be situations in which a topic outside its area of expertise needs to be reflected in a code of practice, and in those situations, it may be appropriate for a direction to be given to modify a code of conduct. A recent and very real example would be in order to reflect the latest medical advice during a public health emergency. Obviously, we saw in the last couple of years, during covid, some quite dangerous medical disinformation being spread—concerning, for example, the safety of vaccines or the “prudence” of ingesting bleach as a remedy to covid. There was also the purported and entirely false connection between 5G phone masts and covid. There were issues on public policy grounds—in this case, medical grounds—and it might have been appropriate to make sure that a code of conduct was appropriately modified.

Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - - - Excerpts

It was mentioned earlier that some of us were on previous Committees that made recommendations more broadly that would perhaps be in line with the amendment. Since that time, there has been lots of discussion around this topic, and I have raised it with the Minister and colleagues. I feel reassured that there is a great need to keep the clause as is because of the fact that exceptional circumstances do arise. However, I would like reassurances that directions would be made only in exceptional circumstances and would not override the Ofcom policy or remit, as has just been discussed.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I can provide my hon. Friend with that reassurance on the exceptional circumstances point. The Joint Committee report was delivered in December, approximately six months ago. It was a very long report—I think it had more than 100 recommendations. Of course, members of the Committee are perfectly entitled, in relation to one or two of those recommendations, to have further discussions, listen further and adjust their views if they individually see fit.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me just finish this point and then I will give way. The shadow SNP spokesman, the hon. Member for Ochil and South Perthshire, asked about the Government listening and responding, and we accepted 66 of the Joint Committee’s recommendations —a Committee that he served on. We made very important changes to do with commercial pornography, for example, and fraudulent advertising. We accepted 66 recommendations, so it is fair to say we have listened a lot during the passage of this Bill. On the amendments that have been moved in Committee, often we have agreed with the amendments but the Bill has already dealt with the matter. I wanted to respond to those two points before giving way.

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

I am intrigued, as I am sure viewers will be. What is the new information that has come forward since December that has resulted in the Minister believing that he must stick with this? He has cited new information and new evidence, and I am dying to know what it is.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am afraid it was not me that cited new information. It was my hon. Friend the Member for Watford who said he had had further discussions with Ministers. I am delighted to hear that he found those discussions enlightening, as I am sure they—I want to say they always are, but let us say they often are.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

Before my hon. Friend moves on, can I ask a point of clarification? The hon. Member for Ochil and South Perthshire is right that this is an important point, so we need to understand it thoroughly. I think he makes a compelling argument about the exceptional circumstances. If Ofcom did not agree that a change that was being requested was in line with what my hon. Friend the Minister has said, how would it be able to discuss or, indeed, challenge that?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

My right hon. Friend raises a good question. In fact, I was about to come on to the safeguards that exist to address some of the concerns that have been raised this morning. Let me jump to the fourth of the safeguards, which in many ways is the most powerful and directly addresses my right hon. Friend’s question.

In fact, a change has been made. The hon. Member for Ochil and South Perthshire asked what changes had been made, and one important change—perhaps the change that my hon. Friend the Member for Watford found convincing—was the insertion of a requirement for the codes, following a direction, to go before Parliament and be voted on using the affirmative procedure. That is a change. The Bill previously did not have that in it. We inserted the use of the affirmative procedure to vote on a modified code in order to introduce extra protections that did not exist in the draft of the Bill that the Joint Committee commented on.

I hope my right hon. Friend the Member for Basingstoke will agree that if Ofcom had a concern and made it publicly known, Parliament would be aware of that concern before voting on the revised code using the affirmative procedure. The change to the affirmative procedures gives Parliament extra control. It gives parliamentarians the opportunity to respond if they have concerns, if third parties raise concerns, or if Ofcom itself raises concerns.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

Before the Minister moves off the point about exceptional circumstances, it was the case previously that an amendment of the law resolution was always considered with Finance Bills. In recent years, that has stopped on the basis of it being exceptional circumstances because a general election was coming up. Then the Government changed that, and now they never table an amendment of the law resolution because they have decided that that is a minor change. Something has gone from being exceptional to being minor, in the view of this Government.

The Minister said that he envisions that this measure will be used only in exceptional circumstances. Can he commit himself to it being used only in exceptional circumstances? Can he give the commitment that he expects that it will be used only in exceptional circumstances, rather than simply envisioning that it will be used in such circumstances?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have made clear how we expect the clause to be used. I am slightly hesitant to be more categorical simply because I do not want to make comments that might unduly bind a future Secretary of State—or, indeed, a future Parliament, because the measure is subject to the affirmative procedure—even were that Secretary of State, heaven forbid, to come from a party other than mine. Circumstances might arise, such as the pandemic, in which a power such as this needs to be exercised for good public policy reasons—in that example, public health. I would not want to be too categorical, which the hon. Lady is inviting me to be, lest I inadvertently circumscribe the ability of a future Parliament or a future Secretary of State to act.

The power is also limited in the sense that, in relation to matters that are not to do with national security or terrorism or CSEA, the power to direct can be exercised only at the point at which the code is submitted to be laid before Parliament. That cannot be done at any point. The power cannot be exercised at a time of the Secretary of State’s choosing. There is one moment, and one moment only, when that power can be exercised.

I also want to make it clear that the power will not allow the Secretary of State to direct Ofcom to require a particular regulated service to take a particular measure. The power relates to the codes of practice; it does not give the power to intrude any further, beyond the code of practice, in the arena of regulated activity.

I understand the points that have been made. We have listened to the Joint Committee, and we have made an important change, which is that to the affirmative procedure. I hope my explanation leaves the Committee feeling that, following that change, this is a reasonable place for clauses 40 and 41 to rest. I respectfully resist amendment 84 and new clause 12, and urge the Committee to allow clauses 40 and 41 to stand part of the Bill.

Question put, That the amendment be made.

Division 22

Ayes: 5

Noes: 9

Clause 40 ordered to stand part of the Bill.
Clauses 41 to 47 ordered to stand part of the Bill.
09:45
Clause 48
OFCOM’s guidance: record-keeping duties and children’s access assessments
Question proposed, That the clause stand part of the Bill.
None Portrait The Chair
- Hansard -

Barbara Keeley, do you wish to speak to the clause?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Given that the clause is clearly uncontentious, I will be extremely brief.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I can see that that is the most popular thing I have said during the entire session—when you say, “And finally,” in a speech and the crowd cheers, you know you are in trouble.

Regulated user-to-user and search services will have duties to keep records of their risk assessments and the measures they take to comply with their safety duties, whether or not those are the ones recommended in the codes of practice. They must also undertake a children’s access assessment to determine whether children are likely to access their service.

Clause 48 places a duty on Ofcom to produce guidance to assist service providers in complying with those duties. It will help to ensure a consistent approach from service providers, which is essential in maintaining a level playing field. Ofcom will have a duty to consult the Information Commissioner prior to preparing this guidance, as set out in clause 48(2), in order to draw on the expertise of the Information Commissioner’s Office and ensure that the guidance is aligned with wider data protection and privacy regulation.

Question put and agreed to.

Clause 48 accordingly ordered to stand part of the Bill.

Clause 49

“Regulated user-generated content”, “user-generated content”, “news

publisher content”

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move amendment 89, in clause 49, page 45, line 16, leave out subsection (e).

This amendment would remove the exemption for comments below news articles posted online.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss amendment 43, in clause 49, page 45, line 19, at end insert—

“(2A) Subsection (2)(e) does not apply in respect of a user-to-user service which is operated by an organisation which—

(a) is a relevant publisher (as defined in section 41 of the Crime and Courts Act 2013); and

(b) has an annual UK turnover in excess of £100 million.”

This amendment removes comments sections operated by news websites where the publisher has a UK turnover of more than £100 million from the exemption for regulated user-generated content.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Thank you, Ms Rees, for your hard work in chairing the Committee this morning; we really appreciate it. Amendment 89 relates to below-the-line comments on newspaper articles. For the avoidance of doubt, if we do not get amendment 89, I am more than happy to support the Labour party’s amendment 43, which has a similar effect but covers slightly fewer—or many fewer—organisations and places.

Below-the-line comments in newspaper articles are infamous. They are places that everybody fears to go. They are worse than Twitter. In a significant number of ways, below-the-line comments are an absolute sewer. I cannot see any reasonable excuse for them to be excluded from the Bill. We are including Twitter in the Bill; why are we not including below-the-line comments for newspapers? It does not make any sense to me; I do not see any logic.

We heard a lot of evidence relating to freedom of speech and a free press, and I absolutely, wholeheartedly agree with that. However, the amendment would not stop anyone writing a letter to the editor. It would not stop anyone engaging with newspapers in the way that they would have in the print medium. It would still allow that to happen; it would just ensure that below-the-line comments were subject to the same constraints as posts on Twitter. That is the entire point of amendment 89.

I do not think that I need to say much more, other than to add one more thing about the direction by comments to other, more radical and extreme pieces, or bits of information. It is sometimes the case that the comments on a newspaper article will direct people to even more extreme views. The newspaper article itself may be just slightly derogatory, while some of the comments may have links or references to other pieces, and other places on the internet where people can find a more radical point of view. That is exactly what happens on Twitter, and is exactly some of the stuff that we are trying to avoid—sending people down an extremist rabbit hole. I do not understand how the Minister thinks that the clause, which excludes below the line newspaper comments, is justifiable or acceptable.

Having been contacted by a number of newspapers, I understand and accept that some newspapers have moderation policies for their comments sections, but that is not strong enough. Twitter has a moderation policy, but that does not mean that there is actually any moderation, so I do not think that subjecting below-the-line comments to the provisions of the Bill is asking too much. It is completely reasonable for us to ask for this to happen, and I am honestly baffled as to why the Minister and the Government have chosen to make this exemption.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Before I address the amendments, I will speak to clause 49 more broadly.

Labour has concerns about a number of subsections of the clause, including subsections (2), and (8) to (10)— commonly known as the news publisher content exemption, which I have spoken about previously. We understand that the intention of the exemption is to shield broadcasters and traditional newspaper publishers from the Bill’s regulatory effects, clause 50(2) defines a “recognised news publisher” as a regulated broadcaster or any other publisher that publishes news, has an office, and has a standards code and complaints process. There is no detail about the latter two requirements, thus enabling almost any news publishing enterprise to design its own code and complaints process, however irrational, and so benefit from the exemption. “News” is also defined broadly, and may include gossip. There remains a glaring omission, which amendment 43 addresses and which I will come to.

During an earlier sitting of the Committee, in response to comments made by my hon. Friend the Member for Liverpool, Walton as we discussed clause 2, the Minister claimed that

“The metaverse is a good example, because even though it did not exist when the structure of the Bill was conceived, anything happening in the metaverse is none the less covered by the Bill. Anything that happens in the metaverse that is illegal or harmful to children, falls into the category of legal but harmful to adults, or indeed constitutes pornography will be covered because the Bill is tech agnostic.”––[Official Report, Online Safety Public Bill Committee, 7 June 2022; c. 204.]

Clause 49 exempts one-to-one live aural communications from the scope of regulation. Given that much interaction in virtual reality is live aural communication, including between two users, it is hard to understand how that would be covered by the Bill.

There is also an issue about what counts as content. Most standard understandings would define “content” as text, video, images and audio, but one of the worries about interactions in VR is that behaviour such as physical violence will be able to be replicated virtually, with psychologically harmful effects. It is very unclear how that would be within the scope of the current Bill, as it does not clearly involve content, so could the Minister please address that point? As he knows, Labour advocates for a systems-based approach, and for risk assessments and systems to take place in a more upstream and tech-agnostic way than under the current approach. At present, the Bill would struggle to be expanded effectively enough to cover those risks.

Amendment 43 removes comments sections operated by news websites where the publisher has a UK turnover of more than £100 million from the exemption for regulated user-generated comment. If the Bill is to be effective in protecting the public from harm, the least it must accomplish is a system of accountability that covers all the largest platforms used by British citizens. Yet as drafted, the Bill would exempt some of the most popular social media platforms online: those hosted on news publisher websites, which are otherwise known as comments sections. The amendment would close that loophole and ensure that the comments sections of the largest newspaper websites are subject to the regime of regulation set out in the Bill.

Newspaper comments sections are no different from the likes of Facebook and Twitter, in that they are social media platforms that allow users to interact with one another. This is done through comments under stories, comments in response to other comments, and other interactions—for example, likes and dislikes on posts. In some ways, their capacity to cause harm to the public is even greater: for example, their reach is in many cases larger than even the biggest of social media platforms. Whereas there are estimated to be around 18 million users of Twitter in the UK, more than twice that number of British citizens access newspaper websites every month, and the harm perpetuated on those platforms is severe.

In July 2020, the rapper Wiley posted a series of antisemitic tweets, which Twitter eventually removed after an unacceptable delay of 48 hours, but under coverage of the incident in The Sun newspaper, several explicitly antisemitic comments were posted. Those comments contained holocaust denial and alleged a global Jewish conspiracy to control the world. They remained up and accessible to The Sun’s 7 million daily readers for the best part of a week. If we exempt comments sections from the Bill’s proposed regime and the duties that the Bill sets for platforms, we will send the message that that kind of vicious, damaging and harmful racism is acceptable.

Similarly, after an antisemitic attack in the German city of Halle, racists comments followed in the comments section under the coverage in The Sun. There are more examples: Chinese people being described as locusts and attacked with other racial slurs; 5G and Bill Gates conspiracy theories under articles on the Telegraph website; and of course, the most popular targets for online abuse, women in public life. Comments that described the Vice-President of the United States as a “rat” and “ho” appeared on the MailOnline. A female union leader has faced dozens of aggressive and abusive comments about her appearance, and many of such comments remain accessible on newspaper comments sections to this day. Some of them have been up for months, others for years.

Last week, the Committee was sent a letter from a woman who was the victim of comments section abuse, Dr Corinne Fowler. Dr Fowler said of the comments that she received:

“These comments contained scores of suggestions about how to kill or injure me. Some were general ideas, such as hanging, but many were gender specific, saying that I should be burnt at the stake like a witch. Comments focused on physical violence, one man advising that I should slapped hard enough to make my teeth chatter”.

She added:

“I am a mother: without me knowing, my son (then 12 years old) read these reader comments. He became afraid for my safety.”

Without the amendment, the Bill cannot do anything to protect women such as Dr Fowler and their families from this vile online abuse, because comments sections will be entirely out of scope of the Bill’s new regime and the duties designed to protect users.

As I understand it, two arguments have been made to support the exemption. First, it is argued that the complaints handlers for the press already deal with such content, but the handler for most national newspapers, the Independent Press Standards Organisation, will not act until a complaint is made. It then takes an average of six months for a complaint to be processed, and it cannot do anything if the comments have not been moderated. The Opposition do not feel that that is a satisfactory response to the seriousness of harms that we know to occur, and which I have described. IPSO does not even have a code to deal with cases of antisemitic abuse that appeared on the comments section of The Sun. IPSO’s record speaks for itself from the examples that I have given, and the many more, and it has proven to be no solution to the severity of harms that appear in newspaper comments sections.

The second argument for an exemption is that publishers are legally responsible for what appears on comments sections, but that is only relevant for illegal harms. For everything else, from disinformation to racial prejudice and abuse, regulation is needed. That is why it is so important that the Bill does the job that we were promised. To keep the public safe from harm online, comments sections must be covered under the Bill.

The amendment is a proportionate solution to the problem of comments section abuse. It would protect user’s freedom of expression and, given that it is subject to a turnover threshold, ensure that duties and other requirements do not place a disproportionate burden on smaller publishers such as locals, independents and blogs.

I have reams and reams and reams of examples from comments sections that all fall under incredibly harmful abuse and should be covered by the Bill. I could be here for hours reading them all out, and while I do not think that anybody in Committee would like me to, I urge Committee members to take a look for themselves at the types of comments under newspaper articles and ask themselves whether those comments should be covered by the terms of the Bill. I think they know the answer.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

On a point of order, Ms Rees. Are we considering clause 49 now? I know that it is supposed to considered under the next set of amendments, but I just wondered, because I have separate comments to make on that clause that I did not make earlier because I spoke purely to the amendment.

None Portrait The Chair
- Hansard -

I did not want to stop Alex Davies-Jones in full flow. When we come to consideration of clause 49, I was going to ask for additional comments, but it is for the Committee to decide whether it is content with that, or would like the opportunity to elaborate on that clause now.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am happy to speak on clause 49 now—I can see the Minister is nodding. I really appreciate it, Ms Rees, because I did not want to lose the opportunity to raise concerns about this matter. I have not tabled an amendment but I would appreciate it if the Minister gave consideration to my following comments.

My concern relates to subsection (5) of clause 49, which exempts one-to-one live aural communications in relation to user-to-user services. My concern relates to child sexual abuse and grooming. I am worried that exempting those one-to-one live aural communications allows bad actors, people who are out to attack children, a loophole to do that. We know that on games such as Fortnite, one-to-one aural communication happens.

I am not entirely sure how communication happens on Roblox and whether there is an opportunity for that there. However, we also know that a number of people who play online games have communication on Discord at the same time. Discord is incredibly popular, and we know that there is an opportunity for, and a prevalence of, grooming on there. I am concerned that exempting this creates a loophole for people to attack children in a way that the Minister is trying to prevent with the Bill. I understand why the clause is there but am concerned that the loophole is created.

10:00
We know—or I know, having some of my own—that children and young people cannot really be bothered to type things and much prefer to leave a voice message or something. I appreciate that voice messages do not count as live, but some conversations that will happen on platforms such as Discord are live, and those are those most harmful places where children can be encouraged to create child sexual abuse images, for example. I do not necessarily expect the Minister to have all the answers today, and I know there will be other opportunities to amend the Bill, but I would really appreciate it if he took a good look at the Bill and considered whether strengthening provisions can be put in place. If he desires to exempt one-to-one aural communications, he may still do that, while ensuring that child sexual abuse and grooming behaviour are considered illegal and within the scope of the Bill in whatever form they take place, whether in aural communications or in any other way.
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me start by addressing the substance of the two amendments and then I will answer one or two of the questions that arose in the course of the debate.

As Opposition Members have suggested, the amendments would bring the comments that appear below the line on news websites such as The Guardian, MailOnline or the BBC into the scope of the Bill’s safety duties. They are right to point out that there are occasions when the comments posted on those sites are extremely offensive.

There are two reasons why comments below BBC, Guardian or Mail articles are excluded from the scope of the Bill. First, the news media publishers—newspapers, broadcasters and their representative industry bodies—have made the case to the Government, which we are persuaded by, that the comments section below news articles is an integral part of the process of publishing news and of what it means to have a free press. The news publishers—both newspapers and broadcasters that have websites—have made that case and have suggested, and the Government have accepted, that intruding into that space through legislation and regulation would represent an intrusion into the operation of the free press.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am sorry, but I am having real trouble buying that argument. If the Minister is saying that newspaper comments sections are exempt in order to protect the free press because they are an integral part of it, why do we need the Bill in the first place? Social media platforms could argue in the same way that they are protecting free speech. They could ask, “Why should we regulate any comments on our social media platform if we are protecting free speech?” I am sorry; that argument does not wash.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

There is a difference between random individuals posting stuff on Facebook, as opposed to content generated by what we have defined as a “recognised news publisher”. We will debate that in a moment. We recognise that is different in the Bill. Although the Opposition are looking to make amendments to clause 50, they appear to accept that the press deserve special protection. Article 10 case law deriving from the European convention on human rights also recognises that the press have a special status. In our political discourse we often refer generally to the importance of the freedom of the press. We recognise that the press are different, and the press have made the case—both newspapers and broadcasters, all of which now have websites—that their reader engagement is an integral part of that free speech. There is a difference between that and individuals chucking stuff on Facebook outside of the context of a news article.

There is then a question about whether, despite that, those comments are still sufficiently dangerous that they merit regulation by the Bill—a point that the shadow Minister, the hon. Member for Pontypridd, raised. There is a functional difference between comments made on platforms such as Facebook, Twitter, TikTok, Snapchat or Instagram, and comments made below the line on a news website, whether it is The Guardian, the Daily Mail, the BBC—even The National. The difference is that on social media platforms, which are the principal topic of the Bill, there is an in-built concept of virality—things going viral by sharing and propagating content widely. The whole thing can spiral rapidly out of control.

Virality is an inherent design feature in social media sites. It is not an inherent design feature of the comments we get under the news website of the BBC, The Guardian or the Daily Mail. There is no way of generating virality in the same way as there is on Facebook and Twitter. Facebook and Twitter are designed to generate massive virality in a way that comments below a news website are not. The reach, and the ability for them to grow exponentially, is orders of magnitude lower on a news website comment section than on Facebook. That is an important difference, from a risk point of view.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

This issue comes down to a fundamental point—are we looking at volume or risk? There is no difference between an individual—a young person in this instance—seeing something about suicide or self-harm on a Facebook post or in the comments section of a newspaper article. The volume—whether it goes viral or not—does not matter if that individual has seen that content and it has directed them to somewhere that will create serious harm and lead them towards dangerous behaviour. The volume is not the point.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady raises an important philosophical question that underpins much of the Bill’s architecture. All the measures are intended to strike a balance. Where there are things that are at risk of leading to illegal activity, and things that are harmful to children, we are clamping down hard, but in other areas we are being more proportionate. For example, the legal but harmful to adult duties only apply to category 1 companies, and we are looking at whether that can be extended to other high-risk companies, as we debated earlier. In the earlier provisions that we debated, about “have regard to free speech”, there is a balancing exercise between the safety duties and free speech. A lot of the provisions in the Bill have a sense of balance and proportionality. In some areas, such as child sexual exploitation and abuse, there is no balance. We just want to stop that—end of story. In other areas, such as matters that are legal but harmful and touch on free speech, there is more of a balancing exercise.

In this area of news publisher content, we are again striking a balance. We are saying that the inherent harmfulness of those sites, owing to their functionality—they do not go viral in the same way—is much lower. There is also an interaction with freedom of the press, as I said earlier. Thus, we draw the balance in a slightly different way. To take the example of suicide promotion or self-harm content, there is a big difference between stumbling across something in comment No. 74 below a BBC article, versus the tragic case of Molly Russell—the 14-year-old girl whose Instagram account was actively flooded, many times a day, with awful content promoting suicide. That led her to take her own life.

I think the hon. Member for Batley and Spen would probably accept that there is a functional difference between a comment that someone has to scroll down a long way to find and probably sees only once, and being actively flooded with awful content. In having regard to those different arguments—the risk and the freedom of the press—we try to strike a balance. I accept that they are not easy balances to strike, and that there is a legitimate debate to be had on them. However, that is the reason that we have adopted this approach.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a question on anonymity. On social media there will be a requirement to verify users’ identities, so if somebody posts on Twitter that they want to lynch me, it is possible to find out who that is, provided they do not have an anonymous account. There is no such provision for newspaper comment sections, so I assume it would be much more difficult for the police to find them, or for me not to see anonymous comments that threaten my safety below the line of newspaper articles—comments that are just as harmful, which threaten my safety on social media. Can the Minister can convince me otherwise?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady is correct in her analysis, I can confirm. Rather similar to the previous point, because of the interaction with freedom of the press—the argument that the newspapers and broadcasters have advanced—and because this is an inherently less viral environment, we have drawn the balance where we have. She is right to highlight a reasonable risk, but we have struck the balance in the way we have for that reason.

The shadow Minister, the hon. Member for Pontypridd, asked whether very harmful or illegal interactions in the metaverse would be covered or whether they have a metaphorical “get out of jail free” card owing to the exemption in clause 49(2)(d) for “one-to-one live aural communications”. In essence, she is asking whether, in the metaverse, if two users went off somewhere and interacted only with each other, that exemption would apply and they would therefore be outwith the scope of the Bill. I am pleased to tell her they would not, because the definition of live one-to-one aural communications goes from clause 49(2)(d) to clause 49(5), which defines “live aural communications”. Clause 49(5)(c) states that the exemption applies only if it

“is not accompanied by user-generated content of any other description”.

The actions of a physical avatar in the metaverse do constitute user-generated content of any other description. Owing to that fact, the exemption in clause 49(2)(d) would not apply to the metaverse.

I am happy to provide clarification on that. It is a good question and I hope I have provided an example of how, even though the metaverse was not conceived when the Bill was conceived, it does have an effect.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

On that point, when it comes to definition of content, we have tabled an amendment about “any other content”. I am not convinced that the definition of content adequately covers what the Minister stated, because it is limited, does not include every possible scenario where it is user-generated and is not future-proofed enough. When we get to that point, I would appreciate it if the Minister would look at the amendment and ensure that what he intends is what happens.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am grateful to the hon. Lady for thinking about that so carefully. I look forward to her amendment. For my information, which clause does her amendment seek to amend?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I will let the Minister know in a moment.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am grateful. It is an important point.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

During the Joint Committee we were concerned about future-proofing. Although I appreciate it is not specifically included in the Bill because it is a House matter, I urge the setting up of a separate Online Safety Act committee that runs over time, so that it can continue to be improved upon and expanded, which would add value. We do not know what the next metaverse will be in 10 years’ time. However, I feel confident that the metaverse was included and I am glad that the Minister has confirmed that.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my hon. Friend for his service on the Joint Committee. I heard the representations of my right hon. Friend the Member for Basingstoke about a Joint Committee, and I have conveyed them to the higher authorities.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The amendment that the Minister is asking about is to clause 189, which states:

“‘content’ means anything communicated by means of an internet service, whether publicly or privately, including written material or messages, oral communications, photographs, videos, visual images, music and data of any description”.

It is amendment 76 that, after “including”, would insert “but not limited to”, in order that the Bill is as future-proofed as it can be.

10:15
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Lady for her rapid description of that amendment. We will come to clause 189 in due course. The definition of “content” in that clause is,

“anything communicated by means of an internet service”,

which sounds like it is quite widely drafted. However, we will obviously debate this issue properly when we consider clause 189.

The remaining question—

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I intervene rather than making a subsequent substantive contribution because I am making a very simple point. My hon. Friend the Minister is making a really compelling case about the need for freedom of speech and the need to protect it within the context of newspapers online. However, could he help those who might be listening to this debate today to understand who is responsible if illegal comments are made on newspaper websites? I know that my constituents would be concerned about that, not particularly if illegal comments were made about a Member of Parliament or somebody else in the public eye, but about another individual not in the public eye.

What redress would that individual have? Would it be to ask the newspaper to take down that comment, or would it be that they could find out the identity of the individual who made the comment, or would it be that they could take legal action? If he could provide some clarity on that, it might help Committee members to understand even further why he is taking the position that he is taking.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my right hon. Friend for that intervention. First, clearly if something illegal is said online about someone, they would have the normal redress to go to the police and the police could seek to exercise their powers to investigate the offence, including requesting the company that hosts the comments—in this case, it would be a newspaper’s or broadcaster’s website—to provide any relevant information that might help to identify the person involved; they might have an account, and if they do not they might have a log-on or IP address. So, the normal criminal investigatory procedures would obviously apply.

Secondly, if the content was defamatory, then—I realise that only people like Arron Banks can sue for libel, but there is obviously civil recourse for libel. And I think there are powers in the civil procedure rules that allow for court orders to be made that require organisations, such as news media websites, to disclose information that would help to identify somebody who is a respondent in a civil case.

Thirdly, there are obviously the voluntary steps that the news publisher might take to remove content. News publishers say that they do that; obviously, their implementation, as we know, is patchy. Nevertheless, there is that voluntary route.

Regarding any legal obligation that may fall on the shoulders of the news publisher itself, I am not sure that I have sufficient legal expertise to comment on that. However, I hope that those first three areas of redress that I have set out give my right hon. Friend some assurance on this point.

Finally, I turn to a question asked by the hon. Member for Aberdeen North. She asked whether the exemption for “one-to-one live aural communications”, as set out in clause 49(2)(d), could inadvertently allow grooming or child sexual exploitation to occur via voice messages that accompany games, for example. The exemption is designed to cover what are essentially phone calls such as Skype conversations—one-to-one conversations that are essentially low-risk.

We believe that the Bill contains other duties to ensure that services are designed to reduce the risk of grooming and to address risks to children, if those risks exist, such as on gaming sites. I would be happy to come back to the hon. Lady with a better analysis and explanation of where those duties sit in the Bill, but there are very strong duties elsewhere in the Bill that impose those obligations to conduct risk assessments and to keep children safe in general. Indeed, the very strongest provisions in the Bill are around stopping child sexual exploitation and abuse, as set out in schedule 6.

Finally, there is a power in clause 174(1) that allows us, as parliamentarians and the Government, to repeal this exemption using secondary legislation. So, if we found in the future that this exemption caused a problem, we could remove it by passing secondary legislation.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

That is helpful for understanding the rationale, but in the light of how people communicate online these days, although exempting telephone conversations makes sense, exempting what I am talking about does not. I would appreciate it if the Minister came back to me on that, and he does not have to give me an answer now. It would also help if he explained the difference between “aural” and “oral”, which are mentioned at different points in the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I will certainly come back with a more complete analysis of the point about protecting children—as parents, that clearly concerns us both. The literal definitions are that “aural” means “heard” and “oral” means “spoken”. They occur in different places in the Bill.

This is a difficult issue and legitimate questions have been raised, but as I said in response to the hon. Member for Batley and Spen, in this area as in others, there are balances to strike and different considerations at play—freedom of the press on the one hand, and the level of risk on the other. I think that the clause strikes that balance in an appropriate way.

Question put, That the amendment be made.

Division 23

Ayes: 5

Noes: 9

Amendment proposed: 43, in clause 49, page 45, line 19, at end insert—
“(2A) Subsection (2)(e) does not apply in respect of a user-to-user service which is operated by an organisation which—
(a) is a relevant publisher (as defined in section 41 of the Crime and Courts Act 2013); and
(b) has an annual UK turnover in excess of £100 million.” —(Alex Davies-Jones.)
This amendment removes comments sections operated by news websites where the publisher has a UK turnover of more than £100 million from the exemption for regulated user-generated content.
Question put, That the amendment be made.

Division 24

Ayes: 5

Noes: 9

Clause 49 ordered to stand part of the Bill.
Clause 50
“Recognised news publisher”
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I beg to move amendment 107, in clause 50, page 46, line 46, leave out from end to end of clause and insert

“is a member of an approved regulator (as defined in section 42 of the Crime and Courts Act 2013).”

This amendment expands the definition of a recognised news publisher to incorporate any entity that is a member of an approved regulator.

The primary purpose of the Bill is to protect social media users from harm, and it will have failed if it does not achieve that. Alongside that objective, the Bill must protect freedom of expression and, in particular, the freedom of the press, which I know we are all committed to upholding and defending. However, in evaluating the balance between freedom of the press and the freedom to enjoy the digital world without encountering harm, the Bill as drafted has far too many loopholes and risks granting legal protection to those who wish to spread harmful content and disinformation in the name of journalism.

Amendment 107 will address that imbalance and protect the press and us all from harm. The media exemption in the Bill is a complete exemption, which would take content posted by news publishers entirely out of the scope of platforms’ legal duties to protect their users. Such a powerful exemption must be drafted with care to ensure it is not open to abuse. However, the criteria that organisations must meet to qualify for the exemption, which are set out in clause 50, are loose and, in some cases, almost meaningless. They are open to abuse, they are ambiguous and they confer responsibility on the platforms themselves to decide which publishers meet the Bill’s criteria and which do not.

In evidence that we heard recently, it was clear that the major platforms do not believe it is a responsibility they should be expected to bear, nor do they have the confidence or feel qualified to do so. Furthermore, David Wolfe, chairman of the Press Recognition Panel, has advised that the measure represents a threat to press freedom. I agree.

Opening the gates for any organisation to declare themselves a news publisher by obtaining a UK address, jotting down a standards code on the back of an envelope and inviting readers to send an email if they have any complaints is not defending the press; it is opening the profession up to abuse and, in the long term, risks weakening its rights and protections.

Let us discuss those who may wish to exploit that loophole and receive legal protection to publish harmful content. A number of far-right websites have made white supremacist claims and praised Holocaust deniers. Those websites already meet several of the criteria for exemption and could meet the remaining criteria overnight. The internet is full of groups that describes themselves as news publishers but distribute profoundly damaging and dangerous material designed to promote extremist ideologies and stir up hatred.

We can all think of high-profile individuals who use the internet to propagate disinformation, dangerous conspiracy theories and antisemitic, Islamophobic, homophobic or other forms of abuse. They might consider themselves journalists, but the genuine professionals whose rights we want to protect beg to differ. None of those individuals should be free to publish harmful material as a result of exemptions that are designed for quite a different purpose. Is it really the Government’s intention that any organisation that meets their loose criteria, as defined in the Bill, should be afforded the sacrosanct rights and freedoms of the press that we all seek to defend?

I turn to disinformation, and to hostile state actors who wish to sow the seeds of doubt and division in our politics and our civic life. The Committee has already heard that Russia Today is among those expected to benefit from the exemption. I have a legal opinion from Tamsin Allen, a senior media lawyer at Bindmans LLP, which notes that,

“were the bill to become law in its present form, Russia Today would benefit from the media exemption. The exemption for print and online news publications is so wide that it would encompass virtually all publishers with multiple contributors, an editor and some form of complaints procedure and standards code, no matter how inadequate. I understand that RT is subject to a standards code in Russia and operates a complaints procedure. Moreover, this exemption could also apply to a publisher promoting hate or violence, providing it met the (minimal) standards set out in the bill and constituted itself as a ‘news’ or ‘gossip’ publication. The only such publications which would not be exempt are those published by organisations proscribed under the Terrorism Act.”

If hostile foreign states can exploit this loophole in the Bill to spread disinformation to social media users in the UK, that is a matter of national security and a threat to our freedom and open democracy. The requirement to have a UK address offers little by way of protection. International publishers spreading hate, disinformation or other forms of online harm could easily set up offices in the UK to qualify for this exemption and instantly make the UK the harm capital of the world. For those reasons, the criteria must change.

We heard from several individuals in evidence that the exemption should be removed entirely from the Bill, but we are committed to freedom of the press as well as providing proper protections from harm. Instead of removing the exemption, I propose a change to the qualifying criteria to ensure that credible publishers can access it while extremist and harmful publishers cannot.

My amendment would replace the convoluted list of requirements with a single and simple requirement for the platforms to follow and adhere to: that all print and online media that seeks to benefit from the exemption should be independently regulated under the royal charter provisions that this House has already legislated for. If, as the Bill already says, broadcast media should be defined in this way, why not print media too? Unlike the Government’s criteria, the likes of Russia Today, white supremacist blogs and other deeply disturbing extremist publications simply could not satisfy this requirement. If they were ever to succeed in signing up to such a regulator, they would swiftly be expelled for repeated standards breaches.

10:29
This amendment, supported by the Press Recognition Panel and the Independent Media Association, would help to rebalance the rights of the press and the right to be protected from harm in the Bill. I do not pretend that this is a perfect solution to a complex problem, and I am aware that it raises wider issues around the independence of press regulators, but I believe that if the press wish to afford themselves the protections offered in this Bill, it is for them to satisfy Parliament that the requirements of existing legislation are being met.
There is no simple, agreed definition of what constitutes a recognised news publisher, and even those who have given evidence on behalf of the press have conceded that, but we must find a way to navigate this challenge. As drafted, the Bill does not do that. I am open to working with colleagues from all parties to tweak and improve this amendment, and to find an acceptable and agreed way to secure the balance we all wish to see. However, so far I have not seen or heard a better way to tighten the definitions in the Bill so as to achieve this balance, and I believe this amendment is an important step in the right direction.
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Member for Batley and Spen for her speech. There is agreement across the House, in this Committee and in the Joint Committee that the commitment to having a free press in this country is extremely important. That is why recognised news publishers are exempted from the provisions of the Bill, as the hon. Lady said.

The clause, as drafted, has been looked at in some detail over a number of years and debated with news publishers and others. It is the best attempt that we have so far collectively been able to come up with to provide a definition of a news publisher that does not infringe on press freedom. The Government are concerned that if the amendment were adopted, it would effectively require news publishers to register with a regulator in order to benefit from the exemption. That would constitute the imposition of a mandatory press regulator by the back door. I put on record that this Government do not support any kind of mandatory or statutory press regulation, in any form, for reasons of freedom of the press. Despite what has been said in previous debates, we think to do that would unreasonably restrict the freedom of the press in this country.

While I understand its intention, the amendment would drive news media organisations, both print and broadcast, into the arms of a regulator, because they would have to join one in order to get the exemption. We do not think it is right to create that obligation. We have reached the philosophical position that statutory or mandatory regulation of the press is incompatible with press freedom. We have been clear about that general principle and cannot accept the amendment, which would violate that principle.

In relation to hostile states, such as Russia, I do not think anyone in the UK press would have the slightest objection to us finding ways to tighten up on such matters. As I have flagged previously, thought is being given to that issue, but in terms of the freedom of the domestic press, we feel very strongly that pushing people towards a regulator is inappropriate in the context of a free press.

The characterisation of these provisions is a little unfair, because some of the requirements are not trivial. The requirement in 50(2)(f) is that there must be a person—I think it includes a legal person as well as an actual person—who has legal responsibility for the material published, which means that, unlike with pretty much everything that appears on the internet, there is an identified person who has legal responsibility. That is a very important requirement. Some of the other requirements, such as having a registered address and a standards code, are relatively easy to meet, but the point about legal responsibility is very important. For that reason, I respectfully resist the amendment.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I will not push the amendment to a vote, but it is important to continue this conversation, and I encourage the Minister to consider the matter as the Bill proceeds. I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

I beg to move amendment 86, in clause 50, page 47, line 3, after “material” insert—

“or special interest news material”.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 87, in clause 50, page 47, line 28, leave out the first “is” and insert—

“and special interest news material are”.

Amendment 88, in clause 50, page 47, line 42, at end insert—

““special interest news material” means material consisting of news or information about a particular pastime, hobby, trade, business, industry or profession.”

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

In its current form, the Online Safety Bill states that platforms do not have any duties relating to content from recognised media outlets and new publishers, and the outlets’ websites are also exempt from the scope of the Bill. However, the way the Bill is drafted means that hundreds of independently regulated specialist publishers’ titles will be excluded from the protections afforded to recognised media outlets and news publishers. This will have a long-lasting and damaging effect on an indispensable element of the UK’s media ecosystem.

Specialist publishers provide unparalleled insights into areas that broader news management organisations will likely not analyse, and it would surely be foolish to dismiss and damage specialist publications in a world where disinformation is becoming ever more prevalent. The former Secretary of State, the right hon. Member for Maldon (Mr Whittingdale), also raised this issue on Second Reading, where he stated that specialist publishers

“deserve the same level of protection.”—[Official Report, 19 April 2022; Vol. 712, c. 109.]

Part of the rationale for having the news publishers exemption in the Bill is that it means that the press will not be double-regulated. Special interest material is already regulated, so it should benefit from the same exemptions.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

For the sake of clarity, and for the benefit of the Committee and those who are watching, could the hon. Gentleman say a bit more about what he means by specialist publications and perhaps give one or two examples to better illustrate his point?

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

I would be delighted to do so. I am talking about specific and occasionally niche publications. Let us take an example. Gardeners’ World is not exactly a hotbed of online harm, and nor is it a purveyor of disinformation. It explains freely which weeds to pull up and which not to, without seeking to confuse people in any way. Under the Bill, however, such publications will be needlessly subjected to rules, creating a regulatory headache for the sector. This is a minor amendment that will help many businesses, and I would be interested to hear from the Minister why the Government will not listen to the industry on this issue.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Member for Ochil and South Perthshire for his amendment and his speech. I have a couple of points to make in reply. The first is that the exemption is about freedom of the press and freedom of speech. Clearly, that is most pertinent and relevant in the context of news, information and current affairs, which is the principal topic of the exemption. Were we to expand it to cover specialist magazines—he mentioned Gardeners’ World—I do not think that free speech would have the same currency when it comes to gardening as it would when people are discussing news, current affairs or public figures. The free speech argument that applies to newspapers, and to other people commenting on current affairs or public figures, does not apply in the same way to gardening and the like.

That brings me on to a second point. Only a few minutes ago, the hon. Member for Batley and Spen drew the Committee’s attention to the risks inherent in the clause that a bad actor could seek to exploit. It was reasonable of her to do so. Clearly, however, the more widely we draft the clause—if we include specialist publications such as Gardeners’ World, whose circulation will no doubt soar on the back of this debate—the greater the risk of bad actors exploiting the exemption.

My third point is about undue burdens being placed on publications. To the extent that such entities count as social media platforms—in-scope services—the most onerous duties under the Bill apply only to category 1 companies, or the very biggest firms such as Facebook and so on. The “legal but harmful” duties and many of the risk assessment duties would not apply to many organisations. In fact, I think I am right to say that if the only functionality on their websites is user comments, they would in any case be outside the scope of the Bill. I have to confess that I am not intimately familiar with the functionality of the Gardeners’ World website, but there is a good chance that if all it does is to provide the opportunity to post comments and similar things, it would be outside the scope of the Bill anyway, because it does not have the requisite functionality.

I understand the point made by the hon. Member for Ochil and South Perthshire, we will, respectfully, resist the amendment for the many reasons I have given.

None Portrait The Chair
- Hansard -

John, do you wish to press the amendment to a vote?

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

No, I will let that particular weed die in the bed. I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Briefly, as with earlier clauses, the Labour party recognises the challenge in finding the balance between freedom of expression and keeping people safe online. Our debate on the amendment has illustrated powerfully that the exemptions as they stand in the Bill are hugely flawed.

First, the exemption is open to abuse. Almost any organisation could develop a standards code and complaints process to define itself as a news publisher and benefit from the exemption. Under those rules, as outlined eloquently by my hon. Friend the Member for Batley and Spen, Russia Today already qualifies, and various extremist publishers could easily join it. Organisations will be able to spread seriously harmful content with impunity—I referred to many in my earlier contributions, and I have paid for that online.

Secondly, the exemption is unjustified, as we heard loud and clear during the oral evidence sessions. I recall that Kyle from FairVote made that point particularly clearly. There are already rigorous safeguards in the Bill to protect freedom of expression. The fact that content is posted by a news provider should not itself be sufficient reason to treat such content differently from that which is posted by private citizens.

Furthermore, quality publications with high standards stand to miss out on the exemption. The Minister must also see the lack of parity in the broadcast media space. In order for broadcast media to benefit from the exemption, they must be regulated by Ofcom, and yet there is no parallel stipulation for non-broadcast media to be regulated in order to benefit. How is that fair? For broadcast media, the requirement to be regulated by Ofcom is simple, but for non-broadcast media, the series of requirements are not rational, exclude many independent publishers and leave room for ambiguity.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a couple of questions that were probably too long for interventions. The Minister said that if comments on a site are the only user-generated content, they are not in scope. It would be really helpful if he explained what exactly he meant by that. We were talking about services that do not fall within the definition of “recognised news publishers”, because we were trying to add them to that definition. I am not suggesting that the Minister is wrong in any way, but I do not understand where the Bill states that those comments are excluded, and how this all fits together.

10:45
My hon. Friend the Member for Ochil and South Perthshire mentioned Gardeners’ World. There are also websites and specialist online publications such as the British Medical Journal that are subject to specific regulation that is separate from the Bill; if they have any user-to-user functionality—I do not know whether the BMJ does—they will also be subject to the requirements described in the Bill. Such publications are inoffensive and provide a huge amount of important information to people; that is not necessarily to say that they should not be regulated, but it does not seem that there is a level playing field. Particularly during the pandemic, peer-reviewed scientific journals were incredibly important in spreading public service information; nevertheless, the Bill includes them in its scope, but not news publications. I am not sure why the Minister is drawing the line where he is on this issue, so a little more clarity would be appreciated.
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I made general comments about clause 50 during the debate on amendment 107; I will not try the Committee’s patience by repeating them, but I believe that in them, I addressed some of the issues that the shadow Minister, the hon. Member for Pontypridd, has raised.

On the hon. Member for Aberdeen North’s question about where the Bill states that sites with limited functionality—for example, functionality limited to comments alone—are out of scope, paragraph 4(1) of schedule 1 states that

“A user-to-user service is exempt if the functionalities of the service are limited, such that users are able to communicate by means of the service only in the following ways—

(a) posting comments or reviews relating to provider content;

(b) sharing such comments or reviews on a different internet service”.

Clearly, services where a user can share freely are in scope, but if they cannot share directly—if they can only share via another service, such as Facebook—that service is out of scope. This speaks to the point that I made to the hon. Member for Batley and Spen in a previous debate about the level of virality, because the ability of content to spread, proliferate, and be forced down people’s throats is one of the main risks that we are seeking to address through the Bill. I hope that paragraph 4(1) of schedule 1 is of assistance, but I am happy to discuss the matter further if that would be helpful.

Question put and agreed to.

Clause 50 accordingly ordered to stand part of the Bill.

Clause 51

“Search content”, “search results” etc

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour does not oppose the intention of the clause. It is important to define “search content” in order to understand the responsibilities that fall within search services’ remits.

However, we have issues with the way that the Bill treats user-to-user services and search services differently when it comes to risk-assessing and addressing legal harm—an issue that we will come on to when we debate schedule 10. Although search services rightly highlight that the content returned by a search is not created or published by them, the algorithmic indexing, promotion and search prompts provided in search bars are fundamentally their responsibility. We do, however, accept that over the past 20 years, Google, for example, has developed mechanisms to provide a safer search experience for users while not curtailing access to lawful information. We also agree that search engines are critical to the proper functioning of the world wide web; they play a uniquely important role in facilitating access to the internet, and enable people to access, impart, and disseminate information.

Question put and agreed to.

Clause 51 accordingly ordered to stand part of the Bill.

Clause 52

“Illegal content” etc

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move amendment 61, in clause 52, page 49, line 5, at end insert—

“(4A) An offence referred to in subsection (4) is deemed to have occurred if it would be an offence under the law of the United Kingdom regardless of whether or not it did take place in the United Kingdom.”

This amendment brings offences committed overseas within the scope of relevant offences for the purposes of defining illegal content.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause stand part.

That schedules 5 and 6 be the Fifth and Sixth schedules to the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

With your permission, Ms Rees, I will speak to clause 52 before coming to amendment 61. Illegal content is defined in clause 52(2) as

“content that amounts to a relevant offence.”

However, as the Minister will know from representations from Carnegie UK to his Department—we share its concerns—the illegal and priority illegal regimes may not be able to operate as intended. The Bill requires companies to decide whether content “amounts to” an offence, with limited room for movement. We share concerns that that points towards decisions on an item-by-item basis; it means detecting intent for each piece of content. However, such an approach does not work at the scale on which platforms operate; it is bad regulation and poor risk management.

There seem to be two different problems relating to the definition of “illegal content” in clause 52. The first is that it is unclear whether we are talking about individual items of content or categories of content—the word “content” is ambiguous because it can be singular or plural—which is a problem for an obligation to design and run a system. Secondly, determining when an offence has taken place will be complex, especially bearing in mind mens rea and defences, so the providers are not in a position to get it right.

The use of the phrase “amounts to” in clause 52(2) seems to suggest that platforms will be required to identify accurately, in individual cases, where an offence has been committed, without any wriggle room drafted in, unlike in the draft Bill. As the definition now contains no space for error either side of the line, it could be argued that there are more incentives to avoid false negatives than false positives—providers can set higher standards than the criminal law—and that leads to a greater risk of content removal. That becomes problematic, because it seems that the obligation under clause 9(3) is then to have a system that is accurate in all cases, whereas it would be more natural to deal with categories of content. This approach seems not to be intended; support for that perspective can be drawn from clause 9(6), which recognises that there is a distinction between categories of content and individual items, and that the application of terms of service might specifically have to deal with individual instances of content. Critically, the “amounts to” approach cannot work in conjunction with a systems-based approach to harm reduction. That leaves victims highly vulnerable.

This problem is easily fixed by a combination of reverting to the draft Bill’s language, which required reasonableness, and using concepts found elsewhere in the Bill that enable a harm mitigation system to operate for illegal content. We also remind the Minister that Ofcom raised this issue in the evidence sessions. I would be grateful if the Minister confirmed whether we can expect a Government amendment to rectify this issue shortly.

More broadly, as we know, priority illegal content, which falls within illegal content, includes,

“(a) terrorism content,

(b) CSEA content, and

(c) content that amounts to an offence specified in Schedule 7”,

as set out in clause 52(7). Such content attracts a greater level of scrutiny and regulation. Situations in which user-generated content will amount to “a relevant offence” are set out in clause 52(3). Labour supports the inclusion of a definition of illegal content as outlined in the grouping; it is vital that service providers and platforms have a clear indication of the types of content that they will have a statutory duty to consider when building, or making changes to the back end of, their business models.

We have also spoken about the importance of parity between the online and offline spaces—what is illegal offline must be illegal online—so the Minister knows we have more work to do here. He also knows that we have broad concerns around the omissions in the Bill. While we welcome the inclusion of terrorism and child sexual exploitation content as priority illegal content, there remain gaps in addressing violence against women and girls content, which we all know is hugely detrimental to many online.

The UK Government stated that their intention for the Online Safety Bill was to make the UK the safest place to be online in the world, yet the Bill does not mention online gender-based violence once. More than 60,000 people have signed the Glitch and End Violence Against Women Coalition’s petition calling for women and girls to be included in the Bill, so the time to act is now. We all have a right to not just survive but thrive, engage and play online, and not have our freedom of expression curtailed or our voices silenced by perpetrators of abuse. The online space is just as real as the offline space. The Online Safety Bill is our opportunity to create safe digital spaces.

The Bill must name the problem. Violence against women and girls, particularly those who have one or multiple protected characteristics, is creating harm and inequality online. We must actively and meaningfully name this issue and take an intersectional approach to ending online abuse to ensure that the Bill brings meaningful change for all women. We also must ensure that the Bill truly covers all illegal content, whether it originated in the UK or not.

Amendment 61 brings offences committed overseas within the scope of relevant offences for the purposes of defining illegal content. The aim of the amendment is to clarify whether the Bill covers content created overseas that would be illegal if what was shown in the content took place in the UK. For example, animal abuse and cruelty content is often filmed abroad. The same can be said for dreadful human trafficking content and child sexual exploitation. The optimal protection would be if the Bill’s definition of illegal content covered matter that would be illegal in either the UK or the country it took place in, regardless of whether it originated in the UK.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I do not intend to make a speech, but I want to let the hon. Lady know that we wholeheartedly support everything that she has said on the clause and amendment 61.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful for the hon. Member’s contribution, and for her support for the amendment and our comments on the clause.

The Bill should be made clearer, and I would appreciate an update on the Minister’s assessment of the provisions in the Bill. Platforms and service providers need clarity if they are to take effective action against illegal content. Gaps in the Bill give rise to serious questions about the overwhelming practical challenges of the Bill. None of us wants a two-tier internet, in which user experience and platforms’ responsibilities in the UK differ significantly from those in the rest of the world. Clarifying the definition of illegal content and acknowledging the complexity of the situation when content originates abroad are vital if this legislation is to tackle wide-ranging, damaging content online. That is a concern I raised on Second Reading, and a number of witnesses reiterated it during the oral evidence sessions. I remind the Committee of the comments of Kevin Bakhurst from Ofcom, who said:

“We feel it is really important—hopefully this is something the Committee can contribute to—that the definition of ‘illegal content’ is really clear for platforms, and particularly the area of intent of illegality, which at the moment might be quite tricky for the platforms to pick up on.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 8, Q7.]

That has been reiterated by myriad other stakeholders, so I would be grateful for the Minister’s comments.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I rise to speak on clause 52 stand part, particularly —the Minister will not be surprised—the element in subsection (4)(c) around the offences specified in schedule 7. The debate has been very wide ranging throughout our sittings. It is extraordinary that we need a clause defining what is illegal. Presumably, most people who provide goods and services in this country would soon go out of business if they were not knowledgeable about what is illegal. The Minister is helping the debate very much by setting out clearly what is illegal, so that people who participate in the social media world are under no illusion as to what the Government are trying to achieve through this legislation.

The truth is that the online world has unfolded without a regulatory framework. New offences have emerged, and some of them are tackled in the Bill, particularly cyber-flashing. Existing offences have taken on a new level of harm for their victims, particularly when it comes to taking, making and sharing intimate images without consent. As the Government have already widely acknowledged, because the laws on that are such a patchwork, it is difficult for the enforcement agencies in this country to adequately protect the victims of that heinous crime, who are, as the Minister knows, predominately women.

11:00
I want to further explore this element of the Bill and the Government’s intention. As the hon. Member for Pontypridd, speaking for the Opposition, set out, there are no direct references in the legislation to violence against women and girls. Of course, taking, making and sharing intimate images online without consent is a form of violence towards women. In our sittings last week, the Minister made extremely helpful comments about the ability of Ofcom to address those broader issues through a code of practice. He made it clear that it was perfectly possible for Ofcom to do that in through this legislation. I am sure that it will have heard the cross-party support for that, which is incredibly loud and clear—or rather, it has heard the cross-party opposition that it would face if it did not take up that opportunity at the earliest possible convenience. I am grateful to the Minister for helping us to find a way forward on the issue. I hope that he can also help us find a way forward on taking, making and sharing intimate images without consent, because the Government are trying to keep up, and want a legal framework that is fit for the purpose of protecting women against these heinous crimes online. I was grateful to the then Lord Chancellor back in 2015 when we enacted the first revenge pornography laws, as they might colloquially be called, which are included as a priority offence in schedule 7 of this Bill.
The Government are also putting in place much needed and important laws around cyber-flashing, as many of us hoped they would, and have been campaigning very hard on, because taking pictures of male genitalia, predominantly, and sending them to, predominantly, women is a form of abuse, harm and violence towards women through intimidation. The Government are trying to keep up with this fast-moving environment, but this legislation will only be as good as the criminal laws contained within it. The Government need to continue to future-proof the legislation, and to demonstrate that they see these sorts of offences as a priority.
The Government commissioned the Law Commission to undertake a significant piece of professional evaluation of how fit for purpose the laws are on the online posting of intimate images without consent. The Law Commission found the situation wanting to the greatest degree, and is consulting on producing legal recommendations. Those are not in the Bill, which is an enormous shame; those recommendations are perhaps even now with the Government for consideration, but unfortunately they have not yet been published.
I am concerned that we are missing an opportunity to tackle an issue that is an overwhelming problem for many women in this country, and I hope that when the Minister responds to this part of the debate, he can clearly set out the Government’s intention to tackle the issue. We all know that parliamentary time is in short supply: the Government have many Bills that they have to get through in this Session, before the next general election. I am concerned that this particular issue, which the Law Commission itself sees as so important, may not get the rapid legislation that we, as elected representatives, need to see happen. The foundation of the Bill is a duty of care, but that duty of care is only as good as the criminal law. If the criminal law is wanting when it comes to the publication online of intimate images, that is the taking, making and sharing of intimate images without consent—if that is not adequately covered in the criminal law—this legislation will not help the many people we want it to help. Will the Minister, in responding to the debate, outline in some detail, if possible, how he will handle the issue and when he hopes to make public the Law Commission recommendations, for which many people have been waiting for many years?
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank right hon. and hon. Members who have participated in the debate on this extremely important clause. It is extremely important because the Bill’s strongest provisions relate to illegal content, and the definition of illegal content set out in the clause is the starting point for those duties.

A number of important questions have been asked, and I would like to reply to them in turn. First, I want to speak directly about amendment 61, which was moved by the shadow Minister and which very reasonably and quite rightly asked the question about physically where in the world a criminal offence takes place. She rightly said that in the case of violence against some children, for example, that may happen somewhere else in the world but be transmitted on the internet here in the United Kingdom. On that, I can point to an existing provision in the Bill that does exactly what she wants. Clause 52(9), which appears about two thirds of the way down page 49 of the Bill, states:

“For the purposes of determining whether content amounts to an offence, no account is to be taken of whether or not anything done in relation to the content takes place in any part of the United Kingdom.”

What that is saying is that it does not matter whether the act of concern takes place physically in the United Kingdom or somewhere else, on the other side of the world. That does not matter in looking at whether something amounts to an offence. If it is criminal under UK law but it happens on the other side of the world, it is still in scope. Clause 52(9) makes that very clear, so I think that that provision is already doing what the shadow Minister’s amendment 61 seeks to do.

The shadow Minister asked a second question about the definition of illegal content, whether it involves a specific act and how it interacts with the “systems and processes” approach that the Bill takes. She is right to say that the definition of illegal content applies item by item. However, the legally binding duties in the Bill, which we have already debated in relation to previous clauses, apply to categories of content and to putting in place “proportionate systems and processes”—I think that that is the phrase used. Therefore, although the definition is particular, the duty is more general, and has to be met by putting in place systems and processes. I hope that my explanation provides clarification on that point.

The shadow Minister asked another question about the precise definitions of how the platforms are supposed to decide whether content meets the definition set out. She asked, in particular, questions about how to determine intent—the mens rea element of the offence. She mentioned that Ofcom had had some comments in that regard. Of course, the Government are discussing all this closely with Ofcom, as people would expect. I will say to the Committee that we are listening very carefully to the points that are being made. I hope that that gives the shadow Minister some assurance that the Government’s ears are open on this point.

The next and final point that I would like to come to was raised by all speakers in the debate, but particularly by my right hon. Friend the Member for Basingstoke, and is about violence against women and girls—an important point that we have quite rightly debated previously and come to again now. The first general point to make is that clause 52(4)(d) makes it clear that relevant offences include offences where the intended victim is an individual, so any violence towards and abuse of women and girls is obviously included in that.

As my right hon. Friend the Member for Basingstoke and others have pointed out, women suffer disproportionate abuse and are disproportionately the victims of criminal offences online. The hon. Member for Aberdeen North pointed out how a combination of protected characteristics can make the abuse particularly impactful—for example, if someone is a woman and a member of a minority. Those are important and valid points. I can reconfirm, as I did in our previous debate, that when Ofcom drafts the codes of practice on how platforms can meet their duties, it is at liberty to include such considerations. I echo the words spoken a few minutes ago by my right hon. Friend the Member for Basingstoke: the strong expectation across the House—among all parties here—is that those issues will be addressed in the codes of practice to ensure that those particular vulnerabilities and those compounded vulnerabilities are properly looked at by social media firms in discharging those duties.

My right hon. Friend also made points about intimate image abuse when the intimate images are made without the consent of the subject—the victim, I should say. I would make two points about that. The first relates to the Bill and the second looks to the future and the work of the Law Commission. On the Bill, we will come in due course to clause 150, which relates to the new harmful communications offence, and which will criminalise a communication—the sending of a message—when there is a real and substantial risk of it causing harm to the likely audience and there is intention to cause harm. The definition of “harm” in this case is psychological harm amounting to at least serious distress.

Clearly, if somebody is sending an intimate image without the consent of the subject, it is likely that that will cause harm to the likely audience. Obviously, if someone sends a naked image of somebody without their consent, that is very likely to cause serious distress, and I can think of few reasons why somebody would do that unless it was their intention, meaning that the offence would be made out under clause 150.

My right hon. Friend has strong feelings, which I entirely understand, that to make the measure even stronger the test should not involve intent at all, but should simply be a question of consent. Was there consent or not? If there was no consent, an offence would have been committed, without needing to go on to establish intention as clause 150 provides. As my right hon. Friend has said, Law Commission proposals are being developed. My understanding is that the Ministry of Justice, which is the Department responsible for this offence, is expecting to receive a final report, I am told, over the summer. It would then clearly be open to Parliament to legislate to put the offence into law, I hope as quickly as possible.

Once that happens, through whichever legislative vehicle, it will have two implications. First, the offence will automatically and immediately be picked up by clause 52(4)(d) and brought within the scope of the Bill because it is an offence where the intended victim is an individual. Secondly, there will be a power for the Secretary of State and for Parliament, through clause 176, I think—I am speaking from memory; yes, it is clause 176, not that I have memorised every clause in the Bill—via statutory instrument not only to bring the offence into the regular illegal safety duties, but to add it to schedule 7, which contains the priority offences.

Once that intimate image abuse offence is in law, via whichever legislative vehicle, that will have that immediate effect with respect to the Bill, and by statutory instrument it could be made a priority offence. I hope that gives my right hon. Friend a clear sense of the process by which this is moving forward.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I thank the Minister for such a clear explanation of his plan. Can he confirm that the Bill is a suitable legislative vehicle? I cannot see why it would not be. I welcome his agreement about the need for additional legislation over and above the communications offence. In the light of the way that nudification software and deepfake are advancing, and the challenges that our law enforcement agencies have in interpreting those quite complex notions, a straightforward law making it clear that publishing such images is a criminal offence would not only help law enforcement agencies, but would help the perpetrators to understand that what they are doing is a crime and they should stop.

11:15
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As always, the right hon. Lady makes an incredibly powerful point. She asked specifically about whether the Bill is a suitable legislative vehicle in which to implement any Law Commission recommendations—we do not yet have the final version of that report—and I believe that that would be in scope. A decision about legislative vehicles depends on the final form of the Law Commission report and the Ministry of Justice response to it, and on cross-Government agreement about which vehicle to use.

I hope that addresses all the questions that have been raised by the Committee. Although the shadow Minister is right to raise the question, I respectfully ask her to withdraw amendment 61 on the basis that those matters are clearly covered in clause 52(9). I commend the clause to the Committee.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful to the Minister for his comments. The Labour party has concerns that clause 52(9) does not adequately get rid of the ambiguity around potential illegal online content. We feel that amendment 61 sets that out very clearly, which is why we will press it to a vote.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Just to help the Committee, what is it in clause 52(9) that is unclear or ambiguous?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

We just feel that amendment 61 outlines matters much more explicitly and leaves no ambiguity by clearly defining any

“offences committed overseas within the scope of relevant offences for the purposes of defining illegal content.”

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I think they say the same thing, but we obviously disagree.

Question put, That the amendment be made.

Division 25

Ayes: 5

Noes: 8

Clause 52 ordered to stand part of the Bill.
None Portrait The Chair
- Hansard -

Schedule 5 has already been debated, so we will proceed straight—

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

No, it hasn’t. We did not get a chance to speak to either schedule 5 or schedule 6.

None Portrait The Chair
- Hansard -

Sorry; they were in the group, so we have to carry on.

Schedules 5 and 6 agreed to.

Ordered, That further consideration be now adjourned.—(Steve Double.)

11:19
Adjourned till this day at Two o’clock.

Online Safety Bill (Tenth sitting)

Committee stage
Tuesday 14th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 14 June 2022 - (14 Jun 2022)
The Committee consisted of the following Members:
Chairs: † Sir Roger Gale, Christina Rees
† Ansell, Caroline (Eastbourne) (Con)
† Bailey, Shaun (West Bromwich West) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
Carden, Dan (Liverpool, Walton) (Lab)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Double, Steve (St Austell and Newquay) (Con)
† Fletcher, Nick (Don Valley) (Con)
Holden, Mr Richard (North West Durham) (Con)
† Keeley, Barbara (Worsley and Eccles South) (Lab)
† Leadbeater, Kim (Batley and Spen) (Lab)
† Miller, Dame Maria (Basingstoke) (Con)
Mishra, Navendu (Stockport) (Lab)
† Moore, Damien (Southport) (Con)
† Nicolson, John (Ochil and South Perthshire) (SNP)
† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
† Russell, Dean (Watford) (Con)
† Stevenson, Jane (Wolverhampton North East) (Con)
Katya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks
† attended the Committee
Public Bill Committee
Tuesday 14 June 2022
(Afternoon)
[Sir Roger Gale in the Chair]
Online Safety Bill
Schedule 7
Priority offences
14:00
John Nicolson Portrait John Nicolson (Ochil and South Perthshire) (SNP)
- Hansard - - - Excerpts

I beg to move amendment 142, in schedule 7, page 183, line 11, leave out from “under” to the end of line and insert

“any of the following provisions of the Suicide Act 1961—

(a) section 2;

(b) section 3A (inserted by section Communication offence for encouraging or assisting self-harm of this Act).”

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss new clause 36—Communication offence for encouraging or assisting self-harm

‘(1) In the Suicide Act 1961, after section 3 insert—

“3A Communication offence for encouraging or assisting self-harm

(1) A person (“A”) commits an offence if—

(a) A sends a message,

(b) the message encourages or could be used to assist another person (“B”) to inflict serious physical harm upon themselves, and

(c) A’s act was intended to encourage or assist the infliction of serious physical harm.

(2) The person referred to in subsection (1)(b) need not be a specific person (or class of persons) known to, or identified by, A.

(3) A may commit an offence under this section whether or not any person causes serious physical harm to themselves, or attempts to do so.

(4) A person guilty of an offence under this section is liable—

(a) on summary conviction, to imprisonment for a term not exceeding 12 months, or a fine, or both;

(b) on indictment, to imprisonment for a term not exceeding 5 years, or a fine, or both.

(5) “Serious physical harm” means serious injury amounting to grievous bodily harm within the meaning of the Offences Against the Person Act 1861.

(6) No proceedings shall be instituted for an offence under this section except by or with the consent of the Director of Public Prosecutions.

(7) If A arranges for a person (“A2”) to do an Act and A2 does that Act, A is also to be treated as having done that Act for the purposes of subsection (1).

(8) In proceedings for an offence to which this section applies, it shall be a defence for A to prove that—

(a) B had expressed intention to inflict serious physical harm upon themselves prior to them receiving the message from A;

(b) B’s intention to inflict serious physical harm upon themselves was not initiated by A; and

(c) the message was wholly motivated by compassion towards B or to promote the interests of B’s health or wellbeing.”’

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

New clause 36 seeks to criminalise the encouragement or assistance of a suicide. Before I move on to the details of the new clause, I would like to share the experience of a Samaritans supporter, who said:

“I know that every attempt my brother considered at ending his life, from his early 20s to when he died in April, aged 40, was based on extensive online research. It was all too easy for him to find step-by-step instructions so he could evaluate the effectiveness and potential impact of various approaches and, most recently, given that he had no medical background, it was purely his ability to work out the quantities of various drugs and likely impact of taking them in combination that equipped him to end his life.”

It is so easy when discussing the minutiae of the Bill to forget its real-world impact. I have worked with Samaritans on the new clause, and I use that quote with permission. It is the leading charity in trying to create a suicide-safer internet. It is axiomatic to say that suicide and self-harm have a devastating impact on people’s lives. The Bill must ensure that the online space does not aid the spreading of content that would promote this behaviour in any way.

There has rightly been much talk about how children are affected by self-harm content online. However, it should be stressed they do not exclusively suffer because of that content. Between 2011 and 2015, 151 patients who died by suicide were known to have visited websites that encouraged suicide or shared information about methods of harm, and 82% of those patients were aged over 25. It is likely that, as the Bill stands, suicide-promoting content will be covered in category 1 services, as it will be designated as harmful. Unless this amendment is passed, that content will not be covered on smaller sites, which is crucial. As Samaritans has identified, it is precisely in these smaller fora and websites that harm proliferates. The 151 patients who took their own life after visiting harmful websites may have been part of a handful of people using those sites, which would not fall under the definition of category 1, as I am sure the Minister will confirm.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

The hon. Gentleman makes a very important point, which comes to the nub of a lot of the issues we face with the Bill: the issue of volume versus risk. Does he agree that one life lost to suicide is one life too many? We must do everything that we can in the Bill to prevent every single life being lost through suicide, which is the aim of his amendment.

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

I do, of course, agree. As anyone who has suffered with someone in their family committing suicide knows, it has a lifelong family effect. It is yet another amendment where I feel we should depart from the pantomime of so much parliamentary procedure, where both sides fundamentally agree on things but Ministers go through the torturous process of trying to tell us that every single amendment that any outside body or any Opposition Member, whether from the SNP or the Labour party, comes up with has been considered by the ministerial team and is already incorporated or covered by the Bill. They would not be human if that were the case. Would it not be refreshing if there were a slight change in tactic, and just occasionally the Minister said, “Do you know what? That is a very good point. I think I will incorporate it into the Bill”?

None of us on the Opposition Benches seeks to make political capital out of any of the things we propose. All of us, on both sides of the House, are here with the best of intentions, to try to ensure that we get the best possible Bill. We all want to be able to vote for the Bill at the end of the day. Indeed, as I said, I have worked with two friends on the Conservative Benches—with the hon. Member for Watford on the Joint Committee on the draft Bill and with the hon. Member for Wolverhampton North East on the Select Committee on Digital, Culture, Media and Sport—and, as we know, they have both voted for various proposals. It is perhaps part of the frustration of the party system here that people are forced to go through the hoops and pretend that they do not really agree with things that they actually do agree with.

Let us try to move on with this, in a way that we have not done hitherto, and see if we can agree on amendments. We will withdraw amendments if we are genuinely convinced that they have already been considered by the Government. On the Government side, let them try to accept some of our amendments—just begin to accept some—if, as with this one, they think they have some merit.

I was talking about Samaritans, and exactly what it wants to do with the Bill. It is concerned about harmful content after the Bill is passed. This feeds into potentially the most important aspect of the Bill: it does not mandate risk assessments based exclusively on risk. By adding in the qualifications of size and scope, the Bill wilfully lets some of the most harmful content slip through its fingers—wilfully, but I am sure not deliberately. Categorisation will be covered by a later amendment, tabled by my hon. Friend the Member for Aberdeen North, so I shall not dwell on it now.

In July 2021, the Law Commission for England and Wales recommended the creation of a new narrow offence of the “encouragement or assistance” of serious self-harm with “malicious intent”. The commission identified that there is

“currently no offence that adequately addresses the encouragement of serious self-harm.”

The recommendation followed acknowledgement that

“self-harm content online is a worrying phenomenon”

and should have a

“robust fault element that targets deliberate encouragement of serious self-harm”.

Currently, there are no provisions of the Bill to create a new offence of assisting or encouraging self- harm.

In conclusion, I urge the Minister to listen not just to us but to the expert charities, including Samaritans, to help people who have lived experience of self-harm and suicide who are calling for regulation of these dangerous sites.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

Good afternoon, Sir Roger; it is a pleasure, as ever, to serve under your chairship. I rise to speak to new clause 36, which has been grouped with amendment 142 and is tabled in the names of the hon. Members for Ochil and South Perthshire and for Aberdeen North.

I, too, pay tribute to Samaritans for all the work it has done in supporting the Bill and these amendments to it. As colleagues will be aware, new clause 36 follows a recommendation from the Law Commission dating back to July 2021. The commission recommended the creation of a new, narrow offence of the “encouragement or assistance” of serious self-harm with “malicious intent”. It identified that there is

“currently no offence that adequately addresses the encouragement of serious self-harm.”

The recommendation followed acknowledgement that

“self-harm content online is a worrying phenomenon”

and should have a

“robust fault element that targets deliberate encouragement of serious self-harm”.

Currently, there are no provisions in the Bill to create a new offence of assisting or encouraging self-harm, despite the fact that other recommendations from the Law Commission report have been brought into the Bill, such as creating a new offence of cyber-flashing and prioritising tackling illegal suicide content.

We all know that harmful suicide and self-harm content is material that has the potential to cause or exacerbate self-harm and suicidal behaviours. Content relating to suicide and self-harm falls into both categories in the Bill—illegal content and legal but harmful content. Encouraging or assisting suicide is also currently a criminal offence in England and Wales under the Suicide Act 1961, as amended by the Coroners and Justice Act 2009.

Content encouraging or assisting someone to take their own life is illegal and has been included as priority illegal content in the Bill, meaning that platforms will be required to proactively and reactively prevent individuals from encountering it, and search engines will need to structure their services to minimise the risk to individuals encountering the content. Other content, including content that positions suicide as a suitable way of overcoming adversity or describes suicidal methods, is legal but harmful.

The Labour party’s Front-Bench team recognises that not all content falls neatly into the legal but harmful category. What can be helpful for one user can be extremely distressing to others. Someone may find it extremely helpful to share their personal experience of suicide, for example, and that may also be helpful to other users. However, the same material could heighten suicidal feelings and levels of distress in someone else. We recognise the complexities of the Bill and the difficulties in developing a way around this, but we should delineate harmful and helpful content relating to suicide and self-harm, and that should not detract from tackling legal but clearly harmful content.

In its current form, the Bill will continue to allow legal but clearly harmful suicide and self-harm content to be accessed by over-18s. Category 1 platforms, which have the highest reach and functionality, will be required to carry out risk assessments of, and set out in their terms and conditions their approach to, legal but harmful content in relation to over-18s. As the hon. Member for Ochil and South Perthshire outlined, however, the Bill’s impact assessment states that “less than 0.001%” of in-scope platforms

“are estimated to meet the Category 1 and 2A thresholds”,

and estimates that only 20 platforms will be required to fulfil category 1 obligations. There is no requirement on the smaller platforms, including those that actively encourage suicide, to do anything at all to protect over-18s. That simply is not good enough. That is why the Labour party supports new clause 36, and we urge the Minister to do the right thing by joining us.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

It is, as always, a great pleasure to serve under your chairmanship, Sir Roger. The hon. Member for Ochil and South Perthshire made an observation in passing about the Government’s willingness to listen and respond to parliamentarians about the Bill. We listened carefully to the extensive prelegislative scrutiny that the Bill received, including from the Joint Committee on which he served. As a result, we have adopted 66 of the changes that that Committee recommended, including on significant things such as commercial pornography and fraudulent advertising.

If Members have been listening to me carefully, they will know that the Government are doing further work or are carefully listening in a few areas. We may have more to say on those topics as the Bill progresses; it is always important to get the drafting of the provisions exactly right. I hope that that has indicated to the hon. Gentleman our willingness to listen, which I think we have already demonstrated well.

On new clause 36, it is important to mention that there is already a criminal offence of inciting suicide. It is a schedule 7 priority offence, so the Bill already requires companies to tackle content that amounts to the existing offence of inciting suicide. That is important. We would expect the promotion of material that encourages children to self-harm to be listed as a primary priority harm relating to children, where, again, there is a proactive duty to protect them. We have not yet published that primary priority harm list, but it would be reasonable to expect that material promoting children to self-harm would be on it. Again, although we have not yet published the list of content that will be on the adult priority harm list—obviously, I cannot pre-empt the publication of that list—one might certainly wish for content that promotes adults to self-harm to appear on it too.

The hon. Gentleman made the point that duties relating to adults would apply only to category 1 companies. Of course, the ones that apply to children would apply to all companies where there was significant risk, but he is right that were that priority harm added to the adult legal but harmful list, it would apply only to category 1 companies.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

In a second, but I may be about to answer the hon. Lady’s question.

Those category 1 companies are likely to be small in number, as I think the shadow Minister said, but I would imagine—I do not have the exact number—that they cover well over 90% of all traffic. However, as I hinted on the Floor of the House on Second Reading—we may well discuss this later—we are thinking about including platforms that may not meet the category 1 size threshold but none the less pose high-level risks of harm. If that is done—I stress “if”—it will address the point raised by the hon. Member for Ochil and South Perthshire. That may answer the point that the hon. Member for Batley and Spen was going to raise, but if not, I happily give way.

14:15
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

It kind of does, but the Minister has raised some interesting points about children and adults and the risk of harm. To go back to the work of Samaritans, it is really important to talk about the fact that suicide is the biggest killer of young people aged 16 to 24, so it transcends the barrier between children and adults. With the right hon. Member for Basingstoke, the hon. Member for Aberdeen North, and the shadow Minister, my hon. Friend the Member for Pontypridd, we have rightly talked a lot about women, but it is really important to talk about the fact that men account for three quarters of all suicide. Men aged between 45 and 49 are most at risk of suicide—the rate among that group has been persistently high for years. It is important that we bring men into the discussion about suicide.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am grateful for the element of gender balance that the hon. Member has introduced, and she is right to highlight the suicide risk. Inciting suicide is already a criminal offence under section 2 of the Suicide Act 1961 and we have named it a priority offence. Indeed, it is the first priority offence listed under schedule 7—it appears a third of the way down page 183—for exactly the reason she cited, and a proactive duty is imposed on companies by paragraph 1 of schedule 7.

On amendment 142 and the attendant new clause 36, the Government agree with the sentiment behind them—namely, the creation of a new offence of encouraging or assisting serious self-harm. We agree with the substance of the proposal from the hon. Member for Ochil and South Perthshire. As he acknowledged, the matter is under final consideration by the Law Commission and our colleagues in the Ministry of Justice. The offence initially proposed by the Law Commission was wider in scope than that proposed under new clause 36. The commission’s proposed offence covered the offline world, as well as the online one. For example, the new clause as drafted would not cover assisting a person to self-harm by providing them with a bladed article because that is not an online communication. The offence that the Law Commission is looking at is broader in scope.

The Government have agreed in principle to create an offence based on the Law Commission recommendation in separate legislation, and once that is done the scope of the new offence will be wider than that proposed in the new clause. Rather than adding the new clause and the proposed limited new offence to this Bill, I ask that we implement the offence recommended by the Law Commission, the wider scope of which covers the offline world as well as the online world, in separate legislation. I would be happy to make representations to my colleagues in Government, particularly in the MOJ, to seek clarification about the relevant timing, because it is reasonable to expect it to be implemented sooner rather than later. Rather than rushing to introduce that offence with limited scope under the Bill, I ask that we do it properly as per the Law Commission recommendation.

Once the Law Commission recommendation is enacted in separate legislation, to which the Government have already agreed in principle, it will immediately flow through automatically to be incorporated into clause 52(4)(d), which relates to illegal content, and under clause 176, the Secretary of State may, subject to parliamentary approval, designate the new offence as a priority offence under schedule 7 via a statutory instrument. The purpose of amendment 142 can therefore be achieved through a SI.

The Government publicly entirely agree with the intention behind the proposed new clause 36, but I think the way to do this is to implement the full Law Commission offence as soon as we can and then, if appropriate, add it to schedule 7 by SI. The Government agree with the spirit of the hon. Gentleman’s proposal, but I believe that the Government already have a plan to do a more complete job to create the new offence.

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

I have nothing to add and, having consulted my hon. Friend the Member for Aberdeen North, on the basis of the Minister’s assurances, I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I beg to move amendment 116, in schedule 7, page 183, line 11, at end insert—

“1A An offence under section 13 of the Criminal Justice Act (Northern Ireland) 1966 (c. 20 (N.I.)) (assisting suicide etc).”

This amendment adds the specified offence to Schedule 7, with the effect that content amounting to that offence counts as priority illegal content.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss Government amendments 117 to 126.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

These amendments pick up a question asked by the hon. Member for Aberdeen North much earlier in our proceedings. In schedule 7 we set out the priority offences that exist in English and Welsh law. We have consulted the devolved Administrations in Scotland and Northern Ireland extensively, and I believe we have agreed with them a number of offences in Scottish and Northern Irish law that are broadly equivalent to the English and Welsh offences already in schedule 7. Basically, Government amendments 116 to 126 add those devolved offences to the schedule.

In future, if new Scottish or Northern Irish offences are created, the Secretary of State will be able to consult Scottish or Northern Irish Ministers and, by regulations, amend schedule 7 to add the new offences that may be appropriate if conceived by the devolved Parliament or Assembly in due course. That, I think, answers the question asked by the hon. Lady earlier in our proceedings. As I say, we consulted the devolved Administrations extensively and I hope that the Committee will assent readily to the amendments.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The amendments aim to capture all the criminal offences in other parts of the UK to be covered by the provisions of the Bill, as the Minister outlined. An offence in one part of the UK will be considered an offence elsewhere, for the purposes of the Bill.

With reference to some of the later paragraphs, I am keen for the Minister to explain briefly how this will work in the case of Scotland. We believe that the revenge porn offence in Scotland is more broadly drawn than the English version, so the level of protection for women in England and Wales will be increased. Can the Minister confirm that?

The Bill will not apply the Scottish offence to English offenders, but it means that content that falls foul of the law in Scotland, but not in England or Wales, will still be relevant regulated content for service providers, irrespective of the part of the UK in which the service users are located. That makes sense from the perspective of service providers, but I will be grateful for clarity from the Minister on this point.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I thank the Minister for tabling the amendments. In the evidence sessions, we heard about omissions in schedule 7 from not having Northern Irish and Scottish offences included. Such offences were included in schedule 6 but, at that point, not in schedule 7.

I appreciate that the Minister has worked with the devolved Administrations to table the amendments. I also appreciate the way in which amendment 126 is written, such that the Secretary of State “must consult” Scottish Ministers and the Department of Justice in Northern Ireland before making regulations that relate to legislation in either of the devolved countries. I am glad that the amendments have been drafted in this way and that the concern that we heard about in evidence no longer seems to exist, and I am pleased with the Minister’s decision about the way in which to make any future changes to legislation.

I agree with the position put forward by the hon. Member for Pontypridd. My understanding, from what we heard in evidence a few weeks ago, is that, legally, all will have to agree with the higher bar of the offences, and therefore anyone anywhere across the UK will be provided with the additional level of protection. She is right that the offence might not apply to everyone, but the service providers will be subject to the requirements elsewhere. Similarly, that is my view. Once again, I thank the Minister.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Briefly, I hope that the amendments provide further evidence to the Committee of the Government’s willingness to listen and to respond. I can provide the confirmation that the hon. Members for Aberdeen North and for Pontypridd requested: the effect of the clauses is a levelling up—if I may put it that way. Any of the offences listed effectively get applied to the UK internet, so if there is a stronger offence in any one part of the United Kingdom, that will become applicable more generally via the Bill. As such, the answer to the question is in the affirmative.

Amendment 116 agreed to.

None Portrait The Chair
- Hansard -

My custom with amendments to be moved formally is to call them by number. If Members wish to vote on them, they should shout; otherwise, I will rattle through them. It is quicker that way.

Amendments made: 117, in schedule 7, page 183, line 29, at end insert—

“4A An offence under section 50A of the Criminal Law (Consolidation) (Scotland) Act 1995 (racially-aggravated harassment).”

This amendment adds the specified offence to Schedule 7, with the effect that content amounting to that offence counts as priority illegal content.

Amendment 118, in schedule 7, page 183, line 36, at end insert—

“5A An offence under any of the following provisions of the Protection from Harassment (Northern Ireland) Order 1997 (S.I. 1997/1180 (N.I. 9))—

(a) Article 4 (harassment);

(b) Article 6 (putting people in fear of violence).”

This amendment adds the specified offences to Schedule 7, with the effect that content amounting to those offences counts as priority illegal content.

Amendment 119, in schedule 7, page 184, line 2, at end insert—

“6A An offence under any of the following provisions of the Criminal Justice and Licensing (Scotland) Act 2010 (asp 13)—

(a) section 38 (threatening or abusive behaviour);

(b) section 39 (stalking).”

This amendment adds the specified offences to Schedule 7, with the effect that content amounting to those offences counts as priority illegal content.

Amendment 120, in schedule 7, page 184, line 38, at end insert—

“12A An offence under any of the following provisions of the Criminal Justice (Northern Ireland) Order 1996 (S.I. 1996/3160 (N.I. 24))—

(a) Article 53 (sale etc of knives);

(b) Article 54 (sale etc of knives etc to minors).”

This amendment adds the specified offences to Schedule 7, with the effect that content amounting to those offences counts as priority illegal content.

Amendment 121, in schedule 7, page 184, line 42, at end insert—

“13A An offence under any of the following provisions of the Firearms (Northern Ireland) Order 2004 (S.I. 2004/702 (N.I. 3))—

(a) Article 24 (sale etc of firearms or ammunition without certificate);

(b) Article 37(1) (sale etc of firearms or ammunition to person without certificate etc);

(c) Article 45(1) and (2) (purchase, sale etc of prohibited weapons);

(d) Article 63(8) (sale etc of firearms or ammunition to people who have been in prison etc);

(e) Article 66A (supplying imitation firearms to minors).”

This amendment adds the specified offences to Schedule 7, with the effect that content amounting to those offences counts as priority illegal content.

Amendment 122, in schedule 7, page 184, line 44, at end insert—

“14A An offence under any of the following provisions of the Air Weapons and Licensing (Scotland) Act 2015 (asp 10)—

(a) section 2 (requirement for air weapon certificate);

(b) section 24 (restrictions on sale etc of air weapons).”

This amendment adds the specified offences to Schedule 7, with the effect that content amounting to those offences counts as priority illegal content.

Amendment 123, in schedule 7, page 185, line 8, at end insert—

“16A An offence under any of the following provisions of the Sexual Offences (Northern Ireland) Order 2008 (S.I. 2008/1769 (N.I. 2))—

(a) Article 62 (causing or inciting prostitution for gain);

(b) Article 63 (controlling prostitution for gain).”—(Chris Philp.)

This amendment adds the specified offences to Schedule 7, with the effect that content amounting to those offences counts as priority illegal content.

None Portrait The Chair
- Hansard -

Amendment 148 remains unmoved, and it has been tabled by a Member who is not a member of the Committee, so unless anybody wishes to adopt it, it will not be called.

Amendments made: 124, in schedule 7, page 185, line 14, at end insert—

“18A An offence under section 2 of the Abusive Behaviour and Sexual Harm (Scotland) Act 2016 (asp 22) (disclosing, or threatening to disclose, an intimate photograph or film).”

This amendment adds the specified offence to Schedule 7, with the effect that content amounting to that offence counts as priority illegal content.

Amendment 125, in schedule 7, page 185, line 28, at end insert—

“20A An offence under section 49(3) of the Criminal Justice and Licensing (Scotland) Act 2010 (articles for use in fraud).”—(Chris Philp.)

This amendment adds the specified offence to Schedule 7, with the effect that content amounting to that offence counts as priority illegal content.

Amendment proposed: 59, in schedule 7, page 185, line 39, at end insert—

“Animal Welfare

22A An offence under any of the following provisions of the Animal Welfare Act 2006—

(a) section 4 (unnecessary suffering);

(b) section 5 (mutilation);

(c) section 7 (administration of poisons);

(d) section 8 (fighting);

(e) section 9 (duty of person responsible for animal to ensure welfare).

22B An offence under any of the following provisions of the Animal Health and Welfare (Scotland) Act 2006—

(a) section 19 (unnecessary suffering);

(b) section 20 (mutilation);

(c) section 21 (cruel operations);

(d) section 22 (administration of poisons);

(e) section 23 (fighting);

(f) section 24 (ensuring welfare of animals).

22C An offence under any of the following provisions of the Welfare of Animals Act (Northern Ireland) 2011—

(a) section 4 (unnecessary suffering);

(b) section 5 (prohibited procedures);

(c) section 7 (administration of poisons);

(d) section 8 (fighting);

(e) section 9 (ensuring welfare of animals).

22D For the purpose of paragraphs 22A, 22B or 22C of this Schedule, the above offences are deemed to have taken place regardless of whether the offending conduct took place within the United Kingdom, if the offending conduct would have constituted an offence under the provisions contained within those paragraphs.”—(Alex Davies-Jones.)

This amendment adds certain animal welfare offences to the list of priority offences in Schedule 7.

Question put, That the amendment be made.

Division 26

Ayes: 5


Labour: 3
Scottish National Party: 2

Noes: 9


Conservative: 9

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

I beg to move amendment 90, in schedule 7, page 185, line 39, at end insert—

“Human trafficking

22A An offence under section 2 of the Modern Slavery Act 2015.”

This amendment would designate human trafficking as a priority offence.

Our amendment seeks to deal explicitly with what Meta and other companies refer to as “domestic servitude”, which we know better as human trafficking. This abhorrent practice has sadly been part of our society for hundreds if not thousands of years, and today, human traffickers are aided by various apps and platforms. The same platforms that connect us with old friends and family across the globe have been hijacked by the very worst people in our world, who are using them to create networks of criminal enterprise, none more cruel than human trafficking.

Investigations by the BBC and The Wall Street Journal have uncovered how traffickers use Instagram, Facebook and WhatsApp to advertise, sell, and co-ordinate the trafficking of young women. One would think that this issue would be of the utmost importance to Meta—Facebook, as it was at the time—yet, as the BBC reported,

“the social media giant only took ‘limited action’ until ‘Apple Inc. threatened to remove Facebook’s products from the App Store, unless it cracked down on the practice’.”

Those of us who have sat on the DCMS Committee and the Joint Committee on the draft Bill—I and my friends across the aisle, the hon. Members for Wolverhampton North East and for Watford—know exactly what it is like to have Facebook’s high heid yins before you. They will do absolutely nothing to respond to legitimate pressure. They understand only one thing: the force of law and of financial penalty. Only when its profits were in danger did Meta take the issue seriously.

The omission of human trafficking from schedule 7 is especially worrying because if it is not directly addressed as priority illegal content, we can be certain that it will not be prioritised by the platforms. We know that from their previous behaviour.

14:30
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Can my hon. Friend see any reason—I am baffled by this—why the Government would leave out human trafficking? Can he imagine any justification that the Minister could possibly have for suggesting that it is not a priority offence, given the Conservative party’s stated aims and, to be fair, previous action in respect of, for example, the Modern Slavery Act 2015?

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

It is an interesting question. Alas, I long ago stopped trying to put myself into the minds of Conservative Ministers—a scary place for any of us to be.

We understand that it is difficult to try to regulate in respect of human trafficking on platforms. It requires work across borders and platforms, with moderators speaking different languages. We established that Facebook does not have moderators who speak different languages. On the Joint Committee on the draft Bill, we discovered that Facebook does not moderate content in English to any adequate degree. Just look at the other languages around the world—do we think Facebook has moderators who work in Turkish, Finnish, Swedish, Icelandic or a plethora of other languages? It certainly does not. The only language that Facebook tries to moderate—deeply inadequately, as we know—is English. We know how bad the moderation is in English, so can the Committee imagine what it is like in some of the world’s other languages? The most terrifying things are allowed to happen without moderation.

Regulating in respect of human trafficking on platforms is not cheap or easy, but it is utterly essential. The social media companies make enormous amounts of money, so let us shed no tears for them and the costs that will be entailed. If human trafficking is not designated a priority harm, I fear it will fall by the wayside, so I must ask the Minister: is human trafficking covered by another provision on priority illegal content? Like my hon. Friend the Member for Aberdeen North, I cannot see where in the Bill that lies. If the answer is yes, why are the human rights groups not satisfied with the explanation? What reassurance can the Minister give to the experts in the field? Why not add a direct reference to the Modern Slavery Act, as in the amendment?

If the answer to my question is no, I imagine the Minister will inform us that the Bill requires platforms to consider all illegal content. In what world is human trafficking that is facilitated online not a priority? Platforms must be forced to be proactive on this issue; if not, I fear that human trafficking, like so much that is non-priority illegal content, will not receive the attention it deserves.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Schedule 7 sets out the list of criminal content that in-scope firms will be required to remove as a priority. Labour was pleased to see new additions to the most recent iteration, including criminal content relating to online drug and weapons dealing, people smuggling, revenge porn, fraud, promoting suicide and inciting or controlling prostitution for gain. The Government’s consultation response suggests that the systems and processes that services may use to minimise illegal or harmful content could include user tools, content moderation and recommendation procedures.

More widely, although we appreciate that the establishment of priority offences online is the route the Government have chosen to go down with the Bill, we believe the Bill remains weak in relation to addressing harms to adults and wider societal harms. Sadly, the Bill remains weak in its approach and has seemingly missed a number of known harms to both adults and children that we feel are a serious omission. Three years on from the White Paper, the Government know where the gaps are, yet they have failed to address them. That is why we are pleased to support the amendment tabled by the hon. Members for Ochil and South Perthshire and for Aberdeen North.

Human trafficking offences are a serious omission from schedule 7 that must urgently be rectified. As we all know from whistleblower Frances Haugen’s revelations, Facebook stands accused, among the vast array of social problems, of profiting from the trade and sale of human beings—often for domestic servitude—by human traffickers. We also know that, according to internal documents, the company has been aware of the problems since at least 2018. As the hon. Member for Ochil and South Perthshire said, we know that a year later, on the heels of a BBC report that documented the practice, the problem was said to be so severe that Apple itself threatened to pull Facebook and Instagram from its app store. It was only then that Facebook rushed to remove content related to human trafficking and made emergency internal policy changes to avoid commercial consequences described as “potentially severe” by the company. However, an internal company report detailed that the company did not take action prior to public disclosure and threats from Apple—profit over people.

In a complaint to the US Securities and Exchange Commission first reported by The Wall Street Journal, whistleblower Haugen wrote:

“Investors would have been very interested to learn the truth about Facebook almost losing access to the Apple App Store because of its failure to stop human trafficking on its products.”

I cannot believe that the Government have failed to commit to doing more to tackle such abhorrent practices, which are happening every day. I therefore urge the Minister to do the right thing and support amendment 90.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The first thing to make clear to the Committee and anyone listening is that, of course, offences under the Modern Slavery Act 2015 are brought into the scope of the illegal content duties of this Bill through clause 52(4)(d), because such offences involve an individual victim.

Turning to the priority offences set out in schedule 7 —I saw this when I was a Home Office Minister—modern slavery is generally associated with various other offences that are more directly visible and identifiable. Modern slavery itself can be quite hard to identify. That is why our approach is, first, to incorporate modern slavery as a regular offence via clause 52(4)(d) and, secondly, to specify as priority offences those things that are often identifiable symptoms of it and that are feasibly identified. Those include many of the offences listed in schedule 7, such as causing, inciting or controlling prostitution for gain, as in paragraph 16 on sexual exploitation, which is often the manifestation of modern slavery; money laundering, which is often involved where modern slavery takes place; and assisting illegal immigration, because modern slavery often involves moving somebody across a border, which is covered in paragraph 15 on assisting illegal immigration, as per section 25 of the Immigration Act 1971.

Modern slavery comes into scope directly via clause 52(4)(d) and because the practicably identifiable consequences of modern slavery are listed as priority offences, I think we do have this important area covered.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I appreciate that the Minister thinks that there are other measures that cover this offence, but will he keep it under consideration going forward? I do not think that that is too much to ask. Part of the logic behind that is that some of the other issues, where the reasons behind them must be proved, are much more difficult to define or prove than the modern slavery offences that we are asking to be added here. Whether he accepts the amendment or not, will he commit to considering the matter and not just saying, “Absolutely no”? That would be helpful for us and the many organisations that are keen for such things to be included.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am happy to give that further consideration, but please do not interpret that as a firm commitment. I repeat that the Modern Slavery Act is brought into the scope of this Bill via clause 52(4)(d).

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

I have nothing further to add. I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Schedule 7, as amended, agreed to.

Clause 53

“Content that is harmful to children” etc

None Portrait The Chair
- Hansard -

I have had no indication that anybody wishes to move Carla Lockhart’s amendment 98—she is not a member of the Committee.

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It is absolutely right that the Government have included a commitment to children in the form of defining primary priority content that is harmful. We all know of the dangerous harms that exist online for children, and while the Opposition support the overarching aims of the Bill, we feel the current definitions do not go far enough—that is a running theme with this Bill.

The Bill does not adequately address the risks caused by the design—the functionalities and features of services themselves—or those created by malign contact with other users, which we know to be an immense problem. Research has found that online grooming of young girls has soared by 60% in the last three years—and four in five victims are girls. We also know that games increasingly have addictive gambling-style features. Those without user-to-user functionalities, such as Subway Surfers, which aggressively promotes in-app purchases, are currently out of scope of the Bill.

Lastly, research by Parent Zone found that 91% of children say that loot boxes are available in the games they play and 40% have paid to open one. That is not good enough. I urge the Minister to consider his approach to tackling harmful content and the impact that it can have in all its forms. When considering how children will be kept safe under the new regime, we should consider concerns flagged by some of the civil society organisations that work with them. Organisations such as the Royal College of Psychiatrists, The Mix, YoungMinds and the Mental Health Foundation have all been instrumental in their calls for the Government to do more. While welcoming the intention to protect children, they note that it is not clear at present how some categories of harm, including material that damages people’s body image, will be regulated—or whether it will be regulated at all.

While the Bill does take steps to tackle some of the most egregious, universally damaging material that children currently see, it does not recognise the harm that can be done through the algorithmic serving of material that, through accretion, will cause harm to children with particular mental health vulnerabilities. For example, beauty or fitness-related content could be psychologically dangerous to a child recovering from an eating disorder. Research from the Mental Health Foundation shows how damaging regular exposure to material that shows conventionally perfect images of bodies, often edited digitally and unattainable, are to children and young people.

This is something that matters to children, with 84% of those questioned in a recent survey by charity The Mix saying the algorithmic serving of content was a key issue that the Bill should address. Yet in its current form it does not give children full control over the content they see. Charities also tell us about the need to ensure that children are exposed to useful content. We suggest that the Government consider a requirement for providers to push material on social media literacy to users and to provide the option to receive content that can help with recovery where it is available, curated by social media companies with the assistance of trusted non-governmental organisations and public health bodies. We also hope that the Government can clarify that material damaging to people’s body image will be considered a form of harm.

Additionally, beyond the issue of the content itself that is served to children, organisations including YoungMinds and the Royal College of Psychiatrists have raised the potential dangers to mental health inherent in the way services can be designed to be addictive.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

My hon Friend raises an important point about media literacy, which we have touched on a few times during this debate. We have another opportunity here to talk about that and to say how important it is to think about media literacy within the scope of the Bill. It has been removed, and I think we need to put it back into the Bill at every opportunity—I am talking about media literacy obligations for platforms to help to responsibly educate children and adults about the risks online. We need to not lose sight of that.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree with my hon. Friend. She is right to talk about the lack of a social and digital media strategy within the Bill, and the need to educate children and adults about the harmful content that we see online. How to stay safe online in all its capacities is absolutely fundamental to the Bill. We cannot have an Online Safety Bill without teaching people how to be safe online. That is important for how children and young people interact online. We know that they chase likes and the self-esteem buzz they get from notifications popping up on their phone or device. That can be addictive, as has been highlighted by mental health and young persons’ charities.

I urge the Minister to address those issues and to consider how the Government can go further, whether through this legislation or further initiatives, to help to combat some of those issues.

14:45
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a couple of questions for the Minister. The first is about the interaction of subsection (4)(c) and subsection (5). I am slightly confused about how that, because subsection (4)(c) states that anything that is not within the terms of primary priority content or primary content but is harmful to

“an appreciable number of children”

is included as

“content that is harmful to children”.

That is completely reasonable. However, subsection (5) excludes illegal content and content with a “potential financial impact”. I appreciate that these provisions are drafted in quite a complicated way, but it would be useful to have an understanding of what that means. If it means there is no harm on the basis of things that are financial in nature, that is a problem, because that explicitly excludes gambling-type sites, loot boxes and anything of that sort, which by their nature are intentionally addictive and try to get children or adults to part with significant amounts of cash. If they are excluded, that is a problem.

How will clause 53 be future-proofed? I am not suggesting that there is no future proofing, but it would be helpful to me and fellow Committee members if the Minister explained how the clause will deal with new emerging harms and things that may not necessarily fall within the definitions that we set initially. How will those definitions evolve and change as the internet evolves and changes, and as the harms with which children are presented evolve and change?

And finally—I know that the Minister mentioned earlier that saying, “And finally”, in a speech is always a concern, but I am saying it—I am slightly concerned about the wording in subsection (4)(c), which refers to

“material risk of significant harm to an appreciable number of children”,

because I am not clear what an “appreciable number” is. If there is significant harm to one child from content, and content that is incredibly harmful to children is stumbled upon by a child, is it okay for that provider to have such content? It is not likely to accessed by an “appreciable number of children” and might be accessed by only a small number, but if the Minister could give us an understanding of what the word “appreciable” means in that instance, that would be greatly appreciated.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

There are one or two points to pick up on. A question was raised about algorithms, and it is worth saying that the risk assessments that platforms must undertake will include consideration of the operation of algorithms. It is important to make it absolutely clear that that is the case.

The shadow Minister asked about the definition of harm, and whether all the harms that might concern Parliament, and many of us as parents, will be covered. It may be helpful to refer to definition of harm provided in clause 187, at the top of page 153. Committee members will note that the definition is very wide and that subsection (2) defines it as “physical or psychological harm”, so I hope that partly answers the shadow Minister’s question.

Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - - - Excerpts

I am jumping ahead a bit, but I know that we will discuss clause 150, Zach’s law and epilepsy in particular at some point. Given the definition that my hon. Friend has just cited, am I correct to assume that the physical harm posed to those with epilepsy who might be targeted online will be covered, and that it is not just about psychological harm?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I admire my hon. Friend’s attention to the debate. The definition of harm for the harmful communications offence in clause 150 is set out in clause 150(4). In that context, harm is defined slightly differently, as

“psychological harm amounting to at least serious distress”.

The definition of harm in clause 187 that I read out is the definition of harm used elsewhere in the Bill. However, as I said before in the House and in the evidence session, the Government’s belief and intention is that epilepsy trolling would fall in the scope of clause 150, because giving someone an epileptic fit clearly does have a physical implication, as my hon. Friend said, but also causes psychological harm. Being given an epileptic fit is physically damaging, but it causes psychological harm as well.

Despite the fact that the definition of harm in clause 187 does not apply in clause 150, which has its own definition of harm, I am absolutely categoric that epilepsy trolling is caught by clause 150 because of the psychological harm it causes. I commend my hon. Friend the Member for Watford for being so attentive on the question of epilepsy, and also in this debate.

Returning to the definition of harm in clause 187, besides the wide definition covering physical and psychological harm, clause 187(4) makes it clear that harm may also arise not just directly but if the content prompts individuals to

“act in a way that results in harm to themselves or that increases the likelihood of harm to themselves”.

Clause 187(4)(b) covers content where the

“individuals do or say something to another individual that results in”

that individual suffering harm. I hope the shadow Minister is reassured that the definition of harm that applies here is extremely wide in scope.

There was a question about media literacy, which I think the hon. Member for Batley and Spen raised in an intervention. Media literacy duties on Ofcom already exist in the Communications Act 2003. The Government published a comprehensive and effective media literacy strategy about a year ago. In December—after the first version of the Bill was produced, but before the second and updated version—Ofcom updated its policy in a way that went beyond the duties contained in the previous version of the Bill. From memory, that related to the old clause 103, in the version of the Bill published in May last year, which is of course not the same clause in this version of the Bill, as it has been updated.

The hon. Member for Aberdeen North raised, as ever, some important points of detail. She asked about future proofing. The concept of harm expressed in the clause is a general concept of harm. The definition of harm is whatever is harmful to children, which includes things that we do not know about at the moment and that may arise in the future. Secondly, primary priority content and priority content that is harmful can be updated from time to time by a statutory instrument. If some new thing happens that we think deserves to be primary priority content or priority content that is harmful to children, we can update that using a statutory instrument.

The hon. Lady also asked about exclusions in clause 53(5). The first exclusion in subsection (5)(a) is illegal content, because that is covered elsewhere in the Bill—it is covered in clause 52. That is why it is excluded, because it is covered elsewhere. The second limb, subsection 5(b), covers some financial offences. Those are excluded because they are separately regulated. Financial services are separately regulated. The hon. Lady used the example of gambling. Gambling is separately regulated by the Gambling Act 2005, a review of which is imminent. There are already very strong provisions in that Act, which are enforced by the regulator, the Gambling Commission, which has a hard-edged prohibition on gambling if people are under 18.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

However, I do not think that loot boxes even existed in 2005 when that Act was published. Loot boxes are gambling. They may not be covered by that legislation, but they are gambling. Will the Minister consider whether those harms are unintentionally excluded by clause 53?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We are getting into some detail here. In the unlikely event that any member of the Committee does not know what a loot box is, it is where someone playing a game can buy extra lives or enhance the game’s functionality somehow by paying some money. There have been some cases where children have stolen their parent’s credit card and bought these things in large numbers

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Having played lots of games, I can clarify that people do not know what they are getting with a loot box, so they are putting money forward but do not know whether they will get a really good piece of armour or a really crap piece of armour. It is literally gambling, because children do not know what will come out of the box, as opposed to just buying a really good piece of armour with £2.99 from their parent’s credit card.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

However, the reward is non-monetary in nature. For that reason, the Government’s view—if I can test your patience momentarily, Sir Roger, as we are straying somewhat outside this particular debate—is that loot boxes will not be covered by the gambling review, because we do not see them as gambling. However, we do see them as an issue that needs to be addressed, and that will happen via the online advertising programme, which will be overseen by the Minister for Media, Data and Digital Infrastructure, my hon. Friend the Member for Hornchurch and Upminster (Julia Lopez). That will happen shortly and advertising legislation will follow, so loot boxes will be addressed in the online advertising programme and the subsequent legislation.

The other question raised by the hon. Member for Aberdeen North was about the definition of “an appreciable number”. I have a couple of points to make. By definition, anything that is illegal is covered already in schedule 7 or through clause 52(4)(d), which we have mentioned a few times. Content that is

“primary priority content that is harmful to children”

or

“priority content that is harmful to children”

is covered in clause 53(4)(a) and (b), so we are now left with the residue of stuff that is neither illegal nor primary priority content; it is anything left over that might be harmful. By definition, we have excluded all the serious harms already, because they would be either illegal or in the priority categories. We are left with the other stuff. The reason for the qualifier “appreciable” is to make sure that we are dealing only with the residual non-priority harmful matters. We are just making sure that the duty is reasonable. What constitutes “appreciable” will ultimately get set out through Ofcom guidance, but if it was a tiny handful of users and it was not a priority harm, and was therefore not considered by Parliament to be of the utmost priority, it would be unlikely to be applicable to such a very small number. Because it is just the residual category, that is a proportionate and reasonable approach to take.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Given the Government’s ability to designate priority content and primary priority content through secondary legislation, the Minister is telling me that if they decided that loot boxes were not adequately covered by the future legislation coming through, and they were to discover that something like this was a big issue, they could add that to one of the two priority content designations.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Member is asking me a somewhat technical question, and I hesitate to answer without taking full advice, but I think the answer is yes. The reason that loot boxes are not considered gambling in our view is that they do not have a monetary value, so the exclusion in clause 53(5)(b)(i) does not apply. On a quick off-the-cuff reading, it does not strike me immediately that the exclusions in (5)(b)(ii) or (iii) would apply to loot boxes either, so I believe—and officials who know more about this than I do are nodding—that the hon. Lady is right to say that it would be possible for loot boxes to become primary priority content or priority content by way of a statutory instrument. Yes, my belief is that that would be possible.

Question put and agreed to.

Clause 53 accordingly ordered to stand part of the Bill.

Clause 54

“Content that is harmful to children” etc

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

I beg to move amendment 83, in clause 54, page 50, line 39, at end insert—

“(2A) Priority content designated under subsection (2) must include content that contains health-related misinformation and disinformation, where such content is harmful to adults.”

This amendment would amend Clause 54 so that the Secretary of State’s designation of “priority content that is harmful to adults” must include a description of harmful health related misinformation or disinformation (as well as other priority content that might be designated in regulations by the Secretary of State).

The Bill requires category 1 service providers to set out how they will tackle harmful content on their platforms. In order for this to work, certain legal but harmful content must be designated in secondary legislation as

“priority content that is harmful to adults.”

As yet, however, it is not known what will be designated as priority content or when. There have been indications from Government that health-related misinformation and disinformation will likely be included, but there is no certainty. The amendment would ensure that harmful health-related misinformation and disinformation would be designated as priority content that is harmful to adults.

15:00
Health-related misinformation and disinformation undermine public health, as we know. For example, pregnant women have received mixed messages about the safety of covid vaccinations, causing widespread confusion, fear and inaction. In October 2021, one in five of the most critically ill covid patients were unvaccinated pregnant women. It should also be stressed that health misinformation and disinformation are not limited to covid or vaccine content. They also extend to, for example, areas as broad as cancer treatment or sexual health misinformation—anything that has the potential to cause physical or psychological harm to adults and to children.
With a third of internet users unaware of the potential for inaccurate or biased information online, it is vital that this amendment on health-related misinformation and disinformation is inserted into the Bill during Committee stage. It would give Parliament the time to scrutinise what content is in scope and ensure that regulation is in place to promote proportionate and effective responses. We must make it incumbent on platforms to be proactive in reducing that pernicious form of disinformation, designed only to hurt and to harm. As we have seen from the pandemic, the consequences can be grave if the false information is believed, as, sadly, it so often is.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Again, Labour supports moves to ensure that there is some clarity about specific content that is deemed to be harmful to adults, but of course the Opposition have concerns about the overall aim of defining harm.

The Government’s chosen approach to regulating the online space has left too much up to secondary legislation. We are also concerned that health misinformation and disinformation—a key harm, as we have all learned from the coronavirus pandemic—is missing from the Bill. That is why we too support amendment 83. The impact of health misinformation and disinformation is very real. Estimates suggest that the number of social media accounts posting misinformation about vaccines, and the number of users following those accounts, increased during the pandemic. Research by the Centre for Countering Digital Hate, published in November 2020, suggested that the number of followers of the largest anti-vaccination social media accounts had increased by 25% since 2019. At the height of the pandemic, it was also estimated that there were 5.4 million UK-based followers of anti-vaccine Twitter accounts.

Interestingly, an Ofcom survey of around 200 respondents carried out between 12 and 14 March 2021 found that 28% of respondents had come across information about covid-19 that could be considered false or misleading. Of those who had encountered such information, respondents from minority ethnic backgrounds were twice as likely to say that the claim made to them made them think twice about the issue compared with white respondents. The survey found that of those people who were getting news and information about the coronavirus within the preceding week, 15% of respondents had come across claims that the coronavirus vaccines would alter human DNA; 18% had encountered claims that the coronavirus vaccines were a cover for the implant of trackable microchips, and 10% had encountered claims that the vaccines contained animal products.

Public health authorities, the UK Government, social media companies and other organisations all attempted to address the spread of vaccine misinformation through various strategies, including moderation of vaccine misinformation on social media platforms, ensuring the public had access to accurate and reliable information and providing education and guidance to people on how to address misinformation when they came across it.

Although studies do not show strong links between susceptibility to misinformation and ethnicity in the UK, some practitioners and other groups have raised concerns about the spread and impact of covid-19 vaccine misinformation among certain minority ethnic groups. Those concerns stem from research that shows historically lower levels of vaccine confidence and uptake among those groups. Some recent evidence from the UK’s vaccine roll-out suggests that that trend has continued for the covid-19 vaccine.

Data from the OpenSAFELY platform, which includes data from 40% of GP practices in England, covering more than 24 million patients, found that up to 7 April 2021, 96% of white people aged over 60 had received a vaccination compared with only 77% of people from a Pakistani background, 76% from a Chinese background and 69% of black people within the same age group. A 2021 survey of more than 172,000 adults in England on attitudes to the vaccine also found that confidence in covid-19 vaccines was highest in those of white ethnicity, with some 92.6% saying that they had accepted or would accept the vaccine. The lowest confidence was found in those of black ethnicity, at 72.5%. Some of the initiatives to tackle vaccine misinformation and encourage vaccine take-up were aimed at specific minority ethnic groups, and experts have emphasised the importance of ensuring that factual information about covid-19 vaccines is available in multiple different languages.

Social media companies have taken various steps to tackle misinformation on their platforms during the covid-19 pandemic, including removing or demoting misinformation, directing users to information from official sources and banning certain adverts. So, they can do it when they want to—they just need to be compelled to do it by a Bill. However, we need to go further. Some of the broad approaches to content moderation that digital platforms have taken to address misinformation during the pandemic are discussed in the Parliamentary Office of Science and Technology’s previous rapid response on covid-19 and misinformation.

More recently, some social media companies have taken specific action to counter vaccine misinformation. In February 2021, as part of its wider policies on coronavirus misinformation, Facebook announced that it would expand its efforts to remove false information about covid-19 vaccines, and other vaccines more broadly. The company said it would label posts that discuss covid-19 vaccines with additional information from the World Health Organisation. It also said it would signpost its users to information on where and when they could get vaccinated. Facebook is now applying similar measures to Instagram.

In March 2021, Twitter began applying labels to tweets that could contain misinformation about covid-19 vaccines. It also introduced a strike policy, under which users that violate its covid-19 misinformation policy five or more times would have their account permanently suspended.

YouTube announced a specific ban on covid-19 anti-vaccination videos in October 2020. It committed to removing any videos that contradict official information about the vaccine from the World Health Organisation. In March, the company said it had removed more than 30,000 misleading videos about the covid-19 vaccine since the ban was introduced. However, as with most issues, until the legislation changes, service providers will not feel truly compelled to do the right thing, which is why we must legislate and push forward with amendment 83.

Nick Fletcher Portrait Nick Fletcher (Don Valley) (Con)
- Hansard - - - Excerpts

I would like to speak to the clause rather than the amendment, Sir Roger. Is now the right time to do so, or are we only allowed to speak to the amendment?

None Portrait The Chair
- Hansard -

It can be, in the sense that I am minded not to have a clause stand part debate.

Nick Fletcher Portrait Nick Fletcher
- Hansard - - - Excerpts

Thank you, Sir Roger. I think that the Minister would agree that this is probably one of the most contentious parts of the Bill. It concerns legal but harmful content, which is causing an awful lot of concern out there. The clause says that the Secretary of State may in regulations define as

“priority content that is harmful to adults”

content that he or she considers to present

“a material risk of significant harm to an appreciable number of adults”.

We have discussed this issue in other places before, but I am deeply concerned about freedom of speech and people being able to say what they think. What is harmful to me may not be harmful to any other colleagues in this place. We would be leaving it to the Secretary of State to make that decision. I would like to hear the Minister’s thoughts on that.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am very happy to reply to the various queries that have been made. I will start with the points on vaccine disinformation raised by the hon. Members for Ochil and South Perthshire and for Pontypridd. The Government strongly agree with the points they made about the damaging effects of vaccine misinformation and the fact that many of our fellow citizens have probably died as a result of being misled into refusing the vaccine when it is, of course, perfectly safe. We strongly share the concerns they have articulated.

Over the past two years, the Department for Digital, Culture, Media and Sport has worked together with other Departments to develop a strong operational response to this issue. We have established a counter-disinformation unit within DCMS whose remit is to identify misinformation and work with social media firms to get it taken down. The principal focus of that unit during the pandemic was, of course, covid. In the past three months, it has focused more on the Russia-Ukraine conflict, for obvious reasons.

In some cases, Ministers have engaged directly with social media firms to encourage them to remove content that is clearly inappropriate. For example, in the Russia-Ukraine context, I have had conversations with social media companies that have left up clearly flagrant Russian disinformation. This is, therefore, an area that the Government are concerned about and have been acting on operationally already.

Obviously, we agree with the intention behind the amendment. However, the way to handle it is not to randomly drop an item into the Bill and leave the rest to a statutory instrument. Important and worthy though it may be to deal with disinformation, and specifically harmful health-related disinformation, there are plenty of other important things that one might add that are legal but harmful to adults, so we will not accept the amendment. Instead, we will proceed as planned by designating the list via a statutory instrument. I know that a number of Members of Parliament, probably including members of this Committee, would find it helpful to see a draft list of what those items might be, not least to get assurance that health-related misinformation and disinformation is on that list. That is something that we are considering very carefully, and more news might be forthcoming as the Bill proceeds through Parliament.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

My hon. Friend has talked about the Department’s counter-disinformation unit. Do the Government anticipate that that function to continue, or will they expect Ofcom to do it?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The work of the counter-disinformation unit is valuable. We look at these things on a spending review by spending review basis, and as far as I am aware we intend to continue with the counter-disinformation unit over the current spending review period. Clearly, I cannot commit future Ministers in perpetuity, but my personal view—if I am allowed to express it—is that that unit performs a useful function and could valuably be continued into the future. I think it is useful for the Government, as well as Ofcom, to directly have eyes on this issue, but I cannot speak for future Ministers. I can only give my right hon. Friend my own view.

I hope that I have set out my approach. We have heard the calls to publish the list so that parliamentarians can scrutinise it, and we also heard them on Second Reading.

I will now turn to the question raised by my hon. Friend the Member for Don Valley regarding freedom of expression. Those on one side of the debate are asking us to go further and to be clearer, while those on the other side have concerns about freedom of expression. As I have said, I honestly do not think that these legal but harmful provisions infringe on freedom of speech, for three reasons. First, even when the Secretary of State decides to designate content and Parliament approves of that decision through the affirmative procedure—Parliament gets to approve, so the Secretary of State is not acting alone—that content is not being banned. The Bill does not say that content designated as legal but harmful should immediately be struck from every corner of the internet. It simply says that category 1 companies—the big ones—have to do a proper risk assessment of that content and think about it properly.

Secondly, those companies have to have a policy to deal with that content, but that policy is up to them. They could have a policy that says, “It is absolutely fine.” Let us say that health disinformation is on the list, as one would expect it to be. A particular social media firm could have a policy that says, “We have considered this. We know it is risky, but we are going to let it happen anyway.” Some people might say that that is a weakness in the Bill, while others might say that it protects freedom of expression. It depends on one’s point of view, but that is how it works. It is for the company to choose and set out its policy, and the Bill requires it to enforce it consistently. I do not think that the requirements I have laid out amount to censorship or an unreasonable repression of free speech, because the platforms can still set their own terms and conditions.

There is also the general duty to have regard to free speech, which is introduced in clause 19(2). At the moment, no such duty exists. One might argue that the duty could be stronger, as my hon. Friend suggested previously, but it is unarguable that, for the first time ever, there is a duty on the platforms to have regard to free speech.

15:15
Thirdly, and finally, let us think about how big platforms such as Facebook and Twitter confront such issues. The truth is that they behave in an arbitrary manner; they are not consistent in how they apply their own terms and conditions. They sometimes apply biases—a matter on which my right hon. Friend the Secretary of State commented recently. No requirement is placed on them to be consistent or to have regard to freedom of speech. So they do things such as cancel Donald Trump—people have their own views on that—while allowing Vladimir Putin’s propaganda to be spread. That is obviously inconsistent. They have taken down a video of my hon. Friend the Member for Christchurch (Sir Christopher Chope) speaking in the House of Commons Chamber. That would be difficult once the Bill is passed because clause 15 introduces protection for content of democratic importance. So I do not think that the legal but harmful duties infringe free speech. To the contrary, once the Bill is passed, as I hope it will be, it will improve freedom of speech on the internet. It will not make it perfect, and I do not pretend that it will, but it will make some modest improvements.
Nick Fletcher Portrait Nick Fletcher
- Hansard - - - Excerpts

The argument has been made that the social media companies are doing this anyway, but two wrongs don’t make a right. We need to stop them doing it. I understand what we are trying to do here. We can see straight away that the Opposition want to be tighter on this. At a later date, if the Bill goes through as it is, freedom of speech will be gradually suppressed, and I am really concerned about that. My hon. Friend said that it would come back to Parliament, which I am pleased about. Are the priorities going to be written into the Bill? Will we be able to vote on them? If the scope is extended at any point in time, will we be able to vote on that, or will the Secretary of State just say, “We can’t have that so we’re just going to ban it”?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I will answer the questions in reverse order. The list of harms will not be in the Bill. The amendment seeks to put one of the harms in the Bill but not the others. So no, it will not be in the Bill. The harms—either the initial list or any addition to or subtraction from the list—will be listed in an affirmative statutory instrument, which means that the House will be able to look at it and, if it wants, to vote on it. So Parliament will get a chance to look at the initial list, when it is published in an SI. If anything is to be added in one, two or three years’ time, the same will apply.

Nick Fletcher Portrait Nick Fletcher
- Hansard - - - Excerpts

So will we be able to vote on any extension of the scope of the Bill at any time? Will that go out to public consultation as well?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes. There is an obligation on the Secretary of State to consult—[Interruption.] Did I hear someone laugh?—before proposing a statutory instrument to add things. There is a consultation first and then, if extra things are going to be added—in my hon. Friend’s language, if the scope is increased—that would be votable by Parliament because it is an affirmative SI. So the answer is yes to both questions. Yes there will be consultation in advance, and yes, if this Government or a future Government wanted to add anything, Parliament could vote on it if it wanted to because it will be an affirmative SI. That is a really important point.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

In a moment; I want to answer the other point made by my hon. Friend the Member for Don Valley first. He said that two wrongs don’t make a right. I am not defending the fact that social media firms act in a manner that is arbitrary and censorious at the moment. I am not saying that it is okay for them to carry on. The point that I was making was a different one. I was saying that they act censoriously and arbitrarily at times at the moment. The Bill will diminish their ability to do that in a couple of ways. First, for the legal but harmful stuff, which he is worried about, they will have a duty to act consistently. If they do not, Ofcom will be able to enforce against them. So their liberty to behave arbitrarily, for this category of content at least, will be circumscribed. They will now have to be consistent. For other content that is outside the scope of this clause —which I guess therefore does not worry my hon. Friend—they can still be arbitrary, but for this they have got to be consistent.

There is also the duty to have regard to freedom of expression, and there is a protection of democratic and journalistic importance in clauses 15 and 16. Although those clauses are not perfect and some people say they should be stronger, they are at least better than what we have now. When I say that this is good for freedom of speech, I mean that nothing here infringes on freedom of speech, and to the extent that it moves one way or the other, it moves us somewhat in the direction of protecting free speech more than is the case at the moment, for the reasons I have set out. I will be happy to debate the issue in more detail either in this Committee or outside, if that is helpful and to avoid trying the patience of colleagues.

None Portrait The Chair
- Hansard -

Order. Before we go any further, I know it is tempting to turn around and talk to Back Benchers, but that makes life difficult for Hansard because you tend to miss the microphone. It is also rather discourteous to the Chair, so in future I ask the Minister to please address the Chair. I call the shadow Minister.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I thank the Minister for giving way; I think that is what he was doing as he sat down.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

indicated assent.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Just for clarity, the hon. Member for Don Valley and the Minister have said that Labour Members are seeking to curtail or tighten freedom of expression and freedom of speech, but that is not the case. We fundamentally support free speech, as we always have been. The Bill addresses systems and processes, and that is what it should do—the Minister, the Labour party and I are in full alignment on that. We do not think that the Bill should restrict freedom of speech. I would just like to put that on the record.

We also share the concerns expressed by the hon. Member for Don Valley about the Secretary of State’s potential powers, the limited scope and the extra scrutiny that Parliament might have to undertake on priority harms, so I hope he will support some of our later amendments.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am grateful to the shadow Minister for confirming her support for free speech. Perhaps I could take this opportunity to apologise to you, Sir Roger, and to Hansard for turning round. I will try to behave better in future.

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

I find myself not entirely reassured, so I think we should press the amendment to a vote.

Question put, That the amendment be made.

Division 27

Ayes: 5


Labour: 3
Scottish National Party: 2

Noes: 8


Conservative: 8

None Portrait The Chair
- Hansard -

As I have indicated already, I do not propose that we have a clause stand part debate. It has been exhaustively debated, if I may say so.

Clause 54 ordered to stand part of the Bill.

Clause 55

Regulations under sections 53 and 54

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move amendment 62, in clause 55, page 52, line 4, after “OFCOM” insert

“and other stakeholders, including organisations that campaign for the removal of harmful content online”.

This amendment requires the Secretary of State to consult other stakeholders before making regulations under clause 53 or 54.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause stand part.

Clause 56 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

We all know that managing harmful content, unlike illegal content, is more about implementing systems that prevent people from encountering it rather than removing it entirely. At the moment, there are no duties on the Secretary of State to consult anyone other than Ofcom ahead of making regulations under clauses 53 and 54. We have discussed at length the importance of transparency, and surely the Minister can agree that the process should be widened, as we have heard from those on the Government Back Benches.

Labour has said time and again that it should not be for the Secretary of State of the day to determine what constitutes harmful content for children or adults. Without the important consultation process outlined in amendment 62, there are genuine concerns that that could lead to a damaging precedent whereby a Secretary of State, not Parliament, has the ability to determine what information is harmful. We all know that the world is watching as we seek to work together on this important Bill, and Labour has genuine concerns that without a responsible consultation process, as outlined in amendment 62, we could inadvertently be suggesting to the world that this fairly dogmatic approach is the best way forward.

Amendment 62 would require the Secretary of State to consult other stakeholders before making regulations under clauses 53 and 54. As has been mentioned, we risk a potentially dangerous course of events if there is no statutory duty on the Secretary of State to consult others when determining the definition of harmful content. Let me draw the Minister’s attention to the overarching concerns of stakeholders across the board. Many are concerned that harmful content for adults requires the least oversight, although there are potential gaps that mean that certain content—such as animal abuse content—could completely slip through the net. The amendment is designed to ensure that sufficient consultation takes place before the Secretary of State makes important decisions in directing Ofcom.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

On that point, I agree wholeheartedly with my hon. Friend. It is important that the Secretary of State consults campaign organisations that have expertise in the relevant areas. Much as we might want the Secretary of State to be informed on every single policy issue, that is unrealistic. It is also important to acknowledge the process that we have been through with the Bill: the expertise of organisations has been vital in some of the decisions that we have had to make. My hon. Friend gave a very good example, and I am grateful to animal welfare groups for their expertise in highlighting the issue of online abuse of animals.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree with my hon. Friend. As parliamentarians we are seen as experts in an array of fields. I do not purport to be an expert in all things, as it is more a jack of all trades role, and it would be impossible for one Secretary of State to be an expert in everything from animal abuse to online scam ads, from fraud to CSAM and terrorism. That is why it is fundamental that the Secretary of State consults with experts and stakeholders in those fields, for whom these things are their bread and butter—their day job every day. I hope the Minister can see that regulation of the online space is a huge task to take on for us all. It is Labour’s view that any Secretary of State would benefit from the input of experts in specific fields. I urge him to support the amendment, especially given the wider concerns we have about transparency and power sharing in the Bill.

It is welcome that clause 56 will force Ofcom, as the regulator, to carry out important reviews that will assess the extent to which content is harmful to children and adults when broadly appearing on user-to-user services. As we have repeatedly said, transparency must be at the heart of our approach. While Labour does not formally oppose the clause, we have concerns about subsection (5), which states:

“The reports must be published not more than three years apart.”

The Minister knows that the Bill has been long awaited, and we need to see real, meaningful change and updates now. Will he tell us why it contains a three-year provision?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I thank the Minister for his clarification earlier and his explanation of how the categories of primary priority content and priority content can be updated. That was helpful.

Amendment 62 is excellent, and I am more than happy to support it.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I have a short comment on clause 56, which is an important clause because it will provide an analysis of how the legislation is working, and that is what Members want to see. To the point that the hon. Member for Pontypridd set out, it is right that Ofcom probably will not report until 2026, given the timeframe for the Bill being enacted. I would not necessarily want Ofcom to report sooner, because system changes take a long time to bed in. It does pose the question, however, of how Parliament will be able to analyse whether the legislation or its approach need to change between now and 2026. That reiterates the need—which I and other hon. Members have pointed out—for some sort of standing committee to scrutinise the issues. I do not personally think it would be right to get Ofcom to report earlier, because it might be an incomplete report.

15:30
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have heard my right hon. Friend’s points about a standing Joint Committee for post-legislative implementation scrutiny. On the comments about the time, I agree that the Ofcom review needs to be far enough into the future that it can be meaningful, hence the three-year time period.

On the substance of amendment 62, tabled by the shadow Minister, I can confirm that the Government are already undertaking research and working with stakeholders on identifying what the priority harms will be. That consideration includes evidence from various civil society organisations, victims organisations and many others who represent the interests of users online. The wider consultation beyond Ofcom that the amendment would require is happening already as a matter of practicality.

We are concerned, however, that making this a formal consultation in the legal sense, as the amendment would, would introduce some delays while we do so, because a whole sequence of things have to happen after Royal Assent. First, we have to designate the priority harms by statutory instrument, and then Ofcom has to publish its risk assessments and codes of practice. If we insert into that a formal legal consultation step, it would add at least four or even six months into the process of implementing the Act. I know that that was not the hon. Lady’s intention and that she is concerned about getting the Act implemented quickly. For that reason, the Government do not want to insert a formal legal consultation step into the process, but I am happy to confirm that we are engaging in the consultation already on an informal basis and will continue to do so. I ask respectfully that amendment 62 be withdrawn.

The purpose of clauses 55 and 56 has been touched on already, and I have nothing in particular to add.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful for the Minister’s comments on the time that these things would take. I cannot see how they could not happen succinctly along with the current consultation, and why it would take an additional four to six months. Could he clarify that?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

A formal statutory consultation could happen only after the passage of the Bill, whereas the informal non-statutory consultation we can do, and are doing, now.

Question put, That the amendment be made.

Division 28

Ayes: 5


Labour: 3
Scottish National Party: 2

Noes: 7


Conservative: 7

Clauses 55 ordered to stand part of the Bill.
Clauses 56 ordered to stand part of the Bill.
Clause 57
User identity verification
Question proposed, That the clause stand part of the Bill.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I have some brief comments on the clause. The Labour party very much welcomes the addition to user verification duties in the revised Bill. A range of groups, including Clean Up the Internet, have long campaigned for a verification requirement process, so this is a positive step forward.

We do, however, have some concerns about the exact principles and minimum standards for the user verification duty, which I will address when we consider new clause 8. We also have concerns about subsection (2), which states:

“The verification process may be of any kind (and in particular, it need not require documentation to be provided).”

I would be grateful if the Minister could clarify exactly what that process will look like in practice.

Lastly, as Clean Up the Internet has said, we need further clarification on whether users will be given a choice of how they verify and of the verification provider itself. We can all recognise that there are potential down- sides to the companies that own the largest platforms —such as Meta, Google, Twitter and ByteDance—developing their own in-house verification processes and making them the only option for users wishing to verify on their platform. Indeed, some users may have reservations about sharing even more personal data with those companies. Users of multiple social media platforms can find it inconvenient and confusing, and could be required to go through multiple different verification processes on different platforms to achieve the same outcome of confirming their real name.

There is a risk of the largest platforms seeking to leverage their dominance of social media to capture the market for ID verification services, raising competition concerns. I would be grateful if the Minister could confirm his assessment of the potential issues around clause 57 as it stands.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I rise to welcome clause 57. It is an important part of the Bill and shows the Government acknowledging that anonymity can have a significant impact on the harms that affect victims. There is a catalogue of evidence of the harm done by those posting anonymously. Anonymity appears to encourage abusive behaviour, and there is evidence dating back to 2015 showing that anonymous accounts are more likely to share sexist comments and that online harassment victims are often not able to identify their perpetrators because of the way anonymity works online. The Government are doing an important thing here and I applaud them.

I underline that again by saying that recent research from Compassion in Politics showed that more than one in four people were put off posting on social media because of the fear of abuse, particularly from anonymous posters. Far from the status quo promoting freedom of speech, it actually deters freedom of speech, as we have said in other debates, and it particularly affects women. The Government are to be applauded for this measure.

In the work I was doing with the FA and the Premier League around this very issue, I particularly supported their call for a twin-track approach to verified accounts that said that they should be the default and that people should automatically be able to opt out of receiving posts from unverified accounts. The Bill does not go as far as that, and I can understand the Government’s reasons, but I gently point out that 81% of the people who took part in the Compassion in Politics research would willingly provide identification to get a verified account if it reduced unverified posts. They felt that was important. Some 72% supported the idea if it reduced the amount of anonymous posting.

I am touching on clause 58, but I will not repeat myself when we debate that clause. I hope that it will be possible in the code of practice for Ofcom to point out the clear benefits of having verified accounts by default and perhaps urge responsible providers to do the responsible thing and allow their users to automatically filter out unverified accounts. That is what users want, and it is extraordinary that large consumer organisations do not seem to want to give consumers what they want. Perhaps Ofcom can help those organisations understand what their consumers want, certainly in Britain.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The right hon. Lady’s speech inspired me to stand up and mention a couple of things. My first question is about using empowerment around this clause. The clause applies only to adults. I can understand the issues that there may be with verifying the identity of children, but if that means that children are unable to block unverified accounts because they cannot verify their own account, the internet becomes a less safe place for children than for adults in this context, which concerns me.

To be honest, I do not know how children’s identities could be verified, but giving them access to the filters that would allow them to block unverified accounts, whether or not they are able to verify themselves—because they are children and therefore may not have the identity documentation they need—would be very helpful.

I appreciate the points that the right hon. Member was making, and I completely agree with her on the requirement for user verification, but I have to say that I believe there is a place for anonymity on the internet. I can understand why, for a number of people, that is the only way that they can safely access some of the community support that they need.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

Just for clarity, the twin-track approach does not outlaw anonymity. It just means that people have verified accounts by default; they do not have to opt into it.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I appreciate that clarification. I just wanted to make it absolutely clear that I strongly believe that anonymity is a very good protection, not just for people who intend to do bad on the internet, but for people who are seeking out community, particularly. I think that that is important.

If you will allow me to say a couple of things about the next clause, Sir Roger, Mencap raised the issue of vulnerable users, specifically vulnerable adult users, in relation to the form of identity verification. If the Minister or Ofcom could give consideration to perhaps including travel passes or adult passes, it might make the internet a much easier place to navigate for people who do not have control of their own documentation—they may not have access to their passport, birth certificate, or any of that sort of thing—but who would be able to provide a travel pass, because that is within their ownership.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We have heard quite a lot about the merits of clause 57, and I am grateful to colleagues on both side for pointing those out. The hon. Member for Pontypridd asked about the effectiveness of the user identity verification processes and how those might occur—whether they would be done individually by each company for their own users, or whether a whole industry would develop even further, with third parties providing verification that could then be used across a whole number of companies.

Some of those processes exist already in relation to age verification, and I think that some companies are already active in this area. I do not think that it would be appropriate for us, in Parliament, to specify those sorts of details. It is ultimately for Ofcom to issue that guidance under clause 58, and it is, in a sense, up to the market and to users to develop their own preferences. If individual users prefer to verify their identity once and then have that used across multiple platforms, that will itself drive the market. I think that there is every possibility that that will happen. [Interruption.]

None Portrait The Chair
- Hansard -

Order. There is a Division on the Floor of the House. The Committee will sit again in 15 minutes. As far as I am aware, there will only be one vote on this; if there are two, we will return 15 minutes later than that.

15:43
Sitting suspended for a Division in the House.
15:55
On resuming
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I was just concluding my remarks on clause stand part, Sir Roger. User choice and Ofcom guidance will ultimately determine the shape of this market.

The shadow Minister, the hon. Member for Pontypridd, expressed concerns about privacy. That is of course why the list of people Ofcom must consult—at clause 58(3)(a)—specifies the Information Commissioner, to ensure that Ofcom’s guidance properly protects the privacy of users, for the reasons that the shadow Minister referred to in her speech.

Finally, on competition, if anyone attempts to develop an inappropriate monopoly position in this area, the Competition and Markets Authority’s usual powers will apply. On that basis, I commend the clause to the Committee.

Question put and agreed to.

Clause 57 accordingly ordered to stand part of the Bill.

Clause 58

OFCOM’s guidance about user identity verification

Question proposed, That the clause stand part of the Bill.

15:59
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

As we have said previously, it is absolutely right that Ofcom produces guidance for providers of category 1 services to assist with their compliance with the duty. We very much welcome the inclusion and awareness of identity verification forms for vulnerable adult users in subsection (2); once again, however, we feel that that should go further, as outlined in new clause 8.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clause 58, which was touched on in our last debate, simply sets out Ofcom’s duty to publish guidance for category 1 services to assist them in complying with the user identification duty set out in clause 57. We have probably covered the main points, so I will say nothing further.

Question put and agreed to.

Clause 58 accordingly ordered to stand part of the Bill.

Clause 59

Requirement to report CSEA content to the NCA

Question proposed, That the clause stand part of the Bill.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

You are really moving us at pace, Sir Roger. It is a pleasure to serve in Committee with you in the Chair.

It is welcome that regulated services will have to report all child sexual exploitation and abuse material that they detect on their platform. The Government’s decision to move away from the approach of a regulatory code of practice to a mandatory reporting requirement is an important improvement to the draft Bill.

For companies to report child sexual exploitation and abuse material correctly to the mandatory reporting body, they will need access to accurate datasets that will determine whether something that they are intending to report is child sexual exploitation and abuse content. What guidance will be made available to companies so that they can proactively detect CSEA, and what plans are in place to assist companies to identify potential CSEA that has not previously been identified? The impact assessment mentions that, for example, BT is planning to use the Internet Watch Foundation’s hash list, which is compliant with UK law enforcement standards, to identify CSEA proactively. Hashing is a technology used to prevent access to known CSEA; a hash is a unique string of letters and numbers which is applied to an image and which can then be matched every time a user attempts to upload a known illegal image to a platform. It relies, however, on CSEA already having been detected. What plans are in place to assist companies to identify potential CSEA?

Finally, it is important that the introduction of mandatory reporting does not impact on existing international reporting structures. Many of the largest platforms in the scope of the Bill are US-based and required under US law to report CSEA material detected on their platform to the National Centre for Missing and Exploited Children, which ensures that information relevant to UK law enforcement is referred to it for investigation.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

To answer the shadow Minister’s question about the duty to detect CSEA proactively—because, as she says, we have to detect it before we can report it—I confirm that there are already duties in the Bill to prevent and detect CSEA proactively, because CSEA is a priority offence in the schedule 6 list of child exploitation and abuse offences, and there is a duty for companies to prevent those proactively. In preventing them proactively, they will by definition identify them. That part of her question is well covered.

The hon. Lady also asked about the technologies available to those companies, including hash matching—comparing images against a known database of child sexual exploitation images. A lot of technology is being developed that can proactively spot child sexual exploitation in new images that are not on the hash matching database. For example, some technology combines age identification with nude image identification; by putting them together, we can identify sexual exploitation of children in images that are new and are not yet in the database.

To ensure that such new technology can be used, we have the duties under clause 103, which gives Ofcom the power to mandate—to require—the use of certain accredited technologies in fighting not just CSEA, but terrorism. I am sure that we will discuss that more when we come to that clause. Combined, the requirement to proactively prevent CSEA and the ability to specify technology under clause 103 will mean that companies will know about the content that they now, under clause 59, have to report to the National Crime Agency. Interestingly, the hon. Member for Worsley and Eccles South mentioned that that duty already exists in the USA, so it is good that we are matching that requirement in our law via clause 59, which I hope that the Committee will agree should stand part of the Bill.

Question put and agreed to.

Clause 59 accordingly ordered to stand part of the Bill.

Clause 60

Regulations about reports to the NCA

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to consider clause 61 stand part.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The additional regulations created by the Secretary of State in connection with the reports will have a lot resting on them. It is vital that they receive the appropriate scrutiny when the time comes. For example, the regulations must ensure that referrals to the National Crime Agency made by companies are of a high quality, and that requirements are easy to comply with. Prioritising the highest risk cases will be important, particularly where there is an immediate threat to the safety and welfare of a child.

Clause 60 sets out that the Secretary of State’s regulations must include

“provision about cases of particular urgency”.

Does the Minister have an idea what that will look like? What plans are in place to ensure that law enforcement can prioritise the highest risk and harm cases?

Under the new arrangements, the National Crime Agency as the designated body, the Internet Watch Foundation as the appropriate authority for notice and takedown in the UK, and Ofcom as the regulator for online harms will all hold a vast amount of information on the scale of the threat posed by child sexual exploitation and illegal content. How will the introduction of mandatory reporting assist those three organisations in improving their understanding of how harm manifests online? How does the Minister envisage the organisations working together to share information to better protect children online?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am glad that clause 60 will be in the Bill and that there will be a duty to report to the NCA. On subsection (3), though, I would like the Minister to clarify that if the Secretary of State believes that the Scottish Ministers would be appropriate people to consult, they would consult them, and the same for the Northern Ireland Executive.

I would appreciate the Minister explaining how clause 61 will work in a Scottish context, because that clause talks about the Crime and Courts Act 2013. Does a discussion need to be had with Scottish Ministers, and perhaps Northern Ireland Ministers as well, to ensure that information sharing takes place seamlessly with devolved areas with their own legal systems, to the same level as within England and Wales? If the Minister does not have an answer today, which I understand that he may not in detail, I am happy to hear from him later; I understand that it is quite a technical question.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Member for Worsley and Eccles South asks about the prioritisation of reports made to the NCA under the new statutory provisions. The prioritisation of investigations is an operational matter for the NCA, acting as a law enforcement body. I do not think it would be right either for myself as a Minister or for Parliament as a legislative body to specify how the NCA should conduct its operational activities. I imagine that it would pursue the most serious cases as a matter of priority, and if there is evidence of any systemic abuse it would also prioritise that, but it really is a matter for the NCA, as an operationally independent police force, to decide for itself. I think it is fairly clear that the scope of matters to be contained in these regulations is fairly comprehensive, as one would expect.

On the questions raised by the hon. Member for Aberdeen North, the Secretary of State might consult Scottish Ministers under clause 63(6)(c), particularly those with responsibility for law enforcement in Scotland, and the same would apply to other jurisdictions. On whether an amendment is required to cover any matters to do with the procedures in Scotland equivalent to the matter covered in clause 61, we do not believe that any equivalent change is required to devolved Administration law. However, in order to be absolutely sure, we will get the hon. Lady written confirmation on that point.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I am not sure that the Minister has answered my question on clause 60. I think we all agree that law enforcement agencies can decide their own priorities, quite rightly, but clause 60(2)(d) sets out that the Secretary of State’s regulations must include

“provision about cases of particular urgency”.

I asked the Minister what that would look like.

Also, we think it is pretty important that the National Crime Agency, the Internet Watch Foundation and Ofcom work together on mandatory reporting. I asked him how he envisaged them working together to share information, because the better they do that, the more children are protected.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I apologise for missing those two points. On working together, the hon. Lady is right that agencies such as the Internet Watch Foundation and others should co-operate closely. There is already very good working between the Internet Watch Foundation, law enforcement and others—they seem to be well networked together and co-operating closely. It is appropriate to put on the record that Parliament, through this Committee, thinks that co-operation should continue. That communication and the sharing of information on particular images is obviously critical.

As the clause states, the regulations can set out expedited timeframes in cases of particular urgency. I understand that to mean cases where there might be an immediate risk to a child’s safety, or where somebody might be at risk in real time, as opposed to something historic—for example, an image that might have been made some time ago. In cases where it is believed abuse is happening at the present time, there is an expectation that the matter will be dealt with immediately or very close to immediately. I hope that answers the shadow Minister’s questions.

Question put and agreed to.

Clause 60 accordingly ordered to stand part of the Bill.

Clause 61 ordered to stand part of the Bill.

Clause 62

Offence in relation to CSEA reporting

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I beg to move amendment 1, in clause 62, page 55, line 14, leave out “maximum summary term for either-way offences” and insert “general limit in a magistrates’ court”.

Amendments 1 to 5 relate to the maximum term of imprisonment on summary conviction of an either-way offence in England and Wales. Amendments 1 to 4 insert a reference to the general limit in a magistrates’ court, meaning the time limit in section 224(1) of the Sentencing Code, which, currently, is 12 months.

None Portrait The Chair
- Hansard -

With this it will be convenient to consider Government amendments 4, 2, 3 and 5.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

These amendments make some technical drafting changes to the Bill in relation to sentencing penalties for either-way offences in the courts of England and Wales. They bring the Bill into line with recent changes implemented following the passage of the Judicial Review and Courts Act 2022. The change uses the new term

“general limit in a magistrates’ court”

to account for any future changes to the sentencing limit in the magistrates court. The 2022 Act includes a secondary power to switch, by regulations, between a 12-month and six-month maximum sentence in the magistrates court, so we need to use the more general language in this Bill to ensure that changes back and forth can be accommodated. If we just fix a number, it would become out of sync if switches are made under the 2022 Act.

Amendment 1 agreed to.

Question proposed, That the clause, as amended, stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to consider clause 63 stand part.

16:15
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Clause 63 sets out that the CSEA content required to be reported must have been published, generated, uploaded or shared either in the UK, by a UK citizen, or including a child in the UK. Subsection (6) requires services to provide evidence of such a link to the UK, which might be quite difficult in some circumstances. I would appreciate the Minister outlining what guidance and support will be made available to regulated services to ensure that they can fulfil their obligations. This is about how services are to provide evidence of such a link to the UK.

Takeovers, mergers and acquisitions are commonplace in the technology industry, and many companies are bought out by others based overseas, particularly in the United States. Once a regulated service has been bought out by a company based abroad, what plans are in place to ensure that either the company continues to report to the National Crime Agency or that it is enabled to transition to another mandatory reporting structure, as may be required in another country in the future. That is particularly relevant as we know that the European Union is seeking to introduce mandatory reporting functions in the coming years.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clause 62 creates an offence, as we discussed earlier, of knowingly or recklessly providing inaccurate information to the NCA in relation to CSEA reporting, the penalty for which is imprisonment, a fine or both. Where a company seeks to evade its responsibility, or disregards the importance of the requirement to report CSEA by providing inaccurate information, it will be liable for prosecution. We are backing the requirement to report CSEA with significant criminal powers.

Clause 63 provides definitions for the terms used in chapter 2 of part 4, in relation to the requirement to report CSEA. In summary, a UK provider of a regulated service is defined as a provider that is

“incorporated or formed under the law of any part of the United Kingdom”

or where it is

“individuals who are habitually resident in the United Kingdom”.

The shadow Minister asked about the test and what counts, and I hope that provides the answer. We are defining CSEA content as content that a company becomes aware of containing CSEA. A company can become aware of that by any means, including through the use of automated systems and processes, human moderation or user reporting.

With regard to the definition of UK-linked CSEA, which the shadow Minister also asked about, that refers to content that may have been published and shared in the UK, or where the nationality or location of a suspected offender or victim is in the UK. The definition of what counts as a UK link is quite wide, because it includes not only the location of the offender or victim but where the content is shared. That is a wide definition.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a specific question—the Minister answered a similar question from me earlier. The Bill says that the location of the child “is” in the UK. Would it be reasonable to expect that if a company suspected the child “was” in the UK, although not currently, that would be in scope as something required to be reported? I know that is technical, but if the “was” is included in the “is” then that is much wider and more helpful than just including the current location.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

If the child had been in the UK when the offence was committed, that would ordinarily be subject to UK criminal law, because the crime would have been committed in the UK. The test is: where was the child or victim at the time the offence was committed? As I said a moment ago, however, the definition of “UK-linked” is particularly wide and includes

“the place where the content was published, generated, uploaded or shared.”

The word “generated”—I am reading from clause 63(6)(a), at the top of page 56—is clearly in the past tense and would include the circumstance that the hon. Lady described.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

What the Minister has said is helpful, but the question I asked was about what guidance and support will be made available to regulated services. We all want this to work, because it is one of the most important aspects of the Bill—many aspects are important. He made it clear to us that the definition is quite wide, for both the general definitions and the “UK-linked” content. The point of the question was, given the possible difficulties in some circumstances, what guidance and support will be made available?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I anticipate that the National Crime Agency will issue best practice guidance. A fair amount of information about the requirements will also be set out in the regulations that the Secretary of State will issue under clause 60, which we have already debated. So it is a combination of those regulations and National Crime Agency best practice guidance. I hope that answers the question.

Finally, on companies being taken over, if a company ceases to be UK-linked, we would expect it to continue to discharge its reporting duties, which might include reporting not just in the UK but to its domestic reporting agency—we have already heard the US agency described and referenced.

I hope that my answers demonstrate that the clause is intended to be comprehensive and effective. It should ensure that the National Crime Agency gets all the information it needs to investigate and prosecute CSEA in order to keep our children safe.

Question put and agreed to.

Clause 62, as amended, accordingly ordered to stand part of the Bill.

Clause 63 ordered to stand part of the Bill.

Clause 64

Transparency reports about certain Part 3 services

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I beg to move amendment 54, in clause 64, page 56, line 29, leave out “Once” and insert “Twice”.

This amendment would change the requirement for transparency report notices from once a year to twice a year.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause stand part.

Amendment 55, in schedule 8, page 188, line 42, at end insert—

“31A The notice under section 64(1) must require the provider to provide the following information about the service—

(a) the languages in which the service has safety systems or classifiers;

(b) details of how human moderators employed or engaged by the provider are trained and supported;

(c) the process by which the provider takes decisions about the design of the service;

(d) any other information that OFCOM considers relevant to ensuring the safe operation of the service.”

This amendment sets out details of information Ofcom must request be provided in a transparency report.

That schedule 8 be the Eighth schedule to the Bill.

Clause 65 stand part.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The duties on regulated services set out in the clause are welcome. Transparency reports will be a vital tool to hold platforms to account for understanding the true drivers of online harm. However, asking platforms to submit transparency reports once a year does not reflect how rapidly we know the online world changes. As we have seen time and again, the online environment can shift significantly in a matter of months, if not weeks. We have seen that in the rise of disinformation about covid, which we have talked about, and in the accelerated growth of platforms such as TikTok.

Increasing the frequency of transparency reports from annual to biannual will ensure that platforms stay on the pulse of emergent risks, allowing Ofcom to do the same in turn. The amendment would also mean that companies focus on safety, rather than just profit. As has been touched on repeatedly, that is the culture change that we want to bring about. It would go some way towards preventing complacency about reporting harms, perhaps forcing companies to revisit the nature of harm analysis, management and reduction. In order for this regime to be world-leading and ambitious—I keep hearing the Minister using those words about the Bill—we must demand the most that we can from the highest-risk services, including on the important duty of transparency reporting.

Moving to clauses 64 and 65 stand part, transparency reporting by companies and Ofcom is important for analysing emerging harms, as we have discussed. However, charities have pointed out that platforms have a track record of burying documents and research that point to risk of harm in their systems and processes. As with other risk assessments and reports, such documents should be made public, so that platforms cannot continue to hide behind a veil of secrecy. As I will come to when I speak to amendment 55, the Bill must be ambitious and bold in what information platforms are to provide as part of the clause 64 duty.

Clause 64(3) states that, once issued with a notice by Ofcom, companies will have to produce a transparency report, which must

“be published in the manner and by the date specified in the notice.”

Can the Minister confirm that that means regulated services will have to publish transparency reports publicly, not just to Ofcom? Can he clarify that that will be done in a way that is accessible to users, similarly to the requirements on services to make their terms of service and other statements clear and accessible? Some very important information will be included in those reports that will be critical for researchers and civil society when analysing trends and harms. It is important that the data points outlined in schedule 8 capture the information needed for those organisations to make an accurate analysis.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

The evidence we heard from Frances Haugen set out how important transparency is. If internet and service providers have nothing to hide, transparency is surely in their interests as well. From my perspective, there is little incentive for the Government not to support the amendment, if they want to help civil society, researchers, academics and so on in improving a more regulated approach to transparency generally on the internet, which I am sure we all agree is a good thing.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I very much agree. We cannot emphasis that enough, and it is useful that my hon. Friend has set that out, adding to what I was saying.

Amendment 55 sets out the details of the information that Ofcom must request to be provided in a transparency report in new paragraph 31A. First, transparency disclosures required by the Bill should include how large companies allocate resources to tackling harm in different languages —an issue that was rightly raised by the hon. Member for Ochil and South Perthshire. As we heard from Frances Haugen, many safety systems at Meta have only a subset of detection systems for languages other than English. Languages such as Welsh have almost no safety systems live on Facebook. It is neither fair nor safe.

When we consider that more than 250 languages are spoken in London alone, the inconsistency of safety systems becomes very concerning. Charities have warned that people accessing Facebook in different languages are being exposed to very different levels of risk, with some versions of Facebook having few or none of the safety systems that protect other versions of the site in different languages.

When giving evidence to the Committee last month, Richard Earley disclosed that Meta regulated only 70 languages. Given that around 3 billion people use Facebook on a monthly basis across the world, that is clearly inadequate.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

One of the things we found on the Joint Committee last year was the consistent message that we should not need to put this Bill in place. I want to put on the record my continued frustration that Meta and the other social media platforms are requiring us to put this Bill in place because they are not doing the monitoring, engaging in that way or putting users first. I hope that the process of going through the Bill has helped them to see the need for more monitoring. It is disappointing that we have had to get to this point. The UK Government are having to lead the world by putting this Bill in place—it should not be necessary. I hope that the companies do not simply follow what we are putting forward, but go much further and see that it is imperative to change the way they work and support their users around the world.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I thank the hon. Gentleman and I agree. It is a constant frustration that we need this Bill. We do need it, though. In fact, amendment 55 would really assist with that, by requiring those services to go further in transparency reporting and to disclose

“the languages in which the service has safety systems or classifiers”.

We need to see what they are doing on this issue. It is an easily reported piece of information that will have an outsized impact on safety, even for English speakers. It will help linguistic groups in the multilingual UK and around the world.

Reporting on language would not be a big burden on companies. In her oral evidence, Frances Haugen told the Committee that large platforms can trivially produce this additional data merely by changing a single line of code when they do their transparency reports. We must not become wrapped up in the comfort of the language we all speak and ignore the gaping loophole left for other languages, which allows harms to slip through.

16:29
The second set of information on which the amendment would require companies to report is the employment, training and support of the human moderators who are employed to consider harmful content. There is chilling evidence of how the largest platforms outsource their content moderation to factories of poorly paid, ill-treated and highly vulnerable workers. These content moderators see the most disturbing, traumatising and abhorrent content. They are the frontline of defence in reducing the scale of harm for other users. Contracting out moderation is just another way for the platforms to outsource risk, to prioritise profits over safety and to shirk their responsibilities. Platforms must be transparent about who moderates their online content, how they are provided for, and what protections are in place. This is basic decency that we cannot trust the platforms to demonstrate without a legal obligation to do so, which goes back to the point that the hon. Member for Watford made a while ago. We need to lead them in this effort and change their culture so that they start to do these things.
Under questioning from my hon. Friend the Member for Pontypridd last month, Richard Earley admitted that he had no idea how many human moderators work for Facebook directly and how many had abided by a UK standard code of conduct. That is disgraceful, yet Frances Haugen said it would be a simple matter, because changing a single line of code would get that information. In fact, she said:
“I guarantee you that they know exactly how many moderators they have hired—they have dashboards to track these numbers. The fact that they do not disclose those numbers shows why we need to pass laws to have mandatory accountability… We need to ensure that there is always enough staffing and that moderators can play an active role in this process.”––[Official Report, Online Safety Public Bill Committee, 26 May 2022; c. 185, Q313.]
We therefore have a duty to keep users safe, and the Bill must ensure that platforms do the right thing.
The third additional transparency disclosure is to show how companies make decisions about service design. Preventing harm to the public would be impossible unless both the regulator and civil society know what is happening inside these large tech companies. We know that if something cannot be detected, it clearly cannot be reported. Knowing how companies make decisions will allow for greater scrutiny of the information they disclose. Without it, there is a risk that Ofcom receives skewed figures and an incomplete picture. Amendment 55 would be a step in the right direction towards making the online environment more transparent, fair and safe for those working to tackle harms, and I hope the Minister will consider its merits.
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

To start with, it is worth saying that clause 64 is extremely important. In the course of debating earlier clauses, Opposition Members rightly and repeatedly emphasised how important it is that social media platforms are compelled to publish information. The testimony that Frances Haugen gave to the Joint Committee and to this Committee a few weeks ago demonstrates how important that is. Social media platforms are secretive and are not open. They seek to disguise what is going on, even though the impact of what they are doing has a global effect. So the transparency power in clause 64 is a critical part of the Bill and will dramatically transform the insights of parliamentarians, the wider public, civil society campaigners and academics. It will dramatically open up the sense of what is going on inside these companies, so it is extremely important indeed.

Amendment 54 seeks to increase the frequency of transparency reporting from once a year to twice a year. To be honest, we do not want to do this unreasonably frequently, and our sense is that once a year, rather than twice a year, is the right regularity. We therefore do not support the amendment. However, Members will notice that there is an ability in clause 64(12) for the Secretary of State, by regulation, to

“amend subsection (1) so as to change the frequency of the transparency reporting process.”

If it turns out in due course that once a year is not enough and we would like to do it more frequently—for example, twice a year—there is the power for those regulations to be used so that the reporting occurs more frequently. The frequency is not set in stone.

I turn to amendment 55, which sets out a number of topics that would be included in reporting. It is important to say that, as a quick glance at schedule 8 shows, the remit of the reports is already extremely wide in scope. Hon. Members will see that paragraph 5 specifies that reports can cover

“systems and processes for users to report content which they consider to be illegal”

or “harmful”, and so on. Paragraph 6 mentions:

“The systems and processes that a provider operates to deal with illegal content, content that is harmful to children”,

and so on. Therefore, the topics that amendment 55 speaks to are already covered by the schedule, and I would expect such things to be reported on. We have given Ofcom the explicit powers to do that and, rather than prescribe such details in the Bill, we should let Ofcom do its job. It certainly has the powers to do such things—that is clearly set out in the schedule—and I would expect, and obviously the Opposition would expect, that it will do so. On that basis, I will gently resist amendments 54 and 55.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

On amendment 55, I want to come back to the Minister on two points about languages that were made by the hon. Member for Aberdeen North. I think most people would be shocked to discover that safety systems and the languages in which they operate are not protected, so if people are speaking a language other than English, they will not be protected. I also think that people will be shocked about, as I outlined, the employment of moderators and how badly they are paid and trained. There are factories full of people doing that important task.

I recommend that the Minister thinks again about requiring Ofcom to provide details on human moderators who are employed or engaged and how they are trained and supported. It is a bit like when we find out about factories producing various items under appalling conditions in other parts of the world—we need transparency on these issues to make people do something about it. These platforms will not do anything about it. Under questioning from my hon. Friend the Member for Pontypridd, Richard Earley admitted that he had no idea how many human moderators were working for Facebook. That is appalling and we must do something about it.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I obviously have sympathy with the objectives, but the topics covered in schedule 8, which include the systems and processes for responding to illegal and harmful content and so on, give Ofcom the power to do what the hon. Member requires. On the language point, the risk assessments that companies are required to do are hard-edged duties in the Bill, and they will have to include an assessment of languages used in the UK, which is a large number of languages—obviously, it does not include languages spoken outside the UK. So the duty to risk-assess languages already exists. I hope that gives the hon. Member reassurance. She is making a reasonable point, and I would expect that, in setting out transparency requirements, Ofcom will address it. I am sure that it will look at our proceedings to hear Parliament’s expectations, and we are giving it those powers, which are clearly set out in schedule 8.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I will just make a final point. The Bill gives Ofcom powers when it already has so much to do. We keep returning to the point of how much will ride on Ofcom’s decisions. Our amendments would make clear the requirement for transparency reporting relating to the language issue, as well as the employment of human moderators and how they are trained and supported. If we do not point that out to Ofcom, it really has enough other things to be doing, so we are asking for these points to be drawn out specifically. As in so many of our amendments, we are just asking for things to be drawn out so that they happen.

Question put, That the amendment be made.

Division 29

Ayes: 4


Labour: 3
Scottish National Party: 1

Noes: 7


Conservative: 7

Clause 64 ordered to stand part of the Bill.
Schedule 8
Transparency reports by providers of Category 1 services, Category 2A services and Category 2B services
Amendment proposed: 55, in schedule 8, page 188, line 42, at end insert—
“31A The notice under section 64(1) must require the provider to provide the following information about the service—
(a) the languages in which the service has safety systems or classifiers;
(b) details of how human moderators employed or engaged by the provider are trained and supported;
(c) the process by which the provider takes decisions about the design of the service;
(d) any other information that OFCOM considers relevant to ensuring the safe operation of the service.”—(Barbara Keeley.)
This amendment sets out details of information Ofcom must request be provided in a transparency report.
Question put, That the amendment be made.

Division 30

Ayes: 4


Labour: 3
Scottish National Party: 1

Noes: 7


Conservative: 7

Schedule 8 agreed to.
Clause 65 ordered to stand part of the Bill.
Clause 66
“Pornographic content”, “provider content”, “regulated provider pornographic content”
Question proposed, That the clause stand part of the Bill.
None Portrait The Chair
- Hansard -

With this it will be convenient to consider the following:

Clause 67 stand part.

That schedule 9 be the Ninth schedule to the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour welcomes the important changes that have been made to the Bill since its original draft, which applied only to user-generated pornographic content. The Bill now includes all pornography, and that is a positive step forward. It is also welcome that the provisions do not apply only to commercial pornography. We all know that some of the biggest commercial pornography sites could have switched their business models had these important changes not been made. As we have reiterated, our priority in regulating pornographic content is to keep children safe. The question that we should continue to ask each other is simple: “Is this content likely to harm children?”

We have a few concerns—which were also outlined in evidence by Professor Clare McGlynn—about the definition of “provider pornographic content” in clause 66(3). It is defined as

“pornographic content that is published or displayed on the service by the provider of the service or by a person acting on behalf of the provider (including pornographic content published or displayed…by means of software or an automated tool or algorithm”.

That definition separates provider porn from content that is uploaded or shared by users, which is outlined in clause 49(2). That separation is emphasised in clause 66(6), which states:

“Pornographic content that is user-generated content in relation to an internet service is not to be regarded as provider pornographic content in relation to that service.”

However, as Professor McGlynn emphasised, it is unclear is exactly what will be covered by the words

“acting on behalf of the provider”.

I would appreciate some clarity from the Minister on that point. Could he give some clear examples?

16:45
Labour supports clause 67, which establishes important definitions related to regulated provider pornographic content. It is important to have that clarity in the Bill so that those duties are crystal clear for those who will be responsible for implementing them.
Schedule 7 is an important schedule, which outlines the providers of internet services that are not subject to the duties on regulated provider pornographic content. Those are important exemptions that Labour welcomes being clarified in the Bill. For that reason, we have tabled no amendments at present.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I associate myself with the comments made by the hon. Member for Pontypridd and apologise on behalf of my hon. Friend the Member for Ochil and South Perthshire, who is currently in the Chamber dealing with the Channel 4 privatisation. I am sure that, given his position on the Joint Committee, he would have liked to comment on the clause and would have welcomed its inclusion in the Bill, but, unfortunately, he cannot currently do so.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

It is a great shame that the hon. Member for Ochil and South Perthshire is occupied in the main Chamber, because I could have pointed to this change as one of the examples of the Government listening to the Joint Committee, on which he and many others served. However, I hope that the hon. Member for Aberdeen North will communicate my observation to him, which I am sure he will appreciate.

In seriousness, this is an example of the Government moving the Bill on in response to widespread parliamentary and public commentary. It is right that we extend the duties to cover commercial pornographic content as well as the user-to-user pornography covered previously. I thank the Opposition parties for their support for the inclusion of those measures.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

As a member of the Joint Committee, on which I worked with the hon. Member for Ochil and South Perthshire, I thank the Minister for including this clause on a point that was debated at length by the Joint Committee. Its inclusion is crucial to organisations in my constituency such as Dignify—a charity that works to raise awareness and campaign on this important point, to protect children but also wider society. As this is one of the 66 recommendations that the Minister took forward in the Bill, I would like to thank him; it is very welcome, and I think that it will make a huge difference to children and to society.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my hon. Friend for his intervention and for his work on the Joint Committee, which has had a huge impact, as we have seen. I hope that colleagues will join me in thanking the members of the Joint Committee for their work.

My final point on this important clause is in response to a question that the shadow Minister raised about clause 66(3), which makes reference to

“a person acting on behalf of the provider”.

That is just to ensure that the clause is comprehensively drafted without any loopholes. If the provider used an agent or engaged some third party to disseminate content on their behalf, rather than doing so directly, that would be covered too. We just wanted to ensure that there was absolutely no loophole—no chink of light—in the way that the clause was drafted. That is why that reference is there.

I am delighted that these clauses seem to command such widespread support. It therefore gives me great pleasure to commend them to the Committee.

Question put and agreed to.

Clause 66 accordingly ordered to stand part of the Bill.

Clause 67 ordered to stand part of the Bill.

Schedule 9 agreed to.

Clause 68

Duties about regulated provider pornographic content

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move amendment 114, in clause 68, page 60, line 13, at end insert—

“(2A) A duty to verify that every individual featured in regulated provider pornographic content is an adult before the content is published on the service.

(2B) A duty to verify that every individual featured in regulated provider pornographic content that is already published on the service when this Act is passed is an adult and, where that is not the case, remove such content from the service.

(2C) A duty to verify that each individual appearing in regulated provider pornographic content has given their permission for the content in which they appear to be published or made available by the internet service.

(2D) A duty to remove regulated provider pornographic content featuring an individual if that individual withdraws their consent, at any time, to the pornographic content in which they feature remaining on the service.”

This amendment creates a duty to verify that each individual featured in pornographic content is an adult and has agreed to the content being uploaded before it is published. It would also impose a duty to remove content if the individual withdraws consent at any time.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 115, in clause 68, page 60, line 17, after “(2)” insert “to (2D)”.

Clause stand part.

New clause 2—Duties regarding user-generated pornographic content: regulated services

“(1) This section sets out the duties which apply to regulated services in relation to user-generated pornographic content.

(2) A duty to verify that each individual featuring in the pornographic content has given their permission for the content in which they feature to be published or made available by the service.

(3) A duty to remove pornographic content featuring a particular individual if that individual withdraws their consent, at any time, to the pornographic content in which they feature remaining on the service.

(4) For the meaning of ‘pornographic content’, see section 66(2).

(5) In this section, ‘user-generated pornographic content’ means any content falling within the meaning given by subsection (4) and which is also generated directly on the service by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service.

(6) For the meaning of ‘regulated service’, see section 2(4).”

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Clause 68 outlines the duties covering regulated provider pornographic content, and Ofcom’s guidance on those duties. Put simply, the amendments are about age verification and consent, to protect women and children who are victims of commercial sexual exploitation.

I am moving a series of targeted amendments, tabled by my right hon. Friend the Member for Kingston upon Hull North (Dame Diana Johnson), which I hope that all hon. Members will be able to support because this is an issue that goes beyond party lines. This is about children who have been sexually abused, women who have been raped, and trafficking victims who have been exploited, who have all suffered the horror of filmed footage of their abuse being published on some of the world’s biggest pornography websites. This is about basic humanity.

Currently, leading pornography websites allow members of the public to upload pornographic videos without verifying that everyone in the film is an adult, that they gave their permission for it to be uploaded to a pornography website, or even that they know the film exists. It is sadly not surprising that because of the absence of even the most basic safety measures, hugely popular and profitable pornography websites have been found hosting and profiting from filmed footage of rape, sex trafficking, image-based sexual abuse and child sexual abuse. This atrocious practice is ongoing and well documented.

In 2019, PayPal stopped processing payments for Pornhub—one of the most popular pornography websites in the world—after an investigation by The Sunday Times revealed that the site contained child abuse videos and other illegal content. That included an account on the site dedicated to posting so-called creepshots of UK schoolgirls. In 2020, The New York Times documented the presence of child abuse videos on Pornhub, prompting Mastercard, Visa and Discover to block the use of their cards for purchases on the site.

New York Times reporter Nicholas Kristof wrote of Pornhub:

“Its site is infested with rape videos. It monetizes child rapes, revenge pornography, spy cam videos of women showering, racist and misogynist content, and footage of women being asphyxiated in plastic bags.”

That particular pornography website is now subject to multiple lawsuits launched against its parent company, MindGeek, by victims whose abuse was published on the site. Plaintiffs include victims of image-based sexual abuse in the UK, such as Crystal Palace footballer Leigh Nicol. Her phone was hacked, and private content was uploaded to Pornhub without her knowledge. She bravely and generously shared her experience in an interview for Sky Sports News, saying:

“The damage is done for me so this is about the next generation. I feel like prevention is better than someone having to react to this. I cannot change it alone but if I can raise awareness to stop it happening to others then that is what I want to do… The more that you dig into this, the more traumatising it is because there are 14-year-old kids on these websites and they don’t even know about it. The fact that you can publish videos that have neither party’s consent is something that has to be changed by law, for sure.”

I agree. It is grotesque that pornography website operators do not even bother to verify that everyone featured in films on their sites is an adult or even gave permission for the film to be uploaded. That cannot be allowed to continue.

These amendments, which I hope will receive the cross-party backing that they strongly deserve, would stop pornography websites publishing and profiting from videos of rape and child sexual abuse by requiring them to implement the most basic of prevention measures.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I support the hon. Member’s amendments. The cases that she mentions hammer home the need for women and girls to be mentioned in the Bill. I do not understand how the Government can justify not doing so when she is absolutely laying out the case for doing so.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I agree with the hon. Member and welcome her intervention. We will be discussing these issues time and again during our proceedings. What is becoming even more apparent is the need to include women and girls in the Bill, call out violence against women and girls online for what it is, and demand that the Government go further to protect women and girls. This is yet another example of where action needs to happen. I hope the Minister is hearing our pleas and that this will happen at some point as we make progress through the Bill.

More needs to be done to tackle this problem. Pornography websites need to verify that every individual in pornographic videos published on their site is an adult and gave their permission for the video to be published, and enable individuals to withdraw their consent for pornography of them to remain on the site. These are rock-bottom safety measures for preventing the most appalling abuses on pornography websites.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I add my voice to the arguments made by my hon. Friend and the hon. Member for Aberdeen North. Violence against women and girls is a fundamental issue that the Bill needs to address. We keep coming back to that, and I too hope that the Minister hears that point. My hon. Friend has described some of the most horrific harms. Surely, this is one area where we have to be really clear. If we are to achieve anything with the Bill, this is an area that we should be working on.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I wholeheartedly agree with my hon. Friend. As I have said, the amendments would put in place rock-bottom safety measures that could prevent the most appalling abuses on pornography websites, and it is a scandal that, hitherto, they have not been implemented. We have the opportunity to change that today by voting for the amendments and ensuring that these measures are in place. I urge the Minister and Conservative Members to do the right thing.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I thank the hon. Lady for giving way. I can understand the intent behind what she is saying and I have a huge amount of sympathy for it, but we know as a matter of fact that many of the images that are lodged on these sorts of websites were never intended to be pornographic in the first place. They may be intimate images taken by individuals of themselves—or, indeed, of somebody else—that are then posted as pornographic images. I am slightly concerned that an image such as that may not be caught by the hon. Lady’s amendments. Would she join me in urging the Government to bring forward the Law Commission’s recommendations on the taking, making and sharing of intimate images online without consent, which are far broader? They would probably do what she wants to do but not run into the problem of whether an image was meant to be pornographic in the first place.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful to the right hon. Member for her intervention. She knows that I have the utmost respect for all that she has tried to achieve in this area in the House along with my right hon. Friend the Member for Kingston upon Hull North.

We feel these amendments would encapsulate the specific issue of consent-based imagery or video content for which consent has not been obtained. Many of these people do not even know that the content has been taken in the first place, and it is then uploaded to these websites. It would be the website’s duty to verify that consent had been obtained and that the people in the video were of the age of consent. That is why we urge hon. Members to back the amendments.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The shadow Minister has laid out compellingly how awful the displaying of images of children on pornography websites and the displaying of images where the consent of the person has not been obtained are. Let me take each of those in turn, because my answers will be a bit different in the two cases.

First, all material that contains the sexual abuse of children or features children at all—any pornographic content featuring children is, by definition, sexual abuse—is already criminalised through the criminal law. Measures such as the Protection of Children Act 1978, the Criminal Justice Act 1988 and the Coroners and Justice Act 2009 provide a range of criminal offences that include the taking, making, circulating, possessing with a view to distributing, or otherwise possessing indecent photos or prohibited images of children. As we would expect, everything that the hon. Lady described is already criminalised under existing law.

This part of the Bill—part 5—covers publishers and not the user-to-user stuff we talked about previously. Because they are producing and publishing the material themselves, publishers of such material are covered by the existing criminal law. What they are doing is already illegal. If they are engaged in that activity, they should—and, I hope, will—be prosecuted for doing it.

The new clause and the amendments essentially seek to duplicate what is already set out very clearly in criminal law. While their intentions are completely correct, I do not think it is helpful to have duplicative law that essentially tries to do the same thing in a different law. We have well established and effective criminal laws in these areas.

In relation to the separate question of people whose images are displayed without their consent, which is a topic that my right hon. Friend the Member for Basingstoke has raised a few times, there are existing criminal offences that are designed to tackle that, including the recent revenge pornography offences in particular, as well as the criminalisation of voyeurism, harassment, blackmail and coercive or controlling behaviour. There is then the additional question of intimate image abuse, where intimate images are produced or obtained without the consent of the subject, and are then disseminated.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

The Minister must be careful about using the revenge pornography legislation as an example of protection. He will know well that that legislation requires relationships between the people involved. It is a very specific piece of legislation. It does not cover the sorts of examples that the shadow Minister was giving.

16:59
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I think it would cover some of them. If, for example, someone in a relationship had a video taken that was then made available on a commercial pornography site, that would clearly be in scope. I am not saying that the revenge pornography legislation covers all examples, but it covers some of them. We have discussed already that clause 150 will criminalise a great deal of the content referred to here if the intention of that content or the communication concerned is to cause harm—meaning

“psychological harm amounting to at least serious distress”—

to the subject. That will capture a lot of this as well.

My right hon. Friend the Member for Basingstoke has made a point about needing to remove the intent requirement. Any sharing of an intimate image without consent should be criminalised. As we have discussed previously, that is being moved forward under the auspices of the Ministry of Justice in connection with the Law Commission’s proposed offence. That work is in flight, and I would anticipate it delivering legislative results. I think that is the remaining piece of the puzzle. With the addition of that piece of legislation, I think we will cover the totality of possible harms in relation to images of people whose consent has not been given.

In relation to material featuring children, the legislative pattern is complete already; it is already criminal. We do not need to do anything further to add any criminal offences; it is already illegal, as it should be. In relation to non-consensual images, the picture is largely complete. With the addition of the intimate image abuse offence that my right hon. Friend the Member for Basingstoke has been rightly campaigning for, the picture will be complete. Given that that is already in process via the Law Commission, while I again agree with what the Opposition are trying to do here, we have a process in hand that will sort this out. I hope that that makes the Government’s position on the amendments and the new clause clear.

Clause 68 is extremely important. It imposes a legally binding duty to make sure that children are not normally able to encounter pornographic content in a commercial context, and it makes it clear that one of the ways that can be achieved is by using age verification. If Ofcom, in its codes of practice, directs companies to use age verification, or if there is no other effective means of preventing children from seeing pornographic content, the clause makes it clear that age verification is expressly authorised by Parliament in primary legislation. There will be no basis upon which a porn provider could try to legally challenge Ofcom, because it is there in black and white in the Bill. It is clearly Parliament’s intention that hard-edged age verification will be legal. By putting that measure in the Bill as an example of the way that the duty can be met, we immunise the measure from legal challenge should Ofcom decide it is the only way of delivering the duty. I make that point explicitly for the avoidance of doubt, so that if this point is ever litigated, Parliament’s intention is clear.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I welcome the Minister’s comments and commitment to look at this further, and the Law Commission’s review being taken forward. With that in mind, I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Clause 68 ordered to stand part of the Bill.

Ordered, That further consideration be now adjourned—(Steve Double.)

17:04
Adjourned till Thursday 16 June at half-past 11 o’clock.
Written evidence reported to the House
OSB69 Full Fact (supplementary submission)
OSB70 Care Quality Commission (CQC)
OSB71 Oxford University's Child-Centred AI initiative, Department of Computer Science
OSB72 British Retail Consortium (BRC)
OSB73 Claudine Tinsman, doctoral candidate in Cyber Security at the University of Oxford
OSB74 British Board of Film Classification (BBFC)
OSB75 Advertising Standards Authority
OSB76 YoungMinds

Online Safety Bill (Eleventh sitting)

Committee stage
Thursday 16th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 16 June 2022 - (16 Jun 2022)
The Committee consisted of the following Members:
Chairs: † Sir Roger Gale, Christina Rees
Ansell, Caroline (Eastbourne) (Con)
† Bailey, Shaun (West Bromwich West) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
Carden, Dan (Liverpool, Walton) (Lab)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Double, Steve (St Austell and Newquay) (Con)
† Fletcher, Nick (Don Valley) (Con)
Holden, Mr Richard (North West Durham) (Con)
† Keeley, Barbara (Worsley and Eccles South) (Lab)
Leadbeater, Kim (Batley and Spen) (Lab)
† Miller, Dame Maria (Basingstoke) (Con)
Mishra, Navendu (Stockport) (Lab)
† Moore, Damien (Southport) (Con)
Nicolson, John (Ochil and South Perthshire) (SNP)
† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
† Russell, Dean (Watford) (Con)
† Stevenson, Jane (Wolverhampton North East) (Con)
Katya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks
† attended the Committee
Public Bill Committee
Thursday 16 June 2022
(Morning)
[Sir Roger Gale in the Chair]
Online Safety Bill
11:30
None Portrait The Chair
- Hansard -

Good morning, ladies and gentlemen. I am entirely properly reminded that today is the anniversary of Jo Cox’s death, which is why Kim Leadbeater, the hon. Member for Batley and Spen, is not with us this morning. I am sure that our thoughts and good wishes are with her and the family.

None Portrait Hon. Members
- Hansard -

Hear, hear.

Clause 69

OFCOM’s guidance about duties set out in section 68

None Portrait The Chair
- Hansard -

We start with amendment 127 to clause 69. It is up to the Committee, but I am minded to allow this debate to go slightly broader and take the stand part debate with it.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

I beg to move amendment 127, in clause 69, page 60, line 26, after “must” insert—

“within six months of this Act being passed”.

As ever, it is a pleasure to serve under your chairship, Sir Roger. The thoughts and prayers of us all are with my hon. Friend the Member for Batley and Spen and all her friends and family.

Labour welcomes the clause, which sets out Ofcom’s duties to provide guidance to providers of internet services. It is apparent, however, that we cannot afford to kick the can down the road and delay implementation of the Bill any further than necessary. With that in mind, I urge the Minister to support the amendment, which would give Ofcom an appropriate amount of time to produce this important guidance.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

It is a pleasure, once again, to serve under your august chairmanship, Sir Roger. I associate the Government with the remarks that you and the shadow Minister made, marking the anniversary of Jo Cox’s appalling murder, which shook the entire House when it happened. She will never be forgotten.

The Government are sympathetic to the intent of the amendment, which seeks to ensure that guidance for providers on protecting children from online pornography is put in place as quickly as possible. We of course sympathise with that objective, but we feel that the Secretary of State must retain the power to determine when to bring in the provisions of part 5, including the requirement under the clause for Ofcom to produce guidance, to ensure that implementation of the framework comprehensively and effectively regulates all forms of pornography online. That is the intention of the whole House and of this Committee.

Ofcom needs appropriate time and flexibility to get the guidance exactly right. We do not want to rush it and consequently see loopholes, which pornography providers or others might seek to exploit. As discussed, we will be taking a phased approach to bringing duties under the Bill into effect. We expect prioritisation for the most serious harms as quickly as possible, and we expect the duties on illegal content to be focused on most urgently. We have already accelerated the timescales for the most serious harms by putting priority illegal content in the various schedules to the Bill.

Ofcom is working hard to prepare implementation. We are all looking forward to the implementation road map, which it has committed to produce before the summer. For those reasons, I respectfully resist the amendment.

Question put, That the amendment be made.

Division 31

Ayes: 3


Labour: 2
Scottish National Party: 1

Noes: 8


Conservative: 8

Clause 69 ordered to stand part of the Bill.
Clause 70
Duty to notify OFCOM
Question proposed, That the clause stand part of the Bill.
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clauses 71 to 76 stand part.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

It is a pleasure to serve with you in the Chair again, Sir Roger. I add my tribute to our former colleague, Jo Cox, on this sad anniversary. Our thoughts are with her family today, including our colleague and my hon. Friend, the Member for Batley and Spen.

We welcome the “polluter pays” principle on which this and the following clauses are founded. Clause 70 establishes a duty for providers to notify Ofcom if their revenue is at or above the specified threshold designated by Ofcom and approved by the Secretary of State. It also creates duties on providers to provide timely notice and evidence of meeting the threshold. The Opposition do not oppose those duties. However, I would be grateful if the Minister could clarify what might lead to a provider or groups of providers being exempt from paying the fee. Subsection (6) establishes that

“OFCOM may provide that particular descriptions of providers of regulated services are exempt”,

subject to the Secretary of State’s approval. Our question is what kinds of services the Minister has in mind for that exemption.

Turning to clauses 71 to 76, as I mentioned, it is appropriate that the cost to Ofcom of exercising its online safety functions is paid through an annual industry fee, charged to the biggest companies with the highest revenues, and that smaller companies are exempt but still regulated. It is also welcome that under clause 71, Ofcom can make reference to factors beyond the provider’s qualifying worldwide revenue when determining the fee that a company must pay. Acknowledging the importance of other factors when computing that fee can allow for a greater burden of the fees to fall on companies whose activities may disproportionately increase Ofcom’s work on improving safety.

My hon. Friend the Member for Pontypridd has already raised our concerns about the level of funding needed for Ofcom to carry out its duties under the Bill. She asked about the creation of a new role: that of an adviser on funding for the online safety regulator. The impact assessment states that the industry fee will need to average around £35 million a year for the next 10 years to pay for operating expenditure. Last week, the Minister referred to a figure of around £88 million that has been announced to cover the first two years of the regime while the industry levy is implemented, and the same figure was used on Second Reading by the Secretary of State. Last October’s autumn Budget and spending review refers on page 115 to

“over £110 million over the SR21 period for the government’s new online safety regime through the passage and implementation of the Online Safety Bill, delivering on the government’s commitment to make the UK the safest place to be online.”

There is no reference to the £88 million figure or to Ofcom in the spending review document. Could the Minister tell us a bit more about that £88 million and the rest of the £110 million announced in the spending review, as it is relevant to how Ofcom is going to be resourced and the industry levy that is introduced by these clauses?

The Opposition feel it is critical that when the Bill comes into force, there is no gap in funding that would prevent Ofcom from carrying out its duties. The most obvious problem is that the level of funding set out in the spending review was determined when the Bill was in draft form, before more harms were brought into scope. The Department for Digital, Culture, Media and Sport has also confirmed that the figure of £34.9 million a year that is needed for Ofcom to carry out its online safety duties was based on the draft Bill.

We welcome many of the additional duties included in the Bill since its drafting, such as on fraudulent advertising, but does the Minister think the same level of funding will be adequate as when the calculation was made, when the Bill was in draft form? Will he reconsider the calculations his Department has made of the level of funding that Ofcom will need for this regime to be effective in the light of the increased workload that this latest version of the Bill introduces?

In March 2021, Ofcom put out a press release stating that 150 people would be employed in the new digital and technology hub in Manchester, but that that number would be reached in 2025. Therefore, as well as the level of resource being based on an old version of the Bill, the timeframe reveals a gap of three years until all the staff are in place. Does the Minister believe that Ofcom will have everything that is needed from the start, and in subsequent years as the levy gets up and going, in order to carry out its duties?

Of course, this will depend on how long the levy might need to be in place. My understanding of the timeframe is that first, the Secretary of State must issue guidance to Ofcom about the principles to be included in the statement of principles that Ofcom will use to determine the fees payable under clause 71. Ofcom must consult with those affected by the threshold amount to inform the final figure it recommends to the Secretary of State, and must produce a statement about what amounts comprise the provider’s qualifying world revenue and the qualifying period. That figure and Ofcom’s guidance must be agreed by the Secretary of State and laid before Parliament. Based on those checks and processes, how quickly does the Minister envisage the levy coming into force?

The Minister said last week that Ofcom is resourced for this work until 2023-24. Will the levy be in place by then to fund Ofcom’s safety work into 2024-25? If not, can the Minister confirm that the Government will cover any gaps in funding? I am sure he will agree, as we all do, that the duties in the Bill must be implemented as quickly as possible, but the necessary funding must also be in place so that Ofcom as a regulator can enforce the safety duty.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I have just a short comment on these clauses. I very much applaud the Government’s approach to the funding of Ofcom through this mechanism. Clause 75 sets out clearly that the fees payable to Ofcom under section 71 should only be

“sufficient to meet, but…not exceed the annual cost to OFCOM”.

That is important when we start to think about victim support. While clearly Ofcom will have a duty to monitor the efficacy of the mechanisms in place on social media platforms, it is not entirely clear to me from the evidence or conversations with Ofcom whether it will see it as part of its duty to ensure that other areas of victim support are financed through those fees.

It may well be that the Minister thinks it more applicable to look at this issue when we consider the clauses on fines, and I plan to come to it at that point, but it would be helpful to understand whether he sees any role for Ofcom in ensuring that there is third-party specialist support for victims of all sorts of crime, including fraud or sexual abuse.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me start by associating myself with the remarks by the hon. Member for Worsley and Eccles South. We are in complete concurrence with the concept that the polluter should pay. Where there are regulatory costs caused by the behaviour of the social media firms that necessitates the Bill, it is absolutely right that those costs should fall on them and not on the general taxpayer. I absolutely agree with the principles that she outlined.

The hon. Lady raised a question about clause 70(6) and the potential exemption from the obligation to pay fees. That is a broadly drawn power, and the phrasing used is where

“OFCOM consider that an exemption…is appropriate”

and where the Secretary of State agrees. The Bill is not being prescriptive; it is intentionally providing flexibility in case there are circumstances where levying the fees might be inappropriate or, indeed, unjust. It is possible to conceive of an organisation that somehow exceeds the size threshold, but so manifestly does not need regulation that it would be unfair or unjust to levy the fees. For example, if a charity were, by some accident of chance, to fall into scope, it might qualify. But we expect social media firms to pay these bills, and I would not by any means expect the exemption to be applied routinely or regularly.

On the £88 million and the £110 million that have been referenced, the latter amount is to cover the three-year spending review period, which is the current financial year—2022-23—2023-24 and 2024-25. Of that £110 million, £88 million is allocated to Ofcom in the first two financial years; the remainder is allocated to DCMS for its work over the three-year period of the spending review. The £88 million for Ofcom runs out at the end of 2023-24.

The hon. Lady then asked whether the statutory fees in these clauses will kick in when the £88 million runs out—whether they will be available in time. The answer is yes. We expect and intend that the fees we are debating will become effective in 2024-25, so they will pick up where the £88 million finishes.

Ofcom will set the fees at a level that recoups its costs, so if the Bill becomes larger in scope, for example through amendments in the Commons or the Lords—not that I wish to encourage amendments—and the duties on Ofcom expand, we would expect the fees to be increased commensurately to cover any increased cost that our legislation imposes.

11:45
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Before the Minister gets past this point—I think he has reached the point of my question—the fees do not kick in for two years. The figure is £88 million, but the point I was making is that the scope of the Bill has already increased. I asked about this during the evidence session with Ofcom. Fraudulent advertising was not included before, so there are already additional powers for Ofcom that need to be funded. I was questioning whether the original estimate will be enough for those two years.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I assume that the hon. Lady is asking about the £88 million.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

indicated assent.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

That covers the preparatory work rather than the actual enforcement work that will follow. For the time being, we believe that it is enough, but of course we always maintain an active dialogue with Ofcom.

Finally, there was a question from my right hon. Friend the Member for Basingstoke, who asked how victims will be supported and compensated. As she said, Ofcom will always pay attention to victims in its work, but we should make it clear that the fees we are debating in these clauses are designed to cover only Ofcom’s costs and not those of third parties. I think the costs of victim support and measures to support victims are funded separately via the Ministry of Justice, which leads in this area. I believe that a victims Bill is being prepared that will significantly enhance the protections and rights that victims have—something that I am sure all of us will support.

Question put and agreed to.

Clause 70 accordingly ordered to stand part of the Bill.

Clauses 71 to 76 ordered to stand part of the Bill.

Clause 77

General duties of OFCOM under section 3 of the Communications Act

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clauses 78 and 79 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

We welcome clause 77, which is an important clause that seeks to amend Ofcom’s existing general duties in the Communications Act 2003. Given the prevalence of illegal harms online, as we discussed earlier in proceedings, it is essential that the Communications Act is amended to reflect the important role that Ofcom will have as a new regulator.

As the Minister knows, and as we will discuss shortly when we reach amendments to clause 80, we have significant concerns about the Government’s approach to size versus harm when categorising service providers. Clause 77(4) amends section 3 of the Communications Act by inserting new subsection (4A). New paragraph (4A)(d) outlines measures that are proportionate to

“the size or capacity of the provider”,

and to

“the level of risk of harm presented by the service in question, and the severity of the potential harm”.

We know that harm, and the potential of accessing harmful content, is what is most important in the Bill—it says it in the name—so I am keen for my thoughts on the entire categorisation process to be known early on, although I will continue to press this issue with the Minister when we debate the appropriate clause.

Labour also supports clause 78. It is vital that Ofcom will have a duty to publish its proposals on strategic priorities within a set time period, and ensuring that that statement is published is a positive step towards transparency, which has been so crucially missing for far too long.

Similarly, Labour supports clause 79, which contains a duty to carry out impact assessments. That is vital, and it must be conveyed in the all-important Communications Act.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As the shadow Minister has set out, these clauses ensure that Ofcom’s duties under the Communications Act 2003 are updated to reflect the new duties that we are asking it to undertake—I think that is fairly clear from the clauses. On the shadow Minister’s comment about size and risk, I note her views and look forward to debating that more fully in a moment.

Question put and agreed to.

Clause 77 accordingly ordered to stand part of the Bill.

Clauses 78 and 79 ordered to stand part of the Bill.

Clause 80

Meaning of threshold conditions etc

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 80, in schedule 10, page 192, line, at end insert—

“(c) the assessed risk of harm arising from that part of the service.”

This amendment, together with Amendments 81 and 82, widens Category 1 to include those services which pose a very high risk of harm, regardless of the number of users.

Amendment 81, in schedule 10, page 192, line 39, after “functionality” insert—

“and at least one specified condition about the assessed risk of harm”

This amendment is linked to Amendment 80.

Amendment 82, in schedule 10, page 192, line 41, at end insert—

‘(4A) At least one specified condition about the assessed risk of harm must provide for a service assessed as posing a very high risk of harm to its users to meet the Category 1 threshold.”

This amendment is linked to Amendment 80, it widens Category 1 to include those services which pose a very high risk of harm, regardless of the number of users.

That schedule 10 be the Tenth schedule to the Bill.

Clause 81 stand part.

Clause 82 stand part.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

Thank you for your efforts in chairing our meeting today, Sir Roger. My thoughts are with the hon. Member for Batley and Spen and her entire family on the anniversary of Jo Cox’s murder; the SNP would like to echo that sentiment.

I want to talk about my amendment, and I start with a quote from the Minister on Second Reading:

“A number of Members…have raised the issue of small platforms that are potentially harmful. I will give some thought to how the question of small but high-risk platforms can be covered.”—[Official Report, 19 April 2022; Vol. 712, c. 133.]

I appreciate that the Minister may still be thinking about that. He might accept all of our amendments; that is entirely possible, although I am not sure there is any precedent. The possibility is there that that might happen.

Given how strong I felt that the Minister was on the issue on Second Reading, I am deeply disappointed that there are no Government amendments to this section of the Bill. I am disappointed because of the massive risk of harm caused by some very small platforms—it is not a massive number—where extreme behaviour and radicalisation is allowed to thrive. It is not just about the harm to those individuals who spend time on those platforms and who are radicalised, presented with misinformation and encouraged to go down rabbit holes and become more and more extreme in their views. It is also about the risk of harm to other people as a result of the behaviour inspired in those individuals. We are talking about Jo Cox today; she is in our memories and thoughts. Those small platforms are the ones that are most likely to encourage individuals towards extremely violent acts.

If the Bill is to fulfil its stated aims and take the action we all want to see to prevent the creation of those most heinous, awful crimes, it needs to be much stronger on small, very high-risk platforms. I will make no apologies for that. I do not care if those platforms have small amounts of profits. They are encouraging and allowing the worst behaviours to thrive on their platforms. They should be held to a higher level of accountability. It is not too much to ask to class them as category 1 platforms. It is not too much to ask them to comply with a higher level of risk assessment requirements and a higher level of oversight from Ofcom. It is not too much to ask because of the massive risk of harm they pose and the massive actual harm that they create.

Those platforms should be punished for that. It is one thing to punish and criminalise the behaviour of users on those platforms—individual users create and propagate illegal content or radicalise other users—but the Bill does not go far enough in holding those platforms to account for allowing that to take place. They know that it is happening. Those platforms are set up as an alternative place—a place that people are allowed to be far more radical that they are on Twitter, YouTube, Twitch or Discord. None of those larger platforms have much moderation, but the smaller platforms encourage such behaviour. Links are put on other sites pointing to those platforms. For example, when people read vaccine misinformation, there are links posted to more radical, smaller platforms. I exclude Discord because, given its number of users, I think it would be included in one of the larger-platform categories anyway. It is not that there is not radical behaviour on Discord—there is—but I think the size of its membership excludes it, in my head certainly, from the category of the very smallest platforms that pose the highest risk.

We all know from our inboxes the number of people who contact us saying that 5G is the Government trying to take over their brains, or that the entire world is run by Jewish lizard people. We get those emails on a regular basis and those theories are propagated on the smallest platforms. Fair enough—some people may not take any action as a result of the radicalisation that they have experienced as a result of their very extreme views. But some people will take action and that action may be simply enough to harm their friends or family, it may be simply enough to exclude them and drag them away from the society or community that they were previously members of or it might, in really tragic cases, be far more extreme. It might lead people to cause physical or mental harm to others intentionally as a result of the beliefs that they have had created and fostered on those platforms.

That is why we have tabled the amendments. This is the one area that the Government have most significantly failed in writing this Bill, by not ensuring that the small, very high-risk platforms are held to the highest level of accountability and are punished for allowing these behaviours to thrive on their platforms. I give the Minister fair warning that unless he chooses to accept the amendments, I intend to push them to a vote. I would appreciate it if he gave assurances, but I do not believe that any reassurance that he could give would compare to having such a measure in the Bill. As I say, for me the lack of this provision is the biggest failing of the entire Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I echo the comments of the hon. Member for Aberdeen North. I completely agree with everything she has just said and I support the amendments that she has tabled.

The Minister knows my feelings on the Government’s approach to categorisation services; he has heard my concerns time and time again. However, it is not just me who believes that the Government have got their approach really wrong. It is also stakeholders far and wide. In our evidence sessions, we heard from HOPE not hate and the Antisemitism Policy Trust specifically on this issue. In its current form, the categorisation process is based on size versus harm, which is a fundamentally flawed approach.

The Government’s response to the Joint Committee that scrutinised the draft Bill makes it clear that they consider that reach is a key and proportional consideration when assigning categories and that they believe that the Secretary of State’s powers to amend those categories are sufficient to protect people. Unfortunately, that leaves many alternative platforms out of category 1, even if they host large volumes of harmful material.

The duty of care approach that essentially governs the Bill is predicated on risk assessment. If size allows platforms to dodge the entry criteria for managing high risk, there is a massive hole in the regime. Some platforms have already been mentioned, including BitChute, Gab and 4chan, which host extreme racist, misogynistic, homophobic and other extreme content that radicalises people and incites harm. And the Minister knows that.

I take this opportunity to pay tribute to my hon. Friend the Member for Plymouth, Sutton and Devonport (Luke Pollard), who has campaigned heavily on the issue since the horrendous and tragic shooting in Keyham in his constituency. One of my big concerns about the lack of focus on violence against women and girls in the Bill, which we have mentioned time and time again, is the potential for the rise of incel culture online, which is very heavily reported on these alternative platforms—these high-harm, high-risk platforms.

I will just give one example. A teacher contacted me about the Bill. She talked about the rise of misogyny and trying to educate her class on what was happening. At the end of the class, a 15-year-old boy—I appreciate that he is under 18 and is a child, so would come under a different category within the Bill, but I will still give the example. He came up to her and said: “Miss, I need to chat to you. This is something I’m really concerned about. All I did was google, ‘Why can’t I get a girlfriend?’” He had been led down a rabbit hole into a warren of alternative platforms that tried to radicalise him with the most extreme content of incel culture: women are evil; women are the ones who are wrong; it is women he should hate; it is his birth right to have a girlfriend, and he should have one; and he should hate women. That is the type of content that is on those platforms that young, impressionable minds are being pointed towards. They are being radicalised and it is sadly leading to incredibly tragic circumstances, so I really want to push the Minister on the subject.

We share the overarching view of many others that this crucial risk needs to be factored into the classification process that determines which companies are placed in category 1. Otherwise, the Bill risks failing to protect adults from substantial amounts of material that causes physical and psychological harm. Schedule 10 needs to be amended to reflect that.

Amendment 80 is fundamental to the creation of a safer internet and to saving lives. Organisations such as the Mental Health Foundation have repeatedly warned us of the dangers of smaller providers that host exceptionally dangerous suicide-related content, which would clearly not be considered category 1 under the Bill as it stands. Suicide forums frequently glamorise suicide and are commonly used to share information about novel methods of suicide. There is clear evidence that when a particular suicide method becomes better known, the effect is not simply that suicidal people switch from one intended method to the novel one, but that suicides occur in people who would not otherwise have taken their own lives.
There are therefore important public health reasons to minimise the discussion of dangerous and effective suicide methods and avoid discussion of them in the public domain. Addressing the most dangerous suicide-related content is an area where the Bill could really save lives. It is therefore inexplicable that a Bill intended to increase online safety does not seek to do that.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I appreciate the shadow Minister’s bringing that issue up. Would she agree that, given we have constraints on broadcast and newspaper reporting on suicide for these very reasons, there can be no argument against including such a measure in the Bill?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree. Those safeguards are in place for that very reason. It seems a major omission that they are not also included in the Online Safety Bill if we are truly to save lives.

The Bill’s own pre-legislative scrutiny Committee recommended that the legislation should

“adopt a more nuanced approach, based not just on size and high-level functionality, but factors such as risk, reach, user base, safety performance, and business model.”

The Government replied that they

“want the Bill to be targeted and proportionate for businesses and Ofcom and do not wish to impose disproportionate burdens on small companies.”

It is, though, entirely appropriate to place a major regulatory burden on small companies that facilitate the glorification of suicide and the sharing of dangerous methods through their forums. It is behaviour that is extraordinarily damaging to public health and makes no meaningful economic or social contribution.

Amendment 82 is vital to our overarching aim of having an assessed risk of harm at the heart of the Bill. The categorisation system is not fit for purpose and will fail to capture so many of the extremely harmful services that many of us have already spoken about.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I want to remind Committee members of what my hon. Friend is talking about. I refer to the oral evidence we heard from Danny Stone, from the Antisemitism Policy Trust, on these small, high-harm platforms. He laid out examples drawn from the work of the Community Security Trust, which released a report called “Hate Fuel”. The report looked at

“various small platforms and highlighted that, in the wake of the Pittsburgh antisemitic murders, there had been 26 threads…with explicit calls for Jews to be killed. One month prior to that, in May 2020, a man called Payton Gendron found footage of the Christchurch attacks. Among this was legal but harmful content, which included the “great replacement” theory, GIFs and memes, and he went on a two-year journey of incitement.”

A week or so before the evidence sitting,

“he targeted and killed 10 people in Buffalo. One of the things that he posted was:

‘Every Time I think maybe I shouldn’t commit to an attack I spend 5 min of /pol/’—

which is a thread on the small 4chan platform—

‘then my motivation returns’.”

Danny Stone told us that the kind of material we are seeing, which is legal but harmful, is inspiring people to go out and create real-world harm. When my hon. Friend the Member for Pontypridd asked him how to amend this approach, he said:

“You would take into account other things—for example, characteristics are already defined in the Bill, and that might be an option”.––[Official Report, Online Safety Public Bill Committee, 26 May 2022; c. 128, Q203-204.]

I do hope that, as my hon. Friend urges, the Minister will look at all these options, because this is a very serious matter.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree with my hon. Friend. The evidence we heard from Danny Stone from the Antisemitism Policy Trust clearly outlined the real-world harm that legal but harmful content causes. Such content may be legal, but it causes mass casualties and harm in the real world.

There are ways that we can rectify that in the Bill. Danny Stone set them out in his evidence and the SNP amendments, which the Labour Front Bench supports wholeheartedly, outline them too. I know the Minister wants to go further; he has said as much himself to this Committee and on the Floor of the House. I urge him to support some of the amendments, because it is clear that such changes can save lives.

Schedule 10 outlines the regulations specifying threshold conditions for categories of part 3 services. Put simply, as the Minister knows, Labour has concerns about the Government’s plans to allow thresholds for each category to be set out in secondary legislation. As we have said before, the Bill has already faced significant delays at the hands of the Government and we have real concerns that a reliance on secondary legislation further kicks the can down the road.

We also have concerns that the current system of categorisation is inflexible in so far as we have no understanding of how it will work if a service is required to shift from one category to another, and how long that would take. How exactly will that work in practice? Moreover, how long would Ofcom have to preside over such decisions?

We all know that the online space is susceptible to speed, with new technologies and ways of functioning popping up all over, and very often. Will the Minister clarify how he expects the re-categorisation process to occur in practice? The Minister must accept that his Department has been tone deaf on this point. Rather than an arbitrary size cut-off, the regulator must use risk levels to determine which category a platform should fall into so that harmful and dangerous content does not slip through the net.

Labour welcomes clause 81, which sets out Ofcom’s duties in establishing a register of categories of certain part 3 services. As I have repeated throughout the passage of the Bill, having a level of accountability and transparency is central to its success. However, we have slight concerns that the wording in subsection (1), which stipulates that the register be established

“as soon as reasonably practicable”,

could be ambiguous and does not give us the certainty we require. Given the huge amount of responsibility the Bill places on Ofcom, will the Minister confirm exactly what he believes the stipulation means in practice?

Finally, we welcome clause 82. It clarifies that Ofcom has a duty to maintain the all-important register. However, we share the same concerns I previously outlined about the timeframe in which Ofcom will be compelled to make such changes. We urge the Minister to move as quickly as he can, to urge Ofcom to do all they can and to make these vital changes.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As we have heard, the clauses set out how different platforms will be categorised with the purpose of ensuring duties are applied in a reasonable and proportionate way that avoids over-burdening smaller businesses. However, it is worth being clear that the Online Safety Bill, as drafted, requires all in-scope services, regardless of their user size, to take action against content that is illegal and where it is necessary to protect children. It is important to re-emphasise the fact that there is no size qualification for the illegal content duties and the duties on the protection of children.

It is also important to stress that under schedule 10 as drafted there is flexibility, as the shadow Minister said, for the Secretary of State to change the various thresholds, including the size threshold, so there is an ability, if it is considered appropriate, to lower the size thresholds in such a way that more companies come into scope, if that is considered necessary.

It is worth saying in passing that we want these processes to happen quickly. Clearly, it is a matter for Ofcom to work through the operations of that, but our intention is that this will work quickly. In that spirit, in order to limit any delays to the process, Ofcom can rely on existing research, if that research is fit for purpose under schedule 10 requirements, rather than having to do new research. That will greatly assist moving quickly, because the existing research is available off the shelf immediately, whereas commissioning new research may take some time. For the benefit of Hansard and people who look at this debate for the application of the Bill, it is important to understand that that is Parliament’s intention.

I will turn to the points raised by the hon. Member for Aberdeen North and the shadow Minister about platforms that may be small and fall below the category 1 size threshold but that are none the less extremely toxic, owing to the way that they are set up, their rules and their user base. The shadow Minister mentioned several such platforms. I have had meetings with the stakeholders that she mentioned, and we heard their evidence. Other Members raised this point on Second Reading, including the right hon. Member for Barking (Dame Margaret Hodge) and my hon. Friend the Member for Brigg and Goole (Andrew Percy). As the hon. Member for Aberdeen North said, I signalled on Second Reading that the Government are listening carefully, and our further work in that area continues at pace.

I am not sure that amendment 80 as drafted would necessarily have the intended effect. Proposed new sub-paragraph (c) to schedule 10(1) would add a risk condition, but the conditions in paragraph (1) are applied with “and”, so they must all be met. My concern is that the size threshold would still apply, and that this specific drafting of the amendment would not have the intended effect.

We will not accept the amendments as drafted, but as I said on Second Reading, we have heard the representations—the shadow Minister and the hon. Member for Aberdeen North have made theirs powerfully and eloquently—and we are looking carefully at those matters. I hope that provides some indication of the Government’s thinking. I thank the stakeholders who engaged and provided extremely valuable insight on those issues. I commend the clause to the Committee.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I thank the Minister for his comments. I still think that such platforms are too dangerous not to be subject to more stringent legislation than similar-sized platforms. For the Chair’s information, I would like to press amendment 80 to a vote. If it falls, I will move straight to pressing amendment 82 to a vote, missing out amendment 81. Does that makes sense, Chair, and is it possible?

None Portrait The Chair
- Hansard -

No, I am afraid it is not. We will deal with the amendments in order.

Question put and agreed to.

Clause 80 accordingly ordered to stand part of the Bill.

Schedule 10

Categories of regulated user-to-user services and regulated search services: regulations

None Portrait The Chair
- Hansard -

Now we come to those amendments, which have not yet been moved. The problem is that amendment 82 is linked to amendment 80. I think I am right in saying that if amendment 80 falls, amendment 82 will fall. Does the hon. Lady want to move just amendment 82?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Thank you for your advice, Chair. I will move amendment 80. Should it be accepted, I would be keen to move to other two.

Amendment proposed: 80,in schedule 10, page 192, line 19, at end insert—

“(c) the assessed risk of harm arising from that part of the service.”—(Kirsty Blackman.)

This amendment, together with Amendments 81 and 82, widens Category 1 to include those services which pose a very high risk of harm, regardless of the number of users.

Division 32

Ayes: 3


Labour: 2
Scottish National Party: 1

Noes: 8


Conservative: 8

None Portrait The Chair
- Hansard -

As I indicated, that means that amendments 81 and 82 now fall. Just for the hon. Lady’s information, ordinarily, where an amendment has been moved in Committee, it would not be selected to be moved on the Floor of the House on Report. However, the Minister has indicated that he is minded to look at this again. If, of course, the Government choose to move an amendment on Report, that then would be put to the House.

12:15
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

On a point of order, Sir Roger. My understanding was that it was previously the case that amendments could not be re-moved again on Report, but that modern practice in the past few years in the House has been that amendments that have been pushed to a vote in Committee are then allowed to be resubmitted on Report, whether or not the Minister has indicated that this is the case.

None Portrait The Chair
- Hansard -

The hon. Lady is correct. I am advised that, actually, the ruling has changed, so it can be. We will see—well, I won’t, but the hon. Lady will see what the Minister does on report.

Schedule 10 agreed to.  

Clauses 81 and 82 ordered to stand part of the Bill.  

Clause 83

OFCOM’s register of risks, and risk profiles, of Part 3

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move amendment 34, in clause 83, page 72, line 12, at end insert—

“(d) the risk of harm posed by individuals in the United Kingdom in relation to adults and children in the UK or elsewhere through the production, publication and dissemination of illegal content.”

This amendment requires the Ofcom’s risk assessment to consider risks to adults and children through the production, publication and dissemination of illegal content.

Labour welcomes clause 83, which places a duty on Ofcom to carry out risk assessments to identify and assess a range of potential risks of harm presented by part 3 services. However we are concerned about subsection (9), which says:

“OFCOM must from time to time review and revise the risk assessments and risk profiles so as to keep them up to date”

That seems a fairly woolly concept even for the Minister to try to defend, so I would be grateful if he clarified exactly what demands will be placed on Ofcom to review those risk assessments and risk profiles. He will know that those are absolutely central to the Bill, so some clarification is required here. Despite that, Labour agrees that it will be a significant advantage for Ofcom to oversee the risk of harm presented by the regulated services.

However, harm should not be limited to those in the UK. Amendment 34 would therefore require Ofcom’s risk assessment to consider risks to adults and children throughout the production, publication and dissemination of illegal content. I have already spoken on this issue, in the debate on amendment 25 to clause 8, so I will keep my comments brief. As the Minister knows, online harms are global in nature, and amendment 34 seeks to ensure that the risk of harm presented by regulated services is not just limited to those in the UK. As we have mentioned previously, research shows us that there is some very damaging, often sexually violent, content being streamed abroad. Labour fears that the current provisions in the legislation will not be far-reaching enough to capture the true essence of the risk of harm that people may face when online.

Labour supports the intentions of clause 84, which outlines that Ofcom must produce guidance to assist providers in complying with their duties to carry out illegal content risk assessments

“As soon as reasonably practicable”.

Of course, the Minister will not be surprised that Labour has slight reservations about the timing around those important duties, so I would appreciate an update from the Minister on the conversations he has had with Ofcom about the practicalities of its duties.

None Portrait The Chair
- Hansard -

I did not indicate at the start of the debate that I would take the clause stand part and clause 84 stand part together, but I am perfectly relaxed about it and very happy to do so, as the hon. Lady has spoken to them. If any other colleague wishes to speak to them, that is fine by me.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Perhaps I might start with amendment 34, which the shadow Minister just spoke to. We agree that it is very important to consider the risks posed to victims who are outside of the territory of the United Kingdom. However, for the reasons I will elaborate on, we believe that the Bill as drafted achieves that objective already.

First, just to remind the Committee, the Bill already requires companies to put in place proportionate systems and processes to prevent UK users from encountering illegal content. Critically, that includes where a UK user creates illegal content via an in-scope platform, but where the victim is overseas. Let me go further and remind the Committee that clause 9 requires platforms to prevent UK users from encountering illegal content no matter where that content is produced or published. The word “encounter” is very broadly defined in clause 189 as meaning

“read, view, hear or otherwise experience content”.

As such, it will cover a user’s contact with any content that they themselves generate or upload to a service.

Critically, there is another clause, which we have discussed previously, that is very important in the context of overseas victims, which the shadow Minister quite rightly raises. The Committee will recall that subsection (9) of clause 52, which is the important clause that defines illegal content, makes it clear that that content does not have to be generated, uploaded or accessed in the UK, or indeed to have anything to do with the UK, in order to count as illegal content towards which the company has duties, including risk assessment duties. Even if the illegal act—for example, sexually abusing a child—happens in some other country, not the UK, it still counts as illegal content under the definitions in the Bill because of clause 52(9). It is very important that those duties will apply to that circumstance. To be completely clear, if an offender in the UK uses an in-scope platform to produce content where the victim is overseas, or to share abuse produced overseas with other UK users, the platform must tackle that, both through its risk assessment duties and its other duties.

As such, the entirely proper intent behind amendment 34 is already covered by the Bill as drafted. The shadow Minister, the hon. Member for Pontypridd, has already referred to the underlying purpose of clauses 83 and 84. As we discussed before, the risk assessments are central to the duties in the Bill. It is essential that Ofcom has a proper picture of the risks that will inform its various regulatory activities, which is why these clauses are so important. Clause 84 requires Ofcom to produce guidance to services to make sure they are carrying out those risk assessments properly, because it is no good having a token risk assessment or one that does not properly deal with the risks. The guidance published under clause 84 will ensure that happens. As such, I will respectfully resist amendment 34, on the grounds that its contents are already covered by the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful for the Minister’s clarification. Given his assurances that its contents are already covered by the Bill, I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Clause 83 ordered to stand part of the Bill.

Clause 84 ordered to stand part of the Bill.

Clause 85

Power to require information

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clauses 86 to 91 stand part.

Schedule 11 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour supports clause 85, which gives Ofcom the power to require the provision of any information it requires in order to discharge its online safety functions. We strongly believe that, in the interests of transparency, Ofcom as the regulator must have sufficient power to require a service provider to share its risk assessment in order to understand how that service provider is identifying risks. As the Minister knows, we feel that that transparency should go further, and that the risk assessments should be made public. However, we have already had that argument during a previous debate, so I will not repeat those arguments—on this occasion, at least.

Labour also supports clause 86, and we particularly welcome the clarification that Ofcom may require the provision of information in any form. If we are to truly give Ofcom the power to regulate and, where necessary, investigate service providers, we must ensure that it has sufficient legislative tools to rely on.

The Bill gives some strong powers to Ofcom. We support the requirement in clause 87 to name a senior manager, but again, we feel those provisions should go further. Both users and Ofcom must have access to the full range of tools they need to hold the tech giants to account. As it stands, senior managers can be held criminally liable only for technical offences, such as failing to supply information to the regulator, and even then, those measures might not come in until two years after the Bill is in place. Surely the top bosses at social media companies should be held criminally liable for systemic and repeated failures to ensure online safety as soon as the Bill comes into force, so can the Minister explain the reasons for the delay?

The Minister will be happy to hear that Labour supports clause 88. It is important to have an outline on the face of the Bill of the circumstances in which Ofcom can require a report from a skilled person. It is also important that Ofcom has the power to appoint, or give notice to a provider requiring them to appoint, a skilled person, as Labour fears that without those provisions in subsections (3) and (4), the ambiguity around defining a so-called skilled person could be detrimental. We therefore support the clause, and have not sought to amend it at this stage.

Again, Labour supports all the intentions of clause 89 in the interests of online safety more widely. Of course, Ofcom must have the power to force a company to co-operate with an investigation.

Again, we support the need for clause 90, which gives Ofcom the power to require an individual to attend an interview. That is particularly important in the instances outlined in subsection (1), whereby Ofcom is carrying out an investigation into the failure or possible failure of a provider of a regulated service to comply with a relevant requirement. Labour has repeatedly called for such personal responsibility, so we are pleased that the Government are ensuring that the Bill includes sufficient powers for Ofcom to allow proper scrutiny.

Labour supports clause 91 and schedule 11, which outlines in detail Ofcom’s powers of entry, inspection and audit. I did not think we would support this much, but clearly we do. We want to work with the Government to get this right, and we see ensuring Ofcom has those important authorisation powers as central to it establishing itself as a viable regulator of the online space, both now and for generations to come. We will support and have not sought to amend the clauses or schedule 11 for the reasons set out.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I want to make a brief comment echoing the shadow Minister’s welcome for the inclusion of senior managers and named people in the Bill. I agree that that level of personal liability and responsibility is the only way that we will be able to hold some of these incredibly large, unwieldy organisations to account. If they could wriggle out of this by saying, “It’s somebody else’s responsibility,” and if everyone then disagreed about whose responsibility it was, we would be in a much worse place, so I also support the inclusion of these clauses and schedule 11.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am delighted by the strong support that these clauses have received from across the aisle. I hope that proves to be a habit-forming development.

On the shadow Minister’s point about publishing the risk assessments, to repeat the point I made a few days ago, under clause 64, which we have already debated, Ofcom has the power—indeed, the obligation—to compel publication of transparency reports that will make sure that the relevant information sees the light of day. I accept that publication is important, but we believe that objective is achieved via the transparency measures in clause 64.

On the point about senior management liability, which again we debated near the beginning of the Bill, we believe—I think we all agree—that this is particularly important for information disclosure. We had the example, as I mentioned at the time, of one of the very large companies refusing to disclose information to the Competition and Markets Authority in relation to a competition matter and simply paying a £50 million fine rather than complying with the duties. That is why criminal liability is so important here in relation to information disclosure.

To reassure the shadow Minister, on the point about when that kicks in, it was in the old version of the Bill, but potentially did not commence for two years. In this new version, updated following our extensive and very responsive listening exercise—I am going to get that in every time—the commencement of this particular liability is automatic and takes place very shortly after Royal Assent. The delay and review have been removed, for the reason the hon. Lady mentioned, so I am pleased to confirm that to the Committee.

The shadow Minister described many of the provisions. Clause 85 gives Ofcom powers to require information, clause 86 gives the power to issue notices and clause 87 the important power to require an entity to name that relevant senior manager, so they cannot wriggle out of their duty by not providing the name. Clause 88 gives the power to require companies to undergo a report from a so-called skilled person. Clause 89 requires full co-operation with Ofcom when it opens an investigation, where co-operation has been sadly lacking in many cases to date. Clause 90 requires people to attend an interview, and the introduction to schedule 11 allows Ofcom to enter premises to inspect or audit the provider. These are very powerful clauses and will mean that social media companies can no longer hide in the shadows from the scrutiny they so richly deserve.

Question put and agreed to.

Clause 85 accordingly ordered to stand part of the Bill.

Clauses 86 to 91 ordered to stand part of the Bill.

Schedule 11

OFCOM’s powers of entry, inspection and audit

Amendment made: 4, in schedule 11, page 202, line 17, leave out

“maximum summary term for either-way offences”

and insert

“general limit in a magistrates’ court”.—(Chris Philp.)

Schedule 11, as amended, agreed to.

Clause 92

Offences in connection with information notices

Question proposed, That the clause stand part of the Bill.

12:30
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clauses 93 to 96 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Minister will be pleased to hear that we, again, support these clauses. We absolutely support the Bill’s aims to ensure that information offences and penalties are strong enough to dissuade non-compliance. However, as we said repeatedly, we feel that the current provisions are lacking.

As it stands, senior managers can be held criminally liable only for technical offences, such as failing to supply information to the regulator. I am grateful that the Minister has confirmed that the measures will come into force with immediate effect following Royal Assent, rather than waiting two years. That is welcome news. The Government should require that top bosses at social media companies be criminally liable for systemic and repeated failures on online safety, and I am grateful for the Minister’s confirmation on that point.

As these harms are allowed to perpetuate, tech companies cannot continue to get away without penalty. Will the Minister confirm why the Bill does not include further penalties, in the form of criminal offences, should a case of systemic and repeated failures arise? Labour has concerns that, without stronger powers, Ofcom may not feel compelled or equipped to sanction those companies who are treading the fine line of doing just enough to satisfy the requirements outlined in the Bill as it stands.

Labour also welcomes clause 93, which sets out the criminal offences that can be committed by named senior managers in relation to their entity’s information obligations. It establishes that senior managers who are named in a response to an information notice can be held criminally liable for failing to prevent the relevant service provider from committing an information offence. Senior managers can only be prosecuted under the clause where the regulated provider has already been found liable for failing to comply with Ofcom’s information request. As I have already stated, we feel that this power needs to go further if we are truly to tackle online harm. For far too long, those at the very top have known about the harm that exists on their platforms, but they have failed to take action.

Labour supports clause 94 and we have not sought to amend at this stage. It is vital that provisions are laid in the Bill, such as those in subsection (3), which specify actions that a person may take to commit an offence of this nature. We all want to see the Bill keep people safe online, and at the heart of doing so is demanding a more transparent approach from those in silicon valley. My hon. Friend the Member for Worsley and Eccles South made an excellent case for the importance of transparency earlier in the debate but, as the Minister knows, and as I have said time and again, the offences must go further than just applying to simple failures to provide information. We must consider a systemic approach to harm more widely, and that goes far beyond simple information offences.

There is no need to repeat myself. Labour supports the need for clause 95 as it stands and we support clause 96, which is in line with penalties for other information offences that already exist.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am delighted to discover that agreement with the Governments clauses continues to provoke a tsunami of unanimity across the Committee. I sense a gathering momentum behind these clauses.

As the shadow Minister mentioned, the criminal offences here are limited to information provision and disclosure. We have debated the point before. The Government’s feeling is that going beyond the information provision into other duties for criminal liability would potentially go a little far and have a chilling effect on the companies concerned.

Also, the fines that can be levied—10% of global revenue—run into billions of pounds, and there are the denial of service provisions, where a company can essentially be disconnected from the internet in extreme cases; these do provide more than adequate enforcement powers for the other duties in the Bill. The information duties are so fundamental—that is why personal criminal liability is needed. Without the information, we cannot really make any further assessment of whether the duties are being met.

The shadow Minister has set out what the other clauses do: clause 92 creates offences; clause 93 introduces senior managers’ liability; clause 94 sets out the offences that can be committed in relation to audit notices issued by Ofcom; clause 95 creates offences for intentionally obstructing or delaying a person exercising Ofcom’s power; and clause 96 sets out the penalties for the information offences set out in the Bill, which of course include a term of imprisonment of up to two years. Those are significant criminal offences, which I hope will make sure that executives working for social media firms properly discharge those important duties.

Question put and agreed to.

Clause 92 accordingly ordered to stand part of the Bill.

Clauses 93 to 95 ordered to stand part of the Bill.

Clause 96

Penalties for information offences

Amendment made: 2, in clause 96, page 83, line 15, leave out

“maximum summary term for either-way offences”

and insert

“general limit in a magistrates’ court”—(Chris Philp.)

Clause 96, as amended, ordered to stand part of the Bill.

Clause 97

Co-operation and disclosure of information: overseas regulators

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to consider clauses 98 to 102 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Again, Labour supports the intentions of clause 97—the collegiality continues. We know that the Bill’s aims are to protect people across the UK, but we know that online harms often originate elsewhere. That is why it is vital that Ofcom has powers to co-operate with an overseas regulator, as outlined in subsection (1).

However, we do have concerns about subsection (2), which states:

“The power conferred by subsection (1) applies only in relation to an overseas regulator for the time being specified in regulations made by the Secretary of State.”

Can the Minister confirm exactly how that will work in practice? He knows that Labour Members have tabled important amendments to clause 123. Amendments 50 and 51, which we will consider later, aim to ensure that Ofcom has the power to co-operate and take action through the courts where necessary. The same issue applies here: Ofcom must be compelled and have the tools available at its disposal to work internationally where required.

Labour supports clause 98, which amends section 393 of the Communications Act 2003 to include new provisions. That is obviously a vital step, and we particularly welcome subsection (2), which outlined that, subject to the specific exceptions in section 393 of the 2003 Act, Ofcom cannot disclose information with respect to a business that it has obtained by exercising its powers under this Bill without the consent of the business in question. This is once again an important step in encouraging transparency across the board.

We support clause 99, which places a duty on Ofcom to consult the relevant intelligence service before Ofcom discloses or publishes any information that it has received from that intelligence service. For reasons of national security, it is vital that the relevant intelligence service is included in Ofcom’s reasoning and approach to the Bill more widely.

We broadly support the intentions of clause 100. It is vital that Ofcom is encouraged to provide information to the Secretary of State of the day, but I would be grateful if the Minister could confirm exactly how the power will function in reality. Provision of information to assist in the formulation of policy, as we know, is a very broad spectrum in the Communications Act. We want to make sure the powers are not abused—I know that is a concern shared on his own Back Benches—so I would be grateful for the Minister’s honest assessment of the situation.

We welcome clause 101, which amends section 26 of the Communications Act and provides for publication of information and advice for various persons, such as consumers. Labour supports the clause as it stands. We also welcome clause 102, which, importantly, sets out the circumstances in which a statement given to Ofcom can be used in evidence against that person. Again, this is an important clause in ensuring that Ofcom has the powers it needs to truly act as a world-leading regulator, which we all want it to be. Labour supports it and has chosen not to table any amendments.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am delighted that support for the Government’s position on the clauses continues and that cross-party unanimity is taking an ever stronger hold. I am sure the Whips Office will find that particularly reassuring.

The shadow Minister asked a question about clause 100. Clause 100 amends section 24B of the Communications Act 2003, which allows Ofcom to provide information to the Secretary of State to assist with the formulation of policy. She asked me to clarify what that means, which I am happy to do. In most circumstances, Ofcom will be required to obtain the consent of providers in order to share information relating to their business. This clause sets out two exceptions to that principle. If the information required by the Secretary of State was obtained by Ofcom to determine the proposed fees threshold, or in response to potential threats to national security or to the health or safety of the public, the consent of the business is not required. In those instances, it would obviously not be appropriate to require the provider’s consent.

It is important that users of regulated services are kept informed of developments around online safety and the operation of the regulatory framework.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

This specifically relates to the Secretary of State, but would the Minister expect both Ofcom and his Department to be working with the Scottish Government and the Northern Ireland Executive? I am not necessarily talking about sharing all the information, but where there are concerns that it is very important for those jurisdictions to be aware of, will he try to ensure that he has a productive relationship with both devolved Administrations?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Member for her question. Where the matter being raised or disclosed touches on matters of devolved competence—devolved authority—then yes, I would expect that consultation to take place. Matters concerning the health and safety of the public are entirely devolved, I think, so I can confirm that in those circumstances it would be appropriate for the Secretary of State to share information with devolved Administration colleagues.

The shadow Minister has eloquently, as always, touched on the purpose of the various other clauses in this group. I do not wish to try the patience of the Committee, particularly as lunchtime approaches, by repeating what she has ably said already, so I will rest here and simply urge that these clauses stand part of the Bill.

Question put and agreed to.

Clause 97 accordingly ordered to stand part of the Bill.

Clauses 98 to 102 ordered to stand part of the Bill.

Ordered, That further consideration be now adjourned. —(Steve Double.)

12:42
Adjourned till this day at Two o’clock.

Online Safety Bill (Twelfth sitting)

Committee stage
Thursday 16th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 16 June 2022 - (16 Jun 2022)
The Committee consisted of the following Members:
Chairs: † Sir Roger Gale, Christina Rees
Ansell, Caroline (Eastbourne) (Con)
† Bailey, Shaun (West Bromwich West) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
Carden, Dan (Liverpool, Walton) (Lab)
† Davies-Jones, Alex (Pontypridd) (Lab)
Double, Steve (St Austell and Newquay) (Con)
† Fletcher, Nick (Don Valley) (Con)
Holden, Mr Richard (North West Durham) (Con)
† Keeley, Barbara (Worsley and Eccles South) (Lab)
Leadbeater, Kim (Batley and Spen) (Lab)
† Miller, Dame Maria (Basingstoke) (Con)
Mishra, Navendu (Stockport) (Lab)
Moore, Damien (Southport) (Con)
Nicolson, John (Ochil and South Perthshire) (SNP)
† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
† Russell, Dean (Watford) (Con)
† Stevenson, Jane (Wolverhampton North East) (Con)
Katya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks
† attended the Committee
Public Bill Committee
Thursday 16 June 2022
(Afternoon)
[Sir Roger Gale in the Chair]
Online Safety Bill
Clause 103
Notices to deal with terrorism content or CSEA content (or both)
14:00
None Portrait The Chair
- Hansard -

There are amendments to clause 103 that are not owned by any member of the Committee. Nobody has indicated that they wish to take them up, and therefore they fall.

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to consider clauses 105 and 106 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

Under this chapter, Ofcom will have the power to direct companies to use accredited technology to identify child sexual exploitation and abuse content, whether communicated publicly or privately by means of a service, and to remove that content quickly. Colleagues will be aware that the Internet Watch Foundation is one group that assists companies in doing that by providing them with “hashes” of previously identified child sexual abuse material in order to prevent the upload of such material to their platforms. That helps stop the images of victims being recirculated again and again. Tech companies can then notify law enforcement of the details of who has uploaded the content, and an investigation can be conducted and offenders sharing the content held to account.

Those technologies are extremely accurate and, thanks to the quality of our datasets, ensure that companies are detecting only imagery that is illegal. There are a number of types of technology that Ofcom could consider accrediting, including image hashing. A hash is a unique string of letters and numbers that can be applied to an image and matched every time a user attempts to upload a known illegal image to a platform.

PhotoDNA is another type, created in 2009 in a collaboration between Microsoft and Professor Hany Farid at the University of Berkeley. PhotoDNA is a vital tool in the detection of CSEA online. It enables law enforcement, charities, non-governmental organisations and the internet industry to find copies of an image even when it has been digitally altered. It is one of the most important technical developments in online child protection. It is extremely accurate, with a failure rate of one in 50 billion to 100 billion. That gives companies a high degree of certainty that what they are removing is illegal, and a firm basis for law enforcement to pursue offenders.

Lastly, there is webpage blocking. Most of the imagery that the Internet Watch Foundation removes from the internet is hosted outside the UK. While it is waiting for removal, it can disable public access to an image or webpage by adding it to our webpage blocking list. That can be utilised by search providers to de-index known webpages containing CSAM. I therefore ask the Minister, as we continue to explore this chapter, to confirm exactly how such technologies can be utilised once the Bill receives Royal Assent.

Labour welcomes clause 105, which confirms, in subsection (2), that where a service provider is already using technology on a voluntary basis but it is ineffective, Ofcom can still intervene and require a service provider to use a more effective technology, or the same technology in a more effective way. It is vital that Ofcom is given the power and opportunity to intervene in the strongest possible sense to ensure that safety online is kept at the forefront.

However, we do require some clarification, particularly on subsections (9) and (10), which explain that Ofcom will only be able to require the use of tools that meet the minimum standards for accuracy for detecting terrorism and/or CSEA content, as set out by the Secretary of State. Although minimum standards are of course a good thing, can the Minister clarify the exact role that the Secretary of State will have in imposing these minimum standards? How will this work in practice?

Once again, Labour does not oppose clause 106 and we have not sought to amend it at this stage. It is vital that Ofcom has the power to revoke a notice under clause 103(1) if there are reasonable grounds to believe that the provider is not complying with it. Only with these powers can we be assured that service providers will be implored to take their responsibilities and statutory duties, as outlined in the Bill, seriously.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I have a few questions, concerns and suggestions relating to these clauses. I think it was the hon. Member for Don Valley who asked me last week about the reports to the National Crime Agency and how that would work—about how, if a human was not checking those things, there would be an assurance that proper reports were being made, and that scanning was not happening and reports were not being made when images were totally legal and there was no problem with them. [Interruption.] I thought it was the hon. Member for Don Valley, although it may not have been. Apologies—it was a Conservative Member. I am sorry for misnaming the hon. Member.

The hon. Member for Pontypridd made a point about the high level of accuracy of the technologies. That should give everybody a level of reassurance that the reports that are and should be made to the National Crime Agency on child sexual abuse images will be made on a highly accurate basis, rather than a potentially inaccurate one. Actually, some computer technology—particularly for scanning for images, rather than text—is more accurate than human beings. I am pleased to hear those particular statistics.

Queries have been raised on this matter by external organisations—I am particularly thinking about the NSPCC, which we spoke about earlier. The Minister has thankfully given a number of significant reassurances about the ability to proactively scan. External organisations such as the NSPCC are still concerned that there is not enough on the face of the Bill about proactive scanning and ensuring that the current level of proactive scanning is able—or required—to be replicated when the Bill comes into action.

During an exchange in an earlier Committee sitting, the Minister gave a commitment—I am afraid I do not have the quote—to being open to looking at amending clause 103. I am slightly disappointed that there are no Government amendments, but I understand that there has been only a fairly short period; I am far less disappointed than I was previously, when the Minister had much more time to consider the actions he might have been willing to take.

The suggestion I received from the NSPCC is about the gap in the Bill regarding the ability of Ofcom to take action. These clauses allow Ofcom to take action against individual providers about which it has concerns; those providers will have to undertake duties set out by Ofcom. The NSPCC suggests that there could be a risk register, or that a notice could be served on a number of companies at one time, rather than Ofcom simply having to pick one company, or to repeatedly pick single companies and serve notices on them. Clause 83 outlines a register of risk profiles that must be created by Ofcom. It could therefore serve notice on all the companies that fall within a certain risk profile or all the providers that have common functionalities.

If there were a new, emerging concern, that would make sense. Rather than Ofcom having to go through the individual process with all the individual providers when it knows that there is common functionality—because of the risk assessments that have been done and Ofcom’s oversight of the different providers—it could serve notice on all of them in one go. It could not then accidentally miss one out and allow people to move to a different platform that had not been mentioned. I appreciate the conversation we had around this issue earlier, and the opportunity to provide context in relation to the NSPCC’s suggestions, but it would be great if the Minister would be willing to consider them.

I have another question, to which I think the Minister will be able to reply in the affirmative, which is on the uses of the technology as it evolves. We spoke about that in an earlier meeting. The technology that we have may not be what we use in the future to scan for terrorist-related activity or child sexual abuse material. It is important that the Bill adequately covers future conditions. I think that it does, but will the Minister confirm that, as technology advances and changes, these clauses will adequately capture the scanning technologies that are required, and any updates in the way in which platforms work and we interact with each other on the internet?

I have fewer concerns about future-proofing with regard to these provisions, because I genuinely think they cover future conditions, but it would be incredibly helpful and provide me with a bit of reassurance if the Minister could confirm that. I very much look forward to hearing his comments on clause 103.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

Let me start by addressing some questions raised by hon. Members, beginning with the last point made by the hon. Member for Aberdeen North. She sought reconfirmation that the Bill will keep up with future developments in accredited technology that are not currently contemplated. The answer to her question can be found in clause 105(9), in which the definition of accredited technology is clearly set out, as technology that is

“accredited (by OFCOM or another person appointed by OFCOM) as meeting minimum standards of accuracy”.

That is not a one-off determination; it is a determination, or an accreditation, that can happen from time to time, periodically or at any point in the future. As and when new technologies emerge that meet the minimum standards of accuracy, they can be accredited, and the power in clause 103 can be used to compel platforms to use those technologies. I hope that provides the reassurance that the hon. Member was quite rightly asking for.

The shadow Minister, the hon. Member for Pontypridd, asked a related question about the process for publishing those minimum standards. The process is set out in clause 105(10), which says that Ofcom will give advice to the Secretary of State on the appropriate minimum standards, and the minimum standards will then be

“approved…by the Secretary of State, following advice from OFCOM.”

We are currently working with Ofcom to finalise the process for setting those standards, which of course will need to take a wide range of factors into account.

Let me turn to the substantive clauses. Clause 103 is extremely important, because as we heard in the evidence sessions and as Members of the Committee have said, scanning messages using technology such as hash matching, to which the shadow Minister referred, is an extremely powerful way of detecting CSEA content and providing information for law enforcement agencies to arrest suspected paedophiles. I think it was in the European Union that Meta—particularly Facebook and Facebook Messenger—stopped using this scanner for a short period time due to misplaced concerns about privacy laws, and the number of referrals of CSEA images and the number of potential paedophiles who were referred to law enforcement dropped dramatically.

A point that the hon. Member for Aberdeen North and I have discussed previously is that it would be completely unacceptable if a situation arose whereby these messages—I am thinking particularly about Facebook Messenger—did not get scanned for CSEA content in a way that they do get scanned today. When it comes to preventing child sexual exploitation and abuse, in my view there is no scope for compromise or ambiguity. That scanning is happening at the moment; it is protecting children on a very large scale and detecting paedophiles on quite a large scale. In my view, under no circumstances should that scanning be allowed to stop. That is the motivation behind clause 103, which provides Ofcom with the power to make directions to require the use of accredited technology.

As the hon. Member for Aberdeen North signalled in her remarks, given the importance of this issue the Government are of course open to thinking about ways in which the Bill can be strengthened if necessary, because we do not want to leave any loopholes. I urge any social media firms watching our proceedings never to take any steps that degrade or reduce the ability to scan for CSEA content. I thank the hon. Member for sending through the note from the NSPCC, which I have received and will look at internally.

14:15
The proactive scanning that we have talked about is critical. To give one or two examples, this is not just about CSEA, but terrorism as well. Every terrorist attack in 2017 had an online element, and many counter-terrorism prosecutions have involved online activity, because terrorists and their supporters continue to use a wide range of online platforms to further their aims. Similarly, in the context of child sexual abuse material, the Internet Watch Foundation confirmed in 2020 that 153,383 reports of webpages containing CSEA, abuse imagery or UK-hosted, non-photographic child sexual abuse imagery were detected. The importance of the scanning technology is clear, as is the importance of ensuring the clause is as strong as possible.
As the shadow Minister has said, clause 105 provides supporting provisions for clause 103, setting out—for example—the particulars of what must appear in the notice, and clause 106 sets out the process for reviewing a notice to deal with terrorism or CSEA content. I hope I have addressed hon. Members’ questions, and I commend this important clause to the Committee.
Question put and agreed to.
Clause 103 accordingly ordered to stand part of the Bill.
Clause 104
Matters relevant to a decision to give a notice under section 103(1)
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move amendment 35, in clause 104, page 88, line 39, leave out “prevalence” and insert “presence”.

This amendment requires that Ofcom considers the presence of relevant content, rather than its prevalence.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 36, in clause 104, page 88, line 43, leave out “prevalence” and insert “presence”.

This amendment requires that Ofcom considers the presence of relevant content, rather than its prevalence.

Amendment 37, in clause 104, page 89, line 13, at end insert—

“(k) risk of harm posed by individuals in the United Kingdom in relation to adults and children in the UK or elsewhere through the production, publication and dissemination of illegal content.”

This amendment requires the Ofcom’s risk assessment to consider risks to adults and children through the production, publication and dissemination of illegal content.

Amendment 39, in clause 116, page 98, line 37, leave out “prevalence” and insert “presence”.

This amendment requires that Ofcom considers the presence of relevant content, rather than its prevalence.

Amendment 40, in clause 116, page 98, line 39, leave out “prevalence” and insert “presence”.

This amendment requires that Ofcom considers the presence of relevant content, rather than its prevalence.

Amendment 38, in clause 116, page 99, line 12, at end insert—

“(j) the risk of harm posed by individuals in the United Kingdom in relation to adults and children in the UK or elsewhere through the production, publication and dissemination of illegal content.”

This amendment requires Ofcom to consider risks to adults and children through the production, publication and dissemination of illegal content before imposing a proactive technology requirement.

Government amendment 6.

Clause stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

We welcome clause 104, but have tabled some important amendments that the Minister should closely consider. More broadly, the move away from requiring child sexual exploitation and abuse content to be prevalent and persistent before enforcement action can be taken is a positive one. It is welcome that Ofcom will have the opportunity to consider a range of factors.

Despite this, Labour—alongside the International Justice Mission—is still concerned about the inclusion of prevalence as a factor, owing to the difficulty in detecting newly produced CSEA content, especially livestreamed abuse. Amendments 35, 36, 39 and 40 seek to address that gap. Broadly, the amendments aim to capture the concern about the Bill’s current approach, which we feel limits its focus to the risk of harm faced by individuals in the UK. Rather, as we have discussed previously, the Bill should recognise the harm that UK nationals cause to people around the world, including children in the Philippines. The amendments specifically require Ofcom to consider the presence of relevant content, rather than its prevalence.

Amendment 37 would require Ofcom’s risk assessments to consider risks to adults and children through the production, publication and dissemination of illegal content—an issue that Labour has repeatedly raised. I believe we last mentioned it when we spoke to amendments to clause 8, so I will do my best to not repeat myself. That being said, we firmly believe it is important that video content, including livestreaming, is captured by the Bill. I remain unconvinced that the Bill as it stands goes far enough, so I urge the Minister to closely consider and support these amendments. The arguments that we and so many stakeholders have already made still stand.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I echo the sentiments that have been expressed by the shadow Minister, and thank her and her colleagues for tabling this amendment and giving voice to the numerous organisations that have been in touch with us about this matter. The Scottish National party is more than happy to support the amendment, which would make the Bill stronger and better, and would better enable Ofcom to take action when necessary.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I understand the spirit behind these amendments, focusing on the word “presence” rather than “prevalence” in various places. It is worth keeping in mind that throughout the Bill we are requiring companies to implement proportionate systems and processes to protect their users from harm. Even in the case of the most harmful illegal content, we are not placing the duty on companies to remove every single piece of illegal content that has ever appeared online, because that is requesting the impossible. We are asking them to take reasonable and proportionate steps to create systems and processes to do so. It is important to frame the legally binding duties in that way that makes them realistically achievable.

As the shadow Minister said, amendments 35, 36, 39 and 40 would replace the word “prevalence” with “presence”. That would change Ofcom’s duty to enforce not just against content that was present in significant numbers—prevalent—but against a single instance, which would be enough to engage the clause.

We mutually understand the intention behind these amendments, but we think the significant powers to compel companies to adopt certain technology contained in section 103 should be engaged only where there is a reasonable level of risk. For example, if a single piece of content was present on a platform, if may not be reasonable or proportionate to force the company to adopt certain new technologies, where indeed they do not do so at the moment. The use of “prevalence” ensures that the powers are used where necessary.

It is clear—there is no debate—that in the circumstances where scanning technology is currently used, which includes on Facebook Messenger, there is enormous prevalence of material. To elaborate on a point I made in a previous discussion, anything that stops that detection happening would be unacceptable and, in the Government’s view, it would not be reasonable to lose the ability to detect huge numbers of images in the service of implementing encryption, because there is nothing more important than scanning against child sexual exploitation images.

However, we think adopting the amendment and replacing the word “prevalence” with “presence” would create an extremely sensitive trigger that would be engaged on almost every site, even tiny ones or where there was no significant risk, because a single example would be enough to trigger the amendment, as drafted. Although I understand the spirit of the amendment, it moves away from the concepts of proportionality and reasonableness in the systems and processes that the Bill seeks to deliver.

Amendment 37 seeks to widen the criteria that Ofcom must consider when deciding to use section 103 powers. It is important to ensure that Ofcom considers a wide range of factors, taking into account the harm occurring, but clause 104(2)(f) already requires Ofcom to consider

“the level of risk of harm to individuals in the United Kingdom presented by relevant content, and the severity of that harm”.

Therefore, the Bill already contains provision requiring Ofcom to take those matters into account, as it should, but the shadow Minister is right to draw attention to the issue.

Finally, amendment 38 seeks to amend clause 116 to require Ofcom to consider the risk of harm posed by individuals in the United Kingdom, in relation to adults and children in the UK or elsewhere, through the production, publication and dissemination of illegal content. In deciding whether to make a confirmation decision requiring the use of technology, it is important that Ofcom considers a wide range of factors. However, clause 116(6)(e) already proposes to require Ofcom to consider, in particular, the risk and severity of harm to individuals in the UK. That is clearly already in the Bill.

I hope that this analysis provides a basis for the shadow Minister to accept that the Bill, in this area, functions as required. I gently request that she withdraw her amendment.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I welcome the Minister’s comments, but if we truly want the Bill to be world-leading, as the Government and the Minister insist it will be, and if it is truly to keep children safe, surely one image of child sexual exploitation and abuse on a platform is one too many. We do not need to consider prevalence over presence. I do not buy that argument. I believe we need to do all we can to make this Bill as strong as possible. I believe the amendments would do that.

Question put, That the amendment be made.

Division 33

Ayes: 3


Labour: 2
Scottish National Party: 1

Noes: 5


Conservative: 5

Amendment proposed: 37, in clause 104, page 89, line 13, at end insert—
“(k) risk of harm posed by individuals in the United Kingdom in relation to adults and children in the UK or elsewhere through the production, publication and dissemination of illegal content.”—(Alex Davies-Jones.)
This amendment requires the Ofcom’s risk assessment to consider risks to adults and children through the production, publication and dissemination of illegal content.
Question put, That the amendment be made.

Division 34

Ayes: 3


Labour: 2
Scottish National Party: 1

Noes: 5


Conservative: 5

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I beg to move amendment 6, in clause 104, page 89, line 14, after “(2)(f)” insert “, (g)”

This amendment ensures that subsection (3) of this clause (which clarifies what “relevant content” in particular paragraphs of subsection (2) refers to in relation to different kinds of services) applies to the reference to “relevant content” in subsection (2)(g) of this clause.

This technical amendment will ensure that the same definition of “relevant content” used in subsection (2) is used in subsection (3).

Amendment 6 agreed to.

Clause 104, as amended, ordered to stand part of the Bill.

Clauses 105 and 106 ordered to stand part of the Bill.

Clause 107

OFCOM’s guidance about functions under this Chapter

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clauses 108 and 109 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour welcomes clause 107, which requires Ofcom to issue guidance setting out the circumstances in which it could require a service provider in scope of the power to use technology to identify CSEA and/or terrorism content. It is undeniably important that Ofcom will have the discretion to decide on the exact content of the guidance, which it must keep under review and publish.

We also welcome the fact that Ofcom must have regard to its guidance when exercising these powers. Of course, it is also important that the Information Commissioner is included and consulted in the process. Ofcom has a duty to continually review its guidance, which is fundamental to the Bill’s success.

We also welcome clause 108. Indeed, the reporting of Ofcom is an area that my hon. Friend the Member for Batley and Spen will touch on when we come to new clause 25. It is right that Ofcom will have a statutory duty to lay an annual report in this place, but we feel it should ultimately go further. That is a conversation for another day, however, so we broadly welcome clause 108 and have not sought to amend it directly at this stage.

Clause 109 ensures that the definitions of “terrorism content” and “child sexual exploitation and abuse content” used in chapter 5 are the same as those used in part 3. Labour supports the clause and we have not sought to amend it.

14:30
None Portrait The Chair
- Hansard -

The Question is—

None Portrait The Chair
- Hansard -

I beg your pardon; I am trying to do too many things at once. I call Kirsty Blackman.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Thank you very much, Sir Roger. I do not envy you in this role, which cannot be easy, particularly with a Bill that is 190-odd clauses long.

None Portrait The Chair
- Hansard -

It goes with the job.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a quick question for the Minister about the timelines in relation to the guidance and the commitment that Ofcom gave to producing a road map before this coming summer. When is that guidance likely to be produced? Does that road map relate to the guidance in this clause, as well as the guidance in other clauses? If the Minister does not know the answer, I have no problem with receiving an answer at a later time. Does the road map include this guidance as well as other guidance that Ofcom may or may not be publishing at some point in the future?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I welcome the cross-party support for the provisions set out in these important clauses. Clause 107 points out the requirement for Ofcom to publish guidance, which is extremely important. Clause 108 makes sure that it publishes an annual report. Clause 109 covers the interpretations.

The hon. Member for Aberdeen North asked the only question, about the contents of the Ofcom road map, which in evidence it committed to publishing before the summer. I cannot entirely speak for Ofcom, which is of course an independent body. In order to avoid me giving the Committee misleading information, the best thing is for officials at the Department for Digital, Culture, Media and Sport to liaise with Ofcom and ascertain what the exact contents of the road map will be, and we can report that back to the Committee by letter.

It will be fair to say that the Committee’s feeling—I invite hon. Members to intervene if I have got this wrong—is that the road map should be as comprehensive as possible. Ideally, it would lay out the intended plan to cover all the activities that Ofcom would have to undertake in order to make the Bill operational, and the more detail there is, and the more comprehensive the road map can be, the happier the Committee will be.

Officials will take that away, discuss it with Ofcom and we can revert with fuller information. Given that the timetable was to publish the road map prior to the summer, I hope that we are not going to have to wait very long before we see it. If Ofcom is not preparing it now, it will hopefully hear this discussion and, if necessary, expand the scope of the road map a little bit accordingly.

Question put and agreed to.

Clause 107 accordingly ordered to stand part of the Bill

Clauses 108 and 109 ordered to stand part of the Bill.

Clause 110

Provisional notice of contravention

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I will be brief. Labour welcomes clause 110, which addresses the process of starting enforcement. We support the process, particularly the point that ensures that Ofcom must first issue a “provisional notice of contravention” to an entity before it reaches its final decision.

The clause ultimately ensures that the process for Ofcom issuing a provisional notice of contravention can take place only after a full explanation and deadline has been provided for those involved. Thankfully, this process means that Ofcom can reach a decision only after allowing the recipient a fair opportunity to make relevant representations too. The process must be fair for all involved and that is why we welcome the provisions outlined in the clause.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I hope that I am speaking at the right stage of the Bill, and I promise not to intervene at any further stages where this argument could be put forward.

Much of the meat of the Bill is within chapter 6. It establishes what many have called the “polluter pays” principle, where an organisation that contravenes can then be fined—a very important part of the Bill. We are talking about how Ofcom is going to be able to make the provisions that we have set out work in practice. A regulated organisation that fails to stop harm contravenes and will be fined, and fined heavily.

I speak at this point in the debate with slight trepidation, because these issues are also covered in clause 117 and schedule 12, but it is just as relevant to debate the point at this stage. It is difficult to understand where in the Bill the Government set out how the penalties that they can levy as a result of the powers under this clause will be used. Yes, they will be a huge deterrent, and that is good in its own right and important, but surely the real opportunity is to make the person who does the harm pay for righting the wrong that they have created.

That is not a new concept. Indeed, it is one of the objectives that the Government set out in the intentions behind their approach to the draft victims Bill. It is a concept used in the Investigatory Powers Act 2016. It is the concept behind the victims surcharge. So how does this Bill make those who cause harm take greater responsibility for the cost of supporting victims to recover from what they have suffered? That is exactly what the Justice Ministers set out as being so important in their approach to victims. In the Bill, that is not clear to me.

At clause 70, the Minister helpfully set out that there was absolutely no intention for Ofcom to have a role in supporting victims individually. In reply to the point that I made at that stage, he said that the victims Bill would address some of the issues—I am sure that he did not say all the issues, but some of them at least. I do not believe that it will. The victims Bill establishes a code and a duty to provide victim support, but it makes absolutely no reference to how financial penalties on those who cause harm—as set out so clearly in this Bill—will be used to support victims. How will they support victims’ organisations, which do so much to help in particular those who do not end up in court, before a judge, because what they have suffered does not warrant that sort of intervention?

I believe that there is a gap. We heard that in our evidence session, including from Ofcom itself, which identified the need for law enforcement, victim-support organisations and platforms themselves to find what the witnesses described as an effective way for the new “ecosystem” to work. Victim-support organisations went further and argued strongly for the need for victims’ voices to be heard independently. The NSPCC in particular made a very powerful argument for children’s voices needing to be heard and for having independent advocacy. There would be a significant issue with trust levels if we were to rely solely on the platforms themselves to provide such victim support.

There are a couple of other reasons why we need the Government to tease the issue out. We are talking about the most significant culture change imaginable for the online platforms to go through. There will be a lot of good will, I am sure, to achieve that culture change, but there will also be problems along the way. Again referring back to our evidence sessions, the charity Refuge said that reporting systems are “not up to scratch” currently. There is a lot of room for change. We know that Revenge Porn Helpline has seen a continual increase in demand for its services in support of victims, in particular following the pandemic. It also finds revenue and funding a little hand to mouth.

Victim support organisations will have a crucial role in assisting Ofcom with the elements outlined in chapter 6, of which clause 110 is the start, in terms of monitoring the reality for users of how the platforms are performing. The “polluter pays” principle is not working quite as the Government might want it to in the Bill. My solution is for the Minister to consider talking to his colleagues in the Treasury about whether this circle could be squared—whether we could complete the circle—by having some sort of hypothecation of the financial penalties, so that some of the huge amount that will be levied in penalties can be put into a fund that can be used directly to support victims’ organisations. I know that that requires the Department for Digital, Culture, Media and Sport and the Ministry of Justice to work together, but my hon. Friend is incredibly good at collaborative working, and I am sure he will be able to achieve that.

This is not an easy thing. I know that the Treasury would not welcome Committees such as this deciding how financial penalties are to be used, but this is not typical legislation. We are talking about enormous amounts of money and enormous numbers of victims, as the Minister himself has set out when we have tried to debate some of these issues. He could perhaps undertake to raise this issue directly with the Treasury, and perhaps get it to look at how much money is currently going to organisations to support victims of online abuse and online fraud—the list goes on—and to see whether we will have to take a different approach to ensure that the victims we are now recognising get the support he and his ministerial colleagues want to see.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

First, on the substance of the clause, as the shadow Minister said, the process of providing a provisional notice of contravention gives the subject company a fair chance to respond and put its case, before the full enforcement powers are brought down on its head, and that is of course only reasonable, given how strong and severe these powers are. I am glad there is once again agreement between the two parties.

I would like to turn now to the points raised by my right hon. Friend the Member for Basingstoke, who, as ever, has made a very thoughtful contribution to our proceedings. Let me start by answering her question as to what the Bill says about where fines that are levied will go. We can discover the answer to that question in paragraph 8 of schedule 12, which appears at the bottom of page 206 and the top of page 207—in the unlikely event that Members had not memorised that. If they look at that provision, they will see that the Bill as drafted provides that fines that are levied under the powers provided in it and that are paid to Ofcom get paid over to the Consolidated Fund, which is essentially general Treasury resources. That is where the money goes under the Bill as drafted.

My right hon. Friend asks whether some of the funds could be, essentially, hypothecated and diverted directly to pay victims. At the moment, the Government are dealing with victims, or pay for services supporting victims, not just via legislation—the victims Bill—but via expenditure that, I think, is managed by the Ministry of Justice to support victims and organisations working with victims in a number of ways. I believe that the amount earmarked for this financial year is in excess of £300 million, which is funded just via the general spending review. That is the situation as it is today.

I am happy to ask colleagues in Government the question that my right hon. Friend raises. It is really a matter for the Treasury, so I am happy to pass her idea on to it. But I anticipate a couple of responses coming from the Treasury in return. I would anticipate it first saying that allocating money to a particular purpose, including victims, is something that it likes to do via spending reviews, where it can balance all the demands on Government revenue, viewed in the round.

Secondly, it might say that the fine income is very uncertain; we do not know what it will be. One year it could be nothing; the next year it could be billions and billions of pounds. It depends on the behaviour of these social media firms. In fact, if the Bill does its job and they comply with the duties as we want and expect them to, the fines could be zero, because the firms do what they are supposed to. Conversely, if they misbehave, as they have been doing until now, the fines could be enormous. If we rely on hypothecation of these fines as a source for funding victim services, it might be that, in a particular year, we discover that there is no income, because no fines have been levied.

14:45
I was anticipating the Treasury’s response as I made those points to the Committee, but since my right hon. Friend spoke with such eloquence, and given her great experience in Government, I shall put her idea to Treasury colleagues. I will happily revert to her when its response is forthcoming, although I have tried to anticipate a couple of points that the Treasury might make.
Question put and agreed to.
Clause 110 accordingly ordered to stand part of the Bill.
Clause 111
Requirements enforceable by OFCOM against providers of regulated services
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move amendment 53, in clause 111, page 94, line 24, at end insert—

“Section 136(7C)

Code of practice on access to data”



This amendment is linked to Amendment 52.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss amendment 52, in clause 136, page 118, line 6, at end insert—

“(7A) Following the publication of the report, OFCOM must produce a code of practice on access to data setting out measures with which regulated services are required to comply.

(7B) The code of practice must set out steps regulated services are required to take to facilitate access to date by persons carrying out independent research.

(7C) Regulated services must comply with any measures in the code of practice.”

This amendment would require Ofcom to produce a code of practice on access to data.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour welcomes this important clause, which lists the enforceable requirements. Failure to comply with those requirements can trigger enforcement action. However, the provisions could go further, so we urge the Minister to consider our important amendments.

Amendments 52 and 53 make it abundantly clear that more access to, and availability of, data and information about systems and processes would improve understanding of the online environment. We cannot rely solely on Ofcom to act as problems arise, when new issues could be spotted early by experts elsewhere. The entire regime depends on how bright a light we can shine into the black box of the tech companies, but only minimal data can be accessed.

The amendments would require Ofcom simply to produce a code of practice on access to data. We have already heard that without independent researchers accessing data on relevant harm, the platforms have no real accountability for how they tackle online harms. Civil society and researchers work hard to identify online harms from limited data sources, which can be taken away by the platforms if they choose. Labour feels that the Bill must require platforms, in a timely manner, to share data with pre-vetted independent researchers and academics. The EU’s Digital Services Act does that, so will the Minister confirm why such a provision is missing from this supposed world-leading Bill?

Clause 136 gives Ofcom two years to assess whether access to data is required, and it “may”, but not “must”, publish guidance on how its approach to data access might work. The process is far too slow and, ultimately, puts the UK behind the EU, whose legislation makes data access requests possible immediately. Amendment 52 would change the “may” to “must”, and would ultimately require Ofcom to explore how access to data works, not if it should happen in the first place.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Frances Haugen’s evidence highlighted quite how shadowy a significant number of the platforms are. Does the hon. Member agree that that hammers home the need for independent researchers to access as much detail as possible so that we can ensure that the Bill is working?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I agree 100%. The testimony of Frances Haugen, the Facebook whistleblower, highlighted the fact that expert researchers and academics will need to examine the data and look at what is happening behind social media platforms if we are to ensure that the Bill is truly fit for purpose and world leading. That process should be carried out as quickly as possible, and Ofcom must also be encouraged to publish guidance on how access to data will work.

Ultimately, the amendments make a simple point: civil society and researchers should be able to access data, so why will the Minister not let them? The Bill should empower independently verified researchers and civil society to request tech companies’ data. Ofcom should be required to publish guidance as soon as possible —within months, not years—on how data may be accessed. That safety check would hold companies to account and make the internet a safer and less divisive space for everyone.

The process would not be hard or commercially ruinous, as the platforms claim. The EU has already implemented it through its Digital Services Act, which opens up the secrets of tech companies’ data to Governments, academia and civil society in order to protect internet users. If we do not have that data, researchers based in the EU will be ahead of those in the UK. Without more insight to enable policymaking, quality research and harm analysis, regulatory intervention in the UK will stagnate. What is more, without such data, we will not know Instagram’s true impact on teen mental health, nor the reality of violence against women and girls online or the risks to our national security.

We propose amending the Bill to accelerate data sharing provisions while mandating Ofcom to produce guidance on how civil society and researchers can access data, not just on whether they should. As I said, that should happen within months, not years. The provisions should be followed by a code of practice, as outlined in the amendment, to ensure that platforms do not duck and dive in their adherence to transparency requirements. A code of practice would help to standardise data sharing in a way that serves platforms and researchers.

The changes would mean that tech companies can no longer hide in the shadows. As Frances Haugen said of the platforms in her evidence a few weeks ago:

“The idea that they have worked in close co-operation with researchers is a farce. The only way that they are going to give us even the most basic data that we need to keep ourselves safe is if it is mandated in the Bill. We need to not wait two years after the Bill passes”.––[Official Report, Online Safety Public Bill Committee, 26 May 2022; c. 188, Q320.]

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I understand the shadow Minister’s point. We all heard from Frances Haugen about the social media firms’ well-documented reluctance—to put it politely—to open themselves up to external scrutiny. Making that happen is a shared objective. We have already discussed several times the transparency obligations enshrined in clause 64. Those will have a huge impact in ensuring that the social media firms open up a lot more and become more transparent. That will not be an option; they will be compelled to do that. Ofcom is obliged under clause 64 to publish the guidance around those transparency reports. That is all set in train already, and it will be extremely welcome.

Researchers’ access to information is covered in clause 136, which the amendments seek to amend. As the shadow Minister said, our approach is first to get Ofcom to prepare a report into how that can best be done. There are some non-trivial considerations to do with personal privacy and protecting people’s personal information, and there are questions about who counts as a valid researcher. When just talking about it casually, it might appear obvious who is or is not a valid researcher, but we will need to come up with a proper definition of “valid researcher” and what confidentiality obligations may apply to them.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

This is all sorted in the health environment because of the personal data involved—there is no data more personal than health data—and a trusted and safe environment has been created for researchers to access personal data.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

This data is a little different—the two domains do not directly correspond. In the health area, there has been litigation—an artificial intelligence company is currently engaged in litigation with an NHS hospital trust about a purported breach of patient data rules—so even in that long-established area, there is uncertainty and recent, or perhaps even current, litigation.

We are asking for the report to be done to ensure that those important issues are properly thought through. Once they are, Ofcom has the power under clause 136 to lay down guidance on providing access for independent researchers to do their work.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The Minister has committed to Ofcom being fully resourced to do what it needs to do under the Bill, but he has spoken about time constraints. If Ofcom were to receive 25,000 risk assessments, for example, there simply would not be enough people to go through them. Does he agree that, in cases in which Ofcom is struggling to manage the volume of data and to do the level of assessment required, it may be helpful to augment that work with the use of independent researchers? I am not asking him to commit to that, but to consider the benefits.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes, I would agree that bona fide academic independent researchers do have something to offer and to add in this area. The more we have highly intelligent, experienced and creative people looking at a particular problem or issue, the more likely we are to get a good and well-informed result. They may have perspectives that Ofcom does not. I agree that, in principle, independent researchers can add a great deal, but we need to ensure that we get that set up in a thoughtful and proper way. I understand the desire to get it done quickly, but it is important to take the time to do it not just quickly, but right. It is an area that does not exist already—at the moment, there is no concept of independent researchers getting access to the innards of social media companies’ data vaults—so we need to make sure that it is done in the right way, which is why it is structured as it is. I ask the Committee to stick with the drafting, whereby there will be a report and then Ofcom will have the power. I hope we end up in the same place—well, the same place, but a better place. The process may be slightly slower, but we may also end up in a better place for the consideration and thought that will have to be given.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I appreciate where the Minister is coming from. It seems that he wants to back the amendment, so I am struggling to see why he will not, especially given that the DSA—the EU’s new legislation—is already doing this. We know that the current wording in the Bill is far too woolly. If providers can get away with it, they will, which is why we need to compel them, so that we are able to access this data. We need to put that on the face of the Bill. I wish that we did not have to do so, but we all wish that we did not have to have this legislation in the first place. Unless we put it in the Bill, however, the social media platforms will carry on regardless, and the internet will not be a safe place for children and adults in the UK. That is why I will push amendment 53 to a vote.

Question put, That the amendment be made.

Division 35

Ayes: 3


Labour: 2
Scottish National Party: 1

Noes: 5


Conservative: 5

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move amendment 56, in clause 111, page 94, line 24, at end insert—

“Section [Supply chain risk assessment duties]

Supply chain risk assessments”



This amendment is linked to NC11.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss new clause 11—Supply chain risk assessment duties—

“(1) This section sets out duties to assess risks arising in a provider’s supply chain, which apply to all Part 3 services.

(2) A duty to carry out a suitable and sufficient assessment of the risk of harm arising to persons employed by contractors of the provider, where the role of such persons is to moderate content on the service.

(3) A duty to keep the risk assessment up to date.

(4) Where any change is proposed to any contract for the moderation of content on the service, a duty to carry out a further suitable and sufficient risk assessment.

(5) In this section, the ‘risk of harm’ includes any risks arising from—

(a) exposure to harmful content; and

(b) a lack of training, counselling or support.”

This new clause introduces a duty to assess the risk of harm in the supply chain.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

We know that human content moderation is the foundation of all content moderation for major platforms. It is the most important resource for making platforms safe. Relying on AI alone is an ineffective and risky way to moderate content, so platforms have to rely on humans to make judgment calls about context and nuance. I pay tribute to all human moderators for keeping us all safe by having to look at some of the most horrendous and graphic content.

The content moderation reviews carried out by humans, often at impossible speeds, are used to classify content to train algorithms that are then used to automatically moderate exponentially more content. Human moderators can be, and often are, exploited by human resource processes that do not disclose the trauma inherent in the work or properly support them in their dangerous tasks. There is little oversight of this work, as it is done largely through a network of contracted companies that do not disclose their expectations for staff or the support and training provided to them. The contractors are “off book” from the platforms and operate at arm’s length from the services they are supporting, and they are hidden by a chain of unaccountable companies. This creates a hazardous supply chain for the safety processes that platforms claim will protect users in the UK and around the world.

Not all online abuse in the UK happens in English, and women of many cultures and backgrounds in the UK are subject to horrific abuse that is not in the English language. The amendment would make all victim groups in the UK much safer.

To make the internet safer it is imperative to better support human content moderators and regulate the supply chain for their work. It is an obvious but overlooked point that content moderators are users of a platform, but they are also the most vulnerable group of users, as they are the frontline of defence in sifting out harmful content. Their sole job is to watch gruesome, traumatising and harmful content so that we do not have to. The Bill has a duty to protect the most vulnerable users, but it cannot do so if their existence is not even acknowledged.

Many reports in the media have described the lack of clarity about, and the exploitative nature of, the hiring process. Just yesterday, I had the immense privilege of meeting Daniel Motaung, the Facebook whistleblower from Kenya who has described the graphic and horrendous content that he was required to watch to keep us all safe, including live beheadings and children being sexually exploited. Members of the Committee cannot even imagine what that man has had to endure, and I commend him for his bravery in speaking out and standing up for his rights. He has also been extremely exploited by Facebook and the third party company by which he was employed. He was paid the equivalent of $2 an hour for doing that work, whereas human moderators in the US were paid roughly $18 an hour—again, nowhere near enough for what they had to endure.

15:00
In one instance, a Meta content moderator working for a contractor was not informed during his interview that the job would require regular viewing of disturbing content that could lead to mental health problems. After he accepted the role, the contractor asked him to sign a non-disclosure agreement, and only then did they reveal to him the exact type of content that he would be working with daily. That moderator—similar to many moderators in the US, Ireland and other locations—was diagnosed with post-traumatic stress disorder due to his work.
One former counsellor for a content moderator contractor
“witnessed managers repeatedly rejecting content moderators’ requests for breaks, citing productivity pressures.”
They also reported that managers
“regularly rejected counsellors’ requests to let content moderators take ‘wellness breaks’ during the day, because of the impact it would have on productivity.”
Other moderators in the US were allocated just nine minutes a day of “wellness time”, which many needed to use to go to the bathroom. In some cases, the wellness coaches that the contractors provide do not have any clinical psychological counselling credentials, and would recommend “karaoke or painting” after shifts of watching suicides and other traumatic content.
Oversight is required to ensure that human resources processes clearly identify the role and provide content descriptions, as well as information on possible occupational hazards. Currently, the conditions of the work are unregulated and rely on the business relationship between two parties focused on the bottom line. Platforms do not release any due diligence on the employment conditions of those contractors, if they conduct it at all. If there is to be any meaningful oversight of the risks inherent in the content moderation supply chain, it is imperative to mandate transparency around the conditions for content moderators in contracted entities. As long as that relationship is self-regulated, the wellness of human moderators will be at risk. That is why we urge the Minister to support this important amendment and new clause: there is a human element to all this. We urge him to do the right thing.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I thank the hon. Member for Pontypridd for laying out her case in some detail, though nowhere near the level of detail that these people have to experience while providing moderation. She has given a very good explanation of why she is asking for the amendment and new clause to be included in the Bill. Concerns are consistently being raised, particularly by the Labour party, about the impact on the staff members who have to deal with this content. I do not think the significance of this issue for those individuals can be overstated. If we intend the Bill to have the maximum potential impact and reduce harm to the highest number of people possible, it makes eminent sense to accept this amendment and new clause.

There is a comparison with other areas in which we place similar requirements on other companies. The Government require companies that provide annual reports to undertake an assessment in those reports of whether their supply chain uses child labour or unpaid labour, or whether their factories are safe for people to work in—if they are making clothes, for example. It would not be an overly onerous request if we were to widen those requirements to take account of the fact that so many of these social media companies are subjecting individuals to trauma that results in them experiencing PTSD and having to go through a lengthy recovery process, if they ever recover. We have comparable legislation, and that is not too much for us to ask. Unpaid labour, or people being paid very little in other countries, is not that different from what social media companies are requiring of their moderators, particularly those working outside the UK and the US in countries where there are less stringent rules on working conditions. I cannot see a reason for the Minister to reject the provision of this additional safety for employees who are doing an incredibly important job that we need them to be doing, in circumstances where their employer is not taking any account of their wellbeing.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

As my hon. Friend the Member for Pontypridd has pointed out, there is little or no transparency about one of the most critical ways in which platforms tackle harms. Human moderators are on the frontline of protecting children and adults from harmful content. They must be well resourced, trained and supported in order to fulfil that function, or the success of the Bill’s aims will be severely undermined.

I find it shocking that platforms offer so little data on human moderation, either because they refuse to publish it or because they do not know it. For example, in evidence to the Home Affairs Committee, William McCants from YouTube could not give precise statistics for its moderator team after being given six days’ notice to find the figure, because many moderators were employed or operated under third-party auspices. For YouTube’s global counter-terrorism lead to be unaware of the detail of how the platform is protecting its users from illegal content is shocking, but it is not uncommon.

In evidence to this Committee, Meta’s Richard Earley was asked how many of Meta’s 40,000 human moderators were outsourced to remove illegal content and disinformation from the platform. My hon. Friend the Member for Pontypridd said:

“You do not have the figures, so you cannot tell me.”

Richard Earley replied:

“I haven’t, no, but I will be happy to let you know afterwards in our written submission.”

Today, Meta submitted its written evidence to the Committee. It included no reference to human content moderators, despite its promise.

The account that my hon. Friend gave just now shows why new clause 11 is so necessary. Meta’s representative told this Committee in evidence:

“Everyone who is involved in reviewing content at Meta goes through an extremely lengthy training process that lasts multiple weeks, covering not just our community standards in total but also the specific area they are focusing on, such as violence and incitement.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 45, Q76.]

But now we know from whistleblowers such as Daniel, whose case my hon. Friend described, that that is untrue. What is happening to Daniel and the other human moderators is deeply concerning. There are powerful examples of the devastating emotional impact that can occur because human moderators are not monitored, trained and supported.

There are risks of platforms shirking responsibility when they outsource moderation to third parties. Stakeholders have raised concerns that a regulated company could argue that an element of its service is not in the scope of the regulator because it is part of a supply chain. We will return to that issue when we debate new clause 13, which seeks to ensure enforcement of liability for supply chain failures that amount to a breach of one of the specified duties.

Platforms, in particular those supporting user-to-user generated content, employ those services from third parties. Yesterday, I met Danny Stone, the chief executive of the Antisemitism Policy Trust, who described the problem of antisemitic GIFs. Twitter would say, “We don’t supply GIFs. The responsibility is with GIPHY.” GIPHY, as part of the supply chain, would say, “We are not a user-to-user platform.” If someone searched Google for antisemitic GIFs, the results would contain multiple entries saying, “Antisemitic GIFs—get the best GIFs on GIPHY. Explore and share the best antisemitic GIFs.”

One can well imagine a scenario in which a company captured by the regulatory regime established by the Bill argues that an element of its service is not within the ambit of the regulator because it is part of a supply chain presented by, but not necessarily the responsibility of, the regulated service. The contracted element, which I have just described by reference to Twitter and GIPHY, supported by an entirely separate company, would argue that it was providing a business-to-business service that is not user-generated content but content designed and delivered at arm’s length and provided to the user-to-user service to deploy for its users.

I suggest that dealing with this issue would involve a timely, costly and unhelpful legal process during which systems were not being effectively regulated—the same may apply in relation to moderators and what my hon. Friend the Member for Pontypridd described; there are a number of lawsuits involved in Daniel’s case—and complex contract law was invoked.

We recognise in UK legislation that there are concerns and issues surrounding supply chains. Under the Bribery Act 2010, for example, a company is liable if anyone performing services for or on the company’s behalf is found culpable for specific actions. These issues on supply chain liability must be resolved if the Bill is to fulfil its aim of protecting adults and children from harm.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

May I first say a brief word about clause stand part, Sir Roger?

None Portrait The Chair
- Hansard -

Yes.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Thank you. Clause 111 sets out and defines the “enforceable requirements” in this chapter—the duties that Ofcom is able to enforce against. Those are set out clearly in the table at subsection (2) and the requirements listed in subsection (3).

The amendment speaks to a different topic. It seeks to impose or police standards for people employed as subcontractors of the various companies that are in scope of the Bill, for example people that Facebook contracts; the shadow Minister, the hon. Member for Pontypridd, gave the example of the gentleman from Kenya she met yesterday. I understand the point she makes and I accept that there are people in those supply chains who are not well treated, who suffer PTSD and who have to do extraordinarily difficult tasks. I do not dispute at all the problems she has referenced. However, the Government do not feel that the Bill is the right place to address those issues, for a couple of reasons.

First, in relation to people who are employed in the UK, we have existing UK employment and health and safety laws. We do not want to duplicate or cut across those. I realise that they relate only to people employed in the UK, but if we passed the amendment as drafted, it would apply to people in the UK as much as it would apply to people in Kenya.

Secondly, the amendment would effectively require Ofcom to start paying regard to employment conditions in Kenya, among other places—indeed, potentially any country in the world—and it is fair to say that that sits substantially outside Ofcom’s area of expertise as a telecoms and communications regulator. That is the second reason why the amendment is problematic.

The third reason is more one of principle. The purpose of the Bill is to keep users safe online. While I understand the reasonable premise for the amendment, it seeks essentially to regulate working conditions in potentially any country in the world. I am just not sure that it is appropriate for an online safety Bill to seek to regulate global working conditions. Facebook, a US company, was referenced, but only 10% of its activity—very roughly speaking—is in the UK. The shadow Minister gave the example of Kenyan subcontractors. Compelling though her case was, I am not sure it is appropriate that UK legislation on online safety should seek to regulate the Kenyan subcontractor of a United States company.

The Government of Kenya can set their own employment regulations and President Biden’s Government can impose obligations on American companies. For us, via a UK online safety Bill, to seek to regulate working conditions in Kenya goes a long way beyond the bounds of what we are trying to do, particularly when we take into account that Ofcom is a telecommunications and communications regulator. To expect it to regulate working conditions anywhere in the world is asking quite a lot.

I accept that a real issue is being raised. There is definitely a problem, and the shadow Minister and the hon. Member for Aberdeen North are right to raise it, but for the three principal reasons that I set out, I suggest that the Bill is not the place to address these important issues.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Minister mentions workers in the UK. I am a proud member of the Labour party and a proud trade unionist; we have strong protections for workers in the UK. There is a reason why Facebook and some of these other platforms, which are incredibly exploitative, will not have human moderators in the UK looking at this content: because they know they would be compelled to treat them a hell of a lot better than they do the workers around the world that they are exploiting, as they do in Kenya, Dublin and the US.

To me, the amendment speaks to the heart of the Bill. This is an online safety Bill that aims to keep the most vulnerable users safe online. People around the world are looking at content that is created here in the UK and having to moderate it; we are effectively shipping our trash to other countries and other people to deal with it. That is not acceptable. We have the opportunity here to keep everybody safe from looking at this incredibly harmful content. We have a duty to protect those who are looking at content created in the UK in order to keep us safe. We cannot let those people down. The amendment and new clause 11 give us the opportunity to do that. We want to make the Bill world leading. We want the UK to stand up for those people. I urge the Minister to do the right thing and back the amendment.

15:14
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The Minister has not commented on the problem I raised of the contracted firm in the supply chain not being covered by the regulations under the Bill—the problem of Twitter and the GIFs, whereby the GIFs exist and are used on Twitter, but Twitter says, “We’re not responsible for them; it’s that firm over there.” That is the same thing, and new clause 11 would cover both.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am answering slightly off the cuff, but I think the point the hon. Lady is raising—about where some potentially offensive or illegal content is produced on one service and then propagated or made available by another—is one we debated a few days ago. I think the hon. Member for Aberdeen North raised that question, last week or possibly the week before. I cannot immediately turn to the relevant clause—it will be in our early discussions in Hansard about the beginning of the Bill—but I think the Bill makes it clear that where content is accessed through another platform, which is the example that the hon. Member for Worsley and Eccles South just gave, the platform through which the content is made available is within the scope of the Bill.

Question put, That the amendment be made.

Division 36

Ayes: 3


Labour: 2
Scottish National Party: 1

Noes: 4


Conservative: 4

Clause 111 ordered to stand part of the Bill.
Clause 112
Confirmation decisions
Question proposed, That the clause stand part of the Bill.
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clauses 113 to 117 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

We support clause 112, which gives Ofcom the power to issue a confirmation decision if, having followed the required process—for example, in clause 110—its final decision is that a regulated service has breached an enforceable requirement. As we know, this will set out Ofcom’s final decision and explain whether Ofcom requires the recipient of the notice to take any specific steps and/or pay a financial penalty. Labour believes that this level of scrutiny and accountability is vital to an Online Safety Bill that is truly fit for purpose, and we support clause 112 in its entirety.

We also support the principles of clause 113, which outlines the steps that a person may be required to take either to come into compliance or to remedy the breach that has been committed. Subsection (5) in particular is vital, as it outlines how Ofcom can require immediate action when the breach has involved an information duty. We hope this will be a positive step forward in ensuring true accountability of big tech companies, so we are happy to support the clause unamended.

It is right and proper that Ofcom has powers when a regulated provider has failed to carry out an illegal content or children’s risk assessment properly or at all, and when it has identified a risk of serious harm that the regulated provider is not effectively mitigating or managing. As we have repeatedly heard, risk assessments are the very backbone of the Bill, so it is right and proper that Ofcom is able to force a company to take measures to comply in the event of previously failing to act.

Children’s access assessments, which are covered by clause 115, are a crucial component of the Bill. Where Ofcom finds that a regulated provider has failed to properly carry out an assessment, it is vital that it has the power and legislative standing to force the company to do more. We also appreciate the inclusion of a three-month timeframe, which would ensure that, in the event of a provider re-doing the assessment, it would at least be completed within a specific—and small—timeframe.

While we recognise that the use of proactive technologies may come with small issues, Labour ultimately feels that clause 116 is balanced and fair, as it establishes that Ofcom may require the use of proactive technology only on content that is communicated publicly. It is fair that content in the public domain is subject to those important safety checks. It is also right that under subsection (7), Ofcom may set a requirement forcing services to review the kind of technology being used. That is a welcome step that will ensure that platforms face a level of scrutiny that has certainly been missing so far.

Labour welcomes and is pleased to support clause 117, which allows Ofcom to impose financial penalties in its confirmation decision. That is something that Labour has long called for, as we believe that financial penalties of this nature will go some way towards improving best practice in the online space and deterring bad actors more widely.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The shadow Minister has set out the provisions in the clauses, and I am grateful for her support. In essence, clauses 112 to 117 set out the processes around confirmation decisions and make provisions to ensure that those are effective and can be operated in a reasonable and fair way. The clauses speak largely for themselves, so I am not sure that I have anything substantive to add.

Question put and agreed to.

Clause 112 accordingly ordered to stand part of the Bill.

Clauses 113 to 117 ordered to stand part of the Bill.

Ordered, That further consideration be now adjourned. —(Dean Russell.)

15:20
Adjourned till Tuesday 21 June at twenty-five minutes past Nine o’clock.
Written evidence reported to the House
OSB77 Twitter

Online Safety Bill (Thirteenth sitting)

Committee stage & Committee Debate - 13th sitting
Tuesday 21st June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 21 June 2022 - (21 Jun 2022)
The Committee consisted of the following Members:
Chairs: Sir Roger Gale, †Christina Rees
† Ansell, Caroline (Eastbourne) (Con)
† Bailey, Shaun (West Bromwich West) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
† Carden, Dan (Liverpool, Walton) (Lab)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Double, Steve (St Austell and Newquay) (Con)
† Fletcher, Nick (Don Valley) (Con)
† Holden, Mr Richard (North West Durham) (Con)
† Keeley, Barbara (Worsley and Eccles South) (Lab)
† Leadbeater, Kim (Batley and Spen) (Lab)
† Miller, Dame Maria (Basingstoke) (Con)
Mishra, Navendu (Stockport) (Lab)
† Moore, Damien (Southport) (Con)
Nicolson, John (Ochil and South Perthshire) (SNP)
† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
† Russell, Dean (Watford) (Con)
Stevenson, Jane (Wolverhampton North East) (Con)
Katya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks
† attended the Committee
Public Bill Committee
Tuesday 21 June 2022
(Morning).
[Christina Rees in the Chair]
Online Safety Bill
09:25
None Portrait The Chair
- Hansard -

We are now sitting in public and proceedings are being broadcast. Please switch electronic devices to silent. Tea and coffee are not allowed during the sitting. I understand the Government wish to move a motion to amend the programme order agreed by the Committee, so that the Committee’s session at 2pm on Thursday will not take place.

Steve Double Portrait Steve Double (St Austell and Newquay) (Con)
- Hansard - - - Excerpts

I beg to move,

That the Order of the Committee of 24 May 2022, as amended on 26 May 2022, be further amended, in paragraph (1)(h), by leaving out “and 2.00pm”.

In the light of the rail strike on Thursday, I am grateful to the Opposition Front Bench for agreeing to the suggestion that the Committee does not sit that afternoon.

None Portrait The Chair
- Hansard -

Because this motion has not been agreed by the programming sub-committee, it may only be proceeded with if everyone is content. Does anyone object to the motion?

Question put and agreed to.

Clause 118

Penalty for failure to comply with confirmation decision

None Portrait The Chair
- Hansard -

We now come to amendment 135 to clause 118, with which it will be convenient to discuss amendments 136 to 138. All these amendments have been tabled by Carla Lockhart, who is not a member of the Committee. Would any Member like to move the amendment? I see no Member wishing to do that.

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause 119 stand part.

Government amendments 154 to 157.

Clauses 120 and 121 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

Bore da, Ms Rees. It is, as ever, a pleasure to serve under your chairship. I rise to speak to clauses 118 to 121 and Government amendments 154 to 157.

As we all know, clause 118 is important and allows Ofcom to impose a financial penalty on a person who fails to complete steps that have been required by Ofcom in a confirmation decision. This is absolutely vital if we are to guarantee that regulated platforms take seriously their responsibilities in keeping us all safe online. We support the use of fines. They are key to overall behavioural change, particularly in the context of personal liability. We welcome clause 118, which outlines the steps Ofcom can take in what we hope will become a powerful deterrent.

Labour also welcomes clause 119. It is vital that Ofcom has these important powers to impose a financial penalty on a person who fails to comply with a notice that requires technology to be implemented to identify and deal with content relating to terrorism and child sexual exploitation and abuse on their service. These are priority harms and the more that can be done to protect us on these two points the better.

Government amendments 155 and 157 ensure that Ofcom has the power to impose a monetary penalty on a provider of a service who fails to pay a fee that it is required to pay under new schedule 2. We see these amendments as crucial in giving Ofcom the important powers it needs to be an effective regulator, which is something we all require. We have some specific observations around new schedule 2, but I will save those until we consider that schedule. For now, we support these amendments and I look forward to outlining our thoughts shortly.

We support clause 120, which allows Ofcom to give a penalty notice to a provider of a regulated service who does not pay the fee due to Ofcom in full. This a vital provision that also ensures that Ofcom’s process to impose a penalty can progress only when it has given due notice to the provider and once the provider has had fair opportunity to make fair representations to Ofcom. This is a fair approach and is central to the Bill, which is why we have not sought to amend.

Finally, we support clause 121, which ensures that Ofcom must state the reasons why it is imposing a penalty, the amount of the penalty and any aggravating or mitigating factors. Ofcom must also state when the penalty must be paid. It is imperative that when issuing a notice Ofcom is incentivised to publish information about the amount, aggravating or mitigating factors and when the penalty must be paid. We support this important clause and have not sought to amend.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship once again, Ms Rees, and I congratulate Committee members on evading this morning’s strike action.

I am delighted that the shadow Minister supports the intent behind these clauses, and I will not speak at great length given the unanimity on this topic. As she said, clause 118 allows Ofcom to impose a financial penalty for failure to take specified steps by a deadline set by Ofcom. The maximum penalty that can be imposed is the greater of £18 million or 10% of qualifying worldwide revenue. In the case of large companies, it is likely to be a much larger amount than £18 million.

Clause 119 enables Ofcom to impose financial penalties if the recipient of a section 103 notice does not comply by the deadline. It is very important to ensure that section 103 has proper teeth. Government amendments 154 to 157 make changes that allow Ofcom to recover not only the cost of running the service once the Bill comes into force and into the future but also the preparatory cost of setting up for the Bill to come into force.

As previously discussed, £88 million of funding is being provided to Ofcom in this financial year and next. We believe that something like £20 million of costs that predate these financial years have been funded as well. That adds up to around £108 million. However, the amount that Ofcom recovers will be the actual cost incurred. The figure I provided is simply an indicative estimate. The actual figure would be based on the real costs, which Ofcom would be able to recoup under these measures. That means that the taxpayer—our constituents —will not bear any of the costs, including the set-up and preparatory cost. This is an equitable and fair change to the Bill.

Clause 120 sets out that some regulated providers will be required to pay a regulatory fee to Ofcom, as set out in clause 71. Clause 120 allows Ofcom to impose a financial penalty if a regulated provider does not pay its fee by the deadline it sets. Finally, clause 121 sets out the information that needs to be included in these penalty notices issued by Ofcom.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I have questions about the management of the fees and the recovery of the preparatory cost. Does the Minister expect that the initial fees will be higher as a result of having to recoup the preparatory cost and will then reduce? How quickly will the preparatory cost be recovered? Will Ofcom recover it quickly or over a longer period of time?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The Bill provides a power for Ofcom to recover those costs. It does not specify over what time period. I do not think they will be recouped over a period of years. Ofcom can simply recoup the costs in a single hit. I would imagine that Ofcom would seek to recover these costs pretty quickly after receiving these powers. The £108 million is an estimate. The actual figure may be different once the reconciliation and accounting is done. It sounds like a lot of money, but it is spread among a number of very large social media firms. It is not a large amount of money for them in the context of their income, so I would expect that recouping to be done on an expeditious basis—not spread over a number of years. That is my expectation.

Question put and agreed to.

Clause 118 accordingly ordered to stand part of the Bill.

Clause 119 ordered to stand part of the Bill.

Clause 120

Non-payment of fee

Amendments made: 154, in clause 120, page 102, line 20, after “71” insert:

“or Schedule (Recovery of OFCOM’s initial costs)”.

This amendment, and Amendments 155 to 157, ensure that Ofcom have the power to impose a monetary penalty on a provider of a service who fails to pay a fee that they are required to pay under NS2.

Amendment 155, in clause 120, page 102, line 21, leave out “that section” and insert “Part 6”.

Amendment 156, in clause 120, page 102, line 26, after “71” insert—

“or Schedule (Recovery of OFCOM’s initial costs)”

Amendment 157, in clause 120, page 103, line 12, at end insert—

“or Schedule (Recovery of OFCOM’s initial costs)”.—(Chris Philp.)

Clause 120, as amended, ordered to stand part of the Bill.

Clause 121 ordered to stand part of the Bill.

Clause 122

Amount of penalties etc

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss:

Government amendment 158.

That schedule 12 be the Twelfth schedule to the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour supports clause 122 and schedule 12, which set out in detail the financial penalties that Ofcom may impose, including the maximum penalty that can be imposed. Labour has long supported financial penalties for those failing to comply with the duties in the Bill. We firmly believe that tough action is needed on online safety, but we feel the sanctions should go further and that there should be criminal liability for offences beyond just information-related failures. We welcome clause 122 and schedule 12. It is vital that Ofcom is also required to produce guidelines around how it will determine penalty amounts. Consistency across the board is vital, so we feel this is a positive step forward and have not sought to amend the clause.

Paragraph 8 of schedule 12 requires monetary penalties to be paid into the consolidated fund. There is no change to that requirement, but it now appears in new clause 43, together with the requirement to pay fees charged under new schedule 2 into the consolidated fund. We therefore support the amendments.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have nothing further to add on these amendments. The shadow Minister has covered them, so I will not detain the Committee further.

Question put and agreed to.

Clause 122 accordingly ordered to stand part of the Bill.

Schedule 12

Penalties imposed by OFCOM under Chapter 6 of Part 7

Amendment made: 158, in schedule 12, page 206, line 43, leave out paragraph 8.—(Chris Philp.)

Paragraph 8 of Schedule 12 requires monetary penalties to be paid into the Consolidated Fund. There is no change to that requirement, but it now appears in NC43 together with the requirement to pay fees charged under NS2 into the Consolidated Fund.

Schedule 12, as amended, agreed to.

Clause 123

Service restriction orders

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move amendment 50, in clause 123, page 106, line 36, at end insert—

“(9A) OFCOM may apply to the court for service restriction orders against multiple regulated services with one application, through the use of a schedule of relevant services which includes all the information required by subsection (5).”

This amendment would give Ofcom the ability to take action against a schedule of non-compliant sites, while still preserving the right of those sites to oppose the application for, and/or appeal through the courts against any, orders to block access or support services.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss amendment 51, in clause 125, page 110, line 20, at end insert—

“(7A) OFCOM may apply to the court for service restriction orders against multiple regulated services with one application, through the use of a schedule of relevant services which includes all the information required by subsection (6).”

This amendment would give Ofcom the ability to take action against a schedule of non-compliant sites, while still preserving the right of those sites to oppose the application for, and/or appeal through the courts against any, orders to block access or support services.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

With your permission, Ms Rees, I will speak to clause stand part and clauses 124 to 127 at the same time. Labour supports clause 123, which outlines the powers that Ofcom will have when applying to the court for business disruption measures. Business disruption measures are court orders that require third parties to withdraw services or block access to non-compliant regulated services. It is right that Ofcom has these tools at its disposal, particularly if it is going to be able to regulate effectively against the most serious instances of user harm. However, the Bill will be an ineffective regime if Ofcom is forced to apply for separate court orders when trying to protect people across the board from the same harms. We have already waited too long for change. Labour is committed to giving Ofcom the powers to take action, where necessary, as quickly as possible. That is why we have tabled amendments 50 and 51, which we feel will go some way in tackling these issues.

Amendment 50 would give Ofcom the ability to take action against a schedule of non-compliant sites, while still preserving the right of those sites to oppose the application for—and/or appeal through the courts against any—orders to block access or support services. The Bill currently requires Ofcom to seek a separate court order for each service against which it wishes to take enforcement action in the form of blocking access or services. That is the only effective mechanism for overseas websites. UK-based services will be subject to enforcement notices and financial penalties that can be enforced without having to go to court. That creates a disadvantage for UK sites, which can be more easily enforced against.

Given that there are 4 million to 5 million pornographic websites, for example, the requirement for separate court orders will prevent Ofcom from taking action at scale and creating a level playing field for all adult sites. Under the Bill, Ofcom must take action against each offending website or social media company individually. While we acknowledge that the Government have stated that enforcement action can be taken against multiple offending content providers, in our opinion that is not made clear in the Bill.

Moreover, we are concerned that some pornography websites would seek to avoid the Bill’s requirements by changing their domain name—domain hopping. That was threatened last year when Germany moved to issue a blocking order against major providers of internet pornography. That is why Ofcom must be granted clear enforcement powers to take swift action against multiple websites and content providers in one court action or order.

This group of amendments would also provide clarity and ease of enforcement for internet service providers, which will be expected to enforce court orders. Labour wants the Bill to be genuinely effective, and amendments 50 and 51 could ensure that Ofcom has the tools available to it to take action at pace. We urge the Minister to accept these small concessions, which could have a hugely positive impact.

Amendment 51 would give Ofcom the ability to take action against a schedule of non-compliant sites, while preserving the right of those sites to oppose an application for an order to block access or support services, or to appeal through the courts against any such order.

It will come as no surprise that Labour supports clause 124, which sets out the circumstances in which Ofcom may apply to the courts for an interim service restriction order. We particularly support the need for Ofcom to be able to take action when time is not on its side, or where, put plainly, the level of harm being caused means that it would be inappropriate to wait for a definite failure before taking action.

However, we hope that caution is exercised if Ofcom ever needs to consider such an interim order; we must, of course, get the balance right in our approach to internet regulation more widely. I would therefore be grateful if the Minister could outline his understanding of the specifics of when these orders may be applied. More broadly, Labour agrees that Ofcom should be given the power to act when time demands it, so we have not sought to amend clause 124 at this stage.

Labour also supports the need for Ofcom to have the power to apply to the courts for an access restriction order, as outlined in clause 125. It is vital that Ofcom is given the power to prevent, restrict or deter individuals in the UK from accessing a service from a non-compliant provider. We welcome the specific provisions on access via internet service providers and app stores. We all know from Frances Haugen’s testimony that harmful material can often be easily buried, so it is right and proper that those are considered as “access facilities” under the clause. Ultimately, we support the intentions of clause 125 and, again, have not sought to amend it at this stage.

We also support clause 126, which sets out the circumstances in which Ofcom may apply to the courts for an interim access restriction order. I will not repeat myself: for the reasons I have already outlined, it is key that Ofcom has sufficient powers to act, particularly on occasions when it is inappropriate to wait for a failure to be established.

We welcome clause 127, which clarifies how Ofcom’s enforcement powers can interact. We particularly welcome clarification that, where Ofcom exercises its power to apply to the courts for a business disruption order under clauses 123 to 126, it is not precluded from taking action under its other enforcement powers. As we have repeatedly reiterated, we welcome Ofcom’s having sufficient power to reasonably bring about positive change and increase safety measures online. That is why we have not sought to amend clause 127.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Thank you for chairing this morning’s sitting, Ms Rees.

I agree with the hon. Member for Pontypridd that these clauses are necessary and important, but I also agree that the amendments are important. It seems like this is a kind of tidying-up exercise, to give Ofcom the ability to act in a way that will make its operation smoother. We all want this legislation to work. This is not an attempt to break this legislation—to be fair, none of our amendments have been—but an attempt to make things work better.

Amendments 50 and 51 are fairly similar to the one that the National Society for the Prevention of Cruelty to Children proposed to clause 103. They would ensure that Ofcom could take action against a group of sites, particularly if they were facing the same kind of issues, they had the same kind of functionality, or the same kind of concerns were being raised about them.

09:45
If the Minister does not intend to accept amendments 50 and 51, will he at least ensure that if Ofcom comes to the Secretary of State and says, “Look, we’re really struggling because we’ve got to do all of these applications individually,” there is some power or ability for the Secretary of State or Parliament to amend this legislation to ensure that Ofcom’s ability to act is not hampered? This is not about Ofcom bringing cases against people who should not have cases brought against them; it is just about making the paperwork easier for Ofcom. These clauses may not allow delegated powers, but will the Minister commit to considering the issue at a future stage? Obviously, the Bill will go to the other place afterwards. If the Minister were to consider including the provision at a future point, that would make the legislation better, and it would make it easier for Ofcom to operate. We do not want Ofcom to spend money and time unnecessarily; we want it to focus on making a big difference. If it is mired in unnecessary extra paperwork, its ability to do so will be hampered.
None Portrait The Chair
- Hansard -

If no other Members wish to speak to amendments 50 and 51 and clauses 123 to 127, I will call the Minister to respond.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me start with amendments 50 and 51, which were introduced by the shadow Minister and supported by the SNP spokesperson. The Government recognise the valid intent behind the amendments, namely to make sure that applications can be streamlined and done quickly, and that Ofcom can make bulk applications if large numbers of service providers violate the new duties to the extent that interim service restriction orders or access restriction orders become necessary.

We want a streamlined process, and we want Ofcom to deal efficiently with it, including, if necessary, by making bulk applications to the court. Thankfully, however, procedures under the existing civil procedure rules already allow so-called multi-party claims to be made. Those claims permit any number of claimants, any number of defendants or respondents and any number of claims to be covered in a single form. The overriding objective of the CPR is that cases are dealt with justly and proportionately. Under the existing civil procedure rules, Ofcom can already make bulk applications to deal with very large numbers of non-compliant websites and service providers in one go. We completely agree with the intent behind the amendments, but their content is already covered by the CPR.

It is worth saying that the business disruption measures—the access restriction orders and the service restriction orders—are intended to be a last resort. They effectively amount to unplugging the websites from the internet so that people in the United Kingdom cannot access them and so that supporting services, such as payment services, do not support them. The measures are quite drastic, although necessary and important, because we do not want companies and social media firms ignoring our legislation. It is important that we have strong measures, but they are last resorts. We would expect Ofcom to use them only when it has taken reasonable steps to enforce compliance using other means.

If a provider outside the UK ignores letters and fines, these measures are the only option available. As the shadow Minister, the hon. Member for Pontypridd, mentioned, some pornography providers probably have no intention of even attempting to comply with our regulations; they are probably not based in the UK, they are never going to pay the fine and they are probably incorporated in some obscure, offshore jurisdiction. Ofcom will need to use these powers in such circumstances, possibly on a bulk scale—I am interested in her comment that that is what the German authorities had to do—but the powers already exist in the CPR.

It is also worth saying that in its application to the courts, Ofcom must set out the information required in clauses 123(5) and 125(3), so evidence that backs up the claim can be submitted, but that does not stop Ofcom doing this on a bulk basis and hitting multiple different companies in one go. Because the matter is already covered in the CPR, I ask the shadow Minister to withdraw the amendment.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am interested to know whether the Minister has anything to add about the other clauses. I am happy to give way to him.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the shadow Minister for giving way. I do not have too much to say on the other clauses, because she has introduced them, but in my enthusiasm for explaining the civil procedure rules I neglected to respond to her question about the interim orders in clauses 124 and 126.

The hon. Lady asked what criteria have to be met for these interim orders to be made. The conditions for clause 124 are set out in subsections (3) and (4) of that clause, which states, first, that it has to be

“likely that the…service is failing to comply with an enforceable requirement”—

so it is likely that there has been a breach—and, secondly, that

“the level of risk of harm to individuals in the United Kingdom…and the nature and severity of that harm, are such that it would not be appropriate to wait to establish the failure before applying for the order.”

Similar language in clause 124(4) applies to breaches of section 103.

Essentially, if it is likely that there has been a breach, and if the resulting harm is urgent and severe—for example, if children are at risk—we would expect these interim orders to be used as emergency measures to prevent very severe harm. I hope that answers the shadow Minister’s question. She is very kind, as is the Chair, to allow such a long intervention.

None Portrait The Chair
- Hansard -

In a Bill Committee, a Member can speak more than once. However, your intervention resolved the situation amicably, Minister.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I welcome the Minister’s comments about clauses 124 and 126 in answer to my questions, and also his comments about amendments 50 and 51, clarifying the CPR. If the legislation is truly to have any impact, it must fundamentally give clarity to service users, providers and regulators. That is why we seek to remove any ambiguity and to put these important measures in the Bill, and it is why I will press amendment 50 to a Division.

Question put, That the amendment be made.

Division 37

Ayes: 5


Labour: 4
Scottish National Party: 1

Noes: 9


Conservative: 9

Clause 123 ordered to stand part of the Bill.
Clause 124 ordered to stand part of the Bill.
Clause 125
Access restriction orders
Amendment proposed: 51, in clause 125, page 110, line 20, at end insert—
“(7A) OFCOM may apply to the court for service restriction orders against multiple regulated services with one application, through the use of a schedule of relevant services which includes all the information required by subsection (6).”—(Alex Davies-Jones.)
This amendment would give Ofcom the ability to take action against a schedule of non-compliant sites, while still preserving the right of those sites to oppose the application for, and/or appeal through the courts against any, orders to block access or support services.
Question put, That the amendment be made.

Division 38

Ayes: 5


Labour: 4
Scottish National Party: 1

Noes: 9


Conservative: 9

Clause 125 ordered to stand part of the Bill.
Clauses 126 and 127 ordered to stand part of the Bill.
Clause 128
Publication of details of enforcement action
Question proposed, That the clause stand part of the Bill.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Minister and his Back Benchers will, I am sure, be tired of our calls for more transparency, but I will be kind to him and confirm that Labour welcomes the provisions in clause 128.

We believe that it is vital that, once Ofcom has followed the process outlined in clause 110 when issuing a confirmation decision outlining its final decision, that is made public. We particularly welcome provisions to ensure that when a confirmation decision is issued, Ofcom will be obliged to publish the identity of the person to whom the decision was sent, details of the failure to which the decision relates, and details relating to Ofcom’s response.

Indeed, the transparency goes further, as Ofcom will be obliged to publish details of when a penalty notice has been issued in many more areas: when a person fails to comply with a confirmation decision; when a person fails to comply with a notice to deal with terrorism content or child sexual exploitation and abuse content, or both; and when there has been a failure to pay a fee in full. That is welcome indeed. Labour just wishes that the Minister had committed to the same level of transparency on the duties in the Bill to keep us safe in the first place. That said, transparency on enforcement is a positive step forward, so we have not sought to amend the clause at this stage.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am grateful for the shadow Minister’s support. I have nothing substantive to add, other than to point to the transparency reporting obligation in clause 64, which we have debated.

Question put and agreed to.

Clause 128 accordingly ordered to stand part of the Bill.

Clause 129

OFCOM’s guidance about enforcement action

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I beg to move amendment 7, in clause 129, page 114, line 3, at end insert—

“(aa) the Information Commissioner, and”.

This amendment ensures that before Ofcom produce guidance about their exercise of their enforcement powers, they must consult the Information Commissioner.

If I may, in the interest of speed and convenience, I will speak to clause stand part as well.

The clause requires Ofcom to issue guidance setting out how it will use its enforcement powers in the round. That guidance will ensure that the enforcement process is transparent, it will cover the general principles and processes of the enforcement regime, and it is intended to help regulated providers and other stakeholders to understand how Ofcom will exercise its powers.

10:00
Government amendment 7 seeks to make it mandatory for Ofcom to consult the Information Commissioner’s Office before producing guidance on how Ofcom will exercise its enforcement powers in relation to the enforceable requirements in the Bill. That is important because the Information Commissioner’s Office has a significant interest in matters of data protection and privacy, and we want to make sure its opinion is properly taken into account before changes are made. We therefore think it is appropriate that the Information Commissioner’s Office is consulted in such circumstances.
Dan Carden Portrait Dan Carden (Liverpool, Walton) (Lab)
- Hansard - - - Excerpts

Clause 129(4) states that the Secretary of State will be consulted in the process. What would be the Secretary of State’s powers in relation to that? Would she be able to overrule Ofcom in the writing of its guidance?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Member asks for my assistance in interpreting legislative language. Generally speaking, “consult” means what it suggests. Ofcom will consult the Secretary of State, as it will consult the ICO, to ascertain the Secretary of State’s opinion, but Ofcom is not bound by that opinion. Unlike the power in a previous clause—I believe it was clause 40—where the Secretary of State could issue a direct instruction to Ofcom on certain matters, here we are talking simply about consulting. When the Secretary of State expresses an opinion in response to the consultation, it is just that—an opinion. I would not expect it to be binding on Ofcom, but I would expect Ofcom to pay proper attention to the views of important stakeholders, which in this case include both the Secretary of State and the ICO. I hope that gives the hon. Member the clarification he was seeking.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

As we know, clause 129 requires Ofcom to publish guidance about how it will use its enforcement powers. It is right that regulated providers and other stakeholders have a full understanding of how, and in what circumstances, Ofcom will have the legislative power to exercise this suite of enforcement powers. We also welcome Government amendment 7, which will ensure that the Information Commissioner—a key and, importantly, independent authority—is included in the consultation before guidance is produced.

As we have just heard, however, the clause sets out that Secretary of State must be consulted before Ofcom produces guidance, including revised or replacement guidance, about how it will use its enforcement powers. We feel that that involves the Secretary of State far too closely in the enforcement of the regime. The Government should be several steps away from being involved, and the clause seriously undermines Ofcom’s independence—the importance of which we have been keen to stress as the Bill progresses, and on which Conservative Back Benchers have shared our view—so we cannot support the clause.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I repeat the point I made to the hon. Member for Liverpool, Walton a moment ago. This is simply an obligation to consult. The clause gives the Secretary of State an opportunity to offer an opinion, but it is just that—an opinion. It is not binding on Ofcom, which may take that opinion into account or not at its discretion. This provision sits alongside the requirement to consult the Information Commissioner’s Office. I respectfully disagree with the suggestion that it represents unwarranted and inappropriate interference in the operation of a regulator. Consultation between organs of state is appropriate and sensible, but in this case it does not fetter Ofcom’s ability to act at its own discretion. I respectfully do not agree with the shadow Minister’s analysis.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Apologies, Ms Rees, for coming in a bit late on this, but I was not aware of the intention to vote against the clause. I want to make clear what the Scottish National party intends to do, and the logic behind it. The inclusion of Government amendment 7 is sensible, and I am glad that the Minister has tabled it. Clause 129 is incredibly important, and the requirement to publish guidance will ensure that there is a level of transparency, which we and the Labour Front Benchers have been asking for.

The Minister has been clear about the requirement for Ofcom to consult the Secretary of State, rather than to be directed by them. As a whole, this Bill gives the Secretary of State far too much power, and far too much ability to intervene in the workings of Ofcom. In this case, however, I do not have an issue with the Secretary of State being consulted, so I intend to support the inclusion of this clause, as amended by Government amendment 7.



Question put, That the amendment be made.

Division 39

Ayes: 10


Conservative: 9
Scottish National Party: 1

Noes: 4


Labour: 4

Amendment 7 agreed to.
Clause 129, as amended, ordered to stand part of the Bill.
Clause 130
Advisory committee on disinformation and misinformation
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move amendment 57, in clause 130, page 115, line 4, leave out “18” and insert “6”

This amendment changes the period by which the advisory committee must report from 18 months to 6.

None Portrait The Chair
- Hansard -

With this, it will be convenient to discuss the following: amendment 58, in clause 130, page 115, line 5, at end insert—

‘(6) Following the publication of the report, OFCOM must produce a code of practice setting out the steps services should take to reduce disinformation across their systems.”

This amendment requires Ofcom to produce a code of practice on system-level disinformation.

Clause stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Clause 130 sets up a committee to advise Ofcom on misinformation and disinformation, which is the only direct reference to misinformation and disinformation in the entire Online Safety Bill. However, the Bill gives the committee no identifiable powers or active role in tackling harmful misinformation and disinformation, meaning that it has limited practical purpose. It is also unclear how the advisory committee will fit with Ofcom’s wider regulatory functions.

The remaining provisions in the Bill are limited and do not properly address harmful misinformation and disinformation. If tackling harmful misinformation and disinformation is left to this clause, the Bill will fail both to tackle harm properly, and to keep children and adults safe.

The clause risks giving a misleading impression that action is being taken. If the Government and Ofcom proceed with creating the committee, we need to see that its remit is strengthened and clarified, so that it more effectively tackles harmful disinformation and misinformation. That should include advising on Ofcom’s research, reporting on drivers of harmful misinformation and disinformation, and proportionate responses to them. There should also be a duty on Ofcom to consult the committee when drafting relevant codes of practice.

That is why we have tabled amendment 57. It would change the period by which the advisory committee must report from 18 months to six. This is a simple amendment that encourages scrutiny. Once again, the Minister surely has little reason not to accept it, especially as we have discussed at length the importance of the advisory committee having the tools that it needs to succeed.

Increasing the regularity of these reports from the advisory committee is vital, particularly given the ever-changing nature of the internet. Labour has already raised concerns about the lack of futureproofing in the Bill more widely, and we feel that the advisory committee has an important role and function to play in areas where the Bill itself is lacking. We are not alone in this view; the Minister has heard from his Back Benchers about just how important this committee is.

Amendment 58 would require Ofcom to produce a code of practice on system-level disinformation. Again, this amendment will come as no surprise to the Minister, given the concerns that Labour has repeatedly raised about the lack of provisions relating to disinformation in the Bill. It seems like an obvious omission that the Bill has failed to consider a specific code of practice around reducing disinformation, and the amendment would be a simple way to ensure that Ofcom actively encourages services to reduce disinformation across their platforms. The Minister knows that this would be a welcome step, and I urge him to consider supporting the amendment.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I want to briefly agree with the sentiments of the Opposition Front Bench, especially about the strength of the committee and the lack of teeth that it currently has. Given that the Government have been clear that they are very concerned about misinformation and disinformation, it seems odd that they are covered in the Bill in such a wishy-washy way.

The reduction of the time from 18 months to six months would also make sense. We would expect the initial report the committee publish in six months to not be as full as the ones it would publish after that. I do not see any issue with it being required to produce a report as soon as possible to assess how the Act is bedding in and beginning to work, rather than having to wait to assess—potentially once the Act is properly working. We want to be able to pick up any teething problems that the Act might have.

We want the committee to be able to say, “Actually, this is not working quite as we expected. We suggest that Ofcom operates in a slightly different way or that the interaction with providers happens in a slightly different way.” I would rather that problems with the Act were tackled as early as possible. We will not know about problems with the Act, because there is no proper review mechanism. There is no agreement on the committee, for example, to look at how the Act is operating. This is one of the few parts of the Bill where we have got an agreement to a review, and it would make sense that it happen as early as possible.

We agree that misinformation and disinformation are very important matters that really need to be tackled, but there is just not enough clout in the Bill to allow Ofcom to properly tackle these issues that are causing untold harm.

Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

When I spoke at the very beginning of the Committee’s proceedings, I said that the legislation was necessary, that it was a starting point and that it would no doubt change and develop over time. However, I have been surprised at how little, considering all of the rhetoric we have heard from the Secretary of State and other Ministers, the Bill actually deals with the general societal harm that comes from the internet. This is perhaps the only place in the Bill where it is covered.

I am thinking of the echo chambers that are created around disinformation and the algorithms that companies use. I really want to hear from the Minister where he sees this developing and why it is so weak and wishy-washy. While I welcome that much of the Bill seeks to deal with the criminality of individuals and the harm and abuse that can be carried out over the internet, overall it misses a great opportunity to deal with the harmful impact the internet can have on society.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me start by speaking on the issue of disinformation more widely, which clearly is the target of the two amendments and the topic of clause 130. First, it is worth reminding the Committee that non-legislatively—operationally—the Government are taking action on the disinformation problem via the counter-disinformation unit of the Department for Digital, Culture, Media and Sport, which we have discussed previously.

The unit has been established to monitor social media firms and sites for disinformation and then to take action and work with social media firms to take it down. For the first couple of years of its operation, it understandably focused on disinformation connected to covid. In the last two or three months, it has focused on disinformation relating to the Russia-Ukraine conflict —in particular propaganda being spread by the Russian Government, which, disgracefully, has included denying responsibility for various atrocities, including those committed at Bucha. In fact, in cases in which the counter-disinformation unit has not got an appropriate response from social media firms, those issues have been escalated to me, and I have raised them directly with those firms, including Twitter, which has tolerated all kinds of disinformation from overt Russian state outlets and channels, including from Russian embassy Twitter accounts, which are of particular concern to me. Non-legislative action is being taken via the CDU.

10:15
I would also point to the legislative action that is currently in train. The Committee will be aware that the National Security Bill had its Second Reading a week or two ago. Colleagues who have studied that Bill—as I am sure they have—will have noticed that clause 13 creates a new foreign interference offence, and that cross-refers to clause 24 in that Bill. I may be over-reaching by trying to memorise two Bills rather than one, but I think those references are right.
That new foreign interference offence, which is being criminalised separately from this Bill, makes it a criminal offence for a foreign state-backed organisation to propagate disinformation, and it specifies the circumstances or conditions that have to be met. I observe in passing that once the National Security Bill has received Royal Assent, it will be possible to add that offence to the Online Safety Bill as a priority offence under schedule 7, so levers will be available.
In addition, for certain kinds of disinformation and misinformation that cause adults harm, it will be possible for that harm to be designated in secondary legislation as a priority category of harm. We may discuss that further in due course.
Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

It is fantastic to hear that those other things are happening—that is all well and good—but surely we should explicitly call out disinformation and misinformation in the Online Safety Bill. The package of other measures that the Minister mentions is fantastic, but I think they have to be in the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady says that those measures should be in the Bill—more than they already are—but as I have pointed out, the way in which the legal architecture of the Bill works means that the mechanisms to do that would be adding a criminal offence to schedule 7 as a priority offence, for example, or using a statutory instrument to designate the relevant kind of harm as a priority harm, which we plan to do in due course for a number of harms. The Bill can cover disinformation with the use of those mechanisms.

We have not put the harmful to adults content in the Bill; it will be set out in statutory instruments. The National Security Bill is still progressing through Parliament, and we cannot have in schedule 7 of this Bill an offence that has not yet been passed by Parliament. I hope that that explains the legal architecture and mechanisms that could be used under the Bill to give force to those matters.

On amendment 57, the Government feel that six months is a very short time within which to reach clear conclusions, and that 18 months is a more appropriate timeframe in which to understand how the Bill is bedding in and operating. Amendment 58 would require Ofcom to produce a code of practice on system-level disinformation. To be clear, the Bill already requires Ofcom to produce codes of practice that set out the steps that providers will take to tackle illegal content— I mentioned the new National Security Bill, which is going through Parliament—and harmful content, which may, in some circumstances, include disinformation.

Disinformation that is illegal or harmful to individuals is in scope of the duties set out in the Bill. Ofcom’s codes of practice will, as part of those duties, have to set out the steps that providers should take to reduce harm to users that arises from such disinformation. Those steps could include content-neutral design choices or interventions of other kinds. We would like Ofcom to have a certain amount of flexibility in how it develops those codes of practice, including by being able to combine or disaggregate those codes in ways that are most helpful to the general public and the services that have to pay regard to them. That is why we have constructed them in the way we have. I hope that provides clarity about the way that disinformation can be brought into the scope of the Bill and how that measure then flows through to the codes of practice. I gently resist amendments 57 and 58 while supporting the clause standing part of the Bill.

Question put, That the amendment be made.

Division 40

Ayes: 5


Labour: 4
Scottish National Party: 1

Noes: 9


Conservative: 9

Amendment proposed: 58, in clause 130, page 115, line 5, at end insert—
‘(6) Following the publication of the report, OFCOM must produce a code of practice setting out the steps services should take to reduce disinformation across their systems.”—(Alex Davies-Jones.)
This amendment requires Ofcom to produce a code of practice on system-level disinformation.
Question put, That the amendment be made.

Division 41

Ayes: 5


Labour: 4
Scottish National Party: 1

Noes: 9


Conservative: 9

Clause 130 ordered to stand part of the Bill.
Clause 131
Functions of the Content Board
Question proposed, That the clause stand part of the Bill.
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The clause allows Ofcom to confer functions on the content board in relation to content-related functions under the Bill, but does not require it to do so. We take the view that how Ofcom manages its responsibilities internally is a matter for Ofcom. That may change over time. The clause simply provides that Ofcom may, if Ofcom wishes, ask its content board to consider online safety matters alongside its existing responsibilities. I trust that the Committee considers that a reasonable measure.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour welcomes the clause, which, as the Minister has said, sets out some important clarifications with respect to the Communications Act 2003. We welcome the clarification that the content board will have delegated and advisory responsibilities, and look forward to the Minister’s confirmation of exactly what those are and how this will work in practice. It is important that the content board and the advisory committee on disinformation and misinformation are compelled to communicate, too, so we look forward to an update from the Minister on what provisions in the Bill will ensure that that happens.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The shadow Minister has asked how this will work in practice, but as I said, the internal operation of Ofcom obviously is a matter for Ofcom. As Members have said in the recent past—indeed, in the last hour—they do not welcome undue Government interference in the operation of Ofcom, so it is right that we leave this as a matter for Ofcom. We are providing Ofcom with the power, but we are not compelling it to use that power. We are respecting Ofcom’s operational independence—a point that shadow Ministers and Opposition Members have made very recently.

Question put and agreed to.

Clause 131 accordingly ordered to stand part of the Bill.

Clause 132

Research about users’ experiences of regulated services

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause 133 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

We support clause 132, which ensures that Ofcom is required to understand and measure public opinion concerning providers of regulated services, as well as the experiences and interests of those using the regulated services in question. The Bill in its entirety is very much a learning curve for us all, and I am sure we all agree that, as previously maintained, the world really is watching as we seek to develop and implement the legislation. That is why it is vital that Ofcom is compelled to conduct and arrange its own research to ensure that we are getting an accurate picture of how our regulatory framework is affecting people. I stress to the Minister that it is imperative that Ofcom consults all service providers—big and small—which the CBI stressed to me in recent meetings.

We also welcome the provisions outlined in subsection (2) that confirm that Ofcom must include a statement of its research in its annual report to the Secretary of State and the devolved Administrations. It is important that Ofcom, as a regulator, takes a research-led approach, and Labour is pleased to see these provisions included in the Bill.

We welcome the inclusion of clause 133, which extends the communication panel’s remit to include online safety. This will mean that the panel is able to give advice on matters relating to different types of online content under the Bill, and on the impacts of online content on UK users of regulated services. It is a welcome step forward, so we have not sought to amend the clause.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I want to make one short comment about clauses 132 and 133, which are really important. There is no intention to interfere with or fetter the way that Ofcom operates, but there is an obligation on this Committee, and on Parliament, to indicate what we would expect to see from Ofcom by way of the clauses, because they are an essential part of the transparency that we are trying to inject into the sector.

Research about users’ experiences is hugely important, and such reports contain important insights into how platforms are used, and the levels of misinformation and disinformation that people are exposed to. Ofcom already produces highly authoritative reports on various aspects of the online world, including the fact that three in four adults do not think about whether the online information that they see is truthful. Indeed, one in three adults believes that all or most information that they find online is truthful. We know that there is a significant gap between consumers perception and reality, so it is important to ensure that research has good exposure among those using the internet.

We do not often hear about the problems of how the online world works, and the level of disinformation and inaccuracy is not well known, so will the Minister elaborate on how he expects Ofcom to ensure that people are aware of the reality of the online world? Platforms will presumably be required to have regard to the content of Ofcom reports, but will Ofcom be required to publicise its reports? It is not clear that such a duty is in the Bill at the moment, so does the Minister expect Ofcom to have a role in educating people, especially children, about the problem of inaccurate data or other aspects of the online world?

We know that a number of platforms spend a great deal of money on going into schools and talking about their products, which may or may not entail accurate information. Does Ofcom not have an important role to play in this area? Educating users about the changes in the Bill would be another potential role for Ofcom in order to recalibrate users’ expectations as to what they might reasonably expect platforms to offer as a result of the legislation. It is important that we have robust regulatory frameworks in place, and this Bill clearly does that. However, it also requires users to be aware of the changes that have been made so that they can report the problems they experience in a timely manner.

10:30
Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

I agree with the right hon. Member for Basingstoke that these are important clauses. I want to put them into the context of what we heard from Frances Haugen, who, when she spoke to Congress, said that Facebook consistently chose to maximise its growth rather than implement safeguards on its platforms. She said:

“During my time at Facebook, I came to realise a devastating truth: Almost no one outside of Facebook knows what happens inside Facebook. “The company intentionally hides vital information from the public, from the U.S. government, and from governments around the world.”

When we consider users’ experiences, I do not think it is good enough just to look at how the user engages with information. We need far more transparency about how the companies themselves are run. I would like to hear the Minister’s views on how this clause, which looks at users’ experiences, can go further in dealing with the harms at source, with the companies, and making sure a light is shone on their practices.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I welcome the support of the hon. Member for Pontypridd for these clauses. I will turn to the questions raised by my right hon. Friend the Member for Basingstoke. First, she asked whether Ofcom has to publish these reports so that the public, media and Parliament can see what they say. I am pleased to confirm that Ofcom does have to publish the reports; section 15 of the Communications Act 2003 imposes a duty on Ofcom to publish reports of this kind.

Secondly, my right hon. Friend asked about educating the public on issues pertinent to these reports, which is what we would call a media literacy duty. Again, I confirm that, under the Communications Act, Ofcom has a statutory duty to promote media literacy, which would include matters that flow from these reports. In fact, Ofcom published an expanded and updated set of policies in that area at the end of last year, which is why the old clause 103 in the original version of this Bill was removed—Ofcom had already gone further than that clause required.

Thirdly, my right hon. Friend asked about the changes that might happen in response to the findings of these reports. Of course, it is open to Ofcom—indeed, I think this Committee would expect it—to update its codes of practice, which it can do from time to time, in response to the findings of these reports. That is a good example of why it is important for those codes of practice to be written by Ofcom, rather than being set out in primary legislation. It means that when some new fact or circumstance arises or some new bit of research, such as the information required in this clause, comes out, those codes of practice can be changed. I hope that addresses the questions my right hon. Friend asked.

The hon. Member for Liverpool, Walton asked about transparency, referring to Frances Haugen’s testimony to the US Senate and her disclosures to The Wall Street Journal, as well as the evidence she gave this House, both to the Joint Committee and to this Committee just before the Whitsun recess. I have also met her bilaterally to discuss these issues. The hon. Gentleman is quite right to point out that these social media firms use Facebook as an example, although there are others that are also extremely secretive about what they say in public, to the media and even to representative bodies such as the United States Congress. That is why, as he says, it is extremely important that they are compelled to be a lot more transparent.

The Bill contains a large number of provisions compelling or requiring social media firms to make disclosures to Ofcom as the regulator. However, it is important to have public disclosure as well. It is possible that the hon. Member for Liverpool, Walton was not in his place when we came to the clause in question, but if he turns to clause 64 on page 56, he will see that it includes a requirement for Ofcom to give every provider of a relevant service a notice compelling them to publish a transparency report. I hope he will see that the transparency obligation that he quite rightly refers to—it is necessary—is set out in clause 64(1). I hope that answers the points that Committee members have raised.

Question put and agreed to.

Clause 132 accordingly ordered to stand part of the Bill.

Clause 133 ordered to stand part of the Bill.

Clause 134

OFCOM’s statement about freedom of expression and privacy

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

As we all know, the clause requires Ofcom to publish annual reports on the steps it has taken, when carrying out online safety functions, to uphold users’ rights under articles 8 and 10 of the convention, as required by section 6 of the Human Rights Act 1998. It will come as no surprise to the Minister that Labour entirely supports this clause.

Upholding users’ rights is a central part of this Bill, and it is a topic we have debated repeatedly in our proceedings. I know that the Minister faces challenges of his own, as the Opposition do, regarding the complicated balance between freedom of speech and safety online. It is only right and proper, therefore, for Ofcom to have a specific duty to publish reports about what steps it is taking to ensure that the online space is fair and equal for all.

That being said, we know that we can and should go further. My hon. Friend the Member for Batley and Spen will shortly address an important new clause tabled in her name—I believe it is new clause 25—so I will do my best not to repeat her comments, but it is important to say that Ofcom must be compelled to publish reports on how its overall regulatory operating function is working. Although Labour welcomes clause 134 and especially its commitment to upholding users’ rights, we believe that when many feel excluded in the existing online space, Ofcom can do more in its annual reporting. For now, however, we support clause 134.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I welcome the shadow Minister’s continuing support for these clauses. Clause 134 sets out the requirement on Ofcom to publish reports setting out how it has complied with articles 8 and 10 of the European convention on human rights.

I will pause for a second, because my hon. Friend the Member for Don Valley and others have raised concerns about the implications of the Bill for freedom of speech. In response to a question he asked last week, I set out in some detail the reasons why I think the Bill improves the position for free speech online compared with the very unsatisfactory status quo. This clause further strengthens that case, because it requires this report and reminds us that Ofcom must discharge its duties in a manner compatible with articles 8 and 10 of the ECHR.

From memory, article 8 enshrines the right to a family life, and article 10 enshrines the right to free speech, backed up by quite an extensive body of case law. The clause reminds us that the powers that the Bill confers on Ofcom must be exercised—indeed, can only be exercised—in conformity with the article 10 duties on free speech. I hope that that gives my hon. Friend additional assurance about the strength of free speech protection inherent in the Bill. I apologise for speaking at a little length on a short clause, but I think that was an important point to make.

Question put and agreed to.

Clause 134 accordingly ordered to stand part of the Bill.

Clause 135

OFCOM’s transparency reports

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Again, Labour welcomes clause 135, which places a duty on Ofcom to produce its own reports based on information from the transparency reports that providers are required to publish. However, the Minister will know that Labour feels the Bill has much more work to do on transparency more widely, as we have repeatedly outlined through our debates. The Minister rejected our calls for increased transparency when we were addressing, I believe, clause 61. We are not alone in feeling that transparency reports should go further. The sector and his own Back Benchers are calling for it, yet so far his Department has failed to act.

It is a welcome step that Ofcom must produce its own reports based on information from the provider’s transparency reports, but the ultimate motivation for the reports to provide a truly accurate depiction of the situation online is for them to be made public. I know the Minister has concerns around security, but of course no one wants to see users put at harm unnecessarily. That is not what we are asking for here. I will refrain from repeating debates we have already had at length, but I wish to again put on the record our concerns around the transparency reporting process as it stands.

That being said, we support clause 135. It is right that Ofcom is compelled to produce its own reports; we just wish they were made public. With the transparency reports coming from the providers, we only wish they would go further.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have spoken to these points previously, so I do not want to tax the Committee’s patience by repeating what I have said.

Question put and agreed to.

Clause 135 accordingly ordered to stand part of the Bill.

Clause 136

OFCOM’s report about researchers’ access to information

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Again, Labour welcomes clause 136, which is a positive step towards a transparent approach to online safety, given that it requires Ofcom to publish a report about the access that independent researchers have, or could have, to matters relating to the online safety of regulated services. As my hon. Friend the Member for Worsley and Eccles South rightly outlined in an earlier sitting, Labour strongly believes that the transparency measures in the Bill do not go far enough.

Independent researchers already play a vital role in regulating online safety. Indeed, there are far too many to list, but many have supported me, and I am sure the Minister, in our research on the Bill. That is why we have tabled a number of amendments on this point, as we sincerely feel there is more work to be done. I know the Minister says he understands and is taking on board our comments, but thus far we have seen little movement on transparency.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

In this clause we are specifically talking about access to information for researchers. Obviously, the transparency matters were covered in clauses 64 and 135. There is consensus across both parties that access to information for bona fide academic researchers is important. The clause lays out a path to take us in the direction of providing that access by requiring Ofcom to produce a report. We debated the matter earlier. The hon. Member for Worsley and Eccles South—I hope I got the pronunciation right this time—

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady made some points about the matter in an earlier sitting, as the shadow Minister just said. It is an area we are giving some careful thought to, because it is important that it is properly academically researched. Although Ofcom is being well resourced, as we have discussed, with lots of money and the ability to levy fees, we understand that it does not have a monopoly on wisdom—as good a regulator as it is. It may well be that a number of academics could add a great deal to the debate by looking at some of the material held inside social media firms. The Government recognise the importance of the matter, and some thought is being given to these questions, but at least we can agree that clause 136 as drafted sets out a path that leads us in this important direction.

Question put and agreed to.

Clause 136 accordingly ordered to stand part of the Bill.

Clause 137

OFCOM’s reports

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Briefly, before I hand over to my hon. Friend the Member for Worsley and Eccles South, I should say that Labour welcomes clause 137, which gives Ofcom a discretionary power to publish reports about certain online safety measures and matters. Clearly, it is important to give Ofcom the power to redact or exclude confidential matters where needs be, and I hope that there will be a certain level of common sense and public awareness, should information of this nature be excluded. As I have previously mentioned—I sound a bit like a broken record—Labour echoes the calls for more transparency, which my hon. Friend the Member for Batley and Spen will come on to in her new clause. However, broadly, we support this important clause.

I would like to press the Minister briefly on how exactly the exclusion of material from Ofcom reports will work in practice. Can he outline any specific contexts or examples, beyond commercial sensitivity and perhaps matters of national security, where he can envision this power being used?

10:45
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I welcome the shadow Minister’s support for the clause, once again. The clause provides Ofcom with the power to publish relevant reports about online safety matters to keep users, the public and Parliament well informed. Again, clearly, it is up to Ofcom to decide how it publishes those reports; we will not compel it.

On the question about confidential material that might be withheld, the relevant language in clause 137 looks, to me, to precisely echo the language we saw previously in clause—where was it? Anyway, we have come across this in a previous clause. When it comes to publishing material that can be excluded, the language is just the same.

I would like to make it clear that, while, obviously, this decision is a matter for Ofcom, I would expect that exclusion to be used on a pretty rare basis. Obviously, one would expect matters that are acutely commercially sensitive to be excluded—or redacted—to address that. If there was very sensitive intellectual property, where it would prejudice a company’s commercial interest to have all of that intellectual property exposed, I would expect Ofcom to exercise the exclusion or at least redact what it publishes.

However, because transparency is so important—it is a point that the Committee has made repeatedly—I would expect these exclusions to be used sparingly, and only where absolutely necessary to deliver issues such as the commercial confidentiality or IP protection. Then, it should be used to the minimum extent necessary, because I think that this Committee thinks, and Parliament thinks, that the disclosure around these reports and the reports about breaches—mentioned in the clause I was trying to reach for previously, which was clause 128(4)(b) and (5)(b); perhaps Hansard would be kind enough to clarify that point to make me look slightly more articulate than I in fact am—should be used only very carefully and very rarely. The Committee should be clear on that, and that the bias, as it were—the assumption—should be on the side of disclosure rather than withholding information.

Question put and agreed to.

Clause 137 accordingly ordered to stand part of the Bill.

Clause 138

Appeals against OFCOM decisions relating to the register under section 81

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this, it will be convenient to consider clause 139 stand part.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Good morning, Ms Rees. It is a pleasure to serve on the Committee with you in the Chair. Clause 138 allows companies to make appeals against Ofcom’s decisions regarding the categorisation of services within categories 1, 2A or 2B.

We have argued, many times, that we believe the Government’s size-based approach to categorisation is flawed. Our preference for an approach based on risk is backed up by the views of multiple stakeholders and the Joint Committee. It was encouraging to hear last week of the Minister’s intention to look again at the issues of categorisation, and I hope we will see movement on that on Report.

Clause 138 sets out that where a regulated provider has filed an appeal, they are exempt from carrying out the duties in the Bill that normally apply to services designated as category 1, 2A or 2B. That is concerning, given that there is no timeframe in which the appeals process must be concluded.

While the right to appeal is important, it is feasible that many platforms will raise appeals about their categorisation to delay the start of their duties under the Bill. I understand that the platforms will still have to comply with the duties that apply to all regulated services, but for a service that has been classified by Ofcom as high risk, it is potentially dangerous that none of the risk assessments on measures to assess harm will be completed while the appeal is taking place. Does the Minister agree that the appeals process must be concluded as quickly as possible to minimise the risk? Will he consider putting a timeframe on that?

Clause 139 allows for appeals against decisions by Ofcom to issue notices about dealing with terrorism and child sexual abuse material, as well as a confirmation decision or a penalty notice. As I have said, in general the right to appeal is important. However, would an appeals system work if, for example, a company were appealing to a notice under clause 103? In what circumstances does the Minister imagine that a platform would appeal a notice by Ofcom requiring the platform to use accredited technology to identify child sexual abuse content and swiftly take down that content? It is vital that appeals processes are concluded as rapidly as possible, so that we do not risk people being exposed to harmful or dangerous content.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The shadow Minister has set out the purpose of the clauses, which provide for, in clause 138 appeal rights for decisions relating to registration under clause 81, and in clause 139 appeals against Ofcom notices.

I agree that it is important that judicial decisions in this area get made quickly. I note that the appeals are directly to the relevant upper tribunal, which is a higher tier of the tribunal system and tends to be a little less congested than the first-tier tribunal, which often gets used for some first-instance matters. I hope that appeals going to the upper tribunal, directly to that more senior level, provides some comfort.

On putting in a time limit, the general principle is that matters concerning listing are reserved to the judiciary. I recall from my time as a Minister in the Ministry of Justice, that the judiciary guards its independence fiercely. Whether it is the Senior President of Tribunals or the Lord Chief Justice, they consider listing matters to be the preserve of the judiciary, not the Executive or the legislature. Compelling the judiciary to hear a case in a certain time might well be considered to infringe on such principles.

We can agree, however—I hope the people making those listing decisions hear that we believe, that Parliament believes—that it is important to do this quickly, in particular where there is a risk of harm to individuals. Where there is risk to individuals, especially children, but more widely as well, those cases should be heard very expeditiously indeed.

The hon. Member for Worsley and Eccles South also asked about the basis on which appeals might be made and decided. I think that is made fairly clear. For example, clause 139(3) makes it clear that, in deciding an appeal, the upper tribunal will use the same principles as would be applied by the High Court to an application for judicial review—so, standard JR terms—which in the context of notices served or decisions made under clause 103 might include whether the power had been exercised in conformity with statute. If the power were exercised or purported to be exercised in a manner not authorised by statute, that would be one grounds for appeal, or if a decision were considered so grossly unreasonable that no reasonable decision maker could make it, that might be a grounds for appeal as well.

I caution the Committee, however: I am not a lawyer and my interpretation of judicial review principles should not be taken as definitive. Lawyers will advise their clients when they come to apply the clause in practice and they will not take my words in Committee as definitive when it comes to determining “standard judicial review principles”—those are well established in law, regardless of my words just now.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

There is a concern that platforms might raise appeals about their categorisation in order to delay the start of their duties under the Bill. How would the Minister act if that happened—if a large number of appeals were pending and the duties under the Bill therefore did not commence?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clearly, resourcing of the upper tribunal is a matter decided jointly by the Lord Chancellor and the Secretary of State for Justice, in consultation with the Lord Chief Justice, and, in this case, the Senior President of Tribunals. Parliament would expect the resourcing of that part of the upper tribunal to be such that cases could be heard in an expedited matter. Particularly where cases concern the safety of the public—and particularly of children—we expect that to be done as quickly as it can.

Question put and agreed to.

Clause 138 accordingly ordered to stand part of the Bill.

Clause 139 ordered to stand part of the Bill.

Clause 140

Power to make super-complaints

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move amendment 143, in clause 140, page 121, line 1, after “services” insert “, consumers”.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 144, in clause 140, page 121, line 2, after “users” insert “, consumers”.

Amendment 145, in clause 140, page 121, line 4, after “services” insert “, consumers”.

Amendment 146, in clause 140, page 121, line 5, after “users” insert “, consumers”.

Amendment 147, in clause 140, page 121, line 6, at end insert “, consumers”.

Amendment 148, in clause 140, page 121, line 7, after “users” insert “, consumers”.

Amendment 149, in clause 140, page 121, line 14, after “service” insert “, consumers”.

Amendment 150, in clause 140, page 121, line 18, at end insert “, consumers”.

Amendment 151, in clause 140, page 121, line 19, after “users” insert “, consumers”.

Amendment 152, in clause 140, page 121, line 25, at end insert—

“‘consumers’” means individuals in the United Kingdom acting for purposes that are wholly or mainly outside the trade, business, craft or profession of the individuals concerned.”

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The Committee has been flexible about grouping clauses should it make sense to do so. I ask that the Committee allow me to speak to this set of amendments alone. It does not make sense for me to discuss these amendments and amendment 77 at the same time. If I could separately discuss amendment 77, as it says on the Order Paper, then I would appreciate that.

This group of amendments specifically relate to consumer protection. It is the case that online fraud facilitated through social media platforms and search engines is one of the most prevalent forms of crime today. Reported incidents increased significantly during the pandemic, and often resulted in victims losing life-changing amounts of money. In addition to the financial impact of being scammed, there is the emotional and physical impact. We know it has a significant effect on people’s mental health. I am glad that the Government listened to the Joint Committee and the Culture, Media and Sport Committee, and changed the legislation to include fraud.

Amendment 143 is about expanding who can make super-complaints, in order to reflect the expansion of the Bill to include fraud. The Bill does not leave a lot of the details around super-complaints to be made in secondary legislation. These amendments specifically allow groups that are acting on behalf of consumers, or those who are making requests on behalf of consumers, to make super-complaints. I am not sure that if somebody is acting on behalf of consumers that fits into the definitions of users of the service and people representing users of the service. Perhaps the Minister can convince me otherwise. If consumers are losing significant amounts of money, or where there is risk of significant numbers of people losing significant amounts of money—for example, where a search engine allows fraudulent advertising to be the top result—including “consumers” in the Bill will allow organisations acting on behalf of consumers to take action. It may be that the Minister can give me some comfort in this, and let us know that organisations acting on behalf of consumers would potentially—if they meet other criteria—be able to put forward a super-complaint.

I understand that there are other methods of complaining—it is possible for other complaints to be made. However, given the significant increase in the risk to consumers in the past few years, it would seem sensible that the Minister give some consideration to whether this is adequately covered in the Bill, and whether consumers are adequately protected in this section of the Bill, as well as in the additional flawed clauses that the Minister added between publication of the original draft Bill and the Bill that we have before us today.

10:59
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The Bill currently specifies that super-complaints can be made back to Ofcom by bodies representing users or members of the public. The addition of consumer representatives through the amendments is important. Consumer representatives are a key source of information about harms to users of online services, which are widespread, and would be regulated by this legislation. We support the amendments, which would include consumers on the list as an entity that is eligible to make super-complaints.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clearly, we want the super-complaint function to be as effective as possible and for groups of relevant people, users or members of the public to be able to be represented by an eligible entity to raise super-complaints. I believe we are all on the same page in wanting to do that. If I am honest, I am a little confused as to what the addition of the term “consumers” will add. The term “users” is defined quite widely, via clause 140(6), which then refers to clause 181, where, as debated previously, a “user” is defined widely to include anyone using a service, whether registered or not. So if somebody stumbles across a website, they count as a user, but the definition being used in clause 140 about bringing super-complaints also includes “members of the public”—that is, regular citizens. Even if they are not a user of that particular service, they could still be represented in bringing a complaint.

Given that, by definition, “users” and “members of the public” already cover everybody in the United Kingdom, I am not quite sure what the addition of the term “consumers” adds. By definition, consumers are a subset of the group “users” or “members of the public”. It follows that in seeking to become an eligible entity, no eligible entity will purport to act for everybody in the United Kingdom; they will always be seeking to define some kind of subset of people. That might be children, people with a particular vulnerability or, indeed, consumers, who are one such subset of “members of the public” or “users”. I do not honestly understand what the addition of the word “consumers” adds here when everything is covered already.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Will the Minister explicitly say that he thinks that an eligible entity, acting on behalf of consumers, could, if it fulfils the other criteria, bring a super-complaint?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes, definitely. That is the idea of an eligible entity, which could seek to represent a particular demographic, such as children or people from a particular marginalised group, or it could represent people who have a particular interest, which would potentially include consumers. So I can confirm that that is the intention behind the drafting of the Bill. Having offered that clarification and made clear that the definition is already as wide as it conceivably can be—we cannot get wider than “members of the public”—I ask the hon. Member for Aberdeen North to consider withdrawing the amendments, particularly as there are so many. It will take a long time to vote on them.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I thank the Minister for the clarification. Given that he has explicitly said that he expects that groups acting on behalf of consumers could, if they fulfil the other criteria, be considered as eligible entities for making super-complaints, I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Amendment proposed: 66, in clause 140, page 121, line 8, at end insert—

“(d) causing harm to any human or animal.”

This amendment ensures groups are able to make complaints regarding animal abuse videos.(Alex Davies-Jones.)

Division 42

Ayes: 5


Labour: 4
Scottish National Party: 1

Noes: 9


Conservative: 9

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move amendment 77, in clause 140, page 121, line 9, leave out subsection (2).

This amendment removes the tests that complaints have to be of particular importance in order to be admissible.

When I first read clause 140, subsection (2) raised a significant number of red flags for me. The subsection might be reasonable if we did not have giant companies—social media platforms particularly—that significant numbers of people across the UK use regularly. Facebook might be counted as a “single regulated service”, but 85% of UK residents—57.1 million people—had a Facebook account earlier this year. Twitter is used by 28% of people living in the UK, which is 19 million users. TikTok is at 19%, which is significantly less, but still a very high number of people—13 million users. I can understand the decision that a super-complaint picking on one certain company might be a bit extreme, but it does not make sense when we are considering the Facebooks of this world.

If someone is making a complaint about a single regulated service and that service is Facebook, Twitter, TikTok or another large platform—or a new, yet-to-be-created platform—that significant numbers of people use, there is no justification for treating that complaint differently just because it is against a single entity. When a complaint is made against Facebook—I am picking on Facebook because 85% of the UK public are members of it; it is an absolute behemoth—I would like there to be no delay in its being taken to Ofcom. I would like Ofcom not to have to check and justify that the complaint is “of particular importance”.

Subsection (2)(a) states that one of the tests of the complaint should be that it “is of particular importance” or, as subsection (2)(b) notes, that it

“relates to the impacts on a particularly large number of users of the service or members of the public.”

I do not understand what

“large number of users of the service”

would mean. Does a large number of the users of Facebook mean 50% of its users? Does it mean 10%? What is a large number? Is that in percentage terms, or is it something that is likely to impact 1 million people? Is that a large number? The second part—

“large number…of members of the public”—

is again difficult to define. I do not think there is justification for this additional hoop just because the complaint relates to a single regulated service.

Where a complaint relates to a very small platform that is not causing significant illegal harm, I understand that Ofcom may want to consider whether it will accept, investigate and give primacy and precedence to that. If the reality is that the effect is non-illegal, fairly minor and impacts a fairly small number of people, in the order of hundreds instead of millions, I can understand why Ofcom might not want to give that super-complaint status and might not want to carry out the level of investigation and response necessary for a super-complaint. But I do not see any circumstances in which Ofcom could justify rejecting a complaint against Facebook simply because it is a complaint against a single entity. The reality is that if something affects one person on Facebook, it will affect significantly more than one person on Facebook because of Facebook’s absolutely massive user base. Therefore this additional hoop is unrealistic.

Paragraph (a), about the complaint being “of particular importance”, is too woolly. Does it relate only to complaints about things that are illegal? Does it relate only to things that are particularly urgent—something that is happening now and that is having an impact today? Or is there some other criterion that we do not yet know about?

I would very much appreciate it if the Minister could give some consideration to amendment 77, which would simply remove subsection (2). If he is unwilling to remove that subsection, I wonder whether we could meet halfway and whether, let us say, category 1 providers could all be excluded from the “single provider” exemption, because they have already been assessed by Ofcom to have particular risks on their platforms. That group is wider than the three names that I have mentioned, and I think that that would be a reasonable and realistic decision for the Government—and direction for Ofcom—to take. It would be sensible.

If the Government believe that there is more information—more direction—that they could add to the clause, it would be great if the Minister could lay some of that out here and let us know how he intends subsection (2) to operate in practice and how he expects Ofcom to use it. I get that people might want it there as an additional layer of protection, but I genuinely do not imagine that it can be justified in the case of the particularly large providers, where there is significant risk of harm happening.

I will illustrate that with one last point. The Government specifically referred earlier to when Facebook—Meta—stopped proactively scanning for child sexual abuse images because of an issue in Europe. The Minister mentioned the significant amount of harm and the issues that were caused in a very small period. And that was one provider—the largest provider that people use and access. That massive amount of harm can be caused in a very small period. I do not support allowing Meta or any other significantly large platform to have a “get out of jail” card. I do not want them to be able to go to Ofcom and say, “Hey, Ofcom, we’re challenging you on the basis that we don’t think this complaint is of particular importance” or “We don’t think the complaint relates to the impacts on a particularly large number of users of the service or members of the public.” I do not want them to have that ability to wriggle out of things because this subsection is in the Bill, so any consideration that the Minister could give to improving clause 140 and subsection (2) would be very much appreciated.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

We support the SNP’s amendment 77, moved by the hon. Member for Aberdeen North. The super-complaints mechanism introduced by clause 140 is a useful device for reporting numerous, widespread concerns about the harm caused by multiple or single services or providers. Subsection (1) includes the conditions on the subjects of super-complaints, which can relate to one or more services. However, as the hon. Member has pointed out, that is caveated by subsection (2), under which a super-complaint that refers to a single service or provider must prove, as she has just outlined, that it is “of particular importance” or

“relates to the impacts on a particularly large number of users of the service or members of the public.”

Given the various hoops through which a super-complaint already has to jump, it is not clear why the additional conditions are needed. Subsection (2) significantly muddies the waters and complicates the provisions for super-complaints. For instance, how does the Minister expect Ofcom to decide whether the complaint is of particular importance? What criteria does he expect the regulator to use? Why include it as a metric in the first place when the super-complaint has already met the standards set out in subsection (1)?

11:15
There must be no loopholes in the complaints procedures, including as regards holding individual services and providers to account. Amendment 77 both strengthens and simplifies the super-complaint provisions, and we support it.
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I think the Committee, and the House, are pretty unanimous in agreeing that the power to make super-complaints is important. As we have discussed, there are all kinds of groups, such as children, under-represented groups and consumers, that would benefit from being represented where systemic issues are not being addressed and that Ofcom may have somehow overlooked or missed in the discharge of its enforcement powers.

I would observe in passing that one of the bases on which super-complaints can be made—this may be of interest to my hon. Friend the Member for Don Valley—is where there is a material risk under clause 140(1)(b) of

“significantly adversely affecting the right to freedom of expression within the law of users of the services or members of the public”.

That clause is another place in the Bill where freedom of expression is expressly picked out and supported. If freedom of expression is ever threatened in a way that we have not anticipated and that the Bill does not provide for, there is a particular power here for a particular free speech group, such as the Free Speech Union, to make a super-complaint. I hope that my hon. Friend finds the fact that freedom of expression is expressly laid out there reassuring.

Let me now speak to the substance of amendment 77, tabled by the hon. Member for Aberdeen North. It is important to first keep in mind the purpose of the super-complaints, which, as I said a moment ago, is to provide a basis for raising issues of widespread and systemic importance. That is the reason for some of the criteria in sections (1)(a), (b) and (c), and why we have subsection (2)—because we want to ensure that super-complaints are raised only if they are of a very large scale or have a profound impact on freedom of speech or some other matter of particular importance. That is why the tests, hurdles and thresholds set out in clause 140(2) have to be met.

If we were to remove subsection (2), as amendment 77 seeks to, that would significantly lower the threshold. We would end up having super-complaints that were almost individual in nature. We set out previously why we think an ombudsman-type system or having super-complaints used for near-individual matters would not be appropriate. That is why the clause is there, and I think it is reasonable that it is.

The hon. Lady asked a couple of questions about how this arrangement might operate in practice. She asked whether a company such Facebook would be caught if it alone were doing something inappropriate. The answer is categorically yes, because the condition in clause 140(2)(b)—

“impacts on a particularly large number of users”,

which would be a large percentage of Facebook’s users,

“or members of the public”—

would be met. Facebook and—I would argue—any category 1 company would, by definition, be affecting large numbers of people. The very definition of category 1 includes the concept of reach—the number of people being affected. That means that, axiomatically, clause 140(2)(b) would be met by any category 1 company.

The hon. Lady also raised the question of Facebook, for a period of time in Europe, unilaterally ceasing to scan for child sexual exploitation and abuse images, which, as mentioned, led to huge numbers of child sex abuse images and, consequently, huge numbers of paedophiles not being detected. She asks how these things would be handled under the clause if somebody wanted to raise a super-complaint about that. Hopefully, Ofcom would stop them happening in the first place, but if it did not the super-complaint redress mechanism would be the right one. These things would categorically be caught by clause 140(2)(a), because they are clearly of particular importance.

In any reasonable interpretation of the words, the test of “particular importance” is manifestly met when it comes to stopping child sexual exploitation and abuse and the detection of those images. That example would categorically qualify under the clause, and a super-complaint could, if necessary, be brought. I hope it would never be necessary, because that is the kind of thing I would expect Ofcom to catch.

Having talked through the examples from the hon. Lady, I hope I have illustrated how the clause will ensure that either large-scale issues affecting large numbers of people or issues that are particularly serious will still qualify for super-complaint status with subsection (2) left in the Bill. Given those assurances, I urge the hon. Member to consider withdrawing her amendment.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I welcome the Minister’s fairly explicit explanation that he believes that every category 1 company would be in scope, even if there was a complaint against one single provider. I would like to push the amendment to a vote on the basis of the comments I made earlier and the fact that each of these platforms is different. We have heard concerns about, for example, Facebook groups being interested in celebrating eight-year-olds’ birthdays. We have heard about the amount of porn on Twitter, which Facebook does not have in the same way. We have heard about the kind of algorithmic stuff that takes people down a certain path on TikTok. We have heard all these concerns, but they are all specific to that one provider. They are not a generic complaint that could be brought toward a group of providers.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Would the hon. Lady not agree that in all those examples—including TikTok and leading people down dark paths—the conditions in subsection (2) would be met? The examples she has just referred to are, I would say, certainly matters of particular importance. Because the platforms she mentions are big in scale, they would also meet the test of scale in paragraph (b). In fact, only one of the tests has to be met—it is one or the other. In all the examples she has just given, not just one test—paragraph (a) or (b)— would be met, but both. So all the issues she has just raised would make a super-complaint eligible to be made.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am glad the Minister confirms that he expects that that would be the case. I am clearer now that he has explained it, but on my reading of the clause, the definitions of “particular importance” or

“a particularly large number of users…or members of the public”

are not clear. I wanted to ensure that this was put on the record. While I do welcome the Minister’s clarification, I would like to push amendment 77 to a vote.

Question put, That the amendment be made.

Division 43

Ayes: 5


Labour: 4
Scottish National Party: 1

Noes: 9


Conservative: 9

Ordered, That further consideration be now adjourned. —(Steve Double.)
11:24
Adjourned till this day at Two o’clock.

Online Safety Bill (Fourteenth sitting)

Committee stage
Tuesday 21st June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 21 June 2022 - (21 Jun 2022)
The Committee consisted of the following Members:
Chairs: Sir Roger Gale, †Christina Rees
† Ansell, Caroline (Eastbourne) (Con)
† Bailey, Shaun (West Bromwich West) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
† Carden, Dan (Liverpool, Walton) (Lab)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Double, Steve (St Austell and Newquay) (Con)
† Fletcher, Nick (Don Valley) (Con)
† Holden, Mr Richard (North West Durham) (Con)
† Keeley, Barbara (Worsley and Eccles South) (Lab)
† Leadbeater, Kim (Batley and Spen) (Lab)
† Miller, Dame Maria (Basingstoke) (Con)
Mishra, Navendu (Stockport) (Lab)
† Moore, Damien (Southport) (Con)
Nicolson, John (Ochil and South Perthshire) (SNP)
† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
† Russell, Dean (Watford) (Con)
† Stevenson, Jane (Wolverhampton North East) (Con)
Katya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks
† attended the Committee
Public Bill Committee
Tuesday 21 June 2022
(Afternoon)
[Christina Rees in the Chair]
Online Safety Bill
Clause 140
Power to make super-complaints
14:00
Amendment proposed: 67, in clause 140, page 121, line 20, at end insert
“, or a particular group that campaigns for the removal of harmful online content towards humans and animals”.—(Alex Davies-Jones.)
This amendment makes groups campaigning against harmful content eligible to make super-complaints.

Division 44

Ayes: 5


Labour: 4
Scottish National Party: 1

Noes: 9


Conservative: 9

Question proposed, That the clause stand part of the Bill.
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 153, in clause 141, page 121, line 32, after “140” insert

“, which must include the requirement that OFCOM must respond to such complaints within 90 days”

Clauses 141 and 142 stand part.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

Good afternoon, Ms Rees. The importance of an effective complaints procedure has been argued strongly by many people who have given oral and written evidence to this Committee and indeed by Committee members. It is welcome that clause 140 introduces a super-complaints mechanism to report multiple, widespread concerns about the harm caused by services, but the lack of redress for individuals has been raised repeatedly.

This is a David and Goliath situation, with platforms holding all the power, while individuals are left to navigate the often complex and underfunded internal complaints systems provided by the platforms. This is what the London School of Economics and Political Science has called the

“current imbalance between democratic, ‘people’ power and the power of platforms.”

As we argued on new clause 1, there is a clear need to consider a route for redress at an individual level. The current situation is unsatisfactory for people who feel they have been failed by a service’s complaints system and who find themselves with no source of redress.

The current situation is also unsatisfactory for the regulator. Kevin Bakhurst from Ofcom told the right hon. Member for Basingstoke during our evidence sessions:

“Those individual complaints, although we are not going to be very specific in looking at individual pieces of material per se, are very useful to alert us where there are issues around particular types of offence or harm that the platforms are not seen to be dealing with properly.”––[Official Report, Online Safety Public Bill Committee, 24 May; c.9-10, Q9.]

An external redress process was recommended by the Joint Committee on the draft Bill and has been suggested by multiple stakeholders. Our new clause would make sure that we find the best possible solution to the problem. I hope the Minister reconsiders these points and supports new clause 1 when the time comes to vote on it.

As I have argued previously, organisations will not be able to make full and effective use of the super-complaints system unless the platforms risk assessments are published in full. The Opposition’s amendments 11 and 13 sought to address that issue, and I am disappointed that the Government failed to grasp their importance. There is now a real risk that civil society and other groups will not be able to assess and identify the areas where a company may not be meeting its safety duties. How does the Minister expect organisations making super-complaints to identify and argue that a service is causing harm to its users if they have no access to the company’s own analysis and mitigation strategy? Not including a duty to publish risk assessments leaves a gaping hole in the Bill and risks undermining the super-complaints mechanism. I hope that the Minister will reconsider his opposition to this important transparency mechanism in future stages of the Bill.

For powers about super-complaints to be meaningful, there must be a strict deadline for Ofcom to respond to them, and we will support the SNP amendment if it is pushed to a vote. The Enterprise Act 2002 gives a 90-day deadline for the Competition and Markets Authority to respond. Stakeholders have suggested a similar deadline to respond for super-complaints as an effective mechanism to ensure action from the regulator. I urge the Minister to consider this addition, either in the Bill with this amendment, or in the secondary legislation that the clause requires.

Clauses 141 and 142 relate to the structures around super-complaints. Clause 141 appears to be more about handing over powers to the Secretary of State than insuring a fair system of redress. The Opposition have said repeatedly how we feel about the powers being handed over to the Secretary of State. Clause 142 includes necessary provisions on the creation and publication of guidance by Ofcom, which we do not oppose. Under clause 141, Ofcom will have to provide evidence of the validity of the super-complaint and the super-complainant within a stipulated timeframe. However, there is little in the Bill about what will happen when a super-complaint is made, and much of the detail on how that process will work has been left to secondary legislation.

Does the Minister not think that it is strange to leave it up to the Secretary of State to determine how Ofcom is to deal with super-complaints? How does he envisage the system working, and what powers does he think Ofcom will need to be able to assert itself in relation to super-complaints? It seems odd to leave the answers to those important questions out of the Bill.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I appreciate the support from the Opposition in relation to amendment 153. I want to talk about amendment 153, but also about some of the issues there are with clauses 140 and 141—not so much 142. Clause 140(3) allows the Secretary of State to make regulations in relation to working out who an eligible entity is for making super-complaints. The Minister has helpfully been very clear that the definition is likely to be pretty wide—the definition of groups that are working on behalf of consumers is likely to be wide. The regulations that are made in this section are going to be made under the draft affirmative procedure. Although secondary legislation is not brilliant, the affirmative procedure will allow more scrutiny than negative procedure. I appreciate that the Minister has chosen—or the people drafting the Bill have chosen—that way forward for deciding on the eligible entity.

I am concerned that when it comes to clause 141(1), the regulations setting out how the complaints process will be made, and the regulation level, will be done under the negative procedure rather than under the draft affirmative procedure. I have got the Delegated Powers and Regulatory Reform Committee memorandum, which tells us about each of the delegated powers of the Bill, and the justification for them. I understand that the Department is referring to the Police Super-complaints (Designation and Procedure) Regulations 2018, which were made under the negative procedure. However, I am not convinced that in the Policing and Crime Act 2017 we were left with quite so little information about what would be included in those complaints. I think the justification for the negative procedure is not great, especially given the concerns raised about the over-reach of the Secretary of State’s power and the amount of influence they have on Ofcom.

I think clause 142 is fine; it makes sense that Ofcom is able to make guidance. I would have liked to see the regulation part involve more input from parliamentarians. If there is not going to be more input from parliamentarians, there should at least be more in the Bill about how the complaints procedure would work. The reason we have tabled amendment 153 is to ensure that Ofcom provides a response. That response does not have to be a final response saying, “We have investigated everything and these are the findings.” I understand that that may take some time. However, Ofcom must provide a response to super-complainants in 90 days. Even if it were to provide that information in the terms laid out in clause 141(2)(d)—whether a complaint is within clause 140, or is admissible under clause 140 or whether an entity is an eligible entity—and we were to commit Ofcom to provide that information within 90 days, that would be better than the current drafting, which is no time limits at all. It is not specified. It does not say that Ofcom has to deal with the complaint within a certain length of time.

A quick response from Ofcom is important for a number of reasons. I expect that those people who are bringing super-complaints are likely to be third sector organisations. Such organisations do not have significant or excessive budgets. They will be making difficult choices about where to spend their money. If they are bringing forward a super-complaint, they will be doing it on the basis that they think it is incredibly important and it is worth spending their finite funding on legal advice in order to bring forward that super-complaint. If there is an unnecessary delay before Ofcom even recognises whether the complaint is eligible, charities may spend money unnecessarily on building up a further case for the next stages of the super-complaint. They should be told very quickly, “No, we are not accepting this” or “Yes, we are accepting this”.

Ofcom has the ability to levy fees so that it can provide the service that we expect it to provide as a result of the Bill. It will have a huge amount of extra work compared with its current work. It needs to be able to levy fees in order to fulfil its functions. If there is no timeline and it says, “We want to levy fees because we want to be able to respond on a 90-day basis”, it would not be beyond companies to come back and say, “That is unrealistic—you should not be charging us extra fees in order for you to have enough people to respond within a 90-day period to super-complaints.”

If Ofcom is to be able to levy fees effectively to provide the level of service that we would all—including, I am sure, the Minister—like to see to super-complainants who are making very important cases on behalf of members of the public and people who are being harmed by content online, and to give Ofcom that backing when it is setting the structures and levying the fees, it would be sensible for the Minister to make some commitments about the timelines for super-complaints.

In earlier clauses of the Bill, primacy is given to complaints to social media platforms, for example—to regulated providers—about freedom of speech. The Bill says that they are to give such complaints precedence. They are to deal with them as important and, where some content has been taken down, quickly. That precedence is written into the Bill. Such urgency is not included in these three clauses on super-complaints in the way I would like to see. The Bill should say that Ofcom has to deal with super-complaints quickly. I do not mean it should do that by doing a bad job. I mean that it should begin to investigate quickly, work out whether it is appropriate to investigate it under the super-complaints procedure, and then begin the investigation.

In some cases, stuff will be really urgent and will need to be dealt with very quickly, especially if, for example, it includes child sexual abuse images. That would need to be dealt with in a matter of hours or days, rather than any longer period.

I would like to see some sort of indication given to Ofcom about the timelines that we are expecting it to work to. Given the amount of work that third sector organisations have put in to support this Bill and try to make it better, this is a fairly easy amendment for the Minister to accede to—an initial response by Ofcom within a 90-day period; we are not saying overnight—so that everyone can be assured that the internet is, as the Minister wishes, a much safer place.

14:14
Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

As we have heard, the super-complaint process is extremely important for enabling eligible entities representing the interests of users or members of the public to make representations where there are systemic problems that need to be addressed. I think we all agree that is an important approach.

Clauses 140 to 142 set out the power to make super-complaints, the procedure for making them and the guidance that Ofcom will publish in relation to them. The shadow Minister raised a few questions first, some of which we have touched on previously. In relation to transparency, which we have debated before, as I said previously, there are transparency provisions in clause 64 that I think will achieve the objectives that she set out.

The shadow Minister also touched on some of the questions about individual rather than systemic complaints. Again, we debated those right at the beginning, I think, when we discussed the fact that the approach taken in the Bill is to deal with systems and processes, because the scale involved here is so large. If we tried to create an architecture whereby Ofcom, or some other public body, adjudicated individual complaints, as an ombudsman would, it would simply be overwhelmed. A much better approach is to ensure that the systems and processes are fixed, and that is what the Bill does.

The hon. Member for Aberdeen North had some questions too. She touched in passing on the Secretary of State’s powers to specify by regulation who counts as an eligible entity—this is under clause 140(3). Of course, the nature of those regulations is circumscribed by the very next subsection, subsection (4), in which one of the criteria is that the entity

“must be a body representing the interests of users of regulated services, or members of the public”.

That speaks to the important point about consumers that we touched on this morning. As the hon. Lady said, this will be done by the affirmative procedure, so there is enhanced parliamentary scrutiny. I hope that makes it clear that it would be done in a reasonable way.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am sorry to try the Minister’s patience. I think that we are in quite a lot of agreement about what an eligible entity looks like. I appreciate that this is being done by the affirmative procedure, but we seem to be in much less agreement about the next clause, which is being done by the negative procedure. I would like him to explain that contrast.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me move on to clause 141 and amendment 153, which the hon. Lady spoke to a moment ago. Let us first talk about the question of time limits. As she said, the regulations that can be made under the clause include regulations on the time for various steps in the process. Rather than setting those out in the Bill, our intention is that when those regulations are moved they will include those time limits, but we want to consult Ofcom and other appropriate bodies to ensure that the deadlines set are realistic and reasonable. I cannot confirm now what those will be, because we have not yet done the consultation, but I will make a couple of points.

First, the steps set out in clause 141(2)(d)(i), (ii) and (iii), at the top of page 122, are essentially procedural steps about whether a particular complaint is in scope, whether it is admissible and whether the entity is eligible. Those should be relatively straightforward to determine. I do not want to pre-empt the consultation and the regulations, but my expectation is that those are done in a relatively short time. The regulations in clause 141(2)

“may…include provisions about the following matters”—

it then lists all the different things—and the total amount of time the complaint must take to resolve in its totality is not one of them. However, because the word “include” is used, it could include a total time limit. If the regulations were to set a total time limit, one would have to be a little careful, because clearly some matters are more complicated than others. The hon. Member for Aberdeen North acknowledged that we would not want to sacrifice quality and thoroughness for speed. If an overall time limit were set, it would have to accommodate cases that were so complicated or difficult, or that required so much additional information, that they could not be done in a period of, say, 90 days. I put on record that that is something that the consultation should carefully consider. We are proceeding in this way—with a consultation followed by regulations—rather than putting a time limit in the Bill because it is important to get this right.

The question was asked: why regulations rather than Ofcom? This is quite an important area, as the hon. Member for Aberdeen North and the shadow Minister—the hon. Member for Worsley and Eccles South—have said. This element of governmental and parliamentary oversight is important, hence our having regulations, rather than letting Ofcom write its own rules at will. We are talking about an important mechanism, and we want to make sure that it is appropriately responsive.

The question was asked: why will the regulations be subject to the negative, rather than the affirmative, procedure? Clearly that is a point of detail, albeit important detail. Our instinct was that the issue was perhaps of slightly less parliamentary interest than the eligible entity list, which will be keenly watched by many external parties. The negative procedure is obviously a little more streamlined. There is no hard-and-fast rule as to why we are using negative rather than affirmative, but that was broadly the thinking. There will be a consultation, in which Ofcom will certainly be consulted. Clause 141(3) makes it clear that others can be consulted too. That consultation will be crucial in ensuring that we get this right and that the process is as quick as it can be—that is important—but also delivers the right result. I gently resist amendment 153 and commend clauses 140 to 142.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Some Acts that this Parliament has passed have provided for a time limit within which something must be considered, but the time limit can be extended if the organisation concerned says to the Secretary of State, “Look, this is too complicated. We don’t believe that we can do this.” I think that was the case for the Subsidy Control Act 2022, but I have been on quite a few Bill Committees, so I may be wrong about that. That situation would be the exception, obviously, rather than the rule, and would apply only in the most complicated cases.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady is suggesting a practical solution: a default limit that can be extended if the case is very complicated. That sort of structure can certainly be consulted on and potentially implemented in regulations. She referred to asking the Secretary of State’s permission. Opposition Members have been making points about the Secretary of State having too much power. Given that we are talking here about the regulator exercising their investigatory power, that kind of extension probably would not be something that we would want the Secretary of State’s permission for; we would find some other way of doing it. Perhaps the chief executive of Ofcom would have to sign it off, or some other body that is independent of Government.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Sorry, I phrased that quite badly. My point was more about having to justify things—having to say, “Look, we are sorry; we haven’t managed to do this in the time in which we were expected to. This is our justification”—rather than having to get permission. Apologies for phrasing that wrongly. I am glad that the Minister is considering including that point as something that could be suggested in the consultation.

I appreciate what the Minister says, but I still think we should have a time limit in the Bill, so I am keen to push amendment 153 to a vote.

Question put and agreed to.

Clause 140 accordingly ordered to stand part of the Bill.

Clause 141

Procedure for super-complaints

Amendment proposed: 153, in clause 141, page 121, line 32, after “140” insert

“, which must include the requirement that OFCOM must respond to such complaints within 90 days”—(Kirsty Blackman.)

Question put, That the amendment be made.

Division 45

Ayes: 5


Labour: 4
Scottish National Party: 1

Noes: 10


Conservative: 10

Clauses 141 and 142 ordered to stand part of the Bill.
Clause 143
Statement of strategic priorities
Question proposed, That the clause stand part of the Bill.
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause 144 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

As we know, clause 143 introduces a power for the Secretary of State to set out a statement of the Government’s strategic priorities in relation to online safety matters. Given that the power is similar to those that already exist in the Communications Act 2003, we do not formally oppose the clause. We welcome the fact that the Secretary of State must follow a consultation and parliamentary procedure before proceeding. It is vital that transparency surrounds any targets or priorities that the Secretary of State may outline. However, we want to put on record our slight concerns around the frequency limitations on amendments that are outlined in subsections (7) and (8). This is a direct interference regime, and we would appreciate the Minister’s reassurances on the terms of how it will work in practice.

We also welcome clause 144, which sets out the consultation and parliamentary procedure requirements that must be satisfied before the Secretary of State can designate a statement of strategic priorities under clause 143. We firmly believe that parliamentary oversight must be at the heart of the Bill, and the Minister’s Back Benchers agree. We have heard compelling statements from the right hon. Member for Basingstoke and other colleagues about just how important parliamentary oversight of the Bill will be, even when it has received Royal Assent. That is why clause 144 is so important: it ensures that the Secretary of State must consult Ofcom when considering the statement of strategic priorities.

Following that, the draft statement must be laid before Parliament for proper scrutiny. As we have said before, this is central to the Bill’s chances of success, but Labour firmly believes that it would be unreasonable for us to expect the Secretary of State to always be an expert across every policy area out there, because it is not possible. That is why parliamentary scrutiny and transparency are so important. It is not about the politics; it is about all of us working together to get this right. Labour will support clause 144 because, fundamentally, it is for the Secretary of State to set out strategic priorities, but we must ensure that Parliament is not blocked from its all-important role in providing scrutiny.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the shadow Minister for her broad support for these two clauses. Clause 143 provides the power, but not an obligation, for the Secretary of State to set out a strategic statement on her priorities for online safety matters. As the shadow Minister said, it is similar to powers that already exist in other areas. The clause links back to clause 78, whereby Ofcom must have regard to the strategic priorities and set out how it responds to them when they are updated. On clause 144, I am glad that the shadow Minister accepts the consultation has to happen and that the 40-day period for Parliament to consider changes to the draft statement and, if it wishes to, to object to them is also a welcome opportunity for parliamentary scrutiny.

The Government have heard the wider points about parliamentary scrutiny and the functioning of the Joint Committee, which my right hon. Friend the Member for Basingstoke mentioned previously. I have conveyed them to higher authorities than me, so that transmission has occurred. I recognise the valuable work that the Joint Committee of the Commons and Lords did in scrutinising the Bill prior to its introduction, so I am glad that these clauses are broadly welcome.

Question put and agreed to.

Clause 143 accordingly ordered to stand part of the Bill.

Clause 144 ordered to stand part of the Bill.

Clause 145

Directions about advisory committees

Question proposed, That the clause stand part of the Bill.

14:30
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour supports the clause, which enables the Secretary of State to give Ofcom a direction to establish an expert committee to advise it on a specific online safety matter. As we have said repeatedly, it is vital that expert stakeholders are included as we begin the challenging process of regulating the internet. With that in mind, we need to ensure that the committee truly is expert and that it remains independent.

The Minister knows that I have concerns about Ofcom’s ability to remain truly independent, particularly given the recent decision to appoint a Tory peer to chair the organisation. I do not want to use our time today to make pointed criticisms about that decision—much as I would like to—but it is important that the Minister addresses these concerns. Ofcom must be independent—it really is quite important for the future success of the Bill. The expert committee’s chair, and its other members, must be empowered to report freely and without influence. How can the Minister ensure that that will genuinely be the case?

Subsection (4) places a duty on an advisory committee established under such a direction to publish a report within 18 months of its being established. I want to push the Minister on the decision to choose 18 months. I have mentioned my concerns about that timeframe; it seems an awfully long time for the industry, stakeholders, civil society and, indeed, Parliament to wait. I cannot be clearer about how important a role I think that this committee will have, so I would be grateful if the Minister could clarify why he thinks it will take 18 months for such a committee to be established.

That said, we broadly support the principles of what the clause aims to do, so we have not sought to amend it at this stage.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the shadow Minister for her comments and questions. She raised two substantive points on the clause; I will address those, rather than any wider issues that may be contentious.

The first question was about whether the advisory committee would be independent, and how we can be certain that it will not be unduly interfered in by the Government. The answer lies clearly in subsection (3). Paragraphs (a) and (b) make it very clear that although the Secretary of State may direct Ofcom to establish the committee, the identity of the people on the committee is for Ofcom to determine. Subsection (3)(a) states very clearly that the chairman is “appointed by OFCOM”, and subsection (3)(b) states that members of the committee are

“appointed by OFCOM as OFCOM consider appropriate.”

It is Ofcom, not the Secretary of State, that appoints the chair and the members. I trust that that deals with the question about the independence of the members.

On the second question, about time, the 18 months is not 18 months for the committee to be established—I am looking at clause 145(4)—but 18 months for the report to be published. Subsection (4) says “within” a period of 18 months, so it does not have to be 18 months for delivery of the report; it could be less, and I am sure that in many cases it will be. I hope that answers the shadow Minister’s questions on the clause, and I agree that it should stand part of the Bill.

Question put and agreed to.

Clause 145 accordingly ordered to stand part of the Bill.

Clause 146

Directions in special circumstances

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss new clause 10—Special circumstances—

“(1) This section applies where OFCOM has reasonable grounds for believing that circumstances exist that present a threat—

(a) to the health or safety of the public, or

(b) to national security.

(2) OFCOM may, in exercising their media literacy functions, give priority for a specified period to specified objectives designed to address the threat presented by the circumstances mentioned in subsection (1).

(3) OFCOM may give a public statement notice to—

(a) a specified provider of a regulated service, or

(b) providers of regulated services generally.

(4) A ‘public statement notice’ is a notice requiring a provider of a regulated service to make a publicly available statement, by a date specified in the notice, about steps the provider is taking in response to the threat presented in the circumstances mentioned in subsection (1).

(5) OFCOM may, by a public statement notice or a subsequent notice, require a provider of a regulated service to provide OFCOM with such information as they may require for the purpose of responding to that threat.

(6) If OFCOM takes any of the steps set out in this Chapter, they must publish their reasons for doing so.

(7) In subsection (2) ‘media literacy functions’ means OFCOM’s functions under section 11 of the Communications Act (duty to promote media literacy), so far as functions under that section relate to regulated services.”

This new clause gives Ofcom the power to take particular steps where it considers that there is a threat to the health and safety of the public or to national security, without the need for a direction from the Secretary of State.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

As we all know, the clause as it stands enables the Secretary of State to give Ofcom directions in circumstances where it considers that there is a threat to the health or safety of the public or to national security. That includes directing Ofcom to prioritise action to respond to a specific threat when exercising its media literacy functions, and to require specified service providers, or providers of regulated services more generally, to publicly report on what steps they are taking to respond to that threat.

However, Labour shares the concerns of the Carnegie UK Trust, among others, that there is no meaningful constraint on the Secretary of State’s powers to intervene as outlined in the clause. Currently, the Secretary of State has the power to direct Ofcom where they have “reasonable grounds for believing” that there is a threat to the public’s health or safety or to national security. The UK did not need these powers before—during the cold war, for example—so we have to ask: why now?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

So far as I am aware, the phenomenon of social media companies, to which media literacy relates, did not exist during the cold war.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It did not, but there were examples of disinformation, misinformation and the spreading of falsehoods, and none of these powers existed at the time. It seems weird—if I can use that term—that these exist now. Surely, the more appropriate method would be for the Secretary of State to write a letter to Ofcom to which it had to have regard. As it stands, this dangerous clause ensures the Secretary of State has the power to interfere with day-to-day enforcement. Ultimately, it significantly undermines Ofcom’s overall independence, which we truly believe should be at the heart of the Bill.

With that in mind, I will now speak to our crucial new clause 10, which instead would give Ofcom the power to take particular steps, where it considers that there is a threat to the health and safety of the public or national security, without the need for direction from the Secretary of State. Currently, there is no parliamentary scrutiny of the powers outlined in clause 146; it says only that the Secretary of State must publish their reasoning unless national security is involved. There is no urgency threshold or requirement in the clause. The Secretary of State is not required to take advice from an expert body, such as Public Health England or the National Crime Agency, in assessing reasonable grounds for action. The power is also not bounded by the Bill’s definition of harm.

These instructions do two things. First, they direct Ofcom to use its quite weak media literacy duties to respond to the circumstances. Secondly, a direction turns on a power for Ofcom to ask a platform to produce a public statement about what the platform is doing to counter the circumstances or threats in the direction order—that is similar in some ways to the treatment of harm to adults. This is trying to shame a company into doing something without actually making it do it. The power allows the Secretary of State directly to target a given company. There is potential for the misuse of such an ability.

The explanatory notes say:

“the Secretary of State could issue a direction during a pandemic to require OFCOM to; give priority to ensuring that health misinformation and disinformation is effectively tackled when exercising its media literacy function; and to require service providers to report on the action they are taking to address this issue.”

Recent experience of the covid pandemic and the Russian invasion of Ukraine suggests that the Government can easily legislate when required in an emergency and can recall Parliament. The power in the Bill is a strong power, cutting through regulatory independence and targeting individual companies to evoke quite a weak effect. It is not being justified as an emergency power where the need to move swiftly is paramount. Surely, if a heavier-duty action is required in a crisis, the Government can legislate for that and explain to Parliament why the power is required in the context of a crisis.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

It is really important to make sure that the Bill does not end up being a cover for the Secretary of State of the day to significantly interfere with the online space, both now and in the future. At the moment, I am not satisfied that the Secretary of State’s powers littered through the Bill are necessary. I share other hon. Members’ concerns about what this could mean for both the user experience and online safety more broadly. I hope my hon. Friend agrees that the Minister needs to provide us—not just us here today, but civil society and others who might be listening—with more reassurance that the Secretary of State’s powers really are necessary.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree with my hon. Friend. We talk time and again about this Bill being world leading, but with that comes a responsibility to show global leadership. Other countries around the world will be looking to us, and this Parliament, when they adopt their own, similar legislation, and we need to be mindful of that when looking at what powers we give to a Secretary of State—particularly in overruling any independence of Ofcom or Parliament’s sovereignty for that matter.

New clause 10 provides a viable alternative. The Minister knows that this is an area where even his Back Benchers are divided. He must closely consider new clause 10 and recognise that placing power in Ofcom’s hands is an important step forward. None of us wants to see a situation where the Secretary of State is able to influence the regulator. We feel that, without this important clause and concession, the Government could be supporting a rather dangerous precedent in terms of independence in regulatory systems more widely.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I want to talk about a specific example. Perhaps the Minister will be able to explain why the legislation is written this way around when I would have written it the opposite way around, much more in line with proposed new clause 10.

Snapchat brought in the Snap Map feature, which that involved having geolocation on every individual’s phone; whenever anyone took a photo to put it on Snapchat, that geolocation was included. The feature was automatically turned on for all Snapchat users when it first came in, I think in 2017. No matter what age they were, when they posted their story on Snapchat, which is available to anyone on their friends list and sometimes wider, anyone could see where they were. If a child had taken a photo at their school and put it on Snapchat, anyone could see what school they went to. It was a major security concern for parents.

That very concerning situation genuinely could have resulted in children and other vulnerable people, who may not have even known that the feature had been turned on by default and would not know how to turn on ghost mode in Snapchat so as not to post their location, being put at risk. The situation could have been helped if media literacy duties had kicked in that meant that the regulator had to say, “This is a thing on Snapchat: geolocation is switched on. Please be aware of this if your children or people you are responsible for are using Snapchat.”

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Is the hon. Member aware of a similar situation that arose more recently with Strava? People’s running routes were publicly displayed in the same way, which led to incidents of stalking.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I was aware that Strava did that mapping, which is why my friends list on Strava numbers about two people, but I was not aware that it had been publicly displayed. There are similar issues that routes can be public on things such as Garmin, so it is important to keep a note of that. I did not know that that information was public on Strava. If Ofcom had had the duty to ensure that people were aware of that, it would have been much easier for parents and vulnerable adults to take those decisions or have them taken on their behalf.

My reading of the clause is that if Ofcom comes across a problem, it will have to go and explain to the Secretary of State that it is a problem and get the Secretary of State to instruct it to take action. I do not think that makes sense. We have talked already about the fact that the Secretary of State cannot be an expert in everything. The Secretary of State cannot necessarily know the inner workings of Snapchat, Strava, TikTok and whatever other new platforms emerge. It seems like an unnecessary hurdle to stop Ofcom taking that action on its own, when it is the expert. The Minister is likely to say that the Secretary of State will say, “Yes, this is definitely a problem and I will easily instruct you to do this”—

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The Minister will get the chance to make a proper speech in which he can respond.

It could be that the process is different from the one I see from reading the Bill. The Minister’s clarifications will be helpful to allow everyone to understand how the process is supposed to work, what powers Ofcom is supposed to have and whether it will have to wait for an instruction from the Secretary of State, which is what it looks like. That is why proposed new clause 10 is so important, because it would allow action to be taken to alert people to safety concerns. I am focusing mostly on that.

I appreciate that national security is also very important, but I thought I would take the opportunity to highlight specific concerns with individual platforms and to say to the Minister that we need Ofcom to be able to act and to educate the public as well as it possibly can, and to do so without having to wait for an instruction.

14:44
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me start by addressing the point that was raised by the hon. Member for Aberdeen North on Ofcom’s power to issue media literacy advice of its own volition, which is the subject of new clause 10. Under section 11 of the Communications Act 2003, Ofcom already has the power to issue media literacy guidance on issues such as Snapchat geolocation, the Strava map location functionality that I mentioned, and the other example that came up. Ofcom does not need the Secretary of State’s permission to do that, as it already has the power to do so. The power that new clause 10 would confer on Ofcom already exists.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Minister says that Ofcom can already use that existing power, so why does it not do so?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

That is obviously an operational matter for Ofcom. We would encourage it to do as much as possible. We encouraged it through our media literacy strategy, and it published an updated policy on media literacy in December last year. If Members feel that there are areas of media literacy in which Ofcom could do more, they will have a good opportunity to raise those questions when senior Ofcom officials next appear before the Digital, Culture, Media and Sport Committee or any other parliamentary Committee.

The key point is that the measures in new clause 10 are already in legislation, so the new clause is not necessary. The Secretary of State’s powers under clause 146 do not introduce a requirement for permission—they are two separate things. In addition to Ofcom’s existing powers to act of its own volition, the clause gives the Secretary of State powers to issue directions in certain very limited circumstances. A direction may be issued where there is a present threat—I stress the word “threat”—to the health or safety of the public or to national security, and only in relation to media literacy. We are talking about extremely narrowly defined powers.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The Minister said “a present threat”, but the clause says “present a threat”. The two mean different things. To clarify, could he confirm that he means “present a threat”?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady is quite right to correct me. I do mean “present a threat”, as it is written in the Bill—I apologise for inadvertently transposing the words.

Is it reasonable that the Secretary of State has those very limited and specific powers? Why should they exist at all? Does this represent an unwarranted infringement of Ofcom’s freedom? I suppose those are the questions that the Opposition and others might ask. The Government say that, yes, it is reasonable and important, because in those particular areas—health and safety, and national security—there is information to which only the Government have access. In relation to national security, for example, information gathered by the UK intelligence community—GCHQ, the Secret Intelligence Service and MI5—is made available to the Government but not more widely. It is certainly not information that Ofcom would have access to. That is why the Secretary of State has the power to direct in those very limited circumstances.

I hope that, following that explanation, the Committee will see that new clause 10 is not necessary because it replicates an existing power, and that clause 146 is a reasonable provision.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I welcome the Minister’s comments, but I am not convinced by his arguments on the powers given to the Secretary of State on issues of national security or public health and safety. Parliament can be recalled and consulted, and Members of Parliament can have their say in the Chamber on such issues. It should not be up to the Secretary of State alone to direct Ofcom and challenge its independence.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I understand the shadow Minister’s point, but recalling Parliament during a recess is extremely unusual. I am trying to remember how many times it has happened in the seven years that I have been here, and I can immediately recall only one occasion. Does she think that it would be reasonable and proportionate to recall 650 MPs in recess for the purpose of issuing a media literacy directive to Ofcom?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I think the Minister has just made my point for me. If he does not see this happening only in extreme circumstances where a threat is presented or there is an immediate risk to public health and safety, how many times does he envisage the power being used? How many times will the Secretary of State have the power to overrule Ofcom if the power is not to be used only in those unique situations where it would be deemed appropriate for Parliament to be recalled?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

It is not overruling Ofcom; it is offering a direction to Ofcom.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Yes—having direct influence on a regulator, overruling its independence and taking the stance directly themselves. The Minister has made my point for me: if he does not envisage the power being used only in unique circumstances where Parliament would need to be recalled to have a say, it will be used a lot more often than he suggests.

With that in mind, the Opposition will withhold our support for clause 146, in order to progress with new clause 10. I place on record the Labour party’s distinct concerns with the clause, which we will seek to amend on Report.

Dan Carden Portrait Dan Carden (Liverpool, Walton) (Lab)
- Hansard - - - Excerpts

I add my voice to the concerns that have been raised about the clause, and about the powers for the Secretary of State that are littered throughout the Bill. This comes on top of the scandals around the public appointments process that we have seen under this Government—even around the role of chair of Ofcom, which they tried to hand to a former editor of the Daily Mail, Paul Dacre. Earlier this year, Lord Grade was appointed for a four-year term. He is on £140,000-odd a year. The Secretary of State is responsible for appointing the whole board of Ofcom. I really do wonder why, on top of the power that the Government hold in the appointments process, they need the Secretary of State to have the claims to intervention that the Bill affords her.

None Portrait The Chair
- Hansard -

Minister, do you wish to respond?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have nothing further to add.

Question put and agreed to.

Clause 146 accordingly ordered to stand part of the Bill.

Clause 147

Secretary of State’s guidance

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It seems that our support for the clauses has run out. Clause 147 enables the Secretary of State to give guidance to Ofcom relating to its exercise of its statutory powers and functions under the Bill. It also allows the Secretary of State to give guidance to Ofcom around its functions and general powers under certain provisions of the Communications Act 2003. While we appreciate that the Secretary of State must consult Ofcom before issuing, revising or replacing guidance, we feel that this level of interference is unnecessary.

The Minister must recognise that the clause allows for an incredibly granular level of interference by the Secretary of State in the day-to-day functioning of a supposedly independent regulator. It profoundly interferes with enforcement and once again broadly undermines Ofcom’s independence. Civil society and stakeholders alike share our concerns. I must press the Minister on why this level of interference is included in the Bill—what is the precedent? We have genuine concerns that the fundamental aims of the Bill—to keep us all safe online—could easily be shifted according to the priorities of the Secretary of State of the day. We also need to ensure there is consistency in our overall approach to the Bill. Labour feels that this level of interference will cause the Bill to lack focus.

Ultimately, Ofcom, as the independent regulator, should be trusted to do what is right. The Minister must recognise how unpopular the Bill’s current approach of giving overarching powers to the Secretary of State is. I hope he will go some way to addressing our concerns, which, as I have already said, we are not alone in approaching him with. For those reasons, we cannot support clause 147 as it stands.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We are introducing a new, groundbreaking regime, and we are trying to strike a balance between the need for regulatory independence of Ofcom and appropriate roles for Parliament and Government. There is a balance to strike there, particularly in an area such as this, which has not been regulated previously. It is a brand-new area, so we do not have decades of cumulated custom and practice that has built up. We are creating this from the ground up—from a blank sheet of paper.

That is why, in establishing this regime, we want to provide a facility for high-level strategic guidance to be given to Ofcom. Of course, that does not infringe on Ofcom’s day-to-day operations; it will continue to do those things itself, in taking decisions on individual enforcement matters and on the details around codes of practice. All those things, of course, remain for Ofcom.

We are very clear that guidance issued under clause 147 is strategic in nature and will not stray into the operational or organisational matters that should properly fall into the exclusive ambit of the independent regulator. There are a number of safeguards in the clause to ensure that the power is exercised in the way that I have just described and does not go too far.

First, I point to the fact that clause 147(8) simply says that

“ OFCOM must have regard to the guidance”.

That is obviously different from a hard-edged statutory obligation for it to follow the guidance in full. Of course, it does mean that Ofcom cannot ignore it completely—I should be clear about that—but it is different from a hard-edged statutory obligation.

There is also the requirement for Ofcom to be consulted, so that its opinions can be known. Of course, being consulted does not mean that the opinions will be followed, but it means that they will be sought and listened to. There are also some constraints on how frequently this strategic guidance can be revised, to ensure that it does not create regulatory uncertainty by being chopped and changed on an unduly frequent basis, which would cause confusion.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a question about subsection (4)(b), which says that the guidance can be replaced more frequently than once every three years. I understand subsection (4)(a)—that is fine—but subsection (4)(b) says that the guidance can be changed if

“the revision or replacement is by agreement between the Secretary of State and OFCOM.”

How will those of us who are not the Secretary of State or Ofcom know that there has been an agreement that the guidance can be changed and that the Secretary of State is not just acting on their own? If the guidance is changed because of an agreement, will there be a line in the guidance that says, “The Secretary of State has agreed with Ofcom to publish this only 1.5 years after the last guidance was put out, because of these reasons”? In the interests of transparency, it would be helpful for something like that to be included in the guidance, if it was being changed outside the normal three-year structure.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

It is better than being in the guidance, which is non-statutory, because it is in the Bill—it is right here in front of us in the measure that the hon. Lady just referred to, clause 147(4)(b). If the Secretary of State decided to issue updated guidance in less than three years without Ofcom’s consent, that would be unlawful; that would be in breach of this statute, and it would be a very straightforward matter to get that struck down. It would be completely illegal to do that.

My expectation would be that if updated guidance was issued in less than three years, it would be accompanied by written confirmation that Ofcom had agreed. I imagine that if a future Secretary of State—I cannot imagine the current Secretary of State doing it—published guidance in less than three years without Ofcom’s consent, Ofcom would not be shy in pointing that out, but to do that would be illegal. It would be unlawful; it would be a breach of this measure in the Bill.

I hope that the points that I have just made about the safeguards in clause 147, and the assurance and clarity that I have given the Committee about the intent that guidance will be at the strategic level rather than the operational level, gives Members the assurance they need to support the clause.

Question put, That the clause stand part of the Bill.

Division 46

Ayes: 10


Conservative: 10

Noes: 5


Labour: 4
Scottish National Party: 1

Clause 147 ordered to stand part of the Bill.
Clause 148
Annual report on the Secretary of State’s functions
Question proposed, That the clause stand part of the Bill.
15:00
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I will be brief. The clause is incredibly important. It requires the Secretary of State to prepare and lay before Parliament annual reports about their performance in relation to online safety. We fully support such transparency. That is all we want—we want it to go further. That is what we have been trying to say in Committee all day. We agree in principle and therefore have not sought to amend the clause.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I could not possibly add to that exceptionally eloquent description.

Question put and agreed to.

Clause 148 accordingly ordered to stand part of the Bill.

Clause 149

Review

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

As we know, the clause compels the Secretary of State to undertake a review to assess the effectiveness of the regulatory framework. The review will have to be published and laid before Parliament, which we welcome. However, we note the broad time limits on this duty. We have heard repeatedly about the challenges that delays to the Bill’s full implementation will cause, so I urge the Minister to consider that point closely. By and large, though, we absolutely support the clause, especially as the Secretary of State will be compelled to consult Ofcom and other appropriate persons when carrying out its review—something that we have called for throughout scrutiny of the Bill. We only wish that that level of collaboration had been accepted by the Minister on the other clauses. I will not waste time repeating points that I have already made. We support the clause.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I welcome the shadow Minister’s support for this review clause, which is important. I will not add to her comments.

Question put and agreed to.

Clause 149 accordingly ordered to stand part of the Bill.

Clause 150

Harmful communications offence

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I beg to move amendment 112, in clause 150, page 127, line 28, at end insert “and;

(b) physical harm that has been acquired as a consequence of receiving the content of a message sent online.”

This amendment would expand the definition of harm for the purposes of the harmful communications offence to incorporate physical harm resulting from messages received online.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss amendment 113, in clause 150, page 127, line 28, at end insert “; or

(b) physical harm resulting from an epileptic seizure, where the seizure has been triggered by the intentional sending of flashing images to a person with epilepsy.”

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I move the amendment in my name and will speak to amendment 113, which is in the name of the hon. Member for Blackpool North and Cleveleys (Paul Maynard).

The amendment would put into effect Zach’s law in full. Zach, as many Members know, is an amazing, energetic and bright young boy from my constituency. I had the absolute pleasure of visiting Zach and his mum Clare at their home in Hartshead a few weeks ago. We chatted about school and his forthcoming holiday, and he even invited me to the pub. However, Zach also has epilepsy.

Disgustingly, he was trolled online a few years ago and sent flashing images by bullies, designed to trigger his condition and give him an epileptic seizure, a seizure that not only would cause him and his family great distress, but can be extremely dangerous and cause Zach significant psychological and physical harm. I know that we are all united in our disgust at such despicable actions and committed to ensuring that this type of unbelievable online bullying is against the law under the Bill.

On Second Reading, I raised the matter directly with the Minister and I am glad that he pointed to clause 150 and stated very explicitly that subsection (4) will cover the type of online harm that Zach has encountered. However, we need more than just a commitment at the Dispatch Box by the Minister, or verbal reassurances, to protect Zach and the 600,000 other people in the UK with epilepsy.

The form of online harm that Zach and others with epilepsy have suffered causes more than just “serious distress”. Members know that the Bill as drafted lists

“psychological harm amounting to at least serious distress”

as a qualifying criterion of the offence. However, I believe that does not accurately and fully reflect the harm that epilepsy trolling causes, and that it leaves a significant loophole that none of us here wish to see exploited

For many people with epilepsy, the harm caused by this vicious online trolling is not only psychological but physical too. Seizures are not benign events. They can result in broken bones, concussion, bruises and cuts, and in extreme cases can be fatal. It is simply not right to argue that physical harm is intrinsically intertwined with psychological harm. They are different harms with different symptoms. While victims may experience both, that is not always the case.

Professor Sander, medical director of the Epilepsy Society and professor of neurology at University College London Hospitals NHS Foundation Trust, who is widely considered one of the world’s leading experts on epilepsy, has said:

“Everyone experiences seizures differently. Some people may be psychologically distressed by a seizure and not physically harmed. Others may be physically harmed but not psychologically distressed. This will vary from person to person, and sometimes from seizure to seizure depending on individual circumstances.”

Amendment 112 will therefore expand the scope of clause 150 and insert on the face of the Bill that an offence will also be committed under the harmful communications clause when physical harm has occurred as a consequence of receiving a message sent online with malicious intent. In practical terms, if a person with epilepsy were to receive a harmful message online that triggers their epilepsy and they subsequently fall off their chair and hit their head, that physical harm will be proof of a harmful communication offence, without the need to prove any serious psychological distress that may have been caused.

This simple but effective amendment, supported by the Epilepsy Society, will ensure that the horrific trolling that Zach and others with epilepsy have had to endure will be covered in full by the Bill. That will mean that the total impact that such trolling has on the victims is reflected beyond solely psychological distress, so there can be no ambiguity and nowhere for those responsible for sending these images and videos to hide.

I am aware that the Minister has previously pointed to the possibility of a standalone Bill—a proposal that is under discussion in the Ministry of Justice. That is all well and good, but that should not delay our action when the Bill before us is a perfectly fit legislative vehicle to end epilepsy trolling, as the Law Commission report recommended.

I thank colleagues from across the House for the work they have done on this important issue. I sincerely hope that the amendment is one instance where we can be united in this Committee. I urge the Minister to adopt amendment 112, to implement Zach’s law in full and to provide the hundreds of thousands of people across the UK living with epilepsy the legal protections they need to keep them safe online. It would give me no greater pleasure than to call at Zach’s house next time I am in the area and tell him that this is the case.

Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - - - Excerpts

May I praise the hon. Member for Batley and Spen for such an eloquent and heartfelt explanation of the reason why this amendment to the Bill is so important?

I have been campaigning on Zach’s law for the past nine months. I have spoken to Zach multiple times and have worked closely with my hon. Friend the Member for Stourbridge (Suzanne Webb) in engaging directly with Facebook, Twitter and the big platforms to try to get them to do something, because we should not need to have a law to stop them sending flashing images. We had got quite far a few months ago, but now that seems to have stalled, which is very frustrating.

I am stuck between my heart and my head on this amendment. My heart says we need to include the amendment right now, sort it out and get it finalised. However, my head says we have got to get it right. During the Joint Committee for Online Safety before Christmas and in the evidence sessions for this Bill, we heard that if the platforms want to use a loophole and get around things they will. I have even seen that with regard to the engagements and the promises we have had.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I wonder whether the hon. Gentleman would consider a belt and braces approach as the best way forward? We could have it in the Bill and have the other legislation, in order that this will definitely protect people and companies will not be able to wriggle out of it.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

That is an excellent point. I have yet to make up my mind which way to vote if the amendment is pressed to a vote; I do not know whether this is a probing amendment. Having spoken to the Epilepsy Society and having been very close to this issue for many months, for me to feel comfortable, I want the Minister not just to say, as he has said on the Floor of the House, to me personally, in meetings and recently here, that the clause should cover epilepsy, and does seem to, and that he is very confident of that, but to give some assurance that we will change the law in some form.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I am incredibly grateful for the hon. Member’s comments and contribution. I agree wholeheartedly. We need more than a belief and an intention. There is absolutely no reason why we cannot have this in black and white in the Bill. I hope he can find a way to do the right thing today and vote for the amendment.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

The phrase “Do the right thing” is at the heart of this. My hon. Friend the Member for Ipswich (Tom Hunt) presented the Flashing Images Bill yesterday. A big part of this is about justice. I am conscious that we have got to get the balance right; stopping this happening has an impact for the people who choose to do this. I am keen to hear what the Minister says. We have got to get this right. I am keen to get some assurances, which will very much sway my decision on the vote today.

Caroline Ansell Portrait Caroline Ansell (Eastbourne) (Con)
- Hansard - - - Excerpts

At the risk of following my earlier voting pattern, I am also very much with the hon. Member for Batley and Spen in spirit. I could not do the subject any more justice than she has, describing this appalling online behaviour and just how damaging it is. I am a member of the all-party parliamentary group on epilepsy and have lived experience myself.

I want to highlight the comments of the Epilepsy Society, which I am sure is following our work this afternoon. It welcomes many of the introductions to the Bill, but highlights something of a legislative no man’s land. Clause 187 mentions physical harm, but does not apply to clause 150. Clause 150 only covers psychological harm when, as we have heard described, many seizures result in physical harm and some of that is very serious. I know the Minister is equally committed to see this measure come about and recognises the points we have demonstrated. The hon. Lady is right that we are united. I suspect the only point on which there might be some difference is around timing. I will be looking to support the introduction and the honouring in full of Zach’s law before the Bill is passed. There are many other stages.

My understanding is that many others wish to contribute, not least the Ministry of Justice. My hope, and my request to the Minister, is that those expert stakeholder voices will be part of the drafting, should it not be the case that supporting the amendment presented today is the very best and strongest way forward. I want to see recognition in law.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Amendment 112 is clearly very important. As my hon. Friend the Member for Watford pointed out, I have already said that I believe that clause 150 goes a long way to address the various issues that have been raised. Since my hon. Friends the Members for Eastbourne and for Watford, and the hon. Member for Batley and Spen have been raising this issue—my hon. Friends have been lobbying me on this issue persistently and frequently, behind closed doors as well as publicly, and the hon. Member for Batley and Spen has been campaigning on this publicly with great tenacity and verve—the Government and the MOJ have been further considering the Law Commission’s recommendations, which I referenced on Second Reading. Subsequent to Second Reading and the lobbying by the three Members who have just spoken—the hon. Member for Batley and Spen, and my hon. Friends the Members for Watford and for Eastbourne—I can now announce to the Committee that the Government have decided to enact the Law Commission’s recommendations, so there will be a new and separate standalone offence that is specific to epilepsy for the very first time. I can firmly commit to that and announce it today.

15:15
The question then arises which legislative vehicle the offence will go in. I am aware of the private Member’s Bill, but it will take a very long time and we probably would not want to rely on it, so I am in the process of getting cross-Government agreement on which legislative vehicle will be used. I do not want to say any more about that now, because it is still subject to collective agreement, but I am expecting to come back to the House on Report and confirm which Bill the measure will go in.
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I genuinely appreciate the Minister’s comments, but why would we spend more time doing other pieces of legislation when we can do it right here and right now? The amendment will solve the problem without causing any more pain or suffering over a long period of time.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

One of the pieces of legislation that could be used is this Bill, because it is in scope. If the hon. Lady can bear with me until Report, I will say more about the specific legislative vehicle that we propose to use.

On the precise wording to be used, I will make a couple of points about the amendments that have been tabled—I think amendment 113 is not being moved, but I will speak to it anyway. Amendment 112, which was tabled by the hon. Member for Batley and Spen, talks about bringing physical harm in general into the scope of clause 150. Of course, that goes far beyond epilepsy trolling, because it would also bring into scope the existing offence of assisting or encouraging suicide, so there would be duplicative law: there would be the existing offence of assisting or encouraging suicide and the new offence, because a communication that encouraged physical harm would do the same thing.

If we included all physical harm, it would duplicate the proposed offence of assisting or encouraging self-harm that is being worked on by the Ministry of Justice and the Law Commission. It would also duplicate offences under the Offences Against the Person Act 1861, because if a communication caused one person to injure another, there would be duplication between the offence that will be created by clause 150 and the existing offence. Clearly, we cannot have two offences that criminalise the same behaviour. To the point made by the hon. Member for Aberdeen North, it would not be right to create two epilepsy trolling offences. We just need one, but it needs to be right.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

In a second.

The physical harm extension goes way beyond the epilepsy point, which is why I do not think that that would be the right way to do it, although the Government have accepted that we will do it and need to do it, but by a different mechanism.

I was about to speak to amendment 113, the drafting of which specifically mentions epilepsy and which was tabled by my hon. Friend the Member for Blackpool North and Cleveleys (Paul Maynard), but was the hon. Lady’s question about the previous point?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

My question was about the announcement that the Minister is hoping to make on Report. I appreciate that he has committed to introduce the new offence, which is great. If the Bill is to be the legislative vehicle, does he expect to amend it on Report, or does he expect that that will have to wait until the amendment goes through the Lords?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

That is a good question, and it ties into my next point. Clearly, amendment 113 is designed to create a two-sentence epilepsy trolling offence. When trying to create a brand-new offence—in this case, epilepsy trolling—it is unlikely that two sentences’ worth of drafting will do the trick, because a number of questions need to be addressed. For example, the drafting will need to consider what level of harm should be covered and exactly what penalty would be appropriate. If it was in clause 150, the penalty would be two years, but it might be higher or lower, which needs to be addressed. The precise definitions of the various terms need to be carefully defined as well, including “epilepsy” and “epileptic seizures” in amendment 113, which was tabled by my hon. Friend the Member for Blackpool North and Cleveleys. We need to get proper drafting.

My hon. Friend the Member for Eastbourne mentioned that the Epilepsy Society had some thoughts on the drafting. I know that my colleagues in the Ministry of Justice and, I am sure, the office of the parliamentary counsel, would be keen to work with experts from the Epilepsy Society to ensure that the drafting is correct. Report will likely be before summer recess—it is not confirmed, but I am hoping it will be—and getting the drafting nailed down that quickly would be challenging.

I hope that, in a slightly indirect way, that answers the question. We do not have collective agreement about the precise legislative vehicle to use; however, I hope it addresses the questions about how the timing and the choreography could work.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

We have talked a lot about the Epilepsy Society this afternoon, and quite rightly too, as they are the experts in this field. My understanding is that it is perfectly happy with the language in this amendment—

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Which one?

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Amendment 112. I think that the Epilepsy Society feels that this would be covered. I am also confused, because the Minister said previously that it was his belief and intention that this clause would cover epilepsy trolling, but he is now acknowledging that it does not. Why would we not, therefore, just accept the amendment that covers it and save everybody a lot of time?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Representations have been made by the three Members here that epilepsy deserves its own stand-alone offence, and the Government have just agreed to do that, so take that as a win. On why we would not just accept amendment 112, it may well cover epilepsy, and may well cover it to the satisfaction of the Epilepsy Society, but it also, probably inadvertently, does a lot more than that. It creates a duplication with the offence of assisting or encouraging suicide.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Surely that is almost a bonus?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

No, it is not a bonus, because we cannot have two different laws that criminalise the same thing. We want to have laws that are, essentially, mutually exclusive. If a person commits a particular act, it should be clear which Act the offence is being committed under. Imagine that there were two different offences for the same act with different sentences—one is two years and one is 10 years. Which sentence does the judge then apply? We do not want to have law that overlaps, where the same act is basically a clear offence under two different laws. Just by using the term “physical harm”, amendment 112 creates that. I accept that it would cover epilepsy, but it would also cover a whole load of other things, which would then create duplication.

That is why the right way to do this is essentially through a better drafted version of amendment 113, which specifically targets epilepsy. However, it should be done with drafting that has been done properly—with respect to my hon. Friend the Member for Blackpool North and Cleveleys, who drafted the amendment—with definitions that are done properly, and so on. That is what we want to do.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Having been involved on this Bill for quite a while now and having met Zach, I know the concerns that the Epilepsy Society have had. For me, we just need the Minister to tell us, which I think he has, that this will become law, whatever the vehicle for that is. If we know that this will be an offence by the end of this year—hopefully by summer, if not sooner—so that people cannot send flashing images to people with epilepsy, like Zach, then I will feel comfortable in not backing the amendment, on the premise that the Government will do something, moving forward. Am I correct in that understanding?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes. Just to be clear, in no world will a new law pass by the summer recess. However, I can say that the Government are committed, unequivocally, to there being a new offence in law that will criminalise epilepsy trolling specifically. That commitment is categoric. The only matter on which I need to come back to the House, which I will try to do on Report, is to confirm specifically which Bill that offence will go in. The commitment to legislate is made unequivocally today.

Caroline Ansell Portrait Caroline Ansell
- Hansard - - - Excerpts

I welcome the Minister’s announcement and that commitment. I particularly welcome that the new offence will have epilepsy in the title. People who seek out those who may be triggered and have seizures to cause this harm use all sorts of tags, organisations and individuals to deliberately and specifically target those who suffer from epilepsy. It is therefore wholly right that this new offence, whether in this Bill or another, cites epilepsy, because those who would seek to do harm know it and call it that.

I have not had the privilege of meeting Zach; however, thanks to this online world, which we are experiencing through this legislation as the wild west, I was able to see the most beautiful tribute interview he did with his mum. He said that if the change were to be made and offence were to be recognised, “we win.” He is so right that we all win.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

My hon. Friend makes an extremely powerful point that is incapable of being improved upon.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Or perhaps it is.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

It is wonderful that we have such consensus on this issue. I am grateful to colleagues for that. I am very concerned about the pressures on parliamentary time, and the fact that we are kicking this issue down the road again. We could take action today to get the process moving. That is what Zach and his family want and what other people who have been subjected to this hideous bullying want. Without a firm timeframe for another way of getting this done, I am struggling to understand why we cannot do this today.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The progress that the campaign has made, with the clear commitment from the Government that we are going to legislate for a specific epilepsy trolling offence, is a huge step forward. I entirely understand the hon. Lady’s impatience. I have tried to be as forthcoming as I can be about likely times, in answer to the question from the hon. Member for Aberdeen North, within the constraints of what is currently collectively agreed, beyond which I cannot step.

Amendment 112 will sort out the epilepsy, but unfortunately it will create duplicative criminal law. We cannot let our understandable sense of urgency end up creating a slightly dysfunctional criminal statute book. There is a path that is as clear as it reasonably can be. Members of the Committee will probably have inferred the plan from what I said earlier. This is a huge step forward. I suggest that we bank the win and get on with implementing it.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

I appreciate that there will be differences of opinion, but I feel that Zach should be smiling today whatever the outcome—if there is a vote, or if this is a probing amendment. When I have chatted about this previously over many months, it has been a real challenge. The Minister quite rightly said that the Bill already covered epilepsy. I felt that to be true. This is a firming up of the agreement we had. This is the first time I have heard this officially in any form. My message to Zach and the Epilepsy Society, who may well be watching the Committee, is that I hope they will see this as a win. With my head and my heart together, I feel that it is a win, but I forewarn the Minister that I will continue to be like a dog with a bone and make sure that those promises are delivered upon.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I think that is probably a good place to leave my comments. I can offer public testimony of my hon. Friend’s tenacity in pursuing this issue.

I ask the hon. Member for Batley and Spen to withdraw the amendment. I have given the reasons why: because it would create duplicative criminal law. I have been clear about the path forward, so I hope that on that basis we can work together to get this legislated for as a new offence, which is what she, her constituent and my hon. Friends the Members for Watford and for Eastbourne and others have been calling for.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I appreciate the Minister’s comments and the support from across the House. I would like to the push the amendment to a vote.

Question put, That the amendment be made.

Division 47

Ayes: 5


Labour: 4
Scottish National Party: 1

Noes: 10


Conservative: 10

None Portrait The Chair
- Hansard -

Amendment 113 was tabled by Paul Maynard, who is not on the Committee. Does any Member wish to move the amendment?

Amendment proposed: 113, in clause 150, page 127, line 28, at end insert “; or

(b) physical harm resulting from an epileptic seizure, where the seizure has been triggered by the intentional sending of flashing images to a person with epilepsy.”—(Kim Leadbeater.)

15:30
Question put, That the amendment be made.

Division 48

Ayes: 5


Labour: 4
Scottish National Party: 1

Noes: 10


Conservative: 10

Question proposed, That the clause stand part of the Bill.
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clauses 151 to 155 stand part.

Clause 157 stand part.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Part 10 of the Bill sets out three new offences involving harmful, false or threatening communications. Clause 156 includes a new offence on cyber-flashing, to which my hon. Friend the Member for Pontypridd will speak shortly.

For many years, charities have been calling for an update to the offences included in the Malicious Communications Act 1998 and the Communications Act 2003. Back in 2018, the Law Commission pointed out that using the criminal law to deal with harmful online conduct was hindered by several factors, including limited law enforcement capacity to pursue the scale of abusive communications, what the commission called a “persistent cultural tolerance” of online abuse, and difficulties in striking a balance between protecting people from harm and maintaining rights of freedom of expression—a debate that we keep coming to in Committee and one that is still raging today. Reform of the legislation governing harmful online communications is welcome—that is the first thing to say—but the points laid out by the Law Commission in 2018 still require attention if the new offences are to result in the reduction of harm.

My hon. Friend the Member for Batley and Spen spoke about the limited definition of harm, which relates to psychological harm but does not protect against all harms resulting from messages received online, including those that are physical. We also heard from the hon. Member for Ochil and South Perthshire about the importance of including an offence of encouraging or assisting self-harm, which we debated last week with schedule 7. I hope that the Minister will continue to upgrade the merits of new clause 36 when the time comes to vote on it.

Those are important improvements about what should constitute an offence, but we share the concerns of the sector about the extent to which the new offences will result in prosecution. The threshold for committing one of the offences in clause 150 is high. When someone sends the message, there must be

“a real and substantial risk that it would cause harm to a likely audience”,

and they must have

“no reasonable excuse for sending the message.”

The first problem is that the threshold of having to prove the intention to cause distress is an evidential threshold. Finding evidence to prove intent is notoriously difficult. Professor Clare McGlynn’s oral evidence to the Committee was clear:

“We know from the offence of non-consensual sending of sexual images that it is that threshold that limits prosecutions, but we are repeating that mistake here with this offence.”

Professor McGlynn highlighted the story of Gaia Pope. With your permission, Ms Rees, I will make brief reference to it, in citing the evidence given to the Committee. In the past few weeks, it has emerged that shortly before Gaia Pope went missing, she was sent indecent images through Facebook, which triggered post-traumatic stress disorder from a previous rape. Professor McGlynn said:

“We do not know why that man sent her those images, and I guess my question would be: does it actually matter why he sent them? Unfortunately, the Bill says that why he sent them does matter, despite the harm it caused, because it would only be a criminal offence if it could be proved that he sent them with the intention of causing distress or for sexual gratification and being reckless about causing distress.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 58, Q105.]

The communications offences should be grounded upon consent rather than the motivation of the perpetrator. That is a clear omission in the Bill, which my hon. Friend the Member for Pontypridd will speak more about in relation to our amendments 41 and 42 to clause 156. The Government must act or risk missing a critical opportunity to tackle the harms resulting from communications offences.

We then come to the problem of the “reasonable excuse” defence and the “public interest” defence. Clause 150(5) sets out that the court must consider

“whether the message is, or is intended to be, a contribution to a matter of public interest”.

The wording in the clause states that this should not “determine the point”. If that is the case, why does the provision exist? Does the Minister recognise that there is a risk of the provision being abused? In a response to a question from the hon. Member for Aberdeen North, the Minister has previously said that:

“Clause 150…does not give a get-out-of-jail-free card”.––[Official Report, Online Safety Public Bill Committee, 7 June 2022; c. 275.]

Could he lay out what the purpose of this “matter of public interest” defence is? Combined with the reasonable excuse defence in subsection (1), the provisions risk sending the wrong message when it comes to balancing harms, particularly those experienced by women, of which we have already heard some awful examples.

There is a difference in the threshold of harm between clause 150, on harmful communications offences, and clause 151, on false communications offences. To constitute a false communications offence, the message sender must have

“intended the message, or the information in it, to cause non-trivial psychological or physical harm to a likely audience”.

To constitute a harmful communications offence, the message sender must have

“intended to cause harm to a likely audience”

and there must have been

“a real and substantial risk that it would cause harm to a likely audience”.

Will the Minister set out the Government’s reasoning for that distinction? We need to get these clauses right because people have been let down by inadequate legislation and enforcement on harmful online communications offences for far too long.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me start by saying that many of these clauses have been developed in careful consultation with the Law Commission, which has taken a great deal of time to research and develop policy in this area. It is obviously quite a delicate area, and it is important to make sure that we get it right.

The Law Commission is the expert in this kind of thing, and it is right that the Government commissioned it, some years ago, to work on these provisions, and it is right that, by and large, we follow its expert advice in framing these offences, unless there is a very good reason not to. That is what we have done—we have followed the Law Commission’s advice, as we would be expected to do. The clauses replace previous offences—for example, those in the Malicious Communications Act 1998—and update and improve those provisions in the form we see them in the Bill.

The shadow Minister, the hon. Member for Worsley and Eccles South, asked a number of questions about the drafting of the clauses and the thresholds that have to be met for an offence to be committed. We are trying to strike a balance between criminalising communications that deserve to be criminalised and not criminalising communications that people would consider should fall below the criminal threshold. There is obviously a balance to strike in doing that. We do not want to infringe free speech by going too far and having legitimate criticism and debate being subject to criminal sanctions. There is a balance to strike here between, on the one hand, public protection and where the criminal law sits versus, on the other hand, free speech and people expressing themselves. That is why clause 150 is constructed as it is, on the advice of the Law Commission.

As the hon. Member set out, the offence is committed only where there is a “real and substantial risk” that the likely audience would suffer harm. Harm is defined as

“psychological harm amounting to at least serious distress.”

Serious distress is quite a high threshold—it is significant thing, not something trivial. It is important to make that clear.

The second limb is that there is an intention to cause harm. Intention can in some circumstances be difficult to prove, but there are also acts that are so obviously malicious that there can be no conceivable motivation or intention other than to cause harm, where the communication is so obviously malfeasant. In those cases, establishing intent is not too difficult.

In a number of specific areas, such as intimate image abuse, my right hon. Friend the Member for Basingstoke and others have powerfully suggested that establishing intent is an unreasonably high threshold, and that the bar should be set simply at consent. For the intimate image abuse offence, the bar is set at the consent level, not at intent. That is being worked through by the Law Commission and the Ministry of Justice, and I hope that it will be brought forward as soon as possible, in the same way as the epilepsy trolling offence that we discussed a short while ago. That work on intimate image abuse is under way, and consent, not intent, is the test.

For the generality of communications—the clause covers any communications; it is incredibly broad in scope—it is reasonable to have the intent test to avoid criminalising what people would consider to be an exercise of free speech. That is a balance that we have tried to strike. The intention behind the appalling communications that we have heard in evidence and elsewhere is clear: it is in inconceivable that there was any other motivation or intention than to cause harm.

There are some defences—well, not defences, but conditions to be met—in clause 150(1)(c). The person must have “no reasonable excuse”. Subsection (5) makes it clear that

“In deciding whether a person has a reasonable excuse…one of the factors that a court must consider (if it is relevant in a particular case) is whether the message is, or is intended to be, a contribution to a matter of public interest (but that does not determine the point)”

of whether there is a reasonable excuse—it simply has to be taken into account by the court and balanced against the other considerations. That qualification has been put in for reasons of free speech.

There is a delicate balance to strike between criminalising what should be criminal and, at the same time, allowing reasonable free speech. There is a line to draw, and that is not easy, but I hope that, through my comments and the drafting of the clause, the Committee will see that that line has been drawn and a balance struck in a carefully calibrated way. I acknowledge that the matter is not straightforward, but we have addressed it with advice from the Law Commission, which is expert in this area. I commend clause 150 to the Committee.

The other clauses in this group are a little less contentious. Clause 151 sets out a new false communication offence, and I think it is pretty self-explanatory as drafted. The threatening communications offence in clause 152 is also fairly self-explanatory—the terms are pretty clear. Clause 153 contains interpretative provisions. Clause 154 sets out the extra-territorial application, and Clause 155 sets out the liability of corporate officers. Clause 157 repeals some of the old offences that the new provisions replace.

Those clauses—apart from clause 150—are all relatively straightforward. I hope that, in following the Law Commission’s advice, we have struck a carefully calibrated balance in the right place.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I would like to take the Minister back to the question I asked about the public interest defence. There is a great deal of concern that a lot of the overlaying elements create loopholes. He did not answer specifically the question of the public interest defence, which, combined with the reasonable excuse defence, sends the wrong message.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The two work together. On the reasonable excuse condition, for the offence to have been committed, it has to be established that there was no reasonable excuse. The matter of public interest condition—I think the hon. Lady is referring to subsection (5)—simply illustrates one of the ways in which a reasonable excuse can be established, but, as I said in my remarks, it is not determinative. It does not mean that someone can say, “There is public interest in what I am saying,” and they automatically have a reasonable excuse—it does not work automatically like that. That is why in brackets at the end of subsection (5) it says

“but that does not determine the point”.

That means that if a public interest argument was mounted, a magistrate or a jury, in deciding whether the condition in subsection (1)(c)—the “no reasonable excuse” condition—had been met, would balance the public interest argument, but it would not be determinative. A balancing exercise would be performed. I hope that provides some clarity about the way that will operate in practice.

15:45
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

That was about as clear as mud, actually, but let us leave it there.

Question put and agreed to.

Clause 150 accordingly ordered to stand part of the Bill.

Clauses 151 to 155 ordered to stand part of the Bill.

Clause 156

Sending etc photograph or film of genitals

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move amendment 41, in clause 156, page 131, line 15, at end insert—

“(za) B has not consented for A to share the photograph or film with B, or”.

This amendment makes it an offence to send an image of genitals to another person if the recipient has not given consent to receive the image.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss amendment 42, in clause 156, page 131, line 20, at end insert—

“(1A) A person consents if the person agrees by choice, and has the freedom and capacity to make that choice.”

This amendment is linked to Amendment 41.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

With your permission, Ms Rees, I will also speak to clause stand part.

Labour welcomes the clause. We see it as a positive step forward that the Government have committed to creating a new offence in certain circumstances where sending a photograph or film of a person’s genitals to another person will cause distress or humiliation. However, the Government have missed a huge opportunity to accurately capture the problems caused by sharing intimate images online. I will come to that shortly in addressing amendments 41 and 42.

We know that the act of sending unsolicited genital images—cyber-flashing, or sending dick pics—is a huge problem here in the UK. Research from Bumble has shown how disproportionally the issue affects young women. The statistics are shocking and speak for themselves. A whopping 48% of millennial women said that they had been sent an unsolicited sexual image in the last year alone. I must pay tribute to the right hon. Member for Basingstoke, who we all know shared her own experiences of cyber-flashing relatively recently. She is not alone—not in this House or in the country.

I have my own experiences, as do friends, colleagues and even my staff members, and we all share the same concerns about the prevalence of cyber-flashing. The Minister does not need to be reminded of it; he knows of the extent of the issues. We heard compelling evidence only a few weeks ago from Professor Clare McGlynn and Nima Elmi from Bumble, among others.

Labour firmly believes, as Professor McGlynn has outlined, that cyber-flashing is problematic because it is non-consensual conduct of a sexual nature. Distributing these images is not in and of itself wrong, but doing so without the consent of the recipient is. The non-consensual act breaches women’s rights to sexual autonomy, to be treated with dignity and to be free from sexual violence, regardless of the motive of the perpetrator.

We know that men’s motivations for cyber-flashing are varied and overlapping. They include misogyny, causing distress, sexual gratification, humour, boosting status among peers, sexual intimidation, and transactional motivations. Yet there is no evidence that the harms experienced by women are worse when offenders have the specific motivations identified in motive-based proposals, such as causing distress.

For example, a woman may be sent unsolicited penis images while on public transport, making her feel threatened and fearful for her safety, regardless of whether the sender intended to cause her alarm or was simply trying to impress his friends as a bit of banter. That is why the consent approach really is crucial, as I will now discuss in relation to amendments 41 and 42.

Amendment 41 would make it an offence to send an image of genitals to another person if the recipient has not given consent to receive that image. Labour recognises that there are two main options when drafting a new cyber-flashing criminal offence. The first is what we are trying to achieve with these amendments—a comprehensive consent-based offence requiring proof of non-consent. The alternative, as currently proposed by the Law Commission, is far too limited. It offers a motive-based offence, which applies only on proof of specific motives on the part of the offender, such as to cause distress, alarm or humiliation, to get sexual gratification, or to cause distress by being reckless. This is hugely problematic for women and girls across the country, and the Minister must recognise the message this sends to them.

Proving a motive behind an offence as simple as merely sending a photograph is nigh on impossible. If we really want to see systemic change in attitudes to women and girls, we fundamentally should not be creating laws that place the burden on the victim. A consent-based offence, as in our amendments, covers all forms of cyber-flashing, regardless of the motives of the sender. Motive requirements create an unjustified hierarchy of abuses and victims, and they do not reflect victims’ experiences. Requiring proof of specific motives will make investigations and prosecutions more difficult.

We know from police and victims that investigations and prosecutions for sharing sexual images without consent, such as revenge porn, are not taken forward due to similar motive requirements. How, therefore, can the Minister think that the provisions in the Bill related to cyber-flashing go far enough? Will they actually create change? I mentioned on Second Reading our genuine concerns about the levels of misogyny that have become far too normalised across our communities and within our society as a whole.

The consent-based offence provides a much better foundation for education and prevention projects. It sends the message that all sexual activity should be grounded in consent. It better supports education about online activities, with a focus on consent-based practices, and makes clear that any taking or sharing of sexual images without consent is wrong, harmful and criminal. Those are all positives.

The stakeholders are calling for a consent-based approach. The Opposition want the same. Even the Minister’s own Back Benchers can see that the Bill fails to capture and address the real harms women and girls face online. The Minister can likely sense my exasperation. It comes from a place of genuine frustration. I cannot understand how there has not been any movement on this from the Government side.

My final point—and indeed plea—is to urge the Minister to consider what is going on internationally on this issue. He will know that a consent-based cyber-flashing offence has been adopted in Texas and is being debated in other US states. Consent is easily obtained and criminal charges easily avoided. It is important to remember that avoiding being charged with a criminal offence is straightforward. All the sender needs to do is ask, “Would you like to see a picture of my genitals?” It is as simple as that. I am sure even the Minister can agree on that point. I urge him to genuinely consider amendments 41 and 42. There has been no movement from the Minister and no concessions thus far as we have scrutinised the Bill, but he must know that the Bill is far from perfect in its current form.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I would like to make a couple of comments. The shadow Minister mentioned education and prevention projects, which are key. In Scotland, our kids’ sex, health and relationship education in schools teaches consent from the earliest possible age. That is vital. We have a generation of men who think it is okay to send these images and not seek consent. As the shadow Minister said, the problem is everywhere. So many women have received images that they had no desire to see. They did not ask for them, and they did not consent to receive them, but they get them.

Requiring someone to prove the intent behind the offence is just impossible. It is so unworkable, and that makes it really difficult. This is yet another issue that makes it clear that we need to have reference to violence against women and girls on the face of the Bill. If that were included, we would not be making such a passionate case here. We would already have a code of conduct and assessments that have to take place on the basis of the specific harm to women and girls from such offences. We would not be making the case so forcefully because it would already be covered.

I wish the Minister would take on board how difficult it is for women and girls online, how much of an issue this specific action causes and how much pain and suffering it causes. It would great if the Minister could consider moving somewhat on this issue in order to protect women and girls.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I want to make sure that the record is clear that while I did receive a dick pic, I am not a millennial. That shoes how widespread this problem is. My children would want that on the record.

Research done by YouGov showed that half of millennial women have been sent a photo of a penis, and that nine in 10 women who have ever received such a picture did not want to have it sent to them. To anybody who is trying to—I do not feel anybody today is—advocate that this is a small issue or a minority problem, the data suggest that it is not.

For the record, I think the reason I was sent that picture was not sexual at all. I think it was intimidatory. I was sitting in a train carriage on my way into Parliament on a hot day, and I think it was sent as intimidation because I could not leave that carriage and I had, in error, left my AirDrop on. Okay, that was my fault, but let us not victim blame.

I very much welcome the Minister’s approach, because he is the first person to take forward a series of new offences that are needed to clarify the law as it affects people in this area. As he was talking, I was reflecting on his use of the word “clarity”, and I think he is absolutely right. He is rightly looking to the Law Commission as the expert for how we interpret and how we get the most effective law in place.

Although we are not talking about the intimate image abuse recommendations in this part of the Bill, I draw to the Committee’s attention that I, and others, will have received an email from the Law Commission today setting out that it will bring forward its recommendations next month. I hope that that means that the Minister will bring forward something concrete to us about those particular offences in the coming weeks. He is right that when it comes to cyber-flashing, we need to get it right. We need to make sure that we follow the experts. The Law Commission was clear when it undertook its review that the current law does not adequately address these issues. I was pleased when it made that recommendation.

A great many people have looked at these issues, and I pay tribute to each and every one of them, though they come to slightly different conclusions about how we interpret the Law Commission’s recommendations and how we move forward. Professor Clare McGlynn is an expert. Bumble has done work on this; my hon. Friend the Member for Brecon and Radnorshire (Fay Jones) has done a great deal of work too, and I recognise her contribution.

The offence is particularly pernicious because it is as prevalent as indecent exposure. It is right that the offence is recognised in the Sex Offenders Act 2003 as a result. As the hon. Member for Pontypridd said, it is another form of gendered crime online. On the evidence of harm that it causes, she referenced the evidence that we got from Professor McGlynn about Gaia Pope. That was particularly concerning. I do not think any of us in the Committee would argue that this is not the most serious of offences, and I commend the Minister for bringing forward a serious set of recommendations to tackle it.

16:08
The issue is quite specific: how we make sure we have the most effective law in place. Clause 156 amends the Sex Offenders Act and outlines that the offence is committed if somebody sends a photo or a film to another person with the intention of causing harm, distress or humiliation, or
“for the purpose of obtaining sexual gratification”
and they are reckless as to whether they cause harm, distress or humiliation. I welcome that, and I understand the Law Commission’s recommendation focusing on the perpetrator’s motives, not the victim’s consent. I have great sympathy for the argument made by the hon. Member for Pontypridd, but I understand why the Law Commission, as the expert in law, has made that decision. I wonder whether there is a way forward that the Minister might want to contemplate.
I listened to the hon. Lady’s argument—I have made a similar argument in the past—and I will repeat my question: what if the sender of an obscene picture sent it for all sorts of reasons? Maybe it was a joke. Indeed, to go back to my own personal experience, I do not think that the person in the carriage had any sexual motivations at all; he was being intimidatory.
Perhaps the Minister could look at line 19 on page 131, which addresses reckless behaviour. The idea of somebody acting with recklessness is important. At the moment, in proposed new section 66A(1)(b) of the 2003 Act, there is a tie between obtaining sexual gratification and being reckless. The Minister could find a way forward if he simply changed “and” to “or”. I do not think that my personal experience would be caught by this law at all but, as a non-millennial 58-year-old woman, I think it should be. The Minister needs to reflect on that a little.
A way forward that might adhere to what the legal experts at the Law Commission propose—carefully drawing the law so that it does not unintentionally catch people—would be to broaden the provisions slightly by putting in “or” rather than “and”, so that those who act recklessly, such as the individual who sent an image to me, are caught within the law. That would avoid shifting the debate to the issue of consent. The Minister and I have both had long meetings with the Law Commission to understand why it has taken the approach that it has.
I put that suggestion on the table for the Minister to consider between now and Report so that he can find a way forward. Cyber-flashing is at least as harmful as indecent exposure; in fact, I would argue that it is more harmful, because people can experience cyber-flashing in the privacy of their own homes, whereas it is incredibly difficult to experience indecent exposure in that way. I hope that the Minister will look at that.
This seismic change will particularly affect young people: millennials and those who are younger, whatever they are called—generation Z. As parliamentarians, we are interested not just in the law, but in how we make sure it bites. It would be helpful if the Minister explained how we can make the Bill as preventive as possible, so that we do not simply punish young people but actually start to train them to understand that they will be committing a serious offence if they send indecent images of male genitalia to others—predominantly women—without their consent, as they are clearly doing on a large scale. Will the Minister indicate whether he will have conversations with those of his colleagues who are responsible for relationship and sex education, to ensure that young people are aware of this new sex offence and that they do not inadvertently fall foul of the law?
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the Members who have contributed to the debate. Rather like with the provisions in clause 150, which we discussed a few minutes ago, a difficult and delicate balance needs to be struck. We want to criminalise that which should be criminal, but not inadvertently criminalise that which should not be. The legal experts at the Law Commission have been studying the matter and consulting other legal experts for quite some time. As my right hon. Friend the Member for Basingstoke said in her excellent speech, their recommendations have been our starting point.

It is probably worth making one or two points about how the clause works. There are two elements of intention, set out in subsection (1). First, the act of sending has to be intentional; it cannot be done accidentally. I think that is reasonable. Secondly, as set out in subsection (1)(a), there must be an intention to cause the person who sees the image alarm, distress or intimidation.

I understand the point that establishing intent could, in some circumstances, present a higher hurdle. As we discussed in relation to clause 150, we are, separately from this, working on the intimate image abuse offence, which does not require intention to be established; it simply requires lack of consent. I was not aware, until my right hon. Friend mentioned it a few moments ago—she was ahead of me there—that the Law Commission has given a timeframe for coming back. I am not sure whether that implies it will be concomitant with Ministry of Justice agreement or whether that will have to follow, but I am very pleased to hear that there is a timeframe. Clearly, it is an adjacent area to this and it will represent substantial progress.

I understand that it can sometimes be hard to establish intention, but there will be circumstances in which the context of such an incident will often make it clear that there was an intention to cause alarm, distress or humiliation.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Has the Minister ever received a dick pic?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Is that a rhetorical question?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

No, it is a genuine question.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

So he cannot possibly know how it feels to receive one. I appreciate the comments that he is trying to make, and that this is a fine balance, but I do see this specific issue of sending a photograph or film of genitals as black and white: they are sent either with or without consent. It is as simple as that. What other circumstances could there be? Can he give me an example of when one could be sent without the intention to cause distress, harm or intimidation?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

It is a fair question. There might be circumstances in which somebody simply misjudges a situation—has not interpreted it correctly—and ends up committing a criminal offence; stumbling into it almost by accident. Most criminal offences require some kind of mens rea—some kind of intention to commit a criminal offence. If a person does something by accident, without intention, that does not normally constitute a criminal offence. Most criminal offences on the statute book require the person committing the offence to intend to do something bad. If we replace the word “intent” with “without consent”, the risk is that someone who does something essentially by accident will have committed a criminal offence.

I understand that the circumstances in which that might happen are probably quite limited, and the context of the incidents that the hon. Member for Pontypridd and my right hon. Friend the Member for Basingstoke have described would generally support the fact that there is a bad intention, but we have to be a little careful not accidentally to draw the line too widely. If a couple are exchanging images, do they have to consent prior to the exchange of every single image? We have to think carefully about such circumstances before amending the clause.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I have to say, just as an aside, that the Minister has huge levels of empathy, so I am sure that he can put himself into the shoes of someone who receives such an image. I am not a lawyer, but I know that there is a concept in law of acting recklessly, so if someone acts recklessly, as my hon. Friend has set out in his Bill, they can be committing a criminal offence. That is why I thought he might want to consider not having the conditional link between the two elements of subsection(1)(b), but instead having them as an either/or. If he goes back to the Law Commission’s actual recommendations, rather than the interpretation he was given by the MOJ, he will see that they set out that one of the conditions should be that defendants who are posting in this way are likely to cause harm. If somebody is acting in a way that is likely to cause harm, they would be transgressing. The Bill acknowledges that somebody can act recklessly. It is a well-known concept in law that people can be committing an offence if they act recklessly—reckless driving, for example. I wonder whether the Minister might think about that, knowing how difficult it would be to undertake what the hon. Member for Pontypridd is talking about, as it directly contravenes the Law Commission’s recommendations. I do not think what I am suggesting would contravene the Law Commission’s recommendations.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I will commit to consider the clause further, as my right hon. Friend has requested. It is important to do so in the context of the Law Commission’s recommendations, but she has pointed to wording in the Law Commission’s original report that could be used to improve the drafting here. I do not want to make a firm commitment to change, but I will commit to considering whether the clause can be improved upon. My right hon. Friend referred to the “likely to cause harm” test, and asked whether recklessness as to whether someone suffers alarm, distress or humiliation could be looked at as a separate element. We need to be careful; if we sever that from sexual gratification, we need to have some other qualification on sexual gratification. We might have sexual gratification with consent, which would be fine. If we severed them, we would have to add another qualification.

It is clear that there is scope for further examination of clause 156. That does not necessarily mean it will be possible to change it, but it is worth examining it further in the light of the comments made by my right hon. Friend. The testimony we heard from witnesses, the testimony of my right hon. Friend and what we heard from the hon. Member for Pontypridd earlier do demonstrate that this is a widespread problem that is hugely distressing and intrusive and that it represents a severe violation. It does need to be dealt with properly.

We need to be cognisant of the fact that in some communities there is a culture of these kinds of pictures being freely exchanged between people who have not met or communicated before—on some dating websites, for example. We need to draft the clause in such a way that it does not inadvertently criminalise those communities—I have been approached by members of those communities who are concerned.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

They have consent to do that.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Member for Pontypridd says from a sedentary position that they have given consent. The consent is not built into the website’s terms and conditions; it is an assumed social norm for people on those websites. We need to tread carefully and be thoughtful, to ensure that by doing more to protect one group we do not inadvertently criminalise another.

There is a case for looking at the issue again. My right hon. Friend has made the point thoughtfully and powerfully, and in a way that suggests we can stay within the confines of the Law Commission’s advice, while being more thoughtful. I will certainly undertake to go away and do that, in consultation with my right hon. Friend and others.

Nick Fletcher Portrait Nick Fletcher (Don Valley) (Con)
- Hansard - - - Excerpts

I am pleased the Minister will go away and look at this. I am sure there are laws already in place that cover these things, but I know that this issue is very specific. An awful lot of the time, we put laws in place, but we could help an awful lot of people through education, although the last thing we want to do is victim blame. The Government could work with companies that provide devices and have those issued with the airdrop in contacts-only mode, as opposed to being open to everybody. That would stop an awful lot of people getting messages that they should not be receiving in the first place.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

My hon. Friend makes a very powerful and important point. Hopefully, people listening to our proceedings will hear that, as well as those working on media literacy—principally, Ofcom and the Government, through their media literacy strategy. We have had a couple of specific tips that have come out of today’s debate. My right hon. Friend the Member for Basingstoke and my hon. Friend the Member for Don Valley mentioned disabling a device’s airdrop, or making it contacts-only. A point was also made about inadvertently sharing geolocations, whether through Snapchat or Strava. Those are two different but important points that the general public should be more aware of than they are.

16:15
For the time being, I will resist amendments 41 and 42, but in so doing I commit myself to look further at these measures. It is worth saying—this was mentioned a short time ago—that there is nothing in law dealing with this issue, so we have been debating points of detail from around the world. Those are important points of detail, and I am in no way minimising or dismissing them, but we should recognise that, today, Parliament is introducing this offence, which does not exist at the moment. We are taking a gigantic stride forward. While it is important to ensure that we get the details right, let us not forget that a gigantic stride forward is being taken here.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I wholeheartedly agree with the Minister’s comments. This is a gigantic step forward that is long overdue, and we wholeheartedly welcome the new offence being created, but, as he rightly pointed out, it is important that we get this right and that we make the measure as strong as possible so that the legislation causes direct and meaningful change.

To us, the issue is simple: “Do you want to see my genitals, yes or no?” We will push amendment 41 to the vote.

Question put, That the amendment be made.

Division 49

Ayes: 5


Labour: 4
Scottish National Party: 1

Noes: 9


Conservative: 9

Amendments made: 3, in clause 156, page 131, line 37, leave out “12 months” and insert
“the general limit in a magistrates’ court”.
Amendment 5, in clause 156, page 131, leave out lines 40 to 42.—(Chris Philp.)
This amendment is consequential on Amendment 3.
Clause 156, as amended, ordered to stand part of the Bill.
Clause 157 ordered to stand part of the Bill.
Clause 158
Consequential amendments
Question proposed, That the clause stand part of the Bill.
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss schedule 13.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

We have argued that changes to the legislation are long overdue to protect people from the harms caused by online communications offences. The clause and schedule 13 include necessary amendments to the legislation, so we do not oppose them standing part of the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The clause cross-references schedule 13 and sets out amendments to existing legislation consequential on the communications offences in part 10. Schedule 13 has a number of consequential amendments, divided broadly into two parts. It makes various changes to the Sexual Offences Act 2003, amends the Regulatory Enforcement and Sanctions Act 2008 in relation to the Malicious Communications Act 1988, and makes various other changes, all of which are consequential on the clauses we have just debated. I therefore commend clause 158 and its associated schedule 13 to the Committee.

Question put and agreed to.

Clause 158 accordingly ordered to stand part of the Bill.

Schedule 13 agreed to.

Clause 159

Providers that are not legal persons

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to consider:

Government amendment 159.

Clauses 160 and 161 stand part.

That schedule 14 be the Fourteenth schedule to the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour supports clause 159, because it is vital that the Bill includes provisions for Ofcom to issue a penalty notice or confirmation decision when the provider may not be a legal person in the traditional sense. We have repeatedly maintained that it is central to the success of the Bill that, once implemented, it properly and sufficiently gives Ofcom the relevant powers, autonomy and independence to properly pursue providers of regulated services and their wrongdoings.

We recognise the complexity of the service providers’ business models and therefore agree that the Bill must be broad enough to ensure that penalty notices and confirmation decisions can be given, even when the provider may constitute an association, or an organisation between a group of people. Ultimately, as we have made clear, Labour will continue to support giving the regulator the tools required to keep us all safe online.

We have already raised concerns over Ofcom’s independence and the interference of and over-reliance on the Secretary of State’s powers within the Bill as it stands. However, we are in agreement on clause 159 and feel that it provides a vital tool for Ofcom to have at its disposal should the need for a penalty notice or confirmation decision arise. That is why we support the clause and have not sought to amend it.

Government amendment 159, as we know, ensures that if the provider of a service consists of two or more individuals, those individuals are jointly liable to pay a fee demanded under new schedule 2. As I will come on to in my comments on clauses 160 and 161, we welcome the provisions and clarifications around liability for fees when the provider of a service consists of two or more individuals.

As with clause 159, we welcome the clarity of provisions in the Bill that confirm actions to be taken where a group of two or more individuals act together. It is absolutely right that where two or more individuals together are the providers of a regulated service, they should be jointly and severally liable for any duty, requirement or liability to pay a fee.

We also welcome the clarification that that liability and joint responsibility will also apply in the event of a penalty notice or confirmation decision. We believe that these provisions are vital to capturing the true extent of where responsibility should lie, and we hope they will go some way to remedying the hands-off approach that service providers have managed to get away with for too long when it comes to regulation of the internet. We do, however, feel that the Government could have gone further, as we outlined in amendment 50, which we spoke to when we addressed clause 123.

Labour firmly believes that Ofcom’s ability to take action against non-compliance en masse is critical. That is why we welcome clause 160 and will not be seeking to amend it at this stage. We also fundamentally support clause 161, which contains provisions on how joint liability will operate.

We will speak to our concerns about supply chains when we debate a later clause—I believe it is new clause 13 —because it is vital that this Bill captures the challenges around supply chain failures and where responsibility lies. With that in mind, we will support clause 161, with a view to the Minister understanding our broader concerns, which we will address when we debate new clause 13.

Finally, schedule 14 establishes that decisions or notices can be given jointly to both a regulated provider and its parent company. We particularly support the confirmation that all relevant entities must be given the opportunity to make representations when Ofcom seeks to establish joint liability, including on the matters contained in the decision or notice and whether joint liability would be appropriate.

As we have made clear, we see the provisions outlined in this schedule as fundamental to Ofcom’s ability to issue truly meaningful decisions, penalties and notices to multiple parties. The fact that, in this instance, service providers will be jointly liable to comply is key to capturing the extent to which it has been possible to perpetuate harm online for so long. That is why we support the intention behind schedule 14 and have not sought to amend it.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The shadow Minister has set out clearly the purpose of and intent behind these clauses, and how they work, so I do not think I will add anything. I look forward to our future debate on the new clause.

There is one point of correction that I wish to make, and it relates to a question that the hon. Member for Aberdeen North asked this morning and that is germane to amendment 159. That amendment touches on the arrangements for recouping the set-up costs that Ofcom incurs prior to the Bill receiving Royal Assent. The hon. Member for Aberdeen North asked me over what time period those costs would be collected, and I answered slightly off the cuff. Now I have had a chance to dig through the papers, I will take this opportunity to confirm exactly how that works.

To answer the question a little bit better than I did this morning, the place to go is today’s amendment paper. The relevant provisions are on page 43 of the amendment paper, in paragraph 7(5) of Government new schedule 2, which we will debate later. If we follow the drafting through—this is quite a convoluted trail to follow —it states that the cost can be recouped over a period that is not less than three years and not more than five years. I hope that gives the hon. Member for Aberdeen North a proper answer to her question from this morning, and I hope it provides clarity and points to where in the new schedule the information can be found. I wanted to take the first opportunity to clarify that point.

Beyond that, the hon. Member for Pontypridd has summarised the provisions in this group very well, and I have nothing to add to her comments.

Question put and agreed to.

Clause 159 accordingly ordered to stand part of the Bill.

Clause 160

Individuals providing regulated services: liability

Amendment made: 159, in clause 160, page 133, line 6, after “71” insert

“or Schedule (Recovery of OFCOM’s initial costs)”.—(Chris Philp.)

This amendment ensures that, if the provider of a service consists of two or more individuals, those individuals are jointly liable to pay a fee demanded under NS2.

Clause 160, as amended, ordered to stand part of the Bill.

Clause 161 ordered to stand part of the Bill.

Schedule 14 agreed to.

Clause 162

Information offences: supplementary

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clauses 163 to 165 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour supports the intention behind clause 162, because we believe that only by creating specific offences will the messaging around liability and the overall message about public safety really hit home for those at the top in Silicon Valley. We welcome the clarification on exactly how Ofcom will be able to exercise these important powers, and we support the process of giving notice, confirmation decisions and subsequent penalties. We see the clause as fundamental to the Bill’s overall success, although, as the Minister will recall, we feel that the Bill could go further in addressing broader offences beyond those around information practices. However, that is a debate for another day.

In this clause, we believe that the importance and, indeed, the power of information notices is crystal clear for service providers to see, and Labour fully supports and welcomes that move. That is why we will support clause 162 and have not sought to amend it at this stage. We welcome the clarity in clause 163 around the process that applies when a person relies on a defence in an information offence. We see this clause as sitting alongside current legal precedents and are therefore happy to support it.

We fully support and welcome clause 164. We believe it is central to the entire argument around liability that the Minister knows Labour has been making for some time now. We have heard in Committee evidence sessions some truly compelling insights from people such as Frances Haugen, and we know for certain that companies are prone to covering up information that they know will be received unfavourably.

16:30
I point the Minister to one such worrying instance that occurred only last week in which Blake Lemoine, a senior software engineer in Google’s responsible AI unit, was placed on paid leave after claiming that the tech group’s chatbot had become sentient. From Lemoine’s testimonies, the conversations with the language model for dialogue applications, known informally as LaMDA, seem incredibly real. The chatbot seemingly confessed to having hunger for spiritual knowledge, as well as feelings of loneliness. Although they do not constitute an information offence per se, we can and must all recognise that if what has been said is true, those developments in the world of AI are very worrying. That is why we welcome clause 164, which will place more liability on corporate officers for information offences committed by that entity.
Although we believe the liability should go further and transparency around information captured by platforms should be broader, we welcome the impact that the clause could have, given that both an entity and corporate officer could be found guilty of an offence in certain circumstances. That sits with legal precedents elsewhere and we are happy to support it.
Finally, we welcome the clarity outlined in clause 165, which sets out how information offences apply to providers that are not legal persons according to the law under which they are formed. As Members know, subsection (2) specifies:
“Proceedings for an offence alleged to have been committed by a relevant entity must be brought against the entity in its own name”.
We agree with that approach, given the Bill’s provisions for personal liability, which we have discussed at length.
We welcome the provisions outlined in subsection (5), which provide that if the relevant entity commits an offence and the offence was committed
“with the consent or connivance of an officer”,
or can be attributed to the neglect of an officer, the officer also commits the offence. It is a welcome step indeed that the Bill captures both officer liability and, to a certain degree, group liability, in the form of partnership and unincorporated association. We are happy to support clause 165 and have not sought to make changes at this stage.
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Once again, the shadow Minister has described the various clauses in this group. They speak, as she said, to the important and very strong measures around information offences. It is so important that where someone fails to provide the information that Ofcom requires, not only is there a liability on the company to pay very large fines or have their service cut off, as we discussed earlier, but individuals have criminal liability as well.

Clause 162 gives further information about how information-related criminal offences operate and how criminal proceedings can be brought against a person who fails to comply with an information notice or a requirement imposed when Ofcom exercises its powers of entry and inspection. Clause 163 goes further to explain how defences to accusations of criminal offences can operate, and it is helpful to have that clearly set out.

Clause 164 allows for corporate officers of regulated providers to be found liable for offences committed by the provider under the Act. For example, corporate officers can also be found liable for information offences committed by their company. That is extremely important, because it means that senior personnel can be held liable even where they are not named by their company in an information response. That means the most senior executives will have their minds focused on making sure the information requirements are properly met.

Clause 165 provides further information about how information-related criminal offences will operate under the Bill when the regulated provider is not a legal person—when it is, for example, a partnership or an unincorporated association. I hope the clauses give the specificity and clarification required to operate the personal criminal liability, which gives the enforcement powers in the Bill such strong teeth.

Question put and agreed to.

Clause 162 accordingly ordered to stand part of the Bill.

Clauses 163 to 165 ordered to stand part of the Bill.

Clause 166

Extra-territorial application

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause 167 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour welcomes clause 166, which specifies that references to regulated services and Ofcom’s information-gathering powers apply to services provided from outside the United Kingdom as well as to services provided from within the United Kingdom. While we recognise the challenges around internet regulation in the UK, we live in a global world, and we are pleased that the legislation has been drawn up in a way that will capture services based overseas.

We feel the Bill is lacking in its ability to regulate against content that may have originated from outside the UK. While it is welcome that regulated services based abroad will be within scope, we have concerns that that will do little to capture specific content that may not originate within the UK. We have raised these points at length in previous debates, so I will not dwell on them now, but the Minister knows that the Bill will continue to fall short when it does not capture, for example, child sexual exploitation and abuse content that was filmed and originated abroad. That is a huge loophole, which will allow harmful content to be present and to be perpetuated online well into the future. Although we support clause 166 for now, I urge the Minister to reconsider his view on how all-encompassing the current approach to content can be as he considers his Department’s strategy before Report.

Clause 167 outlines that the information offences in the Bill apply to acts done in the United Kingdom and outside the United Kingdom. We welcome its provisions, but we feel that the Government could go further. We welcome the clarification that it will be possible to prosecute information offences in any part of the UK as if they occurred there. Given the devastating pressures that our legal system already faces thanks to this Government’s cuts and shambolic approach to justice, such flexibility is crucial and a welcome step forward.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Last week or the week before, we debated extensively the points about the extraterritorial application to protecting children, and I made it clear that the Bill protects people as we would wish it to.

Clause 166 relates to extraterritorial enforceability. It is important to make sure that the duties, enforceable elements and sanctions apply worldwide, reflecting the realities of the internet, and clause 166 specifies that references to regulated services in the Bill include services provided from outside the United Kingdom. That means that services based overseas must also comply, as well as those in the UK, if they reach UK users.

The clause ensures that Ofcom has effective information-gathering powers and can seek information from in-scope companies overseas for the purposes of regulating and enforcing the regime. Obviously, companies such as Facebook are firmly in scope, as hon. Members would expect. The clause makes it clear that Ofcom can request information held outside the UK and interview individuals outside the UK, if that is necessary for its investigations.

Clause 167 explains that the information-related personal criminal offences in the Bill—for example, failing to comply with Ofcom’s information notices—apply to acts done inside and outside the UK. That means that those offences can be criminally prosecuted whether the perpetrator is based in the UK or outside the UK. That will send a clear message to the large global social media firms that no matter where they may be based in the world or where their services may be provided from, we expect them to comply and the enforcement provisions in the Bill will apply to them.

Question put and agreed to.

Clause 166 accordingly ordered to stand part of the Bill.

Clause 167 ordered to stand part of the Bill.

Ordered, That further consideration be now adjourned. —(Steve Double.)

16:39
Adjourned till Thursday 23 June at half-past Eleven o’clock.
Written evidence reported to the House
OSB78 James Wilson
OSB79 Meta (supplementary submission)
OSB80 Aviva
OSB81 Church of Scotland
OSB82 WebGroup Czech Republic, a.s. and NKL Associates s.r.o.
OSB83 Professor Uta Kohl, Professor of Law at Southampton Law School, and Dr Napoleon Xanthoulis, Lecturer in Law at Southampton Law School
OSB84 Barnardo’s
OSB85 The Children’s Society
OSB86 Professor Clare McGlynn, Durham Law School, Durham University, and Professor Lorna Woods OBE, School of Law, University of Essex

Online Safety Bill (Fifteenth sitting)

Committee stage
Thursday 23rd June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 23 June 2022 - (23 Jun 2022)
The Committee consisted of the following Members:
Chairs: † Sir Roger Gale, Christina Rees
† Ansell, Caroline (Eastbourne) (Con)
† Bailey, Shaun (West Bromwich West) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
† Carden, Dan (Liverpool, Walton) (Lab)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Double, Steve (St Austell and Newquay) (Con)
† Fletcher, Nick (Don Valley) (Con)
† Holden, Mr Richard (North West Durham) (Con)
† Keeley, Barbara (Worsley and Eccles South) (Lab)
† Leadbeater, Kim (Batley and Spen) (Lab)
† Miller, Dame Maria (Basingstoke) (Con)
† Mishra, Navendu (Stockport) (Lab)
† Moore, Damien (Southport) (Con)
Nicolson, John (Ochil and South Perthshire) (SNP)
† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
Russell, Dean (Watford) (Con)
† Stevenson, Jane (Wolverhampton North East) (Con)
Katya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks
† attended the Committee
Public Bill Committee
Thursday 23 June 2022
[Sir Roger Gale in the Chair]
Online Safety Bill
11:30
None Portrait The Chair
- Hansard -

Good morning, ladies and gentlemen. Please ensure your phones are switched to silent.

Clause 168

Publication by OFCOM

Question proposed, That the clause stand part of the Bill.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Sir Roger. Clause 168 is a very short and straightforward clause. Ofcom will be required to publish a variety of documents under the Online Safety Bill. The clause simply requires that this be done in a way that is appropriate and likely to bring it to the attention of any audience who are going to be affected by it. Ofcom is already familiar with this type of statutory obligation through existing legislation, such as the Digital Economy Act 2017, which places similar obligations on Ofcom. Ofcom is well versed in publishing documents in a way that is publicly accessible. Clause 168 puts the obligation on to a clear statutory footing.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

As the Minister said, clause 168 rightly sets out that the raw material the Bill requires of Ofcom is published in a way that will bring it to the attention of any audience likely to be affected by it. It will be important that all the guidance is published in a way that is easily available and accessible, including for people who are not neurotypical, or experience digital exclusion. I think we would all agree, after the work we have done on the Bill, that the subjects are complex and the landscape is difficult to understand. I hope Ofcom will make its documents as accessible as possible.

Question put and agreed to.

Clause 168 accordingly ordered to stand part of the Bill.

Clause 169

Service of notices

Question proposed, That the clause stand part of the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clause 169 sets out the process for the service of any notice under the Bill, including notices to deal with child sexual exploitation and abuse or terrorism content, information notices, enforcement notices, penalty notices and public statement notices to providers of regulated services both within and outside the United Kingdom. The clause sets out that Ofcom may give a notice to a person by handing it to them, leaving it at the person’s last known address, sending it by post to that address or sending it by email to the person’s email address. It provides clarity regarding who Ofcom must give notice to in respect of different structures. For example, notice may be given to an officer of a body corporate.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

As the Minister said, clause 169 sets out the process of issuing notices or decisions by Ofcom. It mostly includes provisions about how Ofcom is to contact the company, which seem reasonable. The Opposition do not oppose clause 169.

Question put and agreed to.

Clause 169 accordingly ordered to stand part of the Bill.

Clause 170

Repeal of Part 4B of the Communications Act

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to consider clauses 171 and 172.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clause 170 repeals the video-sharing platform regime. While the VSP and online safety regimes have similar objectives, the new framework in the Bill will be broader and will apply to a wider range of online platforms. It is for this reason that we will repeal the VSP regime and transition those entities regulated as VSPs across to the online safety regime, which is broader and more effective in its provisions. The clause simply sets out the intention to repeal the VSP.

Clause 171 repeals part 3 of the Digital Economy Act 2017. As we have discussed previously, the Online Safety Bill now captures all online sites that display pornography, including commercial pornography sites, social media sites, video sharing platforms, forums and search engines. It will provide much greater protection to children than the Digital Economy Act. The Digital Economy Act was criticised for not covering social media platforms, which this Bill does cover. By removing that section from the Digital Economy Act, we are laying the path to regulate properly and more comprehensively.

Finally, in this group, clause 172 amends section 1B of the Protection of Children Act 1978 and creates a defence to the offence of making an indecent photograph of a child for Ofcom, its staff and those assisting Ofcom in exercising its online safety duties. Clearly, we do not want to criminalise Ofcom staff while they are discharging their duties under the Bill that we are imposing on them, so it is reasonable to set out that such a defence exists. I hope that provides clarity to the Committee on the three clauses.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The provisions in clauses 170 to 172, as the Minister has said, repeal or amend existing laws for the purposes of the Bill. As Labour supports the need to legislate on the issue of online safety, we will not oppose the clauses. However, I want to note that the entire process, up until the final abandonment of part 3 of the Digital Economy Act under clause 171 appears shambolic. It has been five years now since that part of the Act could have been implemented, which means five years during which children could have been better protected from the harms of pornographic content.

When the Government eventually admitted that part 3 was being ditched, the Minister at the time, the hon. Member for Boston and Skegness (Matt Warman), said that the Government would seek to take action on pornography more quickly than on other parts of the online harms regime. Stakeholders and charities have expressed concerns that we could now see a delay to the implementation of the duties on pornographic content providers, which is similar to the postponement and eventual abandonment of part 3 of the Digital Economy Act. I appreciate that the Minister gave some reassurance of his

“desire to get this done as quickly as possible”—[Official Report, Online Safety Bill Committee, 9 June 2022; c. 308.]

in our debate on clauses 31 to 33, but would it not be better to set out timeframes in the Bill?

Under clause 193, it appears that the only clauses in part 5 to be enacted once the Bill receives Royal Assent will be the definitions—clause 66 and clause 67(4)—and not the duties. That is because Ofcom is expected to issue a call for evidence, after which draft proposals for consultation are published, which then need to be agreed by the Secretary of State and laid before Parliament. There are opportunities there for delays and objections at any stage and, typically, enforcement will be implemented only in a staged fashion, from monitoring to supervision. The consultations and safeguarding processes are necessary to make the guidance robust; we understand that. However, children cannot wait another three years for protections, having been promised protection under part 3 of the Digital Economy Act five years ago, which, as I have said, was never implemented.

The provisions on pornography in part 5 of the Bill require no secondary legislation so they should be implemented as quickly as possible to minimise the amount of time children continue to be exposed to harmful content. It would be irresponsible to wait any longer than absolutely necessary, given the harms already caused by this drawn-out process.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

Thank you, Sir Roger, for chairing this meeting this morning. I want to agree with the Opposition’s points about the timing issue. If an Act will repeal another one, it needs to make sure that there is no gap in the middle and, if the repeal takes place on one day, that the Bill’s provisions that relate to that are in force and working on the same day, rather than leaving a potential set-up time gap.

On clause 170 and repealing the part of the Communications Act 2003 on video-sharing platform services, some concerns have been raised that the requirements in the Online Safety Bill do not exactly mirror the same provisions in the video-sharing platform rules. I am not saying necessarily or categorically that the Online Safety Bill is less strong than the video-sharing platform rules currently in place. However, if the legislation on video-sharing platform services is repealed, the Online Safety Act, as it will be, will become the main way of regulating video-sharing platforms and there will be a degradation in the protections provided on those platforms and an increase in some of the issues and concerns we have seen raised. Will the Minister keep that under review and consider how that could be improved? We do not want to see this getting worse simply because one regime has been switched for another that, as the Minister said, is broader and has stronger protections. Will he keep under review whether that turns out to be the case when the Act has bedded in, when Ofcom has the ability to take action and properly regulate—particularly, in this case, video-sharing platforms?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I agree with the hon. Member for Worsley and Eccles South, that we want to see these provisions brought into force as quickly as possible, for the reasons that she set out. We are actively thinking about ways of ensuring that these provisions are brought into force as fast as possible. It is something that we have been actively discussing with Ofcom, and that, I hope, will be reflected in the road map that it intends to publish before the summer. That will of course remain an area of close working between the Department for Digital, Culture, Media and Sport and Ofcom, ensuring that these provisions come into force as quickly as possible. Of course, the illegal duties will be brought into force more quickly. That includes the CSEA offences set out in schedule 6.

The hon. Member for Aberdeen North raised questions in relation to the repeal of part 3 of the Digital Economy Act. Although that is on the statute book, it was never commenced. When it is repealed, we will not be removing from force something that is applied at the moment, because the statutory instrument to commence it was never laid. So the point she raised about whether the Bill would come into force the day after the Digital Economy Act is repealed does not apply; but the point she raised about bringing this legislation into force quickly is reasonable and right, and we will work on that.

The hon. Lady asked about the differences in scope between the video-sharing platform and the online safety regime. As I said, the online safety regime does have an increased scope compared with the VSP regime, but I think it is reasonable to keep an eye on that as she suggested, and keep it under review. There is of course a formal review mechanism in clause 149, but I think that more informally, it is reasonable that as the transition is made we keep an eye on it, as a Government and as parliamentarians, to ensure that nothing gets missed out.

I would add that, separately from the Bill, the online advertising programme is taking a holistic look at online advertising in general, and that will also be looking at matters that may also touch on the VSPs and what they regulate.

Question put and agreed to.

Clause 170 accordingly ordered to stand part of the Bill.

Clauses 171 and 172 ordered to stand part of the Bill.

Clause 173

Powers to amend section 36

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to take clauses 174 to 176 stand part.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The clause gives the Secretary of State the power to amend the list of fraudulent offences in section 36 in relation to the duties in relation to fraudulent advertising. These are the new duties that were introduced following feedback from Parliament, the Joint Committee, Martin Lewis and many other people. That is to ensure that we can keep the list of fraudulent offences up to date. The power to make those changes is subject to some constraints, as we would expect. The clause lists the criteria that any new offences must meet before the Secretary of State can include them in the section 36 list, which relates to the prevalence of the paid-for advertisements that amount to the new offence on category 1 services and the risk and severity of harm that that content poses to individuals in the UK.

The clause further limits the Secretary of State’s power to include new fraud offences, listing types of offence that may not be added. Offences from the Consumer Protection from Unfair Trading Regulations would be one instance. As I mentioned, the power to update section 36 is necessary to ensure that the legislation is future-proofed against new legislation and changes in criminal behaviour. Hon. Members have often said that it is important to ensure that the Bill is future-proof, and here is an example of exactly that future-proofing.

11:45
Clause 174, also in this group, sets out that the Bill includes a number of particular exemptions to ensure that it remains targeted and proportionate. We recognise that certain kinds of content and services are currently low risk and merit an exemption from the framework, but in the future that may change, and if the risk level does change, parliamentarians and the public will expect us to bring those into the scope of the Bill. Again, this is an example of future-proofing, so that if the world changes in a way that we have not anticipated, the Bill can be updated to ensure that nothing slips through the net. Again, that is consistent with what members of the Committee have been saying about the need to ensure that the Bill is future proof.
Clause 175 provides for updating the list of categories of education and childcare providers. That is essential to ensure that the exemption in paragraph 10 of schedule 1 continues to apply to the correct providers over time, which I think, again, is a reasonable proposition.
Clause 176 provides powers to amend schedules 5, 6 and 7. Those schedules, as colleagues will recall, cover priority criminal offences, which is schedule 7, child sexual exploitation and abuse offences, which is schedule 6, and terrorism offences, which is schedule 5. Clearly, if new offences are created or if there are existing offences that Parliament believes need to be added to these priority lists of offences, we need the flexibility to do that. An example might be a new offence created by a devolved Administration, a new offence that Parliament here at Westminster legislates for that we think needs to be a priority offence, or an existing offence that is not on the list now but in the future we think needs to be added to ensure that platforms proactively protect the public. We need this flexibility. Again, this speaks to the future-proofing of the Bill that Members have spoken about. It is an extremely important aspect of the Bill’s ability to respond to threats that may emerge in the future and to new legislation.
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

Good morning, Sir Roger. As the Minister has outlined, clause 173 gives the Secretary of State the power to amend the list of fraud offences in what will be section 36 in relation to the duties about fraudulent advertising. Although we recognise that this power is subject to some constraints, Labour has concerns about what we consider to be an unnecessary power given to the Secretary of State to amend duties about fraudulent advertising on category 1 services.

We welcome the provisions outlined in clause 173(2), which lists the criteria that any new offences must meet before the Secretary of State may include them in the list of fraud offences in section 36. The Minister outlined some of those. Along the same lines, the provision in clause 173(3) to further limit the Secretary of State’s power to include new fraud offences—it lists types of offences that may not be added to section 36—is a positive step.

However, we firmly believe that delegated law making of this nature, even when there are these minor constraints in place, is a worrying course for the Government to pursue when we have already strongly verbalised our concerns about Ofcom’s independence. Can the Minister alleviate our concerns by clarifying exactly how this process will work in practice? He must agree with the points that colleagues from across the House have made about the importance of Ofcom being truly independent and free from any political persuasion, influence or control. We all want to see the Bill change things for the better so I am keen to hear from the Minister the specific reasoning behind giving the Secretary of State the power to amend this important legislation through what will seemingly be a simple process.

As we all know, clause 174 allows the Secretary of State to make regulations to amend or repeal provisions relating to exempt content or services. Regulations made under this clause can be used to exempt certain content or services from the scope of the regulatory regime, or to bring them into scope. It will come as no surprise to the Minister that we have genuine concerns about the clause, given that it gives the Secretary of State of the day the power to amend the substantive scope of the regulatory regime. In layman’s terms, we see this clause as essentially giving the Secretary of State the power to, through regulations, exempt certain content and services from the scope of the Bill, or bring them into scope. Although we agree with the Minister that a degree of flexibility is crucial to the Bill’s success and we have indeed raised concerns throughout the Bill’s proceedings about the need to future-proof the Bill, it is a fine balance, and we feel that these powers in this clause are in excess of what is required. I will therefore be grateful to the Minister if he confirms exactly why this legislation has been drafted in a way that will essentially give the Secretary of State free rein on these important regulations.

Clauses 175 and 176 seek to give the Secretary of State additional powers, and again Labour has concerns. Clause 175 gives the Secretary of State the power to amend the list in part 2 of schedule 1, specifically paragraph 10. That list sets out descriptions of education and childcare relating to England; it is for the relevant devolved Ministers to amend the list in their respective areas. Although we welcome the fact that certain criteria must be met before the amendments can be made, this measure once again gives the Secretary of State of the day the ability substantively to amend the scope of the regime more broadly.

Those concerns are felt even more strongly when we consider clause 176, which gives the Secretary of State the power to amend three key areas in the Bill—schedules 5, 6 and 7, which relate to terrorism offences, to child sexual exploitation and abuse content offences—except those extending to Scotland—and to priority offences in some circumstances. Alongside stakeholders, including Carnegie, we strongly feel that the Secretary of State should not be able to amend the substantive scope of the regime at this level, unless moves have been initiated by Ofcom and followed by effective parliamentary oversight and scrutiny. Parliament should have a say in this. There should be no room for this level of interference in a regulatory regime, and the Minister knows that these powers are at risk of being abused by a bad actor, whoever the Secretary of State of the day may be. I must, once again, press the Minister to specifically address the concerns that Labour colleagues and I have repeatedly raised, both during these debates and on Second Reading.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a couple of questions, particularly on clause 176 and the powers to amend schedules 6 and 7. I understand the logic for schedule 5 being different—in that terrorism offences are a wholly reserved matter—and therefore why only the Secretary of State would be making any changes.

My question is on the difference in the ways to amend schedules 6 and 7—I am assuming that Government amendment 126, which asks the Secretary of State to consult Scottish Ministers and the Department of Justice in Northern Ireland, and which we have already discussed, will be voted on and approved before we come to clause 176. I do not understand the logic for having different procedures to amend the child sexual exploitation and abuse offences and the priority offences. Why have the Government chosen two different procedures for amending the two schedules?

I understand why that might not be a terribly easy question to answer today, and I would be happy for the Minister to get in touch afterwards with the rationale. It seems to me that both areas are very important, and I do not quite understand why the difference is there.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me start by addressing the questions the shadow Minister raised about these powers. She used the phrase “free rein” in her speech, but I would not exactly describe it as free rein. If we turn to clause 179, which we will come to in a moment or two, and subsection (1)(d), (e), (f) and (g), we see that all the regulations made under clauses 173 to 176, which we are debating, require an SI under the affirmative procedure. Parliament will therefore get a chance to have its say, to object and indeed to vote down a provision if it wishes to. It is not that the Secretary of State can act alone; changes are subject to the affirmative SI procedure.

It is reasonable to have a mechanism to change the lists of priority offences and so on by affirmative SI, because the landscape will change and new offences will emerge, and it is important that we keep up to date. The only alternative is primary legislation, and a slot for a new Act of Parliament does not come along all that often—perhaps once every few years for any given topic. I think that would lead to long delays—potentially years—before the various exemptions, lists of priority offences and so on could be updated. I doubt that it is Parliament’s intention, and it would not be good for the public if we had to wait for primary legislation to change the lists. The proposed mechanism is the only sensible and proportionate way to do it, and it is subject to a parliamentary vote.

A comment was made about Ofcom’s independence. The way the offences are defined has no impact on Ofcom’s operational independence. That is about how Ofcom applies the rules; this is about what the rules themselves are. It is right that we are able to update them relatively nimbly by affirmative SI.

The hon. Member for Aberdeen North asked about the differences in the way schedules 6 and 7 can be updated. I will happily drop her a line with further thoughts if she wants me to, but in essence we are happy to get the Scottish child sexual exploitation and abuse offences, set out in part 2 of schedule 6, adopted as soon as Scottish Ministers want. We do not want to delay any measures on child exploitation and abuse, and that is why it is done automatically. Schedule 7, which sets out the other priority offences, could cover any topic at all—any criminal offence could fall under that schedule—whereas schedule 6 is only about child sexual exploitation and abuse. Given that the scope of schedule 7 takes in any criminal offence, it is important to consult Scottish Ministers if it is a Scottish offence but then use the statutory instrument procedure, which applies it to the entire UK internet. Does the hon. Lady want me to write to her, or does that answer her question?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

That is actually incredibly helpful. I do not need a further letter, thanks.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am grateful to the hon. Lady for saving DCMS officials a little ink, and electricity for an email.

I hope I have addressed the points raised in the debate, and I commend the clause to the Committee.

Question put and agreed to.

Clause 173 accordingly ordered to stand part of the Bill.

Clauses 174 and 175 ordered to stand part of the Bill.

Clause 176

Powers to amend Schedules 5, 6 and 7

Amendment made: 126, in clause 176, page 145, line 4, at end insert—

“(5A) The Secretary of State must consult the Scottish Ministers before making regulations under subsection (3) which—

(a) add an offence that extends only to Scotland, or

(b) amend or remove an entry specifying an offence that extends only to Scotland.

(5B) The Secretary of State must consult the Department of Justice in Northern Ireland before making regulations under subsection (3) which—

(a) add an offence that extends only to Northern Ireland, or

(b) amend or remove an entry specifying an offence that extends only to Northern Ireland.”—(Chris Philp.)

This amendment ensures that the Secretary of State must consult the Scottish Ministers or the Department of Justice in Northern Ireland before making regulations which amend Schedule 7 in connection with an offence which extends to Scotland or Northern Ireland only.

Clause 176, as amended, ordered to stand part of the Bill.

Clause 177

Power to make consequential provision

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause 178 stand part.

Government amendment 160.

Clause 179 stand part.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As new services and functions emerge and evolve, and platforms and users develop new ways to interact online, the regime will need to adapt. Harms online will also continue to change, and the framework will not function effectively if it cannot respond to these changes. These clauses provide the basis for the exercise of the Secretary of State’s powers under the Bill to make secondary legislation. The Committee has already debated the clauses that confer the relevant powers.

Clause 177 gives the Secretary of State the power to make consequential changes to this legislation or regulations made under it. It further provides that the regulations may amend or repeal relevant provisions made under the Communications Act 2003 or by secondary legislation made under that Act. The power is necessary to give effect to the various regulation-making powers in the Bill, which we have mostly already debated, and to ensure that the provisions of the 2003 Act and regulations that relate to online safety can continue to be updated as appropriate. That is consistent with the principle that the Bill must be flexible and future-proof. The circumstances in which these regulation-making powers may be exercised are specified and constrained by the clauses we have previously debated. Clause 178 ensures that the regulation-making powers in the Bill may make different provisions for different purposes, in particular ensuring that regulations make appropriate provisions for different types of service.

Amendment 160 forms part of a group of amendments that will allow Ofcom to recover costs from the regulated services for work that Ofcom carries out before part 6 of the Bill is commenced. As I said previously, the costs may be recouped over a period of three to five years. Currently, the costs of preparations for the exercise of safety functions include only costs incurred after commencement. The amendment makes sure that initial costs incurred before commencement can be recouped as well.

Clause 179 sets out the procedure that should be used for the regulation-making powers in the Bill to ensure that the procedures used are proportionate to the power conferred. We have ensured that all so-called Henry VIII powers, which enable secondary legislation to be used to amend primary legislation, are subject to the affirmative procedure. That way, Parliament will have proper oversight of any changes to this legislation or other Acts of Parliament. I accept the shadow Minister’s point that this cannot be a matter for the Secretary of State acting alone; proper parliamentary approval is needed. That is why clause 179 is constructed as it is.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Again, Labour has concerns about clause 177, which gives the Secretary of State a power to make consequential provisions relating to the Bill or regulations under the Bill. As we know, the power is exercised by regulation and includes the ability to amend the Communications Act 2003. I will spare the Committee a repetition of my sentiments, but we do feel that the clause is part of an extremely worrying package of clauses related to the Secretary of State’s powers, which we feel are broadly unnecessary.

We have the same concerns about clause 178, which sets out how the powers to make regulations conferred on the Secretary of State may be used. Although we recognise that it is important in terms of flexibility and future-proofing that regulations made under the Bill can make different provisions for different purposes, in particular relating to different types of service, we are concerned about the precedent that this sets for future legislation that relies on an independent regulatory system.

Labour supports amendment 160, which will ensure that the regulations made under new schedule 2, which we will debate shortly, are subject to the affirmative procedure. That is vital if the Bill is to succeed. We have already expressed our concerns about the lack of scrutiny of other provisions in the Bill, so we see no issue with amendment 160.

The Minister has outlined clause 179, and he knows that we welcome parliamentary oversight and scrutiny of the Bill more widely. We regard this as a procedural clause and have therefore not sought to amend it.

Question put and agreed to.

Clause 177 accordingly ordered to stand part of the Bill.

Clause 178 ordered to stand part of the Bill.

Clause 179

Parliamentary procedure for regulations

Amendment made: 160, in clause 179, page 146, line 13, at end insert “, or

(k) regulations under paragraph 7 of Schedule (Recovery of OFCOM’s initial costs),—(Chris Philp.)

This amendment provides that regulations under NS2 are subject to the affirmative procedure.

Clause 179, as amended, ordered to stand part of the Bill.

Clause 180

“Provider” of internet service

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to consider the following:

Clauses 181 to 188 stand part.

Amendment 76, in clause 189, page 154, line 34, after “including” insert “but not limited to”.

This amendment clarifies the definition of “content” in the bill in order that anything communicated by means of an internet service is considered content, not only those examples listed.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I will address clauses 180 to 182 together, before moving on to discuss our concerns about the remaining clauses in this group.

As we know, clause 180 determines who is the provider of an internet service and therefore who is subject to the duties imposed on providers. Labour has already raised concerns about the Bill’s lack of future-proofing and its inability to incorporate internet services that may include user-to-user models. The most obvious of those are user-to-user chat functions in gaming, which the hon. Member for Aberdeen North has raised on a number of occasions; we share her concerns.

Broadly, we think the Bill as it stands fails to capture the rapidity of technological advances, and the gaming industry is a key example of this. The Bill targets the providers that have control over who may use the user-to-user functions of a game, but in our view the clarity just is not there for emerging tech in the AI space in particular, so we would welcome the Minister’s comments on where he believes this is defined or specified in the Bill.

Clause 181 defines “user”, “United Kingdom user” and “interested person” in relation to regulated services. We welcome the clarification outlined in subsections (3) and (4) of the role of an employee at a service provider and their position when uploading content. We support the clarity on the term “internet service” in clause 182, and we welcome the provisions to capture services that are accessed via an app specifically, rather than just via an internet browser.

We welcome clause 183, which sets out the meaning of “search engine”. It is important to highlight the difference between search engines and user-to-user services, which has been attempted throughout the Bill. We heard from Google about its definition of “search”, and Labour agrees that, at their root, search services exist as an index of the web, and are therefore different from user-to-user services. We also fully appreciate the rapid nature of the internet—hundreds of web pages are created every single second—meaning that search services have a fundamental role to play in assisting users to find authoritative information that is most relevant to what they are seeking. Although search engines do not directly host content, they have an important role to play in ensuring that a delicate balance is maintained between online safety and access to lawful information. We are therefore pleased to support clause 183, which we feel broadly outlines the responsibilities placed on search services more widely.

On clause 184, Labour supports the need for a proactive technology to be used by regulated service providers to comply with their duties on illegal content, content that is harmful to children, and fraudulent advertising. In our consideration of proactive technology elsewhere in the Bill, Labour has made it clear that we support measures to keep us all safe. When speaking to new clause 20, which we debated with clause 37, I made it clear that we disagree with the Bill’s stance on proactive technology. As it is, the Bill will leave Ofcom unable to proactively require companies to use technology that can detect child abuse. Sadly, I was not particularly reassured by the Minister’s response, but it is important to place on the record again our feeling that proactive technology has an important role to play in improving online safety more widely.

Clause 185 provides information to assist Ofcom in its decision making on whether, in exercising its powers under the Bill, content is communicated publicly or privately. We see no issues with the process that the clause outlines. It is fundamentally right that, in the event of making an assessment of public or private content, Ofcom has a list of factors to consider and a subsequent process to follow. We will therefore support clause 185, which we have not sought to amend.

Clause 186 sets out the meaning of the term “functionality”. Labour supports the clause, particularly the provisions in subsection (2), which include the detailed ways in which platforms’ functionality can affect subsequent online behaviours. Despite our support, I put on the record our concern that the definitions in the clause do little to imagine or capture the broad nature of platforms or, indeed, the potential for them to expand into the AI space in future.

The Minister knows that Labour has advocated a systems-based approach to tackling online safety that would put functionality at the heart of the regulatory system. It is a frustrating reality that those matters are not outlined until clause 186. That said, we welcome the content of the clause, which we have not sought to amend.

Clause 187 aims to define “harm” as “physical or psychological harm”. Again, we feel that that definition could go further. My hon. Friend the Member for Batley and Spen spoke movingly about her constituent Zach in an earlier debate, and made a compelling case for clarity on the interplay between the physical and psychological harm that can occur online. The Minister said that the Government consider the Bill to cover a range of physical and psychological harms, but many charities disagree. What does he say to them?

We will shortly be considering new clause 23, and I will outline exactly how Labour feels that the Bill fails to capture the specific harms that women and girls face online. It is another frustrating reality that the Government have not taken the advice of so many stakeholders, and of so many women and girls, to ensure that those harms are on the face of the Bill.

Labour agrees with the provisions in clause 188, which sets out the meaning of “online safety functions” and “online safety matters”, so we have not sought to amend the clause.

Would it be appropriate for me to speak to the SNP amendment as well, Sir Roger?

None Portrait The Chair
- Hansard -

Not really. If the hon. Lady has finished with her own amendments, we should, as a courtesy, allow the SNP spokesperson to speak to her amendment first.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Thank you, Sir Roger. I thank the shadow Minister for running through some of our shared concerns about the clauses. Similarly, I will talk first about some of the issues and questions that I have about the clauses, and then I will speak to amendment 76. Confusingly, amendment 76 was tabled to clause 189, which we are not discussing right now. I should have raised that when I saw the provisional selection of amendments. I will do my best not to stray too far into clause 189 while discussing the amendment.

I have raised before with the Minister some of the questions and issues that I have. Looking specifically at clause 181, I very much appreciate the clarification that he has given us about users, what the clause actually means, and how the definition of “user” works. To be fair, I agree with the way the definition of “user” is written. My slight concern is that, in measuring the number of users, platforms might find it difficult to measure the number of unregistered users and the number of users who are accessing the content through another means.

Let us say, for example, that someone is sent a WhatsApp message with a TikTok link and they click on that. I do not know whether TikTok has the ability to work out who is watching the content, or how many people are watching it. Therefore, I think that TikTok might have a difficulty when it comes to the child safety duties and working out the percentage or number of children who are accessing the service, because it will not know who is accessing it through a secondary means.

I am not trying to give anyone a get-out clause. I am trying to ensure that Ofcom can properly ensure that platforms that have a significant number of children accessing them through secondary means are still subject to the child safety duties even though there may not be a high number of children accessing the platform or the provider directly. My major concern is assessing whether they are subject to the child safety duties laid out in the Bill.

I will move straight on to our amendment 76, which would amend the definition of “content” in clause 189. I have raised this issue with the Minister already. The clause, as amended, would state that

“‘content’ means anything communicated by means of an internet service, whether publicly or privately, including but not limited to”—

and then a list. The reason I suggest that we should add those words “but not limited to” is that if we are to have a list, we should either make an exhaustive list or have clarity that there are other things that may not be on the list.

I understand that it could be argued that the word “including” suggests that the provision actually goes much wider than what is in the list. I understand that that is the argument that the Minister may make, but can we have some more clarity from him? If he is not willing to accept the amendment but he is willing to be very clear that, actually, the provision does include things that we have not thought of and that do not currently exist and that it genuinely includes anything communicated by means of an internet service, that will be very helpful.

I think that the amendment would add something positive to the Bill. It is potentially the most important amendment that I have tabled in relation to future-proofing the Bill, because it does feel as though the definition of “content”, even though it says “including”, is unnecessarily restrictive and could be open to challenge should someone invent something that is not on the list and say, “Well, it’s not mentioned, so I am not going to have to regulate this in the way we have to regulate other types of content.”

I have other questions about the same provision in clause 189, but I will hold on to those until we come to the next grouping.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I rise briefly to support amendment 76, in the name of the hon. Member for Aberdeen North. Labour supports broadening the definition of “content” in this way. I refer the Minister to our earlier contributions about the importance of including newspaper comments, for example, in the scope of the Bill. This is a clear example of a key loophole in the Bill. We believe that a broadened definition of “content” would be a positive step forward to ensure that there is future-proofing, to prevent any unnecessary harm from any future content.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The shadow Minister, in her first contribution to the debate, introduced the broad purpose of the various clauses in this group, so I do not propose to repeat those points.

I would like to touch on one or two issues that came up. One is that clause 187 defines the meaning of “harm” throughout the Bill, although clause 150, as we have discussed, has its own internal definition of harm that is different. The more general definition of harm is made very clear in clause 187(2), which states:

“‘Harm’ means physical or psychological harm.”

That means that harm has a very broad construction in the Bill, as it should, to make sure that people are being protected as they ought to be.

12:15
The shadow Minister made a point about women and girls. I suspect that we will debate this in some detail when we reach one of the new clauses that she has tabled, which I guess we will come to on Tuesday. I do not want to speak to this issue at length, given that we will probably discuss it more then, but I will make a couple of brief points.
On the risk assessment duties from which the safety duties flow, as we have debated previously, clause 10(6)(d) states that the risk assessments have to cover
“individuals with a certain characteristic or members of a certain group”,
which obviously includes women and girls, meaning that matters that are particular to women and girls—or, indeed, to other groups, such as ethnic minorities, people with a particular sexual orientation and so on—will have to be addressed in those risk assessment duties, and that will then flow through into the other safety duties.
The same applies to the safety duties relating to adults in clause 12(5)(d). Again, an individual’s characteristics, which include gender, have to be properly taken into account. I will also mention in passing that many of the priority offences in schedule 7 are offences where women are overwhelmingly likely to be the victims, such as harassment, stalking and so on, but I suspect that we will debate this issue in much more detail on Tuesday, so we can go through these points then.
I understand the point of amendment 76 to clause 189. We agree that the Bill should cover matters that are not on the list, but the word “including” does not limit what can be included to the things that follow it. It can include other things as well—it is not restrictive—so we do not think the words “but not limited to” need to be added, particularly when the very first sentence of the definition says that
“‘content’ means anything communicated by means of an internet service”.
If a judge or Ofcom comes to interpret the clause in due course, the use of the word “anything” in the previous line is very important, because “anything” is universal and, as the words suggests, means absolutely everything. The word “including” follows “anything”, so it is clear that the list of items that follows—I am happy to put this on the record—is not an exclusive or exhaustive list. Indeed, it could not possibly be; if it was, the “anything” in the previous sentence would not work.
On the definition of “user” and the numbers, the duties to protect children apply to children who access content “by means of” a site, which includes those who access it through one site and on to another, as we have discussed previously. The first point of access, from which someone may then go to a second site, would, of course, be tracking the numbers, and that would then get caught. Beyond that, even where a site has people looking at its content who are not registered, very often it will be tracking them by way of cookies. The main point is that the duty is on the primary access point to protect children who are accessing content through its site, and to keep track of the numbers for the purpose of working out whether the “significant” test is met.
I hope I have responded to the points raised. Obviously, I believe that clauses 181 to 189 should stand part of the Bill. Although I agree with the intent behind amendment 76, I do not think it is necessary, because the existing drafting already achieves what the hon. Member for Aberdeen North seeks to achieve.
Question put and agreed to.
Clause 180 accordingly ordered to stand part of the Bill.
Clauses 181 to 188 ordered to stand part of the Bill.
Clause 189
Interpretation: general
None Portrait The Chair
- Hansard -

Amendment 111 is not claimed; it has been tabled by the hon. Member for Stroud (Siobhan Baillie), who is not a member of the Committee. I am assuming that nobody wishes to take ownership of it and we will not debate it.

If the hon. Member for Aberdeen North wishes to move amendment 76, she will be able to do so at the end of the stand part debate.

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

As we know, the clause sets out the meanings of various terms used in the Bill. Throughout our Committee debates, Labour has raised fundamental concerns on a number of points where we feel the interpretation of the Bill requires clarification. We raised concerns as early as clause 8, when we considered the Bill’s ability to capture harm in relation to newly produced CSEA content and livestreaming. The Minister may feel he has sufficiently reassured us, but I am afraid that simply is not the case. Labour has no specific issues with the interpretations listed in clause 189, but we will likely seek to table further amendments on Report in the areas that we feel require clarification.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

In one of our earlier debates, I asked the Minister about the difference between “oral” and “aural”, and I did not get a very satisfactory answer. I know the difference in their dictionary definition—I understand that they are different, although the words sound the same. I am confused that clause 189 uses “oral” as part of the definition of content, but clause 49 refers to

“one-to-one live aural communications”

in defining things that are excluded.

I do not understand why the Government have chosen to use those two different words in different places in the Bill. It strikes me that, potentially, we mean one or the other. If they do mean two different things, why has one thing been chosen for clause 49 and another thing for clause 189? Why has the choice been made that clause 49 relates to communications that are heard, but clause 189 relates to communications that are said? I do not quite get the Government’s logic in using those two different words.

I know this is a picky point, but in order to have good legislation, we want it to make sense, for there to be a good rationale for everything that is in it and for people to be able to understand it. At the moment, I do not properly understand why the choice has been made to use two different words.

More generally, the definitions in clause 189 seem pretty sensible, notwithstanding what I said in the previous debate in respect of amendment 76, which, with your permission, Sir Roger, I intend to move when we reach the appropriate point.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As the hon. Member for Pontypridd said, clause 189 sets out various points of definition and interpretation necessary for the Bill to be understood and applied.

I turn to the question raised by the hon. Member for Aberdeen North. First, I strongly commend and congratulate her on having noticed the use of the two words. Anyone who thinks that legislation does not get properly scrutinised by Parliament has only to look to the fact that she spotted this difference, 110 pages apart, in two different clauses—clauses 49 and 189. That shows that these things do get properly looked at. I strongly congratulate her on that.

I think the best way of addressing her question is probably to follow up with her after the sitting. Clause 49 relates to regulated user-to-user content. We are in clause 49(2)—is that right?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Subsection (5).

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

It is cross-referenced in subsection (5). The use of the term “aural” in that subsection refers to sound only—what might typically be considered telephony services. “Oral” is taken to cover livestreaming, which includes pictures and voice. That is the intention behind the use of the two different words. If that is not sufficient to explain the point—it may not be—I would be happy to expand in writing.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

That would be helpful, in the light of the concerns I raised and what the hon. Member for Pontypridd mentioned about gaming, and how those communications work on a one-to-one basis. Having clarity in writing on whether clause 49 relates specifically to telephony-type services would be helpful, because that is not exactly how I read it.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Given that the hon. Lady has raised the point, it is reasonable that she requires more detail. I will follow up in writing on that point.

Amendment proposed: 76, in clause 189, page 154, line 34, after “including” insert “but not limited to”.(Kirsty Blackman.)

This amendment clarifies the definition of “content” in the bill in order that anything communicated by means of an internet service is considered content, not only those examples listed.

Question put, That the amendment be made.

Division 50

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

Clause 189 ordered to stand part of the Bill.
Clause 190
Index of defined terms
Question proposed, That the clause stand part of the Bill.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour has not tabled any amendments to clause 190, which lists the provisions that define or explain terms used in the Bill. However, it will come as no surprise that we dispute the Bill’s definition of harm, and I am grateful to my hon. Friend the Member for Batley and Spen for raising those important points in our lively debate about amendment 112 to clause 150. We maintain that the Minister has missed the point, in that the Bill’s definition of harm fails to truly capture physical harm caused as a consequence of being online. I know that the Minister has promised to closely consider that as we head to Report stage, but I urge him to bear in mind the points raised by Labour, as well as his own Back Benchers.

The Minister knows, because we have repeatedly raised them, that we have concerns about the scope of the Bill’s provisions relating to priority content. I will not repeat myself, but he will be unsurprised to learn that this is an area in which we will continue to prod as the Bill progresses through Parliament.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have made points on those issues previously. I do not propose to repeat now what I have said before.

Question put and agreed to.

Clause 190 accordingly ordered to stand part of the Bill.

Clause 191 ordered to stand part of the Bill.

Clause 192

Extent

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I beg to move amendment 141, in clause 192, page 160, line 9, at end insert—

“(aa) section (Offence under the Obscene Publications Act 1959: OFCOM defence);”.

This amendment provides for NC35 to extend only to England and Wales.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss Government new clause 35—Offence under the Obscene Publications Act 1959: OFCOM defence

“(1) Section 2 of the Obscene Publications Act 1959 (prohibition of publication of obscene matter) is amended in accordance with subsections (2) and (3).

(2) After subsection (5) insert—

“(5A) A person shall not be convicted of an offence against this section of the publication of an obscene article if the person proves that—

(a) at the time of the offence charged, the person was a member of OFCOM, employed or engaged by OFCOM, or assisting OFCOM in the exercise of any of their online safety functions (within the meaning of section188 of the Online Safety Act 2022), and

(b) the person published the article for the purposes of OFCOM’s exercise of any of those functions.”

(3) In subsection (7)—

(a) the words after “In this section” become paragraph (a), and

(b) at the end of that paragraph, insert “;

(b) “OFCOM” means the Office of Communications.””

This new clause (to be inserted after clause 171) amends section 2 of the Obscene Publications Act 1959 to create a defence for OFCOM and their employees etc to the offence of the publication of an obscene article.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

New clause 35 amends section 2 of the Obscene Publications Act 1959 to create a defence for Ofcom to the offence of publishing an obscene article where Ofcom is exercising its online safety duties. Ofcom has a range of functions that may result in its staff handling such content, so we want to ensure that that is covered properly. We have debated that already.

Clause 192 covers territorial extent. The regulation of the internet, as a reserved matter, covers all of the United Kingdom, but particular parts of the Bill extend to particular areas of the UK. In repealing that point in the Obscene Publications Act, we are ensuring that the Bill applies to the relevant parts of the United Kingdom, because that area of legislation has different areas of applicability. The clause and our amendments are important in ensuring that that is done in the right way.

12:30
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The clause provides that the Bill extends to England, Wales, Scotland and Northern Ireland, subject to the exceptions set out in subsections (2) to (7). We welcome clarification of how the devolved nations may be affected by the provisions of the Bill—that is of particular importance to me as a Welsh MP. It is important to clarify how amendments or appeals, as outlined in subsection (7), may work in the context of devolution more widely.

Labour also supports new clause 35 and Government amendment 141. Clearly, those working for Ofcom should have a defence to the offence of publishing obscene articles as, sadly, we see that as a core part of establishing the online safety regime in full. We know that having such a defence available is likely to be an important part of the regulator’s role and that of its employees. Labour is therefore happy to support this sensible new clause and amendment.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The Opposition spokesperson has said it all.

Amendment 141 agreed to.

Clause 192, as amended, ordered to stand part of the Bill.

Clause 193

Commencement and transitional provision

None Portrait The Chair
- Hansard -

Amendment 139 was tabled by a Member who is not a member of the Committee, and nobody has claimed it, so we come to amendment 49.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move amendment 49, in clause 193, page 161, line 1, leave out subsection (2) and insert—

“(2) Subject to subsection (2A) below, the other provisions of this Act come into force on such day as the Secretary of State may by regulations appoint.

(2A) The provisions of Part 5 shall come into force at the end of the period of three months beginning with the day on which this Act is passed.”

This amendment would bring Part 5 into force three months after the Act is passed.

We all understand the need for the Bill, which is why we have been generally supportive in Committee. I hope we can also agree that the measures that the Bill introduces must come into force as soon as is reasonably possible. That is particularly important for the clauses introducing protections for children, who have been subject to the harms of the online world for far too long already. I was glad to hear the Minister say in our discussions of clauses 31 to 33 that the Government share the desire to get such protections in place quickly.

My hon. Friend the Member for Worsley and Eccles South also spoke about our concerns about the commencement and transitional provisions when speaking to clauses 170 to 172. We fundamentally believe that the provisions on pornography in part 5 cannot, and should not, be susceptible to further delay, because they require no secondary legislation. I will come to that point in my comments on the amendment. More broadly, I will touch briefly on the reasons why we cannot wait for the legislation and make reference to a specific case that I know colleagues across the House are aware of.

My hon. Friend the Member for Reading East (Matt Rodda) has been a powerful voice on behalf of his constituents Amanda and Stuart Stephens, whose beloved son Olly was tragically murdered in a field outside his home. A BBC “Panorama” investigation, shown only a few days ago, investigated the role that social media played in Olly’s death. It specifically highlighted disturbing evidence that some social media algorithms may still promote violent content to vulnerable young people. That is another example highlighting the urgent need for the Bill, along with a regulatory process to keep people safe online.

We also recognise, however, the important balance between the need for effective development of guidance by Ofcom, informed by consultation, and the need to get the duties up and going. In some cases, that will mean having to stipulate deadlines in the Bill, which we feel is a serious omission and oversight at present.

The amendment would bring part 5 of the Bill into force three months after it is enacted. The Minister knows how important part 5 is, so I do not need to repeat myself. The provisions of the amendment, including subsequent amendments that Labour and others will likely table down the line, are central to keeping people safe online. We have heard compelling evidence from experts and speeches from colleagues across the House that have highlighted how vital it is that the Bill goes further on pornographic content. The amendment is simple. It seeks to make real, meaningful change as soon as is practically possible. The Bill is long delayed, and providers and users are desperate for clarity and positive change, which is what led us to tabling the amendment.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

In the interests of not having to make a speech in this debate, I want to let the hon. Member know that I absolutely support the amendment. It is well balanced, brings the most important provisions into force as soon as possible, and allows the Secretary of State to appoint dates for the others.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I welcome the hon. Member’s intervention, and I am grateful for her and her party’s support for this important amendment.

It is also worth drawing colleagues’ attention to the history of issues, which have been brought forward in this place before. We know there was reluctance on the part of Ministers when the Digital Economy Act 2017 was on the parliamentary agenda to commence the all-important part 3, which covered many of the provisions now in part 5. Ultimately, the empty promises made by the Minister’s former colleagues have led to huge, record failures, even though the industry is ready, having had years to prepare to implement the policy. I want to place on record my thanks to campaigning groups such as the Age Verification Providers Association and others, which have shown fierce commitment in getting us this far.

It might help if I cast colleagues’ minds back to the Digital Economy Act 2017, which received Royal Assent in April of that year. Following that, in November 2018, the then Minister of State for Digital and Creative Industries told the Science and Technology Committee that part 3 of the DEA would be in force “by Easter next year”. Then, in December 2018, both Houses of Parliament approved the necessary secondary legislation, the Online Pornography (Commercial Basis) Regulations 2018, and the required statutory guidance.

But shortly after, in April 2018, the first delay arose when the Government published an online press release stating that part 3 of the DEA would not come into force until 15 July 2019. However, June 2019 came around and still there was nothing. On 20 June, five days after it should have come into force, the then Under-Secretary of State told the House of Lords that the defendant had failed to notify the European Commission of the statutory guidance, which would need to be done, and that that would result in a delay to the commencement of part 3

“in the region of six months”.—[Official Report, House of Lords, 20 June 2019; Vol. 798, c. 883.]

However, on 16 October 2019, the then Secretary of State announced via a written statement to Parliament that the Government

“will not be commencing part 3 of the Digital Economy Act 2017 concerning age verification for online pornography.”—[Official Report, 16 October 2019; Vol. 666, c. 17WS.]

A mere 13 days later, the Government called a snap general election. I am sure those are pretty staggering realities for the Minister to hear—and defend—but I am willing to listen to his defence. It really is not good enough. The industry is ready, the technology has been there for quite some time, and, given this Government’s fondness for a U-turn, there are concerns that part 5 of the Bill, which we have spent weeks deliberating, could be abandoned in a similar way as part 3 of the DEA was.

The Minister has failed to concede on any of the issues we have raised in Committee. It seems we are dealing with a Government who are ignoring the wide-ranging gaps and issues in the Bill. He has a relatively last-ditch opportunity to at least bring about some positive change, and to signify that he is willing to admit that the legislation as it stands is far from perfect. The provisions in part 5 are critical—they are probably the most important in the entire Bill—so I urge him to work with Labour to make sure they are put to good use in a more than reasonable timeframe.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

On the implementation of part 3 of the Digital Economy Act 2017, all the events that the shadow Minister outlined predated my time in the Department. In fact, apart from the last few weeks of the period she talked about, the events predated my time as a Minister in different Departments, and I cannot speak for the actions and words of Ministers prior to my arrival in DCMS. What I can say, and I have said in Committee, is that we are determined to get the Bill through Parliament and implemented as quickly as we can, particularly the bits to do with child safety and the priority illegal content duties.

The shadow Minister commented at the end of her speech that she thought the Government had been ignoring parliamentary opinion. I take slight issue with that, given that we published a draft Bill in May 2021 and went through a huge process of scrutiny, including by the Joint Committee of the Commons and the Lords. We accepted 66 of the Joint Committee’s recommendations, and made other very important changes to the Bill. We have made changes such as addressing fraudulent advertising, which was previously omitted, and including commercial pornography—meaning protecting children—which is critical in this area.

The Government have made a huge number of changes to the Bill since it was first drafted. Indeed, we have made further changes while the Bill has been before the Committee, including amending clause 35 to strengthen the fraudulent advertising duties on large search companies. Members of Parliament, such as the right hon. Member for East Ham (Sir Stephen Timms), raised that issue on Second Reading. We listened to what was said at that stage and we made the changes.

There have also been quite a few occasions during these Committee proceedings when I have signalled—sometimes subtly, sometimes less so—that there are areas where further changes might be forthcoming as the Bill proceeds through both Houses of Parliament. I do not think the hon. Member for Pontypridd, or any member of the Committee, should be in any doubt that the Government are very open to making changes to the Bill where we are able to and where they are right. We have done so already and we might do so again in the future.

On the specifics of the amendment, we share the intention to protect children from accessing pornography online as quickly as possible. The amendment seeks to set a three-month timeframe within which part 5 must come into force. However, an important consideration for the commencement of part 5 will be the need to ensure that all kinds of providers of online pornography are treated the same, including those hosting user-generated content, which are subject to the duties of part 3. If we take a piecemeal approach, bringing into force part 5, on commercial pornography, before part 3, on user-to-user pornography, that may enable some of the services, which are quite devious, to simply reconfigure their services to circumvent regulation or cease to be categorised as part 5 services and try to be categorised as part 3 services. We want to do this in a comprehensive way to ensure that no one will be able to wriggle out of the provisions in the Bill.

Parliament has also placed a requirement on Ofcom to produce, consult on and publish guidance for in-scope providers on meeting the duties in part 5. The three-month timescale set out in the amendment would be too quick to enable Ofcom to properly consult on that guidance. It is important that the guidance is right; if it is not, it may be legally challenged or turn out to be ineffective.

I understand the need to get this legislation implemented quickly. I understand the scepticism that flows from the long delays and eventual cancellation of part 3 of the Digital Economy Act 2017. I acknowledge that, and I understand where the sentiment comes from. However, I think we are in a different place today. The provisions in the Bill have been crafted to address some of the concerns that Members had about the previous DEA measures—not least the fact that they are more comprehensive, as they cover user-to-user, which the DEA did not. There is therefore a clear commitment to getting this done, and getting it done fast. However, we also have to get it done right, and I think the process we have set out does that.

The Ofcom road map is expected before the summer. I hope that will give further reassurance to the Committee and to Parliament about the speed with which these things can get implemented. I share Members’ sentiments about needing to get this done quickly, but I do not think it is practical or right to do it in the way set out in amendment 49.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful for the Minister’s comments. However, I respectfully disagree, given the delays already since 2017. The industry is ready for this. The providers of the age verification services are ready for this. We believe that three months is an adequate timeframe, and it is vital that we get this done as quickly as possible. With that in mind, I will be pushing amendment 49 to a vote.

Question put, That the amendment be made.

Division 51

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

Clause 193 ordered to stand part of the Bill.
Clause 194
Short title
Question proposed, That the clause stand part of the Bill.
12:45
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

This very important and concise clause sets out that the Bill, when passed, will be cited as the Online Safety Act 2022, which I hope is prophetic when it comes the lightning speed of passage through the House of Lords.

Question put and agreed to.

Clause 194 accordingly ordered to stand part of the Bill.

New Clause 35

Offence under the Obscene Publications Act 1959: OFCOM defence

“(1) Section 2 of the Obscene Publications Act 1959 (prohibition of publication of obscene matter) is amended in accordance with subsections (2) and (3).

(2) After subsection (5) insert—

‘(5A) A person shall not be convicted of an offence against this section of the publication of an obscene article if the person proves that—

(a) at the time of the offence charged, the person was a member of OFCOM, employed or engaged by OFCOM, or assisting OFCOM in the exercise of any of their online safety functions (within the meaning of section188 of the Online Safety Act 2022), and

(b) the person published the article for the purposes of OFCOM’s exercise of any of those functions.’

(3) In subsection (7)—

(a) the words after ‘In this section’ become paragraph (a), and

(b) at the end of that paragraph, insert ‘;

(b) “OFCOM” means the Office of Communications.’”—(Chris Philp.)

This new clause (to be inserted after clause 171) amends section 2 of the Obscene Publications Act 1959 to create a defence for OFCOM and their employees etc to the offence of the publication of an obscene article.

Brought up, read the First and Second time, and added to the Bill.

New Clause 42

Recovery of OFCOM’s initial costs

“Schedule (Recovery of OFCOM’s initial costs) makes provision about fees chargeable to providers of regulated services in connection with OFCOM’s recovery of costs incurred on preparations for the exercise of their online safety functions.”—(Chris Philp.)

This new clause introduces NS2.

Brought up, and read the First time.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss Government new clause 43 and Government new schedule 2.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

New clause 42 introduces new schedule 2. New clause 43 provides that the additional fees charged to providers under new schedule 2 must be paid into the consolidated fund. We discussed that a few days ago. That is where the fees are currently destined and I owe my right hon. Friend the Member for Basingstoke some commentary on this topic in due course. The Bill already provided that monetary penalties must be paid into the Consolidated Fund; the provisions are now placed into that clause.

New schedule 2, which is quite detailed, makes provisions in connection with Ofcom’s ability to recover its initial costs, which we have previously debated. As discussed, it is important that the taxpayer not only is protected from the ongoing costs but that the set-up costs are recovered. The taxpayer should not have to pay for the regulatory framework; the people who are being regulated should pay, whether the costs are incurred before or after commencement, in line with the “polluter pays” principle. Deep in new schedule 2 is the answer to the question that the hon. Member for Aberdeen North asked a day or two ago about the period over which set-up costs can be recovered, with that period specified as between three and five years. I hope that provides an introduction to the new clauses and new schedules.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

We welcome this grouping, which includes two new clauses and a new schedule. Labour has raised concerns about the future funding of Ofcom more widely, specifically when we discussed groupings on clause 42. The Minister’s response did little to alleviate our concerns about the future of Ofcom’s ability to raise funds to maintain its position as the regulator. Despite that, we welcome the grouping, particularly the provisions in the new schedule, which will require Ofcom to seek to recover the costs it has incurred when preparing to take on functions as the regulator of services under the Bill by charging fees to providers of services. This is an important step, which we see as being broadly in line with the kind of mechanisms already in place for other, similar regulatory regimes.

Ultimately, it is right that fees charged to providers under new schedule 2 must be paid into the Consolidated Fund and important that Ofcom can recover its costs before a full fee structure and governance process is established. However, I have some questions for the Minister. How many people has Ofcom hired into roles, and can any of those costs count towards the calculation of fees? We want to ensure that other areas of regulation do not lose out as a consequence. Broadly speaking, though, we are happy to support the grouping and have not sought to table amendment at this stage.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

So far as I am aware, all the costs incurred by Ofcom in relation to the duties in the Bill can be recouped by way of fees. If that is not correct, I will write to the hon. Lady saying so, but my understanding is that any relevant Ofcom cost will be in the scope of the fees.

Question put and agreed to.

New clause 42 accordingly read a Second time, and added to the Bill.

New Clause 43

Payment of sums into the Consolidated Fund

“(1) Section 400 of the Communications Act (destination of penalties etc) is amended as follows.

(2) In subsection (1), after paragraph (i) insert—

‘(j) an amount paid to OFCOM in respect of a penalty imposed by them under Chapter 6 of Part 7 of the Online Safety Act 2022;

(k) an amount paid to OFCOM in respect of an additional fee charged under Schedule (Recovery of OFCOM’s initial costs) to the Online Safety Act 2022.’

(3) In subsection (2), after ‘applies’ insert ‘(except an amount mentioned in subsection (1)(j) or (k))’.

(4) After subsection (3) insert—

‘(3A) Where OFCOM receive an amount mentioned in subsection (1)(j) or (k), it must be paid into the Consolidated Fund of the United Kingdom.’

(5) In the heading, omit ‘licence’.”—(Chris Philp.)

This new clause provides that additional fees charged to providers under NS2 must be paid into the Consolidated Fund. The Bill already provided that monetary penalties must be paid into the Consolidated Fund, and those provisions are now placed in this clause.

Brought up, read the First and Second time, and added to the Bill.

New Clause 3

Establishment of Advocacy Body

“(1) There is to be a body corporate (‘the Advocacy Body’) to represent interests of child users of regulated services.

(2) A ‘child user’—

(a) means any person aged 17 years or under who uses or is likely to use regulated internet services; and

(b) includes both any existing child user and any future child user.

(3) The work of the Advocacy Body may include—

(a) representing the interests of child users;

(b) the protection and promotion of these interests;

(c) any other matter connected with those interests.

(4) The ‘interests of child users’ means the interest of children in relation to the discharge by any regulated company of its duties under this Act, including—

(a) safety duties about illegal content, in particular CSEA content;

(b) safety duties protecting children;

(c) ‘enforceable requirements’ relating to children.

(5) The Advocacy Body must have particular regard to the interests of child users that display one or more protected characteristics within the meaning of the Equality Act 2010.

(6) The Advocacy Body will be defined as a statutory consultee for OFCOM’s regulatory decisions which impact upon the interests of children.

(7) The Secretary of State may appoint an organisation known to represent children to be designated the functions under this Act, or may create an organisation to carry out the designated functions.”—(Barbara Keeley.)

This new clause creates a new advocacy body for child users of regulated internet services.

Brought up, and read the First time.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

New clause 3 would make provision for a statutory user advocacy body representing the interests of children. It would also allow the Secretary of State to appoint a new or existing body as the statutory user advocate. A strong, authoritative and well-resourced voice that can speak for children in regulatory debates would ensure that complex safeguarding issues are well understood, and would also actively inform the regulator’s decisions.

Charities have highlighted that the complaints and reporting mechanisms in the Bill may not always be appropriate for children. Ofcom’s own evidence shows that only 14% to 12 to 15-year-old children have ever reported content. Children who are most at risk of online harms may find it incredibly challenging to complete a multi-stage reporting and complaints process. Dame Rachel de Souza told the Committee:

“I worry that the Bill does not do enough to respond to individual cases of abuse and that it needs to do more to understand issues and concerns directly from children. Children should not have to exhaust the platforms’ ineffective complaints routes, which can take days, weeks or even months. I have just conducted a survey of 2,000 children and asked them about their experiences in the past month. Of those 2,000 children, 50% had seen harmful content and 40% had tried to get content about themselves removed and had not succeeded. For me, there is something really important about listening to children and taking their complaints into account.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 16, Q22.]

A children’s advocacy body would be able to support children with redress mechanisms that are fundamentally targeted at adults. Given how many children now use the internet, that is an essential element that is missing from the Bill. That is why the super-complaints mechanism needs to be strengthened with specific arrangements for children, as advocated by the National Society for the Prevention of Cruelty to Children and other children’s organisations. A statutory user advocacy body could support the regulator, as well as supporting child users. It would actively promote the interests of children in regulatory decision making and offer support by ensuring that an understanding of children’s behaviour and safeguarding is front and centre in its approach.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

My hon. Friend is making a really valid point. As I look around the room—I mean this with no disrespect to anybody—I see that we are all of an age at which we do not understand the internet in the same way that children and young people do. Surely, one of the key purposes of the Bill is to make sure that children and young people are protected from harms online, and as the Children’s Commissioner said in her evidence, their voices have to be heard. I am sure that, like me, many Members present attend schools as part of their weekly constituency visits, and the conversations we have with young people are some of the most empowering and important parts of this job. We have to make sure that the voices of the young people who we all represent are heard in this important piece of legislation, and it is really important that we have an advocacy body to ensure that.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I very much agree with my hon. Friend. She is quite right: we have to remember that we do not see these things as children and young people do.

The user advocacy body that my hon. Friend has just spoken in support of could also shine a light on the practices that are most harmful to children by using data, evidence and specialist expertise to point to new and emerging areas of harm. That would enable the regulator to ensure its risk profiles and regulatory approach remain valid and up to date. In his evidence, Andy Burrows of the NSPCC highlighted the importance of an advocacy body acting as an early warning system:

“Given the very welcome systemic approach of the regime, that early warning function is particularly important, because there is the potential that if harms cannot be identified quickly, we will see a lag where whole regulatory cycles are missed. User advocacy can help to plug that gap, meaning that harms are identified at an earlier stage, and then the positive design of the process, with the risk profiles and company risk assessments, means that those harms can be built into that particular cycle.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 16, Q22.]

The provision in the new clause is comparable to those that already exist in many other sectors. For example, Citizens Advice is the statutory user advocate for consumers of energy and the postal services, and there are similar arrangements representing users of public transport. Establishing a children’s user advocacy body would ensure that the most vulnerable online users of all—children at risk of online sexual abuse—receive equivalent protections to customers of post offices or passengers on a bus.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

The hon. Lady will recall the issue that I raised earlier in the Committee’s deliberations, regarding the importance of victim support that gives people somewhere to go other than the platforms. I think that is what she is now alluding to. Does she not believe that the organisations that are already in place, with the right funding—perhaps from the fines coming from the platforms themselves—would be in a position to do this almost immediately, and that we should not have to set up yet another body, or have I misunderstood what she has said?

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I do not think that the right hon. Lady has misunderstood what I said. I said that the new clause would allow the Secretary of State to appoint a new or existing body as the statutory user advocate, so it could very much be either.

New clause 3 would also rebalance the interests of children against the vocal and well-resourced regulated companies. I think that is a key argument for having an advocacy body. Without such a counterbalance, large tech companies could attempt to capture independent expert voices, fund highly selective research with the intent to skew the evidence base, and then challenge regulatory decisions with the evidence base they have created.

Those tactics are not new; similar tactics are used in other regulated sectors, such as the tobacco industry. In line with other sectors, the user advocacy body should be funded by a levy on regulated companies. That would be in line with the “polluter pays” principle in part 6 and would be neutral to the Exchequer—another reason to accept it. Compared with the significant benefits and improved outcomes it would create, the levy would represent only a minimal additional burden on companies.

There is strong support for the creation of a user advocate. Research by the NSPCC shows that 88% of UK adults who responded to a YouGov survey think that it is necessary for the Bill to introduce a requirement for an independent body that can protect the interests of children at risk of online harms, including grooming and child sexual abuse.

It is also a popular option among children. YoungMinds has said that young people do not feel they are being included enough in the drafting of the Bill. It evidenced that with research it undertook that found that almost 80% of young people aged 11 to 25 surveyed had never even heard of the Bill.

A young woman told the NSPCC why she felt a children’s advocacy body is needed. She is a survivor of online grooming, and it is worth sharing what she said in full, because it is powerful and we have not shared the voices of young people enough. She said:

“When I was 13, a man in his 30s contacted me on Facebook. I added him because you just used to add anyone on Facebook. He started messaging me and I liked the attention. We’d speak every day, usually late at night for hours at a time…He started asking for photos, so I sent some. Then he asked for some explicit photos, so I did that too, and he reciprocated…In my eyes, telling anyone in my life about this man was not an option. We need to stop putting the responsibility on a vulnerable child to prevent crime and start living in a world which puts keeping children safe first. That means putting child safety at the heart of policy. I want a statutory child user advocacy body funded by the industry levy. This would play a vital role in advocating for children’s rights in regulatory debates. Being groomed made me feel incredibly vulnerable, isolated, and weak. I felt I had no one who was on my side. Having a body stand up for the rights of children in such a vulnerable position is invaluable…it is so rare that voices like mine have a chance to be heard by policy makers. Watching pre legislative debates I’ve been struck by how detached from my lived experience they can be”—

that is very much the point that my hon. Friend the Member for Batley and Spen made—

“and indeed the lived experiences of thousands of others. If we want to protect children, we need to understand and represent what they need.”

I hope that the Committee will recognise the bravery of that young woman in speaking about her experiences as a survivor of online grooming. I hope that the Minister will respect the insights she offers and consider the merits of having a user advocacy body to support children and young people experiencing harms online.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I read new clause 3 in conjunction with the starred new clause 44, because it makes sense to consider the funding of the advocacy body, and the benefits of that funding, when discussing the merits of such a body. Part of that is because the funding of the advocacy body, and the fact that it needs to be funded, is key to its operation, and a key reason why we need it.

13:00
I have talked at length about how stretched charitable and third sector organisations are just now, about how tight their budgets are and about how they are having to make decisions on what they do or do not pursue. They do not have enough money to pursue everything, so they can do only the most important things. I accept that we have a super-complaints procedure, and I am glad about that, but it is not funded—it does not have the funding that we would hope an advocacy body would have—so it is lacking, and charitable organisations will not necessarily be able to raise all their concerns because they may not have the money, time or resources to go through that procedure.
The advocacy body would be key for doing two things. One, as was mentioned by the right hon. Member for Basingstoke, is providing voices and evidence from victims about what has happened and what needs to change for it not to happen again. The other is looking at emerging threats, thereby protecting not just victims but potential victims. As we begin to see threats emerge on the internet that we have not yet considered, the advocacy body would be the best placed organisation to highlight them to Ofcom.
I will gently push back on the point made by the hon. Member for Batley and Spen. Having been online for 28 years—since I was eight—I feel like I have a pretty good grasp of the internet and one that is not dissimilar to my children’s. More than 20 years ago, I was on forums and MSN Messenger having conversations with guys 20 years older than me, so I have lived experience of this, but I agree with the hon. Member for Worsley and Eccles South that the voices and experiences of children and young people are not central enough, given how important the Bill is for their protection.
I am well aware that I am of the first generation with that experience. In fact, not many people of my age have been on the internet for quite as long—my dad was a very early adopter and bought a modem—and I was given quite a free reign online, which I do not recommend people giving their children. I imagine that the majority of people scrutinising and making decisions in Ofcom will not have my experience of seeing and accessing the internet as children. Empathy is all well and good, but that is different from the amplification of voices that a user advocacy body could bring.
We have heard from a number of organisations, including those that have provided oral and written evidence. Apart from the NSPCC and others that we have mentioned, one good organisation that does regular work in this area is Girlguiding, which does an annual survey about girls’ experiences of the internet, and it makes for bleak reading. It shows that young women and girls have overwhelmingly had negative experiences online. However, that must be balanced against how, during the covid lockdowns, when we were all isolated from our friends, our peer groups and the people we normally spend time with, a huge number of those women and girls—particularly those in the most marginalised groups such as LGBTQI—found solace and important community on the internet.
That hammers home how important it is for us to make the internet a safe place, because people need to be able to find that community online. People, particularly young people, need to be able to access friends, groups, advice and assistance, and all the good things that we have online, even games. Games are a whole load of fun, and I have no problem with children and young people playing them. In fact, I think they should be encouraged to play some of the excellent games available online, but we need to be able to keep them safe.
No matter how many staff Ofcom employs to deal with the provisions in the Online Safety Bill, I do not believe it can possibly have the expertise that an advocacy body, specifically one advocating on behalf of child users, could bring to the table in performing scrutiny, working with other organisations, and undertaking risk assessments and child safety duties. If Ofcom is to rely on charities and third sector organisations to do this, it needs to provide funding to them. If it is just going to say, “It’s fine, because we’ll just listen to the NSPCC or Girlguiding,”—or to any of the other organisations bringing concerns forward—there may be a gap without funding for those organisations. Ofcom cannot rely on third sector organisations or place that responsibility on them, because this issue is too important.
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

The hon. Lady is making some excellent points. I wholeheartedly agree with her about funding for bodies that might be able to support the advocacy body or act as part of it. She makes a really important point, which we have not focused on enough during the debate, about the positive aspects of the internet. It is very easy to get bogged down in all the negative stuff, which a lot of the Bill focuses on, but she is right that the internet provides a safe space, particularly for young people, to seek out their own identity. Does she agree that the new clause is important because it specifically refers to protected characteristics and to the Equality Act 2010? I am not sure where else that appears in the Bill, but it is important that it should be there. We are thinking not just about age, but about gender, disability and sexual orientation, which is why this new clause could be really important.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I absolutely agree. I had not thought about it in those terms, but the hon. Member is right that the new clause gives greater importance to those protected characteristics and lays that out in the Bill.

I appreciate that, under the risk assessment duties set out in the Bill, organisations have to look at protected characteristics in groups and at individuals with those protected characteristics, which I welcome, but I also welcome the inclusion of protected characteristics in the new clause in relation to the duties of the advocacy body. I think that is really important, especially, as the hon. Member for Batley and Spen just said, in relation to the positive aspects of the internet. It is about protecting free speech for children and young people and enabling them to find community and enjoy life online and offline.

Will the Minister give serious consideration to the possibility of a user advocacy body? Third sector organisations are calling for that, and I do not think Ofcom could possibly have the expertise to match such a body.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I want briefly to interject to underline the point I made in my intervention on the hon. Member for Worsley and Eccles South. I welcome the discussion about victims’ support, which picks up on what we discussed on clause 110. At that point I mentioned the NSPCC evidence that talked about the importance of third party advocacy services, due to the lack of trust in the platforms, as well as for some of the other reasons that the hon. Members for Worsley and Eccles South, for Batley and Spen, and for Aberdeen North have raised.

When we discussed clause 110, the Minister undertook to think about the issue seriously and to talk to the Treasury about whether funding could be taken directly from fines rather than those all going into the Treasury coffers. I hope the debate on new clause 3 will serve to strengthen his resolve, given the strength of support for such a measure, whether that is through a formal user advocacy service or by using existing organisations. I hope he uses the debate to strengthen his arguments about such a measure with the Treasury.

I will not support the new clause tabled by the hon. Member for Worsley and Eccles South, because I think the Minister has already undertaken to look at this issue. As I say, I hope this discussion strengthens his resolve to do so.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me start by stating the fact that this Bill, as drafted, rightly has incredibly strong protections for children. The children’s safety duties that we have already debated are extremely strong. They apply to any platform with significant numbers of children using it and they impose a duty on such companies to protect children from harm. The priority illegal safety duties are listed in schedule 6, on child sexual exploitation and abuse offences—they have their very own schedule because we attach such importance to them. Committee members should be in no doubt that protecting children is at the very heart of the Bill. I hope that has been obvious from the debates we have had.

On children’s ability to raise complaints and seek redress under the Bill, it is worth reminding ourselves of a couple of clauses that we have debated previously, through which we are trying to make sure it is as easy as possible for children to report problematic content or to raise complaints. Members will recall that we debated clause 17. Clause 17(6)(c) allows for

“a parent of, or other adult with responsibility for, a child”

to raise content-reporting claims with users, so that children are not left on their own. We have also been clear under the complaints procedures set out in clause 18(2)(c) that those procedures must be

“easy to access, easy to use (including by children)”.

That is an explicit reference to accessibility for children.

The hon. Member for Aberdeen North has also already referred to the fact that in both the children’s risk assessment duties and the adult’s risk assessment duties people’s characteristics, including whether they are a member of a particular group, have to be taken into account. The children’s risk assessment duties are set out in clause 10(6)(d). Children with particular characteristics —orientation, race and so on—have to be particularly considered. The fact that a clause on the children’s risk assessment duties even exists in the first place shows that specific and special consideration has to be given to children and the risks they face. That is hardwired right into the architecture of the Bill.

All the provisions that I have just mentioned—starting with clause 10 on children’s risk assessment duties, right through to the end of the Bill and the priority offences in schedule 6, on child sexual exploitation and abuse offences—show that, right throughout the whole Bill, the protection of children is integral to what we are trying to do with the Bill.

On the consultation that happened in forming and framing the Bill, really extensive engagement and consultation took place throughout the preparation of this piece of legislation, including direct consultation with children themselves, their parents and the many advocacy groups for children. There should be no doubt at all that children have been thoroughly consulted as the Bill has been prepared.

On the specifics of new clause 3, which relate to advocacy for children, as the hon. Member for Aberdeen North referred to in passing a moment ago, there is a mechanism in clause 140 for organisations that represent particular groups, such as children, to raise super-complaints with Ofcom when there is a problem. In fact, when we debated that clause, I used children as an example when I spoke about the “eligible entities” that can raise super-complaints—I used the NSPCC speaking for children as a specific example of the organisations I would expect the term “eligible entity” to include. Clause 140 explicitly empowers organisations such as the NSPCC and others to speak for children.

13:15
Having a statutory organisation to speak for children is such a good idea that it has been done already. In fairness, I should say that it was done by the Labour Government. The Children Act 2004 established the Children’s Commissioner for England, with equivalents for Wales, Scotland and Northern Ireland. The 2004 Act states, right at the start, that the function of the Children’s Commissioner is:
“promoting awareness of the views and interests of children in England.”
So the very first function of the Children’s Commissioner, as stated in the Act, is to act as an advocate for children.
The Act goes on to state that the Children’s Commissioner may encourage persons
“exercising functions or engaged in activities affecting children…to take account of their views and interests”.
Ofcom, in exercising its regulatory functions, is clearly doing precisely what I have just read out. We therefore have a statutory advocacy organisation for children: it is the Children’s Commissioner and it is doing exactly what the hon. Member for Worsley and Eccles South is calling for.
This is a good moment to pay tribute to the current Children’s Commissioner, Dame Rachel de Souza, who gave evidence to the Committee before the Whitsun recess. Dame Rachel is extremely active, energetic and effective at advocating for children in general, but she is particularly active and effective at advocating for children in the digital sphere. I am sure the whole Committee will want to put on record its thanks to our existing statutory advocate, Dame Rachel, who is doing such a good job in that area.
I hope those comments make it clear that we already have a statutory advocate: the Children’s Commissioner. Clause 140 contains facilities for other organisations besides our existing statutory advocate to formally and legally raise with Ofcom issues that may arise. Ofcom is bound to reply—it is not optional. Ofcom has to listen to complaints and it has to respond.
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I agree wholeheartedly about the importance of the role of the Children’s Commissioner and she does a fantastic job, but is it not testament to the fact that there is a need for this advocacy body that she is advocating for it and thinks it is a really good idea? The Children Act 2004 is a fantastic Act, but that was nearly 20 years ago and the world has changed significantly since then. The Bill shows that. The fact that she is advocating for it may suggest that she sees the need for a separate entity.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

There is a danger if we over-create statutory bodies with overlapping responsibilities. I just read out the current statutory functions of the Children’s Commissioner under the 2004 Act. If we were to agree to the new clause, we would basically be creating a second statutory advocate or body with duties that are the same as some of those that the Children’s Commissioner already exercises. I read from section 2 of the Act, where those duties are set out. I do not think that having two people with conflicting or competing duties would be particularly helpful.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful to the Minister for his support for Labour legislation. Does he acknowledge that we have different Children’s Commissioners across the nations of the UK? Each would have the same rights to advocate for children, so we would have four, rather than one focusing on one specific issue, which is what the Children’s Commissioners across the UK are advocating for.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I do not have in front of me the relevant devolved legislation—I have only the Children Act 2004 directly in front of me—but I assume it is broadly similar. The hon. Member for Aberdeen North can correct me if I am wrong, but I assume it is probably broadly similar in the way—[Interruption.] She is not sure, so I do not feel too bad about not being sure either. I imagine it is similar. I am not sure that having similar statutory bodies with the same function—we would create another with the new clause—is necessarily helpful.

The Bill sets out formal processes that allow other organisations, such as the NSPCC, to raise complaints that have to be dealt with. That ensures that the voices of groups—including children, but not just children—will be heard. I suspect that if we have a children’s advocacy body, other groups will want them and might feel that they have been overlooked by omission.

The good thing about the way the super-complaint structure in clause 140 works is that it does not prescribe what the groups are. Although I am sure that children will be top of the list, there will be other groups that want to advocate and to be able to bring super-complaints. I imagine that women’s groups will be on that list, along with groups advocating for minorities and people with various sexual orientations. Clause 140 is not exclusive; it allows all these groups to have a voice that must be heard. That is why it is so effective.

My right hon. Friend the Member for Basingstoke and the hon. Member for Batley and Spen asked whether the groups have enough resources to advocate on issues under the super-complaint process. That is a fair question. The allocation of funding to different groups tends to be done via the spending review process. Colleagues in other Departments—the Department for Education or, in the case of victims, the Ministry of Justice—allocate quite a lot of money to third-sector groups. The victims budget was approximately £200 million a year or two ago, and I am told it has risen to £300 million for the current financial year. That is the sort of funding that can find its way into the hands of the organisations that advocate for particular groups of victims. My right hon. Friend asked whether the proceeds of fines could be applied to fund such work, and I have undertaken to raise that with the Treasury.

We already have a statutory advocate for children: the four Children’s Commissioners for the four parts of the United Kingdom. We have the super-complaints process, which covers more than children’s groups, crucial though they are. We have given Ofcom statutory duties to consult when developing its codes of practice, and we have money flowing via the Ministry of Justice, the DFE and others, into advocate groups. Although we agree with the intention behind new clause 3, we believe its objectives are very well covered via the mechanisms that I have just set out at some length.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

There have not been all that many times during the debate on the Bill when the Minister has so spectacularly missed the point as he has on this section. I understand everything he said about provisions already being in place to protect to children and the provisions regarding the super-complaints, but the new clause is not intended to be a replacement for the super-complaints procedure, which we all support—in fact, we have tried to strengthen that procedure. The new clause is intended to be an addition—another, very important layer.

Unfortunately, I do not have at the front of my mind the legislation that set up the Children’s Commissioner for Scotland, or the one for England. The Minister talked through some of the provisions and phrasing in the Children Act 2004. He said that the role of the Children’s Commissioner for England is to encourage bodies to act positively on behalf of children—to encourage. There is no requirement for the body to act in the way the Children’s Commissioner says it should act. Changes have been made in Wales establishing the Future Generations Commissioner, who has far more power.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As far as I can tell, the user advocacy body proposed in new clause 3 would not have the ability to compel Ofcom either.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

But it would be a statutory consultee that is specifically mentioned in this provision. I cannot find in the Bill a provision giving Ofcom a statutory duty to consult the four Children’s Commissioners. The new clause would make the children’s advocacy body a statutory consultee in decisions that affect children.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The Bill will require Ofcom to consult people who represent the interests of children. Although not named, it would be astonishing if the first people on that list were not the four Children’s Commissioners when developing the relevant codes of practice. The statutory obligation to consult those groups when developing codes of practice and, indeed, guidance is set out in clauses 37(6)(d) and 69(3)(d).

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

That is very helpful, but there are still shortcomings in what the Minister says. The Bill, as drafted, requires Ofcom to require things of other organisations. Some of the detail is in the Bill, some of the detail will come in secondary legislation and some of the detail will come in the codes of practice published by Ofcom. We broadly agree that the Bill will ensure people are safer on the internet than they currently are, but we do not have all the detail on the Government’s intent. We would like more detail on some things, but we are not saying, “We need every little bit of detail.” If we did, the Bill would not be future-proof. We would not be able to change and update the Bill if we required everything to be in the Bill.

The Bill is not a one-off; it will continually change and grow. Having a user advocacy body would mean that emerging threats can quickly be brought to Ofcom’s attention. Unlike the Children’s Commissioners, who have a hundred other things to do, the entire purpose of this body would be to advocate on behalf of children online. The Children’s Commissioners do an amazing job, but this is not their No. 1 priority. If the Minister wants this to be a world-leading Bill, its No. 1 priority should be to protect the human rights of children.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I think the hon. Lady is being a little unfair to the Children’s Commissioners. Dame Rachel de Souza is doing a fantastic job of advocating specifically in the digital sphere. She really is doing a fantastic job, and I say that as a Minister. I would not say she is leaving any gaps.

These digital children’s safety issues link to wider children’s safety issues that exist offline, such as sexual exploitation, grooming and so on, so it is useful that the same person advocates for children in both the offline and online worlds.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The new clause asks for an additional body. It is not saying the Children’s Commissioners should be done away with. The Children’s Commissioners do an amazing job, as we have recognised, but the No. 1 priority, certainly for the Children’s Commissioner in Scotland, is to protect the human rights of children; it is not to protect children online, which is what the user advocacy body would do. The body would specifically give the benefit of its experience and specifically use its resources, time and energy to advocate between Ofcom, children and children’s organisations and groups.

The Minister is right that the Bill takes massive steps forward in protecting children online, and he is right that the Children’s Commissioners do a very good job. The work done by the Children’s Commissioners in giving us evidence on behalf of children and children’s organisations has been incredibly powerful and incredibly helpful, but there is still a layer missing. If this Bill is to be future-proof, if it is to work and if it is not to put an undue burden on charitable organisations, we need a user advocacy body. The Minister needs to consider that.

I appreciate that the Government provide money to victim support organisations, which is great, but I am also making a case about potential victims. If the money only goes to those who support people who have already been harmed, it will not allow them to advocate to ensure that more people are not harmed. It will allow them to advocate on the behalf of those who have been harmed—absolutely—but it will not effectively tackle potential and emerging harms. It is a key place where the Bill misses out. I am quite disappointed that the Minister has not recognised that something may be lacking and is so keen to defend his position, because it seems to me that the position of the Opposition is so obviously the right one.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I wholeheartedly agree with what the hon. Member for Aberdeen North just said, but I wish to emphasise some elements because it seems to me that the Minister was not listening, although he has listened to much that has been said. I made some specific points, used quotes and brought forward some evidence. He feels that children have been consulted in the drafting of the Bill; I cited a YoungMinds survey that showed that that was very much not what young people feel. YoungMinds surveyed a large group of young people and a very large proportion of them had not even heard of the Bill.

The evidence of the young survivor of online grooming was very powerful. She very much wanted a user-advocacy body and spoke strongly about that. The Minister is getting it wrong if he thinks that somebody in that situation, who has been groomed, would go to a parent. The quote that I cited earlier was:

“Being groomed made me feel incredibly vulnerable, isolated, and weak. I felt I had no one who was on my side.”

There were clearly adults in her life she could have gone to, but she did not because she was in that vulnerable position—a position of weakness. That is why some kind of independent advocacy body for children is so important.

I do not think children and young people do feel consulted about the Bill because the organisations and charities are telling us that. I join all Opposition Members in supporting and paying tribute to the remarkable job that the Children’s Commissioner does. I quoted her setting out her worries about the Bill. I quoted her saying that

“the Bill does not do enough to respond to individual cases of abuse and that it needs to do more to understand issues and concerns directly from children.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 16, Q22.]

That is what she said. She did not say, “I’m the person charged with doing this. I’m the person who has the resource and my office has the resource.”

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I hope that I did not in any way confuse the debate earlier, because these two things are very separate. The idea of a user-advocacy service and individual victim support are two separate issues. The Minister has already taken up the issue of victim support, which is what the Children’s Commissioner was talking about, but that is separate from advocacy, which is much broader and not necessarily related to an individual problem.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Indeed, but the Children’s Commissioner was very clear about certain elements being missing in the Bill, as is the NSPCC and other organisations. It is just not right for the Minister to land it back with the Children’s Commissioner as part of her role, because she has to do so many other things. The provisions in the Bill in respect of a parent or adult assisting a young people in a grooming situation are a very big concern. The Children’s Commissioner cited her own survey of 2,000 children, a large proportion of whom had not succeeded in getting content about themselves removed. From that, we see that she understands that the problem exists. We will push the new clause to a Division.

Question put, That the clause be read a Second time.

Division 52

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

Ordered, That further consideration be now adjourned. —(Steve Double.)
13:35
Adjourned till Tuesday 28 June at twenty-five minutes past Nine o’clock.
Written evidence reported to the House
OSB87 Glassdoor, Inc
OSB88 Open Rights Group

Online Safety Bill (Sixteenth sitting)

Committee stage
Tuesday 28th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 28 June 2022 - (28 Jun 2022)
The Committee consisted of the following Members:
Chairs: † Sir Roger Gale, Christina Rees
† Ansell, Caroline (Eastbourne) (Con)
† Bailey, Shaun (West Bromwich West) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
† Carden, Dan (Liverpool, Walton) (Lab)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Double, Steve (St Austell and Newquay) (Con)
† Fletcher, Nick (Don Valley) (Con)
† Holden, Mr Richard (North West Durham) (Con)
† Keeley, Barbara (Worsley and Eccles South) (Lab)
† Leadbeater, Kim (Batley and Spen) (Lab)
† Miller, Dame Maria (Basingstoke) (Con)
† Mishra, Navendu (Stockport) (Lab)
Moore, Damien (Southport) (Con)
Nicolson, John (Ochil and South Perthshire) (SNP)
† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
† Russell, Dean (Watford) (Con)
† Stevenson, Jane (Wolverhampton North East) (Con)
Katya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks
† attended the Committee
Public Bill Committee
Tuesday 28 June 2022
(Morning)
[Sir Roger Gale in the Chair]
Online Safety Bill
09:25
None Portrait The Chair
- Hansard -

Good morning, ladies and gentlemen. Please be kind enough to make sure that your mobile phones are switched off.

New Clause 4

Duty to disclose information to OFCOM

“(1) This section sets out the duties to disclose information to OFCOM which apply in relation to all regulated user-to-user services.

(2) A regulated user-to-user service must disclose to OFCOM anything relating to that service of which that regulator would reasonably expect notice.

(3) This includes —

(a) any significant changes to its products or services which may impact upon its performance of its safety duties;

(b) any significant changes to its moderation arrangements which may impact upon its performance of its safety duties;

(c) any significant breaches in respect of its safety duties.”—(Barbara Keeley.)

This new clause creates a duty to disclose information to Ofcom.

Brought up, and read the First time.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

Good morning, Sir Roger. The new clause would require regulated companies to disclose proactively to the regulator material changes in its operations that may impact on safety, and any significant breaches as a result of its safety duties. Category 1 services should be under regulatory duties to disclose proactively to the regulator matters about which it could reasonably expect to be informed. For example, companies should notify Ofcom about significant changes to their products and services, or to their moderation arrangements, that may impact on the child abuse threat and the company’s response to it. A similar proactive duty already applies in the financial services sector. The Financial Conduct Authority handbook states:

“A firm must deal with its regulators in an open and cooperative way, and must disclose to the FCA appropriately anything relating to the firm of which that regulator would reasonably expect notice.”

The scope of the duty we are suggesting could be drawn with sufficient clarity so that social media firms properly understand their requirements and companies do not face unmanageable reporting burdens. Such companies should also be subject to red flag disclosure requirements, whereby they would be required to notify the regulator of any significant lapses in, or changes to, systems and processes that compromise children’s safety or could put them at risk. For example, if regulation had been in place over the last 12 months, Facebook might reasonably have been expected to report on the technology and staffing issues to which it attributes its reduced detection of child abuse content.

Experience from the financial services sector demonstrates the importance of disclosure duties as a means of regulatory intelligence gathering. Perhaps more importantly, they provide a useful means of hard-wiring regulatory compliance into company decisions on the design and operation of their sites.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

Thank you for chairing this meeting, Sir Roger. I have a quick question for the Minister that relates to the new clause, which is a reasonable request for a duty on providers to disclose information to Ofcom. We would hope that the regulator had access to that information, and if companies are making significant changes, it is completely reasonable that they should have to tell Ofcom.

I do not have any queries or problems with the new clause; it is good. My question for the Minister is—I am not trying to catch anyone out; I genuinely do not know the answer—if a company makes significant changes to something that might impact on its safety duties, does it have to do a new risk assessment at that point, or does it not have to do so until the next round of risk assessments? I do not know the answer, but it would be good if the direction of travel was that any company making drastic changes that massively affected security—for example, Snapchat turning on the geolocation feature when it did an update—would have to do a new risk assessment at that point, given that significant changes would potentially negatively impact on users’ safety and increase the risk of harm on the platform.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

It is a pleasure, as always, to serve under your chairmanship, Sir Roger. As the hon. Member for Worsley and Eccles South said, the new clause is designed to introduce a duty on providers to notify Ofcom of anything that Ofcom could reasonably be expected to be notified of.

The Bill already has extremely strong information disclosure provisions. I particularly draw the Committee’s attention to clause 85, which sets out Ofcom’s power to require information by provision of an information notice. If Ofcom provides an information notice—the particulars of which are set out in clause 86—the company has to abide by that request. As the Committee will recall, the strongest sanctions are reserved for the information duties, extending not only to fines of up to 10% or service discontinuation—unplugging the website, as it were; there is also personal criminal liability for named executives, with prison sentences of up to two years. We take those information duties extremely seriously, which is why the sanctions are as strong as they are.

The hon. Member for Aberdeen North asked what updates would occur if there were a significant design change. I draw the Committee’s attention to clause 10, which deals with children’s risk assessment duties, but there are similar duties in relation to illegal content and the safety of adults. The duty set out in clause 10(2), which cross-refers to schedule 3, makes it clear. The relevant words are “suitable and sufficient”. Clearly if there were a massive design change that would, in this case, adversely affect children, the risk assessment would not be suitable and sufficient if it were not updated to reflect that design change. I hope that answers the hon. Lady’s question.

Turning to the particulars of the new clause, if we incentivise companies to disclose information they have not been asked for by Ofcom, there is a danger that they might, through an excessive desire to comply, over-disclose and provide a torrent of information that would not be very helpful. There might also be a risk that some companies that are not well intentioned would deliberately dump enormous quantities of data in order to hide things within it. The shadow Minister, the hon. Member for Worsley and Eccles South, mentioned an example from the world of financial services, but the number of companies potentially within the scope of the Bill is so much larger than even the financial services sector. Some 25,000 companies may be in scope, a number that is much larger—probably by one order of magnitude, and possibly by two—than the financial services sector regulated by the FCA. That disparity in scale makes a significant difference.

Given that there are already strong information provision requirements in the Bill, particularly clause 85, and because of the reasons of scale that I have mentioned, I will respectfully resist the new clause.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

We believe that the platforms need to get into disclosure proactively, and that this is a reasonable clause, so we will push it to a vote.

Question put, That the clause be read a Second time.

Division 53

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

New Clause 5
Duty to distinguish paid-for advertisements
“(1) A provider of a Category 2A service must operate the service using systems and processes designed to clearly distinguish to users of that service paid-for advertisements from all other content appearing in or via search results of the service.
(2) The systems and processes described under subsection (1)—
(a) must include clearly displaying the words “paid-for advertisement” next to any paid-for advertisement appearing in or via search results of the service, and
(b) may include measures such as but not limited to the application of colour schemes to paid-for advertisements appearing in or via search results of the service.
(3) The reference to paid-for advertisements appearing “in or via search results of a search service” does not include a reference to any advertisements appearing as a result of any subsequent interaction by a user with an internet service other than the search service.
(4) If a person is the provider of more than one Category 2A service, the duties set out in this section apply in relation to each such service.
(5) The duties set out in this section extend to the design, operation and use of a Category 2A service that hosts paid-for advertisements targeted at users of that service in the United Kingdom.
(6) For the meaning of “Category 2A service”, see section 81 (register of a categories of service).
(7) For the meaning of “paid-for advertisement”, see section 189 (interpretation: general).”—(Alex Davies-Jones.)
Brought up, and read the First time.
Question put, That the clause be read a Second time.

Division 54

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

New Clause 6
Duty to verify advertisements
“(1) A provider of a Category 2A service must operate an advertisement verification process for any relevant advertisement appearing in or via search results of the service.
(2) In this section, “relevant advertisement” means any advertisement for a service or product to be designated in regulations made by the Secretary of State.
(3) The verification process under subsection (1) must include a requirement for advertisers to demonstrate that they are authorised by a UK regulatory body.
(4) In this section, “UK regulatory body” means a UK regulator responsible for the regulation of a particular service or product to be designated in regulations made by the Secretary of State.
(5) If a person is the provider of more than one Category 2A service, the duties set out in this section apply in relation to each such service.
(6) For the meaning of “Category 2A service”, see section 81 (register of a categories of service).
(7) Regulations under this section shall be made by statutory instrument.
(8) A statutory instrument containing regulations under this section may not be made unless a draft of the instrument has been laid before and approved by resolution of each House of Parliament.”—(Alex Davies-Jones.)
Brought up, and read the First time.
Question put, That the clause be read a Second time.

Division 55

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

New Clause 7
Report on duties to protect content of democratic importance and journalistic content
“(1) The Secretary of State must publish a report which—
(a) reviews the extent to which Category 1 services have fulfilled their duties under—
(i) Clause 15; and
(ii) Clause 16;
(b) analyses the effectiveness of Clauses 15 and 16 in protecting against—
(i) foreign state actors;
(ii) extremist groups and individuals; and
(iii) sources of misinformation and disinformation.
(2) The report must be laid before Parliament within one year of this Act being passed.”—(Alex Davies-Jones.)
This new clause would require the Secretary of State to publish a report reviewing the effectiveness of Clauses 15 and 16.
Brought up, and read the First time.
Question put, That the clause be read a Second time.

Division 56

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

New Clause 8
OFCOM’s guidance about user identity verification
“(1) OFCOM must produce guidance for providers of Category 1 services on how to comply with the duty set out in section 57(1).
(2) In producing the guidance (including revised or replacement guidance), OFCOM must have regard to—
(a) ensuring providers offer forms of identity verification which are likely to be accessible to vulnerable adult users and users with protected Characteristics under the Equality Act 2010,
(b) promoting competition, user choice, and interoperability in the provision of identity verification,
(c) protection of rights, including rights to privacy, freedom of expression, safety, access to information, and the rights of children,
(d) alignment with other relevant guidance and regulation, including with regards to Age Assurance and Age Verification.
(3) In producing the guidance (including revised or replacement guidance), OFCOM must set minimum standards for the forms of identity verification which Category services must offer, addressing—
(a) effectiveness,
(b) privacy and security,
(c) accessibility,
(d) time-frames for disclosure to Law Enforcement in case of criminal investigations,
(e) transparency for the purposes of research and independent auditing,
(f) user appeal and redress mechanisms.
(4) Before producing the guidance (including revised or replacement guidance), OFCOM must consult—
(a) the Information Commissioner,
(b) the Digital Markets Unit,
(c) persons whom OFCOM consider to have technological expertise relevant to the duty set out in section 57(1),
(d) persons who appear to OFCOM to represent the interests of users including vulnerable adult users of Category 1 services, and
(e) such other persons as OFCOM considers appropriate.
(5) OFCOM must publish the guidance (and any revised or replacement guidance).”—(Alex Davies-Jones.)
This new clause would require Ofcom to set a framework of principles and minimum standards for the User Verification Duty.
Brought up, and read the First time.
Question put, That the clause be read a Second time.

Division 57

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

New Clause 9
Risk assessments: submission to OFCOM and publication
“Whenever a Category 1 service carries out any risk assessment pursuant to Part 3 of this Act, the service must—
(a) submit the risk assessment to OFCOM; and
(b) publish the risk assessment on the service’s website.”—(Barbara Keeley.)
This new clause requires any risk assessment carried out by a Category 1 service under Part 3 to be submitted to Ofcom and published.
Brought up, and read the First time.
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

Throughout these debates it has been clear that we agree on both sides that the Online Safety Bill must be a regime that promotes the highest levels of transparency. This will ensure that platforms can be held accountable for their systems and processes. Like other regulated industries, they must be open and honest with the regulator and the public about how their products work and how they keep users safe.

As we know, platforms duck and dive to avoid sharing information that could make life more difficult for them or cast them in a dim light. The Bill must give them no opportunity to shirk their responsibilities. The Bill enables the largest platforms to carry out a risk assessment safe in the knowledge that it may never see the light of day. Ofcom can access such information if it wants, but only following a lengthy process and as part of an investigation. This creates no incentive for platforms to carry out thorough and proper risk assessments. Instead, platforms should have to submit these risk assessments to Ofcom not only on request but as a matter of course. Limiting this requirement to only the largest platforms will not overload Ofcom, but will give it the tools and information it needs to oversee an effective regime.

In addition, the public have a right to know the risk profile of the services they use. This happens in all other regulated industries, with consumers having easy access to the information they need to make informed decisions about the products they use. At present, the Bill does not give users the information they deserve about what to expect online. Parents in particular will be empowered by information about the risk level of platforms their children use. Therefore, it is imperative that risk assessments are made publicly available, as well as submitted to the regulator as a matter of course.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a couple of comments on the point about parental empowerment. I have been asked by my children for numerous apps. I have a look at them and think, “I don’t know anything about this app. I have never seen or heard of it before, and I have no idea the level of user-to-user functionality in this app.” Nowhere is there a requirement for this information to be set out. There is nowhere that parents can easily find this information.

With iPhones, if a kid wants an app, they have to request it from their parent and their parents needs to approve whether or not they get it. I find myself baffled by some of them because they are not ones that I have ever heard of or come across. To find out whether they have that level of functionality, I have to download and use the app myself in the way that, hopefully, my children would use it in order to find out whether it is safe for them.

A requirement for category 1 providers to be up front and explain the risks and how they manage them, and even how people interact with their services, would increase the ability of parents to be media literate. We can be as media literate as we like, but if the information is not there and we cannot find it anywhere, we end up having to make incredibly restrictive decisions in relation to our children’s ability to use the internet, which we do not necessarily want to make. We want them to be able to have fun, and the information being there would be very helpful, so I completely agree on that point.

My other point is about proportionality. The Opposition moved new clause 4, relating to risk assessments, and I did not feel able to support it on the basis of the arguments that the Minister made about proportionality. He made the case that Ofcom would receive 25,000 risk assessments and would be swamped by the number that it might receive. This new clause balances that, and has the transparency that is needed.

It is completely reasonable for us to put the higher burden of transparency on category 1 providers and not on other providers because they attract the largest market share. A huge percentage of the risk that might happen online happens with category 1 providers, so I am completely happy to support this new clause, which strikes the right balance. It answers the Minister’s concerns about Ofcom being swamped, because only category 1 providers are affected. Asking those providers to put the risk assessment on their site is the right thing to do. It will mean that there is far more transparency and that people are better able to make informed decisions.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I understand the intention behind the new clause, but I want to draw the Committee’s attention to existing measures in the Bill that address this matter. I will start with the point raised by the hon. Member for Aberdeen North, who said that as a parent she would like to be able to see a helpful summary of what the risks are prior to her children using a new app. I am happy to say to her that that is already facilitated via clause 13(2), which appears at the top of page 13. There is a duty there

“to summarise in the terms of service the findings of the most recent adults’ risk assessment of a service”,

including the levels of risk, and the nature and severity of those risks. That relates specifically to adults, but there is an equivalent provision relating to children as well.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I just gently say that if there is a requirement for people to sign up or begin to go through the sign-up process in order to see the terms of service, that is not as open and transparent. That is much more obstructive than it could be. A requirement for providers to make their terms of service accessible to any user, whether or not they were registered, would assist in the transparency.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I think the terms of service are generally available to be viewed by anyone. I do not think people have to be registered users to view the terms of service.

In addition to the duty to summarise the findings of the most recent risk assessment in relation to adults in clause 13(2), clause 11 contains obligations to specify in the terms of service, in relation to children, where children might be exposed to risks using that service. I suggest that a summary in the terms of service, which is an easy place to look, is the best way for parents or anybody else to understand what the risks are, rather than having to wade through a full risk assessment. Obviously, the documents have not been written yet, because the Bill has not been passed, but I imagine they would be quite long and possibly difficult to digest for a layperson, whereas a summary is more readily digestible. Therefore, I think the hon. Lady’s request as a parent is met by the duties set out in clause 11, and the duties for adults are set out in clause 13.

09:44
On transparency and disclosure more generally, beyond the summaries that will be published, I would point to the transparency duties in clause 64, which we have discussed previously. Ofcom must specify what it requires to be published publicly and the platforms will then have to comply with that. That is a good mechanism for Ofcom to force publication of what it thinks needs to be brought into the light of day to meet the wider public interest, and the interests of users and parents. I hope that I have set out how, in clauses 11, 13 and 64, the transparency and disclosure obligations are met. In addition, clause 136 will require Ofcom to produce a report about providing researchers with access to information, which is important.
So what are the issues with the new clause? First, for the reasons that I have set out, the Bill already addresses the point. However, exposing the entire risk assessment publicly also carries some risks itself. For example, if the risk assessment identifies weaknesses or vulnerabilities in the service—ways that malfeasant people could exploit it to get at children or do something else that we would consider harmful—then exposing to everybody, including bad actors, the ways of beating the system and doing bad things on the service would not necessarily be in the public interest. A complete disclosure could help those looking to abuse and exploit the systems. That is why the transparency duties in clause 64 and the duties to publish accessible summaries in clauses 11 and 13 meet the objectives—the quite proper objectives—of the shadow Minister, the hon. Member for Worsley and Eccles South, and the hon. Member for Aberdeen North, without running the risks that are inherent in new clause 9, which I would therefore respectfully and genuinely resist.
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The Minister seems to be resisting so many measures that have been put forward that would improve transparency, particularly by making information publicly available. As I made clear, the public have a right to know the risk profile of the services they use. We have debated this issue reasonably exhaustively now. Therefore, I will press the new clause to a Division.

Question put, That the clause be read a Second time.

Division 58

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

New Clause 10
Special circumstances
“(1) This section applies where OFCOM has reasonable grounds for believing that circumstances exist that present a threat—
(a) to the health or safety of the public, or
(b) to national security.
(2) OFCOM may, in exercising their media literacy functions, give priority for a specified period to specified objectives designed to address the threat presented by the circumstances mentioned in subsection (1).
(3) OFCOM may give a public statement notice to—
(a) a specified provider of a regulated service, or
(b) providers of regulated services generally.
(4) A “public statement notice” is a notice requiring a provider of a regulated service to make a publicly available statement, by a date specified in the notice, about steps the provider is taking in response to the threat presented in the circumstances mentioned in subsection (1).
(5) OFCOM may, by a public statement notice or a subsequent notice, require a provider of a regulated service to provide OFCOM with such information as they may require for the purpose of responding to that threat.
(6) If OFCOM takes any of the steps set out in this Chapter, they must publish their reasons for doing so.
(7) In subsection (2) “media literacy functions” means OFCOM’s functions under section 11 of the Communications Act (duty to promote media literacy), so far as functions under that section relate to regulated services.”—(Alex Davies-Jones.)
This new clause gives Ofcom the power to take particular steps where it considers that there is a threat to the health and safety of the public or to national security, without the need for a direction from the Secretary of State.
Brought up, and read the First time.
Question put, That the clause be read a Second time.

Division 59

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

New Clause 12
Secretary of State’s powers to suggest modifications to a code of practice
“(1) The Secretary of State may on receipt of a code write within one month of that day to OFCOM with reasoned, evidence-based suggestions for modifying the code.
(2) OFCOM shall have due regard to the Secretary of State’s letter and must reply to the Secretary of State within one month of receipt.
(3) The Secretary of State may only write to OFCOM twice under this section for each code.
(4) The Secretary of State and OFCOM shall publish their letters as soon as reasonably possible after transmission, having made any reasonable redactions for public safety and national security.
(5) If the draft of a code of practice contains modifications made following changes arising from correspondence under this section, the affirmative procedure applies.”—(Alex Davies-Jones.)
This new clause gives the Secretary of State powers to suggest modifications to a code of practice, as opposed to the powers of direction proposed in clause 40.
Brought up, and read the First time.
Question put, That the clause be read a Second time.

Division 60

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

New Clause 13
Liability for companies associated with regulated services
“(1) A relevant regulated entity (“C”) is liable for penalties set out in the Bill where a person or company (“A”) associated with C and considered by a user to be a component of C does not comply with the duties established in the Bill.
(2) Subsection (1) applies whether or not C has made A aware of the duties established in the Bill.
(3) But it is a defence for C to prove that C had in place adequate procedures designed to prevent persons associated with C from undertaking such conduct.
(4) In this section a “relevant regulated entity” means a regulated service as defined in section 3(4) of this Act.
(5) For the purposes of this section, A is associated with C if A is a person who performs services for or on behalf of C notwithstanding—
(a) the capacity in which A performs services for or on behalf of C;
(b) whether or not A is an employee, agent or subsidiary of C.
(6) Whether or not A is a person who performs services for or on behalf of C is to be determined by reference to all the relevant circumstances and not merely by reference to the nature of the relationship between A and C.
(7) If A is an employee of C, it is to be presumed unless the contrary is shown that A is a person who performs services for or on behalf of C.”—(Alex Davies-Jones.)
Brought up, and read the First time.
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

Good morning, Sir Roger. As my hon. Friend the Member for Worsley and Eccles South mentioned when speaking to new clause 11, Labour has genuine concerns about supply chain risk assessment duties. That is why we have tabled new clause 13, which seeks to ensure enforcement of liability for supply chain failures that amount to a breach of one of the specified duties drawing on existing legislation.

As we know, platforms, particularly those supporting user-to-user generated content, often employ services from third parties. At our evidence sessions we heard from Danny Stone of the Antisemitism Policy Trust that this has included Twitter explaining that racist GIFs were not its own but were provided by another service. The hands-off approach that platforms have managed to get away with for far too long is exactly what the Bill is trying to fix, yet without this important new clause we fear there will be very little change.

We have already raised issues with the reliance on third party providers more widely, particularly content moderators, but the same problems also apply to some types of content. Labour fears a scenario in which a company captured by the regulatory regime established by the Bill will argue that an element of its service is not within the ambit of the regulator simply because it is part of a supply chain, represented by, but not necessarily the responsibility of, the regulated services.

The contracted element, supported by an entirely separate company, would argue that it is providing business-to-business services. That is not user-to-user generated content per se but content designed and delivered at arm’s length, provided to the user-to-user service to deploy to its users. The result would likely be a timely, costly and unhelpful legal process during which systems could not be effectively regulated. The same may apply in relation to moderators, where complex contract law would need to be invoked.

We recognise that in UK legislation there are concerns and issues around supply chains. The Bribery Act 2010, for example, says that a company is liable if anyone performing services for or on the company’s behalf is found culpable of specific actions. We therefore strongly urge the Minister to consider this new clause. We hope he will see the extremely compelling reasons why liability should be introduced for platforms failing to ensure that associated parties, considered to be a part of a regulated service, help to fulfil and abide by relevant duties.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The new clause seeks to impose liability on a provider where a company providing regulated services on its behalf does not comply with the duties in the Bill. The provider would be liable regardless of whether it has any control over the service in question. We take the view this would impose an unreasonable burden on businesses and cause confusion over which companies are required to comply with the duties in the Bill.

As drafted, the Bill ensures legal certainty and clarity over which companies are subject to duties. Clause 180 makes it clear that the Bill’s duties fall on companies with control over the regulated service. The point about who is in control is very important, because the liability should follow the control. These companies are responsible for ensuring that any third parties, such as contractors or individuals involved in running the service, are complying with the Bill’s safety duties, so that they cannot evade their duties in that way.

Companies with control over the regulated service are best placed to keep users safe online, assess risk, and put in place systems and processes to minimise harm, and therefore bear the liability if there is a transgression under the Bill as drafted. Further, the Bill already contains robust provisions in clause 161 and schedule 14 that allow Ofcom to hold parent and subsidiary companies jointly liable for the actions of other companies in a group structure. These existing mechanisms promote strong compliance within groups of companies and ensure that the entities responsible for breaches are the ones held responsible. That is why we feel the Bill as drafted achieves the relevant objectives.

Question put, That the clause be read a Second time.

Division 61

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

New Clause 14
Duty to promote media literacy: regulated user-to-user services and search services
“(1) In addition to the duty on OFCOM to promote media literacy under section 11 of the Communications Act 2003, OFCOM must take such steps as they consider appropriate to improve the media literacy of the public in relation to regulated user-to-user services and search services.
(2) This section applies only in relation to OFCOM’s duty to regulate—
(a) user-to-user services, and
(b) search services.
(3) OFCOM’s performance of its duty in subsection (1) must include pursuit of the following objectives—
(a) to reach audiences who are less engaged with, and harder to reach through, traditional media literacy initiatives;
(b) to address gaps in the availability and accessibility of media literacy provisions targeted at vulnerable users;
(c) to build the resilience of the public to disinformation and misinformation by using media literacy as a tool to reduce the harm from that misinformation and disinformation;
(d) to promote greater availability and effectiveness of media literacy initiatives and other measures, including by—
(i) carrying out, commissioning or encouraging educational initiatives designed to improve the media literacy of the public;
(ii) seeking to ensure, through the exercise of OFCOM’s online safety functions, that providers of regulated services take appropriate measures to improve users’ media literacy;
(iii) seeking to improve the evaluation of the effectiveness of the initiatives and measures mentioned in sub paras (2)(d)(i) and (ii) (including by increasing the availability and adequacy of data to make those evaluations);
(e) to promote better coordination within the media literacy sector.
(4) OFCOM may prepare such guidance about the matters referred to in subsection (2) as it considers appropriate.
(5) Where OFCOM prepares guidance under subsection (4) it must—
(a) publish the guidance (and any revised or replacement guidance); and
(b) keep the guidance under review.
(6) OFCOM must co-operate with the Secretary of State in the exercise and performance of their duty under this section.”—(Alex Davies-Jones.)
This new clause places an additional duty on Ofcom to promote media literacy of the public in relation to regulated user-to-user services and search services.
Brought up, and read the First time.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

New clause 15—Media literacy strategy

“(1) OFCOM must prepare a strategy which sets out how they intend to undertake their duty to promote media literacy in relation to regulated user-to-user services and regulated search services under section (Duty to promote media literacy: regulated user-to-user services and search services).

(2) The strategy must—

(a) set out the steps OFCOM propose to take to achieve the pursuit of the objectives set out in section (Duty to promote media literacy: regulated user-to-user services and search services),

(b) set out the organisations, or types of organisations, that OFCOM propose to work with in undertaking the duty;

(c) explain why OFCOM considers that the steps it proposes to take will be effective;

(d) explain how OFCOM will assess the extent of the progress that is being made under the strategy.

(3) In preparing the strategy OFCOM must have regard to the need to allocate adequate resources for implementing the strategy.

(4) OFCOM must publish the strategy within the period of 6 months beginning with the day on which this section comes into force.

(5) Before publishing the strategy (or publishing a revised strategy), OFCOM must consult—

(a) persons with experience in or knowledge of the formulation, implementation and evaluation of policies and programmes intended to improve media literacy;

(b) the advisory committee on disinformation and misinformation, and

(c) any other person that OFCOM consider appropriate.

(6) If OFCOM have not revised the strategy within the period of 3 years beginning with the day on which the strategy was last published, they must either—

(a) revise the strategy, or

(b) publish an explanation of why they have decided not to revise it.

(7) If OFCOM decides to revise the strategy they must—

(a) consult in accordance with subsection (3), and

(b) publish the revised strategy.”

This new clause requires Ofcom to publish a strategy related to their duty to promote media literacy of the public in relation to regulated user-to-user services and search services.

New clause 16—Media literacy strategy: progress report

“(1) OFCOM must report annually on the delivery of the strategy required under section (Duty to promote media literacy: regulated user-to-user services and search services).

(2) The report must include—

(a) a description of the steps taken in accordance with the strategy during the year to which the report relates; and

(b) an assessment of the extent to which those steps have had an effect on the media literacy of the public in that year.

(3) The assessment referred to in subsection (2)(b) must be made in accordance with the approach set out by OFCOM in the strategy (see section (Duty to promote media literacy: regulated user-to-user services and search services) (2)(d).

(4) OFCOM must—

(a) publish the progress report in such manner as they consider appropriate; and

(b) send a copy of the report to the Secretary of State who must lay the copy before Parliament.”

This new clause is contingent on NC15.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The UK has a vast media literacy skills and knowledge gap, which leaves the population at risk of harm. Indeed, research from Ofcom found that a third of internet users are unaware of the potential for inaccurate or biased information. Similarly, about 61% of social media users who say they are confident in judging whether online content is true or false actually lack the skills to do so.

Good media literacy is our first line of defence against bad information online. It can make the difference between decisions based on sound evidence and decisions based on poorly informed opinions that can harm health and wellbeing, social cohesion and democracy. Clause 103 of the draft Bill proposed a new media duty for Ofcom to replace the one in section 11 of the Communications Act 2003, but sadly the Government scrapped it from the final Bill.

Media literacy initiatives in the Online Safety Bill are now mentioned only in the context of risk assessments, but there is no active requirement for internet companies to promote media literacy. The draft Bill’s media literacy provision needed to be strengthened, not cut. New clauses 14, 15 and 16 would introduce a new, stronger media literacy duty on Ofcom, with specific objectives. They would require the regulator to produce a statutory strategy for delivering on it and then to report on progress made towards increasing media literacy under the strategy. There is no logical reason for the Minister not to accept these important new clauses or work with Labour on them.

Over the past few weeks, we have debated a huge range of issues that are being perpetuated online as we speak, from vile, misogynistic content about women and girls to state-sponsored disinformation. It is clear that the lessons have not been learned from the past few years, when misinformation was able to significantly undermine public health, most notably throughout the pandemic. Harmful and, more importantly, false statistics were circulated online, which caused significant issues in encouraging the uptake of the vaccine. We have concerns that, without a robust media literacy strategy, the consequences of misinformation and disinformation could go further.

The issues that Labour has raised about the responsibility of those at the top—the Government—have been well documented. Only a few weeks ago, we spoke about the Secretary of State actually contributing to the misinformation discourse by sharing a picture of the Labour leader that was completely out of context. How can we be in a position where those at the top are contributing to this harmful discourse? The Minister must be living in a parallel universe if he cannot see the importance of curbing these harmful behaviours online as soon as possible. He must know that media literacy is at the very heart of the Bill’s success more widely. We genuinely feel that a strengthened media literacy policy would be a huge step forward, and I sincerely hope that the Minister will therefore accept the justification behind these important new clauses.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I agree entirely on these new clauses. Although the Bill will make things safer, it will do that properly only if supported by proper media literacy and the upskilling of everybody who spends any portion of their lives online. They all need better media literacy, and I am not excluding myself from that. Everybody, no matter how much time they have spent online, can learn more about better ways to fact-check and assess risk, and about how services use our data.

I pay tribute to all those involved in media literacy—all the educators at all levels, including school teachers delivering it as part of the curriculum, school teachers delivering it not as part of the curriculum, and organisations such as CyberSafe Scotland in my constituency, which is working incredibly hard to upskill parents and children about the internet. They also include organisations such as the Silver City Surfers in Aberdeen, where a group of young people teaches groups of elderly people how to use the internet. All those things are incredibly helpful and useful, but we need to ensure that Ofcom is at the top of that, producing materials and taking its duties seriously. It must produce the best possible information and assistance for people so that up-to-date media literacy training can be provided.

As we have discussed before, Ofcom’s key role is to ensure that when threats emerge, it is clear and tells people, “This is a new threat that you need to be aware of,” because the internet will grow and change all the time, and Ofcom is absolutely the best placed organisation to be recognising the new threats. Obviously, it would do that much better with a user advocacy panel on it, but given its oversight and the way it will be regulating all the providers, Ofcom really needs to take this issue as seriously as it can. It is impossible to overstate the importance of media literacy, so I give my wholehearted backing to the three new clauses.

10:00
Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

I rise to speak in favour of new clauses 14 to 16, on media literacy. As we have discussed in Committee, media literacy is absolutely vital to ensure that internet users are aware of the tools available to protect themselves. Knowledge and understanding of the risks online, and how to protect against them, are the first line of defence for us all.

We all know that the Bill will not eliminate all risk online, and it will not entirely clean up the internet. Therefore, ensuring that platforms have robust tools in place, and that users are aware of them, is one of the strongest tools in the Bill to protect internet users. As my hon. Friend the Member for Pontypridd said, including the new clauses in the Bill would help to ensure that we all make decisions based on sound evidence, rather than on poorly informed opinions that can harm not just individuals but democracy itself. The new clauses, which would place a duty on Ofcom to promote media literacy and publish a strategy, are therefore crucial.

I am sure we all agree about the benefits of public health information that informs us of the role of a healthy diet and exercise, and of ways that we can adopt a healthier lifestyle. I do not want to bring up the sensitive subject of the age of members of the Committee, as it got me into trouble with some of my younger colleagues last week, but I am sure many of us will remember the Green Cross Code campaign, the stop smoking campaigns, the anti-drink driving ads, and the powerful campaign to promote the wearing of seatbelts—“Clunk click every trip”. These were publicly funded and produced information campaigns that have stuck in our minds and, I am sure, protected thousands of lives across the country. They laid out the risks and clearly stated the actions we all need to take to protect ourselves.

When it comes to online safety, we need a similar mindset to inform the public of the risks and how we can mitigate them. Earlier in Committee, the right hon. Member for Basingstoke, a former Secretary of State for Digital, Culture, Media and Sport, shared her experience of cyber-flashing and the importance of knowing how to turn off AirDrop to prevent such incidents from occurring in the first place. I had no idea about this simple change that people can make to protect themselves from such an unpleasant experience. That is the type of situation that could be avoided with an effective media literacy campaign, which new clauses 14 to 16 would legislate for.

I completely agree that platforms have a significant duty to design and implement tools for users to protect themselves while using platforms’ services. However, I strongly believe that only a publicly funded organisation such as Ofcom can effectively promote their use, explain the dangers of not using them and target such information at the most vulnerable internet users. That is why I wholeheartedly support these vital new clauses.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The Government obviously recognise and support the intent behind the new clause, which is to make sure that work is undertaken by Ofcom specifically, and the Government more widely, on media literacy. That is important for the reasons laid out by the hon. Members for Aberdeen North and for Batley and Spen.

Ofcom already has a statutory duty to promote media literacy in relation to electronic media, which includes everything in scope of the Bill and more beyond. That is set out in the Communications Act 2003, so the statutory duty exists already. The duty proposed in new clause 14 is actually narrower in scope than the existing statutory duty on Ofcom, and I do not think it would be a very good idea to give Ofcom an online literacy duty with a narrower scope than the one it has already. For that reason, I will resist the amendment, because it narrows the duties rather than widens them.

I would also point out that a number of pieces of work are being done non-legislatively. The campaigns that the hon. Member for Batley and Spen mentioned—dating often, I think, back to the 1980s—were of course done on a non-legislative basis and were just as effective for it. In that spirit, Ofcom published “Ofcom’s approach to online media literacy” at the end of last year, which sets out how Ofcom plans to expand, and is expanding, its media literacy programmes, which cover many of the objectives specified in the new clause. Therefore, Ofcom itself has acted already—just recently—via that document.

Finally, I have two points about what the Government are doing. First, about a year ago the Government published their own online media literacy strategy, which has been backed with funding and is being rolled out as we speak. When it comes to disinformation more widely, which we have debated previously, we also have the counter-disinformation unit working actively on that area.

Therefore, through the Communications Act 2003, the statutory basis exists already, and on a wider basis than in these new clauses; and, through the online media literacy strategy and Ofcom’s own approach, as recently set out, this important area is well covered already.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

We feel that we cannot have an online safety Bill without a core digital media literacy strategy. We are disappointed that clause 103 was removed from the draft Bill. We do not feel that the current regime, under the Communications Act 2003, is robust enough. Clearly, the Government do not think it is robust enough, which is why they tried to replace it in the first place. We are sad to see that now replaced altogether. We fully support these new clauses.

Question put, That the clause be read a Second time.

Division 62

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

New Clause 17
Algorithmic prompts: prohibition of protected characteristics
“(1) A search service which uses an algorithm to suggest search terms to users, an “algorithmic prompt”, must not apply any algorithm where any of the words in the search term relate to any protected characteristic as defined in the Equality Act 2010.
(2) If the word relating to a protected characteristic is not the first word input, the algorithmic prompt must cease as soon as the word relating to a protected characteristic is input by the user.”—(Kirsty Blackman.)
This new clause removes the ability of search services to allow their algorithms to create prompts in relation to protected characteristics. This removes entirely the possibility that a prompt would contain discriminatory language toward an individual or group with protected characteristics.
Brought up, and read the First time.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

I tabled new clause 17 in relation to protected characteristics because of some of the points made by Danny Stone. I missed the relevant evidence session because unfortunately, at the time, I was in the Chamber, responding to the Chancellor of the Exchequer. I am referring to some of the points made by Danny Stone in the course of the evidence session in relation to the algorithmic prompts that there are in search functions.

We have an issue with search functions; we have an issue with the algorithmic prompts that there are in search functions. There is an issue if someone puts in something potentially derogatory, if they put in something relating to someone with a protected characteristic. For example, if someone were to type “Jews are”, the results that they get with those algorithmic prompts can be overwhelmingly racist, overwhelmingly antisemitic, overwhelmingly discriminatory. The algorithm should not be pushing those things.

To give organisations like Google some credit, if something like that is highlighted to them, they will address it. Some of them take a long time to sort it, but they will have a look at it, consider sorting it and, potentially, sort it. But that is not good enough. By that point, the damage is done. By that point, the harm has been put into people’s minds. By that point, someone who is from a particular group and has protected characteristics has already seen that Google—or any other search provider—is pushing derogatory terms at people with protected characteristics.

I know that the prompts work like that because of artificial intelligence; firms are not intentionally writing these terms in order to push them towards people, but the AI allows that to happen. If such companies are going to be using artificial intelligence—some kind of software algorithm—they have a responsibility to make sure that none of the content they are generating on the basis of user searches is harmful. I asked Google about this issue during one of our evidence sessions, and the response they gave was, “Oh, algorithmic prompts are really good, so we should keep them”—obviously I am paraphrasing. I do not think that is a good enough argument. I do not think the value that is added by algorithmic prompts is enough to counter the harm that is caused by some of those prompts.

As such, the new clause specifically excludes protected characteristics from any algorithm that is used in a search engine. The idea is that if a person starts to type in something about any protected characteristic, no algorithmic prompt will appear, and they will just be typing in whatever they were going to type in anyway. They will not be served with any negative, harmful, discriminatory content, because no algorithmic prompt will come up. The new clause would achieve that across the board for every protected characteristic term. Search engines would have to come up with a list of such terms and exclude all of them from the work of the algorithm in order to provide that layer of protection for people.

I do not believe that that negative content could be in any way balanced by the potential good that could arise from somebody being able to type “Jews are” and getting a prompt that says “funny”. That would be a lovely, positive thing for people to see, but the good that could be caused by those prompts is outweighed by the negativity, harm and pain that is caused by the prompts we see today, which platforms are not quick enough to act on.

As I say, the harm is done by the time the report is made; by the time the concern is raised, the harm has already happened. New clause 17 would prevent that harm from ever happening. It would prevent anybody from ever being injured in any way by an algorithmic prompt from a search engine. That is why I have tabled that new clause, in order to provide a level of protection for any protected characteristic as defined under the Equality Act 2010 when it comes to search engine prompts.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The problem underlying the need for this new clause is that under the Bill, search services will not have to address or risk assess legal harm to adults on their sites, while the biggest user-to-user services will. As Danny Stone of the Antisemitism Policy Trust told us in evidence, that includes sites such as Google and Microsoft Bing, and voice search assistants including Amazon’s Alexa and Apple’s Siri. Search services rightly highlight that the content returned by a search is not created or published by then, but as the hon. Member for Aberdeen North has said, algorithmic indexing, promotion and search prompts provided in the search bar are their responsibility. As she has pointed out, and as we have heard in evidence sessions, those algorithms can cause significant harm.

Danny Stone told us on 26 May:

“Search returns are not necessarily covered because, as I say, they are not the responsibility of the internet companies, but the systems that they design as to how those things are indexed and the systems to prevent them going to harmful sites by default are their responsibility, and at present the Bill does not address that.”––[Official Report, Online Safety Public Bill Committee, 26 May 2022; c. 130, Q207.]

The hon. Member for Aberdeen North mentioned the examples from Microsoft Bing that Danny gave in his evidence—“Jews are” and “gays are”. He gave other examples of answers that were returned by search services, such as using Amazon Alexa to search, “Is George Soros evil?” The response was, “Yes, he is.” “Are the White Helmets fake?” “Yes, they are set up by an ex-intelligence officer.” The issue is that the search prompts that the hon. Member has talked about are problematic, because just one person giving an answer to Amazon could prompt that response. The second one, about the White Helmets, was a comment on a website that was picked up. Clearly, that is an issue.

Danny Stone’s view is that it would be wise to have something that forces search companies to have appropriate risk assessments in place for the priority harms that Parliament sets, and to enforce those terms and conditions consistently. It is not reasonable to exempt major international and ubiquitous search services from risk assessing and having a policy to address the harms caused by their algorithms. We know that leaving it up to platforms to sort this out themselves does not work, which is why Labour is supporting the new clause proposed by our SNP colleague.

10:15
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

It is important to make clear how the Bill operates, and I draw the Committee’s attention in particular to clauses 23 to 26, which deal with the risk assessment and safety duties for search services. I point in particular to clause 23(5)(a), which deals with the risk assessment duties for illegal content. The provision makes it clear that those risk assessments have to be carried out

“taking into account (in particular) risks presented by algorithms used by the service”.

Clause 25 relates to children’s risk assessment duties, and subsection (5)(a) states that children’s risk assessment duties have to be carried out

“taking into account (in particular) risks presented by algorithms”.

The risks presented by algorithms are expressly accounted for in clauses 23 and 25 in relation to illegal acts and to children. Those risk assessment duties flow into safety duties as we know.

By coincidence, yesterday I met with Google’s head of search, who talked about the work Google is doing to ensure that its search work is safe. Google has the SafeSearch work programme, which is designed to make the prompts better constructed.

In my view, the purpose of the new clause is covered by existing provisions. If we were to implement the proposal—I completely understand and respect the intention behind it, by the way—there could be an unintended consequence in the sense that it would ban any reference in the prompts to protected characteristics, although people looking for help, support or something like that might find such prompts helpful.

Through a combination of the existing duties and the list of harms, which we will publish in due course, as well as legislating via statutory instrument, we can ensure that people with protected characteristics, and indeed other people, are protected from harmful prompts while not, as it were, throwing the baby out with the bathwater and banning the use of certain terms in search. That might cause an unintended negative consequence for some people, particularly those from marginalised groups who were looking for help. I understand the spirit of the new clause, but we shall gently resist it.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The Minister has highlighted clauses 23 and 25. Clause 25 is much stronger than clause 23, because clause 23 includes only illegal content and priority illegal content, whereas clause 25 goes into non-designated content that is harmful to children. Some of the things that we are talking about, which might not be on the verge of illegal, but which are wrong and discriminatory, might not fall into the categories of illegal or priority illegal content unless the search service, which presumably an organisation such as Google is, has a children’s risk assessment duty. Such organisations are getting a much easier ride in that regard.

I want to make the Minister aware of this. If he turns on Google SafeSearch, which excludes explicit content, and googles the word “oral” and looks at the images that come up, he will see that those images are much more extreme than he might imagine. My point is that, no matter the work that the search services are trying to do, they need to have the barriers in place before that issue happens—before people are exposed to that harmful or illegal content. The existing situation does not require search services to have enough in place to prevent such things happening. The Minister was talking about moderation and things that happen after the fact in some ways, which is great, but does not protect people from the harm that might occur. I very much wish to press the new clause to the vote.

Question put, That the clause be read a Second time.

Division 63

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

New Clause 18
Identification of information incidents by Ofcom
“(1) OFCOM must maintain arrangements for identifying and understanding patterns in the presence and dissemination of harmful misinformation and disinformation on regulated services.
(2) Arrangements for the purposes of subsection (1) must in particular include arrangements for—
(a) identifying, and assessing the severity of, actual or potential information incidents; and
(b) consulting with persons with expertise in the identification, prevention and handling of disinformation and misinformation online (for the purposes of subsection (2)(a)).
(3) Where an actual or potential information incident is identified, OFCOM must as soon as reasonably practicable—
(a) set out any steps that OFCOM plans to take under its online safety functions in relation to that situation; and
(b) publish such recommendations or other information that OFCOM considers appropriate.
(4) Information under subsection (3) may be published in such a manner as appears to OFCOM to be appropriate for bringing it to the attention of the persons who, in OFCOM’s opinion, should be made aware of it.
(5) OFCOM must prepare and issue guidance about how it will exercise its functions under this section and, in particular—
(a) the matters it will take into account in determining whether an information incident has arisen;
(b) the matters it will take into account in determining the severity of an incident; and
(c) the types of responses that OFCOM thinks are likely to be appropriate when responding to an information incident.
(6) For the purposes of this section—
‘harmful misinformation or disinformation’ means misinformation or disinformation which, taking into account the manner and extent of its dissemination, may have a material adverse effect on users of regulated services or other members of the public;
‘information incident’ means a situation where it appears to OFCOM that there is a serious or systemic dissemination of harmful misinformation or disinformation relating to a particular event or situation.”—(Kirsty Blackman.)
This new clause would insert a new clause into the Bill to give Ofcom a proactive role in identifying and responding to the sorts of information incidents that can occur in moments of crisis.
Brought up, and read the First time.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss new clause 45—Sharing of information relating to counter-disinformation

“(1) The Secretary of State must produce a report setting out any steps the Secretary of State has taken to tackle the presence of disinformation on Part 3 services.

(2) The purpose of the report is to assist OFCOM in carrying out its regulatory duties under this Act.

(3) The first report must be submitted to OFCOM and laid before Parliament within six months of this Act being passed.

(4) Thereafter, the Secretary of State must submit an updated report to OFCOM and lay it before Parliament at least once every three months.”

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

My hon. Friend the Member for Ochil and South Perthshire is not present and he had intended to move this new clause. If the Committee does not mind, I will do more reading and look at my notes more than I would normally when giving a speech.

Misinformation and disinformation arise during periods of uncertainty, either acutely, such as during a terror attack, or over a long period, as with the pandemic. That often includes information gaps and a proliferation of inaccurate claims that spread quickly. Where there is a vacuum of information, we can have bad actors or the ill-informed filling it with false information.

Information incidents are not dealt with effectively enough in the Bill, which is focused on regulating the day-to-day online environment. I accept that clause 146 gives the Secretary of State powers of direction in certain special circumstances, but their effectiveness in real time would be questionable. The Secretary of State would have to ask Ofcom to prioritise its media literacy function or to make internet companies report on what they are doing in response to a crisis. That is just too slow, given the speed at which such incidents can spread.

The new clause might involve Ofcom introducing a system whereby emerging incidents could be reported publicly and different actors could request the regulator to convene a response group. The provision would allow Ofcom to be more proactive in its approach and, in I hope rare moments, to provide clear guidance. That is why the new clause is a necessary addition to the Bill.

Many times, we have seen horrendous incidents unfold on the internet, in a very different way from how they ever unfolded in newspapers, on news websites or among people talking. We have seen the untold and extreme harm that such information incidents can cause, as significant, horrific events can be spread very quickly. We could end up in a situation where an incident happens and, for example, a report spreads that a Muslim group was responsible when there is absolutely no basis of truth to that. A vacuum can be created and bad actors step into it in order to spread discrimination and lies, often about minority groups who are already struggling. That is why we move the new clause.

For the avoidance of doubt, new clause 45, which was tabled by Labour, is also to be debated in this group. I am more than happy to support it.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

As we know, the new clause would give Ofcom a proactive role in identifying and responding to misinformation incidents that can occur in a moment of crisis. As we have discussed, there are huge gaps in the Bill’s ability to sufficiently arm Ofcom with the tools it will likely need to tackle information incidents in real time. It is all very well that the Bill will ensure that things such as risk assessments are completed, but, ultimately, if Ofcom is not able to proactively identify and respond to incidents in a crisis, I have genuine concerns about how effective this regulatory regime will be in the wider sense. Labour is therefore pleased support the new clause, which is fundamental to ensuring that Ofcom can be the proactive regulator that the online space clearly needs.

The Government’s methods of tackling disinformation are opaque, unaccountable and may not even work. New clause 45, which would require reporting to Parliament, may begin to address this issue. When Ministers are asked how they tackle misinformation or disinformation harms, they refer to some unaccountable civil service team involved in state-based interference in online media.

I thank those at Carnegie UK Trust for their support when researching the following list, and for supporting my team and me to make sense of the Bill. First, we have the counter-disinformation unit, which is based in the Department for Digital, Culture, Media and Sport and intends to address mainly covid issues that breach companies’ terms of service and, recently, the Russia-Ukraine conflict. In addition, the Government information cell, which is based in the Foreign, Commonwealth and Development Office, focuses on war and national security issues, including mainly Russia and Ukraine. Thirdly, there is the so-called rapid response unit, which is based in the Cabinet Office, and mainly tackles proactive counter-messaging.

Those teams appear to nudge service providers in different ways where there are threats to national security or the democratic process, or risks to public health, yet we have zero record of their effectiveness. The groups do not publish logs of action to any external authority for oversight of what they raise with companies using the privilege authority of Her Majesty’s Government, nor do they publish the effectiveness of their actions. As far as we know, they are not rooted in expert independent external advisers. That direct state interference in the media is very worrying.

In our recent debate on amendment 83, which calls on the Government to include health misinformation and disinformation in the Bill, the Minister clearly set out why he thinks the situation is problematic. He said,

“We have established a counter-disinformation unit within DCMS whose remit is to identify misinformation and work with social media firms to get it taken down. The principal focus of that unit during the pandemic was, of course, covid. In the past three months, it has focused more on the Russia-Ukraine conflict, for obvious reasons.

In some cases, Ministers have engaged directly with social media firms to encourage them to remove content that is clearly inappropriate. For example, in the Russia-Ukraine context, I have had conversations with social media companies that have left up clearly flagrant Russian disinformation. This is, therefore, an area that the Government are concerned about and have been acting on operationally already.”––[Official Report, Online Safety Public Bill Committee, 14 June 2022; c. 408.]

Until we know more about those units, the boundary between their actions and that of a press office remains unclear. In the new regulatory regime, Ofcom needs to be kept up to date on the issues they are raising. The Government should reform the system and bring those units out into the open. We support Carnegie’s longer term strategic goal to set up a new external oversight body and move the current Government functions under Ofcom’s independent supervision. The forthcoming National Security Bill may tackle that, but I will leave that for the Minister to consider.

There must be a reporting system that requires the Government to set out their operational involvement with social media companies to address misinformation and disinformation, which is why we have tabled new clause 45. I hope the Minister will see that the current efforts in these units are hugely lacking in transparency, which we all want and have learned is fundamental to keep us all safe online.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We agree that it is important that the Bill contains measures to tackle disinformation and misinformation that may emerge during serious information incidents, but the Bill already contains measures to address those, including the powers vested in the Secretary of State under clause 146, which, when debated, provoked some controversy. Under that clause, the Secretary of State will have the power to direct Ofcom when exercising its media literacy functions in the context of an issue of public health or safety or national security.

Moreover, Ofcom will be able to require platforms to issue a public statement about the steps they are taking to respond to a threat to public health or safety or to national security. As we discussed, it is appropriate that the Secretary of State will make those directions, given that the Government have the access to intelligence around national security and the relevant health information. Ofcom, as a telecoms regulator, obviously does not have access to that information, hence the need for the Secretary of State’s involvement.

10:30
It is also worth saying that under the existing framework, companies will have to address harmful disinformation that could spread during information incidents, such as the recent pandemic. The Government have already committed to designating some forms of harmful health mis and disinformation as priority harmful content in secondary legislation, which further supports the point.
Ofcom already has reporting duties under the Bill’s framework to carry out reviews of the prevalence and severity of content harmful to children and adults on regulated services. Under clause 135, Ofcom must also produce its own transparency report, in addition to which there will be an advisory committee on dis and misinformation, set out in clause 130, to provide advice to Ofcom about how these issues can be addressed.
The shadow Minister, the hon. Member for Pontypridd, has already made reference to DCMS’s counter-disinformation unit. She has quoted me extensively—I thank her for that—setting out the work it has been doing. She asked about further reporting in terms of oversight of that counter-disinformation unit. Obviously, setting out the full details of what it does could provide inappropriately detailed information to hostile states, such as Russia, that are trying to pump out that disinformation. However, the activities of the CDU are of course open to parliamentary scrutiny in the usual way, whether that is through oral questions, Backbench Business and Opposition day debates, or scrutiny by Select Committees, just as every other area of Government activity is open to parliamentary scrutiny using any of the means available.
On the regular reports sought through new clause 45, we think the work of the CDU is already covered in the way I have just set out. It would not be appropriate to lift up the hood to the point that the Russians and others can see exactly what is going on. Ofcom is already required to consult with the Secretary of State and relevant experts when developing its codes of practice, which gives the Secretary of State an appropriate mechanism.
I have been brief in the interest of time, but I hope I have set out how the Bill as drafted already provides a response to mis and disinformation. I have also pointed out the existing parliamentary scrutiny to which the Government in general and the CDU in particular is subject. I therefore ask the hon. Member for Aberdeen North to withdraw the new clause.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I do not think the urgency and speed that are needed for these incidents is adequately covered by the Bill, so I would like to push new clause 18 to a vote.

Question put, That the clause be read a Second time.

Division 64

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

New Clause 19
Research conducted by regulated services
“(1) OFCOM may, at any time it considers appropriate, produce a report into how regulated services commission, collate, publish and make use of research.
(2) For the purposes of the report, OFCOM may require services to submit to OFCOM—
(a) a specific piece of research held by the service, or
(b) all research the service holds on a topic specified by OFCOM.”—(Kirsty Blackman.)
Brought up, and read the First time.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

I think you are probably getting fed up with me, Sir Roger, so I will try my best not to speak for too long. The new clause is one of the most sensible ones we have put forward. It simply allows Ofcom to ask regulated services to submit to Ofcom

“a specific piece of research held by the service”

or

“all research the service holds”

on a specific topic. It also allows Ofcom to product a report into

“how regulated services commission, collate, publish and make use of research.”

The issues that we heard raised by Frances Haugen about the secretive nature of these very large companies gave us a huge amount concern. Providers will have to undertake risk assessments on the basis of the number of users they have, the risk of harm to those users and what percentage of their users are children. However, Ofcom is just going to have to believe the companies when they say, “We have 1 million users,” unless it has the ability to ask for information that proves the risk assessments undertaken are adequate and that nothing is being hidden by those organisations. In order to find out information about a huge number of the platforms, particularly ones such as Facebook, we have had to have undercover researchers posing as other people, submitting reports and seeing how they come out.

We cannot rely on these companies, which are money-making entities. They exist to make a profit, not to make our lives better. In some cases they very much do make our lives better—in some cases they very much do not—but that is not their aim. Their aim is to try to make a profit. It is absolutely in their interests to underplay the number of users they have and the risk faced by people on their platforms. It is very much in their interest to underplay how the algorithms are firing content at people, taking them into a negative or extreme spiral. It is also in their interests to try to hide that from Ofcom, so that they do not have to put in the duties and mitigations that keep people safe.

We are not asking those companies to make the information public, but if we require them to provide to Ofcom their internal research, whether on the gender or age of their users, or on how many of their users are viewing content relating to self-harm, it will raise their standards. It will raise the bar and mean that those companies have to act in the best interests—or as close as they can get to them—of their users. They will have to comply with what is set out in the Bill and the directions of Ofcom.

I see no issue with that. Ofcom is not going to share the information with other companies, so that they could subvert competition law. Ofcom is a regulator; it literally does not do that. Our proposal would mean that Ofcom has the best, and the most, information in order to take sensible decisions to properly regulate the platforms. It is not a difficult provision for the Minister to accept.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The transparency requirements set out in the Bill are welcome but limited. Numerous amendments have been tabled by the Opposition and by our colleagues in the SNP to increase transparency, so that we can all be better informed about the harms around us, and so that the regulator can determine what protections are needed for existing and emerging harms. This new clause is another important provision in that chain and I speak in support of it.

We know that there is research being undertaken all the time by companies that is never published—neither publicly nor to the regulator. As the hon. Member for Aberdeen North said, publishing research undertaken by companies is an issue championed by Frances Haugen, whose testimony last month the Committee will remember. A few years ago, Frances Haugen brought to the public’s attention the extent to which research is held by companies such as Facebook—as it was called then—and never reaches the public realm.

Billions of members of the public are unaware that they are being tracked and monitored by social media companies as subjects in their research studies. The results of those studies are only published when revealed by brave whistleblowers. However, their findings could help charities, regulators and legislators to recognise harms and help to make the internet a safer place. For example, Frances Haugen leaked one Facebook study that found that a third of teenage girls said Instagram made them feel worse about their bodies. Facebook’s head of safety, Antigone Davis, fielded questions on this issue from United States Senators last September. She claimed that the research on the impact of Instagram and Facebook to children’s health was “not a bombshell”. Senator Richard Blumenthal responded:

“I beg to differ with you, Ms Davis, this research is a bombshell. It is powerful, gripping, riveting evidence that Facebook knows of the harmful effects of its site on children and that it has concealed those facts and findings.”

It is this kind of cover-up that new clause 19 seeks to prevent.

I remind the Committee of one more example that Frances Haugen illustrated to us in her evidence last month. Meta conducts frequent analyses of the estimated age of its users, which is often different from the ages they submit when registering, both among adults and children. Frances told us that Meta does this so that adverts can be targeted more effectively. However, if Ofcom could request this data, as the new clause would require, it would give an important insight into how many under-13s were in fact creating accounts on Facebook. Ofcom should be able to access such information, so I hope hon. Members and the Minister will support the new clause as a measure to increase transparency and support greater protections for children.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me start by saying that I completely agree with the premise of the new clause. First, I agree that these large social media companies are acting principally for motives of their own profit and not the public good. Secondly, I agree with the proposition that they are extremely secretive, and do not transparently and openly disclose information to the public, the Government or researchers, and that is a problem we need to solve. I therefore wholeheartedly agree with the premise of the hon. Member for Aberdeen North’s new clause and her position.

However, I am honestly a bit perplexed by the two speeches we have just heard, because the Bill sets out everything the hon. Members for Aberdeen North and for Worsley and Eccles South asked for in unambiguous, black and white terms on the face of the Bill—or black and green terms, because the Bill is published on green paper.

Clause 85 on page 74 outlines the power Ofcom has to request information from the companies. Clause 85(1) says very clearly that Ofcom may require a person

“to provide them with any information”—

I stress the word “any”—

“that they require for the purpose of exercising, or deciding whether to exercise, any of their online safety functions.”

Ofcom can already request anything of these companies.

For the avoidance of doubt, clause 85(5) lists the various things Ofcom can request information for the purpose of and clause 85(5)(l)—on page 75, line 25— includes for

“the purpose of carrying out research, or preparing a report, in relation to online safety matters”.

Ofcom can request anything, expressly including requesting information to carry out research, which is exactly what the hon. Member for Aberdeen North quite rightly asks for.

The hon. Lady then said, “What if they withhold information or, basically, lie?” Clause 92 on page 80 sets out the situation when people commit an offence. The Committee will see that clause 92(3)(a) states that a person “commits an offence” if

“the person provides information that is false in a material respect”.

Again, clause 92(5)(a) states that a person “commits an offence” if

“the person suppresses, destroys or alters, or causes or permits the suppression, destruction or alteration of, any information required to be provided.”

In short, if the person or company who receives the information request lies, or falsifies or destroys information, they are committing an offence that will trigger not only civil sanctions—under which the company can pay a fine of up to 10% of global revenue or be disconnected—but a personal offence that is punishable by up to two years in prison.

I hope I have demonstrated that clauses 85 and 92 already clearly contain the powers for Ofcom to request any information, and that if people lie, destroy information or supress information as they do as the moment, as the hon. Member for Aberdeen North rightly says they do, that will be a criminal offence with full sanctions available. I hope that demonstrates to the Committee’s satisfaction that the Bill does this already, and that it is important that it does so for the reasons that the hon. Lady set out.

10:45
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a question for the Minister that hopefully, given the Committee’s work, he might be able to answer. New clause 19(2)(b) would give Ofcom the power to require services to submit to it

“all research the service holds on a topic specified by OFCOM.”

Ofcom could say, “We would like all the research you have on the actual age of users.”

My concern is that clause 85(1) allows Ofcom to require companies to provide it

“with any information that they require for the purpose of exercising, or deciding whether to exercise, any of their online safety functions.”

Ofcom might not know what information the company holds. I am concerned that Ofcom is able to say, as it is empowered to do by clause 85(1), “Could you please provide us with the research piece you did on under-age users or on the age of users?”, instead of having a more general power to say, “Could you provide us with all the research you have done?” I am worried that the power in clause 85(1) is more specific.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

If the Minister holds on for two seconds, he will get to make an actual speech. I am worried that the power is not general enough. I would very much like to hear the Minister confirm what he thinks.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am not going to make a full speech. I have conferred with colleagues. The power conferred by clause 85(1) is one to require any information in a particular domain. Ofcom does not have to point to a particular research report and say, “Please give me report X.” It can ask for any information that is relevant to a particular topic. Even if it does not know what specific reports there may be—it probably would not know what reports there are buried in these companies—it can request any information that is at all relevant to a topic and the company will be obliged to provide any information relevant to that request. If the company fails to do so, it will be committing an offence as defined by clause 92, because it would be “suppressing”, to use the language of that clause, the information that exists.

I can categorically say to the hon. Lady that the general ability of Ofcom is to ask for any relevant information—the word “any” does appear—and even if the information notice does not specify precisely what report it is, Ofcom does have that power and I expect it to exercise it and the company to comply. If the company does not, I would expect it to be prosecuted.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Given that clarification, I will not press the new clause. The Minister has made the case strongly enough and has clarified clause 85(1) to my satisfaction. I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

New Clause 23

Priority illegal content: violence against women and girls

“(1) For the purposes of this Act, any provision applied to priority illegal content should also be applied to any content which—

(a) constitutes,

(b) encourages, or

(c) promotes

violence against women or girls.

(2) ‘Violence against women and girls’ is defined by Article 3 of the Council of Europe Convention on Preventing Violence Against Women and Domestic Violence (‘the Istanbul Convention’).” —(Alex Davies-Jones.)

This new clause applies provisions to priority illegal content to content which constitutes, encourages or promotes violence against women and girls.

Brought up, and read the First time.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

This new clause would apply provisions applied to priority illegal content also to content that constitutes, encourages or promotes violence against women and girls. As it stands, the Bill is failing women and girls. In an attempt to tackle that alarming gap, the new clause uses the Istanbul convention definition of VAWG, given that the Home Secretary has so recently agreed to ratify the convention—just a decade after was signed.

The Minister might also be aware that GREVIO—the Group of Experts on Action against Violence against Women and Domestic Violence—which monitors the implementation of the Istanbul convention, published a report in October 2021 on the digital dimension of violence against women and girls. It stated that domestic laws are failing to place the abuse of women and girls online

“in the context of a continuum of violence against women that women and girls are exposed to in all spheres of life, including in the digital sphere.”

The purpose of naming VAWG in the Bill is to require tech companies to be responsible for preventing and addressing VAWG as a whole, rather than limiting their obligations only to specific criminal offences listed in schedule 7 and other illegal content. It is also important to note that the schedule 7 priority list was decided on without any consultation with the VAWG sector. Naming violence against women and girls will also ensure that tech companies are held to account for addressing emerging forms of online hate, which legislation is often unable to keep up with.

We only need to consider accounts from survivors of online violence against women and girls, as outlined in “VAWG Principles for the Online Safety Bill”, published in September last year, to really see the profound impact that the issue is having on people’s lives. Ellesha, a survivor of image-based sexual abuse, was a victim of voyeurism at the hands of her ex-partner. She was filmed without her consent and was later notified by someone else that he had uploaded videos of her to Pornhub. She recently spoke at an event that I contributed to—I believe the right hon. Member for Basingstoke and others also did—on the launch of the “Violence Against Women and Girls Code of Practice”. I am sure we will come to that code of practice more specifically on Report. Her account was genuinely difficult to listen to.

This is an issue that Ellesha, with the support of EVAW, Glitch, and a huge range of other organisations, has campaigned on for some time. She says:

“Going through all of this has had a profound impact on my life. I will never have the ability to trust people in the same way and will always second guess their intentions towards me. My self confidence is at an all time low and although I have put a brave face on throughout this, it has had a detrimental effect on my mental health.”

Ellesha was informed by the police that they could not access the websites where her ex-partner had uploaded the videos, so she was forced to spend an immense amount of time trawling through all of the videos uploaded to simply identify herself. I can only imagine how distressing that must have been for her.

Pornhub’s response to the police inquiries was very vague in the first instance, and it later ignored every piece of following correspondence. Eventually the videos were taken down, likely by the ex-partner himself when he was released from the police station. Ellesha was told that Pornhub had only six moderators at the time—just six for the entire website—and it and her ex-partner ultimately got away with allowing the damaging content to remain, even though the account was under his name and easily traced back to his IP address. That just is not good enough, and the Minister must surely recognise that the Bill fails women in its current form.

If the Minister needs any further impetus to genuinely consider the amendment, I point him to a BBC report from last week that highlighted how much obscene material of women and girls is shared online without their consent. The BBC’s Angus Crawford investigated Facebook accounts and groups that were seen to be posting pictures and videos of upskirting. Naturally, Meta—Facebook’s owner—said that it had a grip on the problem and that those accounts and groups had all been removed, yet the BBC was able to find thousands of users sharing material. Indeed, one man who posted videos of himself stalking schoolgirls in New York is now being investigated by the police. This is the reality of the internet; it can be a powerful, creative tool for good, but far too often it seeks to do the complete opposite.

I hate to make this a gendered argument, but there is a genuine difference between the experiences of men and women online. Last week the Minister came close to admitting that when I queried whether he had ever received an unsolicited indecent picture. I am struggling to understand why he has failed to consider these issues in a Bill proposed by his Department.

The steps that the Government are taking to tackle violence against women and girls offline are broadly to be commended, and I welcome a lot of the initiatives. The Minister must see sense and do the right thing by also addressing the harms faced online. We have a genuine opportunity in the Bill to prevent violence against women and girls online, or at least to diminish some of the harms they face. Will he please do the right thing?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The shadow Minister is right to raise the issue of women and girls being disproportionately—one might say overwhelmingly—the victims of certain kinds of abuse online. We heard my right hon. Friend the Member for Basingstoke, the shadow Minister and others set that out in a previous debate. The shadow Minister is right to raise the issue.

Tackling violence against women and girls has been a long-standing priority of the Government. Indeed, a number of important new offences have already been and are being created, with protecting women principally in mind—the offence of controlling or coercive behaviour, set out in the Serious Crime Act 2015 and amended in the Domestic Abuse Act 2021; the creation of a new stalking offence in 2012; a revenge porn offence in 2015; and an upskirting offence in 2019. All of those offences are clearly designed principally to protect women and girls who are overwhelmingly the victims of those offences. Indeed, the cyber-flashing offence created by clause 156 —the first time we have ever had such an offence in this jurisdiction—will, again, overwhelmingly benefit women and girls who are the victims of that offence.

All of the criminal offences I have mentioned—even if they are not mentioned in schedule 7, which I will come to in a moment—will automatically flow into the Bill via the provisions of clause 52(4)(d). Criminal offences where the victim is an individual, which these clearly all are, automatically flow into the provisions of the Bill, including the offences I just listed, which have been created particularly with women in mind.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I hope that my hon. Friend will discuss the Law Commission’s recommendations on intimate image abuse. When I raised this issue in an earlier sitting, he was slightly unsighted by the fact that the recommendations were about to come out—I can confirm again that they will come out on 7 July, after some three years of deliberation. It is unfortunate that will be a week after the end of the Committee’s deliberations, and I hope that the timing will not preclude the Minister from mopping it up in his legislation.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my right hon. Friend for her question and for her tireless work in this area. As she says, the intimate image abuse offence being worked on is an extremely important piece in the jigsaw puzzle to protect women, particularly as it has as its threshold—at least in the previous draft—consent, without any test of intent, which addresses some points made by the Committee previously. As we have discussed before, it is a Ministry of Justice lead, and I am sure that my right hon. Friend will make representations to MOJ colleagues to elicit a rapid confirmation of its position on the recommendations, so that we can move to implement them as quickly as possible.

I remind the Committee of the Domestic Abuse Act 2021, which was also designed to protect women. Increased penalties for stalking and harassment have been introduced, and we have ended the automatic early release of violent and sex offenders from prison—something I took through Parliament as a Justice Minister a year or two ago. Previously, violent and sex offenders serving standard determinate sentences were often released automatically at the halfway point of their sentence, but we have now ended that practice. Rightly, a lot has been done outside the Bill to protect women and girls.

Let me turn to what the Bill does to further protect women and girls. Schedule 7 sets out the priority offences—page 183 of the Bill. In addition to all the offences I have mentioned previously, which automatically flow into the illegal safety duties, we have set out priority offences whereby companies must not just react after the event, but proactively prevent the offence from occurring in the first place. I can tell the Committee that many of them have been selected because we know that women and girls are overwhelmingly the victims of such offences. Line 21 lists the offence of causing

“intentional harassment, alarm or distress”.

Line 36 mentions the offence of harassment, and line 37 the offence of stalking. Those are obviously offences where women and girls are overwhelmingly the victims, which is why we have picked them out and put them in schedule 7—to make sure they have the priority they deserve.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The Minister is making a good speech about the important things that the Bill will do to protect women and girls. We do not dispute that it will do so, but I do not understand why he is so resistant to putting this on the face of the Bill. It would cost him nothing to do so, and it would raise the profile. It would mean that everybody would concentrate on ensuring that there are enhanced levels of protection for women and girls, which we clearly need. I ask him to reconsider putting this explicitly on the face of the Bill, as he has been asked to do by us and so many external organisations.

11:04
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I completely understand and accept the point that there are groups of people in society who suffer disproportionate harms, as we have debated previously, and that obviously includes women and girls. There are of course other groups as well, such as ethnic minorities or people whose sexual orientation makes them the target of completely unacceptable abuse in a way that other groups do not suffer.

I accept the point about having this “on the face of the Bill”. We have debated this. That is why clauses 10 and 12 use the word “characteristic”—we debated this word previously The risk assessment duties, which are the starting point for the Bill’s provisions, must specifically and expressly—it is on the face of the Bill—take into account characteristics, first and foremost gender, but also racial identity, sexual orientation and so on. Those characteristics must be expressly addressed by the risk assessments for adults and for children, in order to make sure that the special protections or vulnerabilities or the extra levels of abuse people with those characteristics suffer are recognised and addressed. That is why those provisions are in the Bill, in clauses 10 and 12.

A point was raised about platforms not responding to complaints raised about abusive content that has been put online—the victim complains to the platform and nothing happens. The hon. Members for Pontypridd and for Aberdeen North are completely right that this is a huge problem that needs to be addressed. Clause 18(2) places a duty—they have to do it; it is not optional—on these platforms to operate a complaints procedure that is, in paragraph (c),

“easy to access, easy to use (including by children)”

and that, in paragraph (b),

“provides for appropriate action to be taken”.

They must respond. They must take appropriate action. That is a duty under clause 18. If they do not comply with that duty on a systemic basis, they will be enforced against. The shadow Minister and the hon. Member for Aberdeen North are quite right. The days of the big platforms simply ignoring valid complaints from victims have to end, and the Bill will end them.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I am extremely impressed by the Minister’s knowledge of the Bill, as I have been throughout the Committee’s sittings. It is admirable to see him flicking from page to page, finding where the information about violence against women and girls is included, but I have to concur with the hon. Member for Aberdeen North and my Front-Bench colleagues. There is surely nothing to be lost by specifically including violence against women and girls on the face of the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I hope I have made very clear in everything I have said, which I do not propose to repeat, that the way the Bill operates, in several different areas, and the way the criminal law has been constructed over the past 10 years, building on the work of previous Governments, is that it is designed to make sure that the crimes committed overwhelmingly against women and girls are prioritised. I think the Bill does achieve the objective of providing that protection, which every member of this Committee wishes to see delivered. I have gone through it in some detail. It is woven throughout the fabric of the Bill, in multiple places. The objective of new clause 23 is more than delivered.

In conclusion, we will be publishing a list of harms, including priority harms for children and adults, which will then be legislated for in secondary legislation. The list will be constructed with the vulnerability of women and girls particularly in mind. When Committee members see that list, they will find it reassuring on this topic. I respectfully resist the new clause, because the Bill is already incredibly strong in this important area as it has been constructed.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Bill is strong, but it could be stronger. It could be, and should be, a world-leading piece of legislation. We want it to be world-leading and we feel that new clause 23 would go some way to achieving that aim. We have cross-party support for tackling violence against women and girls online. Placing it on the face of the Bill would put it at the core of the Bill—at its heart—which is what we all want to achieve. With that in mind, I wish to press the new clause to a vote.

Question put, That the clause be read a Second time.

Division 65

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

New Clause 24
Civil claims for breach of duty
“A user may bring civil proceedings against the provider of a regulated service in respect of a breach by a provider of any of its duties under Part 3 of this Act.”—(Barbara Keeley.)
This new clause would enable users to bring civil proceedings against providers when providers fail to meet their duties under Part 3.
Brought up, and read the First time.
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

New clause 24 would enable users to bring civil proceedings against providers when they fail to meet their duties under part 3 of the Bill. As has been said many times, power is currently skewed significantly against individuals and in favour of big corporations, leading people to feel that they have no real ability to report content or complain to companies because, whenever they do, there is no response and no action. We have discussed how the reporting, complaints and super-complaints mechanisms in the Bill could be strengthened, as well as the potential merits of an ombudsman, which we argued should be considered when we debated new clause 1.

In tabling this new clause, we are trying to give users the right to appeal through another route—in this case, the courts. As the Minister will be aware, that was a recommendation of the Joint Committee, whose report stated:

“While we recognise the resource challenges both for individuals in accessing the courts and the courts themselves, we think the importance of issues in this Bill requires that users have a right of redress in the courts. We recommend the Government develop a bespoke route of appeal in the courts to allow users to sue providers for failure to meet their obligations under the Act.”

The Government’s response to that recommendation was that the Bill would not change the current situation, which allows individuals to

“seek redress through the courts in the event that a company has been negligent or is in breach of its contract with the individual.”

It went on to note:

“Over time, as regulatory precedent grows, it will become easier for individuals to take user-to-user services to court when necessary.”

That seems as close as we are likely to get to an admission that the current situation for individuals is far from easy. We should not have to wait for the conclusion of the first few long and drawn-out cases before it becomes easier for people to fight companies in the courts.

Some organisations have rightly pointed out that a system of redress based on civil proceedings in the courts risks benefiting those with the resources to sue—as we know, that is often the case. However, including that additional redress system on the face of the Bill should increase pressure on companies to fulfil their duties under part 3, which will hopefully decrease people’s need to turn to the redress mechanism.

If we want the overall system of redress to be as strong as possible, individuals must have the opportunity to appeal failures of a company’s duty of care as set out in the Bill. The Joint Committee argued that the importance of the issues dealt with by the Bill requires that users have a right of redress in the courts. The Government did not respond to that criticism in their formal response, but it is a critical argument. A balancing act between proportionate restrictions and duties versus protections against harms is at the heart of this legislation, and has been at the heart of all our debates. Our position is in line with that of the Joint Committee: these issues are too important to deny individuals the right to appeal failures of duty by big companies through the courts.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I agree with the shadow Minister’s point that it is important to make sure social media firms are held to account, which is the entire purpose of the Bill. I will make two points in response to the proposed new clause, beginning with the observation that the first part of its effect is essentially to restate an existing right. Obviously, individuals are already at liberty to seek redress through the courts where a company has caused that individual to suffer loss through negligence or some other behaviour giving rise to grounds for civil liability. That would, I believe, include a breach of that company’s terms of service, so simply restating in legislation a right that already exists as a matter of law and common law is not necessary. We do not do declaratory legislation that just repeats an existing right.

Secondly, the new clause creates a new right of action that does not currently exist, which is a right of individual action if the company is in breach of one of the duties set out in part 3 of the Bill. Individuals being able to sue for a breach of a statutory duty that we are creating is not the way in which we are trying to construct enforcement under the Bill. We will get social media firms to comply through Ofcom acting as the regulator, rather than via individuals litigating these duties on a case-by-case basis. A far more effective way of dealing with the problems, as we discussed previously when we debated the ombudsman, is to get Ofcom to deal with this on behalf of the whole public on a systemic basis, funded not by individual litigants’ money, which is what would happen, at least in the first instance, if they had to proceed individually. Ofcom should act on behalf of us all collectively—this should appeal to socialists—using charges levied from the industry itself.

That is why we want to enforce against these companies using Ofcom, funded by the industry and acting on behalf of all of us. We want to fix these issues not just on an individual basis but systemically. Although I understand the Opposition’s intent, the first part simply declares what is already the law, and the second bit takes a different route from the one that the Bill takes. The Bill’s route is more comprehensive and will ultimately be more effective. Perhaps most importantly of all, the approach that the Bill takes is funded by the fees charged on the polluters—the social media firms—rather than requiring individual citizens, at least in the first instance, to put their hand in their own pocket, so I think the Bill as drafted is the best route to delivering these objectives.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I will say a couple of things in response to the Minister. It is individuals who are damaged by providers breaching their duties under part 3 of the Bill. I understand the point about—

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Yes, but it is not systems that are damaged; it is people. As I said in my speech, the Government’s response that, as regulatory precedent grows, it will become easier over time for individuals to take user-to-user services to court where necessary clearly shows that the Government think it will happen. What we are saying is: why should it wait? The Minister says it is declaratory, but I think it is important, so we will put the new clause to a vote.

Question put, That the clause be read a Second time.

Division 66

Ayes: 6


Labour: 5
Scottish National Party: 1

Noes: 9


Conservative: 9

New Clause 25
Annual reporting by OFCOM to Parliament
“(1) OFCOM must publish and lay before Parliament an annual report on the operation of its regulatory functions under this Act.
(2) The report must include—
(a) an overall assessment of the continued effectiveness of this Act in reducing harm online;
(b) figures of the volume of content removed by category 1 services in compliance with their duties under this Act;
(c) details of the exercise of any powers by OFCOM under Chapter 4, Part 7 of this Act, including—
(i) the number of times each power has been exercised, and
(ii) the service providers subject to the power;
(a) the number of reports received by OFCOM from regulated services in compliance with their duties under this Act, including details of the type of content that the reports concern.”—(Kim Leadbeater.)
This new clause would require Ofcom to publish and lay before Parliament an annual report on the operation of its regulatory functions under the Act.
Brought up, and read the First time.
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

New clause 25 would place an obligation on Ofcom to report annually to Parliament with an update on the effectiveness of the Online Safety Bill, which would also indicate Ofcom’s ability to implement the measures in the Bill to tackle online harms.

As we have discussed, chapter 7 of the Bill compels Ofcom to compile and issue reports on various aspects of the Bill as drafted. Some of those reports are to be made public by Ofcom, and others are to be issued to the Secretary of State, who must subsequently lay them before Parliament. However, new clause 25 would place a direct obligation on Ofcom to be transparent to Parliament about the scale of harms being tackled, the type of harms encountered and the effectiveness of the Bill in achieving its overall objectives.

The current proposal in clause 135 for an annual transparency report is not satisfactory. Those transparency reports are not required to be laid before Parliament. The clause places vague obligations on reporting patterns, and it will not give Parliament the breadth of information needed to allow us to decide the Online Safety Bill’s effectiveness.

Clause 149 is welcome. It will ensure that a review conducted by the Secretary of State in consultation with Ofcom is placed before Parliament. However, that review is a one-off that will provide just a small snapshot of the Bill’s effectiveness. It may not fully reflect Ofcom’s concerns as the regulator, and most importantly it will not disclose the data and information that Parliament needs to accurately assess the impact of the Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Does the hon. Member agree with me that there is no point in having world-leading legislation if it does not actually work?

11:15
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I agree with the hon. Member wholeheartedly. It should be Parliament that is assessing the effectiveness of the Bill. The Committee has discussed many times how groundbreaking the Bill could be, how difficult it has been to regulate the internet for the first time, the many challenges encountered, the relationship between platforms and regulator and how other countries will be looking at the legislation as a guide for their own regulations. Once this legislation is in place, the only way we can judge how well it is tackling harm in the UK is with clear public reports detailing information on what harms have been prevented, who has intervened to remove that harm, and what role the regulator—in this case Ofcom—has had in protecting us online.

New clause 25 will place a number of important obligations on Ofcom to provide us with that crucial information. First, Ofcom will report annually to Parliament on the overall effectiveness of the Act. That report will allow Ofcom to explore fully where the Act is working, where it could be tightened and where we have left gaps. Throughout the Bill we are heaping considerable responsibility on to Ofcom, and it is only right that Ofcom is able to feedback publicly and state clearly where its powers allow it to act, and where it is constrained and in need of assistance.

Secondly, new clause 25 will compel Ofcom to monitor, collate and publish figures relating to the number of harms removed by category 1 services, which is an important indicator for us to know the scale of the issue and that the Act is working.

Thirdly, we need to know how often Ofcom is intervening, compared with how often the platforms themselves are acting. That crucial figure will allow us to assess the balance of regulation, which assists not only us in the UK but countries looking at the legislation as a guide for their own regulation.

Finally, Ofcom will detail the harms removed by type to identify any areas where the Act may be falling short, and where further attention may be needed.

I hope the Committee understands why this information is absolutely invaluable, when we have previously discussed our concerns that this groundbreaking legislation will need constant monitoring. I hope it will also understand why the information needs to be transparent in order to instil trust in the online space, to show the zero-tolerance approach to online harms, and to show countries across the globe that the online space can be effectively regulated to protect citizens online. Only Parliament, as the legislature, can be an effective monitor of that information. I hope I can count on the Government’s support for new clause 25.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I speak in support of new clause 25. As my hon. Friend has argued, transparency is critical to the Bill. It is too risky to leave information and data about online harms unpublished. That is why we have tabled several amendments to the Bill to increase reporting, both to the regulator and publicly.

New clause 25 is an important addition that would offer an overview of the effectiveness of the Bill and act as a warning bell for any unaddressed historical or emerging harms. Not only would such a report benefit legislators, but the indicators included in the report would be helpful for both Ofcom and user advocacy groups. We cannot continue to attempt to regulate the internet blind. We must have the necessary data and analysis to be sure that the provisions in the Bill are as effective as they can be. I hope the Minister can support this new clause.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The idea that a report on Ofcom’s activities be delivered to Parliament so that it can be considered is an excellent one. In fact, it is such an excellent idea that it has been set out in statute since 2002: the Office of Communications Act 2002 already requires Ofcom to provide a report to the Secretary of State on the carrying out of all of its functions, which will include the new duties we are giving Ofcom under the Bill. The Secretary of State must then lay that report before each House of Parliament. That is a well-established procedure for Ofcom and for other regulatory bodies. It ensures the accountability of Ofcom to the Department and to Parliament.

I was being slightly facetious there, because the hon. Member for Batley and Spen is quite right to raise the issue. However, the duty she is seeking to create via new clause 25 is already covered by the duties in the Office of Communications Act. The reports that Ofcom publish under that duty will include their new duties under the Bill. Having made that clear, I trust that new clause 25 can be withdrawn.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I would like to press new clause 25 to a Division. It is important that it is included in the Bill.

Question put, That the clause be read a Second time.

Division 67

Ayes: 5


Labour: 4
Scottish National Party: 1

Noes: 9


Conservative: 9

New clause 26
Report on synthetic media content harms
“(1) The Secretary of State must publish and lay before Parliament a report on the harms caused to users by synthetic media content appearing on regulated services.
(2) The report must contain analysis of the harms caused specifically to individuals working in the entertainment industry, including, but not limited to, infringements of their intellectual property rights.
(3) The report must be published within six months of this Act being passed.
(4) In this section, ‘synthetic media content’ means any content that has been produced or modified by automated means.”—(Alex Davies-Jones.)
This new clause would require the Secretary of State to publish and lay before Parliament a report on the harms caused to users by synthetic media content (aka “deepfakes”). The report must contain particular reference to the harms caused to those working in the entertainment industry.
Brought up, and read the First time.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

This new clause would require the Secretary of State to publish and lay before Parliament a report on the harms caused to users by synthetic media content, also known as deepfakes. The report must contain particular reference to the harms caused to those working in the entertainment industry.

The Government define artificial intelligence as

“technologies with the ability to perform tasks that would otherwise require human intelligence, such as visual perception, speech recognition, and language translation”.

That kind of technology has advanced rapidly in recent years, and commercial AI companies can be found across all areas of the entertainment industries, including voice, modelling, music, dance, journalism and gaming—the list goes on.

One key area of development is AI-made performance synthetisation, which is the process of creating a synthetic performance. That has a wide range of applications, including automated audiobooks, interactive digital avatars and “deepfake” technology, which often, sadly, has more sinister implications. Innovation for the entertainment industry is welcome and, when used ethically and responsibly, can have various benefits. For example, AI systems can create vital sources of income for performers and creative workers. From an equalities perspective, it can be used to increase accessibility for disabled workers.

However, deepfake technology has received significant attention globally due to its often-malicious application. Deepfakes have been defined as,

“realistic digital forgeries of videos or audio created with cutting-edge machine learning techniques.”

An amalgamation of artificial intelligence, falsification and automation, deepfakes use deep learning to replicate the likeness and actions of real people. Over the past few years, deepfake technology has become increasingly sophisticated and accessible. Various apps can be downloaded for free, or a low cost, to utilise deepfake technology.

Deepfakes can cause short-term and long-term social harms to individuals working in the entertainment industry, and to society more broadly. Currently, deepfakes are mostly used in pornography, inflicting emotional and reputational damage, and in some cases violence towards the individual—mainly women. The US entertainment union, the Screen Actors Guild, estimates that 96% of deepfakes are pornographic and depict women, and 99% of deepfake subjects are from the entertainment industry.

However, deepfakes used without consent pose a threat in other key areas. For example, deepfake technology has the power to alter the democratic discourse. False information about institutions, policies, and public leaders, powered by a deepfake, can be exploited to spin information and manipulate belief. For example, deepfakes have the potential to sabotage the image and reputation of a political candidate and may alter the course of an election. They could be used to impersonate the identities of business leaders and executives to facilitate fraud, and also have the potential to accelerate the already declining trust in the media.

Alongside the challenges presented by deepfakes, there are issues around consent for performers and creative workers. In a famous case, the Canadian voiceover artist Bev Standing won a settlement after TikTok synthesised her voice without her consent and used it for its first ever text-to-speech voice function. Many artists in the UK are also having their image, voice or likeness used without their permission. AI systems have also started to replace jobs for skilled professional performers because using them is often perceived to be a cheaper and more convenient way of doing things.

Audio artists are particularly concerned by the development of digital voice technology for automated audiobooks, using the same technology used for digital voice assistants such as Siri and Alexa. It is estimated that within one or two years, high-end synthetic voices will have reached human levels. Equity recently conducted a survey on this topic, which found that 65% of performers responding thought that the development of AI technology poses a threat to employment opportunities in the performing arts sector. That figure rose to 93% for audio artists. Pay is another key issue; it is common for artists to not be compensated fairly, and sometimes not be paid at all, when engaging with AI. Many artists have also been asked to sign non-disclosure agreements without being provided with the full information about the job they are taking part in.

Government policy making is non-existent in this space. In September 2021 the Government published their national AI strategy, outlining a 10-year plan to make Britain a global AI superpower. In line with that strategy, the Government have delivered two separate consultations looking at our intellectual property system in relation to AI.

None Portrait The Chair
- Hansard -

Order. I am sorry, but I must interrupt the hon. Lady to adjourn the sitting until this afternoon, when Ms Rees will be in the Chair.

Before we leave the room, my understanding is that it is hoped that the Bill will report this afternoon. That is a matter for the usual channels; it is nothing to do with the Chair. However, of course, it is an open-ended session, so if you are getting close to the mark, you may choose to go on. If that poses a problem for Ms Rees, I am prepared to take the Chair again to see it through if we have to. On the assumption that I do not, thank you all very much indeed for the courtesy you have shown throughout this session, which has been exemplary. I also thank the staff; thank you very much.

11:25
The Chair adjourned the Committee without Question put (Standing Order No. 88).
Adjourned till this day at Two o’clock.

Online Safety Bill (Seventeenth sitting)

Committee stage
Tuesday 28th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 28 June 2022 - (28 Jun 2022)
The Committee consisted of the following Members:
Chairs: Sir Roger Gale, † Christina Rees
† Ansell, Caroline (Eastbourne) (Con)
† Bailey, Shaun (West Bromwich West) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
† Carden, Dan (Liverpool, Walton) (Lab)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Double, Steve (St Austell and Newquay) (Con)
† Fletcher, Nick (Don Valley) (Con)
† Holden, Mr Richard (North West Durham) (Con)
† Keeley, Barbara (Worsley and Eccles South) (Lab)
† Leadbeater, Kim (Batley and Spen) (Lab)
† Miller, Dame Maria (Basingstoke) (Con)
† Mishra, Navendu (Stockport) (Lab)
Moore, Damien (Southport) (Con)
† Nicolson, John (Ochil and South Perthshire) (SNP)
† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
† Russell, Dean (Watford) (Con)
† Stevenson, Jane (Wolverhampton North East) (Con)
Katya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks
† attended the Committee
Public Bill Committee
Tuesday 28 June 2022
(Afternoon)
[Christina Rees in the Chair]
Online Safety Bill
New Clause 26
Report on synthetic media content harms
“(1) The Secretary of State must publish and lay before Parliament a report on the harms caused to users by synthetic media content appearing on regulated services.
(2) The report must contain analysis of the harms caused specifically to individuals working in the entertainment industry, including, but not limited to, infringements of their intellectual property rights.
(3) The report must be published within six months of this Act being passed.
(4) In this section, “synthetic media content” means any content that has been produced or modified by automated means.”—(Alex Davies-Jones.)
This new clause would require the Secretary of State to publish and lay before Parliament a report on the harms caused to users by synthetic media content (aka “deepfakes”). The report must contain particular reference to the harms caused to those working in the entertainment industry.
Brought up, read the First time, and motion made (this day), That the clause be read a Second time.
14:00
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

Before we adjourned, I was discussing the Government’s national artificial intelligence strategy and the two separate consultations launched by the Government to look at the intellectual property system in relation to AI. In those consultations, the Intellectual Property Office recognised that AI

“is playing an increasing role in...artistic creativity.”

However, specific questions about reviewing or enhancing performers’ rights were notably absent from both Government consultations. If the UK Government really want to make Britain a global AI and creative superpower, strengthening the rights of performers and other creatives must be at the heart of the national AI strategy.

Another key challenge is that our intellectual property framework is desperately out of date. Currently, performers have two sets of rights under the Copyright, Designs and Patents Act 1988: the right to consent to the making of a recording of a performance; and the right to control the subsequent use of such recordings, such as the right to make copies. However, as highlighted by Dr Mathilde Pavis, senior lecturer in law at the University of Exeter, AI-made performance synthetisation challenges our intellectual property framework because it reproduces performances without generating a recording or a copy, and therefore falls outside the scope of the Act. An unintended consequence is that people are left vulnerable to abuse and exploitation. Without effective checks and balances put in place by the Government, that will continue. That is why 93% of Equity members responding to a recent survey stated that the Government should introduce a new legal protection for performers, so that a performance cannot be reproduced by AI technology without the performer’s consent.

Advances in AI, including deepfake technology, have reinforced the urgent need to introduce image rights—also known as personality rights or publicity rights. That refers to

“the expression of a personality in the public domain”,

such as an individual’s name, likeness or other personal indicators. Provision of image rights in law enables performers to safeguard meaningful income streams, and to defend their artistic integrity, career choices, brand and reputation. More broadly, for society, it is an important tool for protecting privacy and allowing an individual to object to the use of their image without consent.

In the UK, there is no codified law of image rights or privacy. Instead, we have a patchwork of statutory and common-law causes of action, which an individual can use to protect various aspects of their image and personality. However, none of that is fit for purpose. Legal provision for image rights can be found around the world, so the Government here can and should do more. For example, some American states recognise the right through their statute, and some others through common law. California has both statutory and common-law strains of authority, which protect slightly different forms of the right.

The Celebrities Rights Act of 1985 was passed in California and extended the personality rights for a celebrity to 70 years after their death. In 2020, New York State passed a Bill that recognised rights of publicity for “deceased performers” and “deceased personalities”. Guernsey has created a statutory regime under which image rights can be registered. The legislation centres on the legal concept of a “personnage”— the person or character behind a personality that is registered. The image right becomes a property right capable of protection under the legislation through registration, which enables the image right to be protected, licensed and assigned.

The Minister will know that Equity is doing incredible work to highlight the genuine impact that this type of technology is having on our creative industry and our performers. He must therefore see the sense in our new clause, which would require the Government at least to consider the matter of synthetic media content, which thus far they have utterly failed to do.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship again, Ms Rees. I thank the shadow Minister, the hon. Member for Pontypridd, for raising the issues that she has done about synthetic and digitally manipulated content, which we are very conscious of. We are conscious of the risk of harm to those who work in the entertainment industry and of course, in particular, to victims of deepfake pornography.

We take intellectual property infringement extremely seriously. The Government have recently published a counter-infringement strategy, setting out a range of steps that we intend to take to strengthen the whole system approach to tackling infringement of intellectual property rights. It is widely acknowledged that the United Kingdom has an intellectual property framework that is genuinely world leading and considered among the best in the world. That includes strong protections for performers’ rights. We intend that to continue. However, we are not complacent and the law is kept under review, not least via the counter-infringement strategy I mentioned a moment ago.

Harmful synthetic media content, including the deepfakes that the hon. Member for Pontypridd mentioned, is robustly addressed by the safety duties set out in the Bill in relation to illegal content—much deepfake content, if it involves creating an image of someone, would be illegal—as well as content that could be harmful to children and content that will be on the “legal but harmful” adult list. Those duties will tackle the most serious and illegal forms of deepfake and will rightly cover certain threats that undermine our democracy. For example, a manipulated media image that contained incitement to violence, such as a deepfake of a politician telling people to attack poll workers because they are rigging an election, would obviously already fall foul of the Bill under the illegal duties.

In terms of reporting and codes of practice, the Bill already requires Ofcom to produce codes of practice setting out the ways in which providers can take steps to reduce the harm arising from illegal and harmful content, which could include synthetic media content such as deepfakes where those contain illegal content.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Minister uses the example of a deepfake of a politician inciting people to attack poll workers during an election. Given some of the technology is so advanced that it is really difficult to spot when the deepfakes actually occur, could it be argued that Ofcom as regulator or even the platforms themselves would be adverse to removing or reporting the content as it could fall foul of the democratic content exemption in the Bill?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The democratic content protection that the shadow Minister refers to, in clause 15, is not an exemption; it is a duty to take into account content of democratic importance. That is on line 34 of page 14. When making a decision, it has to be taken into account—it is not determinative; it is not as if a politician or somebody involved in an election gets a free pass to say whatever they like, even if it is illegal, and escapes the provisions of the Bill entirely. The platform simply has to take it into account. If it was a deepfake image that was saying such a thing, the balancing consideration in clause 15 would not even apply, because the protection applies to content of democratic importance, not to content being produced by a fake image of a politician.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It is important that we get this right. One of our concerns on clause 15, which we have previously discussed, relates to this discussion of deepfakes, particularly of politicians, and timeframes. I understand the Minister’s point on illegal content. If there is a deepfake of a politician—on the eve of poll, for example—widely spreading disinformation or misinformation on a platform, how can the Minister confidently say that that would be taken seriously, in a timely manner? That could have direct implications on a poll or an election. Would the social media companies have the confidence to take that content down, given clause 15?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The protections in clause 15—they are not exemptions—would only apply to content that is of bona fide, genuine democratic importance. Obviously, a deepfake of a politician would not count as genuine, democratic content, because it is fake. If it was a real politician, such as the hon. Lady, it would benefit from that consideration. If it was a fake, it would not, because it would not be genuine content of democratic importance.

It is also worth saying that if—well, I hope when—our work with the Law Commission to review the criminal law related to the non-consensual taking and sharing of internet images is taken forward, that will then flow into the duties in the Bill. Deepfakes of internet images are rightly a concern of many people. That work would fall into the ambit of the Bill, either via clause 52, which points to illegal acts where there is an individual victim, or schedule 7, if a new internet image abuse were added to schedule 7 as a priority offence. There are a number of ways in which deepfakes could fall into the ambit of the Bill, including if they relate to extreme pornography.

The new clause would require the production of a report, not a change to the substantive duties in the Bill. It is worth saying that the Bill already provides Ofcom with powers to produce and publish reports regarding online safety matters. Those powers are set out in clause 137. The Bill will ensure that Ofcom has access to the information required to prepare those reports, including information from providers about the harm caused by deepfakes and how companies tackle the issue. We debated that extensively this morning when we talked about the strong powers that already exist under clause 85.

The hon. Lady has raised important points about intellectual property, and I have pointed to our counter-infringement strategy. She raised important points about deepfakes both in a political context and in the context of especially intimate images being generated by AI. I hope I have set out how the Bill addresses concerns in those areas. The Bill as drafted addresses those important issues in a way that is certainly adequate.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I welcome the Minister’s comments and I am grateful for his reassurance on some of the concerns that were raised. At this stage we will not press the matter to a vote. I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

New Clause 27

OFCOM: power to impose duties on regulated services

“OFCOM: power to impose duties on regulated services

(1) OFCOM may carry out an assessment of the risk of harm posed by any regulated service.

(2) Where OFCOM assess a service to pose a very high risk of harm, OFCOM may, notwithstanding the categorisation of the service or the number or profile of its users, impose upon the service duties equivalent to—

(a) the children’s risk assessment duties set out in sections 10 and 25 of this Act; and

(b) the safety duties protecting children set out in sections 11 and 26 of this Act.”—(Kirsty Blackman.)

This new clause enables Ofcom to impose on any regulated service duties equivalent to the children’s risk assessment duties and the safety duties protecting children.

Brought up, and read the First time.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

This is another attempt to place a higher bar and more requirements on regulated services that are likely to cause the most serious risks of harm. The Minister has consistently said that he is keen to consider regulating the companies and platforms that have the highest potential risk of harm more strictly than the normal regime would allow. Some of the platforms would not be category 1 on the basis that they have a small number of members, but the potential for harm—radicalisation, extremism, severe damage to people or extreme pornography—is very high.

I am not yet happy that the Minister has provided an adequate answer to the question about the regulation of the highest-risk platforms that do not meet the category 1 thresholds. If he is unwilling to accept this amendment or any of the other amendments tabled by the Opposition on this specific issue, I hope that he will give consideration to a Government amendment on Report or when the Bill goes through the House of Lords in order that this loose end can be tied up.

As I have said before—I do not want go too much over comments that I have made previously—it is reasonable for us to have a higher bar and a more strict regulation regime on specific platforms that Ofcom will easily be able to identify and that create the highest harm. Again, as I have said, this is another way of going about it. The new clause suggests that if Ofcom assesses that a service poses a very high risk of harm, it might, notwithstanding the categorisation of that service, require it to perform the children’s risk assessment duties and the safety duties protecting children. This is specifically about the children’s risk assessment.

I have previously raised concerns about not being able to accurately assess the number of child users that a service has. I am still not entirely comfortable that platforms will be able to accurately assess the number of child users they have, and therefore they might not be subject to the child user requirements, because they have underplayed or understated the number of children using their service, or because there are only a few hundred children using the service, which is surely massively concerning for the wellbeing of those few hundred children.

I hope the Minister can give us some comfort that he is not just considering what action to take, but that he will take some sort of action on Report or when the Bill proceeds through the House of Lords.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

It is a pleasure to serve with you in the Chair again, Ms Rees. I rise to speak in support of new clause 27.

We have argued that the Government’s approach to categorising services fails to take account of the harms that could result from smaller services. I understand that a risk-based approach rather than a size-based approach is being considered, and that is welcome. The new clause would go some way to improving the categorisation of services as it stands. It is critical that there are ways for Ofcom to assess companies’ risk of harm to users and to place additional duties on them even when they lie outside the category to which they were initially assigned. Ofcom should be able to consult any organisation that it sees fit to consult, including user advocacy groups and civil society, in assessing whether a service poses

“a very high risk of harm”.

Following that, Ofcom should have powers to deliver the strictest duties on companies that expose adults to the most dangerous harms. That should always be proportionate to the risk of harm.

Labour supports the new clause and the arguments made by the hon. Member for Aberdeen North.

14:15
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Member for Aberdeen North for raising those considerations, because protecting children is clearly one of the most important things that the Bill will do. The first point that it is worth drawing to the Committee’s attention again is the fact that all companies, regardless of the number of child users they may have, including zero child users, have duties to address illegal content where it affects children. That includes child sexual exploitation and abuse content, and illegal suicide content. Those protections for the things that would concern us the most—those illegal things—apply to companies regardless of their size. It is important to keep that in mind as we consider those questions.

It is also worth keeping in mind that we have designed the provisions in clause 31 to be a bit flexible. The child user condition, which is in clause 31(3) on page 31 of the Bill, sets out that one of two tests must be met for the child user condition to be met. The condition is met if

“there is a significant number of children who are users of the service…or…the service…is of a kind likely to attract a significant number of users who are children.”

When we debated the issue previously, we clarified that the word “user” did not mean that they had to be a registered user; they could be somebody who just stumbles across it by accident or who goes to it intentionally, but without actually registering. We have built in a certain amount of flexibility through the word “likely”. That helps a little bit. We expect that where a service poses a very high risk of harm to children, it is likely to meet the test, as children could be attracted to it—it might meet the “likely to attract” test.

New clause 27 would introduce the possibility that even when there were no children on the service and no children were ever likely to use it, the duties would be engaged—these duties are obviously in relation to content that is not illegal; the illegal stuff is covered already elsewhere. There is a question about proportionality that we should bear in mind as we think about this. I will be resisting the new clause on that basis.

However, as the hon. Member for Aberdeen North said, I have hinted or more than hinted to the Committee previously that we have heard the point that has been made—it was made in the context of adults, but applies equally to children here—that there is a category of sites that might have small numbers of users but none the less pose a high risk of harm, not harm that is illegal, because the “illegal” provision applies to everybody already, but harm that falls below the threshold of illegality. On that area, we heard hon. Members’ comments on Second Reading. We have heard what members of the Committee have had to say on that topic as well. I hope that if I say that that is something that we are reflecting on very carefully, the hon. Member for Aberdeen North will understand that those comments have been loudly heard by the Government. I hope that I have explained why I do not think new clause 27 quite works, but the point is understood.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I appreciate the Minister’s comments, but in the drafting of the new clause, we have said that Ofcom “may” impose these duties. I would trust the regulator enough not to impose the child safety duties on a site that literally has no children on it and that children have no ability to access. I would give the regulator greater credit than the Minister did, perhaps accidentally, in his comments. If it were up to Ofcom to make that decision and it had the power to do so where it deemed that appropriate, it would be most appropriate for the regulator to have the duty to make the decision.

I wish to press the new clause to a Division.

Question put, That the clause be read a Second time.

Division 68

Ayes: 7


Labour: 5
Scottish National Party: 2

Noes: 9


Conservative: 9

New Clause 28
Empowerment features for child users
“(1) This section applies where a Part 3 service has empowerment features for adults of a type described in section 14(2).
(2) OFCOM may require a service to provide equivalent features designed specifically for child users.
(3) Where OFCOM places a requirement on a service under subsection (2) it must provide guidance to the service on how to ensure the features are easily accessible and understandable for children.”—(Kirsty Blackman.)
This new clause enables Ofcom to require services to provided empowerment features for child users.
Brought up, and read the First time.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

The new clause attempts to address an asymmetry in the Bill in relation to the lack of user empowerment features for child users. As far as I am aware, there is no requirement for user empowerment functions for child users in the Bill. The new clause would require that if a service has to have user empowerment features in place for adults, then

“OFCOM may require a service to provide equivalent features designed specifically for child users.”

Ofcom would be able then to provide guidance on how those user empowerment features for child users would work.

This provision is especially important for the fairly small number of platforms and providers that are very much aimed at children, and where the vast majority of users are children. We are not talking about Facebook, for example, although if Facebook did have child user empowerment, it would be a good thing. I am thinking about organisations and games such as Roblox, which is about 70% children; Fortnite, although it has quite a lot of adult users too; and Minecraft, which has significant numbers of child users. On those platforms that are aimed at children, not having a child-centred, child-focused user empowerment requirement is an oversight. It is missing from the Bill.

It is important that adults have the ability to make privacy choices about how they use sites and to make choices about some of the content that they can see on a site by navigating the user empowerment functions that exist. But it is also important for children to have that choice. I do not see why adults should be afforded that level of choice and flexibility over the way that they use platforms and the providers that they engage with, but children should not. We are not just talking here about kids who are eight: we are talking about children far older, and for whom adult-centred, adult-written user empowerment functions may not be the best option or as easy to access as ones that are specifically focused on and designed for children.

I have had a discussion with the National Society for the Prevention of Cruelty to Children about the user empowerment functions for child users. We have previously discussed the fact that complaints features have to be understandable by the users of services, so if the Minister is unwilling to accept the new clause, will he give some consideration to what happens when the provider of the platform is marketing that platform to children?

The Roblox website is entirely marketed as a platform for children. It is focused in that way, so will the Minister consider whether Ofcom should be able to require differential user empowerment functions, particularly in cases where the overwhelming majority of users are children? Also, it would not be beyond the wit of man for platforms such as Facebook to have two differential user empowerment functions based on whether somebody is under the age of 18—whether they are a child or an adult—because users tell Facebook their date of birth when signing up. We have talked a lot about age verification and the ways in which that could work.

I would appreciate it if the Minister would consider this important matter. It is something that is lacking at the moment, and we are doing our children a disservice by not providing them with the same functionality that we are providing, or requiring, for adult users.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Labour argued in favour of greater empowerment provisions for children during the debate on new clause 3, which would have brought in a user advocacy body for children. YoungMinds has pointed out that many young people are unaware of the Bill, and there has been little engagement with children regarding its design. I am sure members of the Committee would agree that the complexity of the Bill is evidence enough of that.

New clause 28 would make the online world more accessible for children and increase their control over the content they see. We know that many children use category 1 services, so they should be entitled to the same control over harmful content as adults. As such, Labour supports the new clause.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Member for Aberdeen North for her, as ever, thoughtful comments on the new clause. She has already referred to the user empowerment duties for adults set out in clause 57, and is right to say that those apply only to adults, as is made clear in the very first line of subsection (1) near the bottom of page 52.

As always, the hon. Lady’s analysis of the Bill is correct: the aim of those empowerment duties is to give adults more control over the content they see and the people with whom they interact online. One of the reasons why those empowerment duties have been crafted specifically for adults is that, as we discussed in a freedom of expression context, the Bill does not ultimately censor free speech regarding content that is legal but potentially harmful. Platforms can continue to display that information if their policies allow, so we felt it was right to give adults more choice over whose content they see, given that it could include content that is harmful but falls on the right side of the legal threshold.

As Members would expect, the provisions of the Bill in relation to children are very difficult to the provisions for adults. There are already specific provisions in the Bill that relate to children, requiring all social media companies whose platforms are likely to be accessed by children—not just the big ones—to undertake comprehensive risk assessments and protect children from any kind of harmful activity. If we refer to the children’s risk assessment duties in clause 10, and specifically clause 10(6)(e), we see that those risk assessments include an assessment looking at the content that children will encounter and—critically—who they might encounter online, including adults.

To cut to the chase and explain why user empowerment has been applied to adults but not children, the view was taken that children are already protected a lot more than adults through the child risk assessment duties and child safety duties. Therefore, they do not need the user empowerment provisions because they are already—all of them, regardless of whether they choose to be verified or not—being protected from harmful content already by the much stronger provisions in the Bill relating to children. That is why it was crafted as it is.

14:29
The hon. Lady referred to submissions made by the NSPCC. If they have an argument that advances a different line of reasoning or suggests that what I have just said is in some way flawed, I would be very happy to look at that. She has my email address, and she is very welcome to send that through.
However, on my reading of the Bill as it stands, because of the existing strong protections for children, they do not need to also benefit from the user empowerment duties as set out. Of course, there are also some questions around data protection and safeguarding if children end up self-identifying on a public basis. That is why they are omitted. I hope that makes sense, but I would be happy to read any further submission if she has one.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

It does make sense, and I do understand what the Minister is talking about in relation to clause 10 and the subsections that he mentioned. However, that only sets out what the platforms must take into account in their child risk assessments.

If we are talking about 15-year-olds, they are empowered in their lives to make many decisions on their own behalf, as well as decisions guided by parents or parental decisions taken for them. We are again doing our children a disservice by failing to allow young people the ability to opt out—the ability to choose not to receive certain content. Having a requirement to include whether not these functionalities exist in a risk assessment is very different from giving children and young people the option to choose, and to decide what they do—and especially do not—want to see on whichever platform they are interacting on.

I have previously mentioned the fact that if a young person is on Roblox, or some of those other platforms, it is difficult for them to interact only with people who are on their friends list. It is difficult for that young person to exclude adult users from contacting them. A lot of young people want to exclude content, comments or voice messages from people they do not know. They want to go on the internet and have fun and enjoy themselves without the risk of being sent an inappropriate message or photo and having to deal with those things. If they could choose those empowerment functions, that just eliminates the risk and they can make that choice.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Could I develop the point I was making earlier on how the Bill currently protects children? Clause 11, which is on page 10, is on safety duties for children—what the companies have to do to protect children. One thing that they may be required by Ofcom to do, as mentioned in subsection (4)(f), is create

“functionalities allowing for control over content that is encountered, especially by children”.

Therefore, there is a facility to require the platforms to create the kind of functionalities that relate actually, as that subsection is drafted, to not just identity but the kind of content being displayed. Does that go some way towards addressing the hon. Lady’s concern?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

That is very helpful. I am glad that the Minister is making clear that he thinks that Ofcom will not just be ignoring this issue because the Bill is written to allow user empowerment functions only for adults.

I hope the fact that the Minister kindly raised clause 11(4) will mean that people can its importance, and that Ofcom will understand it should give consideration to it, because that list of things could have just been lost in the morass of the many, many lists of things in the Bill. I am hoping that the Minister’s comments will go some way on that. Notwithstanding that, I will press the new clause to a vote.

Question put, That the clause be read a Second time.

Division 69

Ayes: 7


Labour: 5
Scottish National Party: 2

Noes: 9


Conservative: 9

New Clause 29
Accessibility to adult users with learning disabilities
“(1) This section applies to the following functions—
(a) any user empowerment features provided under section 14;
(b) any content reporting systems or processes under section 17 or section 27;
(c) any complaints procedure under section 18 or section 28.
(2) The service must, as part of its compliance with any duties under the sections listed in subsection (1), ensure that the functions are accessible and understandable to adult users with learning disabilities.”—(Kirsty Blackman.)
This new clause requires complaints, user empowerment and user reporting functions to be accessible and understandable to adult users with learning disabilities.
Brought up, and read the First time.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

I mentioned this in earlier consideration. The issue was raised with me by Mencap, specifically in relation to the people it represents who have learning disabilities and who have a right to access the internet just as we all do. They should be empowered to use the internet with a level of safety and be able to access complaints, to make content reports and to use the user empowerment functions. Everybody who is likely to use the platforms should be able to access and understand those functions.

Will the Minister make it clear that he expects Ofcom, when drafting guidance about the user empowerment functions and their accessibility, the content reporting and the complaints procedures, to consult people about how those things work? Will he make it clear that he hopes Ofcom will take into account the level of accessibility? This is not just about writing things in plain English—or whatever that campaign is about writing things in a way that people can understand—it is about actually speaking to groups that represent people with learning disabilities to ensure that content reporting, the empowerment functions and the complaints procedures are accessible, easy to find and easy to understand, so that people can make the complaints that they need to make and can access the internet on an equal and equitable basis.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I rise to speak in support of the new clause. Too often people with learning disabilities are left out of discussions about provisions relevant to them. People with learning disabilities are disproportionately affected by online harms and can receive awful abuse online.

At the same time, Mencap has argued that social media platforms enable people with learning disabilities to develop positive friendships and relationships. It is therefore even more important that people with learning disabilities do not lose out on the features described in clause 14, which allow them to control the content to which they are exposed. It is welcome that clauses 17, 18, 27 and 28 specify that reporting and complaints procedures must be easy to access and use.

The Bill, however, should go further to ensure that the duties on complaints and reporting explicitly cater to adults with learning disabilities. In the case of clause 14 on user empowerment functions, it must be made much clearer that those functions are easy to access and use. The new clause would be an important step towards ensuring that the Bill benefits everyone who experiences harms online, including people with learning disabilities. Labour supports the new clause.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Member for Aberdeen North once again for the thoughtfulness with which she has moved her new clause. To speak first to the existing references to accessibility in the Bill, let me start with user empowerment in clause 14.

Clause 14(4) makes it clear that the features included in “a service in compliance” with the duty in this clause must be made available to all adult users. I stress “all” because, by definition, that includes people with learning disabilities or others with characteristics that mean they may require assistance. When it comes to content reporting duties, clause 17(2)—line 6 of page 17—states that it has to be easy for any “affected persons” to report the content. They may be people who are disabled or have a learning difficulty or anything else. Clause 17(6)(d) further makes it clear that adults who are “providing assistance” to another adult are able to raise content reporting issues.

There are references in the Bill to being easy to report and to one adult assisting another. Furthermore, clause 18(2)(c), on page 18, states that the complaints system has to be

“easy to use (including by children)”.

It also makes it clear through the definition of “affected person”, which we have spoken about, that an adult assisting another adult is allowed to make a complaint on behalf of the second adult. Those things have been built into the structure of the Bill.

Furthermore, to answer the question from the hon. Member for Aberdeen North, I am happy to put on record that Ofcom, as a public body, is subject to the public sector equality duty, so by law it must take into account the ways in which people with certain characteristics, such as learning disabilities, may be impacted when performing its duties, including writing the codes of practice for user empowerment, redress and complaints duties. I can confirm, as the hon. Member requested, that Ofcom, when drafting its codes of practice, will have to take accessibility into account. It is not just a question of my confirming that to the Committee; it is a statutory duty under the Equality Act 2010 and the public sector equality duty that flows from it.

I hope that the words of the Bill, combined with that statutory public sector equality duty, make it clear that the objectives of new clause 29 are met.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The Minister mentioned learning difficulties. That is not what we are talking about. Learning difficulties are things such as dyslexia and attention deficit hyperactivity disorder. Learning disabilities are lifelong intellectual impairments and very different things—that is what we are talking about.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am very happy to accept the shadow Minister’s clarification. The way that clauses 14, 17 and 18 are drafted, and the public sector equality duty, include the groups of people she referred to, but I am happy to acknowledge and accept her clarification.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

That is fine, but I have a further point to make. The new clause would be very important to all those people who support people with learning disabilities. So much of the services that people use do not take account of people’s learning disabilities. I have done a huge amount of work to try to support people with learning disabilities over the years. This is a very important issue to me.

There are all kinds of good examples, such as easy-read versions of documents, but the Minister said when batting back this important new clause that the expression “all adult users” includes people with learning disabilities. That is not the case. He may not have worked with a lot of people with learning disabilities, but they are excluded from an awful lot. That is why I support making that clear in the Bill.

We on the Opposition Benches say repeatedly that some things are not included by an all-encompassing grouping. That is certainly the case here. Some things need to be said for themselves, such as violence against women and girls. That is why this is an excellent new clause that we support.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I thank the Minister, particularly for providing the clarification that I asked for about who is likely to be consulted or taken into account when Ofcom is writing the codes of practice. Notwithstanding that, and particularly given the rather excellent speech from the shadow Minister, the hon. Member for Worsley and Eccles South, I am keen to press the new clause to a vote.

Question put, That the clause be read a Second time.

Division 70

Ayes: 7


Labour: 5
Scottish National Party: 2

Noes: 9


Conservative: 9

New Clause 36
Communication offence for encouraging or assisting self-harm
“(1) In the Suicide Act 1961, after section 3 insert—
“3A Communication offence for encouraging or assisting self-harm
(1) A person (“A”) commits an offence if—
(a) A sends a message,
(b) the message encourages or could be used to assist another person (“B”) to inflict serious physical harm upon themselves, and
(c) A’s act was intended to encourage or assist the infliction of serious physical harm.
(2) The person referred to in subsection (1)(b) need not be a specific person (or class of persons) known to, or identified by, A.
(3) A may commit an offence under this section whether or not any person causes serious physical harm to themselves, or attempts to do so.
(4) A person guilty of an offence under this section is liable—
(a) on summary conviction, to imprisonment for a term not exceeding 12 months, or a fine, or both;
(b) on indictment, to imprisonment for a term not exceeding 5 years, or a fine, or both.
(5) “Serious physical harm” means serious injury amounting to grievous bodily harm within the meaning of the Offences Against the Person Act 1861.
(6) No proceedings shall be instituted for an offence under this section except by or with the consent of the Director of Public Prosecutions.
(7) If A arranges for a person (“A2”) to do an Act and A2 does that Act, A is also to be treated as having done that Act for the purposes of subsection (1).
(8) In proceedings for an offence to which this section applies, it shall be a defence for A to prove that—
(a) B had expressed intention to inflict serious physical harm upon themselves prior to them receiving the message from A;
(b) B’s intention to inflict serious physical harm upon themselves was not initiated by A; and
(c) the message was wholly motivated by compassion towards B or to promote the interests of B’s health or wellbeing.””—(Kirsty Blackman.)
14:45
Brought up, and read the First time.
Question put, That the clause be read a Second time.

Division 71

Ayes: 7


Labour: 5
Scottish National Party: 2

Noes: 9


Conservative: 9

New Clause 37
The Digital Regulation Committee
“(1) There shall be a Committee, to be known as the Digital Regulation Committee and in this section referred to as “the Committee”, to undertake the following functions in connection with the provisions of this Act—
(a) to review all codes of practice and any other relevant publication produced by OFCOM; and
(b) to monitor and report on any other matter relevant to the functioning of this Act.
(2) The Committee may publish reports in connection with its activities under subsection (1).
(3) The Secretary of State must—
(a) respond to the recommendations contained in any report by the Committee within three months; and
(b) publish and lay copies of their response in both Houses of Parliament.
(4) The Committee shall consist of twelve members—
(a) who shall be drawn from both the House of Commons and from members of the House of Lords; and
(b) none of whom shall be a Minister of the Crown.
(5) The membership and Chair of the Committee shall be appointed by regulations made by the Secretary of State.
(6) Details of the tenure of office of members of, the procedure of and other matters relating to, the Committee shall be set out in regulations made by the Secretary of State.
(7) A statutory instrument containing regulations under this section may not be made unless a draft of the instrument has been laid before and approved by resolution of each House of Parliament.”—(Kirsty Blackman.)
Brought up, and read the First time.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

I drafted this new clause following a number of conversations and debates that we had in Committee about how the Act will be scrutinised. How will we see whether the Act is properly achieving what it is supposed to achieve? We know that there is currently a requirement in the Bill for a review to take place but, as has been mentioned already, that is a one-off thing; it is not a rolling update on the efficacy of the Act and whether it is achieving the duties that it is supposed to achieve.

This is particularly important because there are abilities for the Secretary of State to make changes to some of the Act. Presumably the Government would not have put that in if they did not think there was a possibility or a likelihood that changes would have to be made to the Act at some future point. The Bill is certainly not perfect, but even from the Government’s point of view it is not perfect for all time. There is a requirement for the Act to be updated; it will have to change. New priority harms may have to be added. New details about different illegal acts may have to be added to the duties. That flexibility is given, and the Secretary of State has that flexibility in a number of cases.

If the Act were just going to be a standing thing, if it were not going to be updated, it would never be future-proof; it would never work in the changing world that we have. We know that this legislation has taken a very long time to get here. We have been sadly lacking in significant regulation in the online world for more than 20 years, certainly. For a very long time we have not had this. Now that the Act is here—or it will be once the Bill passes through both Houses of Parliament—we want it to work.

That is the point of every amendment we have tabled: we are trying to make the Bill better so that it works and can keep people as safe as possible. At the moment, we do not know how safe the internet will be as a result of the Bill. Even once it begins to be implemented, we will not have enough information on the improvements it has created to be able to say, “Actually, this was a world-leading piece of legislation.”

It may be that the digital regulation committee that I am suggesting in this new clause has a look regularly at the implementation of the Bill going forward and says, “Yep, that’s brilliant.” The committee might look at the implementation and the increasing time we spend online, with all the harms that can come with that, and says, “Actually, you need to tweak that a bit” or, “That is not quite fulfilling what it was intended to.” The committee might also say, “This brand new technology has come in and it is not entirely covered by the Act as it is being implemented.” A digital regulation committee was proposed by the Joint Committee, I think, to scrutinise implementation of the legislation.

The Government will say that they will review—they always do. I have been in so many Delegated Legislation Committees that involve the Treasury and the Government saying, “Yes, we keep everything under review—we always review everything.” That line is used in so many of these Committees, but it is just not true. In January I asked the Department for Digital, Culture, Media and Sport

“how many and what proportion of (a) primary and (b) secondary legislation sponsored by (i) their Department…has undergone a post legislative review”.

It was a written question I put to a number of Departments including DCMS. The reply I got from the Minister here was:

“The number of post legislative reviews the Department has undertaken on primary and secondary legislation in each of the last five years is not held within the Department.”

The Government do not even know how many pieces of primary or secondary legislation they have reviewed. They cannot tell us that all of them have been reviewed. Presumably, if they could tell us that all of them have been reviewed, the answer to my written question would have been, “All of them.” I have a list of the number they sponsored. It was six in 2021, for example. If the Department had reviewed the implementation of all those pieces of legislation, I would expect it to be shouting that from the rooftops in response to a written question. It should be saying, “Yes, we are wonderful. We have reviewed all these and found that most of them are working exactly as we intended them to.”

I do not have faith in the Government or in DCMS—nor pretty much in any Government Department. I do not have faith in their ability or intention to adequately and effectively review the implementation of this legislation, to ensure that the review is done timeously and sent to the Digital, Culture, Media and Sport Committee, or to ensure those proper processes that are supposed to be in place are actually in place and that the Bill is working.

It is unfortunate for the Minister that he sent me that reply earlier in the year, but I only asked the question because I was aware of the significant lack of work the Government are doing on reviewing whether or not legislation has achieved its desired effect, including whether it has cost the amount of money they said it would, whether it has kept the amount of people safe that they said it would, and that it has done what it needs to do.

I have a lack of faith in the Government generally, but specifically on this issue because of the shifting nature of the internet. This is not to take away from the DCMS Committee, but I have sat on a number of Select Committees and know that they are very busy—they have a huge amount of things to scrutinise. This would not stop them scrutinising this Act and taking action to look at whether it is working. It would give an additional line of scrutiny, transparency and defence, in order to ensure that this world-leading legislation is actually world-leading and keeps people safe in the way it is intended to.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It is an honour to support the new clause moved by the hon. Member for Aberdeen North. This was a recommendation from the Joint Committee report, and we believe it is important, given the sheer complexity of the Bill. The Minister will not be alarmed to hear that I am all in favour of increasing the scrutiny and transparency of this legislation.

Having proudly served on the DCMS Committee, I know it does some excellent work on a very broad range of policy areas, as has been highlighted. It is important to acknowledge that there will of course be cross-over, but ultimately we support the new clause. Given my very fond memories of serving on the Select Committee, I want to put on the record my support for it. My support for this new clause is not meant as any disrespect to that Committee. It is genuinely extremely effective in scrutinising the Government and holding them to account, and I know it will continue to do that in relation to both this Bill and other aspects of DCMS. The need for transparency, openness and scrutiny of this Bill is fundamental if it is truly to be world-leading, which is why we support the new clause.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am grateful for the opportunity to discuss this issue once again. I want to put on the record my thanks to the Joint Committee, which the hon. Member for Ochil and South Perthshire sat on, for doing such fantastic work in scrutinising the draft legislation. As a result of its work, no fewer than 66 changes were made to the Bill, so it was very effective.

I want to make one or two observations about scrutinising the legislation following the passage of the Bill. First, there is the standard review mechanism in clause 149, on pages 125 and 126, which provides for a statutory review not before two years and not after five years of the Bill receiving Royal Assent.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

On that review function, it would help if the Minister could explain a bit more why it was decided to do that as a one-off, and not on a rolling two-year basis, for example.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

That is a fairly standard clause in legislation. Clearly, for most legislation and most areas of Government activity, the relevant departmental Select Committee would be expected to provide the ongoing scrutiny, so ordinarily the DCMS Committee would do that. I hear the shadow Minister’s comments: she said that this proposal is not designed in any way to impugn or disrespect that Committee, but I listened to the comments of the Chair of that Committee on Second Reading, and I am not sure he entirely shares that view—he expressed himself in quite forthright terms.

On the proposal, we understand that the Joint Committee did valuable work. This is an unusual piece of legislation, in that it is completely groundbreaking. It is unlike any other, so the case for a having a particular Committee look at it may have some merits. I am not in a position to give a definitive Government response to that because the matter is still under consideration, but if we were to establish a special Committee to look at a single piece of legislation, there are two ways to do it. It could either be done in statute, as the new clause seeks, or it could be done by Standing Orders.

Generally speaking, it is the practice of the House to establish Committees by Standing Orders of the House rather than by statute. In fact, I think the only current Committee of the House established by statute—Ms Rees, you will correct me if I am wrong, as you are more of an expert on these matters than me—is the Intelligence and Security Committee, which was established by the Intelligence Services Act 1994. That is obviously very unusual, because it has special powers. It looks into material that would ordinarily be classified as secret, and it has access to the intelligence services. It is a rather unusual Committee that has to be granted special powers because it looks into intelligence and security matters. Clearly, those considerations do not apply here. Were a particular Committee to be established, the right way of doing that would not be in statute, as the new clause proposes, but via the Standing Orders of the House, if that is something that Parliament wants to do.

Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - - - Excerpts

As another member of the Joint Committee, I totally understand the reasoning. I want to put on record my support for setting up a Committee through the approach the Minister mentioned using statutory instruments. I will not support the new clause but I strongly support the Joint Committee continuing in some form to enable scrutiny. When we look forward to the metaverse, virtual reality and all the things that are coming, it is important that that scrutiny continues. No offence to Opposition colleagues, but I do not think the new clause is the right way to do that. However, the subject is worth further exploration, and I would be very supportive of that happening.

15:00
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

First, let me also put on record my thanks to my hon. Friend for his service on the Joint Committee. He did a fantastic job and, as I said, the Committee’s recommendations have been powerfully heard. I thank him for his acknowledgment that if one were to do this, the right way to do it would be through Standing Orders. I have heard the point he made in support of some sort of ongoing special committee. As I say, the Government have not reached a view on this, but if one were to do that, I agree with my hon. Friend that Standing Orders would be the right mechanism.

One of the reasons for that can be found in the way the new clause has been drafted. Subsections (5) and (6) say:

“The membership and Chair of the Committee shall be appointed by regulations made by the Secretary of State…the tenure of office of members of, the procedure of and other matters…shall be set out in regulations made by the Secretary of State.”

I know those regulations are then subject to approval by a resolution of the House, but given the reservations expressed by Opposition Members about powers for the Secretary of State over the last eight sitting days, it is surprising to see the new clause handing the Secretary of State—in the form of a regulation-making power—the power to form the Committee.

That underlines why doing this through Standing Orders, so that the matter is in the hands of the whole House, is the right way to proceed, if that is something we collectively wish to do. For that reason, we will not support the new clause. Obviously, we will get back to the House in due course once thinking has been done about potential Committees, but that can be done as a separate process to the legislation. In any case, post-legislative scrutiny will not be needed until the regime is up and running, which will be after Royal Assent, so that does not have enormous time pressure on it.

A comment was made about future-proofing the Bill and making sure it stays up to date. There is a lot in that, and we need to make sure we keep up to date with changing technologies, but the Bill is designed to be tech agnostic, so if there is change in technology, that is accommodated by the Bill because the duties are not specific to any given technology. A good example is the metaverse. That was not conceived or invented prior to the Bill being drafted; none the less, it is captured by the Bill. The architecture of the Bill, relying on codes of practice produced by Ofcom, is designed to ensure flexibility so that the codes of practice can be kept up to date. I just wanted to make those two points in passing, as the issue was raised by the hon. Member for Aberdeen North.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The reason the new clause is drafted in that way is because I wanted to recognise the work of the Joint Committee and to take on board its recommendations. If it had been entirely my drafting, the House of Lords would certainly not have been involved, given that I am not the biggest fan of the House of Lords, as its Members are not elected. However, the decision was made to submit the new clause as drafted.

The Minister has said that the Government have not come to a settled view yet, which I am taking as the Minister not saying no. He is not standing up and saying, “No, we will definitely not have a Standing Committee.” I am not suggesting he is saying yes, but given that he is not saying no, I am happy to withdraw the new clause. If the Minister is keen to come forward at a future stage with suggestions for changes to Standing Orders, which I understand have to be introduced by the Leader of the House or the Cabinet Office, then they would be gladly heard on this side of the House. I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

New Clause 38

Adults’ risk assessment duties

“(1) This section sets out duties which apply in relation to internet services within section 67(2).

(2) A duty to take appropriate steps to keep an adults’ risk assessment up to date, including when OFCOM makes any significant change to a risk profile that relates to services of the kind in question.

(3) Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient adults’ risk assessment relating to the impacts of that proposed change.

(4) A duty to make and keep a written record, in an easily understandable form, of every risk assessment under subsections (2) and (3).

(5) An “adults’ risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—

(a) the user base;

(b) the level of risk of adults who are users of the service encountering, by means of the service, each kind of priority content that is harmful to adults (with each kind separately assessed).

(6) An “adults’ risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—

(a) the user base;

(b) the level of risk of adults who are users of the service encountering, by means of the service, each kind of priority content that is harmful to adults (with each kind separately assessed), taking into account (in particular) algorithms used by the service, and how easily, quickly and widely content may be disseminated by means of the service;

(c) the level of risk of harm to adults presented by different kinds of priority content that is harmful to adults;

(d) the level of risk of harm to adults presented by priority content that is harmful to adults which particularly affects individuals with a certain characteristic or members of a certain group;

(e) the level of risk of functionalities of the service facilitating the presence or dissemination of priority content that is harmful to adults, identifying and assessing those functionalities that present higher levels of risk;

(f) the different ways in which the service is used, and the impact of such use on the level of risk of harm that might be suffered by adults;

(g) the nature, and severity, of the harm that might be suffered by adults from the matters identified in accordance with paragraphs (b) to (f);

(h) how the design and operation of the service (including the business model, governance, use of proactive technology, measures to promote users’ media literacy and safe use of the service, and other systems and processes) may reduce or increase the risks identified.

(7) In this section references to risk profiles are to the risk profiles for the time being published under section 83 which relate to the risk of harm to adults presented by priority content that is harmful to adults.

(8) The provisions of Schedule 3 apply to any assessment carried out under this section in the same way they apply to any relating to a Part 3 service.”—(John Nicolson.)

This new clause applies adults’ risk assessment duties to pornographic sites.

Brought up, and read the First time.

John Nicolson Portrait John Nicolson (Ochil and South Perthshire) (SNP)
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

New clause 39—Safety duties protecting adults—

“(1) This section sets out duties which apply in relation to internet services within section 67(2).

(2) A duty to summarise in the terms of service the findings of the most recent adults’ risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to adults).

(3) A duty to include provisions in the terms of service specifying, in relation to each kind of priority content that is harmful to adults that is to be treated in a way described in subsection (3), which of those kinds of treatment is to be applied.

(4) These are the kinds of treatment of content referred to in subsection (3)—

(a) taking down the content;

(b) restricting users’ access to the content.

(5) A duty to explain in the terms of service the provider’s response to the risks relating to priority content that is harmful to adults (as identified in the most recent adults’ risk assessment of the service), by reference to—

(a) any provisions of the terms of service included in compliance with the duty set out in subsection (3), and

(b) any other provisions of the terms of service designed to mitigate or manage those risks.

(6) If provisions are included in the terms of service in compliance with the duty set out in subsection (3), a duty to ensure that those provisions—

(a) are clear and accessible, and

(b) are applied consistently in relation to content which the provider reasonably considers is priority content that is harmful to adults or a particular kind of priority content that is harmful to adults.

(7) If the provider of a service becomes aware of any non-designated content that is harmful to adults present on the service, a duty to notify OFCOM of—

(a) the kinds of such content identified, and

(b) the incidence of those kinds of content on the service.

(8) In this section—

“adults’ risk assessment” has the meaning given by section 12;

“non-designated content that is harmful to adults” means content that is harmful to adults other than priority content that is harmful to adults.”

This new clause applies safety duties protecting adults to regulated provider pornographic content.

New clause 40—Duties to prevent users from encountering illegal content—

“(1) This section sets out duties which apply in relation to internet services within section 67(2).

(2) A duty to operate an internet service using proportionate systems and processes designed to—

(a) prevent individuals from encountering priority illegal content that amounts to an offence in either Schedule 6 or paragraphs 17 and 18 of Schedule 7 by means of the service;

(b) minimise the length of time for which the priority illegal content referred to in subsection (a) is present;

(c) where the provider is alerted by a person to the presence of the illegal content referred to in subsection (a), or becomes aware of it in any other way, swiftly take down such content.

(3) A duty to operate systems and processes that—

(a) verify the identity and age of all persons depicted in the content;

(b) obtain and keep on record written consent from all persons depicted in the content;

(c) only permit content uploads from verified content providers and must have a robust process for verifying the age and identity of the content provider;

(d) all uploaded content must be reviewed before publication to ensure that the content is not illegal and does not otherwise violate its terms of service;

(e) unloaded content must not be marketed by content search terms that give the impression that the content contains child exploitation materials or the depiction of non–consensual activities;

(f) the service must offer the ability for any person depicted in the content to appeal to remove the content in question.”

This new clause applies duties to prevent users from encountering illegal content to regulated providers of pornographic content.

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

Big porn, or the global online pornography industry, is a proven driver of big harms. It causes the spread of image-based sexual abuse and child sexual abuse material. It normalises sexual violence and harmful sexual attitudes and behaviours, and it offers children easy access to violent, sexist and racist sexual content, which is proven to cause them a whole range of harms. In part, the Government recognised how harmful pornography can be to children by building one small aspect of pornography regulation into the Bill.

The Bill is our best chance to regulate the online pornography industry, which it currently does not mention. Over two decades, the porn industry has shown itself not to be trustworthy about regulating itself. Vanessa Morse, the head of the Centre to End All Sexual Exploitation, said:

“If we fail to see the porn industry as it really is, efforts to regulate will flounder.”

If the Minister has not yet read CEASE’s “Expose Big Porn” report, I recommend that he does so. The report details some of the harrowing harms that are proliferated by porn companies. Importantly, these harms are being done with almost zero scrutiny. We all know who the head of Meta or the chief executive officer of Google is, but can the Minister tell me who is in charge of MindGeek? This company dominates the market, yet it is almost completely anonymous—or at least the high heid yins of the company are.

New clause 38 seeks to identify pornography websites as providers of category 1 services, introduce a relevant code of practice and designate a specific regulator, in order to ensure compliance. Big porn must be made to stop hosting illegal extreme porn and the legal but harmful content prohibited by its own terms of service. If anyone thought that social media platforms were indifferent to a harm taking place on their site, they pale in comparison with porn sites, which will do the absolute minimum that they can. To show the extent of the horrible searches allowed, one video found by CEASE was titled “Oriental slave girl tortured”. I will not read out some of the other titles in the report, but there are search terms that promote non-consensual activity, violence, incest and racial slurs. For example, “Ebony slave girl” is a permitted term. This is just one of the many examples of damaging content on porn sites, which are perpetuating horrific sexual practices that, sadly, are too often being viewed by children.

Over 80% of the UK public would support strict new porn laws. I really think there is an appetite among the public to introduce such laws. The UK Government must not pass up this opportunity to regulate big porn, which is long overdue.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

As we heard from the hon. Member for Ochil and South Perthshire, new clauses 38 to 40 would align the duties on pornographic content so that both user-to-user sites and published pornography sites are subject to robust duties that are relevant to the service. Charities have expressed concerns that many pornography sites might slip through the net because their content does not fall under the definition of “pornographic content” in clause 66. The new clauses aim to address that. They are based on the duties placed on category 1 services, but they recognise the unique harms that can be caused by pornographic content providers, some of which the hon. Member graphically described with the titles that he gave. The new clauses also contain some important new duties that are not currently in the Bill, including the transparency arrangements in new clause 39 and important safeguards in new clause 40.

The Opposition have argued time and again for publishing duties when it comes to risk assessments. New clause 39 would introduce a duty to summarise in the terms of service the findings of the most recent adult risk assessments of a service. That is an important step towards making risk assessments publicly accessible, although Labour’s preference would be for them to be published publicly and in full, as I argued in the debate on new clause 9, which addressed category 1 service risk assessments.

New clause 40 would introduce measures to prevent the upload of illegal content, such as by allowing content uploads only from verified content providers, and by requiring all uploaded content to be reviewed. If the latter duty were accepted, there would need to be proper training and support for any human content moderators. We have heard during previous debates about the awful circumstances of human content moderators. They are put under such pressure for that low-paid work, and we do not want to encourage that.

New clause 40 would also provide protections for those featured in such content, including the need for written consent and identity and age verification. Those are important safeguards that the Labour party supports. I hope the Minister will consider them.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Member for Ochil and South Perthshire for raising these issues with the Committee. It is important first to make it clear that websites providing user-to-user services are covered in part 3 of the Bill, under which they are obliged to protect children and prevent illegal content, including some forms of extreme pornography, from circulating. Such websites are also obliged to prevent children from accessing those services. For user-to-user sites, those matters are all comprehensively covered in part 3.

New clauses 38, 39 and 40 seek to widen the scope of part 5 of the Bill, which applies specifically to commercial pornography sites. Those are a different part of the market. Part 5 is designed to close a loophole in the original draft of the Bill that was identified by the Joint Committee, on which the hon. Member for Ochil and South Perthshire and my hon. Friend the Member for Watford served. Protecting children from pornographic content on commercial porn sites had been wrongly omitted from the original draft of the Bill. Part 5 of the Bill as currently drafted is designed to remedy that oversight. That is why the duties in part 5 are narrowly targeted at protecting children in the commercial part of the market.

A much wider range of duties is placed by part 3 on the user-to-user part of the pornography market. The user-to-user services covered by part 3 are likely to include the largest sites with the least control; as the content is user generated, there is no organising mind—whatever gets put up, gets put up. It is worth drawing the distinction between the services covered in part 3 and part 5 of the Bill.

In relation to part 5 services publishing their own material, Parliament can legislate, if it chooses to, to make some of that content illegal, as it has done in some areas—some forms of extreme pornography are illegal. If Parliament thinks that the line is drawn in the wrong place and need to be moved, it can legislate to move that line as part of the general legislation in this area.

I emphasise most strongly that user-to-user sites, which are probably what the hon. Member for Ochil and South Perthshire was mostly referring to, are comprehensively covered by the duties in part 3. The purpose of part 5, which was a response to the Joint Committee’s report, is simply to stop children viewing such content. That is why the Bill has been constructed as it has.

Question put, That the clause be read a Second time.

Division 72

Ayes: 6


Labour: 4
Scottish National Party: 2

Noes: 9


Conservative: 9

15:15
New Clause 39
Safety duties protecting adults
“(1) This section sets out duties which apply in relation to internet services within section 67(2).
(2) A duty to summarise in the terms of service the findings of the most recent adults’ risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to adults).
(3) A duty to include provisions in the terms of service specifying, in relation to each kind of priority content that is harmful to adults that is to be treated in a way described in subsection (3), which of those kinds of treatment is to be applied.
(4) These are the kinds of treatment of content referred to in subsection (3)—
(a) taking down the content;
(b) restricting users’ access to the content.
(5) A duty to explain in the terms of service the provider’s response to the risks relating to priority content that is harmful to adults (as identified in the most recent adults’ risk assessment of the service), by reference to—
(a) any provisions of the terms of service included in compliance with the duty set out in subsection (3), and
(b) any other provisions of the terms of service designed to mitigate or manage those risks.
(6) If provisions are included in the terms of service in compliance with the duty set out in subsection (3), a duty to ensure that those provisions—
(a) are clear and accessible, and
(b) are applied consistently in relation to content which the provider reasonably considers is priority content that is harmful to adults or a particular kind of priority content that is harmful to adults.
(7) If the provider of a service becomes aware of any non-designated content that is harmful to adults present on the service, a duty to notify OFCOM of—
(a) the kinds of such content identified, and
(b) the incidence of those kinds of content on the service.
(8) In this section—
‘adults’ risk assessment’ has the meaning given by section 12;
‘non-designated content that is harmful to adults’ means content that is harmful to adults other than priority content that is harmful to adults.”—(John Nicolson.)
This new clause applies safety duties protecting adults to regulated provider pornographic content.
Brought up, and read the First time.
Question put, That the clause be read a Second time.

Division 73

Ayes: 6


Labour: 4
Scottish National Party: 2

Noes: 9


Conservative: 9

New Clause 40
Duties to prevent users from encountering illegal content
“(1) This section sets out duties which apply in relation to internet services within section 67(2).
(2) A duty to operate an internet service using proportionate systems and processes designed to—
(a) prevent individuals from encountering priority illegal content that amounts to an offence in either Schedule 6 or paragraphs 17 and 18 of Schedule 7 by means of the service;
(b) minimise the length of time for which the priority illegal content referred to in subsection (a) is present;
(c) where the provider is alerted by a person to the presence of the illegal content referred to in subsection (a), or becomes aware of it in any other way, swiftly take down such content.
(3) A duty to operate systems and processes that—
(a) verify the identity and age of all persons depicted in the content;
(b) obtain and keep on record written consent from all persons depicted in the content;
(c) only permit content uploads from verified content providers and must have a robust process for verifying the age and identity of the content provider;
(d) all uploaded content must be reviewed before publication to ensure that the content is not illegal and does not otherwise violate its terms of service;
(e) unloaded content must not be marketed by content search terms that give the impression that the content contains child exploitation materials or the depiction of non–consensual activities;
(f) the service must offer the ability for any person depicted in the content to appeal to remove the content in question.”—(John Nicolson.)
This new clause applies duties to prevent users from encountering illegal content to regulated providers of pornographic content.
Brought up, and read the First time.
Question put, That the clause be read a Second time.

Division 74

Ayes: 6


Labour: 4
Scottish National Party: 2

Noes: 9


Conservative: 9

New Clause 41
Co-operation and disclosure of information: UK regulators
“(1) OFCOM may co-operate with a regulator established by statute or a recognised self-regulatory body in the United Kingdom, including by disclosing online safety information to that regulator, for the purposes of—
(a) tackling harm arising from illegal content, primary priority content harmful to children, priority content harmful to children, or priority content that is harmful to adults, or
(b) criminal investigations or proceedings relating to a matter to which the regulator’s functions relate.
(2) Where information is disclosed to a person in reliance on subsection (1), the person may not—
(a) use the information for a purpose other than the purpose for which it was disclosed, or
(b) further disclose the information, except with OFCOM’s consent (which may be general or specific) or in accordance with an order of a court or tribunal.
(3) A disclosure of information under subsection (1) does not breach—
(a) any obligation of confidence owed by the person making the disclosure, or
(b) any other restriction on the disclosure of information.”—(Alex Davies-Jones.)
This new clause would give Ofcom the power to co-operate with other regulators for the purposes of tackling harm from illegal content and criminal investigations and proceedings.
Brought up, and read the First time.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

The new clause would give Ofcom the power to co-operate with other regulators for the purposes of tackling harm from illegal content, and for criminal investigations and proceedings. The Minister will be aware that the vast range of human and business activity covered online presents a complex map of potential harms. Some harms will fall into or be adjacent to the purview of other regulators with domain-specific expertise. The relationship formalised through the Digital Regulation Cooperation Forum is well known. Indeed, Ofcom already has a working relationship with the Advertising Standards Authority and the Internet Watch Foundation, among others. Within this regulatory web, Ofcom will have the most relevant powers and expertise, so many regulators will look to it for help in tackling online safety issues. The Minister must recognise that public protection will most effectively be achieved through regulatory interlock. To protect people, Ofcom should be empowered to co-operate with others and to share information. The Bill should, therefore, as much as it can, enable Ofcom to work with other regulators and share online safety information with them.

Ofcom should also be able to bring the immense skills of other regulators into its work. The Bill gives Ofcom the general ability to co-operate with overseas regulators, but, with the exception of references to consulting the Information Commissioner’s Office when drawing up codes of practice and various items of guidance, the Bill is largely silent on co-operation with UK regulators.

The Communications Act 2003 limits the UK regulators with which Ofcom can share information—excluding the ICO, for instance—yet the Online Safety Bill takes a permissive approach to overseas regulators. The Bill should extend co-operation and information sharing in respect of online safety to include regulators overseeing the offences in schedule 7, the primary priority and priority harms to children, and the priority harms to adults.

Elsewhere in regulation, the Financial Conduct Authority has a general duty to co-operate. The same should apply here. Increasing safety through co-operation between relevant regulators is most easily achieved through our new clause, which will allow Ofcom to co-operate more widely. That is limited to co-operation in respect of harmful illegal content, harms to children and priority harms to adults. It is implicit that Ofcom will share information only with the regulators responsible for those precise matters. We have spoken frequently about the importance of co-operation, collaboration and consultation. This simple new clause would help to remedy the slight limitations placed on Ofcom in the Bill.

Ms Rees, with your permission, at this point—because this is likely to be my last contribution to the Bill Committee—[Interruption.] For shame. I place on record my sincere thanks to you and Sir Roger for chairing these Committee sittings, as well as all the Hansard staff, the Clerks, the Table Office, our civil servants, the Doorkeepers, the tech staff and broadcasting team who enable our proceedings to be broadcast to the public, and all members of the Committee for allowing great scrutiny of this legislation to take place. I look forward to continuing that scrutiny on Report.

None Portrait The Chair
- Hansard -

Thank you.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I will take this opportunity, as my hon. Friend has done, to add a few words of thanks. She has already thanked all the people in this place who we should be thanking, including the Clerks, who have done a remarkable job over the course of our deliberations with advice, drafting, and support to the Chair. I also thank the stakeholder organisations. This Bill is uniquely one in which the stakeholders—the children’s charities and all those other organisations—have played an incredible part. I know from meetings that they have already advertised that those organisations will continue playing that part over the coming weeks, up until Report. It has been fantastic.

Finally, I will mention two people who have done a remarkable amount of work: my researcher Iona and my hon. Friend’s researcher Freddie, who have done a huge amount to help us prepare speaking notes. It is a big task, because this is a complex Bill. I add my thanks to you, Ms Rees, for the way you have chaired this Committee. Please thank Sir Roger on our behalf as well.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Seeing as we are not doing spurious points of order, I will also take the opportunity to express our thanks. The first one is to the Chairs: thank you very much, Ms Rees and Sir Roger, for the excellent work you have done in the Chair. This has been a very long Bill, and the fact that you have put up with us for so long has been very much appreciated.

I thank all the MPs on the Committee, particularly the Labour Front-Bench team and those who have been speaking for the Labour party. They have been very passionate and have tabled really helpful amendments—it has been very good to work with the Labour team on the amendments that we have put together, particularly the ones we have managed to agree on, which is the vast majority. We thank Matt Miller, who works for my hon. Friend the Member for Ochil and South Perthshire. He has been absolutely wonderful. He has done an outstanding amount of work on the Bill, and the amazing support that he has given us has been greatly appreciated. I also thank the Public Bill Office, especially for putting up with the many, many amendments we submitted, and for giving us a huge amount of advice on them.

Lastly, I thank the hundreds of organisations that got in touch with us, and the many people who took the time to scrutinise the Bill, raise their concerns, and bring those concerns to us. Of those hundreds of people and organisations, I particularly highlight the work of the National Society for the Prevention of Cruelty to Children. Its staff have been really helpful to work with, and I have very much appreciated their advice and support in drafting our amendments.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I feel slightly out of place, but I will add some concluding remarks in a moment; I should probably first respond to the substance of the new clause. The power to co-operate with other regulators and share information is, of course, important, but I am pleased to confirm that it is already in the Bill—it is not the first time that I have said that, is it?

Clause 98 amends section 393(2)(a) of the Communications Act 2003. That allows Ofcom to disclose information and co-operate with other regulators. Our amendment will widen the scope of the provision to include carrying out the functions set out in the Bill.

The list of organisations with which Ofcom can share information includes a number of UK regulators—the Competition and Markets Authority, the Information Commissioner, the Financial Conduct Authority and the Payment Systems Regulator—but that list can be amended, via secondary legislation, if it becomes necessary to add further organisations. In the extremely unlikely event that anybody wants to look it up, that power is set out in subsections (3)(i) and (4)(c) of section 393 of the Communications Act 2003. As the power is already created by clause 98, I hope that we will not need to vote on new clause 41.

I echo the comments of the shadow Minister about the Digital Regulation Cooperation Forum. It is a non-statutory body, but it is extremely important that regulators in the digital arena co-operate with one another and co-ordinate their activities. I am sure that we all strongly encourage the relevant regulators to work with the DRCF and to co-operate in this and adjacent fields.

I will bring my remarks to a close with one or two words of thanks. Let me start by thanking Committee members for their patience and dedication over the nine days we have been sitting—50-odd hours in total. I think it is fair to say that we have given the Bill thorough consideration, and of course there is more to come on Report, and that is before we even get to the House of Lords. This is the sixth Bill that I have taken through Committee as Minister, and it is by far the most complicated and comprehensive, running to 194 clauses and 15 schedules, across 213 pages. It has certainly been a labour. Given its complexity, the level of scrutiny it has received has been impressive—sometimes onerous, from my point of view.

The prize for the most perceptive observation during our proceedings definitely goes to the hon. Member for Aberdeen North, who noticed an inconsistency between use of the word “aural” in clause 49 and “oral” in clause 189, about 120 pages later.

I certainly thank our fantastic Chairs, Sir Roger Gale and Ms Rees, who have chaired our proceedings magnificently and kept us in order, and even allowed us to finish a little early, so huge thanks to them. I also thank the Committee Clerks for running everything so smoothly and efficiently, the Hansard reporters for deciphering our sometimes near-indecipherable utterances, and the Officers of the House for keeping our sittings running smoothly and safely.

I also thank all those stakeholders who have offered us their opinions; I suspect that they will continue to do so during the rest of the passage of the Bill. Their engagement has been important and very welcome. It has really brought external views into Parliament, which is really important.

I conclude by thanking the people who have been working on the Bill the longest and hardest: the civil servants in the Department for Digital, Culture, Media and Sport. Some members of the team have been working on the Bill in its various forms, including White Papers and so on, for as long as five years. The Bill has had a long gestation. Over the last few months, as we have been updating the Bill, rushing to introduce it, and perhaps even preparing some amendments for Report, they have been working incredibly hard, so I give a huge thanks to Sarah Connolly and the whole team at DCMS for all their incredible work.

Finally, as we look forward to Report, which is coming up shortly, we are listening, and no doubt flexibility will be exhibited in response to some of the points that have been raised. I look forward to working with members of the Committee and Members of the House more widely as we seek to make the Bill as good as it can be. On that note, I will sit down for the last time.

None Portrait The Chair
- Hansard -

Before I ask Alex Davies-Jones whether she wishes to press the new clause to a vote, I thank you all for the very respectful way in which you have conducted proceedings. It is much appreciated. Let me say on behalf of Sir Roger and myself that it has been an absolute privilege to co-chair this Bill Committee.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

On a point of order, Ms Rees. On behalf of the Back Benchers, I thank you and Sir Roger for your excellent chairpersonships, and the Minister and shadow Ministers for the very courteous way in which proceedings have taken place. It has been a great pleasure to be a member of the Bill Committee.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am content with the Minister’s assurance that the provisions of new clause 41 are covered in the Bill, and therefore do not wish to press it to a vote. I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

New Schedule 2

Recovery of OFCOM’s initial costs

Recovery of initial costs

1 (1) This Schedule concerns the recovery by OFCOM of an amount equal to the aggregate of the amounts of WTA receipts which, in accordance with section 401(1) of the Communications Act and OFCOM’s statement under that section, are retained by OFCOM for the purpose of meeting their initial costs.

(2) OFCOM must seek to recover the amount described in sub-paragraph (1) (“the total amount of OFCOM’s initial costs”) by charging providers of regulated services fees under this Schedule (“additional fees”).

(3) In this Schedule—

“initial costs” means the costs incurred by OFCOM before the day on which section 75 comes into force on preparations for the exercise of their online safety functions;

“WTA receipts” means the amounts described in section 401(1)(a) of the Communications Act which are paid to OFCOM (certain receipts under the Wireless Telegraphy Act 2006).

Recovery of initial costs: first phase

2 (1) The first phase of OFCOM’s recovery of their initial costs is to take place over a period of several charging years to be specified in regulations under paragraph 7 (“specified charging years”).

(2) Over that period OFCOM must, in aggregate, charge providers of regulated services additional fees of an amount equal to the total amount of OFCOM’s initial costs.

(3) OFCOM may not charge providers additional fees in respect of any charging year which falls before the first specified charging year.

(4) OFCOM may require a provider to pay an additional fee in respect of a charging year only if the provider is required to pay a fee in respect of that year under section 71 (and references in this Schedule to charging providers are to be read accordingly).

(5) The amount of an additional fee payable by a provider is to be calculated in accordance with regulations under paragraph 7.

Further recovery of initial costs

3 (1) The second phase of OFCOM’s recovery of their initial costs begins after the end of the last of the specified charging years.

(2) As soon as reasonably practicable after the end of the last of the specified charging years, OFCOM must publish a statement specifying—

(a) the amount which is at that time the recoverable amount (see paragraph 6), and

(b) the amounts of the variables involved in the calculation of the recoverable amount.

(3) OFCOM’s statement must also specify the amount which is equal to that portion of the recoverable amount which is not likely to be paid or recovered. The amount so specified is referred to in sub-paragraphs (4) and (5) as “the outstanding amount”.

(4) Unless a determination is made as mentioned in sub-paragraph (5), OFCOM must, in aggregate, charge providers of regulated services additional fees of an amount equal to the outstanding amount.

(5) The Secretary of State may, as soon as reasonably practicable after the publication of OFCOM’s statement, make a determination specifying an amount by which the outstanding amount is to be reduced, and in that case OFCOM must, in aggregate, charge providers of regulated services additional fees of an amount equal to the difference between the outstanding amount and the amount specified in the determination.

(6) Additional fees mentioned in sub-paragraph (4) or (5) must be charged in respect of the charging year immediately following the last of the specified charging years (“year 1”).

(7) The process set out in sub-paragraphs (2) to (6) is to be repeated in successive charging years, applying those sub-paragraphs as if—

(a) in sub-paragraph (2), the reference to the end of the last of the specified charging years were to the end of year 1 (and so on for successive charging years);

(b) in sub-paragraph (6), the reference to year 1 were to the charging year immediately following year 1 (and so on for successive charging years).

(8) Any determination by the Secretary of State under this paragraph must be published in such manner as the Secretary of State considers appropriate.

(9) Sub-paragraphs (4) and (5) of paragraph 2 apply to the charging of additional fees under this paragraph as they apply to the charging of additional fees under that paragraph.

(10) The process set out in this paragraph comes to an end in accordance with paragraph 4.

End of the recovery process

4 (1) The process set out in paragraph 3 comes to an end if a statement by OFCOM under that paragraph records that—

(a) the recoverable amount is nil, or

(b) all of the recoverable amount is likely to be paid or recovered.

(2) Or the Secretary of State may bring that process to an end by making a determination that OFCOM are not to embark on another round of charging providers of regulated services additional fees.

(3) The earliest time when such a determination may be made is after the publication of OFCOM’s first statement under paragraph 3.

(4) A determination under sub-paragraph (2)—

(a) must be made as soon as reasonably practicable after the publication of a statement by OFCOM under paragraph 3;

(b) must be published in such manner as the Secretary of State considers appropriate.

(5) A determination under sub-paragraph (2) does not affect OFCOM’s power—

(a) to bring proceedings for the recovery of the whole or part of an additional fee for which a provider became liable at any time before the determination was made, or

(b) to act in accordance with the procedure set out in section 120 in relation to such a liability.

Providers for part of a year only

5 (1) For the purposes of this Schedule, the “provider” of a regulated service, in relation to a charging year, includes a person who is the provider of the service for part of the year.

(2) Where a person is the provider of a regulated service for part of a charging year only, OFCOM may refund all or part of an additional fee paid to OFCOM under paragraph 2 or 3 by that provider in respect of that year.

Calculation of the recoverable amount

6 For the purposes of a statement by OFCOM under paragraph 3, the “recoverable amount” is given by the formula—

C – (F – R) - D

where—

C is the total amount of OFCOM’s initial costs,

F is the aggregate amount of the additional fees received by OFCOM at the time of the statement in question,

R is the aggregate amount of the additional fees received by OFCOM that at the time of the statement in question have been, or are due to be, refunded (see paragraph 5(2)), and

D is the amount specified in a determination made by the Secretary of State under paragraph 3 (see paragraph 3(5)) at a time before the statement in question or, where more than one such determination has been made, the sum of the amounts specified in those determinations.

If no such determination has been made before the statement in question, D=).

Regulations about recovery of initial costs

7 (1) The Secretary of State must make regulations making such provision as the Secretary of State considers appropriate in connection with the recovery by OFCOM of their initial costs.

(2) The regulations must include provision as set out in sub-paragraphs (3), (4) and (6).

(3) The regulations must specify the total amount of OFCOM’s initial costs.

(4) For the purposes of paragraph 2, the regulations must specify—

(a) the charging years in respect of which additional fees are to be charged, and

(b) the proportion of the total amount of initial costs which OFCOM must seek to recover in each of the specified charging years.

(5) The following rules apply to provision made in accordance with sub-paragraph (4)(a)—

(a) the initial charging year may not be specified;

(b) only consecutive charging years may be specified;

(c) at least three charging years must be specified;

(d) no more than five charging years may be specified.

(6) The regulations must specify the computation model that OFCOM must use to calculate fees payable by individual providers of regulated services under paragraphs 2 and 3 (and that computation model may be different for different charging years).

(7) The regulations may make provision about what OFCOM may or must do if the operation of this Schedule results in them recovering more than the total amount of their initial costs.

(8) The regulations may amend this Schedule or provide for its application with modifications in particular cases.

(9) Before making regulations under this paragraph, the Secretary of State must consult—

(a) OFCOM,

(b) providers of regulated user-to-user services,

(c) providers of regulated search services,

(d) providers of internet services within section 67(2), and

(e) such other persons as the Secretary of State considers appropriate.

Interpretation

8 In this Schedule—

“additional fees” means fees chargeable under this Schedule in respect of the recovery of OFCOM’s initial costs;

“charging year” has the meaning given by section76;

“initial charging year” has the meaning given by section76;

“initial costs” has the meaning given by paragraph 1(3), and the “total amount” of initial costs means the amount described in paragraph 1(1);

“recoverable amount” has the meaning given by paragraph 6;

“specified charging year” means a charging year specified in regulations under paragraph 7 for the purposes of paragraph 2.” —(Chris Philp.)

This new Schedule requires Ofcom to seek to recover their costs which they have incurred (before clause 75 comes into force) when preparing to take on functions as the regulator of services under the Bill by charging fees to providers of services.

Brought up, read the First and Second time, and added to the Bill.

None Portrait The Chair
- Hansard -

New schedule 1 was tabled by Carla Lockhart, who is not on the Committee. Does any Member wish to move new schedule 1? No.

We now come to the final Question in the proceedings. The Committee has finished its work.

Bill, as amended, to be reported.

15:32
Committee rose.
Written evidence reported to the House
OSB89 Mental Health Foundation
OSB90 CEASE UK
OSB91 Amazon UK
OSB92 Demos (supplementary submission)
OSB93 Dave ‘Yardfish’
OSB94 Sam Guinness
OSB95 M. Jenny Edwards, Criminologist and international subject matter expert (SME), Chandler Edwards
OSB96 Domestic Abuse Commissioner
OSB97 The football authorities (Kick It Out, The FA, The Premier League, EFL, Women’s Super League, Women’s Championship, National League, Isthmian League, Southern League, Northern Premier League, Professional Footballers Association, League Managers’ Association, Professional Game Match Officials, and Women in Football) (joint submission)
OSB98 Suzy Lamplugh Trust
OSB99 Liberty
OSB100 Ibrahim Chaudry

Online Safety Bill

Day 1
Consideration of Bill, as amended in the Public Bill Committee
[Relevant Documents: Report of the Joint Committee on the Draft Online Safety Bill, Session 2021-22: Draft Online Safety Bill, HC 609, and the Government Response, CP 640; Letter from the Minister for Tech and the Digital Economy to the Chair of the Joint Committee on Human Rights relating to the Online Safety Bill, dated 16 June 2022; Letter from the Chair of the Joint Committee on Human Rights to the Secretary of State for Digital, Culture, Media and Sport relating to the Online Safety Bill, dated 19 May 2022; First Report of the Digital, Cultural, Media and Sport Committee, Amending the Online Safety Bill, HC 271]
New Clause 19
Duties to protect news publisher content
(1) This section sets out the duties to protect news publisher content which apply in relation to Category 1 services.
(2) Subject to subsections (4), (5) and (8), a duty, in relation to a service, to take the steps set out in subsection (3) before—
(a) taking action in relation to content present on the service that is news publisher content, or
(b) taking action against a user who is a recognised news publisher.
(3) The steps referred to in subsection (2) are—
(a) to give the recognised news publisher in question a notification which—
(i) specifies the action that the provider is considering taking,
(ii) gives reasons for that proposed action by reference to each relevant provision of the terms of service,
(iii) where the proposed action relates to news publisher content that is also journalistic content, explains how the provider took the importance of the free expression of journalistic content into account when deciding on the proposed action, and
(iv) specifies a reasonable period within which the recognised news publisher may make representations,
(b) to consider any representations that are made, and
(c) to notify the recognised news publisher of the decision and the reasons for it (addressing any representations made).
(4) If a provider of a service reasonably considers that the provider would incur criminal or civil liability in relation to news publisher content present on the service if it were not taken down swiftly, the provider may take down that content without having taken the steps set out in subsection (3).
(5) A provider of a service may also take down news publisher content present on the service without having taken the steps set out in subsection (3) if that content amounts to a relevant offence (see section 52 and also subsection (10) of this section).
(6) Subject to subsection (8), if a provider takes action in relation to news publisher content or against a recognised news publisher without having taken the steps set out in subsection (3), a duty to take the steps set out in subsection (7).
(7) The steps referred to in subsection (6) are—
(a) to swiftly notify the recognised news publisher in question of the action taken, giving the provider’s justification for not having first taken the steps set out in subsection (3),
(b) to specify a reasonable period within which the recognised news publisher may request that the action is reversed, and
(c) if a request is made as mentioned in paragraph (b)—
(i) to consider the request and whether the steps set out in subsection (3) should have been taken prior to the action being taken,
(ii) if the provider concludes that those steps should have been taken, to swiftly reverse the action, and
(iii) to notify the recognised news publisher of the decision and the reasons for it (addressing any reasons accompanying the request for reversal of the action).
(8) If a recognised news publisher has been banned from using a service (and the ban is still in force), the provider of the service may take action in relation to news publisher content present on the service which was generated or originally published or broadcast by the recognised news publisher without complying with the duties set out in this section.
(9) For the purposes of subsection (2)(a), a provider is not to be regarded as taking action in relation to news publisher content in the following circumstances—
(a) a provider takes action in relation to content which is not news publisher content, that action affects related news publisher content, the grounds for the action only relate to the content which is not news publisher content, and it is not technically feasible for the action only to relate to the content which is not news publisher content;
(b) a provider takes action against a user, and that action affects news publisher content that has been uploaded to or shared on the service by the user.
(10) Section (Providers’ judgements about the status of content) (providers’ judgements about the status of content) applies in relation to judgements by providers about whether news publisher content amounts to a relevant offence as it applies in relation to judgements about whether content is illegal content.
(11) OFCOM’s guidance under section (Guidance about illegal content judgements) (guidance about illegal content judgements) must include guidance about the matters dealt with in section (Providers’ judgements about the status of content) as that section applies by reason of subsection (10).
(12) Any provision of the terms of service has effect subject to this section.
(13) In this section—
(a) references to “news publisher content” are to content that is news publisher content in relation to the service in question;
(b) references to “taking action” in relation to content are to—
(i) taking down content,
(ii) restricting users’ access to content, or
(iii) taking other action in relation to content (for example, adding warning labels to content);
(c) references to “taking action” against a person are to giving a warning to a person, or suspending or banning a person from using a service, or in any way restricting a person’s ability to use a service.
(14) Taking any step set out in subsection (3) or (7) does not count as “taking action” for the purposes of this section.
(15) See—
section 16 for the meaning of “journalistic content”;
section 49 for the meaning of “news publisher content”;
section 50 for the meaning of “recognised news publisher”.”—(Damian Collins.)
Member’s explanatory statement
This new clause requires providers to notify a recognised news publisher and provide a right to make representations before taking action in relation to news publisher content or against the publisher (except in certain circumstances), and to notify a recognised news publisher after action is taken without that process being followed and provide an opportunity for the publisher to request that the action is reversed.
Brought up, and read the First time.
12:50
Lindsay Hoyle Portrait Mr Speaker
- Hansard - - - Excerpts

With this it will be convenient to discuss the following:

New clause 2—Secretary of State’s powers to suggest modifications to a code of practice

“(1) The Secretary of State may on receipt of a code write within one month of that day to OFCOM with reasoned, evidence-based suggestions for modifying the code.

(2) OFCOM shall have due regard to the Secretary of State’s letter and must reply to the Secretary of State within one month of receipt.

(3) The Secretary of State may only write to OFCOM twice under this section for each code.

(4) The Secretary of State and OFCOM shall publish their letters as soon as reasonably possible after transmission, having made any reasonable redactions for public safety and national security.

(5) If the draft of a code of practice contains modifications made following changes arising from correspondence under this section, the affirmative procedure applies.”

New clause 3—Priority illegal content: violence against women and girls

“(1) For the purposes of this Act, any provision applied to priority illegal content should also be applied to any content which—

(a) constitutes,

(b) encourages, or

(c) promotes

(2) ‘Violence against women and girls’ is defined by Article 3 of the Council of Europe Convention on Preventing Violence Against Women and Domestic Violence (‘the Istanbul Convention’).”

This new clause applies provisions to priority illegal content to content which constitutes, encourages or promotes violence against women and girls.

New clause 4—Duty about content advertising or facilitating prostitution: Category 1 and Category 2B services

“(1) A provider of a Category 1 or Category 2B service must operate the service so as to—

(a) prevent individuals from encountering content that advertises or facilitates prostitution;

(b) minimise the length of time for which any such content is present;

(c) where the provider is alerted by a person to the presence of such content, or becomes aware of it in any other way, swiftly take down such content.

(2) A provider of a Category 1 or Category 2B service must include clear and accessible provisions in a publicly available statement giving information about any proactive technology used by the service for the purpose of compliance with the duty set out in subsection (1) (including the kind of technology, when it is used, and how it works).

(3) If a person is the provider of more than one Category 1 or Category 2B service, the duties set out in this section apply in relation to each such service.

(4) The duties set out in this section extend only to the design, operation and use of a Category 1 or Category 2B service in the United Kingdom.

(5) For the meaning of ‘Category 1 service’ and ‘Category 2B service’, see section 81 (register of categories of services).

(6) For the meaning of ‘prostitution’, see section 54 of the Sexual Offences Act 2003.”

New clause 5—Duty about content advertising or facilitating prostitution: Category 2A services

“(1) A provider of a Category 2A service must operate that service so as to minimise the risk of individuals encountering content which advertises or facilitates prostitution in or via search results of the service.

(2) A provider of a Category 2A service must include clear and accessible provisions in a publicly available statement giving information about any proactive technology used by the service for the purpose of compliance with the duty set out in subsection (1) (including the kind of technology, when it is used, and how it works).

(3) The reference to encountering content which advertises or facilitates prostitution “in or via search results” of a search service does not include a reference to encountering such content as a result of any subsequent interactions with an internet service other than the search service.

(4) If a person is the provider of more than one Category 2A service, the duties set out in this section apply in relation to each such service.

(5) The duties set out in this section extend only to the design, operation and use of a Category 2A service in the United Kingdom.

(6) For the meaning of ‘Category 2A service’, see section 81 (register of categories of services).

(7) For the meaning of ‘prostitution’, see section 54 of the Sexual Offences Act 2003.”

New clause 6—Duty about content advertising or facilitating prostitution: internet services providing pornographic content

“(1) A provider of an internet service within the scope of section 67 of this Act must operate that service so as to—

(a) prevent individuals from encountering content that advertises or facilitates prostitution;

(b) minimise the length of time for which any such content is present;

(c) where the provider is alerted by a person to the presence of such content, or becomes aware of it in any other way, swiftly take down such content.

(2) A provider of an internet service under this section must include clear and accessible provisions in a publicly available statement giving information about any proactive technology used by the service for the purpose of compliance with the duty set out in subsection (1) (including the kind of technology, when it is used, and how it works).

(3) If a person is the provider of more than one internet service under this section, the duties set out in this section apply in relation to each such service.

(4) For the meaning of ‘prostitution’, see section 54 of the Sexual Offences Act 2003.”

New clause 8—Duties about advertisements for cosmetic procedures

“(1) A provider of a regulated service must operate the service using systems and processes designed to—

(a) prevent individuals from encountering advertisements for cosmetic procedures that do not meet the conditions specified in subsection (3);

(b) minimise the length of time for which any such advertisement is present;

(c) where the provider is alerted by a person to the presence of such an advertisement, or becomes aware of it in any other way, swiftly take it down.

(2) A provider of a regulated service must include clear and accessible provisions in the terms of service giving information about any proactive technology used by the service for the purpose of compliance with the duty set out in subsection (1) (including the kind of technology, when it is used, and how it works).

(3) The conditions under subsection (1)(a) are that the advertisement—

(a) contains a disclaimer as to the health risks of the cosmetic procedure, and

(b) includes a certified service quality indicator.

(4) If a person is the provider or more than one regulated service, the duties set out in this section apply in relation to each such service.

(5) The duties set out in this section extent only to the design, operation and use of a regulated service in the United Kingdom.

(6) For the meaning of ‘regulated service’, see section 3 (‘Regulated service’. ‘Part 3 service’ etc).”

This new clause would place a duty on all internet service providers regulated by the Bill to prevent individuals from encountering adverts for cosmetic procedures that do not contain a disclaimer as to the health risks of the procedure nor include a certified service quality indicator.

New clause 9—Content harmful to adults risk assessment duties: regulated search services

“(1) This section sets out the duties about risk assessments which apply in relation to all regulated search services.

(2) A duty to carry out a suitable and sufficient priority adults risk assessment at a time set out in, or as provided by Schedule 3.

(3) A duty to take appropriate steps to keep an adults’ risk assessment up to date, including when OFCOM make any significant change to a risk profile that relates to services of the kind in question.

(4) Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient adult risk assessment relating to the impacts of that proposed change.

(5) An ‘adults risk assessment’ of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—

(a) the level of risk of individuals who are users of the service encountering each kind of priority content that is harmful to adults (with each kind separately assessed), taking into account (in particular) risks presented by algorithms used by the service, and the way that the service indexes, organises and presents search results;

(b) the level of risk of functionalities of the service facilitating individuals encountering search content that is harmful to adults, identifying and assessing those functionalities that present higher levels of risk;

(c) the nature, and severity, of the harm that might be suffered by individuals from the matters identified in accordance with paragraphs (a) and (b);

(d) how the design and operation of the service (including the business model, governance, use of proactive technology, measures to promote users’ media literacy and safe use of the service, and other systems and processes) may reduce or increase the risks identified.

(6) In this section, references to risk profiles are to the risk profiles for the time being published under section 84 which relate to the risk of harm to adults presented by priority content that is harmful to adults.

(7) See also—section 20(2) (records of risk assessments), and Schedule 3 (timing of providers’ assessments).”

New clause 10—Safety Duties Protecting Adults: regulated search services

“(1) This section sets out the duties about protecting adults which apply in relation to all regulated search services.

(2) A duty to summarise in the policies of the search service the findings of the most recent adults’ risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to adults).

(3) A duty to include provisions in the search service policies specifying, in relation to each kind of priority content that is harmful to adults that is to be treated in a way described in subsection (4), which of those kinds of treatment is to be applied.

(4) The duties set out in subsections (2) and (3) apply across all areas of a service, including the way the search engine is operated and used as well as search content of the service, and (among other things) require the provider of a service to take or use measures in the following areas, if it is proportionate to do so—

(a) regulatory compliance and risk management arrangements,

(b) design of functionalities, algorithms and other features relating to the search engine,

(c) functionalities allowing users to control the content they encounter in search results,

(d) content prioritisation and ranking,

(e) user support measures, and

(f) staff policies and practices.

(5) A duty to explain in the terms of service the provider’s response to the risks relating to priority content that is harmful to adults (as identified in the most recent adults’ risk assessment of the service), by reference to—

(a) any provisions of the policies included in compliance with the duty set out in subsection (3), and

(b) any other provisions of the terms of service designed to mitigate or manage those risks.

(6) If provisions are included in the policies in compliance with the duty set out in subsection (3), a duty to ensure that those provisions—

(a) are clear and accessible, and

(b) are applied consistently in relation to content which the provider reasonably considers is priority

(NaN) If the provider of a service becomes aware of any non-designated content that is harmful to adults present on the service, a duty to notify OFCOM of—

(a) the kinds of such content identified, and

(b) the incidence of those kinds of content on the service.

(NaN) A duty to ensure that the provisions of the publicly available statement referred to in subsections (5) and (7) are clear and accessible.

(NaN) In this section—

‘adults’ risk assessment’ has the meaning given by section 12;

‘non-designated content that is harmful to adults’ means content that is harmful to adults other than priority content that is harmful to adults.”

New clause 18—Child user empowerment duties

“(1) This section sets out the duties to empower child users which apply in relation to Category 1 services.

(2) A duty to include in a service, to the extent that it is proportionate to do so, features which child users may use or apply if they wish to increase their control over harmful content.

(3) The features referred to in subsection (2) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—

(a) reduce the likelihood of the user encountering priority content that is harmful, or particular kinds of such content, by means of the service, or

(b) alert the user to the harmful nature of priority content that is harmful that the user may encounter by means of the service.

(4) A duty to ensure that all features included in a service in compliance with the duty set out in subsection (2) are made available to all child users.

(5) A duty to include clear and accessible provisions in the terms of service specifying which features are offered in compliance with the duty set out in subsection (2), and how users may take advantage of them.

(6) A duty to include in a service features which child users may use or apply if they wish to filter out non-verified users.

(7) The features referred to in subsection (6) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—

(a) prevent non-verified users from interacting with content which that user generates, uploads or shares on the service, and

(b) reduce the likelihood of that user encountering content which non-verified users generate, upload or share on the service.

(8) A duty to include in a service features which child users may use or apply if they wish to only encounter content by users they have approved.

(9) A duty to include in a service features which child users may use or apply if they wish to filter out private messages from—

(a) non-verified users, or

(b) adult users, or

(c) any user other than those on a list approved by the child user.

(10) In determining what is proportionate for the purposes of subsection (2), the following factors, in particular, are relevant—

(a) all the findings of the most recent child risk assessment (including as to levels of risk and as to nature, and severity, of potential harm), and

(b) the size and capacity of the provider of a service.

(11) In this section ‘non-verified user’ means a user who has not verified their identity to the provider of a service (see section 57(1)).

(12) In this section references to features include references to functionalities and settings.”

New clause 24—Category 1 services: duty not to discriminate, harass or victimise against service users

“(1) The following duties apply to all providers of Category 1 services.

(2) A duty not to discriminate, on the grounds of a protected characteristic, against a person wishing to use the service by not providing the service, if the result of not providing the service is to cause harm to that person.

(3) A duty not to discriminate, on the grounds of a protected characteristic, against any user of the service in a way that causes harm to the user—

(a) as to the terms on which the provider provides the service to the user;

(b) by terminating the provision of the service to the user;

(c) by subjecting the user to any other harm.

(4) A duty not to harass, on the grounds of a protected characteristic, a user of the service in a way that causes harm to the user.

(5) A duty not to victimise because of a protected characteristic a person wishing to use the service by not providing the user with the service, if the result of not providing the service is to cause harm to that person.

(6) A duty not to victimise a service user—

(a) as to the terms on which the provider provides the service to the user;

(b) by terminating the provision of the service to the user;

(c) by subjecting the user to any other harm.

(7) In this section—

references to harassing, discriminating or victimising have the same meaning as set out in Part 2 of the Equality Act 2010;

‘protected characteristic’ means a characteristic listed in section 4 of the Equality Act 2010.”

This new clause would place a duty, regulated by Ofcom, on Category 1 service providers not to discriminate, harass or victimise users of their services on the basis of a protected characteristic if doing so would result in them being caused harm. Discrimination, harassment and victimisation, and protected characteristics, have the same meaning as in the Equality Act 2010.

New clause 25—Report on duties that apply to all internet services likely to be accessed by children

“(1) Within 12 months of this Act receiving Royal Assent, the Secretary of State must commission an independent evaluation of the matters under subsection (2) and must lay the report of the evaluation before Parliament.

(2) The evaluation under subsection (1) must consider whether the following duties should be imposed on all providers of services on the internet that are likely to be accessed by children, other than services regulated by this Act—

(a) duties similar to those imposed on regulated services by sections 10 and 25 of this Act to carry out a children’s risk assessment, and

(b) duties similar to those imposed on regulated services by sections 11 and 26 of this Act to protect children’s online safety.”

This new clause would require the Secretary of State to commission an independent evaluation on whether all providers of internet services likely to be accessed by children should be subject to child safety duties and must conduct a children’s risk assessment.

New clause 26—Safety by design

“(1) In exercising their functions under this Act—

(a) The Secretary of State, and

(b) OFCOM

must have due regard to the principles in subsections (2)-(3).

(2) The first principle is that providers of regulated services should design those services to prevent harmful content from being disseminated widely, and that this is preferable in the first instance to both—

(a) removing harmful content after it has already been disseminated widely, and

(b) restricting which users can access the service or part of it on the basis that harmful content is likely to disseminate widely on that service.

(4) The second principle is that providers of regulated services should safeguard freedom of expression and participation, including the freedom of expression and participation of children.”

This new clause requires the Secretary of State and Ofcom to have due regard to the principle that internet services should be safe by design.

New clause 27—Publication of risk assessments

“Whenever a Category 1 service carries out any risk assessment pursuant to Part 3 of this Act, the service must publish the risk assessment on the service’s website.”

New clause 38—End-to-end encryption

“Nothing in this Act shall prevent providers of user-to-user services protecting their users’ privacy through end-to-end encryption.”

Government amendment 57.

Amendment 202, in clause 6, page 5, line 11, at end insert—

“(ba) the duty about pornographic content set out in Schedule [Additional duties on pornographic content].”

This amendment ensures that user-to-user services must meet the new duties set out in NS1.

Government amendments 163, 58, 59 and 60.

Amendment 17, in clause 8, page 7, line 14, at end insert—

“(h) how the service may be used in conjunction with other regulated user-to-user services such that it may—

(i) enable users to encounter illegal content on other regulated user-to-user services, and

(ii) constitute part of a pathway to harm to individuals who are users of the service, in particular in relation to CSEA content.”

This amendment would incorporate into the duties a requirement to consider cross-platform risk.

Amendment 15, in clause 8, page 7, line 14, at end insert—

“(5A) The duties set out in this section apply in respect of content which reasonably foreseeably facilitates or aids the discovery or dissemination of CSEA content.”

This amendment extends the illegal content risk assessment duties to cover content which could be foreseen to facilitate or aid the discovery or dissemination of CSEA content.

Government amendments 61 and 62.

Amendment 18, page 7, line 30 [Clause 9], at end insert—

“(none) ‘, including by being directed while on the service towards priority illegal content hosted by a different service;’

This amendment aims to include within companies’ safety duties a duty to consider cross-platform risk.

Amendment 16, in clause 9, page 7, line 35, at end insert—

“(d) minimise the presence of content which reasonably foreseeably facilitates or aids the discovery or dissemination of priority illegal content, including CSEA content.”

This amendment brings measures to minimise content that may facilitate or aid the discovery of priority illegal content within the scope of the duty to maintain proportionate systems and processes.

Amendment 19, in clause 9, page 7, line 35, at end insert—

“(3A) A duty to collaborate with other companies to take reasonable and proportionate measures to prevent the means by which their services can be used in conjunction with other services to facilitate the encountering or dissemination of priority illegal content, including CSEA content.”

This amendment creates a duty to collaborate in cases where there is potential cross-platform risk in relation to priority illegal content and CSEA content.

Government amendments 63 to 67.

Amendment 190, page 10, line 11, in clause 11, at end insert “, and—

(c) mitigate the harm to children caused by habit-forming features of the service by consideration and analysis of how processes (including algorithmic serving of content, the display of other users’ approval of posts and notifications) contribute to development of habit-forming behaviour.”

This amendment requires services to take or use proportionate measures to mitigate the harm to children caused by habit-forming features of a service.

Government amendments 68 and 69.

Amendment 42, page 11, line 16, in clause 11, at end insert—

“(c) the benefits of the service to children’s well-being.”

Amendment 151, page 12, line 43, leave out Clause 13.

This amendment seeks to remove Clause 13 from the Bill.

Government amendment 70.

Amendment 48, page 13, line 5, in clause 13, leave out “is to be treated” and insert

“the provider decides to treat”

This amendment would mean that providers would be free to decide how to treat content that has been designated ‘legal but harmful’ to adults.

Amendment 49, page 13, line 11, in clause 13, at end insert—

‘(ca) taking no action;”

This amendment provides that providers would be free to take no action in response to content referred to in subsection (3).

Government amendments 71 and 72.

Amendment 157, page 14, line 11, in clause 14, leave out subsections (6) and (7).

This amendment is consequential to Amendment 156, which would require all users of Category 1 services to be verified.

Government amendments 73, 164, 74 and 165.

Amendment 10, page 16, line 16, in clause 16, leave out from “or” until the end of line 17.

Government amendments 166 and 167.

Amendment 50, page 20, line 21, in clause 19, at end insert—

“(6A) A duty to include clear provision in the terms of service that the provider will not take down, or restrict access to content generated, uploaded or shared by a user save where it reasonably concludes that—

(a) the provider is required to do so pursuant to the provisions of this Act, or

(b) it is otherwise reasonable and proportionate to do so.”

This amendment sets out a duty for providers to include in terms of service a commitment not to take down or restrict access to content generated, uploaded or shared by a user except in particular circumstances.

Government amendment 168.

Amendment 51, page 20, line 37, in clause 19, at end insert—

“(10) In any claim for breach of contract brought in relation to the provisions referred to in subsection (7), where the breach is established, the court may make such award by way of compensation as it considers appropriate for the removal of, or restriction of access to, the content in question.”

This amendment means that where a claim is made for a breach of the terms of service result from Amendment 50, the court has the power to make compensation as it considers appropriate.

Government amendment 169.

Amendment 47, page 22, line 10, in clause 21, at end insert—

“(ba) the duties about adults’ risk assessment duties in section (Content harmful to adult risk assessment duties: regulated search services),

(bb) the safety duties protecting adults in section (Safety duties protecting adults: regulated search services).”

Government amendments 75 to 82.

Amendment 162, page 31, line 19, in clause 31, leave out “significant”

This amendment removes the requirement for there to be a “significant” number of child users, and replaces it with “a number” of child users.

Government amendments 85 to 87.

Amendment 192, page 36, line 31, in clause 37, at end insert—

“(ha) persons whom OFCOM consider to have expertise in matters relating to the Equality Act 2010,”

This amendment requires Ofcom to consult people with expertise on the Equality Act 2010 about codes of practice.

Amendment 44, page 37, line 25, in clause 39, leave out from beginning to the second “the” in line 26.

This amendment will remove the ability of the Secretary of State to block codes of practice being, as soon as practical, laid before the House for its consideration.

Amendment 45, page 38, line 8, leave out Clause 40.

This amendment will remove the ability of the Secretary of State to block codes of practice being, as soon as practical, laid before the House for its consideration.

Amendment 13, page 38, line 12, in clause 40, leave out paragraph (a).

Amendment 46, page 39, line 30, leave out Clause 41.

This amendment will remove the ability of the Secretary of State to block codes of practice being, as soon as practical, laid before the House for its consideration.

Amendment 14, page 39, line 33, in clause 41, leave out subsection (2).

Amendment 21, page 40, line 29, in clause 43, leave out “may require” and insert “may make representations to”

Amendment 22, page 40, line 33, in clause 43, at end insert—

‘(2A) OFCOM must have due regard to representations by the Secretary of State under subsection (2).”

Government amendments 88 to 89 and 170 to 172.

Amendment 161, page 45, line 23, in clause 49, leave out paragraph (d).

This amendment removes the exemption for one-to-one live aural communications.

Amendment 188, page 45, line 24, in clause 49, leave out paragraph (e).

This amendment removes the exemption for comments and reviews on provider content.

Government amendments 90 and 173.

Amendment 197, page 47, line 12, in clause 50, after “material” insert

“or special interest news material”.

Amendment 11, page 47, line 19, in clause 50, after “has” insert “suitable and sufficient”.

Amendment 198, page 47, line 37, in clause 50, leave out the first “is” and insert

“and special interest news material are”.

Amendment 199, page 48, line 3, in clause 50, at end insert—

““special interest news material” means material consisting of news or information about a particular pastime, hobby, trade, business, industry or profession.”

Amendment 12, page 48, line 7, in clause 50, after “a” insert “suitable and sufficient”.

Government amendments 91 to 94.

Amendment 52, page 49, line 13, in clause 52, leave out paragraph (d).

This amendment limits the list of relevant offences to those specifically specified.

Government amendments 95 to 100.

Amendment 20, page 51, line 3, in clause 54, at end insert—

‘(2A) Priority content designated under subsection (2) must include—

(a) content that contains public health related misinformation or disinformation, and

(b) misinformation or disinformation that is promulgated by a foreign state.”

This amendment would require the Secretary of State’s designation of “priority content that is harmful to adults” to include public health-related misinformation or disinformation, and misinformation or disinformation spread by a foreign state.

Amendment 53, page 51, line 47, in clause 55, after “State” insert “reasonably”.

This amendment, together with Amendment 54, would mean that the Secretary of State must reasonably consider the risk of harm to each one of an appreciable number of adults before specifying a description of the content.

Amendment 54, page 52, line 1, in clause 55, after “to” insert “each of”.

This amendment is linked to Amendment 53.

Amendment 55, page 52, line 12, in clause 55, after “OFCOM” insert

“, Parliament and members of the public in a manner the Secretary of State considers appropriate”.

This amendment requires the Secretary of State to consult Parliament and the public, as well as Ofcom, in a manner the Secretary of State considers appropriate before making regulations about harmful content.

Government amendments 147 to 149.

Amendment 43, page 177, line 23, in schedule 4, after “ages” insert

“, including the benefits of the service to their well-being,”

Amendment 196, page 180, line 9, in schedule 4, at end insert—

Amendment 187, page 186, line 32, in schedule 7, at end insert—

Human trafficking

22A An offence under section 2 of the Modern Slavery Act 2015.”

This amendment includes Human Trafficking as a priority offence.

Amendment 211, page 187, line 23, in schedule 7, at end insert—

Government new clause 14.

Government new clause 15.

Government amendments 83 to 84.

Amendment 156, page 53, line 7, in clause 57, leave out subsections (1) and (2) and insert—

‘(1) A provider of a Category 1 service must require all adult users of the service to verify their identity in order to access the service.

(2) The verification process—

(a) may be of any kind (and in particular, it need not require documentation to be provided),

(b) must—

(i) be carried out by a third party on behalf of the provider of the Category 1 service,

(ii) ensure that all anonymous users of the Category 1 service cannot be identified by other users, apart from where provided for by section (Duty to ensure anonymity of users).”

This amendment would require all users of Category 1 services to be verified. The verification process would have to be carried out by a third party and to ensure the anonymity of users.

Government amendment 101.

Amendment 193, page 58, line 33, in clause 65, at end insert—

“(ea) persons whom OFCOM consider to have expertise in matters relating to the Equality Act 2010,”

This amendment requires Ofcom to consult people with expertise on the Equality Act 2010 in respect of guidance about transparency reports.

Amendment 203, page 60, line 33, in clause 68, at end insert—

‘(2B) A duty to meet the conditions set out in Schedule [Additional duties on pornographic content].”

This amendment ensures that commercial pornographic websites must meet the new duties set out in NS1.

Government amendments 141, 177 to 184, 142 to 145, 185 to 186 and 146.

New schedule 1—Additional duties on pornographic content

“30 All user-to-user services and an internet service which provides regulated provider pornographic content must meet the following conditions for pornographic content and content that includes sexual photographs and films (“relevant content”).

The conditions are—

(a) the service must not contain any prohibited material,

(b) the service must review all relevant content before publication.

31 In this Schedule—

“photographs and films” has the same meaning as section 34 of the Criminal Justice and Courts Act 2015 (meaning of “disclose” and “photograph or film”)

“prohibited material” has the same meaning as section 368E(3) of the Communications Act 2003 (harmful material).”

The new schedule sets out additional duties for pornographic content which apply to user-to-user services under Part 3 and commercial pornographic websites under Part 5.

Government amendments 150 and 174.

Amendment 191, page 94, line 24, in clause 12, at end insert—

“Section [Category 1 services: duty not to discriminate against, harass or victimise service users] Duty not to discriminate against, harass or victimise

This amendment makes NC24 an enforceable requirement.

Government amendment 131.

Lindsay Hoyle Portrait Mr Speaker
- View Speech - Hansard - - - Excerpts

I welcome the new Minister to the Dispatch Box.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

Thank you, Mr Speaker. I am honoured to have been appointed the Minister responsible for the Online Safety Bill. Having worked on these issues for a number of years, I am well aware of the urgency and importance of this legislation, in particular to protect children and tackle criminal activity online—that is why we are discussing this legislation.

Relative to the point of order from my right hon. Friend the Member for Haltemprice and Howden (Mr Davis), I have the greatest respect for him and his standing in this House, but it feels like we have been discussing this Bill for at least five years. We have had a Green Paper and a White Paper. We had a pre-legislative scrutiny process, which I was honoured to be asked to chair. We have had reports from the Digital, Culture, Media and Sport Committee and from other Select Committees and all-party parliamentary groups of this House. This legislation does not want for scrutiny.

We have also had a highly collaborative and iterative process in the discussion of the Bill. We have had 66 Government acceptances of recommendations made by the Joint Committee on the draft Online Safety Bill. We have had Government amendments in Committee. We are discusssing Government amendments today and we have Government commitments to table amendments in the House of Lords. The Bill has received a huge amount of consultation. It is highly important legislation, and the victims of online crime, online fraud, bullying and harassment want to see us get the Bill into the Lords and on the statute book as quickly as possible.

Jeremy Wright Portrait Sir Jeremy Wright (Kenilworth and Southam) (Con)
- Hansard - - - Excerpts

I warmly welcome my hon. Friend to his position. He will understand that those of us who have followed the Bill in some detail since its inception had some nervousness as to who might be standing at that Dispatch Box today, but we could not be more relieved that it is him. May I pick up on his point about the point of order from our right hon. Friend the Member for Haltemprice and Howden (Mr Davis)? Does he agree that an additional point to add to his list is that, unusually, this legislation has a remarkable amount of cross-party consensus behind its principles? That distinguishes it from some of the other legislation that perhaps we should not consider in these two weeks. I accept there is plenty of detail to be examined but, in principle, this Bill has a lot of support in this place.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

I completely agree with my right hon. and learned Friend. That is why the Bill passed Second Reading without a Division and the Joint Committee produced a unanimous report. I am happy for Members to cast me in the role of poacher turned gamekeeper on the Bill, but looking around the House, there are plenty of gamekeepers turned poachers here today who will ensure we have a lively debate.

Lindsay Hoyle Portrait Mr Speaker
- View Speech - Hansard - - - Excerpts

And the other way, as well.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Exactly. The concept at the heart of this legislation is simple. Tech companies, like those in every other sector, must take appropriate responsibility for the consequences of their business decisions. As they continue to offer their users the latest innovations that enrich our lives, they must consider safety as well as profit. They must treat their users fairly and ensure that the internet remains a place for robust debate. The Bill has benefited from input and scrutiny from right across the House. I pay tribute to my predecessor, my hon. Friend the Member for Croydon South (Chris Philp), who has worked tirelessly on the Bill, not least through 50 hours of Public Bill Committee, and the Bill is better for his input and work.

We have also listened to the work of other Members of the House, including my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), the right hon. Member for Barking (Dame Margaret Hodge), my right hon. Friend the Member for Haltemprice and Howden and the Chair of the Select Committee, my hon. Friend the Member for Solihull (Julian Knight), who have all made important contributions to the discussion of the Bill.

We have also listened to those concerned about freedom of expression online. It is worth pausing on that, as there has been a lot of discussion about whether the Bill is censoring legal speech online and much understandable outrage from those who think it is. I asked the same questions when I chaired the Joint Committee on the Bill. This debate does not reflect the actual text of the Bill itself. The Bill does not require platforms to restrict legal speech—let us be absolutely clear about that. It does not give the Government, Ofcom or tech platforms the power to make something illegal online that is legal offline. In fact, if those concerned about the Bill studied it in detail, they would realise that the Bill protects freedom of speech. In particular, the Bill will temper the huge power over public discourse wielded by the big tech companies behind closed doors in California. They are unaccountable for the decisions they make on censoring free speech on a daily basis. Their decisions about what content is allowed will finally be subject to proper transparency requirements.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- View Speech - Hansard - - - Excerpts

My hon. Friend did not have the joy of being on the Bill Committee, as I did with my hon. Friend the Member for Croydon South (Chris Philp), who was the Minister at that point. The point that my hon. Friend has just made about free speech is so important for women and girls who are not able to go online because of the violent abuse that they receive, and that has to be taken into account by those who seek to criticise the Bill. We have to make sure that people who currently feel silenced do not feel silenced in future and can participate online in the way that they should be able to do. My hon. Friend is making an excellent point and I welcome him to his position.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

My right hon. Friend is entirely right on that point. The structure of the Bill is very simple. There is a legal priority of harms, and things that are illegal offline will be regulated online at the level of the criminal threshold. There are protections for freedom of speech and there is proper transparency about harmful content, which I will come on to address.

Joanna Cherry Portrait Joanna Cherry (Edinburgh South West) (SNP)
- Hansard - - - Excerpts

Does the Minister agree that, in moderating content, category 1 service providers such as Twitter should be bound by the duties under our domestic law not to discriminate against anyone on the grounds of a protected characteristic? Will he take a look at the amendments I have brought forward today on that point, which I had the opportunity of discussing with his predecessor, who I think was sympathetic?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. and learned Lady makes a very important point. The legislation sets regulatory thresholds at the criminal law level based on existing offences in law. Many of the points she made are covered by existing public law offences, particularly in regards to discriminating against people based on their protected characteristics. As she well knows, the internet is a reserved matter, so the legal threshold is set at where UK law stands, but where law may differ in Scotland, the police authorities in Scotland can still take action against individuals in breach of the law.

Joanna Cherry Portrait Joanna Cherry
- Hansard - - - Excerpts

The difficulty is that Twitter claims it is not covered by the Equality Act 2010. I have seen legal correspondence to that effect. I am not talking about the criminal law here. I am talking about Twitter’s duty not to discriminate against women, for example, or those who hold gender critical beliefs in its moderation of content. That is the purpose of my amendment today—it would ensure that Twitter and other service providers providing a service in the United Kingdom abide by our domestic law. It is not really a reserved or devolved matter.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. and learned Lady is right. There are priority offences where the companies, regardless of their terms of service, have to meet their obligations. If something is illegal offline, it is illegal online as well. There are priority areas where the company must proactively look for that. There are also non-priority areas where the company should take action against anything that is an offence in law and meets the criminal threshold online. The job of the regulator is to hold them to account for that. They also have to be transparent in their terms of service as category 1 companies. If they have clear policies against discrimination, which they on the whole all do, they will have to set out what they would do, and the regulator can hold them to account to make sure they do what they say. The regulator cannot make them take down speech that is legal or below a criminal threshold, but they can hold them to account publicly for the decisions they make.

One of the most important aspects of this Bill with regard to the category 1 companies is transparency. At the moment, the platforms make decisions about curating their content—who to take down, who to suppress, who to leave up—but those are their decisions. There is no external scrutiny of what they do or even whether they do what they say they will do. As a point of basic consumer protection law, if companies say in their terms of service that they will do something, they should be held to account for it. What is put on the label also needs to be in the tin and that is what the Bill will do for the internet.

I now want to talk about journalism and the role of the news media in the online world, which is a very important part of this Bill. The Government are committed to defending the invaluable role of a free media. Online safety legislation must protect the vital role of the press in providing people with reliable and accurate sources of information. Companies must therefore put in place protections for journalistic content. User-to-user services will not have to apply their safety duties in part 3 of the Bill to news publishers’ content shared on their services. News publishers’ content on their own sites will also not be in scope of regulation.

12:59
New clause 19 and associated amendments introduce a further requirement on category 1 services to notify a recognised news publisher and offer a right of appeal before removing or moderating its content or taking any action against its account. This new provision will reduce the risk of major online platforms taking over-zealous, arbitrary or accidental moderation decisions against news publisher content, which plays an invaluable role in UK democracy and society.
We recognise that there are cases where platforms must be able to remove content without having to provide an appeal, and the new clause has been drafted to ensure that platforms will not be required to provide an appeal before removing content that would give rise to civil or criminal liability to the service itself, or where it amounts to a relevant offence as defined by the Bill. This means that platforms can take down without an appeal content that would count as illegal content under the Bill.
Moreover, in response to some of the concerns raised, in particular by my right hon. and learned Friend the Member for Kenilworth and Southam as well as by other Members, about the danger of creating an inadvertent loophole for bad actors, we have committed to further tightening the definition of “recognised news provider” in the House of Lords to ensure that sanctioned entities, such as RT, cannot benefit from these protections.
As the legislation comes into force, the Government are committed to ensuring that protections for journalism and news publisher content effectively safeguard users’ access to such content. We have therefore tabled amendments 167 and 168 to require category 1 companies to assess the impact of their safety duties on how news publisher and journalistic content are treated when hosted on the service. They must then demonstrate the steps they are taking to mitigate any impact.
In addition, a series of amendments, including new clause 20, will require Ofcom to produce a report assessing the impact of the Online Safety Bill on the availability and treatment of news publisher content and journalistic content on category 1 services. This will include consideration of the impact of new clause 19, and Ofcom must do this within two years of the relevant provisions being commenced.
The Bill already excludes comments sections on news publishers’ sites from the Bill’s safety duties. These comments are crucial for enabling reader engagement with the news and encouraging public debate, as well as for the sustainability of the news media. We have tabled a series of amendments to strengthen these protections, reflecting the Government’s commitment to media freedom. The amendments will create a higher bar for removing the protections in place for comments sections on recognised news publishers’ sites by ensuring that these can only be brought into the scope of regulation via primary legislation.
Government amendments 70 and 71 clarify the policy intention of the clause 13 adult safety duties to improve transparency about how providers treat harmful content, rather than incentivise its removal. The changes respond to concerns raised by stakeholders that the drafting did not make it sufficiently clear that providers could choose simply to allow any form of legal content, rather than promote, restrict or remove it, regardless of the harm to users.
This is a really important point that has sometimes been missed in the discussion on the Bill. There are very clear duties relating to illegal harm that companies must proactively identify and mitigate. The transparency requirements for other harmful content are very clear that companies must set out what their policies are. Enforcement action can be taken by the regulator for breach of their policies, but the primary objective is that companies make clear what their policies are. It is not a requirement for companies to remove legal speech if their policies do not allow that.
Margaret Hodge Portrait Dame Margaret Hodge (Barking) (Lab)
- Hansard - - - Excerpts

I welcome the Minister to his position, and it is wonderful to have somebody else who—like the previous Minister, the hon. Member for Croydon South (Chris Philp)—knows what he is talking about. On this issue, which is pretty key, I think it would work if minimum standards were set on the risk assessments that platforms have to make to judge what is legal but harmful content, but at the moment such minimum standards are not in the Bill. Could the Minister comment on that? Otherwise, there is a danger that platforms will set a risk assessment that allows really vile harmful but legal content to carry on appearing on their platform.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The right hon. Lady makes a very important point. There have to be minimum safety standards, and I think that was also reflected in the report of the Joint Committee, which I chaired. Those minimum legal standards are set where the criminal law is set for these priority legal offences. A company may have higher terms of service—it may operate at a higher level—in which case it will be judged on the operation of its terms of service. However, for priority illegal content, it cannot have a code of practice that is below the legal threshold, and it would be in breach of the provisions if it did. For priority illegal offences, the minimum threshold is set by the law.

Margaret Hodge Portrait Dame Margaret Hodge
- Hansard - - - Excerpts

I understand that in relation to illegal harmful content, but I am talking about legal but harmful content. I understand that the Joint Committee that the hon. Member chaired recommended that for legal but harmful content, there should be minimum standards against which the platforms would be judged. I may have missed it, but I cannot see that in the Bill.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The Joint Committee’s recommendation was for a restructuring of the Bill, so that rather than having general duty of care responsibilities that were not defined, we defined those responsibilities based on existing areas of law. The core principle behind the Bill is to take things that are illegal offline, and to regulate such things online based on the legal threshold. That is what the Bill does.

In schedule 7, which did not exist in the draft phase, we have written into the Bill a long list of offences in law. I expect that, as this regime is created, the House will insert more regulations and laws into schedule 7 as priority offences in law. Even if an offence in law is not listed in the priority illegal harms schedule, it can still be a non-priority harm, meaning that even if a company does not have to look for evidence of that offence proactively, it still has to act if it is made aware of the offence. I think the law gives us a very wide range of offences, clearly defined against offences in law, where there are clearly understood legal thresholds.

The question is: what is to be done about other content that may be harmful but sits below the threshold? The Government have made it clear that we intend to bring forward amendments that set out clear priorities for companies on the reporting of such harmful content, where we expect the companies to set out what their policies are. That will include setting out clearly their policies on things such as online abuse and harassment, the circulation of real or manufactured intimate images, content promoting self-harm, content promoting eating disorders or legal suicide content—this is content relating to adults—so the companies will have to be transparent on that point.

Chris Philp Portrait Chris Philp (Croydon South) (Con)
- View Speech - Hansard - - - Excerpts

I congratulate the Minister on his appointment, and I look forward to supporting him in his role as he previously supported me in mine. I think he made an important point a minute ago about content that is legal but considered to be harmful. It has been widely misreported in the press that this Bill censors or prohibits such content. As the Minister said a moment ago, it does no such thing. There is no requirement on platforms to censor or remove content that is legal, and amendment 71 to clause 13 makes that expressly clear. Does he agree that reports suggesting that the Bill mandates censorship of legal content are completely inaccurate?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I am grateful to my hon. Friend, and as I said earlier, he is absolutely right. There is no requirement for platforms to take down legal speech, and they cannot be directed to do so. What we have is a transparency requirement to set out their policies, with particular regard to some of the offences I mentioned earlier, and a wide schedule of things that are offences in law that are enforced through the Bill itself. This is a very important distinction to make. I said to him on Second Reading that I thought the general term “legal but harmful” had added a lot of confusion to the way the Bill was perceived, because it created the impression that the removal of legal speech could be required by order of the regulator, and that is not the case.

Debbie Abrahams Portrait Debbie Abrahams (Oldham East and Saddleworth) (Lab)
- Hansard - - - Excerpts

I congratulate the Minister on his promotion and on his excellent chairmanship of the prelegislative scrutiny Committee, which I also served on. Is he satisfied with the Bill in relation to disinformation? It was concerning that there was only one clause on disinformation, and we know the impact—particularly the democratic impact—that that has on our society at large. Is he satisfied that the Bill will address that?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

It was a pleasure to serve alongside the hon. Lady on the Joint Committee. There are clear new offences relating to knowingly false information that will cause harm. As she will know, that was a Law Commission recommendation; it was not in the draft Bill but it is now in the Bill. The Government have also said that as a consequence of the new National Security Bill, which is going through Parliament, we will bring in a new priority offence relating to disinformation spread by hostile foreign states. As she knows, one of the most common areas for organised disinformation has been at state level. As a consequence of the new national security legislation, that will also be reflected in schedule 7 of this Bill, and that is a welcome change.

The Bill requires all services to take robust action to tackle the spread of illegal content and activity. Providers must proactively reduce the risk on their services of illegal activity and the sharing of illegal content, and they must identify and remove illegal content once it appears on their services. That is a proactive responsibility. We have tabled several interrelated amendments to reinforce the principle that companies must take a safety-by-design approach to managing the risk of illegal content and activity on their services. These amendments require platforms to assess the risk of their services being used to commit, or to facilitate the commission of, a priority offence and then to design and operate their services to mitigate that risk. This will ensure that companies put in place preventive measures to mitigate a broad spectrum of factors that enable illegal activity, rather than focusing solely on the removal of illegal content once it appears.

Henry Smith Portrait Henry Smith (Crawley) (Con)
- View Speech - Hansard - - - Excerpts

I congratulate my hon. Friend on his appointment to his position. On harmful content, there are all too many appalling examples of animal abuse on the internet. What are the Government’s thoughts on how we can mitigate such harmful content, which is facilitating wildlife crime? Might similar online protections be provided for animals to the ones that clause 53 sets out for children?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

My hon. Friend raises an important point that deserves further consideration as the Bill progresses through its parliamentary stages. There is, of course, still a general presumption that any illegal activity that could also constitute illegal activity online—for example, promoting or sharing content that could incite people to commit violent acts—is within scope of the legislation. There are some priority illegal offences, which are set out in schedule 7, but the non-priority offences also apply if a company is made aware of content that is likely to be in breach of the law. I certainly think this is worth considering in that context.

In addition, the Bill makes it clear that platforms have duties to mitigate the risk of their service facilitating an offence, including where that offence may occur on another site, such as can occur in cross-platform child sexual exploitation and abuse—CSEA—offending, or even offline. This addresses concerns raised by a wide coalition of children’s charities that the Bill did not adequately tackle activities such as breadcrumbing—an issue my hon. Friend the Member for Solihull (Julian Knight), the Chair of the Select Committee, has raised in the House before—where CSEA offenders post content on one platform that leads to offences taking place on a different platform.

We have also tabled new clause 14 and a related series of amendments in order to provide greater clarity about how in-scope services should determine whether they have duties with regard to content on their services. The new regulatory framework requires service providers to put in place effective and proportionate systems and processes to improve user safety while upholding free expression and privacy online. The systems and processes that companies implement will be tailored to the specific risk profile of the service. However, in many cases the effectiveness of companies’ safety measures will depend on them making reasonable judgments about types of content. Therefore, it is essential to the effective functioning of the framework that there is clarity about how providers should approach these judgments. In particular, such clarity will safeguard against companies over-removing innocuous content if they wrongly assume mental elements are present, or under-removing content if they act only where all elements of an offence are established beyond reasonable doubt. The amendments make clear that companies must consider all reasonably available contextual information when determining whether content is illegal content, a fraudulent advert, content that is harmful to children, or content that is harmful to adults.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I was on the Bill Committee and we discussed lots of things, but new clause 14 was not discussed: we did not have conversations about it, and external organisations have not been consulted on it. Is the Minister not concerned that this is a major change to the Bill and it has not been adequately consulted on?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

As I said earlier, in establishing the threshold for priority illegal offences, the current threshold of laws that exist offline should provide good guidance. I would expect that as the codes of practice are developed, we will be able to make clear what those offences are. On the racial hatred that the England footballers received after the European championship football final, people have been prosecuted for what they posted on Twitter and other social media platforms. We know what race hate looks like in that context, we know what the regulatory threshold should look at and we know the sort of content we are trying to regulate. I expect that, in the codes of practice, Ofcom can be very clear with companies about what we expect, where the thresholds are and where we expect them to take enforcement action.

13:15
Caroline Dinenage Portrait Dame Caroline Dinenage (Gosport) (Con)
- Hansard - - - Excerpts

I congratulate my hon. Friend on taking his new position; we rarely have a new Minister so capable of hitting the ground running. He makes a crucial point about clearness and transparency for both users and the social media providers and other platforms, because it is important that we make sure they are 100% clear about what is expected of them and the penalties for not fulfilling their commitments. Does he agree that opaqueness—a veil of secrecy—has been one of the obstacles, and that a whole raft of content has been taken down for the wrong reasons while other content has been left to proliferate because of the lack of clarity?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

That is entirely right, and in closing I say that the Bill does what we have always asked for it to do: it gives absolute clarity that illegal things offline must be illegal online as well, and be regulated online. It establishes clear responsibilities and liabilities for the platforms to do that proactively. It enables a regulator to hold the platforms to account on their ability to tackle those priority illegal harms and provide transparency on other areas of harmful content. At present we simply do not know about the policy decisions that companies choose to make: we have no say in it; it is not transparent; we do not know whether they do it. The Bill will deliver in those important regards. If we are serious about tackling issues such as fraud and abuse online, and other criminal offences, we require a regulatory system to do that and proper legal accountability and liability for the companies. That is what the Bill and the further amendments deliver.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

It is an honour to respond on the first group of amendments on behalf of the Opposition.

For those of us who have been working on this Bill for some time now, it has been extremely frustrating to see the Government take such a siloed approach in navigating this complex legislation. I remind colleagues that in Committee Labour tabled a number of hugely important amendments that sought to make the online space safer for us all, but the Government responded by voting against each and every one of them. I certainly hope the new Minister—I very much welcome him to his post—has a more open-minded approach than his predecessor and indeed the Secretary of State; I look forward to what I hope will be a more collaborative approach to getting this legislation right.

With that in mind, it must be said that time and again this Government claim that the legislation is world-leading but that is far from the truth. Instead, once again the Government have proposed hugely significant and contentious amendments only after line-by-line scrutiny in Committee; it is not the first time this has happened in this Parliament, and it is extremely frustrating for those of us who have debated this Bill for more than 50 hours over the past month.

I will begin by touching on Labour’s broader concerns around the Bill. As the Minister will be aware, we believe that the Government have made a fundamental mistake in their approach to categorisation, which undermines the very structure of the Bill. We are not alone in this view and have the backing of many advocacy and campaign groups including the Carnegie UK Trust, Hope Not Hate and the Antisemitism Policy Trust. Categorisation of services based on size rather than risk of harm will mean that the Bill will fail to address some of the most extreme harms on the internet.

We all know that smaller platforms such as 4chan and BitChute have significant numbers of users who are highly motivated to promote very dangerous content. Their aim is to promote radicalisation and to spread hate and harm.

Debbie Abrahams Portrait Debbie Abrahams
- Hansard - - - Excerpts

Not only that: people migrate from one platform to another, a fact that just has not been reflected on by the Government.

Alex Davies-Jones Portrait Alex Davies-Jones
- View Speech - Hansard - - - Excerpts

My hon. Friend is absolutely right, and has touched on elements that I will address later in my speech. I will look at cross-platform harm and breadcrumbing; the Government have taken action to address that issue, but they need to go further.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I am sorry to intervene so early in the hon. Lady’s speech, and thank her for her kind words. I personally agree that the question of categorisation needs to be looked at again, and the Government have agreed to do so. We will hopefully discuss it next week during consideration of the third group of amendments.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I welcome the Minister’s commitment, which is something that the previous Minister, the hon. Member for Croydon South (Chris Philp) also committed to in Committee. However, it should have been in the Bill to begin with, or been tabled as an amendment today so that we could discuss it on the Floor of the House. We should not have to wait until the Bill goes to the other place to discuss this fundamental, important point that I know colleagues on the Minister’s own Back Benches have been calling for. Here we are, weeks down the line, with nothing having been done to fix that problem, which we know will be a persistent problem unless action is taken. It is beyond frustrating that no indication was given in Committee of these changes, because they have wide-ranging consequences for the effects of the Bill. Clearly, the Government are distracted with other matters, but I remind the Minister that Labour has long called for a safer internet, and we are keen to get the Bill right.

Let us start with new clause 14, which provides clarification about how online services should determine whether content should be considered illegal, and therefore how the illegal safety duty should apply. The new clause is deeply problematic, and is likely to reduce significantly the amount of illegal content and fraudulent advertising that is correctly identified and acted on. First, companies will be expected to determine whether content is illegal or fraudulently based on information that is

“reasonably available to a provider”,

with reasonableness determined in part by the size and capacity of the provider. That entrenches the problems I have outlined with smaller, high-risk companies being subject to fewer duties despite the acute risks they pose. Having less onerous applications of the illegal safety duties will encourage malign actors to migrate illegal activity on to smaller sites that have less pronounced regulatory expectations placed on them. That has particularly concerning ramifications for children’s protections, which I will come on to shortly. On the other end of the scale, larger sites could use new clause 14 to argue that their size and capacity, and the corresponding volumes of material they are moderating, makes it impractical for them reliably and consistently to identify illegal content.

The second problem arises from the fact that the platforms will need to have

“reasonable grounds to infer that all elements necessary for the commission of the offence, including mental elements, are present or satisfied”.

That significantly raises the threshold at which companies are likely to determine that content is illegal. In practice, companies have routinely failed to remove content where there is clear evidence of illegal intent. That has been the case in instances of child abuse breadcrumbing, where platforms use their own definitions of what constitutes a child abuse image for moderation purposes. Charities believe it is inevitable that companies will look to use this clause to minimise their regulatory obligations to act.

Finally, new clause 14 and its resulting amendments do not appear to be adequately future-proofed. The new clause sets out that judgments should be made

“on the basis of all relevant information that is reasonably available to a provider.”

However, on Meta’s first metaverse device, the Oculus Quest product, that company records only two minutes of footage on a rolling basis. That makes it virtually impossible to detect evidence of grooming, and companies can therefore argue that they cannot detect illegal content because the information is not reasonably available to them. The new clause undermines and weakens the safety mechanisms that the Minister, his team, the previous Minister, and all members of the Joint Committee and the Public Bill Committee have worked so hard to get right. I urge the Minister to reconsider these amendments and withdraw them.

I will now move on to improving the children’s protection measures in the Bill. In Committee, it was clear that one thing we all agreed on, cross-party and across the House, was trying to get the Bill to work for children. With colleagues in the Scottish National party, Labour Members tabled many amendments and new clauses in an attempt to achieve that goal. However, despite their having the backing of numerous children’s charities, including the National Society for the Prevention of Cruelty to Children, 5Rights, Save the Children, Barnardo’s, The Children’s Society and many more, the Government sadly did not accept them. We are grateful to those organisations for their insights and support throughout the Bill’s passage.

We know that children face significant risks online, from bullying and sexist trolling to the most extreme grooming and child abuse. Our amendments focus in particular on preventing grooming and child abuse, but before I speak to them, I associate myself with the amendments tabled by our colleagues in the Scottish National party, the hon. Members for Aberdeen North (Kirsty Blackman) and for Ochil and South Perthshire (John Nicolson). In particular, I associate myself with the sensible changes they have suggested to the Bill at this stage, including a change to children’s access assessments through amendment 162 and a strengthening of duties to prevent harm to children caused by habit-forming features through amendment 190.

Since the Bill was first promised in 2017, the number of online grooming crimes reported to the police has increased by more than 80%. Last year, around 120 sexual communication with children offences were committed every single week, and those are only the reported cases. The NSPCC has warned that that amounts to a

“tsunami of online child abuse”.

We now have the first ever opportunity to legislate for a safer world online for our children.

However, as currently drafted, the Bill falls short by failing to grasp the dynamics of online child abuse and grooming, which rarely occurs on one single platform or app, as mentioned by my hon. Friend the Member for Oldham East and Saddleworth (Debbie Abrahams). In well-established grooming pathways, abusers exploit the design features of open social networks to contact children, then move their communication across to other, more encrypted platforms, including livestreaming sites and encrypted messaging services. For instance, perpetrators manipulate features such as Facebook’s algorithmic friend suggestions to make initial contact with large numbers of children, who they then groom through direct messages before moving to encrypted services such as WhatsApp, where they coerce children into sending sexual images. That range of techniques is often referred to as child abuse breadcrumbing, and is a significant enabler of online child abuse.

I will give a sense of how easy it is for abusers to exploit children by recounting the words and experiences of a survivor, a 15-year-old girl who was groomed on multiple sites:

“I’ve been chatting with this guy online who’s…twice my age. This all started on Instagram but lately all our chats have been on WhatsApp. He seemed really nice to begin with, but then he started making me do these things to ‘prove my trust’ to him, like doing video chats with my chest exposed. Every time I did these things for him, he would ask for more and I felt like it was too late to back out. This whole thing has been slowly destroying me and I’ve been having thoughts of hurting myself.”

I appreciate that it is difficult listening, but that experience is being shared by thousands of other children every year, and we need to be clear about the urgency that is needed to change that.

It will come as a relief to parents and children that, through amendments 58 to 61, the Government have finally agreed to close the loophole that allowed for breadcrumbing to continue. However, I still wish to speak to our amendments 15, 16, and 17 to 19, which were tabled before the Government changed their mind. Together with the Government’s amendments, these changes will bring into scope tens of millions of interactions with accounts that actively enable the discovery and sharing of child abuse material.

Amendment 15 would ensure that platforms have to include in their illegal content risk assessment content that

“reasonably foreseeably facilitates or aids the discovery or dissemination of CSEA content.”

Amendment 16 would ensure that platforms have to maintain proportionate systems and processes to minimise the presence of such content on their sites. The wording of our amendments is tighter and includes aiding the discovery or dissemination of content, whereas the Government’s amendments cover only “commission or facilitation”. Can the Minister tell me why the Government chose that specific wording and opposed the amendments that we tabled in Committee, which would have done the exact same thing? I hope that in the spirit of collaboration that we have fostered throughout the passage of the Bill with the new Minister and his predecessor, the Minister will consider the merit of our amendments 15 and 16.

Labour is extremely concerned about the significant powers that the Bill in its current form gives to the Secretary of State. We see that approach to the Bill as nothing short of a shameless attempt at power-grabbing from a Government whose so-called world-leading Bill is already failing in its most basic duty of keeping people safe online. Two interlinked issues arise from the myriad of powers granted to the Secretary of State throughout the Bill: the first is the unjustified intrusion of the Secretary of State into decisions that are about the regulation of speech, and the second is the unnecessary levels of interference and threats to the independence of Ofcom that arise from the powers of direction to Ofcom in its day-to-day matters and operations. That is not good governance, and it is why Labour has tabled a range of important amendments that the Minister must carefully consider. None of us wants the Bill to place undue powers in the hands of only one individual. That is not a normal approach to regulation, so I fail to see why the Government have chosen to go down that route in this case.

Chris Philp Portrait Chris Philp
- View Speech - Hansard - - - Excerpts

I thank the shadow Minister for giving way—I will miss our exchanges across the Dispatch Box. She is making a point about the Secretary of State powers in, I think, clause 40. Is she at all reassured by the undertakings given in the written ministerial statement tabled by the Secretary of State last Thursday, in which the Government committed to amending the Bill in the Lords to limit the use of those powers to exceptional circumstances only, and precisely defined those circumstances as only being in connection with issues such as public health and public safety?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I thank the former Minister for his intervention, and I am grateful for that clarification. We debated at length in Committee the importance of the regulator’s independence and the prevention of overarching Secretary of State powers, and of Parliament having a say and being reconvened if required. I welcome the fact that that limitation on the power will be tabled in the other place, but it should have been tabled as an amendment here so that we could have discussed it today. We should not have to wait for the Bill to go to the other place for us to have our say. Who knows what will happen to the Bill tomorrow, next week or further down the line with the Government in utter chaos? We need this to be done now. The Minister must recognise that this is an unparalleled level of power, and one with which the sector and Back Benchers in his own party disagree. Let us work together and make sure the Bill really is fit for purpose, and that Ofcom is truly independent and without interference and has the tools available to it to really create meaningful change and keep us all safe online once and for all.

13:30
I must put on record my support for amendments 11 and 12, tabled by the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright). In Committee, we heard multiple examples of racist, extremist and other harmful publishers, from holocaust deniers to white supremacists, who would stand to benefit from the recognised news publisher exemption as it stands, either overnight or by making minor administrative changes. As long as the exemption protects antisemites and extremists, it is not fit for purpose. That much should be clear to all of us. In Committee, in response to an amendment tabled by my hon. Friend the Member for Batley and Spen (Kim Leadbeater), the then Minister promised a concession so that Russia Today would be excluded from the recognised news publisher exemption. I welcome the Minister’s comments at the Dispatch Box today to confirm that. I am pleased that the Government have promised to exclude sanctioned news bodies such as Russia Today, but their approach does not go far enough. Disinformation outlets rarely have the profile of Russia Today.
Andrew Percy Portrait Andrew Percy (Brigg and Goole) (Con)
- Hansard - - - Excerpts

While the shadow Minister is on the subject of exemptions for antisemites, will she say where the Opposition are on the issue of search? Search platforms and search engines provide some of the most appalling racist, Islamophobic and antisemitic content.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I thank the hon. Gentleman, who is absolutely right. In Committee, we debated at length the impact search engines have, and they should be included in the Bill’s categorisation of difficult issues. In one recent example on a search engine, the imagery that comes up when we search for desk ornaments is utterly appalling and needs to be challenged and changed. If we are to truly tackle antisemitism, racism and extremist content online, then the provisions need to be included in the Bill, and journalistic exemptions should not apply to this type of content. Often, they operate more discretely and are less likely to attract sanctions. Furthermore, any amendment will provide no answer to the many extremist publishers who seek to exploit the terms of the exemption. For those reasons, we need to go further.

The amendments are not a perfect or complete solution. Deficiencies remain, and the amendments do not address the fact that the exemption continues to exclude dozens of independent local newspapers around the country on the arbitrary basis that they have no fixed address. The Independent Media Association, which represents news publishers, describes the news publisher criteria as

“punishing quality journalism with high standards”.

I hope the Minister will reflect further on that point. As a priority, we need to ensure that the exemption cannot be exploited by bad actors. We must not give a free pass to those propagating racist, misogynistic or antisemitic harm and abuse. By requiring some standards of accountability for news providers, however modest, the amendments are an improvement on the Bill as drafted. In the interests of national security and the welfare of the public, we must support the amendments.

Finally, I come to a topic that I have spoken about passionately in this place on a number of occasions and that is extremely close to my heart: violence against women and girls. Put simply, in their approach to the Bill the Government are completely failing and falling short in their responsibilities to keep women and girls safe online. Labour has been calling for better protections for some time now, yet still the Government are failing to see the extent of the problem. They have only just published an initial indicative list of priority harms to adults, in a written statement that many colleagues may have missed. While it is claimed that this will add to scrutiny and debate, the final list of harms will not be on the face of the Bill but will included in secondary legislation after the Bill has received Royal Assent. Non-designated content that is harmful will not require action on the part of service providers, even though by definition it is still extremely harmful. How can that be acceptable?

Many campaigners have made the case that protections for women and girls are not included in the draft Bill at all, a concern supported by the Petitions Committee in its report on online abuse. Schedule 7 includes a list of sexual offences and aggravated offences, but the Government have so far made no concessions here and the wider context of violence against women and girls has not been addressed. That is why I urge the Minister to carefully consider our new clause 3, which seeks to finally name violence against women and girls as a priority harm. The Minister’s predecessor said in Committee that women and girls receive “disproportionate” levels of abuse online. The Minister in his new role will likely be well briefed on the evidence, and I know this is an issue he cares passionately about. The case has been put forward strongly by hon. Members on all sides of the House, and the message is crystal clear: women and girls must be protected online, and we see this important new clause as the first step.

Later on, we hope to see the Government move further and acknowledge that there must be a code of practice on tackling violence against women and girls content online.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

The hon. Lady raises the issue of codes of practice. She will recall that in Committee we talked about that specifically and pressed the then Minister on that point. It became very clear that Ofcom would be able to issue a code of practice on violence against women and girls, which she talked about. Should we not be seeking an assurance that Ofcom will do that? That would negate the need to amend the Bill further.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I welcome the right hon. Lady’s comments. We did discuss this at great length in Committee, and I know she cares deeply and passionately about this issue, as do I. It is welcome that Ofcom can issue a code of practice on violence against women and girls, and we should absolutely be urging it to do that, but we also need to make it a fundamental aim of the Bill. If the Bill is to be truly world leading, if it is truly to make us all safe online, and if we are finally to begin to tackle the scourge of violence against women and girls in all its elements—not just online but offline—then violence against women and girls needs to be named as a priority harm in the Bill. We need to take the brave new step of saying that enough is enough. Words are not enough. We need actions, and this is an action the Minister could take.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I think we would all agree that when we look at the priority harms set out in the Bill, women and girls are disproportionately the victims of those offences. The groups in society that the Bill will most help are women and girls in our community. I am happy to work with the hon. Lady and all hon. Members to look at what more we can do on this point, both during the passage of the Bill and in future, but as it stands the Bill is the biggest step forward in protecting women and girls, and all users online, that we have ever seen.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful to the Minister for the offer to work on that further, but we have an opportunity now to make real and lasting change. We talk about how we tackle this issue going forward. How can we solve the problem of violence against women and girls in our community? Three women a week are murdered at the hands of men in this country—that is shocking. How can we truly begin to tackle a culture change? This is how it starts. We have had enough of words. We have had enough of Ministers standing at the Dispatch Box saying, “This is how we are going to tackle violence against women and girls; this is our new plan to do it.” They have an opportunity to create a new law that makes it a priority harm, and that makes women and girls feel like they are being listened to, finally. I urge the Minister and Members in all parts of the House, who know that this is a chance for us finally to take that first step, to vote for new clause 3 today and make women and girls a priority by showing understanding that they receive a disproportionate level of abuse and harm online, and by making them a key component of the Bill.

David Davis Portrait Mr David Davis (Haltemprice and Howden) (Con)
- View Speech - Hansard - - - Excerpts

I join everybody else in welcoming the Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend the Member for Folkestone and Hythe (Damian Collins), to the Front Bench. He is astonishingly unusual in that he is both well-intentioned and well-informed, a combination we do not always find among Ministers.

I will speak to my amendments to the Bill. I am perfectly willing to be in a minority of one—one of my normal positions in this House. To be in a minority of one on the issue of free speech is an honourable place to be. I will start by saying that I think the Bill is fundamentally mis-designed. It should have been several Bills, not one. It is so complex that it is very difficult to forecast the consequences of what it sets out to do. It has the most fabulously virtuous aims, but unfortunately the way things will be done under it, with the use of Government organisations to make decisions that, properly, should be taken on the Floor of the House, is in my view misconceived.

We all want the internet to be safe. Right now, there are too many dangers online—we have been hearing about some of them from the hon. Member for Pontypridd (Alex Davies-Jones), who made a fabulous speech from the Opposition Front Bench—from videos propagating terror to posts promoting self-harm and suicide. But in its well-intentioned attempts to address those very real threats, the Bill could actually end up being the biggest accidental curtailment of free speech in modern history.

There are many reasons to be concerned about the Bill. Not all of them are to be dealt with in this part of the Report stage—some will be dealt with later—and I do not have time to mention them all. I will make one criticism of the handling of the Bill at this point. I have seen much smaller Bills have five days on Report in the past. This Bill demands more than two days. That was part of what I said in my point of order at the beginning.

One of the biggest problems is the “duties of care” that the Bill seeks to impose on social media firms to protect users from harmful content. That is a more subtle issue than the tabloid press have suggested. My hon. Friend the Member for Croydon South (Chris Philp), the previous Minister, made that point and I have some sympathy with him. I have spoken to representatives of many of the big social media firms, some of which cancelled me after speeches that I made at the Conservative party conference on vaccine passports. I was cancelled for 24 hours, which was an amusing process, and they put me back up as soon as they found out what they had done. Nevertheless, that demonstrated how delicate and sensitive this issue is. That was a clear suppression of free speech without any of the pressures that are addressed in the Bill.

When I spoke to the firms, they made it plain that they did not want the role of online policemen, and I sympathise with them, but that is what the Government are making them do. With the threat of huge fines and even prison sentences if they consistently fail to abide by any of the duties in the Bill—I am using words from the Bill—they will inevitably err on the side of censorship whenever they are in doubt. That is the side they will fall on.

Worryingly, the Bill targets not only illegal content, which we all want to tackle—indeed, some of the practice raised by the Opposition Front Bencher, the hon. Member for Pontypridd should simply be illegal full stop—but so-called “legal but harmful” content. Through clause 13, the Bill imposes duties on companies with respect to legal content that is “harmful to adults”. It is true that the Government have avoided using the phrase “legal but harmful” in the Bill, preferring “priority content”, but we should be clear about what that is.

The Bill’s factsheet, which is still on the Government’s website, states on page 1:

“The largest, highest-risk platforms will have to address named categories of legal but harmful material”.

This is not just a question of transparency—they will “have to” address that. It is simply unacceptable to target lawful speech in this way. The “Legal to Say, Legal to Type” campaign, led by Index on Censorship, sums up this point: it is both perverse and dangerous to allow speech in print but not online.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

As I said, a company may be asked to address this, which means that it has to set out what its policies are, how it would deal with that content and its terms of service. The Bill does not require a company to remove legal speech that it has no desire to remove. The regulator cannot insist on that, nor can the Government or the Bill. There is nothing to make legal speech online illegal.

David Davis Portrait Mr Davis
- Hansard - - - Excerpts

That is exactly what the Minister said earlier and, indeed, said to me yesterday when we spoke about this issue. I do not deny that, but this line of argument ignores the unintended consequences that the Bill may have. Its stated aim is to achieve reductions in online harm, not just illegal content. Page 106 of the Government’s impact assessment lists a reduction in the prevalence of legal but harmful content as a “key evaluation” question. The Bill aims to reduce that—the Government say that both in the online guide and the impact assessment. The impact assessment states that an increase in “content moderation” is expected because of the Bill.

A further concern is that the large service providers already have terms and conditions that address so-called legal but harmful content. A duty to state those clearly and enforce them consistently risks legitimising and strengthening the application of those terms and conditions, possibly through automated scanning and removal. That is precisely what happened to me before the Bill was even dreamed of. That was done under an automated system, backed up by somebody in Florida, Manila or somewhere who decided that they did not like what I said. We have to bear in mind how cautious the companies will be. That is especially worrying because, as I said, providers will be under significant pressure from outside organisations to include restrictive terms and conditions. I say this to Conservative Members, and we have some very well-intentioned and very well-informed Members on these Benches: beware of the gamesmanship that will go on in future years in relation to this.

Ofcom and the Department see these measures as transparency measures—that is the line. Lord Michael Grade, who is an old friend of mine, came to see me and he talked about this not as a pressure, but as a transparency measure. However, these are actually pressure measures. If people are made to announce things and talk about them publicly, that is what they become.

It is worth noting that several free speech and privacy groups have expressed scepticism about the provisions, yet they were not called to give oral evidence in Committee. A lot of other people were, including pressure groups on the other side and the tech companies, which we cannot ignore, but free speech advocates were not.

13:45
The clause is also part of the Bill where real democratic scrutiny is missing. Without being too pious about this, the simple truth is that the comparative power of Parliament has diminished over the past decade or two, with respect to Government, and this is another example. The decision on what counts as
“priority content that is harmful to adults”
will initially be made by the Secretary of State and then be subject to the draft affirmative procedure, in a whipped Statutory Instrument Committee. I have never, ever been asked to serve on an SI Committee and the Whips’ Office has never sought me to volunteer to do that—I wonder why. I hasten to add that I am not volunteering to do so now either, but the simple truth is this will be considered in a whipped, selected Committee. When we talked earlier about constraints on the power, we heard comments such as, “We will only do this in the case of security, and so on.” Heavens above, we have been through two years of ferocious controversy on matters of public health security. This is not something that should somehow be protected from free speech, I’m afraid.
We cannot allow such significant curtailments of free expression to take place without proper parliamentary debate or amendment. These questions need to be discussed and decided in the Chamber, if need be, annually. When I first came into the House of Commons, we had an annual Companies Act, because companies law and accounting law were going through change. They were not changing anything like as fast as the internet. The challenges were not coming up anything like as fast as they do with the internet, so why do we not have an annual Bill on this matter? I would be perfectly happy to see that, so that we can make decisions here. If we do not do that, we could do this on an ad hoc basis as the issues arise, including some that the hon. Member for Pontypridd raised. We could have been dealing with that before now on a simpler basis than that of the Bill.
If a category of speech is important enough to be censored, which is what we are really asking for, it is important enough to be debated in this Chamber and by the whole of Parliament—the Commons and the Lords. Otherwise, the Government’s claim that the Bill will protect free speech will appear absurd. My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) has tabled some amendments relating to the press. Even there, it is incredibly difficult to get this right because of the sheer complexity of the Bill and the size of the problem that we are trying to address. That is why I have tabled amendment 151, which seeks to remove clause 13 entirely, because it introduces the authoritarian concept of “legal but harmful” content—decided by the Government. “Legal but harmful” is a perfectly reasonable concept, but if it is decided by the Government alone, that is authoritarian. This is described as “priority content”, but everybody knows what it actually means. The Government have run away from using “legal but harmful” in a public context, but they use it everywhere else.
My amendments are designed to protect free speech while making the internet a safer place for everyone. I do not want to see content relating to suicide, self-harm or abuse of women, or whatever it may be, and I tabled two amendments to make them explicitly illegal, and the House can decide on those. That is what we should do. That is where the power of the House and the proper judgment lies.
The Bill has significantly improved since it was in draft form and the new Minister has a very honourable history in that reform. I compliment and commend him on that and thank him for those actions. I also welcome the measures taken against such things as cyber-flashing, but more needs to be done. The Bill falls far short of what it needs to be, and it would be remiss of us, in our duty as MPs, to let it pass without serious alteration.
I say this to the Whip on the Front Bench, and I hope that I have his attention: the Bill needs many more days on Report. I hope that he will reflect that back to the Chief Whip at the end of this business, because only with more days can we get it right. This is probably one of the most important Bills to go through this House in this decade, and we have not quite got it right yet.
John Nicolson Portrait John Nicolson (Ochil and South Perthshire) (SNP)
- View Speech - Hansard - - - Excerpts

I rise to speak to the amendments in my name and those of other right hon. and hon. Members. I welcome the Minister to his place after his much-deserved promotion; as other hon. Members have said, it is great to have somebody who is both passionate and informed as a Minister. I also pay tribute to the hon. Member for Croydon South (Chris Philp), who is sitting on the Back Benches: he worked incredibly hard on the Bill, displayed a mastery of detail throughout the process and was extremely courteous in his dealings with us. I hope that he will be speedily reshuffled back to the Front Bench, which would be much deserved—but obviously not that he should replace the Minister, who I hope will remain in his current position or indeed be elevated from it.

But enough of all this souking, as we say north of the border. As one can see from the number of amendments tabled, the Bill is not only an enormous piece of legislation but a very complex one. Its aims are admirable—there is no reason why this country should not be the safest place in the world to be online—but a glance through the amendments shows how many holes hon. Members think it still has.

The Government have taken some suggestions on board. I welcome the fact that they have finally legislated outright to stop the wicked people who attempt to trigger epileptic seizures by sending flashing gifs; I did not believe that such cruelty was possible until I was briefed about it in preparation for debates on the Bill. I pay particular tribute to wee Zach, whose name is often attached to what has been called Zach’s law.

The amendments to the Bill show that there has been a great deal of cross-party consensus on some issues, on which it has been a pleasure to work with friends in the Labour party. The first issue is addressed, in various ways, by amendments 44 to 46, 13, 14, 21 and 22, which all try to reduce the Secretary of State’s powers under the Bill. In all the correspondence that I have had about the Bill, and I have had a lot, that is the area that has most aggrieved the experts. A coalition of groups with a broad range of interests, including child safety, human rights, women and girls, sport and democracy, all agree that the Secretary of State is granted too many powers under the Bill, which threatens the independence of the regulator. Businesses are also wary of the powers, in part because they cause uncertainty.

The reduction of ministerial powers under the Bill was advised by the Joint Committee on the Draft Online Safety Bill and by the Select Committee on Digital, Culture, Media and Sport, on both of which I served. In Committee, I asked the then Minister whether any stakeholder had come forward in favour of these powers. None had.

Even DCMS Ministers do not agree with the powers. The new Minister was Chair of the Joint Committee, and his Committee’s report said:

“The powers for the Secretary of State to a) modify Codes of Practice to reflect Government policy and b) give guidance to Ofcom give too much power to interfere in Ofcom’s independence and should be removed.”

The Government have made certain concessions with respect to the powers, but they do not go far enough. As the Minister said, the powers should be removed.

We should be clear about exactly what the powers do. Under clause 40, the Secretary of State can

“modify a draft of a code of practice”.

That allows the Government a huge amount of power over the so-called independent communications regulator. I am glad that the Government have listened to the suggestions that my colleagues and I made on Second Reading and in Committee, and have committed to using the power only in “exceptional circumstances” and by further defining “public policy” motives. But “exceptional circumstances” is still too opaque and nebulous a phrase. What exactly does it mean? We do not know. It is not defined—probably intentionally.

The regulator must not be politicised in this way. Several similar pieces of legislation are going through their respective Parliaments or are already in force. In Germany, Australia, Canada, Ireland and the EU, with the Digital Services Act, different Governments have grappled with the issue of making digital regulation future-proof and flexible. None of them has added political powers. The Bill is sadly unique in making such provision.

When a Government have too much influence over what people can say online, the implications for freedom of speech are particularly troubling, especially when the content that they are regulating is not illegal. There are ways to future-proof and enhance the transparency of Ofcom in the Bill that do not require the overreach that these powers give. When we allow the Executive powers over the communications regulator, the protections must be absolute and iron-clad, but as the Bill stands, it gives leeway for abuse of those powers. No matter how slim the Minister feels the chance of that may be, as parliamentarians we must not allow it.

Amendment 187 on human trafficking is an example of a relatively minor change to the Bill that could make a huge difference to people online. Our amendment seeks to deal explicitly with what Meta and other companies refer to as domestic servitude, which is very newsworthy, today of all days, and which we know better as human trafficking. Sadly, this abhorrent practice has been part of our society for hundreds if not thousands of years. Today, human traffickers are aided by various apps and platforms. The same platforms that connect us with old friends and family across the globe have been hijacked by the very worst people in our world, who are using them to create networks of criminal enterprise, none more cruel than human trafficking.

Investigations by the BBC and The Wall Street Journal have uncovered how traffickers use Instagram, Facebook and WhatsApp to advertise, sell and co-ordinate the trafficking of young women. One would have thought that the issue would be of the utmost importance to Meta—Facebook, as it was at the time—yet, as the BBC reported, The Wall Street Journal found that

“the social media giant only took ‘limited action’ until ‘Apple Inc. threatened to remove Facebook’s products from the App Store, unless it cracked down on the practice’.”

I and my friends across the aisle who sat on the DCMS Committee and the Joint Committee on the draft Bill know exactly what it is like to have Facebook’s high heid yins before us. They will do absolutely nothing to respond to legitimate pressure. They understand only one thing: the force of law and of financial penalty. Only when its profits were in danger did Meta take the issue seriously.

The omission of human trafficking from schedule 7 is especially worrying, because if human trafficking is not directly addressed as priority illegal content, we can be certain that it will not be prioritised by the platforms. We know from their previous behaviour that the platforms never do anything that will cost them money unless they are forced to do so. We understand that it is difficult to regulate in respect of human trafficking on platforms: it requires work across borders and platforms, with moderators speaking different languages. It is not cheap or easy, but it is utterly essential. The social media companies make enormous amounts of money, so let us shed no tears for them and for the costs that will be entailed. If human trafficking is not designated as a priority harm, I fear that it will fall by the wayside.

In Committee, the then Minister said that the relevant legislation was covered by other parts of the Bill and that it was not necessary to incorporate offences under the Modern Slavery Act 2015 into priority illegal content. He referred to the complexity of offences such as modern slavery, and said how illegal immigration and prostitution priority offences might cover that already. That is simply not good enough. Human traffickers use platforms as part of their arsenal at every stage of the process, from luring in victims to co-ordinating their movements and threatening their families. The largest platforms have ample capacity to tackle these problems and must be forced to be proactive. The consequences of inaction will be grave.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

It is a pleasure to follow the hon. Member for Ochil and South Perthshire (John Nicolson).

Let me begin by repeating my earlier congratulations to my hon. Friend the Member for Folkestone and Hythe (Damian Collins) on assuming his place on the Front Bench. Let me also take this opportunity to extend my thanks to those who served on the Bill Committee with me for some 50 sitting hours—it was, generally speaking, a great pleasure—and, having stepped down from the Front Bench, to thank the civil servants who have worked so hard on the Bill, in some cases over many years.

13:59
It would be strange if I did not broadly support the Government amendments, given that I have spent most of the last three or months concocting them. I will touch on one or two of them, and then mention some areas in which I think the House might consider going further when the Bill proceeds to the other end of the building. I certainly welcome new clause 19, which gives specific protection to content generated by news media publishers by ensuring that there is a right of appeal before it can be removed. I take the view—and I think the Government do as well—that protecting freedom of the press is critical, but as we grant news media publishers this special protection, it is important for us to ensure that we are granting it to organisations that actually deserve it.
That, I think, is the purpose of amendments 11 and 12, tabled by my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright). The amendments apply to clause 50, which defines the term “recognised news publisher”. During the evidence sessions in Committee, some concern was expressed that the definition was too wide, and that some organisations—"bad actors”, as the Minister put it—might manage to organise themselves in such a way that they would benefit from this exemption. My right hon. and learned Friend’s amendments are designed to tighten that definition a little bit. There is some concern that the drafting of the amendments might effectively give rise to back-door press regulation because determining whether news publishers’ terms and conditions are “suitable and sufficient” constitutes a value judgment, but I certainly agree that clause 50 needs tightening up.
I welcome—unsurprisingly—the reference in the written ministerial statement to tabling an amendment in the House of Lords providing that sanctioned organisations cannot benefit from this exemption. I suggest, however, that their lordships might like to consider going even further, for example by saying that where content amounts to a foreign interference offence as defined by the National Security Bill, introduced by my hon. Friend the Member for North East Hampshire (Mr Jayawardena)—the Under-Secretary of State for International Trade, who is in his place on the Front Bench—the organisation propagating it should not be able to benefit from the “recognised news publisher” exemption. Their lordships may wish to consider that, along with any other ideas for tightening the definition in clause 50.
Let me now say a word about free speech. It has been widely misreported that the Bill mandates censorship of speech that is legal but harmful. As I said in my intervention on the Minister earlier, that is categorically untrue. While the large social media platforms will have to address such content as part of their terms and conditions, they are not compelled in the actions that they have to take in relation to it; they simply have to risk-assess it, adopt a policy—what that policy is will be up to them—and then apply that policy consistently. They are not obliged to take any action, and they are certainly not obliged to remove the content entirely. Lest there should be any doubt about that, Government amendment 71 to clause 13 makes it explicit that it is reasonable to take no action if the platform sees fit.
Joanna Cherry Portrait Joanna Cherry
- Hansard - - - Excerpts

I hear what the hon. Gentleman is saying, but he will have heard the speech made by his colleague, the right hon. Member for Haltemprice and Howden (Mr Davis). Does he not accept that it is correct to say that there is a risk of an increase in content moderation, and does he therefore see the force of my amendment, which we have previously discussed privately and which is intended to ensure that Twitter and other online service providers are subject to anti-discrimination law in the United Kingdom under the Equality Act 2010?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I did of course hear what was said by my right hon. Friend the Member for Haltemprice and Howden (Mr Davis). To be honest, I think that increased scrutiny of content which might constitute abuse of harassment, whether of women or of ethnic minorities, is to be warmly welcomed. The Bill provides that the risk assessors must pay attention to the characteristics of the user. There is no cross-reference to the Equality Act—I know the hon. and learned Lady has submitted a request on that, to which my successor Minister will now be responding—but there are references to characteristics in the provisions on safety duties, and those characteristics do of course include gender and race.

In relation to the risk that these duties are over-interpreted or over-applied, for the first time ever there is a duty for social media firms to have regard to freedom of speech. At present these firms are under no obligation to have regard to it, but clause 19(2) imposes such a duty, and anyone who is concerned about free speech should welcome that. Clauses 15 and 16 go further: clause 15 creates special protections for “content of democratic importance”, while clause 16 does the same for content of journalistic importance. So while I hugely respect and admire my right hon. Friend the Member for Haltemprice and Howden, I do not agree with his analysis in this instance.

I would now like to ask a question of my successor. He may wish to refer to it later or write to me, but if he feels like intervening, I will of course give way to him. I note that four Government amendments have been tabled; I suppose I may have authorised them at some point. Amendments 72, 73, 78 and 82 delete some words in various clauses, for example clauses 13 and 15. They remove the words that refer to treating content “consistently”. The explanatory note attached to amendment 72 acknowledges that, and includes a reference to new clause 14, which defines how providers should go about assessing illegal content, what constitutes illegal content, and how content is to be determined as being in one of the various categories.

As far as I can see, new clause 14 makes no reference to treating, for example, legal but harmful content “consistently”. According to my quick reading—without the benefit of highly capable advice—amendments 72, 73, 78 and 82 remove the obligation to treat content “consistently”, and it is not reintroduced in new clause 14. I may have misread that, or misunderstood it, but I should be grateful if, by way of an intervention, a later speech or a letter, my hon. Friend the Minister could give me some clarification.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I think that the codes of practice establish what we expect the response of companies to be when dealing with priority illegal harm. We would expect the regulator to apply those methods consistently. If my hon. Friend fears that that is no longer the case, I shall be happy to meet him to discuss the matter.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clause 13(6)(b), for instance, states that the terms of service must be

“applied consistently in relation to content”,

and so forth. As far as I can see, amendment 72 removes the word “consistently”, and the explanatory note accompanying the amendment refers to new clause 14, saying that it does the work of the previous wording, but I cannot see any requirement to act consistently in new clause 14. Perhaps we could pick that up in correspondence later.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

If there is any area of doubt, I shall be happy to follow it up, but, as I said earlier, I think we would expect that if the regulator establishes through the codes of practice how a company will respond proactively to identify illegal priority content on its platform, it is inherent that that will be done consistently. We would accept the same approach as part of that process. As I have said, I shall be happy to meet my hon. Friend and discuss any gaps in the process that he thinks may exist, but that is what we expect the outcome to be.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am grateful to my hon. Friend for his comments. I merely observe that the “consistency” requirements were written into the Bill, and, as far as I can see, are not there now. Perhaps we could discuss it further in correspondence.

Let me turn briefly to clause 40 and the various amendments to it—amendments 44, 45, 13, 46 and others—and the remarks made by the shadow Minister, the hon. Member for Pontypridd (Alex Davies-Jones), about the Secretary of State’s powers. I intervened on the hon. Lady earlier on this subject. It also arose in Committee, when she and many others made important points on whether the powers in clause 40 went too far and whether they impinged reasonably on the independence of the regulator, in this case Ofcom. I welcome the commitments made in the written ministerial statement laid last Thursday—coincidentally shortly after my departure—that there will be amendments in the Lords to circumscribe the circumstances in which the Secretary of State can exercise those powers to exceptional circumstances. I heard the point made by the hon. Member for Ochil and South Perthshire that it was unclear what “exceptional” meant. The term has a relatively well defined meaning in law, but the commitment in the WMS goes further and says that the bases upon which the power can be exercised will be specified and limited to certain matters such as public health or matters concerning international relations. That will severely limit the circumstances in which those powers can be used, and I think it would be unreasonable to expect Ofcom, as a telecommunications regulator, to have expertise in those other areas that I have just mentioned. I think that the narrowing is reasonable, for the reasons that I have set out.

Julian Knight Portrait Julian Knight (Solihull) (Con)
- Hansard - - - Excerpts

Those areas are still incredibly broad and open to interpretation. Would it not be easier just to remove the Secretary of State from the process and allow this place to take directly from Ofcom the code of standards that we are talking about so that it can be debated fully in the House?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I understand my hon. Friend’s point. Through his work as the Chairman of the Select Committee he has done fantastic work in scrutinising the Bill. There might be circumstances where one needed to move quickly, which would make the parliamentary intervention he describes a little more difficult, but he makes his point well.

Julian Knight Portrait Julian Knight
- Hansard - - - Excerpts

So why not quicken up the process by taking the Secretary of State out of it? We will still have to go through the parliamentary process regardless.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The Government are often in possession of information—for example, security information relating to the UK intelligence community—that Ofcom, as the proposer of a code or a revised code, may not be in possession of. So the ability of the Secretary of State to propose amendments in those narrow fields, based on information that only the Government have access to, is not wholly unreasonable. My hon. Friend will obviously comment further on this in his speech, and no doubt the other place will give anxious scrutiny to the question as well.

I welcome the architecture in new clause 14 in so far as it relates to the definition of illegal content; that is a helpful clarification. I would also like to draw the House’s attention to amendment 16 to clause 9, which makes it clear that acts that are concerned with the commission of a criminal offence or the facilitation of a criminal offence will also trigger the definitions. That is a very welcome widening.

I do not want to try the House’s patience by making too long a speech, given how much the House has heard from me already on this topic, but there are two areas where, as far as I can see, there are no amendments down but which others who scrutinise this later, particularly in the other place, might want to consider. These are areas that I was minded to look at a bit more over the summer. No doubt it will be a relief to some people that I will not be around to do so. The first of the two areas that might bear more thought is clause 137, which talks about giving academic researchers access to social media platforms. I was struck by Frances Haugen’s evidence on this. The current approach in the Bill is for Ofcom to do a report that will takes two years, and I wonder if there could be a way of speeding that up slightly.

The second area concerns the operation of algorithms promoting harmful content. There is of course a duty to consider how that operates, but when it comes algorithms promoting harmful content, I wonder whether we could be a bit firmer in the way we treat that. I do not think that would restrain free speech, because the right of free speech is the right to say something; it is not the right to have an algorithm automatically promoting it. Again, Frances Haugen had some interesting comments on that.

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

I agree that there is scope for more to be done to enable those in academia and in broader civil society to understand more clearly what the harm landscape looks like. Does my hon. Friend agree that if they had access to the sort of information he is describing, we would be able to use their help to understand more fully and more clearly what we can do about those harms?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

My right hon. and learned Friend is right, as always. We can only expect Ofcom to do so much, and I think inviting expert academic researchers to look at this material would be welcome. There is already a mechanism in clause 137 to produce a report, but on reflection it might be possible to speed that up. Others who scrutinise the Bill may also reach that conclusion. It is important to think particularly about the operation of algorithmic promotion of harmful content, perhaps in a more prescriptive way than we do already. As I have said, Frances Haugen’s evidence to our Committee in this area was particularly compelling.

14:15
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I agree with my hon. Friend on both points. I discussed the point about researcher access with him last week, when our roles were reversed, so I am sympathetic to that. There is a difference between that and the researcher access that the Digital Services Act in Europe envisages, which will not have the legal powers that Ofcom will have to compel and demand access to information. It will be complementary but it will not replace the primary powers that Ofcom will have, which will really set our regime above those elsewhere. It is certainly my belief that the algorithmic amplification of harmful content must be addressed in the transparency reports and that, where it relates to illegal activities, it must absolutely be within the scope of the regulator to state that actively promoting illegal content to other people is an offence under this legislation.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

On my hon. Friend’s first point, he is right to remind the House that the obligations to disclose information to Ofcom are absolute; they are hard-edged and they carry criminal penalties. Researcher access in no way replaces that; it simply acts as a potential complement to it. On his second point about algorithmic promotion, of course any kind of content that is illegal is prohibited, whether algorithmically promoted or otherwise. The more interesting area relates to content that is legal but perceived as potentially harmful. We have accepted that the judgments on whether that content stays up or not are for the platforms to make. If they wish, they can choose to allow that content simply to stay up. However, it is slightly different when it comes to algorithmically promoting it, because the platform is taking a proactive decision to promote it. That may be an area that is worth thinking about a bit more.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

On that point, if a platform has a policy not to accept a certain sort of content, I think the regulators should expect it to say in its transparency report what it is doing to ensure that it is not actively promoting that content through a newsfeed, on Facebook or “next up” on YouTube. I expect that to be absolutely within the scope of the powers we have in place.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

In terms of content that is legal but potentially harmful, as the Bill is drafted, the platforms will have to set out their policies, but their policies can say whatever they like, as we discussed earlier. A policy could include actively promoting content that is harmful through algorithms, for commercial purposes. At the moment, the Bill as constructed gives them that freedom. I wonder whether that is an area that we can think about making slightly more prescriptive. Giving them the option to leave the content up there relates to the free speech point, and I accept that, but choosing to algorithmically promote it is slightly different. At the moment, they have the freedom to choose to algorithmically promote content that is toxic but falls just on the right side of legality. If they want to do that, that freedom is there, and I just wonder whether it should be. It is a difficult and complicated topic and we are not going to make progress on it today, but it might be worth giving it a little more thought.

I think I have probably spoken for long enough on this Bill, not just today but over the last few months. I broadly welcome these amendments but I am sure that, as the Bill completes its stages, in the other place as well, there will be opportunities to slightly fine-tune it that all of us can make a contribution to.

Margaret Hodge Portrait Dame Margaret Hodge
- View Speech - Hansard - - - Excerpts

First, congratulations to the Under-Secretary of State for Digital, Culture, Media and Sport, the hon. Member for Folkestone and Hythe (Damian Collins). I think his is one of the very few appointments in these latest shenanigans that is based on expertise and ability. I really welcome him, and the work he has done on the Bill this week has been terrific. I also thank the hon. Member for Croydon South (Chris Philp). When he held the position, he was open to discussion and he accepted a lot of ideas from many of us across the House. As a result, I think we have a better Bill before us today than we would have had. My gratitude goes to him as well.

I support much of the Bill, and its aim of making the UK the safest place to be online is one that we all share. I support the systems-based approach and the role of Ofcom. I support holding the platforms to account and the importance of protecting children. I also welcome the cross-party work that we have done as Back Benchers, and the roles played by both Ministers and by the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright). I thank him for his openness and his willingness to talk to us. Important amendments have been agreed on fraudulent advertising, bringing forward direct liability so there is not a two-year wait, and epilepsy trolling—my hon. Friend the Member for Batley and Spen (Kim Leadbeater) promoted that amendment.

I also welcome the commitment to bring forward amendments in the Lords relating to the amendments tabled by the hon. Member for Brigg and Goole (Andrew Percy) and the right hon. and learned Member for Kenilworth and Southam—I think those amendments are on the amendment paper but it is difficult to tell. It is important that the onus on platforms to be subject to regulation should be based not on size and functionality but on risk of harm. I look forward to seeing those amendments when they come back from the other place. We all know that the smallest platforms can present the greatest risk. The killing of 51 people in the mosques in Christchurch, New Zealand is probably the most egregious example, as the individual concerned had been on 8chan before committing that crime.

I am speaking to amendments 156 and 157 in my name and in the names of other hon. and right hon. Members. These amendments would address the issue of anonymous abuse. I think we all accept that anonymity is hugely important, particularly to vulnerable groups such as victims of domestic violence, victims of child abuse and whistleblowers. We want to retain anonymity for a whole range of groups and, in framing these amendments, I was very conscious of our total commitment to doing so.

Equally, freedom of speech is very important, as the right hon. Member for Haltemprice and Howden (Mr Davis) said, but freedom of speech has never meant freedom to harm, which is not a right this House should promote. It is difficult to define, and it is difficult to get the parameters correct, but we should not think that freedom of speech is an absolute right without constraints.

Joanna Cherry Portrait Joanna Cherry
- Hansard - - - Excerpts

I agree with the right hon. Lady that freedom of speech is not absolute. As set out in article 10 of the European convention on human rights, there have to be checks and balances. Nevertheless, does she agree freedom of speech is an important right that this House should promote, with the checks and balances set out in article 10 of the ECHR?

Margaret Hodge Portrait Dame Margaret Hodge
- Hansard - - - Excerpts

Absolutely. I very much welcome the hon. and learned Lady’s amendment, which clarifies the parameters under which freedom of speech can be protected and promoted.

Equally, freedom of speech does not mean freedom from consequences. The police and other enforcement agencies can pursue unlawful abuse, assuming they have the resources, which we have not discussed this afternoon. I know the platforms have committed to providing the finance for such resources, but I still question whether the resources are there.

The problem with the Bill and the Government amendments, particularly Government amendment 70, is that they weaken the platforms’ duty on legal but harmful abuse. Such abuse is mainly anonymous and the abusers are clever. They do not break the law; they avoid the law with the language they use. It might be best if I give an example. People do not say, in an antisemitic way, “I am going to kill all Jews.” We will not necessarily find that online, but we might find, “I am going to harm all globalists.” That is legal but harmful and has the same intent. We should think about that, without being beguiled by the absolute right to freedom of speech that I am afraid the right hon. Member for Haltemprice and Howden is promoting, otherwise we will find that the Bill does not meet the purposes we all want.

Much of the abuse is anonymous. We do not know how much, but much of it is. When there was racist abuse at the Euros, Twitter claimed that 99% of postings of racist abuse were identifiable. Like the Minister, I wrote to Twitter to challenge that claim and found that Twitter was not willing to share its data with me, claiming GDPR constraints.

It is interesting that, in recent days, the papers have said that one reason Elon Musk has given for pulling out of his takeover is that he doubts Twitter’s claim that fake and spam accounts represent less than 5% of users. There is a lack of understanding and knowledge of the extent of anonymous abuse.

In the case I have shared with the Minister on other occasions, I received 90,000 posts in the two months from the publication of the Equality and Human Rights Commission report to the shenanigans about the position of the previous leader of the Labour party—from October to Christmas. The posts were monitored for me by the Community Security Trust. When I asked how many of the posts were anonymous, I was told that it had been unable to do that analysis. I wish there were the resources to do so, but I think most of the posts were anonymous and abusive.

There is certainly public support for trying to tackle abusive posts. A June 2021 YouGov poll found that 78% of the public are in favour of revealing the identity of those who post online, and we should bear that in mind. If people feel strongly about this, and the poll suggests that they do, we should respond and not put it to one side.

The Government have tried to tackle this with a compromise following the very good work by the hon. Member for Stroud (Siobhan Baillie). The Bill places a duty on the platforms to give users the option to verify their identity. If a user chooses to remain unverified, they may not be able to interact with verified accounts. Although I support the motives behind that amendment, I have concerns.

First, the platform itself would have to verify who holds the account, which gives the platforms unprecedented access to personal details. Following Cambridge Analytica, we know how such data can be abused. Data on 87 million identities was stolen, and we know it was used to influence the Trump election in 2016, and it may have been a factor in the Brexit referendum.

Secondly, the police have been very clear on how I should deal with anonymous online abuse. They say that the last thing I should do is remove it, as they need it to be able to judge whether there is a real threat within the abuse that they should take seriously. So individuals having that right does not diminish the real harm they could face if the online abuse is removed.

Thirdly, one of the problems with a lot of online abuse is not just that it is horrible or can be dangerous in particular circumstances, but that it prevents democracy. It inhibits freedom of speech by inhibiting engagement in free, democratic discourse. Online abuse is used to undermine an individual’s credibility. A lot of the abuse I receive seeks to undermine my credibility. It says that I am a bad woman, that I abuse children, that I break tax law and that I do this, that and the other. Building that picture of me as someone who cannot be believed undermines my ability to enter into legitimate democratic debate on issues I care about. Simply removing anonymous online abuse from my account does not stop the circulation of abusive, misleading content that undermines my democratic right to free speech. Therefore, in its own way, it undermines free speech.

Amendments 156 and 157, in my name and in the name of other colleagues, are based on a strong commitment to protecting anonymity, especially for vulnerable groups. We seek to tackle anonymous abuse not by denying anonymity but by ensuring traceability. It is quite simple. The Government recognise the feasibility and importance of that with age verification; they have now accepted the argument on age verification, and I urge them to take it further. Although I have heard that various groups are hostile to what we are suggesting, in a meeting I held last week with HOPE not hate there was agreement that what we are proposing made sense, and therefore we and the Government should pursue it.

14:30
Under our proposed scheme, any individual who chooses to go on a platform would have to have their identity verified, not by the platform but by a third party. We would thus remove the platform’s ability to access the individual’s data, which it could use in an inappropriate way. Such a scheme is perfectly feasible, particularly now that the Government have introduced the age verification mechanism. More than 99% of us have bank accounts, so there is a simple way of verifying someone’s identity through a third-party mechanism without giving platforms the powers I have described. Everybody would be able to enter any platform and have total anonymity, and only if and when an individual posts something that breaks the law will they lose their right to anonymity.
To go back to a point I made in an intervention on the Minister, that would also involve having minimum standards on harmful but legal abuse. Under a minimum standards platform, only if someone posted abuse that is harmful—this would mainly be illegal abuse, but it would also be harmful but legal abuse—would they lose their right to anonymity. I think that is good, because one could name and shame. Most importantly, this would be the most effective tool for preventing a lot of online abuse from happening in the first place, and we should all be focusing our energies on doing so.
My hon. Friend the Member for Pontypridd (Alex Davies-Jones), our Front Bencher, has talked about women who are particularly vulnerable, and I think our measure would be very important—my experience justifies that. It would be a powerful deterrent. I hope that our Front-Bench team will support the proposition we are putting before the House. I will not press it to a vote if they do not, although I would regret the fact that they did not support it.
I regret that the Government do not feel able to support our proposition, but I think its time will come. A lot of the stuff that we are doing in this Bill is innovative, and we are not sure where everything will land. We are likely to get some things wrong and others right. I say to all Members, from across this House, that if we really want to reduce the amount of harmful abuse online, tackling anonymous abuse, rather than anonymity, must be central to our concerns. I urge my Front-Bench team and the Government to think carefully about this.
Nick Fletcher Portrait Nick Fletcher (Don Valley) (Con)
- View Speech - Hansard - - - Excerpts

I rise to speak on amendments 50, 51 and 55, and I share the free speech concerns that I think lie behind amendment 151. As I said in Committee to the previous Minister, my hon. Friend the Member for Croydon South (Chris Philp), who knew this Bill inside out—it was amazing to watch him do it—I have deep concerns about how the duty on “legal but harmful” content will affect freedom of speech. I do not want people to be prevented from saying what they think. I am known for saying what I think, and I believe others should be allowed the same freedom, offline and online. What is harmful can be a subjective question, and many of us in this House might have different answers. When we start talking about restricting content that is perfectly legal, we should be very careful.

This Bill is very complex and detailed, as I know full well, having been on the Committee. I support the Bill—it is needed—but when it comes to legal but harmful content, we need to make sure that free speech is given enough protection. We have to get the right balance, but clause 19 does not do that. It says only that social media companies have

“a duty to have regard to the importance of protecting users’ right to freedom of expression within the law.”

There is no duty to do anything about freedom of speech; it just says, “You have to think about the importance of it”. That is not enough.

I know that the Bill does not state that social media companies have to restrict content—I understand that—but in the real world that is what will happen. If the Government define certain content as harmful, no social media company will want to be associated with it. The likes of Meta will want to be seen to get tough on legally defined harmful content, so of course it will be taken down or restricted. We have to counterbalance that instinct by putting stronger free speech duties in the Bill if we insist on it covering legal but harmful.

The Government have said that we cannot have stronger free speech obligations on private companies, and, in general, I agree with that. However, this Bill puts all sorts of other obligations on Facebook, Twitter and Instagram, because they are not like other private companies. These companies and their chief executive officers are household words all around the world, and their power and influence is incredible. In 2021, Facebook’s revenue was $117 billion, which is higher than the GDP—

Andrew Percy Portrait Andrew Percy
- Hansard - - - Excerpts

Is that not exactly why there has to be action on legal but harmful content? The cross-boundary, cross-national powers of these organisations mean that we have to insist that they take action against harm, whether lawful or unlawful. We are simply asking those organisations to risk assess and ensure that appropriate warnings are provided, just as they are in respect of lots of harms in society; the Government require corporations and individuals to risk assess those harms and warn about them. The fact that these organisations are so transnational and huge is absolutely why we must require them to risk assess legal but harmful content.

Nick Fletcher Portrait Nick Fletcher
- Hansard - - - Excerpts

I understand what my hon. Friend is saying, but the list of what is legal but harmful will be set by the Secretary of State, not by Parliament. All we ask is for that to be discussed on the Floor of the House before we place those duties on the companies. That is all I am asking us to do.

Facebook has about 3 billion active users globally. That is more than double the population of China, the world’s most populous nation, and it is well over half the number of internet users in the entire world. These companies are unlike any others we have seen in history. For hundreds of millions of people around the world, they are the public square, which is how the companies have described themselves: Twitter founder Jack Dorsey said in 2018:

“We believe many people use Twitter as a digital public square. They gather from all around the world to see what’s happening, and have a conversation about what they see.”

In 2019, Mark Zuckerberg said:

“Facebook and Instagram have helped people connect with friends, communities, and interests in the digital equivalent of a town square.”

Someone who is blocked from these platforms is blocked from the public square, as we saw when the former President of the United States was blocked. Whatever we might think about Donald Trump, it cannot be right that he was banned from Twitter. We have to have stronger protection for free speech in the digital public square than clause 19 gives. The Bill gives the Secretary of State the power to define what is legal but harmful by regulations. As I have said, this is an area where free speech could easily be affected—

Adam Afriyie Portrait Adam Afriyie (Windsor) (Con)
- Hansard - - - Excerpts

I commend my hon. Friend for the powerful speech he is making. It seems to many of us here that if anyone is going to be setting the law or a regulation, it should really be done in the Chamber of this House. I would be very happy if we had annual debates on what may be harmful but is currently lawful, in order to make it illegal. I very much concur with what he is saying.

Nick Fletcher Portrait Nick Fletcher
- Hansard - - - Excerpts

I thank my hon. Friend for his contribution, which deals with what I was going to finish with. It is not enough for the Secretary of State to have to consult Ofcom; there should be public consultation too. I support amendment 55, which my hon. Friend has tabled.

Anna McMorrin Portrait Anna McMorrin (Cardiff North) (Lab)
- View Speech - Hansard - - - Excerpts

Not too long ago, the tech industry was widely looked up to and the internet was regarded as the way forward for democracy and freedoms. Today that is not the case. Every day we read headlines about data leaks, racist algorithms, online abuse, and social media platforms promoting, and becoming swamped in, misinformation, misogyny and hate. These problems are not simply the fault of those platforms and tech companies; they are the result of a failure to govern technology properly. That has resulted from years of muddled thinking and a failure to bring forward this Bill, and now, a failure to ensure that the Bill is robust enough.

Ministers have talked up the Bill, and I welcome the improvements that were made in Committee. Nevertheless, Ministers had over a decade in which to bring forward proposals, and in that time online crime exploded. Child sexual abuse online has become rife; the dark web provides a location for criminals to run rampant and scams are widespread.

Delay has also allowed disinformation to spread, including state-sponsored propaganda and disinformation, such as from Russia’s current regime. False claims and fake fact checks are going viral. That encourages other groups to adopt such tactics, in an attempt to undermine democracy, from covid deniers to climate change deniers—it is rampant.

Today I shall speak in support of new clause 3, to put violence against women and girls on the face of the Bill. As a female MP, I, along with my colleagues, have faced a torrent of abuse online, attacking me personally and professionally. I have been sent images such as that of a person with a noose around their neck, as well as numerous messages containing antisemitic and misogynistic abuse directed towards both me and my children. It is deeply disturbing, but also unsurprising, that one in five women across the country have been subjected to abuse; I would guess that that figure is actually much higher.

Joanna Cherry Portrait Joanna Cherry
- View Speech - Hansard - - - Excerpts

I am really sorry to hear about the abuse that the hon. Lady and her family have received. Many women inside and without this Chamber, such as myself, receive terrible abuse on Twitter, including repeated threats to shoot us if we do not shut the f-u-c-k up. Twitter refuses to take down memes of a real human hand pointing a gun at me and other feminists and lesbians, telling us to shut the f-u-c-k up. Does she see the force of my amendment to ensure that Twitter apply its moderation policy evenly across society with regard to all protected characteristics, including sex?

Anna McMorrin Portrait Anna McMorrin
- Hansard - - - Excerpts

The hon. and learned Lady makes a very good point, and that illustrates what I am talking about in my speech—the abuse that women face online. We need this legislation to ensure that tech companies take action.

There is a very dark side to the internet, deeply rooted in misogyny. The End Violence Against Women organisation released statistics last year, stating that 85% of women who experienced online abuse from a partner or ex-partner also received abuse online. According to the latest Office for National Statistics figures, 92% of women who were killed in the year ending March 2021 were killed by men. Just yesterday, a woman was stabbed in the back by a male cyclist in east London, near to where Zara Aleena was murdered just two weeks ago. And in the year 2021, nearly 41,000 women were victims of sexual assault—and those were just the ones who reported it. We know that the actual figure was very much higher. That was the highest number of sexual offences ever recorded within a 12-month period. It is highly unlikely that any of those women will ever see their perpetrator brought to justice, because of the current 1.3% prosecution rate of rape cases. Need I continue?

14:45
Violence against women and girls is an ever-growing epidemic, and time is running out. This Government are more concerned with piecemeal actions that fail to tackle the root causes of the issue. Although the introduction of new criminal offences such as cyber-flashing and rape threats is a welcome first step, there are significant concerns about their enforceability. The cyber-flashing offence requires the police to prove a perpetrator’s intent to cause harm, which is incredibly difficult to evidence. That is the loophole through which perpetrators avoid consequences.
I doubt there are many women who have not been sent unsolicited images of male genitals online. There are accounts of women being airdropped images on public transport while on their way to work. What does that leave them feeling? Violated—scared, not knowing who in their train carriage or on their bus has sent those unsolicited images. The online dating platform Bumble conducted research on cyber-flashing and found that of its users, nearly half of women aged 18 to 24 had received a sexual photo that they did not ask for in the last year alone.
So, considering the scale of this issue and the Government’s appalling record on prosecuting sexual assault offences, why would this new offence be any different? Acts of violence towards women are not merely isolated incidents. We know that, unfortunately, there is systemic misogyny within our society that results in a shocking number of women losing their lives, but we refuse to see it. Failing to name violence against women and girls on the face of the Bill is putting the lives of countless women at risk, and will leave behind a dangerous and damning legacy, even by this Government’s standards.
I welcome the new Minister to his place and hope that he will look at this issue in a new light. I hope that the Government can put politics to one side for just a moment, match their words with deeds and commit to protecting women across the country by supporting new clause 3.
None Portrait Several hon. Members rose—
- Hansard -

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- View Speech - Hansard - - - Excerpts

Order. The House will see that a great many people still wish to speak. May I explain that there are two groups of amendments? We will finish debating this group at 4.30 pm, after which there will be some votes, and debate on the next group of amendments will last until 7 o’clock. By my calculations, there might be more time for speeches during the debate on the next group, so if anyone wishes to speak on that group rather than the current group, I would be grateful if they came and indicated that to me. Meanwhile, if everyone takes about eight minutes and no longer, everyone will have the opportunity to speak. I call Sir Jeremy Wright.

Jeremy Wright Portrait Sir Jeremy Wright
- View Speech - Hansard - - - Excerpts

I shall speak to the amendments in my name and the names of other right hon. and hon. Members, to whom I am grateful for their support. I am also grateful to the organisations that helped me to work through some of the problems I am about to identify, including the Carnegie Trust, Reset and the Antisemitism Policy Trust.

On the first amendments I shall talk about, amendments 42 and 43, I have been able to speak to Lego, so I can honestly say that these amendments were put together with Lego. Let me explain. The focus of the Bill, quite rightly, is on safety, and there is no safety more important than the safety of children. In that respect, the Bill is clear: platforms must give the safety of children the utmost priority and pay close attention to ways to enhance it. In other parts of the Bill, however, there are countervailing duties—for example, in relation to freedom of speech and privacy—where, predominantly in relation to adults, we expect platforms to conduct a balancing exercise. It seems right to me to think about that in the context of children, too.

As I said, the emphasis is rightly on children’s safety, but the safest approach would be to prohibit children from any online activity at all. We would not regard such an approach as sensible, because there are benefits to children in being able to engage—safely, of course—in online activity and to use online products and services. It seems to me that we ought to recognise that in the language of the Bill. Amendment 42 would do that when consideration is given to the safety duties designed to protect children set out in clause 11, which requires that “proportionate measures” must be taken to protect children’s safety and goes on to explain what factors might be taken into account when deciding what is proportionate, by adding

“the benefits to children’s well-being”

of the product or service in that list of factors. Amendment 43 would do the same when consideration is given to the online safety objectives set out in schedule 4. Both amendments are designed to ensure that the appropriate balance is struck when judgments are taken by platforms.

Others have spoken about journalistic content, and I am grateful for what the Minister said about that, but my amendment 10 is aimed at the defect that I perceive in clause 16. The Bill gives additional protections and considerations to journalists, which is entirely justifiable, given the important role that journalism plays in our society, but those extra protections mean that it will be harder for platforms to remove potentially harmful content that is also journalistic content. We should be sure, therefore, that the right people get the benefit of that protection.

It is worth having look at what clause 16 says and does. It sets out that a platform—a user-to-user service—in category 1 will have

“A duty to operate a service using proportionate systems and processes designed to ensure that the importance of the free expression of journalistic content is taken into account when making decisions about…how to treat such content (especially decisions about whether to take it down or restrict users’ access to it), and…whether to take action against a user generating, uploading or sharing such content.”

So it is important, because of the significance of those protections, that we get right the definitions of those who should benefit from them. Amendment 10 would amend clause 16(8), which states that:

“For the purposes of this section content is “journalistic content”, in relation to a user-to-user service, if…the content is”

either

“news publisher content in relation to that service”—

the definition of which I will return to—

“or…regulated user-generated content in relation to that service”.

That is the crucial point. The content also has to be

“generated for the purposes of journalism”

and be linked to the UK.

The first problem here is that journalism is not defined in the Bill. There are definitions of journalism, but none appears in the text of this Bill. “UK-linked” does not narrow it down much, and “regulated user-generated content” is a very broad category indeed. Clause 16 as drafted offers the protection given to journalistic content not just to news publishers, but to almost everybody else who chooses to define themselves as a journalist, whether or not that is appropriate. I do not think that that is what the Bill is intended to do, or an approach that this House should endorse. Amendment 10 would close the loophole by removing the second limb, regulated user-generated content that is not news publisher content. Let me be clear: I do not think that that is the perfect answer to the question I have raised, but it is better than the Bill as it stands, and if the Government can come up with a way of reintroducing protections of this kind for types of journalistic content beyond news publisher content that clearly deserve them, I will be delighted and very much open to it. Currently, however, the Bill is defective and needs to be remedied.

That brings us to the definition of news publisher content, because it is important that if we are to give protection to that category of material, we are clear about what we mean by it. Amendments 11 and 12 relate to the definition of news publisher content that arises from the definition of a recognised news publisher in clauses 49 and 50. That matters for the same reason as I just set out: we should give these protections only to those who genuinely deserve them. That requires rigorous definition. Clause 50 states that if an entity is not named in the Bill, as some are, it must fulfil a set of conditions set out in subsection (2), which includes having a standards code and policies and procedures for handling and resolving complaints. The difficulty here is that in neither case does the Bill refer to any quality threshold for those two things, so having any old standards code or any old policy for complaints will apparently qualify. That cannot be right.

I entirely accept that inserting a provision that the standards code and the complaints policies and procedures should be both “suitable and sufficient” opens the question whose job it becomes to decide what is suitable and sufficient. I am familiar with all the problems that may ensue, so again, I do not say that the amendment is the final word on the subject, but I do say that the Government need to look more carefully at what the value of those two items on the list really is if the current definition stands. If we are saying that we want these entities to have a standards code and a complaints process that provide some reassurance that they are worthy of the protections the Bill gives, it seems to me that meaningful criteria must apply, which currently they do not.

The powers of the Secretary of State have also been discussed by others, but I perhaps differ from their view in believing that there should be circumstances in which the Secretary of State should hold powers to act in genuine emergency situations. However, being able to direct Ofcom, as the Bill allows the Secretary of State to do, to modify a code of practice

“for reasons of public policy”

is far too broad. Amendment 13 would simply remove that capacity, with amendment 14 consequential upon it.

I accept that on 7 July the Secretary of State issued a written statement that helps to some extent on that point—it was referred to by my hon. Friend the Member for Croydon South South (Chris Philp). First, it states that the Secretary of State would act only in “exceptional circumstances”, although it does not say who defines what exceptional circumstances are, leaving it likely that the Secretary of State would do so, which does not help us much. Secondly, it states the intention to replace the phrase

“for reasons of public policy”

with a list of circumstances in which the Secretary of State might act. I agree with my hon. Friend the Member for Solihull (Julian Knight) that that is still too broad. The proposed list comprises

“national security, public safety, public health, the UK’s international relations and obligations, economic policy and burden to business.”—[Official Report, 7 July 2022; Vol. 717, c. 69WS.]

The platforms we are talking about are businesses. Are we really saying that a burden on them would give the Secretary of State reason to say to Ofcom, the independent regulator, that it must change a code of practice? That clearly cannot be right. This is still too broad a provision. The progress that has been made is welcome, but I am afraid that there needs to be more to further constrain this discretion. That is because, as others have said, the independence of the regulator is crucial not just to this specific part of the Bill but to the credibility of the whole regulatory and legislative structure here, and therefore we should not undermine it unless we have to.

15:00
Madam Deputy Speaker, may I also say something very briefly about new clause 14. This is the Government’s additional new clause, which is designed to assist platforms in understanding some of the judgments that they have to make and how to make them, particularly in relation to illegal content. When people first look at this Bill, they will assume that everyone knows what illegal content is and therefore it should be easy to identify and take it down, or take the appropriate action to avoid its promotion. But, as new clause 14 makes clear, what the platform has to do is not just identify content but have reasonable grounds to infer that all elements of an offence, including the mental elements, are present or satisfied, and, indeed, that the platform does not have reasonable grounds to infer that the defence to the offence may be successfully relied upon. That is right, of course, because criminal offences very often are not committed just by the fact of a piece of content; they may also require an intent, or a particular mental state, and they may require that the individual accused of that offence does not have a proper defence to it. The question of course is how on earth a platform is supposed to know either of those two things in each case. This is helpful guidance, but the Government will have to think carefully about what further guidance they will need to give—or Ofcom will need to give—in order to help a platform to make those very difficult judgments.
Julian Knight Portrait Julian Knight
- View Speech - Hansard - - - Excerpts

Although this is not contained within these measures, it is pertaining to them. Does my right hon. and learned Friend agree that, down the line, Ofcom will want to look at a regime of compliance officers in order to give the guidance that he seeks?

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

Yes, that is a possible way forward. Ofcom will need to produce a code of practice in this area. I am sure my hon. Friend on the Front Bench will say that that is a suitable way to deal with the problem that I have identified. It may well be, but at this stage, it is right for the House to recognise that the drafting of the Bill at the moment seeks to offer support to platforms, for which I am sure they will be grateful, but it will need to offer some more in order to allow these judgments to be made.

I restate the point that I have made in previous debates on this subject: there is little point in this House passing legislation aimed to make the internet a safer place if the legislation does not work as it is intended to. If our regime does not work, we will keep not a single person any safer. It is important, therefore, that we think about this Bill not in its overarching statements and principles but, particularly at this stage of consideration, in terms of how it will actually work.

You will not find a bigger supporter of the Bill in this House than me, Madam Deputy Speaker, but I want to see it work well and be effective. That means that some of the problems that I am highlighting must be addressed. Because humility is a good way to approach debates on something as ground-breaking and complex as this, I do not pretend that I have all the right answers. These amendments have been tabled because the Bill as it stands does not quite yet do the job that we want it to do. It is a good Bill—it needs to pass—but it can be better, and I very much hope that this process will improve it.

Joanna Cherry Portrait Joanna Cherry
- View Speech - Hansard - - - Excerpts

I rise to speak to new clause 24 and amendments 193 and 191 tabled in my name. I also want to specifically give my support to new clause 6 and amendments 33 and 34 in the name of the right hon. Member for Kingston upon Hull North (Dame Diana Johnson).

The purpose of my amendments, as I have indicated in a number of interventions, is to ensure that, when moderating content, category 1 service providers such as Twitter abide by the anti-discrimination law of our domestic legal systems—that is to say the duties set out in the Equality Act 2010 not to discriminate against, harass or victimise their users on the grounds of a protected characteristic.

I quickly want to say a preliminary word about the Bill. Like all responsible MPs, I recognise the growing concern about online harms, and the need to protect service users, especially children, from harmful and illegal content online. That said, the House of Lords’ Communication and Digital Committee was correct to note that the internet is not currently the unregulated Wild West that some people say it is, and that civil and criminal law already applies to activities online as well as offline.

The duty of care, which the Bill seeks to impose on online services, will be a significant departure from existing legislation regulating online content. It will allow for a more preventative approach to regulating illegal online content and will form part of a unified regulatory framework applying to a wide range of online services. I welcome the benefits that this would represent, especially with respect to preventing the proliferation of child sexual and emotional abuse online.

Before I became an MP, I worked for a number of years as a specialist sex crimes prosecutor, so I am all too aware of how children are targeted online. Sadly, there are far too many people in our society, often hiding in plain sight, who seek to exploit children. I must emphasise that child safeguarding should be a No. 1 priority for any Government. In so far as this Bill does that, I applaud it. However, I do have some concerns that there is a significant risk that the Bill will lead to censorship of legal speech by online platforms. For the reasons that were set out by the right hon. Member for Haltemprice and Howden (Mr Davis), I am also a bit worried that it will give the Government unacceptable controls over what we can and cannot say online, so I am keen to support any amendments that would ameliorate those aspects of the Bill. I say this to those Members around the Chamber who might be looking puzzled: make no mistake, when the Bill gives greater power to online service providers to regulate content, there is a very real risk that they will be lobbied by certain groups to regulate what is actually legal free speech by other groups. That is partly what my amendment is designed to avoid.

Jeremy Wright Portrait Sir Jeremy Wright
- View Speech - Hansard - - - Excerpts

What the hon. and learned Lady says is sensible, but does she accept—this is a point the Minister made earlier—that, at the moment, the platforms have almost unfettered control over what they take down and what they leave up? What this Bill does is present a framework for the balancing exercise that they ought to apply in making those decisions.

Joanna Cherry Portrait Joanna Cherry
- View Speech - Hansard - - - Excerpts

That is why I am giving the Bill a cautious welcome, but I still stand by my very legitimate concerns about the chilling effect of aspects of this Bill. I will give some examples in a moment about the problems that have arisen when organisations such as Twitter are left to their own devices on their moderation of content policy.

As all hon. Members will be aware, under the Equality Act there are a number of protected characteristics. These include: age; gender reassignment; being married or in a civil partnership; being pregnant or on maternity leave; disability; race, including colour, nationality, ethnic or national origin; religion or belief; sex and sexual orientation. It is against the law to discriminate, victimise or harass anyone because of any of those protected characteristics, but Twitter does discriminate against some of the protected characteristics. It often discriminates against women in the way that I described in an intervention earlier. It takes down expressions of feminist belief, but refuses to take down expressions of the utmost violent intent against women. It also discriminates against women who hold gender-critical beliefs. I remind hon. Members that, in terms of the Employment Appeal Tribunal’s decision in the case of Maya Forstater, the belief that sex matters is worthy of respect in a democratic society and, under the Equality Act, people cannot lawfully discriminate against women, or indeed men, who hold those views.

Twitter also sometimes discriminates against lesbians, gay men and bisexual people who assert that their sexual orientation is on the basis of sex, not gender, despite the fact that same-sex orientation, such as I hold, is a protected characteristic under the Equality Act.

At present, Twitter claims not to be covered by the Equality Act. I have seen correspondence from its lawyers that sets out the purported basis for that claim, partly under reference to schedule 25 to the Equality Act, and partly because it says:

“Twitter UK is included in an Irish Company and is incorporated in the Republic of Ireland. It does pursue economic activity through a fixed establishment in the UK but that relates to income through sales and marketing with the main activity being routed through Ireland.”

I very much doubt whether that would stand up in court, since Twitter is clearly providing a service in the United Kingdom, but it would be good if we took the opportunity of this Bill to clarify that the Equality Act applies to Twitter, so that when it applies moderation of content under the Bill, it will not discriminate against any of the protected characteristics.

The Joint Committee on Human Rights, of which I am currently the acting Chair, looked at this three years ago. We had a Twitter executive before our Committee and I questioned her at length about some of the content that Twitter was content to support in relation to violent threats against women and girls and, on the other hand, some of the content that Twitter took down because it did not like the expression of certain beliefs by feminists or lesbians.

We discovered on the Joint Committee on Human Rights that Twitter’s hateful conduct policy does not include sex as a protected characteristic. It does not reflect the domestic law of the United Kingdom in relation to anti-discrimination law. Back in October 2019, in the Committee’s report on democracy, freedom of expression and freedom of association, we recommended that Twitter should include sex as a protected characteristic in its hateful conduct policy, but Twitter has not done that. It seems Twitter thinks it is above the domestic law of the United Kingdom when it comes to anti-discrimination.

At that Committee, the Twitter executive assured me that certain violent memes that often appear on Twitter directed against women such as me and against many feminists in the United Kingdom, threatening us with death by shooting, should be removed. However, just in the past 48 hours I have seen an example of Twitter’s refusing to remove that meme. Colleagues should be assured that there is a problem here, and I would like us to direct our minds to it, as the Bill gives us an opportunity to do.

Whether or not Twitter is correctly praying in aid the loophole it says there is in the Equality Act—I think that is questionable—the Bill gives us the perfect opportunity to clarify matters. Clause 3 of clearly brings Twitter and other online service providers within the regulatory scheme of the Bill as a service with

“a significant number of United Kingdom users”.

The Bill squarely recognises that Twitter provides a service in the United Kingdom to UK users, so it is only a very small step to amend the Bill to make it absolutely clear that when it does so it should be subject to the Equality Act. That is what my new clause 24 seeks to do.

I have also tabled new clauses 193 and 191 to ensure that Twitter and other online platforms obey non-discrimination law regarding Ofcom’s production of codes of practice and guidance. The purpose of those amendments is to ensure that Ofcom consults with persons who have expertise in the Equality Act before producing those codes of conduct.

I will not push the new clauses to a vote. I had a very productive meeting with the Minister’s predecessor, the hon. Member for Croydon South (Chris Philp), who expressed a great deal of sympathy when I explained the position to him. I have been encouraged by the cross-party support for the new clauses, both in discussions before today with Members from all parties and in some of the comments made by various hon. Members today.

I am really hoping that the Government will take my new clauses away and give them very serious consideration, that they will look at the Joint Committee’s report from October 2019 and that either they will adopt these amendments or perhaps somebody else will take them forward in the other place.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I can assure the hon. and learned Lady that I am happy to carry on the dialogue that she had with my predecessor and meet her to discuss this at a further date.

Joanna Cherry Portrait Joanna Cherry
- Hansard - - - Excerpts

I am delighted to hear that. I must tell the Minister that I have had a huge number of approaches from women, from lesbians and from gay men across the United Kingdom who are suffering as a result of Twitter’s moderation policy. There is a lot of support for new clause 24.

Of course, it is important to remember that the Equality Act protects everyone. Gender reassignment is there with the protected characteristics of sex and sexual orientation. It is really not acceptable for a company such as Twitter, which provides a service in the United Kingdom, to seek to flout and ignore the provisions of our domestic law on anti-discrimination. I am grateful to the Minister for the interest he has shown and for his undertaking to meet me, and I will leave it at that for now.

15:15
Julian Knight Portrait Julian Knight
- Hansard - - - Excerpts

We live in the strangest of times, and the evidence of that is that my hon. Friend the Member for Folkestone and Hythe (Damian Collins), who has knowledge second to none in this area, has ended up in charge of it. I have rarely seen such an occurrence. I hope he is able to have a long and happy tenure and that the blob does not discover that he knows what he is doing.

I backed the Bill on Second Reading and I will continue to back it. I support most of the content within it and, before I move on to speak to the amendments I have tabled, I want to thank the Government for listening to the recommendations of the Digital, Culture, Media and Sport Committee, which I chair. The Government have accepted eight of the Committee’s key recommendations, demonstrating that the Committee is best placed to provide Parliamentary scrutiny of DCMS Bills as they pass through this House and after they are enacted.

I also pay tribute to the work of the Joint Committee on the draft Bill, which my hon. Friend the Member for Folkestone and Hythe chaired, and the Public Bill Committee, which has improved this piece of legislation during its consideration. The Government have rightfully listened to the Select Committee’s established view that it would be inappropriate to establish a permanent joint committee on digital regulation. I also welcome the news that the Government are set to bring forward amendments in the House of Lords to legislate for a new criminal offence for epilepsy trolling, which was recommended by both the Joint Committee and the Select Committee.

That said, the Digital, Culture, Media and Sport Committee continues to have concerns around some aspects of the Bill, particularly the lack of provision for funding digital literacy, a key area where we are falling behind in and need to make some progress. However, my primary concern and that of my colleagues on the Committee relates to the powers within this Bill that would, in effect, give the Secretary of State the opportunity to interfere with Ofcom’s role in the issuing of codes of practice to service providers.

It is for that reason that I speak to amendments 44 to 46 standing in my name on the amendment paper. Clause 40, in my view, gives the Secretary of State unprecedented powers and would bring into question the future integrity of Ofcom itself. Removing the ability to exercise those powers in clause 39 would mean we could lose clauses 40 and 41, which outline the powers granted and how they would be sent to the House for consideration.

Presently, Ofcom sets out codes of practice under which,

“companies can compete fairly, and businesses and customers benefit from the choice of a broad range of services”.

Under this Bill Ofcom, which, I remind the House, is an independent media regulator, will be required to issue codes of practice to service providers, for example codes outlining measures that would enable services to comply with duties to mitigate the presence of harmful content.

Currently, codes of practice from Ofcom are presented to the House for consideration “as soon as practicable”, something I support. My concern is the powers given in this Bill that allow the Secretary of State to reject the draft codes of practice and to send them back to Ofcom before this House knows the recommendations exist, let alone having a chance to consider or debate them.

I listened with interest to my hon. Friend the Member for Croydon South (Chris Philp), who is not in his place but who was a very fine Minister during his time in the Department. To answer his query on the written ministerial statement and the letter written to my Committee on this matter, I say to him and to those on the Front Bench that if the Government disagree with what Ofcom is saying, they can bring the matter to the House and explain that disagreement. That would allow things to be entirely transparent and open, allow greater scrutiny rather than less, and allow for less delay than would be the case if there is forever that ping-pong between the Secretary of State and Ofcom until it gets its work right.

I want to make it clear that the DCMS Committee and I believe that this is nothing more than a power grab by the Executive. I am proud that in western Europe we have a free press without any interference from Government, and I believe that the Bill, if constituted in this particular form, has the potential to damage that relationship—I say potential, because I do not believe that is the intention of what is being proposed here, but there is the potential for the Bill to jeopardise that relationship in the long term. That is why I hope that Members will consider supporting my amendments, and I will outline why they should do so.

As William Perrin, a trustee of the Carnegie Trust UK, made clear in evidence to my Committee,

“the underpinning convention of regulation of media in Western Europe is that there is an independent regulator and the Executive does not interfere in their day to day decision-making for very good reason.” Likewise, Dr Edina Harbinja, a senior lecturer at Aston University, raised concerns that the Bill made her

“fear that Ofcom’s independence may be compromised”

and that

“similar powers are creeping into other law reform pieces and proposals, such as…data protection”.

My amendments seek to cut red tape, bureaucracy and endless recurring loops that in some cases may result in significant delays in Ofcom managing to get some codes of practice approved. The amendments will allow the codes to come directly to this House for consideration by Members without another level of direct interference from the Secretary of State. Let me make it very clear that this is not a comment on any Secretary of State, at any time in the past, but in some of these cases I expect that Ofcom will require a speedy turnaround to get these codes of practices approved—for instance, measures that it wishes to bring forward to better safeguard children online. In addition, the Secretary of State has continually made it clear in our Select Committee hearings that she is a great supporter of more parliamentary scrutiny. I therefore hope that the Government will support my amendment so that we do not end up in a position where future Secretaries of State could potentially prevent draft codes coming before the House due to endless delays and recurring loops.

I also want to make it abundantly clear that my amendment does not seek to prevent the Secretary of State from having any involvement in the formulation of new codes of practice from Ofcom. Indeed, as Ofcom has rightly pointed out, the Secretary of State is already a statutory consultee when Ofcom wishes to draft new codes of practice or amend those that already exist. She can also, every three years, set out guidelines that Ofcom would have to follow when creating such codes of practice. The Government therefore already play a crucial role in influencing the genesis and the direction of travel in this area.

On Friday the Secretary of State wrote to my office outlining some of the concerns shared by Members of this House and providing steps on how her Department would address those concerns. In her letter, she recognises that the unprecedented powers awarded to the Secretary of State are of great concern to Members and goes on to state that

“regulatory independence is vital to the success of the framework”.

I have been informed that in order to appease some of these concerned Members, the Government intend to bring forward amendments around the definitions of “exceptional circumstances” and “public policy”, as referenced earlier. These definitions, including “economic policy” and “business interests”, are so broad that I cannot think of anything that would not be covered by these exceptional circumstances.

If the Secretary of State accepts our legitimate concerns, surely Ministers should accept my amendments becoming part of the Bill today, leaving a cleaner process rather than an increasingly complex system of unscrutinised ministerial interference with the regulator. The DCMS Committee and I are very clear that clause 40 represents a power grab by the Government that potentially threatens the independence of Ofcom, which is a fundamental principle of ensuring freedom of speech and what should be a key component of this legislation. The Government must maintain their approach to ensuring independent, effective, and trustworthy regulation.

I will not press my amendments to a vote, but I hope my concerns will spark not just thoughts and further engagement from Ministers but legislative action in another place as the Bill progresses, because I really do think that this could hole the Bill under the waterline and has the potential for real harm to our democratic way of life going forward as we tackle this whole new area.

Kevan Jones Portrait Mr Kevan Jones (North Durham) (Lab)
- View Speech - Hansard - - - Excerpts

I rise to speak to my new clause 8, which would place a duty on all internet site providers regulated by this Bill to prevent individuals from encountering adverts for cosmetic procedures that do not contain disclaimers as to health risks of the procedure or include certified service quality indicators.

I have been campaigning for a number of years for better regulation of the non-surgical and cosmetic surgery industry, which is frankly a wild west in terms of lack of regulation, only made worse by the internet. I pay tribute to my constituent Dawn Knight, who has been a fierce campaigner in this area. We are slowly making progress. I thank the former Health Minister, the hon. Member for Charnwood (Edward Argar), for his work in bringing amendments on licensing to the Bill that became the Health and Care Act 2022. That is now out for consultation. It is a first, welcome step in legislation to tame the wild west that is the cosmetic surgery sector. My amendment would enhance and run parallel to that piece of legislation.

Back in 2013, Sir Bruce Keogh first raised the issue of advertising in his recommendations on regulation of the cosmetic surgery industry, saying that cosmetic and aesthetic procedures adverts should be provided with a disclaimer or kitemark in a manner similar to that around alcohol or gambling regulation. Years ago, adverts were in newspapers and magazines. Now, increasingly, the sector’s main source of advertising revenue is the internet.

People will say, “Why does this matter?” Well, it links to some of the other things that have been raised in this debate. The first is safety. We do not have any data, for which I have been calling for a while, on how many surgical and non-surgical aesthetic procedures in the UK go wrong, but I know who picks up the tab for it—it is us as taxpayers as the NHS has to put a lot of those procedures right. The horrendous cases that I have seen over the years provide just cause for why people need to be in full control of the facts before they undertake these procedures.

This is a boom industry. It is one where decisions on whether to go ahead with a procedure are not usually made with full information on the potential risks. It is sold, certainly online, as something similar to buying any other service. As we all know, any medical procedure has health risks connected to it, and people should be made aware of them in the adverts that are now online. I have tried writing to Facebook and others to warn them about some of the more spurious claims that some of the providers are making, but have never got a reply from Facebook. This is about patient safety. My amendment would ensure that these adverts at least raise in people’s minds the fact that there is a health risk to these procedures.

Again, people will say, “Why does this matter?” Well, the target for this sector is young people. As I said, a few years ago these adverts were in newspapers and magazines; now they are on Facebook, Twitter, Instagram and so on, and we know what they are selling: they are bombarding young people with the perfect body image.

We only have to look at the Mental Health Foundation’s report on this subject to see the effect the industry is having on young people, with 37% feeling upset and 31% feeling ashamed of their own body image. That is causing anxiety and mental health problems, but it is also forcing some people to go down the route of cosmetic surgery—both surgical and non-surgical—when there is nothing wrong with their body. It is the images, often photoshopped and sadly promoted by certain celebrities, that force them down that route.

Someone has asked me before, “Do you want to close down the cosmetic surgery industry?” I am clear that I do not; what I want is for anyone going forward for these procedures to be in full control of the facts. Personally, if I had a blank sheet of paper, I would say that people should have mental health assessments before they undertake these procedures. If we had a kitemark on adverts, as Sir Bruce Keogh recommended, or something that actually said, “This is not like buying any other service. This is a medical procedure that could go wrong”, people would be in full awareness of the facts before they went forward.

15:30
It is a modest proposal for the Bill, but it could have a major impact on the industry out there at the moment, which for many years has been completely unregulated. I do not propose pressing my new clause to a vote, but will the Minister work with his Department of Health and Social Care colleagues? Following the Health and Care Act 2022, there is a consultation on the regulations, and we could make a real difference for those I am worried about and concerned for—the more and more young people who are being bombarded with these adverts. In some cases, dangerous and potentially life-threatening procedures are being sold to them as if they are just like any other service, and they are not.
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The right hon. Gentleman makes a very important point and, as he knows, there is a wider ongoing Government review related to advertising online, which is a very serious issue. I assure him that we will follow up with colleagues in the Department of Health and Social Care to discuss the points he has raised.

Kevan Jones Portrait Mr Jones
- Hansard - - - Excerpts

I am grateful to the Minister and I will be keeping a beady eye to see how far things go. The proposal would make a difference. It is a simple but effective way of protecting people, especially young people.

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - - - Excerpts

Very good, that was wonderfully brief.

Damian Hinds Portrait Damian Hinds (East Hampshire) (Con)
- View Speech - Hansard - - - Excerpts

May I join others in welcoming my hon. Friend the Member for Folkestone and Hythe (Damian Collins) to his place on the Front Bench? He brings a considerable amount of expertise. I also, although it is a shame he is not here to hear me say nice things about him, pay tribute, as others have, to my hon. Friend the Member for Croydon South (Chris Philp). I had the opportunity to work with him, his wonderful team of officials and wonderful officials at the Home Office on some aspects of this Bill, and it was a great pleasure to do so. As we saw again today, his passion for this subject is matched only by his grasp of its fine detail.

I particularly echo what my hon. Friend said about algorithmic promotion, because if we address that, alongside what the Government have rightly done on ID verification options and user empowerment, we would address some of the core wiring and underpinnings at an even more elemental level of online harm.

I want to talk about two subjects briefly. One is fraud, and the other is disinformation. Opposition amendment 20 refers to disinformation, but that amendment is not necessary because of the amendments that the Government are bringing to the National Security Bill to address state-sponsored disinformation. I refer the House in particular to Government amendment 9 to that Bill. That in turn amends this Bill—it is the link, or so-called bridge, between the two. Disinformation is a core part of state threat activity and it is one of the most disturbing, because it can be done at huge volume and at very low cost, and it can be quite hard to detect. When someone has learned how to change the way people think, that makes that part of their weaponry look incredibly valuable to them.

We often talk about this in the context of elections. I think we are actually pretty good—when I say “we”, I mean our country, some other countries and even the platforms themselves—at addressing disinformation in the context of the elections themselves: the process of voting, eligibility to vote and so on. However, first, that is often not the purpose of disinformation at election time and, secondly, most disinformation occurs outside election times. Although our focus on interference with the democratic process is naturally heightened coming up to big democratic events, it is actually a 365-day-a-year activity.

There are multiple reasons and multiple modes for foreign states to engage in that activity. In fact, in many ways, the word “disinformation” is a bit unsatisfactory because a much wider set of things comes under the heading of information operations. That can range from simple untruths to trying to sow many different versions of an event, particularly a foreign policy or wartime event, to confuse the audience, who are left thinking, “Oh well, whatever story I’m being told by the BBC, my newspaper, or whatever it is, they are all much of a muchness.” Those states are competing for truth, even though in reality, of course, there is one truth. Sometimes the aim is to big up their own country, or to undermine faith in a democracy like ours, or the effectiveness of free societies.

Probably the biggest category of information operations is when there is not a particular line to push at all, but rather the disinformer is seeking to sow division or deepen division in our society, often by telling people things that they already believe, but more loudly and more aggressively to try to make them dislike some other group in society more. The purpose, ultimately, is to destabilise a free and open society such as ours and that has a cancerous effect. We talk sometimes of disinformation being spread by foreign states. Actually, it is not spread by foreign states; it is seeded by foreign states and then spread usually by people here. So they create these fake personas to plant ideas and then other people, seeing those messages and personas, unwittingly pick them up and pass them on themselves. It is incredibly important that we tackle that for the health of our democracy and our society.

The other point I want to mention briefly relates to fraud and the SNP amendments in the following group, but also Government new clause 14 in this group. I strongly support what the Government have done, during the shaping of the Bill, on fraud; there have been three key changes on fraud. The first was to bring user-generated content fraud into the scope of the Bill. That is very important for a particularly wicked form of fraud known as romance fraud. The second was to bring fraudulent advertising into scope, which is particularly important for categories of fraud such as investment fraud and e-commerce. The third big change was to make fraud a priority offence in the Bill, meaning that it is the responsibility of the platforms not just to remove that content when they are made aware of it, but to make strenuous efforts to try to stop it appearing in front of their users in the first place. Those are three big changes that I greatly welcome.

There are three further things I think the Government will need to do on fraud. First, there is a lot of fraudulent content beyond categories 1 and 2A as defined in the Online Safety Bill, so we are going to have to find ways—proportionate ways—to make sure that that fraudulent content is suppressed when it appears elsewhere, but without putting great burdens on the operators of all manner of community websites, village newsletters and so on. That is where the DCMS online advertising programme has an incredibly important part to play.

The second thing is about the huge variety of channels and products. Telecommunications are obviously important, alongside online content, but even within online, as the so-called metaverse develops further, with the internet of things and the massive potential for defrauding people through deep fakes and so on, we need to be one step ahead of these technologies. I hope that in DCMS my hon. Friends will look to create a future threats unit that seeks to do that.

Thirdly, we need to make sure everybody’s incentives are aligned on fraud. At present, the banks reimburse people who are defrauded and I hope that rate of reimbursement will shortly be increasing. They are not the only ones involved in the chain that leads to people being defrauded and often they are not the primary part of that chain. It is only right and fair, as well as economically efficient, to make sure the other parts of the chain that are involved share in that responsibility. The Bill makes sure their incentives are aligned because they have to take proportionate steps to stop fraudulent content appearing in front of customers, but we need to look at how we can sharpen that up to make sure everybody’s incentives are absolutely as one.

This is an incredibly important Bill. It has been a long time coming and I congratulate everybody, starting with my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), my hon. Friend the Member for Croydon South (Chris Philp) and others who have been closely involved in creating it. I wish my hon. Friend the Minister the best of luck.

None Portrait Several hon. Members rose—
- Hansard -

Nigel Evans Portrait Mr Deputy Speaker (Mr Nigel Evans)
- Hansard - - - Excerpts

We will now introduce a six-minute limit on speeches. It may come down but, if Members can take less than six minutes, please do so. I intend to call the Minister at 4.20 pm.

Jamie Stone Portrait Jamie Stone (Caithness, Sutherland and Easter Ross) (LD)
- View Speech - Hansard - - - Excerpts

May I, on behalf of my party, welcome the Minister to his place?

I have been reflecting on the contributions made so far and why we are here. I am here because I know of a female parliamentary candidate who pulled out of that process because of the online abuse. I also know of somebody not in my party—it would be unfair to name her or her party—who stood down from public life in Scotland mostly because of online abuse. This is something that threatens democracy, which we surely hold most dear.

Most of us are in favour of the Bill. It is high time that we had legislation that keeps users safe online, tackles illegal content and seeks to protect freedom of speech, while also enforcing the regulation of online spaces. It is clear to me from the myriad amendments that the Bill as it currently stands is not complete and does not go far enough. That is self-evident. It is a little vague on some issues.

I have tabled two amendments, one of which has already been mentioned and is on media literacy. My party and I believe Ofcom should have a duty to promote and improve the media literacy of the public in relation to regulated user-to-user services and search services. That was originally in the Bill but it has gone. Media literacy is mentioned only in the context of risk assessments. There is no active requirement for internet companies to promote media literacy.

The pandemic proved that a level of skill is needed to navigate the online world. I offer myself as an example. The people who help me out in my office here and in my constituency are repeatedly telling me what I can and cannot do and keeping me right. I am of a certain age, but that shows where education is necessary.

My second amendment is on end-to-end encryption. I do not want anything in this Bill to prevent providers of online services from protecting their users’ privacy through end-to-end encryption. It does provide protection to individuals and if it is circumvented or broken criminals and hostile foreign states can breach security. Privacy means security.

There are also concerns about the use of the word “harm” in the Bill. It remains vague and threatens to capture a lot of unintended content. I look forward to seeing what comes forward from the Government on that front. It focuses too much on content as opposed to activity and system design. Regulation of social media must respect the rights to privacy and free expression of those who use it. However, as the right hon. Member for Barking (Dame Margaret Hodge) said, that does not mean a laissez-faire approach: bullying and abuse prevent people from expressing themselves and must at all costs be stamped out, not least because of the two examples I mentioned at the start of my contribution.

As I have said before, the provisions on press exemption are poorly drafted. Under the current plans, the Russian propaganda channel Russia Today, on which I have said quite a bit in this place in the past, would qualify as a recognised news publisher and would therefore be exempt from regulation. That cannot be right. It is the same news channel that had its licence revoked by Ofcom.

I will help you by being reasonably brief, Mr Deputy Speaker, and conclude by saying that as many Members have said, the nature of the Bill means that the Secretary of State will have unprecedented powers to decide crucial legislation later. I speak—I will say it again—as a former chair of the Scottish Parliament’s statutory instruments committee, so I know from my own experience that all too often, instruments that have far-reaching effects are not given the consideration in this place that they should receive. Such instruments should be debated by the rest of us in the Commons.

As I said at the beginning of my speech, the myriad amendments to the Bill make it clear that the rest of us are not willing to allow it to remain so inherently undemocratic. We are going in the right direction, but a lot can be done to improve it. I wait with great interest to see how the Minister responds and what is forthcoming in the period ahead.

15:45
Andrew Percy Portrait Andrew Percy
- View Speech - Hansard - - - Excerpts

This has been an interesting debate on a Bill I have followed closely. I have been particularly struck by some of the arguments that claim the Bill is an attack on freedom of speech. I always listen intently to my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) and to the hon. and learned Member for Edinburgh South West (Joanna Cherry), but I think they are wrong in the conclusions they have reached about legal but harmful content. Indeed, many of the criticisms that the hon. and learned Member for Edinburgh South West made of the various platforms were criticisms of the present situation, and that is exactly why I think this legislation will improve the position. However, those Members raised important points that I am sure will be responded to. I have also been a strong advocate of the inclusion of small but high-harm platforms, as the Minister and the shadow Minister, the hon. Member for Pontypridd (Alex Davies-Jones), both know—we have all had those discussions.

In the time I have, I want to focus principally on the issue of search and on new clauses 9 and 10, which stand in my name. As the shadow Minister has highlighted, last week we were—like many people in this place, perhaps—sent the most remarkable online prompt, which was to simply search Google for the words “desk ornament”. The top images displayed in response to that very mundane and boring search were of swastikas, SS bolts and other Nazi memorabilia presented as desk ornaments. Despite there having been awareness of that fact since, I believe, the previous weekend, and even though Google is making millions of pounds in seconds from advertising, images promoting Nazism were still available for all to see as a result of those searches.

When he gave evidence to the Bill Committee recently, Danny Stone, the Antisemitism Policy Trust’s very capable chief executive, pointed out that Amazon’s Alexa had used just one comment posted by one individual on Amazon’s website to inform potentially millions of users who cared to ask that George Soros was responsible for all of the world’s evils, and that Alexa had used a comment from another website to inform those who searched for it that the humanitarian group the White Helmets was an illicit operation founded by a British spy.

As we have seen throughout the covid pandemic, similar results come up in response to other searches, such as those around vaccines and covid. The Antisemitism Policy Trust has previously demonstrated that Microsoft Bing, the platform that lies behind Alexa, was directing users to hateful searches such as “Jews are bastards” through autocompletes, as well as pointing people to homophobic stories. We even had the sickening situation of Google’s image carousel highlighting Jewish baby strollers in response to people searching for portable barbecues.

Our own Alexa searches highlighted the issue some time ago. Users who asked Alexa “Do Jews control the media?” were responded to with a quote from a website called Jew Watch—that should tell Members all they need to know about the nature of the platform—saying that Jews control not only the media, but the financial system too. The same problem manifests itself across search platforms in other languages, as we highlighted not so long ago with Siri in Spanish. When asked, “Do the Jews control the media?” she responds with an article that states that Jews do indeed control international media. This goes on and on, irrespective of whether the search is voice or text-based.

The largest search companies in the world are falling at the first hurdle when it comes to risk assessing for harms on their platform. That is the key point when we ask for lawful but harmful content to be responded to. It is about risk assessment—requiring companies that do not respect borders, operate globally and are in many ways more powerful than Governments to risk assess and warn about lawful but deeply harmful content that all of us in the House would be disgusted by.

At present, large traditional search services including Google and Microsoft Bing, and voice search assistants including Alexa and Siri, will be exempted from having to risk assess their systems and address harm to adults, despite the fact that other large user-to-user services will have to do so. How can it be possible that Google does not have to act, when Meta—Facebook—and Twitter do? That does not seem consistent with the aims of the Bill.

There is a lot more that I would like to have said on the Bill. I welcome the written ministerial statement last week in relation to small but high-harm platforms. I hope that as the Bill progresses to the other place, we can look again at search. Some of the content generated is truly appalling, even though it may very well be considered lawful.

Feryal Clark Portrait Feryal Clark (Enfield North) (Lab)
- View Speech - Hansard - - - Excerpts

I join everyone else in the House in welcoming the Minister to his place.

I rise to speak in support of amendments 15 and 16. At the core of this issue is the first duty of any Government: to keep people safe. Too often in debates, which can become highly technical, we lose sight of that fact. We are not just talking about technology and regulation; we are talking about real lives and real people. It is therefore incumbent on all of us in this place to have that at the forefront of our minds when discussing such legislation.

Labelling social media as the wild west of today is hardly controversial—that is plain and obvious for all to see. There has been a total failure on the part of social media companies to make their platforms safe for everyone to use, and that needs to change. Regulation is not a dirty word, but a crucial part of ensuring that as the internet plays a bigger role in every generation’s lives, it meets the key duty of keeping people safe. It has been a decade since we first heard of this Bill, and almost four years since the Government committed to it, so I am afraid that there is nothing even slightly groundbreaking about the Bill as it is today. We have seen progress being made in this area around the world, and the UK is falling further and further behind.

Of particular concern to me is the impact on children and young people. As a mother, I worry for the world that my young daughter will grow up in, and I will do all I can in this place to ensure that children’s welfare is at the absolute forefront. I can see no other system or institution that children are allowed to engage with that has such a wanting lack of safeguards and regulation. If there was a faulty slide in a playground, it would be closed off and fixed. If a sports field was covered with glass or litter, that would be reported and dealt with. Whether we like it or not, social media has become the streets our children hang out in, the world they grow up in and the playground they use. It is about time we started treating it with the same care and attention.

There are far too many holes in the Bill that allow for the continued exploitation of children. Labour’s amendments 15 and 16 tackle the deeply troubling issue of “breadcrumbing”. That is where child abusers use social networks to lay trails to illegal content elsewhere online and share videos of abuse edited to fall within content moderation guidelines. The amendments would give the regulators powers to tackle that disgusting practice and ensure that there is a proactive response to it. They would bring into regulatory scope the millions of interactions with accounts that actively enable child abuse. Perhaps most importantly, they would ensure that social media companies tackled child abuse at the earliest possible stage.

In its current form, even with Government amendment 14, the Bill merely reinforces companies’ current focus only on material that explicitly reaches the criminal threshold. That is simply not good enough. Rather than acknowledging that issue, Government amendments 71 and 72 let social media companies off the hook. They remove the requirement for companies to apply their terms and conditions “consistently”. That was addressed very eloquently by the hon. Member for Croydon South (Chris Philp) and the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright), who highlighted that Government amendment 14 simply does not go far enough.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

On the amendments that the former Minister, my hon. Friend the Member for Croydon South (Chris Philp), spoke to, the word “consistently” has not been removed from the text. There is new language that follows the use of “consistently”, but the use of that word will still apply in the context of the companies’ duties to act against illegal content.

Feryal Clark Portrait Feryal Clark
- View Speech - Hansard - - - Excerpts

I welcome the Minister’s clarification and look forward to the amendments being made to the Bill. Other than tying one of our hands behind our back in relation to trying to keep children safe, however, the proposals as they stand do not achieve very much. This will undermine the entire regulatory system, practically rendering it completely ineffective.

Although I welcome the Bill and some of the Government amendments, it still lacks a focus on ensuring that tech companies have the proper systems in place to fulfil their duty of care and keep our children safe. The children of this country deserve better. That is why I wholeheartedly welcome the amendments tabled by my hon. Friend the Member for Pontypridd (Alex Davies-Jones) and urge Government Members to support them.

None Portrait Several hon. Members rose—
- Hansard -

Nigel Evans Portrait Mr Deputy Speaker (Mr Nigel Evans)
- View Speech - Hansard - - - Excerpts

Order. We will stick with a time limit of six minutes, but I put everybody on notice that we may have to move that down to five.

Adam Afriyie Portrait Adam Afriyie
- Hansard - - - Excerpts

I very much welcome the Bill, which has been a long time in the making. It has travelled from my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) to my hon. Friend the Member for Croydon South (Chris Philp) and now to my hon. Friend the Member for Folkestone and Hythe (Damian Collins); I say a huge thank you to them for their work. The Bill required time because this is a very complex matter. There are huge dangers and challenges in terms of committing offences against freedom of speech. I am glad that Ministers have recognised that and that we are very close to an outcome.

The Bill is really about protection—it is about protecting our children and our society from serious harms—and nobody here would disagree that we want to protect children from harm online. That is what 70% to 80% of the Bill achieves. Nobody would disagree that we need to prevent acts of terror and incitement to violence. We are all on the same page on that across the House. What we are talking about today, and what we have been talking about over the past several months, are nips and tucks to try to improve elements of the Bill. The framework appears to be generally correct. We need to drill down into some of the details to ensure that the areas that each of us is concerned about are dealt with in the Bill we finally produce, as it becomes an Act of Parliament.

There are several amendments tabled in my name and those of other right hon. and hon. Members. I can only canter through them cursorily in the four minutes and 30 seconds remaining to me, but I will put these points on the record in the hope that the Minister will respond positively to many of them.

Amendments 48 and 49 would ensure that providers can decide to keep user-generated content online, taking no action if that content is not harmful. In effect, the Government have accepted those amendments by tabling amendment 71, so I thank the Minister for that.

My amendment 50 says that the presumption should be tipped further in favour of freedom of expression and debate by ensuring that under their contractual terms of service, except in particular circumstances, providers are obliged to leave content online. I emphasise that I am not talking about harmful or illegal content; amendment 50 seeks purely to address content that may be controversial but does not cross the line.

16:00
I turn to amendment 51. It appears that the Bill protects the media, journalists, Governments and us politicians, while providers have some protections against being fined unjustly. In many ways, the only people who are not protected are the public—the users with user-generated legal content. It seems to me that we need to increase the powers for citizens to get an outcome if their content is taken down inaccurately, incorrectly or inappropriately. We would do well to look at ensuring, as amendment 51 would, that citizens can also seek compensation. Tens of millions of pounds, if not hundreds of millions, are about to go to the regulator and to the Government in a form of quasi-taxation, so there needs to be a mechanism for a judge to decide that if somebody has been harmed by the inappropriate removal of legal content, they can get some redress.
Government amendment 94 is quite interesting. I can certainly see the reason for it and the purpose that it seeks to achieve, but it will require providers to take into account the entire criminal code. Effectively, they will have to act as a policeman, policing all internet content against all legislation. I am sure that that is not the intent behind amendment 94. I simply urge the Government to take a look at my amendment 52, which would ensure that relevant offences include only those specified, so that providers do not need to understand the entire criminal code.
The primary area of concern, which many other hon. Members have voiced, is that it looks as if the Secretary of State will be given the power to specify priority harms without that decision necessarily being passed on the Floor of the House. It seems to me that it is Parliament that should primarily be making regulations and legislation, so I really urge the Government to take another look and ensure that if a Secretary of State seeks to modify the priority harms or specify certain content as harmful or illegal, it is debated in the Chamber of the House of Commons. That is the primary function of this place.
Technology moves very quickly, so personally I would welcome an annual debate on areas that may need improvement. Now that we are outside the European Union and have autonomy, those are the kinds of things that we must decide in this Chamber.
Munira Wilson Portrait Munira Wilson (Twickenham) (LD)
- View Speech - Hansard - - - Excerpts

I rise to speak to new clauses 25 and 26 in my name. The Government rightly seek to make the UK the safest place in the world to go online, especially for our children, and some of their amendments will start to address previous gaps in the Bill. However, I believe that the Bill still falls short in its aim not only to protect children from harm and abuse, but, importantly, to empower and enable young people to make the most of the online world.

I welcome the comments that the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) made about how we achieve the balance between rights and protecting children from harm. I also welcome his amendments on children’s wellbeing, which seek to achieve that balance.

With one in five children going online, keeping them safe is more difficult but more important than ever. I speak not only as the mother of two very young children who are growing up with iPads in their hands, but as—like everyone else in the Chamber—a constituency Member of Parliament who speaks regularly to school staff and parents who are concerned about the harms caused by social media in particular, but also those caused by games and other services to which children have access.

The Bill proffers a broad and vague definition of content that is legal yet harmful. As many have already said, it should not be the responsibility of the Secretary of State, in secondary legislation, to make decisions about how and where to draw the line; Parliament should set clear laws that address specific, well-defined harms, based on strong evidence. The clear difficulty that the Government have in defining what content is harmful could have been eased had the Bill focused less on removing harmful content and more on why service providers allow harmful content to spread so quickly and widely. Last year, the 5Rights Foundation conducted an experiment in which it created several fake Instagram profiles for children aged between 14 and 17. When the accounts were searched for the term “skinny”, while a warning pop-up message appeared, among the top results were

“accounts promoting eating disorders and diets, as well as pages advertising appetite-suppressant gummy bears.”

Ultimately, the business models of these services profit from the spread of such content. New clause 26 requires the Government and Ofcom to focus on ensuring that internet services are safe by design. They should not be using algorithms that give prominence to harmful content. The Bill should focus on harmful systems rather than on harmful content.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

It does focus on systems as well as content. We often talk about content because it is the exemplar for the failure of the systems, but the systems are entirely within the scope of the Bill.

Munira Wilson Portrait Munira Wilson
- View Speech - Hansard - - - Excerpts

I thank the Minister for that clarification, but there are still many organisations out there, not least the Children’s Charities Coalition, that feel that the Bill does not go far enough on safety by design. Concerns have rightly been expressed about freedom of expression, but if we focus on design rather than content, we can protect freedom of expression while keeping children safe at the same time. New clause 26 is about tackling harms downstream, safeguarding our freedoms and, crucially, expanding participation among children and young people. I fear that we will always be on the back foot when trying to tackle harmful content. I fear that regulators or service providers will become over-zealous in taking down what they consider to be harmful content, removing legal content from their platforms just in case it is harmful, or introducing age gates that deny children access to services outright.

Of course, some internet services are clearly inappropriate for children, and illegal content should be removed—I think we all agree on that—but let us not lock children out of the digital world or let their voices be silenced. Forty-three per cent. of girls hold back their opinions on social media for fear of criticism. Children need a way to exercise their rights. Even the Children’s Commissioner for England has said that heavy-handed parental controls that lock children out of the digital world are not the solution.

I tabled new clause 25 because the Bill’s scope, focusing on user-to-user and search services, is too narrow and not sufficiently future-proof. It should cover all digital technology that is likely to be accessed by children. The term

“likely to be accessed by children”

appears in the age-appropriate design code to ensure that the privacy of children’s data is protected. However, that more expansive definition is not included in the Bill, which imposes duties on only a subset of services to keep children safe. Given rapidly expanding technologies such as the metaverse—which is still in its infancy—and augmented reality, as well as addictive apps and games that promote loot boxes and gambling-type behaviour, we need a much more expansive definition

Nigel Evans Portrait Mr Deputy Speaker (Mr Nigel Evans)
- Hansard - - - Excerpts

I am grateful to the right hon. Member for Kingston upon Hull North (Dame Diana Johnson) for keeping her powder dry and deferring her speech until the next group of amendments, so Members now have five minutes each.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- View Speech - Hansard - - - Excerpts

I rise to speak in favour of amendments 15 to 19 in the names of my hon. Friends and, later, amendments 11 and 12 in the name of the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright).

As we discussed at great length in Committee—my first Bill Committee; a nice simple one to get me started—the Bill has a number of critical clauses to address the atrocious incidence of child sexual expectation online. Amendments 15 to 19 are aimed at strengthening those protections and helping to ensure that the internet is a safer place for every young person. Amendments 15 and 16 will bring into scope tens of millions of interactions with accounts that actively enable the discovery and sharing of child abuse material. Amendments 17 to 19 will tackle the issue of cross-platform abuse, where abuse starts on one platform and continues on another. These are urgent measures that children’s charities and advocacy groups have long called for, and I seriously hope this House will support them.

Last week, along with the shadow Minister and the then Minister, I attended an extremely moving reception hosted by one of those organisations, the NSPCC. It included a speech by Rachel, a mother of a victim of online grooming and child sexual exploitation. She outlined in a very powerful way how her son Ben was forced from the age of 13 to take and share photos of himself that he did not want to, and to enter Skype chats with multiple men. He was then blackmailed with those images and subjected to threats of violence to his family. Rachel said to us:

“We blamed ourselves and I thought we had failed…I felt like I hadn’t done enough to protect our children”.

I want to say to you, Rachel, that you did not fail Ben. Responsibility for what happened to Ben lies firmly with the perpetrators of these heinous crimes, but what did fail Ben and has failed our young people for far too long is the lack of urgency and political will to regulate the wild west of the internet. No one is pretending that this is an easy task, and we are dealing with a highly complex piece of legislation, but if we are to protect future Bens we have to strengthen this Bill as much as possible.

Another young woman, Danielle, spoke during the NSPCC event. She had been a victim of online CSE that had escalated into horrific real-world physical and sexual abuse. She told us how she has to live with the fear that her photos may appear online and be shared without her knowledge or control. She is a strong young woman who is moving on with her life with huge resilience, but her trauma is very real. Amendment 19 would ensure that proportionate measures are in place to prevent the encountering or dissemination of child abuse content—for example, through intelligence sharing of new and emerging threats. This will protect Danielle and people like her, giving them some comfort that measures are in place to stop the spread of these images and to place far more onus on the platforms to get on top of this horrific practice.

Amendments 11 and 12, in the name of the right hon. and learned Member for Kenilworth and Southam, will raise the threshold for non-broadcast media outlets to benefit from the recognised news publisher exemption by requiring that such publishers are subject to complaints procedures that are both suitable and sufficient. I support those amendments, which, while not perfect, are a step forward in ensuring that this exception is protected from abuse.

I am also pleased that the Government have listened to some of my and other Members’ concerns and have now agreed to bring forward amendments at a later stage to exclude sanctioned publishers such as Russia Today from accessing this exemption. However, there are hundreds if not thousands of so-called news publishers across the internet that pose a serious threat, from the far right and also from Islamist, antisemitic and dangerous conspiratorial extremism. We must act to ensure that journalistic protections are not abused by those wishing to spread harm. Let us be clear that this is as much about protecting journalism as it is about protecting users from harm.

We cannot overstate the seriousness of getting this right. Carving out protections within the Bill creates a risk that if we do not get the criteria for this exemption right, harmful and extremist websites based internationally will simply establish offices in the UK, just so that they too can access this powerful new protection. Amendments 11 and 12 will go some way towards ensuring that news publishers are genuine, but I recognise that the amendments are not the perfect solution and that more work is needed as the Bill progresses in the other place.

In closing, I hope that we can find consensus today around the importance of protecting children online and restricting harmful content. It is not always easy, but I know we can find common ground in this place, as we saw during the Committee stage of the Bill when I was delighted to gain cross-party support to secure the introduction of Zach’s law, inspired by my young constituent Zach Eagling, which will outlaw the dreadful practice of epilepsy trolling online.

Nigel Evans Portrait Mr Deputy Speaker (Mr Nigel Evans)
- Hansard - - - Excerpts

You will resume your seat no later than 4.20 pm. We will therefore not put the clock on you.

16:14
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I will try to avoid too much preamble, but I thank the former Minister, the hon. Member for Croydon South (Chris Philp), for all his work in Committee and for listening to my nearly 200 contributions, for which I apologise. I welcome the new Minister to his place.

As time has been short today, I am keen to meet the Minister to discuss my new clauses and amendments. If he cannot meet me, I would be keen for him to meet the NSPCC, in particular, on some of my concerns.

Amendment 196 is about using proactive technology to identify CSEA content, which we discussed at some length in Committee. The hon. Member for Croydon South made it very clear that we should use scanning to check for child sexual abuse images. My concern is that new clause 38, tabled by the Lib Dems, might exclude proactive scanning to look for child sexual abuse images. I hope that the Government do not lurch in that direction, because we need proactive scanning to keep children protected.

New clause 18 specifically addresses child user empowerment duties. The Bill currently requires that internet service providers have user empowerment duties for adults but not for children, which seems bizarre. Children need to be able to say yes or no. They should be able to make their own choices about excluding content and not receiving unsolicited comments or approaches from anybody not on their friend list, for example. Children should be allowed to do that, but the Bill explicitly says that user empowerment duties apply only to adults. New clause 18 is almost a direct copy of the adult user empowerment duties, with a few extra bits added. It is important that children have access to user empowerment.

Amendment 190 addresses habit-forming features. I have had conversations about this with a number of organisations, including The Mix. I regularly accessed its predecessor, The Site, more than 20 years ago, and it is concerned that 42% of young people surveyed by YoungMinds show addiction-like behaviour in what they are accessing on social media. There is nothing on that in this Bill. The Mix, the Mental Health Foundation, the British Psychological Society, YoungMinds and the Royal College of Psychiatrists are all unhappy about the Bill’s failure to regulate habit-forming features. It is right that we provide support for our children, and it is right that our children are able to access the internet safely, so it is important to address habit-forming behaviour.

Amendment 162 addresses child access assessments. The Bill currently says that providers need to do a child access assessment only if there is a “significant” number of child users. I do not think that is enough and I do not think it is appropriate, and the NSPCC agrees. The amendment would remove the word “significant.” OnlyFans, for example, should not be able to dodge the requirement to child risk assess its services because it does not have a “significant” number of child users. These sites are massively harmful, and we need to ensure changes are made so they cannot wriggle out of their responsibilities.

Finally, amendment 161 is about live, one-to-one oral communications. I understand why the Government want to exempt live, one-to-one oral communications, as they want to ensure that phone calls continue to be phone calls, which is totally fine, but they misunderstand the nature of things like Discord and how people communicate on Fortnite, for example. People are having live, one-to-one oral communications, some of which are used to groom children. We cannot explicitly exempt them and allow a loophole for perpetrators of abuse in this Bill. I understand what the Government are trying to do, but they need to do it in a different way so that children can be protected from the grooming behaviour we see on some online platforms.

Once again, if the Minister cannot accept these amendments, I would be keen to meet him. If he cannot meet me, will he please meet the NSPCC? We cannot explicitly exempt those and allow a loophole for perpetrators of abuse in this Bill. I understand what the Government are trying to do, but they need to do it in a different way, in order that children can be protected from that grooming behaviour that we see on some of those platforms that are coming online. Once again, if the Minister cannot accept these amendments, I would be keen to meet him. If he cannot do that, I ask that the NSPCC have a meeting with him.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

We have had a wide-ranging debate of passion and expert opinion from Members in all parts of the House, which shows the depth of interest in this subject, and the depth of concern that the Bill is delivered and that we make sure we get it right. I speak as someone who only a couple of days ago became the Minister for online safety, although I was previously involved in engaging with the Government on this subject. As I said in my opening remarks, this has been an iterative process, where Members from across the House have worked successfully with the Government to improve the Bill. That is the spirit in which we should complete its stages, both in the Commons and in the Lords, and look at how we operate this regime when it has been created.

I wish to start by addressing remarks made by the hon. Member for Pontypridd (Alex Davies-Jones), the shadow Minister, and by the hon. Member for Cardiff North (Anna McMorrin) about violence against women and girls. There is a slight assumption that if the Government do not accept an amendment that writes, “Violence against women and girls” into the priority harms in the Bill, somehow the Bill does not address that issue. I think we would all agree that that is not the case. The provisions on harmful content that is directed at any individual, particularly the new harms offences approved by the Law Commission, do create offences in respect of harm that is likely to lead to actual physical harm or severe psychological harm. As the father of a teenage girl, who was watching earlier but has now gone to do better things, I say that the targeting of young girls, particularly vulnerable ones, with content that is likely to make them more vulnerable is one of the most egregious aspects of the way social media works. It is right that we are looking to address serious levels of self-harm and suicide in the Bill and in the transparency requirements. We are addressing the self-harm and suicide content that falls below the illegal threshold but where a young girl who is vulnerable is being sent content and prompted with content that can make her more vulnerable, could lead her to harm herself or worse. It is absolutely right that that was in the scope of the Bill.

New clause 3, perfectly properly, cites international conventions on violence against women and girls, and how that is defined. At the moment, with the way the Bill is structured, the schedule 7 offences are all based on existing areas of UK law, where there is an existing, clear criminal threshold. Those offences, which are listed extensively, will all apply as priority areas of harm. If there is, through the work of the Law Commission or elsewhere, a clear legal definition of misogyny and violence against women and girls that is not included, I think it should be included within scope. However, if new clause 3 was approved, as tabled, it would be a very different sort of offence, where it would not be as clear where the criminal threshold applied, because it is not cited against existing legislation. My view, and that of the Government, is that existing legislation covers the sorts of offences and breadth of offences that the shadow Minister rightly mentioned, as did other Members. We should continue to look at this—

Anna McMorrin Portrait Anna McMorrin
- Hansard - - - Excerpts

The Minister is not giving accurate information there. Violence against women and girls is defined by article 3 of the Council of Europe convention on preventing violence against women and domestic violence—the Istanbul convention. So there is that definition and it would be valid to put that in the Bill to ensure that all of that is covered.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I was referring to the amendment’s requirement to list that as part of the priority illegal harms. The priority illegal harms set out in the Bill are all based on existing UK Acts of Parliament where there is a clear established criminal threshold—that is the difference. The spirit of what that convention seeks to achieve, which we would support, is reflected in the harm-based offences written into the Bill. The big change in the structure of the Bill since the draft Bill was published—the Joint Committee on the Draft Online Safety Bill and I pushed for this at the time—is that far more of these offences have been clearly written into the Bill so that it is absolutely clear what they apply to. The new offences proposed by the Law Commission, particularly those relating to self-harm and suicide, are another really important addition. We know what the harms are. We know what we want this Bill to do. The breadth of offences that the hon. Lady and her colleagues have set out is covered in the Bill. But of course as law changes and new offences are put in place, the structure of the Bill, through the inclusion of new schedule 7 on priority offences, gives us the mechanism in the future, through instruments of this House, to add new offences to those primary illegal harms as they occur. I expect that that is what would happen. I believe that the spirit of new clause 3 is reflected in the offences that are written into the Bill.

The hon. Member for Pontypridd mentioned Government new clause 14. It is not true that the Government came up with it out of nowhere. There has been extensive consultation with Ofcom and others. The concern is that some social media companies, and some users of services, may have sought to interpret the criminal threshold as being based on whether a court of law has found that an offence has been committed, and only then might they act. Actually, we want them to pre-empt that, based on a clear understanding of where the legal threshold is. That is how the regulatory codes work. So it is an attempt, not to weaken the provision but to bring clarity to the companies and the regulator over the application.

The hon. Member for Ochil and South Perthshire (John Nicolson) raised an important point with regard to the Modern Slavery Act. As the Bill has gone along, we have included existing migration offences and trafficking offences. I would be happy to meet him further to discuss that aspect. Serious offences that exist in law should have an application, either as priority harms or as non-priority legal harms, and we should consider how we do that. I do not know whether he intends to press the amendment, but either way, I would be happy to meet him and to discuss this further.

My hon. Friend the Member for Solihull, the Chair of the Digital, Culture, Media and Sport Committee, raised an important matter with regard to the power of the Secretary of State, which was a common theme raised by several other Members. The hon. Member for Ochil and South Perthshire rightly quoted me, or my Committee’s report, back to me—always a chilling prospect for a politician. I think we have seen significant improvement in the Bill since the draft Bill was published. There was a time when changes to the codes could be made by the negative procedure; now they have to be by a positive vote of both Houses. The Government have recognised that they need to define the exceptional circumstances in which that provision might be used, and to define specifically the areas that are set out. I accept from the Chair of the Select Committee and my right hon. and learned Friend the Member for Kenilworth and Southam that those things could be interpreted quite broadly—maybe more broadly than people would like—but I believe that progress has been made in setting out those powers.

I would also say that this applies only to the period when the codes of practice are being agreed, before they are laid before Parliament. This is not a general provision. I think sometimes there has been a sense that the Secretary of State can at any time pick up the phone to Ofcom and have it amend the codes. Once the codes are approved by the House they are fixed. The codes do not relate to the duties. The duties are set out in the legislation. This is just the guidance that is given to companies on how they comply. There may well be circumstances in which the Secretary of State might look at those draft codes and say, “Actually, we think Ofcom has given the tech companies too easy a ride here. We expected the legislation to push them further.” Therefore it is understandable that in the draft form the Secretary of State might wish to have the power to raise that question, and not dictate to Ofcom but ask it to come back with amendments.

I take on board the spirit of what Members have said and the interest that the Select Committee has shown. I am happy to continue that dialogue, and obviously the Government will take forward the issues that they set out in the letter that was sent round last week to Members, showing how we seek to bring in that definition.

A number of Members raised the issue of freedom of speech provisions, particularly my hon. Friend the Member for Windsor (Adam Afriyie) at the end of his excellent speech. We have sought to bring, in the Government amendments, additional clarity to the way the legislation works, so that it is absolutely clear what the priority legal offences are. Where we have transparency requirements, it is absolutely clear what they apply to. The amendment that the Government tabled reflects the work that he and his colleagues have done, setting out that if we are discussing the terms of service of tech companies, it should be perfectly possible for them to say that this is not an area where they intend to take enforcement action and the Bill does not require them to do so.

The hon. Member for Batley and Spen (Kim Leadbeater) mentioned Zach’s law. The hon. Member for Ochil and South Perthshire raised that before the Joint Committee. So, too, did my hon. Friend the Member for Watford (Dean Russell); he and the hon. Member for Ochil and South Perthshire are great advocates on that. It is a good example of how a clear offence, something that we all agree to be wrong, can be tackled through this legislation; in this case, a new offence will be created, to prevent the pernicious targeting of people with epilepsy with flashing images.

Finally, in response to the speech by the hon. Member for Aberdeen North (Kirsty Blackman), I certainly will continue dialogue with the NSPCC on the serious issues that she has raised. Obviously, child protection is foremost in our mind as we consider the legislation. She made some important points about the ability to scan for encrypted images. The Government have recently made further announcements on that, to be reflected as the Bill progresses through the House.

Nigel Evans Portrait Mr Deputy Speaker (Mr Nigel Evans)
- Hansard - - - Excerpts

To assist the House, I anticipate two votes on this first section and one vote immediately on the next, because it has already been moved and debated.

16:30
Proceedings interrupted (Programme Order, this day).
The Deputy Speaker put forthwith the Question already proposed from the Chair (Standing Order No. 83E), That the clause be read a Second time.
Question agreed to.
New clause 19 accordingly read a Second time, and added to the Bill.
The Deputy Speaker then put forthwith the Questions necessary for the disposal of the business to be concluded at that time (Standing Order No. 83E).
New Clause 3
Priority illegal content: violence against women and girls
“(1) For the purposes of this Act, any provision applied to priority illegal content should also be applied to any content which—
(a) constitutes,
(b) encourages, or
(c) promotes
(2) “Violence against women and girls” is defined by Article 3 of the Council of Europe Convention on Preventing Violence Against Women and Domestic Violence (“the Istanbul Convention”).”—(Alex Davies-Jones.)
This new clause applies provisions to priority illegal content to content which constitutes, encourages or promotes violence against women and girls.
Brought up,
Question put, That the clause be added to the Bill.
16:30

Division 35

Ayes: 226

Noes: 292

Clause 5
Overview of Part 3
Amendment made: 57, page 4, line 36, at beginning insert ““priority offence”,”.—(Damian Collins.)
This is a technical amendment providing for a signpost to the definition of “priority offence” in the clause giving an overview of Part 3.
Clause 6
Providers of user-to-user services: duties of care
Amendment made: 163, page 5, line 30, at end insert—
“(da) the duties to protect news publisher content set out in section (Duties to protect news publisher content),”. —(Damian Collins.)
This amendment is consequential on NC19.
Clause 8
Illegal content risk assessment duties
Amendments made: 58, page 6, line 45, at end insert—
“(ba) the level of risk of the service being used for the commission or facilitation of a priority offence;”
This amendment adds another matter to the matters that should be included in a provider’s risk assessment regarding illegal content on a user-to-user service, so that the risks around use of a service for the commission or facilitation of priority offences are included.
Amendment 59, page 7, line , at end insert
“or by the use of the service for the commission or facilitation of a priority offence”.
This amendment ensures that providers’ risk assessments about illegal content on a user-to-user service must consider the risk of harm from use of the service for the commission or facilitation of priority offences.
Amendment 60, page 7, line 4, after “content” insert
“or the use of the service for the commission or facilitation of a priority offence”.—(Damian Collins.)
This amendment ensures that providers’ risk assessments about illegal content on a user-to-user service must consider the risk of functionalities of the service facilitating the use of the service for the commission or facilitation of priority offences.
Clause 9
Safety duties about illegal content
Amendments made: 61, page 7, line 24, leave out from “measures” to the end of line 26 and insert
“relating to the design or operation of the service to—
(a) prevent individuals from encountering priority illegal content by means of the service,
(b) effectively mitigate and manage the risk of the service being used for the commission or facilitation of a priority offence, as identified in the most recent illegal content risk assessment of the service, and
(c) effectively mitigate and manage the risks of harm to individuals, as identified in the most recent illegal content risk assessment of the service (see section 8(5)(f)).”
The substantive changes made by this amendment are: (1) making it clear that compliance with duties to mitigate risks as mentioned in paragraphs (a) to (c) is to be achieved by the way a service is designed or operated, and (2) paragraph (b) is a new risk mitigation duty on providers to deal with the risks around use of a user-to-user service for the commission or facilitation of priority offences.
Amendment 62, page 7, line 29, leave out paragraph (a).
This amendment omits the provision that now appears in subsection (2) of this clause: see paragraph (a) of the provision inserted by Amendment 61.
Amendment 63, page 7, line 37, after “is” insert “designed”.
This adds a reference to design to clause 9(4) which provides for the illegal content duties for user-to-user services to apply across all areas of a service.
Amendment 64, page 8, line 5, leave out “paragraphs (a) and (b)” and insert “paragraph (b)”.
This amendment is a technical change consequential on Amendment 62.
Amendment 65, page 8, line 9, leave out from “consistently” to end of line 10.(Damian Collins.)
This amendment omits words from a provision imposing a duty to apply the terms of service of a user-to-user service consistently. The omitted words relate to the material now dealt with in NC14.
Clause 11
Safety duties protecting children
Amendments made: 66, page 10, line 5, after “measures” insert
“relating to the design or operation of the service”.
This amendment makes it clear that compliance with duties to mitigate risks of harm from content harmful to children on user-to-user services is to be achieved by the way a service is designed or operated.
Amendment 67, page 10, line 9, after “service” insert “(see section 10(6)(g))”.
This is a technical amendment to put beyond doubt the meaning of a provision about risks identified in a risk assessment relating to content that is harmful to children on user-to-user services.
Amendment 68, page 10, line 22, after “is” insert “designed”.
This adds a reference to design to clause 11(4) which provides for the children’s safety duties for user-to-user services to apply across all areas of a service.
Amendment 69, page 11, line 2, leave out from “consistently” to end of line 4.(Damian Collins.)
This amendment omits words from a provision imposing a duty to apply the terms of service of a user-to-user service consistently. The omitted words relate to the material now dealt with in NC14.
Clause 13
Safety duties protecting adults
Amendments made: 70, page 13, line 4, leave out subsection (3) and insert—
“(3) If a provider decides to treat a kind of priority content that is harmful to adults in a way described in subsection (4), a duty to include provisions in the terms of service specifying how that kind of content is to be treated (separately covering each kind of priority content that is harmful to adults which a provider decides to treat in one of those ways).”
This amendment ensures that a provider must state in the terms of service how priority content that is harmful to adults is to be treated, if the provider decides to treat it in one of the ways listed in clause 13(4).
Amendment 71, page 13, line 12, at end insert—
“(e) allowing the content without treating it in a way described in any of paragraphs (a) to (d).”
This amendment expands the list in clause 13(4) with the effect that if a provider decides to treat a kind of priority content that is harmful to adults by not recommending or promoting it, nor taking it down or restricting it etc, the terms of service must make that clear.
Amendment 72, page 13, line 23, leave out from “consistently” to end of line 25.—(Damian Collins.)
This amendment omits words from a provision imposing a duty to apply the terms of service of a Category 1 service consistently. The omitted words relate to the material now dealt with in NC14.
Clause 15
Duties to protect content of democratic importance
Amendment made: 73, page 15, line 4, leave out from “consistently” to end of line 5.—(Damian Collins.)
This amendment omits words from a provision imposing a duty to apply the terms of service of a Category 1 service consistently. The omitted words relate to the material now dealt with in NC14.
Clause 16
Duties to protect journalistic content
Amendments made: 164, page 15, line 44, at end insert—
“(5A) Subsections (3) and (4) do not require a provider to make a dedicated and expedited complaints procedure available to a recognised news publisher in relation to a decision if the provider has taken the steps set out in section (Duties to protect news publisher content)(3) in relation to that decision.”
This amendment ensures that where a recognised news publisher has a right to make representations about a proposal to take action in relation to news publisher content or against the recognised news publisher under the new clause introduced by NC19, a provider is not also required to offer that publisher a complaints procedure under clause 16.
Amendment 74, page 16, line 11, leave out from “consistently” to end of line 12.
This amendment omits words from a provision imposing a duty to apply the terms of service of a Category 1 service consistently. The omitted words relate to the material now dealt with in NC14.(Damian Collins.)
Amendment 165, page 16, line 13, leave out “section” and insert “Part”.—(Damian Collins.)
This is a technical amendment ensuring that the definition of “journalistic content” applies for the purposes of Part 3 of the Bill.
Clause 18
Duties about complaints procedures
Amendment made: 166, page 19, line 14, at end insert—
“(iiia) section (Duties to protect news publisher content) (news publisher content)”.
This amendment ensures that users and affected persons can complain if a provider is not complying with a duty set out in NC19.(Damian Collins.)
Clause 19
Duties about freedom of expression and privacy
Amendments made: 167, page 20, line 18, at end insert—
“(5A) An impact assessment relating to a service must include a section which considers the impact of the safety measures and policies on the availability and treatment on the service of content which is news publisher content or journalistic content in relation to the service.”
This amendment requires a provider of a Category 1 service to include a section in their impact assessments considering the effect of the provider’s measures and policies on the availability on the service of news publisher content and journalistic content.
Amendment 168, page 20, line 37, at end insert—
“(10) See—
section 16 for the meaning of “journalistic content”;
section 49 for the meaning of “news publisher content”.” —(Damian Collins.)
This amendment inserts a signpost to definitions of terms used in the new subsection inserted by Amendment 167.
Clause 20
Record-keeping and review duties
Amendment made: 169, page 21, line 45, at end insert
“and for the purposes of subsection (6), also includes the duties set out in section (Duties to protect news publisher content) (news publisher content).”—(Damian Collins.)
This amendment ensures that providers have a duty to review compliance with the duties set out in NC19 regularly, and after making any significant change to the design or operation of the service.
Clause 24
Safety duties about illegal content
Amendments made: 75, page 23, line 43, after “measures” insert
“relating to the design or operation of the service”.
This amendment makes it clear that compliance with duties to mitigate risks of harm from illegal content on search services is to be achieved by the way a service is designed or operated.
Amendment 76, page 23, line 45, at end insert “(see section 23(5)(c))”.
This is a technical amendment to put beyond doubt the meaning of a provision about risks identified in a risk assessment relating to illegal content on search services.
Amendment 77, page 24, line 8, after “is” insert “designed”.
This adds a reference to design to clause 24(4) which provides for the illegal content duties for search services to apply across all areas of a service.
Amendment 78, page 24, line 23, leave out from “consistently” to end of line 24.—(Damian Collins.)
This amendment omits words from a provision imposing a duty to apply a search service’s publicly available statement consistently. The omitted words relate to the material now dealt with in NC14.
Clause 26
Safety duties protecting children
Amendments made: 79, page 26, line 5, after “measures” insert
“relating to the design or operation of the service”.
This amendment makes it clear that compliance with duties to mitigate risks of harm from content that is harmful to children on search services is to be achieved by the way a service is designed or operated.
Amendment 80, page 26, line 9, after “service” insert “(see section 25(5)(d))”.
This is a technical amendment to put beyond doubt the meaning of a provision about risks identified in a risk assessment relating to content that is harmful to children on search services.
Amendment 81, page 26, line 20, after “is” insert “designed”.
This adds a reference to design to clause 26(4) which provides for the children’s safety duties for search services to apply across all areas of a service.
Amendment 82, page 26, line 40, leave out from “consistently” to end of line 42.—(Damian Collins.)
This amendment omits words from a provision imposing a duty to apply a search service’s publicly available statement consistently. The omitted words relate to the material now dealt with in NC14.
Clause 37
Codes of practice about duties
Amendments made: 85, page 35, line 32, at end insert “or offences within Schedule 5 (terrorism offences)”.
This amendment ensures that a code of practice under clause 37(1) encompasses duties to deal with the use of a service in connection with terrorism offences as well as terrorism content.
Amendment 86, page 35, line 36, at end insert “or offences within Schedule 6 (child sexual exploitation and abuse offences)”.
This amendment ensures that a code of practice under clause 37(2) encompasses duties to deal with the use of a service in connection with child sexual exploitation and abuse offences as well as CSEA content.
Amendment 87, page 36, line 21, leave out “content” and insert “matters”. —(Damian Collins.)
This amendment is consequential on Amendments 85 and 86.
Clause 47
Duties and the first codes of practice
Amendments made: 88, page 44, line 32, at end insert “or offences within Schedule 5 (terrorism offences)”.
This amendment ensures that a reference to the illegal content duties on providers encompasses a reference to terrorism offences as well as terrorism content.
Amendment 89, page 44, line 35, after “content” insert “or offences within Schedule 6 (child sexual exploitation and abuse offences)”.—(Damian Collins.)
Clause 48
OFCOM’s guidance: record-keeping duties and children’s access assessments
Amendments made: 170, page 45, line 4, at end insert—
‘(A1) OFCOM must produce guidance for providers of Category 1 services to assist them in complying with their duties set out in section (Duties to protect news publisher content) (news publisher content).”
This amendment requires Ofcom to produce guidance for providers of Category 1 services to assist them with complying with their duties under NC19.
Amendment 171, page 45, line 9, leave out “the guidance” and insert “guidance under subsection (1)”.
This amendment means that the consultation requirements in clause 48 would not apply to guidance required to be produced as a result of Amendment 170.
Amendment 172, page 45, leave out “the guidance” and insert “guidance under this section”. —(Damian Collins.)
This amendment requires Ofcom to publish guidance required to be produced as a result of Amendment 170.
Clause 49
“Regulated user-generated content”, “user-generated content”, “news publisher content”
Amendments made: 90, page 46, line 1, leave out “operated” and insert “controlled”.
This amendment uses the term “control ” in relation to a person responsible for a bot, which is the language used in NC14.
Amendment 173, page 46, line 11, leave out “on, or reviews of,” and insert “or reviews relating to”. —(Damian Collins.)
This amendment ensures that the wording in this provision is consistent with the wording in paragraph 4(1)(a) of Schedule 1.
Clause 52
“Illegal content” etc
Amendments made: 91, page 49, line 1, leave out paragraph (b).
This amendment leaves out material which is now dealt with in NC14.
Amendment 92, page 49, line 9, leave out paragraph (a) and insert—
“(a) a priority offence, or”
This is a technical amendment to insert a defined term, a “priority offence” (see Amendment 95).
Amendment 93, page 49, line 10, leave out paragraphs (b) and (c).
This amendment is consequential on the new approach of referring to a “priority offence”.
Amendment 94, page 49, line 13, leave out paragraph (d) and insert—
“(d) an offence within subsection (4A).
‘(4A) An offence is within this subsection if—
(a) it is not a priority offence,
(b) the victim or intended victim of the offence is an individual (or individuals), and
(c) the offence is created by this Act or, before or after this Act is passed, by—
(i) another Act,
(ii) an Order in Council,
(iii) an order, rules or regulations made under an Act by the Secretary of State or other Minister of the Crown, including such an instrument made jointly with a devolved authority, or
(iv) devolved subordinate legislation made by a devolved authority with the consent of the Secretary of State or other Minister of the Crown.”
New subsection (4A), inserted by this amendment, describes offences which are relevant for the purposes of the concept of “illegal content”, but which are not priority offences as defined by new subsection (4B) (see amendment ). Subsection (4A)(c) requires there to have been some involvement of HMG in relation to the creation of the offence.
Amendment 95, page 49, line 14, at end insert—
‘(4B) “Priority offence” means—
(a) an offence specified in Schedule 5 (terrorism offences),
(b) an offence specified in Schedule 6 (offences related to child sexual exploitation and abuse), or
(c) an offence specified in Schedule 7 (other priority offences).”
This amendment inserts a definition of “priority offence” into clause 52.
Amendment 96, page 49, line 23, that subsection (8) of clause 52 be transferred to the end of line 14 on page 49.
This is a technical amendment moving provision to a more appropriate position in clause 52.
Amendment 97, page 49, leave out line 23 and insert “But an offence is not within subsection (4A)”.
This is a technical amendment consequential on the changes to clause 52 made by Amendment 94.
Amendment 98, page 49, line 35, at end insert—
‘(9A) References in subsection (3) to conduct of particular kinds are not to be taken to prevent content generated by a bot or other automated tool from being capable of amounting to an offence (see also section (Providers’ judgements about the status of content)(7) (providers’ judgements about the status of content)).”
This amendment ensures that content generated by bots is capable of being illegal content (so that the duties about dealing with illegal content may apply to such content).
Amendment 99, page 50, line 1, leave out subsection (12) and insert—
‘(12) In this section—
“devolved authority” means—
(a) the Scottish Ministers,
(b) the Welsh Ministers, or
(c) a Northern Ireland department;
“devolved subordinate legislation” means—
(a) an instrument made under an Act of the Scottish Parliament,
(b) an instrument made under an Act or Measure of Senedd Cymru, or
(c) an instrument made under Northern Ireland legislation;
“Minister of the Crown” has the meaning given by section 8 of the Ministers of the Crown Act 1975 and also includes the Commissioners for Her Majesty’s Revenue and Customs;
“offence” means an offence under the law of any part of the United Kingdom.”
This amendment inserts definitions into clause 52 that are needed as a result of Amendment 94.
Amendment 100, page 50, line 2, at end insert—
‘(13) See also section (Providers’ judgements about the status of content) (providers’ judgements about the status of content).” —(Damian Collins.)
This amendment inserts a signpost into clause 52 pointing to the NC about Providers’ judgements inserted by NC14.
Schedule 3
Timing of providers’ assessments
Amendments made: 147, page 175, line 23, leave out the definition of “illegal content risk assessment guidance”.
This technical amendment is consequential on Amendment 107.
Amendment 148, page 175, line 30, leave out from second “to” to end of line 31 and insert “OFCOM’s guidance under section 85(1).”
Schedule 3 is about the timing of risk assessments etc. This amendment ensures that the provisions about guidance re risk assessments work with the changes made by Amendment 107.
Amendment 149, page 175, line 36, leave out from second “to” to end of line 37 and insert “OFCOM’s guidance under section 85(1A).” —(Damian Collins.)
Schedule 3 is about the timing of risk assessments etc. This amendment ensures that the provisions about guidance re risk assessments work with the changes made by Amendment 107.
Schedule 7
Priority offences
Amendment proposed: 187, page 186, line 32, at end insert—
“Human trafficking
22A An offence under section 2 of the Modern Slavery Act 2015.” —(John Nicolson.)
This amendment includes Human Trafficking as a priority offence.
16:45

Division 36

Ayes: 229

Noes: 294

Nigel Evans Portrait Mr Deputy Speaker (Mr Nigel Evans)
- Hansard - - - Excerpts

I am anticipating another Division, as I said, and then I understand there may be some points of order, which I will hear after that Division.

That concludes proceedings on new clauses, new schedules and amendments to those parts of the Bill that have to be concluded by 4.30 pm.

It has been pointed out to me that, in this unusually hot weather, Members should please remember to drink more water. I tried it myself once. [Laughter.]

In accordance with the programme (No. 2) order of today, we now come to new clauses, new schedules and amendments relating to those parts of the Bill to be concluded by 7 pm. We begin with new clause 14, which the House has already debated. I therefore call the Minister to move new clause 14 formally.

New Clause 14

Providers’ judgements about the status of content

“(1) This section sets out the approach to be taken where—

(a) a system or process operated or used by a provider of a Part 3 service for the purpose of compliance with relevant requirements, or

(b) a risk assessment required to be carried out by Part 3, involves a judgement by a provider about whether content is content of a particular kind.

(2) Such judgements are to be made on the basis of all relevant information that is reasonably available to a provider.

(3) In construing the reference to information that is reasonably available to a provider, the following factors, in particular, are relevant—

(a) the size and capacity of the provider, and

(b) whether a judgement is made by human moderators, by means of automated systems or processes or by means of automated systems or processes together with human moderators.

(4) Subsections (5) to (7) apply (as well as subsection (2)) in relation to judgements by providers about whether content is—

(a) illegal content, or illegal content of a particular kind, or

(b) a fraudulent advertisement.

(5) In making such judgements, the approach to be followed is whether a provider has reasonable grounds to infer that content is content of the kind in question (and a provider must treat content as content of the kind in question if reasonable grounds for that inference exist).

(6) Reasonable grounds for that inference exist in relation to content and an offence if, following the approach in subsection (2), a provider—

(a) has reasonable grounds to infer that all elements necessary for the commission of the offence, including mental elements, are present or satisfied, and

(b) does not have reasonable grounds to infer that a defence to the offence may be successfully relied upon.

(7) In the case of content generated by a bot or other automated tool, the tests mentioned in subsection (6)(a) and (b) are to be applied in relation to the conduct or mental state of a person who may be assumed to control the bot or tool (or, depending what a provider knows in a particular case, the actual person who controls the bot or tool).

(8) In considering a provider’s compliance with relevant requirements to which this section is relevant, OFCOM may take into account whether providers’ judgements follow the approaches set out in this section (including judgements made by means of automated systems or processes, alone or together with human moderators).

(9) In this section—

“fraudulent advertisement” has the meaning given by section 34 or 35 (depending on the kind of service in question);

“illegal content” has the same meaning as in Part 3 (see section 52);

“relevant requirements” means—

(a) duties and requirements under this Act, and

(b) requirements of a notice given by OFCOM under this Act.”—(Damian Collins.)

This new clause clarifies how providers are to approach judgements (human or automated) about whether content is content of a particular kind, and in particular, makes provision about how questions of mental state and defences are to be approached when considering whether content is illegal content or a fraudulent advertisement.

Brought up.

Question put, That the clause be added to the Bill.

16:59

Division 37

Ayes: 288

Noes: 229

New clause 14 read a Second time, and added to the Bill.
New Clause 15
Guidance about illegal content judgements
“(1) OFCOM must produce guidance for providers of Part 3 services about the matters dealt with in section (Providers’ judgements about the status of content) so far as relating to illegal content judgements.
(2) “Illegal content judgements” means judgements of a kind mentioned in subsection (4) of that section.
(3) Before producing the guidance (including revised or replacement guidance), OFCOM must consult such persons as they consider appropriate.
(4) OFCOM must publish the guidance (and any revised or replacement guidance).”—(Damian Collins.)
This new clause requires OFCOM to give guidance to providers about how they should approach judgements about whether content is illegal content or a fraudulent advertisement.
Brought up, and added to the Bill.
Thangam Debbonaire Portrait Thangam Debbonaire (Bristol West) (Lab)
- Hansard - - - Excerpts

On a point of order, Mr Deputy Speaker. Despite over 50 members of the Government resigning last week and many more Tory MPs submitting letters of no confidence in their own leader, the Conservative party continues to prop up this failed Prime Minister until September. They are complicit. They know—indeed, they have said—he is not fit to govern. They told the public so just days ago. Now they seem to be running scared and will not allow the Opposition to table a vote of no confidence. [Hon. Members: “Shame!”] Yes. This is yet another outrageous breach of the conventions that govern our country from a man who disrespected the Queen and illegally prorogued Parliament. Now he is breaking yet another convention. Every single day he is propped up by his Conservative colleagues, he is doing more damage to this country.

Mr Deputy Speaker, are you aware of any other instances where a Prime Minister has so flagrantly ignored the will of this House by refusing to grant time to debate a motion of no confidence in the Government, despite the fact that even his own party does not believe he should be Prime Minister any more? Do you agree with me that this egregious breach of democratic convention only further undermines confidence in this rotten Government?

Margaret Beckett Portrait Margaret Beckett (Derby South) (Lab)
- Hansard - - - Excerpts

Further to that point of order, Mr Deputy Speaker. I recognise that under the present Prime Minister, this Government have specialised in constitutional innovation. Nevertheless, it certainly seems to me, and I hope it does to you and to the House authorities, that this is stretching the boundaries of what is permissible into the outrageous and beyond, and threatening the democracy of this House.

Angela Eagle Portrait Dame Angela Eagle (Wallasey) (Lab)
- Hansard - - - Excerpts

Further to that point of order, Mr Deputy Speaker. The convention is that if the Leader of the Opposition tables a motion of no confidence, it is taken as the next available business. That is what has been done, yet even though we know that large swathes of the party in Government have no confidence in their Prime Minister, they are refusing to acknowledge and honour a time-honoured convention that is the only way to make a debate on that possible. Do you not agree that it is for this House of Commons to test whether any given Prime Minister has its confidence and that his or her Prime Ministership is always based on that? One of the prerequisites for being appointed Prime Minister of this country by the Queen is that that person shall have the confidence of the House of Commons. If we are not allowed to test that now, when on earth will be allowed to test it?

17:15
Chris Bryant Portrait Chris Bryant (Rhondda) (Lab)
- Hansard - - - Excerpts

Further to that point of order, Mr Deputy Speaker. As you know, “Erskine May” says very clearly:

“By established convention, the Government always accedes to the demand from the Leader of the Opposition”

in regard to a no-confidence motion. There has been a very long tradition of all sorts of different kinds of votes of no confidence. Baldwin, Melbourne, Wellington and Salisbury all resigned after a vote on an amendment to the Loyal Address; they considered that to be a vote of no confidence. Derby and Gladstone resigned after an amendment to the Budget; they considered that to be a vote of no confidence. Neville Chamberlain resigned after a motion to adjourn the House, even though he won the vote, because he saw that as a motion of no confidence. So it is preposterous that the Government are trying to say that the motion that is being tabled for tomorrow somehow does not count.

Let me remind Government Members that on 2 August 1965, the motion tabled by the Conservatives was:

“That this House has no confidence in Her Majesty’s Government and deplores the Prime Minister’s conduct of the nation’s affairs.”

I think this House agrees with that today.

Of course, we briefly had the Fixed-term Parliaments Act 2011, which set in statute that there was only one way of having a motion of no confidence, but this Government overturned and repealed that Act. The then Minister, the right hon. Member for Surrey Heath (Michael Gove), came on behalf of the Government to tell the Joint Committee on the Fixed-term Parliaments Act:

“It seems to us to be cleaner and clearer to have a return to a more classical understanding of what a vote of confidence involves.”

It is simple: the Prime Minister is disgraced, he does not enjoy the confidence of the House, and if he simply tries to prevent the House from coming to that decision, it is because he is a coward.

None Portrait Several hon. Members rose—
- Hansard -

Nigel Evans Portrait Mr Deputy Speaker (Mr Nigel Evans)
- Hansard - - - Excerpts

I will only allow three more points of order, because this is eating into time for very important business. [Interruption.] They are all similar points of order and we could carry on with them until 7 o’clock, but we are not going to do so.

Karin Smyth Portrait Karin Smyth (Bristol South) (Lab)
- Hansard - - - Excerpts

Further to that point of order, Mr Deputy Speaker. At the Public Administration and Constitutional Affairs Committee this morning, Sir John Major presented evidence to us about propriety and ethics. In that very sombre presentation, he talked about being

“at the top of a slope”

down towards the loss of democracy in this country. Ultimately, the will of Parliament is all we have, so if we do not have Parliament to make the case, what other option do we have?

None Portrait Several hon. Members rose—
- Hansard -

Nigel Evans Portrait Mr Deputy Speaker
- Hansard - - - Excerpts

Order. I ask the final Members please to show restraint as far as language is concerned, because I am not happy with some of the language that has been used.

Clive Efford Portrait Clive Efford (Eltham) (Lab)
- Hansard - - - Excerpts

Further to that point of order, Mr Deputy Speaker. There have been 50 resignations of Ministers; the Government are mired in controversy; people are acting up as Ministers who are not quite Ministers, as I understand it; and legislation is being delayed. When was there ever a better time for the House to table a motion of no confidence in a Government? This is a cowardly act not by the Prime Minister, but by the Conservative party, which does not want a vote on this issue. Conservative Members should support the move to have a vote of no confidence and have the courage to stand up for their convictions.

Angus Brendan MacNeil Portrait Angus Brendan MacNeil (Na h-Eileanan an Iar) (SNP)
- Hansard - - - Excerpts

Further to that point of order, Mr Deputy Speaker. How can the Conservative party have no confidence in and write letters about the Prime Minister one week yet refuse to come to Parliament the following week to declare that in front of the public?

Kevin Brennan Portrait Kevin Brennan (Cardiff West) (Lab)
- Hansard - - - Excerpts

Further to that point of order, Mr Deputy Speaker. Can you inform the House of whether Mr Speaker has received any explanation from the Government for this craven and egregious breach of parliamentary convention? If someone were to table a motion under Standing Order No. 24 for tomorrow, has he given any indication of what his attitude would be towards such a motion?

Nigel Evans Portrait Mr Deputy Speaker
- Hansard - - - Excerpts

I will answer the question about Standing Order No. 24 first, because I can deal with it immediately: clearly, if an application is made, Mr Speaker will determine it himself.

The principles concerning motions of no confidence are set out at paragraph 18.44 of “Erskine May”, which also gives examples of motions that have been debated and those that have not. “May” says:

“By established convention, the Government always accedes to the demand from the Leader of the Opposition to allot a day for the discussion of a motion tabled by the official Opposition which, in the Government’s view, would have the effect of testing the confidence of the House.”

I can only conclude, therefore, that the Government have concluded that the motion, as tabled by the official Opposition, does not have that effect. That is a matter for the Government, though, rather than for the Chair.

May I say that there are seven more sitting days before recess? As Deputy Speaker, I would anticipate that there will be further discussions.

We now have to move on with the continuation of business on the Bill.

New Clause 7

Duties regarding user-generated pornographic content: regulated services

“(1) This section sets out the duties which apply to regulated services in relation to user-generated pornographic content.

(2) A duty to verify that each individual featuring in the pornographic content has given their permission for the content in which they feature to be published or made available by the service.

(3) A duty to remove pornographic content featuring a particular individual if that individual withdraws their consent, at any time, to the pornographic content in which they feature remaining on the service.

(4) For the meaning of ‘pornographic content’, see section 66(2).

(5) In this section, ‘user-generated pornographic content’ means any content falling within the meaning given by subsection (4) and which is also generated directly on the service by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service.

(6) For the meaning of ‘regulated service’, see section 2(4).”—(Dame Diana Johnson.)

Brought up, and read the First time.

Diana Johnson Portrait Dame Diana Johnson (Kingston upon Hull North) (Lab)
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

Lindsay Hoyle Portrait Mr Deputy Speaker
- View Speech - Hansard - - - Excerpts

With this it will be convenient to discuss the following:

New clause 33—Meaning of “pornographic content”

“(1) In this Act ‘pornographic content’ means any of the following—

(a) a video work in respect of which the video works authority has issued an R18 certificate;

(b) content that was included in a video work to which paragraph (a) applies, if it is reasonable to assume from its nature that its inclusion was among the reasons why the certificate was an R18 certificate;

(c) any other content if it is reasonable to assume from its nature that any classification certificate issued in respect of a video work including it would be an R18 certificate;

(d) a video work in respect of which the video works authority has issued an 18 certificate, and that it is reasonable to assume from its nature was produced solely or principally for the purposes of sexual arousal;

(e) content that was included in a video work to which paragraph (d) applies, if it is reasonable to assume from the nature of the content—

(i) that it was produced solely or principally for the purposes of sexual arousal, and

(ii) that its inclusion was among the reasons why the certificate was an 18 certificate;

(f) any other content if it is reasonable to assume from its nature—

(i) that it was produced solely or principally for the purposes of sexual arousal, and

(ii) that any classification certificate issued in respect of a video work including it would be an 18 certificate;

(g) a video work that the video works authority has determined not to be suitable for a classification certificate to be issued in respect of it, if—

(i) it includes content that it is reasonable to assume from its nature was produced solely or principally for the purposes of sexual arousal, and

(ii) it is reasonable to assume from the nature of that content that its inclusion was among the reasons why the video works authority made that determination;

(h) content that was included in a video work that the video works authority has determined not to be suitable for a classification certificate to be issued in respect of it, if it is reasonable to assume from the nature of the content—

(i) that it was produced solely or principally for the purposes of sexual arousal, and

(ii) that its inclusion was among the reasons why the video works authority made that determination;

(i) any other content if it is reasonable to assume from the nature of the content—

(i) that it was produced solely or principally for the purposes of sexual arousal, and

(ii) that the video works authority would determine that a video work including it was not suitable for a classification certificate to be issued in respect of it.

(2) In this section—

‘18 certificate’ means a classification certificate which—

(a) contains, pursuant to section 7(2)(b) of the Video Recordings Act 1984, a statement that the video work is suitable for viewing only by persons who have attained the age of 18 and that no video recording containing that work is to be supplied to any person who has not attained that age, and

(b) does not contain the statement mentioned in section 7(2)(c) of that Act that no video recording containing the video work is to be supplied other than in a licensed sex shop;

‘classification certificate’ has the same meaning as in the Video Recordings Act 1984 (see section 7 of that Act);

‘content’ means—

(a) a series of visual images shown as a moving picture, with or without sound;

(b) a still image or series of still images, with or without sound; or

(c) sound;

‘R18 certificate’ means a classification certificate which contains the statement mentioned in section 7(2)(c) of the Video Recordings Act 1984 that no video recording containing the video work is to be supplied other than in a licensed sex shop;

‘the video works authority’ means the person or persons designated under section 4(1) of the Video Recordings Act 1984 as the authority responsible for making arrangements in respect of video works other than video games;

‘video work’ means a video work within the meaning of the Video Recordings Act 1984, other than a video game within the meaning of that Act.”

This new clause defines pornographic content for the purposes of the Act and would apply to user-to-user services and commercial pornographic content.

Amendment 205, in clause 34, page 33, line 23, at end insert—

“(3A) But an advertisement shall not be regarded as regulated user-generated content and precluded from being a ‘fraudulent advertisement’ by reason of the content constituting the advertisement being generated directly on, uploaded to, or shared on a user-to-user service before being modified to a paid-for advertisement.”

Amendment 206, page 33, line 30, after “has” insert

“or may reasonably be expected to have”.

Amendment 207, in clause 36, page 35, line 12, at end insert—

“(3A) An offence under section 993 of the Companies Act 2006 (fraudulent trading).”

Amendment 208, page 35, line 18, after “(3)” insert “, 3(A)”.

Amendment 209, page 35, line 20, after “(3)” insert “, 3(A)”

Amendment 210, page 35, line 23, after “(3)” insert “, 3(A)”

Amendment 201, in clause 66, page 59, line 8, leave out from “Pornographic content” to end of line 10 and insert

“has the same meaning as section [meaning of pornographic content]”.

This amendment defines pornographic content for the purposes of the Part 5. It is consequential on NC33.

Amendment 56, page 59, line 8, after “content” insert “, taken as a whole,”

This amendment would require that content is considered as a whole before being defined as pornographic content.

Amendment 33, in clause 68, page 60, line 33, at end insert—

“(2A) A duty to verify that every individual featured in regulated provider pornographic content is an adult before the content is published on the service.

(2B) A duty to verify that every individual featured in regulated provider pornographic content that is already published on the service when this Act is passed is an adult and, where that is not the case, remove such content from the service.

(2C) A duty to verify that each individual appearing in regulated provider pornographic content has given their permission for the content in which they appear to be published or made available by the internet service.

(2D) A duty to remove regulated provider pornographic content featuring an individual if that individual withdraws their consent, at any time, to the pornographic content in which they feature remaining on the service.”

This amendment creates a duty to verify that each individual featured in pornographic content is an adult and has agreed to the content being uploaded before it is published. It would also impose a duty to remove content if the individual withdraws consent at any time.

Amendment 34, page 60, line 37, leave out “subsection (2)” and insert “subsections (2) to (2D)”.

This amendment is consequential on Amendment 33.

Amendment 31, in clause 182, page 147, line 16, leave out from “unless” to end of line 17 and insert—

“(a) a draft of the instrument has been laid before each House of Parliament,

“(b) the Secretary of State has made a motion in the House of Commons in relation to the draft instrument, and

(c) the draft instrument has been approved by a resolution of each House of Parliament.”

This amendment would require a draft of a statutory instrument containing regulations under sections 53 or 54 to be debated on the floor of the House of Commons, rather than in a delegated legislation committee (as part of the affirmative procedure).

Amendment 158, in clause 192, page 155, line 26, after “including” insert “but not limited to”.

This amendment clarifies that the list of types of content in clause 192 is not exhaustive.

Diana Johnson Portrait Dame Diana Johnson
- View Speech - Hansard - - - Excerpts

May I welcome the Minister to his place, as I did not get an opportunity to speak on the previous group of amendments?

New clause 7 and amendments 33 and 34 would require online platforms to verify the age and consent of all individuals featured in pornographic videos uploaded to their site, as well as enabling individuals to withdraw their consent to the footage remaining on the website. Why are the amendments necessary? Let me read a quotation from a young woman:

“I sent Pornhub begging emails. I pleaded with them. I wrote, ‘Please, I’m a minor, this was assault, please take it down.’”

She received no reply and the videos remained live. That is from a BBC article entitled “I was raped at 14, and the video ended up on a porn site”.

This was no one-off. Some of the world’s biggest pornography websites allow members of the public to upload videos without verifying that everyone in the film is an adult or that everyone in the film gave their permission for it to be uploaded. As a result, leading pornography websites have been found to be hosting and profiting from filmed footage of rape, sex trafficking, image-based sexual abuse and child sexual abuse.

In 2020, The New York Times documented the presence of child abuse videos on Pornhub, one of the most popular pornography websites in the world, prompting Mastercard, Visa and Discover to block the use of their cards for purchases on the site. The New York Times reporter Nicholas Kristof wrote about Pornhub:

“Its site is infested with rape videos. It monetizes child rapes, revenge pornography, spy cam videos of women showering, racist and misogynist content, and footage of women being asphyxiated in plastic bags.”

Even before that, in 2019, PayPal took the decision to stop processing payments for Pornhub after an investigation by The Sunday Times revealed that the site contained child abuse videos and other illegal content. The newspaper reported:

“Pornhub is awash with secretly filmed ‘creepshots’ of schoolgirls and clips of men performing sex acts in front of teenagers on buses. It has also hosted indecent images of children as young as three.

The website says it bans content showing under-18s and removes it swiftly. But some of the videos identified by this newspaper’s investigation had 350,000 views and had been on the platform for more than three years.”

One of the women who is now being forced to take legal action against Pornhub’s parent company, MindGeek, is Crystal Palace footballer Leigh Nicol. Leigh’s phone was hacked and private content was uploaded to Pornhub without her knowledge. She said in an interview:

“The damage is done for me so this is about the next generation. I feel like prevention is better than someone having to react to this. I cannot change it alone but if I can raise awareness to stop it happening to others then that is what I want to do…The more that you dig into this, the more traumatising it is because there are 14-year-old kids on these websites and they don’t even know about it. The fact that you can publish videos that have neither party’s consent is something that has to be changed by law, for sure.”

Leigh Nicol is spot on.

Unfortunately, when this subject was debated in Committee, the previous Minister, the hon. Member for Croydon South (Chris Philp), argued that the content I have described—including child sexual abuse images and videos—was already illegal, and there was therefore no need for the Government to introduce further measures. However, that misses the point: the Minister was arguing against the very basis of his own Government’s Bill. At the core of the Bill, as I understand it, is a legal duty placed on online platforms to combat and remove content that is already illegal, such as material relating to terrorism. ln keeping with that, my amendments would place a legal duty on online platforms hosting pornographic content to combat and remove illegal content through the specific and targeted measure of verifying the age and consent of every individual featured in pornographic content on their sites. The owners and operators of pornography websites are getting very rich from hosting footage of rape, trafficking and child sexual abuse, and they must be held to account under the law and required to take preventive action.

The Organisation for Security and Co-operation in Europe, which leads action to combat human trafficking across 57 member states, recommends that Governments require age and consent verification on pornography websites in order to combat exploitation. The OSCE told me:

“These sites routinely feature sexual violence, exploitation and abuse, and trafficking victims. Repeatedly these sites have chosen profits over reasonable prevention and protection measures. At the most basic level, these sites should be required to ensure that each person depicted is a consenting adult, with robust age verification and the right to withdraw consent at any time. Since self- regulation hasn’t worked, this will only work through strong, state-led regulation”.

Who else supports that? Legislation requiring online platforms to verify the age and consent of all individuals featured in pornographic content on their sites is backed by leading anti-sexual exploitation organisations including CEASE—the Centre to End All Sexual Exploitation—UK Feminista and the Traffickinghub movement, which has driven the global campaign to expose the abuses committed by, in particular, Pornhub.

New clause 7 and amendments 33 and 34 are minimum safety measures that would stop the well-documented practice of pornography websites hosting and profiting from videos of rape, trafficking and child sexual abuse. I urge the Government to reconsider their position, and I will seek to test the will of the House on new clause 7 later this evening.

Adam Afriyie Portrait Adam Afriyie
- View Speech - Hansard - - - Excerpts

I echo the concerns expressed by the right hon. Member for Kingston upon Hull North (Dame Diana Johnson). Some appalling abuses are taking place online, and I hope that the Bill goes some way to address them, to the extent that that is possible within the framework that it sets up. I greatly appreciate the right hon. Lady’s comments and her contribution to the debate.

I have a tight and narrow point for the Minister. In amendment 56, I seek to ensure that only pornographic material is caught by the definition in the Bill. My concern is that we catch these abuses online, catch them quickly and penalise them harshly, but also that sites that may display, for example, works of art featuring nudes—or body positivity community sites, of which there are several—are not inadvertently caught in our desire to clamp down on illegal pornographic sites. Perhaps the Minister will say a few words about that in his closing remarks.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- View Speech - Hansard - - - Excerpts

I rise to speak to this small group of amendments on behalf of the Opposition. Despite everything that is going on at the moment, we must remember that this Bill has the potential to change lives for the better. It is an important piece of legislation, and we cannot miss the opportunity to get it right. I would like to join my hon. Friend the Member for Pontypridd (Alex Davies-Jones) in welcoming the Under-Secretary of State for Digital, Culture, Media and Sport, the hon. Member for Folkestone and Hythe (Damian Collins) to his role. His work as Chair of the Joint Committee on this Bill was an important part of the pre-legislative scrutiny process, and I look forward to working in collaboration with him to ensure that this legislation does as it should in keeping us all safe online. I welcome the support of the former Minister, the hon. Member for Croydon South (Chris Philp), on giving access to data to academic researchers and on looking at the changes needed to deal with the harm caused by the way in which algorithmic prompts work. It was a pity he was not persuaded by the amendments in Committee, but better late than never.

17:34
Earlier we debated new clause 14, which will reduce the amount of illegal content and fraudulent advertising that is identified and acted upon. In our view, this new clause undermines and weakens the safety mechanisms that members of the Joint Committee and the Public Bill Committee worked so hard to get right. I hope the Government will reconsider this part of the Bill when it goes through its stages in the House of Lords. Even without new clause 14, though, there are problems with the provisions around fraudulent advertising. Having said that, we were pleased that the Government conceded to our calls in Committee to ensure that major search engines and social media sites would be subject to the same duties to prevent fraudulent advertising from appearing on their sites.
However, there are other changes that we need to see if the Bill is to be successful in reducing the soaring rates of online fraud and changing the UK’s reputation as the
“scam capital of the world”,
according to Which? The Government voted against other amendments tabled in Committee by me and my hon. Friend the Member for Pontypridd that would have tackled the reasons why people become subject to online fraud. Our amendments would have ensured that customers had better protection against scams and a clearer understanding of which search results were paid-for ads. In rejecting our amendments, the Government have missed an opportunity to tackle the many forms of scamming that people experience online.
One of those forms of scamming is in the world of online ticketing. In my role as shadow Minister for the Arts and Civil Society, I have worked on this issue and been informed by the expertise of my hon. Friend the Member for Washington and Sunderland West (Mrs Hodgson), who chairs the all-party parliamentary group on ticket abuse. I would like to thank her and those who have worked with the APPG on the anti-ticket touting campaign for their insights. Ticket reselling websites have a well-documented history of breaching consumer protection laws. These breaches include cases of fraud such as the sale of non-existent tickets. If our amendment had been passed, secondary ticketing websites such as Viagogo would have had to be members of a regulatory body responsible for secondary ticketing such as the Society of Ticket Agents and Retailers, and they would have had to comply with established standards.
I have used ticket touting as an example, but the repercussions of this change go wider to include scamming by holiday websites, debt services and fraudulent passport renewal companies. Our amendments, together with amendments 205 to 210, which were tabled by the hon. Members for Ochil and South Perthshire (John Nicolson) and for Aberdeen North (Kirsty Blackman), would improve protection against scams and close loopholes in the definitions of fraudulent advertising. I hope the Minister recognises how many more scams these clauses would prevent if the amendments were accepted.
Part 5 of the Bill includes provisions that relate to pornographic content, which we have already heard about in this debate. For too long, we have seen a proliferation of websites with illegal and harmful content rife with representations of sexual violence, incest, rape and exploitation, and I thank my right hon. Friend the Member for Kingston upon Hull North (Dame Diana Johnson) for the examples she has just given us. We welcomed the important changes made to the Bill before the Committee stage, which meant that all pornographic content, not just user-generated content, would now be included within the duties in part 5.
Other Members have tabled important amendments to this part of the Bill. New clause 33 and new schedule 1, tabled by the hon. Members for Ochil and South Perthshire and for Aberdeen North, will ensure parity between online and offline content standards for pornography. New clause 33 is important in specifying that content that fails to obtain an R18 certificate has to be removed, just as happens in the offline world under the Video Recordings Act 1984. My right hon. Friend the Member for Kingston upon Hull North tabled amendment 33 and new clause 7, which place new duties on user-generated commercial pornography sites to verify the age and obtain consent of people featured in pornographic content, and to remove content should that consent be withdrawn. These are safeguards that should have been put in place by pornography platforms from the very start.
I would like to raise our concern about how quickly these duties can be brought into force. Clause 196 lays out that the only clauses in part 5 to be enacted once the Bill receives Royal Assent will be those covering the definitions—clauses 66 and 67(4)—and not those covering the duties. Children cannot wait another three years for protections from harm, having been promised this five years ago under part 3 of the Digital Economy Act 2017, which was never implemented. I hope the Minister appreciates the need for speed in regulating this particularly high harm part of the internet.
Part 11 clarifies companies’ liability and outlines the type of information offences contained in the Bill. It is important that liability is at the heart of discussions about the practical applications of the Bill, because we know that big internet companies have got away with doing nothing for far too long. However, the current focus on information offences means that criminal liability for repeated and systematic failures resulting in serious harm to users remains a crucial omission from the Bill.
My hon. Friend the Member for Pontypridd was vocal in making the point, but it needs to be made again, that we are very concerned about the volume of last-minute amendments tabled by the Government, and particularly their last-ditch attempt at power grabbing through amendment 144. The Secretary of State should not have the ability to decide what constitutes a priority offence without appropriate scrutiny, and our amendments would bring appropriate parliamentary oversight.
Amendment 31, in my name and in the name of my hon. Friend, would require that any changes to clauses 53 or 54, on harmful content, are debated on the Floor of the House rather than in a Delegated Legislation Committee. Without this change, the Secretary of State of the day will have the power to make decisions about priority content quietly through secondary legislation, which could have real-life consequences. Any changes to priority content are worthy of proper debate. If the Minister is serious about proper scrutiny of the online safety regime, he should carefully consider amendment 31. I urge hon. Members to support the amendment.
Finally, part 12 includes clarifications and definitions. The hon. Members for Ochil and South Perthshire and for Aberdeen North tabled amendment 158, which would expand the definition of content in the Bill. This is an important future-proofing measure.
As I mentioned, we are concerned about the delays to the implementation of certain duties set out in part 12. We are now in a situation in which many children who need protection will no longer even be children by the time this legislation and its protections come into effect. Current uncertainty about the running of Government will compound the concerns of many charities and children’s advocacy groups. I hope the Minister will agree that we cannot risk further delays.
At its core, the Online Safety Bill should be about reducing harm, and we are all aligned on that aim. I am disappointed that the Government have reversed some of the effectiveness of the scrutiny in Committee by now amending the Bill to such a degree. I hope the Minister considers our amendments in the collaborative spirit in which they are intended, and recognises their potential to make this Bill stronger and more effective for all.
Jeremy Wright Portrait Sir Jeremy Wright
- View Speech - Hansard - - - Excerpts

I think it is extraordinarily important that this Bill does what the hon. Member for Worsley and Eccles South (Barbara Keeley) has just described. As the Bill moves from this place to the other place, we must debate what the right balance is between what the Secretary of State must do—in the previous group of amendments, we heard that many of us believe that is too extensive as the Bill stands—what the regulator, Ofcom, must do and what Parliament must do. There is an important judgment call for this House to make on whether we have that balance right in the Bill as it stands.

These amendments are very interesting. I am not convinced that the amendments addressed by the hon. Lady get the balance exactly right either, but there is cause for further discussion about where we in this House believe the correct boundary is between what an independent regulator should be given authority to do under this legislative and regulatory structure and what we wish to retain to ourselves as a legislature.

Adam Afriyie Portrait Adam Afriyie
- Hansard - - - Excerpts

My right hon. and learned Friend is highlighting, and I completely agree, that there is a very sensitive balance between different power bases and between different approaches to achieving the same outcome. Does he agree that as even more modifications are made—the nipping and tucking I described earlier—this debate and future debates, and these amendments, will contribute to those improvements over the weeks and months ahead?

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

Yes, I agree with my hon. Friend about that. I hope it is some comfort to the hon. Member for Worsley and Eccles South when I say that if the House does not support her amendment, it should not be taken that she has not made a good point that needs further discussion—probably in the other place, I fear. We are going to have think carefully about that balance. It is also important that we do not retain to ourselves as a legislature those things that the regulator ought to have in its own armoury. If we want Ofcom to be an effective and independent regulator in this space, we must give it sufficient authority to fulfil that role. She makes interesting points, although I am not sure I can go as far as supporting her amendments. I know that is disappointing, but I do think that what she has done is prompted a further debate on exactly this balance between Secretary of State, Executive, legislature and regulator, which is exactly where we need to be.

I have two other things to mention. The first relates to new clause 7 and amendment 33, which the right hon. Member for Kingston upon Hull North (Dame Diana Johnson) tabled. She speaks powerfully to a clear need to ensure that this area is properly covered. My question, however, is about practicalities. I am happy to take an intervention if she can answer it immediately. If not, I am happy to discuss it with her another time. She has heard me speak many times about making sure that this Bill is workable. The challenge in what she has described in her amendments may be that a platform needs to know how it is to determine and “verify”—that is the word she has used—that a participant in a pornographic video is an adult and a willing participant. It is clearly desirable that the platform should know both of those things, but the question that will have to be answered is: by what mechanism will it establish that? Will it ask the maker of the pornographic video and be prepared to accept the assurances it is given? If not, by what other mechanism should it do this? For example, there may be a discussion to be had on what technology is available to establish whether someone is an adult or is not—that bleeds into the discussion we have had about age assurance. It may be hard for a platform to establish whether someone is a willing participant.

Jess Phillips Portrait Jess Phillips (Birmingham, Yardley) (Lab)
- View Speech - Hansard - - - Excerpts

This has been quite topical this week. When we have things on any platform that is on our television, people absolutely have to have signed forms to say that they are a willing participant. It is completely regular within all other broadcast media that people sign consent forms and that people’s images are not allowed to be used without their consent.

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

Yes, I am grateful to the hon. Lady for that useful addition to this debate, but it tends to clarify the point I was seeking to clarify, which is whether or not what the right hon. Member for Kingston upon Hull North has in mind is to ensure that a platform would be expected to make use of those mechanisms that already exist in order to satisfy itself of the things that she rightly asks it to be satisfied of or whether something beyond that would be required to meet her threshold. If it is the former, that is manageable for platforms and perfectly reasonable for us to expect of them. If it is the latter, we need to understand a little more clearly how she expects a platform to achieve that greater assurance. If it is that, she makes an interesting point.

Finally, let me come to amendment 56, tabled by my hon. Friend the Member for Windsor (Adam Afriyie). Again, I have a practical concern. He seeks to ensure that the pornographic content is “taken as a whole”, but I think it is worth remembering why we have included pornographic content in the context of this Bill. We have done it to ensure that children are not exposed to this content online and that where platforms are capable of preventing that from happening, that is exactly what they do. There is a risk that if we take this content as a whole, it is perfectly conceivable that there may be content online that is four hours long, only 10 minutes of which is pornographic in nature. It does not seem to me that that in any way diminishes our requirement of a platform to ensure that children do not see those 10 minutes of pornographic content.

Adam Afriyie Portrait Adam Afriyie
- Hansard - - - Excerpts

I am very sympathetic to that view. I am merely flagging up for the Minister that if we get the opportunity, we need to have a look at it again in the Lords, to be absolutely certain that we are not ruling out certain types of art, and certain types of community sites that we would all think were perfectly acceptable, that are probably not accessible to children, just to ensure that we are not creating further problems down the road that we would have to correct.

17:45
Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

I follow that point. I will channel, with some effort, the hon. Member for Birmingham, Yardley (Jess Phillips), who I suspect would say that these things are already up for debate and discussed in other contexts—the ability to distinguish between art and pornography is something that we have wrestled with in other media. Actually, in relation to the Bill, I think that one of our guiding principles ought to be that we do not reinvent the wheel where we do not have to, and that we seek to apply to the online world the principles and approaches that we would expect in all other environments. That is probably the answer to my hon. Friend’s point.

I think it is very important that we recognise the need for platforms to do all they can to ensure that the wrong type of material does not reach vulnerable users, even if that material is a brief part of a fairly long piece. Those, of course, are exactly the principles that we apply to the classification of films and television. It may well be that a small portion of a programme constitutes material that is unsuitable for a child, but we would still seek to put it the wrong side of the 9 o’clock watershed or use whatever methods we think the regulator ought to adopt to ensure that children do not see it.

Good points are being made. The practicalities are important; it may be that because of a lack of available time and effort in this place, we have to resolve those elsewhere.

John Nicolson Portrait John Nicolson
- View Speech - Hansard - - - Excerpts

I wish to speak to new clause 33, my proposed new schedule 1 and amendments 201 to 203. I notice that the Secretary of State is off again. I place on record my thanks to Naomi Miles of CEASE—the Centre to End All Sexual Exploitation—and Ceri Finnegan of Barnardos for their support.

The UK Government have taken some steps to strengthen protections on pornography and I welcome the fact that young teenagers will no longer be able to access pornography online. However, huge quantities of extreme and harmful pornography remain online, and we need to address the damage that it does. New clause 33 would seek to create parity between online and offline content—consistent legal standards for pornography. It includes a comprehensive definition of pornography and puts a duty on websites not to host content that would fail to attain the British Board of Film Classification standard for R18 classification.

The point of the Bill, as the Minister has repeatedly said, is to make the online world a safer place, by doing what we all agree must be done—making what is illegal offline, illegal online. That is why so many Members think that the lack of regulation around pornography is a major omission in the Bill.

The new clause stipulates age and consent checks for anyone featured in pornographic content. It addresses the proliferation of pornographic content that is both illegal and harmful, protecting women, children and minorities on both sides of the camera.

The Bill presents an opportunity to end the proliferation of illegal and harmful content on the internet. Representations of sexual violence, animal abuse, incest, rape, coercion, abuse and exploitation—particularly directed towards women and children—are rife. Such content can normalise dangerous and abusive acts and attitudes, leading to real-world harm. As my hon. Friend the Member for Pontypridd (Alex Davies-Jones) said in her eloquent speech earlier, we are seeing an epidemic of violence against women and girls online. When bile and hatred is so prolific online, it bleeds into the offline space. There are real-world harms that flow from that.

The Minister has said how much of a priority tackling violence against women and girls is for him. Knowing that, and knowing him, he will understand that pornography is always harmful to children, and certain kinds of pornographic content are also potentially harmful to adults. Under the Video Recordings Act 1984, the BBFC has responsibility for classifying pornographic content to ensure that it is not illegal, and that it does not promote an interest in abusive relationships, such as incest. Nor can it promote acts likely to cause serious physical harm, such as breath restriction or strangulation. In the United Kingdom, it is against the law to supply pornographic material that does not meet this established BBFC classification standard, but there is no equivalent standard in the online world because the internet evolved without equivalent regulatory oversight.

I know too that the Minister is determined to tackle some of the abusive and dangerous pornographic content online. The Bill does include a definition of pornography, in clause 66(2), but that definition is inadequate; it is too brief and narrow in scope. In my amendment, I propose a tighter and more comprehensive definition, based on that in part 3 of the Digital Economy Act 2017, which was debated in this place and passed into law. The amendment will remove ambiguity and prevent confusion, ensuring that all websites know where they stand with regard to the law.

The new duty on pornographic websites aligns with the UK Government’s 2020 legislation regulating UK-established video-sharing platforms and video-on-demand services, both of which appeal to the BBFC’s R18 classification standards. The same “high standard of rules in place to protect audiences”, as the 2020 legislation put it, and “certain content standards” should apply equally to online pornography and offline pornography, UK-established video-sharing platforms and video-on-demand services.

Let me give some examples sent to me by Barnardo’s, the children’s charity, which, with CEASE, has done incredibly important work in this area. The names have been changed in these examples, for obvious reasons.

“There are also children who view pornography to try to understand their own sexual abuse. Unfortunately, what these children find is content that normalises the most abhorrent and illegal behaviours, such as 15-year-old Elizabeth, who has been sexually abused by a much older relative for a number of years. The content she found on pornography sites depicted older relatives having sex with young girls and the girls enjoying it. It wasn’t until she disclosed her abuse that she realised that it was not normal.

Carrie is a 16-year-old who was being sexually abused by her stepfather. She thought this was not unusual due to the significant amount of content she had seen on pornography sites showing sexual relationships within stepfamilies.”

That is deeply disturbing evidence from Barnardo’s.

Although in theory the Bill will prevent under-18s from accessing such content, the Minister knows that under-18s will be able to bypass regulation through technology like VPNs, as the DCMS Committee and the Bill Committee—I served on both—were told by experts in various evidence sessions. The amendment does not create a new law; it merely moves existing laws into the online space. There is good cause to regulate and sometimes prohibit certain damaging offline content; I believe it is now our duty to provide consistency with legislation in the online world.

Kirsty Blackman Portrait Kirsty Blackman
- View Speech - Hansard - - - Excerpts

I want to talk about several things, but particularly new clause 7. I am really pleased that the new clause has come back on Report, as we discussed it in the Bill Committee but unfortunately did not get enough support for it there—as was the case with everything we proposed—so I thank the right hon. Member for Kingston upon Hull North (Dame Diana Johnson) for tabling it. I also thank my hon. Friend the Member for Inverclyde (Ronnie Cowan) for his lobbying and for providing us with lots of background information. I agree that it is incredibly important that new clause 7 is agreed, particularly the provisions on consent and making sure that participants are of an appropriate age to be taking part. We have heard so many stories of so many people whose videos are online—whose bodies are online—and there is nothing they can do about it because of the lack of regulation. My hon. Friend the Member for Ochil and South Perthshire (John Nicolson) has covered new clause 33 in an awful lot of detail—very good detail—so I will not comment on that.

The right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) mentioned how we need to get the balance right, and specifically talked about the role of the regulator. In many ways, this Bill has failed to get the balance right in its attempts to protect children online. Many people who have been involved in writing this Bill, talking about this Bill, scrutinising this Bill and taking part in every piece of work that we have done around it do not understand how children use the internet. Some people do, absolutely, but far too many of the people who have had any involvement in this Bill do not. They do not understand the massive benefits to children of using the internet, the immense amount of fun they can have playing Fortnite, Fall Guys, Minecraft, or whatever it is they happen to be playing online and how important that is to them in today’s crazy world with all of the social media pressures. Children need to decompress. This is a great place for children to have fun—to have a wonderful time—but they need to be protected, just as we would protect them going out to play in the park, just the same as we would protect them in all other areas of life. We have a legal age for smoking, for example. We need to make sure that the protections are in place, and the protections that are in place need to be stronger than the ones that are currently in the Bill.

I did not have a chance earlier—or I do not think I did—to support the clause about violence against women and girls. As I said in Committee, I absolutely support that being in the Bill. The Government may say, “Oh we don’t need to have this in the Bill because it runs through everything,” but having that written in the Bill would make it clear to internet service providers—to all those people providing services online and having user-generated content on their sites—how important this is and how much of a scourge it is. Young women who spend their time on social media are more likely to have lower outcomes in life as a result of problematic social media use, as a result of the pain and suffering that is caused. We should be putting such a measure in the Bill, and I will continue to argue for that.

We have talked a lot about pornographic content in this section. There is not enough futureproofing in the Bill. My hon. Friend the Member for Ochil and South Perthshire and I tabled amendment 158 because we are concerned about that lack of futureproofing. The amendment edits the definition of “content”. The current definition of “content” says basically anything online, and it includes a list of stuff. We have suggested that it should say “including but not limited to”, on the basis that we do not know what the internet will look like in two years’ time, let alone what it will look like in 20 years’ time. If this Bill is to stand the test of time, it needs to be clear that that list is not exhaustive. It needs to be clear that, when we are getting into virtual reality metaverses where people are meeting each other, that counts as well. It needs to be clear that the sex dungeon that exists in the child’s game Roblox is an issue—that that content is an issue no matter whether it fits the definition of “content” or whether it fits the fact that it is written communication, images or whatever. It does not need to fit any of that. If it is anything harmful that children can find on the internet, it should be included in that definition of “content”, no matter whether it fits any of those specific categories. We just do not know what the internet is going to look like.

I have one other specific thing in relation to the issues of content and pornography. One of the biggest concerns that we heard is the massive increase in the amount of self-generated child sexual abuse images. A significant number of new images of child sexual abuse are self-generated. Everybody has a camera phone these days. Kids have camera phones these days. They have much more potential to get themselves into really uncomfortable and difficult situations than when most of us were younger. There is so much potential for that to be manipulated unless we get this right.

18:03
I have concerns about the age assurance that was mentioned. If it is livestreamed content, content that is being generated right now at this moment, scanning for those child sexual abuse images will be very difficult. There will not be a hashtag. It has not been discovered before. It is not an image that will have been looked at, categorised and worked out before. It is something that evil people are convincing and forcing children to do today.
That is another place where the Bill fails to recognise how young people use the internet today. It fails to talk specifically about things such as livestreaming and how duties on providing services for children on the internet should reduce things such as the ability to livestream and to have private conversations with potential abusers. All those things are not in the Bill in the way I would like. I understand that the codes of practice will come through and there will be guidance on the risk assessments, but I have not seen enough so far to convince me that people know what they are doing when they are writing those codes of practice.
From what I have heard from Ofcom, it has generally been pretty sensible, but nearly every person I have encountered talking about this Bill who has had or continues to have any say over it does not understand how children actually use the internet. I have been online for nearly 30 years, since I was younger than my children are now. I grew up on the internet. I spend a lot of time on the internet. I have spoken to so many people who I did not know online. I have had so many ridiculous, harmful conversations that I would be aghast and devastated if my children were having now. I do not want that to be happening to tomorrow’s generation of children.
No matter what we put in place, there will always be loopholes and bad actors and there will always be issues, but we want the Bill to be genuinely the best possible. The biggest failing is that lack of understanding of how children interact with the internet. We want it to be a safe place for them. We want them to be able to have great experiences online and to enjoy themselves, and we want to put up those protections in the same way that we put up crossing patrols and so on to protect children, and we are just not there yet.
I appreciate that the Minister and the previous Minister, the hon. Member for Croydon South (Chris Philp), have brought forward a significant number of amendments, although it is unfortunate that they have come at such short notice that we have not had enough time to look at them properly, and I appreciate that they will bring forward more, but there is still more they need to do. Even with all the amendments and all the commitments we have seen, I am still not comfortable enough that my children and my children’s children will be as safe on the internet as they should be.
Lloyd Russell-Moyle Portrait Lloyd Russell-Moyle (Brighton, Kemptown) (Lab/Co-op)
- View Speech - Hansard - - - Excerpts

I rise to support new clauses 7 and 33 in particular. I support them sometimes from a different angle from my hon. Friends, but fundamentally from the same angle: consent. I am not afraid to say that I have a different perspective from some hon. Members in this House in that I view sex work as a legitimate form of work under regulated and protected conditions, and pornography as part of that. What I do have a problem with is the lack of consent that occurs far too often not only in the industry—that may be too broad a term—but in particular content that we see online at the moment.

That is true particularly for those sex workers who might have produced content with consent at the time, as adults, but who later in life realise that they do not wish that material to be available any more—not just because they may be embarrassed about it, but perhaps because they just do not want that material commercially available and people making profits off their bodies any more. They are struggling to get content taken down because they are told, “You gave consent at the time and that can’t now be removed. You have to allow your body to be used.” We would not allow any other form of worker or artist to suffer that. In any other form of music or production, if they wished to remove their consent for it to be played, it would be taken down, but in pornography there seems to be a free-for-all where, even if people remove their consent, it still proliferates in copies of copies that are put all over the internet. That is not even to mention people who never gave their consent at all and experience revenge porn or their phones being hacked and the devastation that that can cause.

I might come from a different position on some of this, but I think we can be united in saying that of course we need better action on under-18s, which is very important, but even for those who have supposedly given their consent at one point or another, the removal of consent must be put into the Bill and platforms must have a strict responsibility to remove that content. Without that being in the Bill, there is a danger that platforms will continue to play loophole after loophole and the content will still be there when it should not be.

Ronnie Cowan Portrait Ronnie Cowan (Inverclyde) (SNP)
- View Speech - Hansard - - - Excerpts

I was not planning to speak, but we have a couple of minutes so I will abuse that position.

I just want to say that I do not want new clause 7 to be lost in this debate and become part of the flotsam and jetsam of the tide of opinion that goes back and forth in this place, because new clause 7 is about consent. We are trying very hard to teach young men all about consent, and if we cannot do it from this place, then when can we do it? We can work out the details of the technology in time, as we always do. It is out there. Other people are way ahead of us in this matter. In fact, the people who produce this pornography are way ahead of us in this matter.

Diana Johnson Portrait Dame Diana Johnson
- Hansard - - - Excerpts

While we have been having this debate, Iain Corby, executive director at the Age Verification Providers Association, has sent me an email in which he said that the House may be interested to know that one of the members of that organisation offers adult sites a service that facilitates age verification and the obtaining and maintaining of records of consent. So it is possible to do this if the will is there.

Ronnie Cowan Portrait Ronnie Cowan
- View Speech - Hansard - - - Excerpts

I absolutely agree. We can also look at this from the point of view of gambling reform and age verification for that. The technology is there, and we can harness and use it to protect people. All I am asking is that we do not let this slip through the cracks this evening.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

We have had an important debate raising a series of extremely important topics. While the Government may not agree with the amendments that have been tabled, that is not because of a lack of seriousness of concern about the issues that have been raised.

The right hon. Member for Kingston upon Hull North (Dame Diana Johnson) spoke very powerfully. I have also met Leigh Nicol, the lady she cited, and she discussed with me the experience that she had. Sadly, it was during lockdown and it was a virtual meeting rather than face to face. There are many young women, in particular, who have experienced the horror of having intimate images shared online without their knowledge or consent and then gone through the difficult experience of trying to get them removed, even when it is absolutely clear that they should be removed and are there without their consent. That is the responsibility of the companies and the platforms to act on.

Thinking about where we are now, before the Bill passes, the requirement to deal with illegal content, even the worst illegal content, on the platforms is still largely based on the reporting of that content, without the ability for us to know how effective they are at actually removing it. That is largely based on old legislation. The Bill will move on significantly by creating proactive responsibilities not just to discover illegal content but to act to mitigate it and to be audited to see how effectively it is done. Under the Bill, that now includes not just content that would be considered to be an abuse of children. A child cannot give consent to have sex or to appear in pornographic content. Companies need to make sure that what they are doing is sufficient to meet that need.

It should be for the regulator, Ofcom, as part of putting together the codes of practice, to understand, even on more extreme content, what systems companies have in place to ensure that they are complying with the law and certainly not knowingly hosting content that has been flagged to them as being non-consensual pornography or child abuse images, which is effectively what pornography with minors would be; and to understand what systems they have in place to make sure that they are complying with the law and, as hon. Members have said, making sure that they are using available technologies in order to deliver that.

Jess Phillips Portrait Jess Phillips
- View Speech - Hansard - - - Excerpts

We have an opportunity here today to make sure that the companies are doing it. I am not entirely sure why we would not take that opportunity to legislate to make sure that they are. With the greatest of respect to the Minister back in a position of authority, it sounds an awful lot like the triumph of hope over experience.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

It is because of the danger of such a sentiment that this Bill is so important. It not just sets the targets and requirements of companies to act against illegal content, but enables a regulator to ensure that they have the systems and processes in place to do it, that they are using appropriate technology and that they apply the principle that their system should be effective at addressing this issue. If they are defective, that is a failure on the company’s part. It cannot be good enough that the company says, “It is too difficult to do”, when they are not using technologies that would readily solve that problem. We believe that the technologies that the companies have and the powers of the regulator to have proper codes of practice in place and to order the companies to make sure they are doing it will be sufficient to address the concern that the hon. Lady raises.

Diana Johnson Portrait Dame Diana Johnson
- View Speech - Hansard - - - Excerpts

I am a little taken aback that the Minister believes that the legislation will be sufficient. I do not understand why he has not responded to the point that my hon. Friend the Member for Birmingham, Yardley (Jess Phillips) was making that we could make this happen by putting the proposal in the Bill and saying, “This is a requirement.” I am not sure why he thinks that is not the best way forward.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

It is because the proposal would not make such content more illegal than it is now. It is already illegal and there are already legal duties on companies to act. The regulator’s job is to ensure they have the systems in place to do that effectively, and that is what the Bill sets out. We believe that the Bill addresses the serious issue that the right hon. Lady raises in her amendments. That legal requirement is there, as is the ability to have the systems in place.

If I may, I will give a different example based on the fraud example given by the shadow Minister, the hon. Member for Worsley and Eccles South (Barbara Keeley). On the Joint Committee that scrutinised the Bill, we pushed hard to have fraudulent ads included within the scope of the Bill, which has been one of the important amendments to it. The regulator can consider what systems the company should have in place to identify fraud, but also what technologies it employs to make it far less likely that fraud would be there in the first place. Google has a deal with the Financial Conduct Authority, whereby it limits advertisers from non-accredited companies advertising on its platform. That makes it far less likely that fraud will be discovered because, if the system works, only properly recognised organisations will be advertising.

Facebook does not have such a system in place. As a consequence, since the Google system went live, we have seen a dramatic drop in fraud ads on Google, but a substantial increase in fraud ads on Facebook and platforms such as Instagram. That shows that if we have the right systems in place, we can have a better outcome and change the result. The job of the regulator with illegal pornography and other illegal content should be to look at those systems and say, “Do the companies have the right technology to deliver the result that is required?” If they do not, that would still be a failure of the codes.

Barbara Keeley Portrait Barbara Keeley
- View Speech - Hansard - - - Excerpts

The Minister is quoting a case that I quoted in Committee, and the former Minister, the hon. Member for Croydon South (Chris Philp), would not accept amendments on this issue. We could have tightened up on fraudulent advertising. If Google can do that for financial ads, other platforms can do it. We tabled an amendment that the Government did not accept. I do not know why this Minister is quoting something that we quoted in Committee—I know he was not there, but he needs to know that we tried this and the former Minister did not accept what we called for.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I am quoting that case merely because it is a good example of how, if we have better systems, we can get a better result. As part of the codes of practice, Ofcom will be able to look at some of these other systems and say to companies, “This is not just about content moderation; it is about having better systems that detect known illegal activity earlier and prevent it from getting on to the platform.” It is not about how quickly it is removed, but how effective companies are at stopping it ever being there in the first place. That is within the scope of regulation, and my belief is that those powers exist at the moment and therefore should be used.

Jess Phillips Portrait Jess Phillips
- Hansard - - - Excerpts

Just to push on this point, images of me have appeared on pornographic sites. They were not necessarily illegal images of anything bad happening to me, but other Members of Parliament in this House and I have suffered from that. Is the Minister telling me that this Bill will allow me to get in touch with that site and have an assurance that that image will be taken down and that it would be breaking the law if it did not do so?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The Bill absolutely addresses the sharing of non-consensual images in that way, so that would be something the regulator should take enforcement action against—

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Well, the regulator is required, and has the power, to take enforcement action against companies for failing to do so. That is what the legislation sets out, and we will be in a very different place from where we are now. That is why the Bill constitutes a very significant reform.

Lloyd Russell-Moyle Portrait Lloyd Russell-Moyle
- Hansard - - - Excerpts

Will the Minister give way?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Very briefly, and then I want to wrap up.

Lloyd Russell-Moyle Portrait Lloyd Russell-Moyle
- Hansard - - - Excerpts

Could the Minister give me a reassurance about when consent is withdrawn? The image may initially have been there “consensually”—I would put that in inverted commas—so the platform is okay to put it there. However, if someone contacts the platform saying that they now want to change their consent—they may want to take a role in public life, having previously had a different role; I am not saying that about my hon. Friend the Member for Birmingham, Yardley (Jess Phillips)—my understanding is that there is no ability legally to enforce that content coming down. Can the Minister correct me, and if not, why is he not supporting new clause 7?

18:15
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

With people who have appeared in pornographic films consensually and signed contracts to do so, that would be a very different matter from the question of intimate images being shared without consent. When someone has not consented for such images to be there, that would be a very different matter. I am saying that the Bill sets out very clearly—it did not do so in draft form—that non-consensual sexual images and extreme pornography are within the scope of the regulator’s power. The regulator should be taking action not just on what a company does to take such content down when it is discovered after the event, but on what systems the company has in place and whether it deploys all available technology to make sure that such content is never there in the first place.

Before closing, I want to touch briefly on the point raised about the Secretary of State’s powers to designate priority areas of harm. This is now under the affirmative procedure in the Bill, and it requires the approval of both Houses of Parliament. The priority illegal harms will be based on offences that already exist in law, and we are writing those priority offences into the Bill. The other priorities will be areas where the regulator will seek to test whether companies adhere to their terms of service. The new transparency requirements will set that out, and the Government have said that we will set out in more detail which of those priority areas of harm such transparency will apply to. There is still more work to be done on that, but we have given an indicative example. However, when it comes to adding a new priority illegal offence to the Bill, the premise is that it will already be an offence that Parliament has created, and writing it into the Bill will be done with the positive consent of Parliament. I think that is a substantial improvement on where the Bill was before. I am conscious that I have filled my time.

Question put, That the clause be read a Second time.

18:17

Division 38

Ayes: 220

Noes: 285

Clause 34
Duties about fraudulent advertising: Category 1 services
Amendment made: 83, page 33, line 20, leave out “and (9)” and insert “, (9) and (9A)”.—(Damian Collins.)
This technical amendment ensures that a reference to clause 52 takes account of the new subsection inserted by Amendment 98.
Clause 35
Duties about fraudulent advertising: Category 2A services
Amendment made: 84, page 34, line 21, leave out “and (9)” and insert “, (9) and (9A)”.—(Damian Collins.)
This technical amendment ensures that a reference to clause 52 takes account of the new subsection inserted by Amendment 98.
Clause 63
Interpretation of this Chapter
Amendment made: 101, page 56, line 32, leave out “and (9)” and insert “, (9) and (9A)”.—(Damian Collins.)
This technical amendment ensures that a reference to clause 52 takes account of the new subsection inserted by Amendment 98.
Clause 176
Powers to amend section 36
Amendment made: 141, page 141, line 39, leave out “and (9)” and insert “, (9) and (9A)”.—(Damian Collins.)
This technical amendment ensures that a reference to clause 52 takes account of the new subsection inserted by Amendment 98.
Clause 177
Powers to amend or repeal provisions relating to exempt content or services
Amendments made: 177, page 142, line 6, at beginning insert “Subject to subsection (2A),”.
Amendment 178, page 142, line 7, after “49” insert “(2)(e),”.
Amendment 179, page 142, line 8, leave out paragraph (b).
Amendment 180, page 142, line 12, at end insert—
“(2A) Regulations under subsection (2) may not have the effect that comments and reviews on provider content present on a service of which the provider is a recognised news publisher become regulated user-generated content within the meaning of Part 3.”
Amendment 181, page 142, line 25, leave out “any” and insert “either”.
Amendment 182, page 142, line 28, leave out paragraph (b).
Amendment 183, page 142, line 34, at end insert—
“(7A) Subject to subsection (7B), the Secretary of State may by regulations amend paragraph 4 of Schedule 1 (limited functionality services) if the Secretary of State considers that it is appropriate because of the risk of harm to individuals in the United Kingdom presented by a service described in that paragraph.
(7B) Regulations under subsection (7A) may not have the effect that a service described in paragraph 4 of Schedule 1 of which the provider is a recognised news publisher is no longer exempt under that paragraph.”
Amendment 184, page 142, line 46, leave out subsection (11) and insert—
‘(11) In this section—
“comments and reviews on provider content” and “one-to-one live aural communications” have the meaning given by section 49;
“recognised news publisher” has the meaning given by section 50;
“regulated provider pornographic content” and “published or displayed” have the same meaning as in Part 5 (see section 66).’—(Damian Collins.)
Amendments 177 to 184 ensure that the power to amend the definition of “regulated user-generated content” in clause 49 cannot be exercised so as to include comments and reviews on content on services provided by recognised news publishers, and the power to amend paragraph 4 of Schedule 1 (limited functionality services) cannot be exercised so as to remove such services which are provided by recognised news publishers from the exemption.
Clause 179
Powers to amend Schedules 5, 6 and 7
Amendments made: 142, page 145, leave out lines 13 and 14 and insert—
“But an offence may be added to that Schedule only on the grounds in subsection (4) or (4A), and subsection (5) limits the power to add an offence.”
This amendment is consequential on Amendments 143 and 144.
Amendment 143, page 145, line 15, leave out from beginning to “the” and insert
“The first ground for adding an offence to Schedule 7 is that”.
This amendment is consequential on Amendment 144.
Amendment 144, page 145, line 24, at end insert—
“(4A) The second ground for adding an offence to Schedule 7 is that the Secretary of State considers it appropriate to do so because of—
(a) the prevalence of the use of regulated user-to-user services for the commission or facilitation of that offence,
(b) the risk of harm to individuals in the United Kingdom presented by the use of such services for the commission or facilitation of that offence, and
(c) the severity of that harm.”
This amendment extends the Secretary of State’s power to make regulations adding an offence to Schedule 7 so that it will be a priority offence. The new grounds concern the prevalence of user-to-user services being used to commit or facilitate the offence in question.
Amendment 145, page 146, line 5, leave out “and (9)” and insert “, (9) and (9A)”.—(Damian Collins.)
This technical amendment ensures that a reference to clause 52 takes account of the new subsection inserted by Amendment 98.
Clause 182
Parliamentary procedure for regulations
Amendment made: 185, page 147, line 3, after “(6)” insert “, (7A)”.—(Damian Collins.)
This amendment ensures that regulations under clause 177(7A) (inserted by Amendment 183) are subject to the affirmative procedure.
Amendment proposed: 31, page 147, line 16, leave out from “unless” to end of line 17 and insert—
“(a) a draft of the instrument has been laid before each House of Parliament,
“(b) the Secretary of State has made a motion in the House of Commons in relation to the draft instrument, and
(c) the draft instrument has been approved by a resolution of each House of Parliament.”—(Barbara Keeley.)
This amendment would require a draft of a statutory instrument containing regulations under sections 53 or 54 to be debated on the floor of the House of Commons, rather than in a delegated legislation committee (as part of the affirmative procedure).
Question put, That the amendment be made.
18:34

Division 39

Ayes: 188

Noes: 283

Clause 193
Index of defined terms
Amendments made: 186, page 158, line 25, at end insert—

“journalistic content (in Part 3)

section 16”

This is a technical amendment adding a definition of “journalistic content” to the index of defined terms.
Amendment 146, page 159, line 18, at end insert—

“priority offence (in Part 3)

section 52”

—(Damian Collins.)
This technical amendment adds a definition of “priority offence” to the index of defined terms.
Schedule 8
Transparency reports by providers of Category 1 services, Category 2a services and Category 2b services
Amendment made: 150, page 188, line 29, at end insert—
“7A Features, including functionalities, that a provider considers may contribute to risks of harm to individuals using the service, and measures taken or in use by the provider to mitigate and manage those risks.”—(Damian Collins.)
This amendment adds a new matter to Schedule 8, which is about things that providers can be asked to provide transparency reports about. The new matter is about risks around functionalities used by user-to-user services.
Ordered, That further consideration be now adjourned. —(James Duddridge.)
Bill to be further considered tomorrow.
Business of the House (Today)
Ordered,
That, at today’s sitting, the Speaker shall put the Questions necessary to dispose of proceedings on the Motion in the name of Mark Spencer relating to Restoration and Renewal of the Palace of Westminster not later than two hours after the commencement of proceedings on the Motion for this Order; such Questions shall include the Questions on any Amendments selected by the Speaker which may then be moved; the business on that Motion may be entered upon and proceeded with at any hour, though opposed; and Standing Order No. 41A (Deferred divisions) shall not apply.—(James Duddridge.)

Online Safety Bill

[2nd Allocated Day]
[Relevant documents: Report of the Joint Committee on the Draft Online Safety Bill, Session 2021-22: Draft Online Safety Bill, HC 609, and the Government Response, CP 640; First Report of the Digital, Cultural, Media and Sport Committee, Amending the Online Safety Bill, HC 271; Second Report of the Petitions Committee, Session 2021-22, Tackling Online Abuse, HC 766, and the Government response, HC 1224; Letter from the Minister for Tech and the Digital Economy to the Chair of the Joint Committee on Human Rights relating to the Online Safety Bill, dated 16 June 2022; Letter from the Chair of the Joint Committee on Human Rights to the Secretary of State for Digital, Culture, Media and Sport relating to the Online Safety Bill, dated 19 May 2022; e-petition 272087, Hold online trolls accountable for their online abuse via their IP address; e-petition 332315, Ban anonymous accounts on social media; e-petition 575833, Make verified ID a requirement for opening a social media account; e-petition 582423, Repeal Section 127 of the Communications Act 2003 and expunge all convictions e-petition 601932, Do not restrict our right to freedom of expression online.]
Further consideration of Bill, as amended in the Public Bill Committee
Lindsay Hoyle Portrait Mr Speaker
- Hansard - - - Excerpts

Before I call the Minister to open the debate, I have something to say about the scope of today’s debate. This is day 2 of debate on consideration of the Bill as amended in the Public Bill Committee. We are debating today only the new clauses, amendments and new schedules listed on the selection paper that I have issued today.

Members may be aware that the Government have tabled a programme motion that would recommit certain clauses and schedules to a Public Bill Committee. There will be an opportunity to debate that motion following proceedings on consideration. The Government have also published a draft list of proposed amendments to the Bill that they intend to bring forward during the recommittal process. These amendments are not in scope for today. There will be an opportunity to debate, at a future Report stage, the recommitted clauses and schedules, as amended on recommittal in the Public Bill Committee.

Most of today’s amendments and new clauses do not relate to the clauses and schedules that are being recommitted. These amendments and new clauses have been highlighted on the selection paper. Today will be the final chance for the Commons to consider them: there will be no opportunity for them to be tabled and considered again at any point during the remaining Commons stages.

New Clause 11

Notices to deal with terrorism content or CSEA content (or both)

“(1) If OFCOM consider that it is necessary and proportionate to do so, they may give a notice described in subsection (2), (3) or (4) relating to a regulated user-to-user service or a regulated search service to the provider of the service.

(2) A notice under subsection (1) that relates to a regulated user-to-user service is a notice requiring the provider of the service—

(a) to do any or all of the following—

(i) use accredited technology to identify terrorism content communicated publicly by means of the service and to swiftly take down that content;

(ii) use accredited technology to prevent individuals from encountering terrorism content communicated publicly by means of the service;

(iii) use accredited technology to identify CSEA content, whether communicated publicly or privately by means of the service, and to swiftly take down that content;

(iv) use accredited technology to prevent individuals from encountering CSEA content, whether communicated publicly or privately, by means of the service; or

(b) to use the provider’s best endeavours to develop or source technology for use on or in relation to the service or part of the service, which—

(i) achieves the purpose mentioned in paragraph (a)(iii) or (iv), and

(ii) meets the standards published by the Secretary of State (see section 106(10)).

(3) A notice under subsection (1) that relates to a regulated search service is a notice requiring the provider of the service—

(a) to do either or both of the following—

(i) use accredited technology to identify search content of the service that is terrorism content and to swiftly take measures designed to secure, so far as possible, that search content of the service no longer includes terrorism content identified by the technology;

(ii) use accredited technology to identify search content of the service that is CSEA content and to swiftly take measures designed to secure, so far as possible, that search content of the service no longer includes CSEA content identified by the technology; or

(b) to use the provider’s best endeavours to develop or source technology for use on or in relation to the service which—

(i) achieves the purpose mentioned in paragraph (a)(ii), and

(ii) meets the standards published by the Secretary of State (see section 106(10)).

(4) A notice under subsection (1) that relates to a combined service is a notice requiring the provider of the service—

(a) to do any or all of the things described in subsection (2)(a) in relation to the user-to-user part of the service, or to use best endeavours to develop or source technology as described in subsection (2)(b) for use on or in relation to that part of the service;

(b) to do either or both of the things described in subsection (3)(a) in relation to the search engine of the service, or to use best endeavours to develop or source technology as described in subsection (3)(b) for use on or in relation to the search engine of the service;

(c) to do any or all of the things described in subsection (2)(a) in relation to the user-to-user part of the service and either or both of the things described in subsection (3)(a) in relation to the search engine of the service; or

(d) to use best endeavours to develop or source—

(i) technology as described in subsection (2)(b) for use on or in relation to the user-to-user part of the service, and

(ii) technology as described in subsection (3)(b) for use on or in relation to the search engine of the service.

(5) For the purposes of subsections (2) and (3), a requirement to use accredited technology may be complied with by the use of the technology alone or by means of the technology together with the use of human moderators.

(6) See—

(a) section (Warning notices), which requires OFCOM to give a warning notice before giving a notice under subsection (1), and

(b) section 105 for provision about matters which OFCOM must consider before giving a notice under subsection (1).

(7) A notice under subsection (1) relating to terrorism content present on a service must identify the content, or parts of the service that include content, that OFCOM consider is communicated publicly on that service (see section 188).

(8) For the meaning of “accredited” technology, see section 106(9) and (10).”—(Julia Lopez.)

This clause replaces existing clause 104. The main changes are: for user-to-user services, a notice may require the use of accredited technology to prevent individuals from encountering terrorism or CSEA content; for user-to-user and search services, a notice may require a provider to use best endeavours to develop or source technology to deal with CSEA content.

Brought up, and read the First time.

15:33
Lindsay Hoyle Portrait Mr Speaker
- Hansard - - - Excerpts

With this it will be convenient to discuss the following:

Government new clause 12—Warning notices.

Government new clause 20—OFCOM’s reports about news publisher content and journalistic content.

Government new clause 40—Amendment of Enterprise Act 2002.

Government new clause 42—Former providers of regulated services.

Government new clause 43—Amendments of Part 4B of the Communications Act.

Government new clause 44—Repeal of Part 4B of the Communications Act: transitional provision etc.

Government new clause 51—Publication by providers of details of enforcement action.

Government new clause 52—Exemptions from offence under section 152.

Government new clause 53—Offences of sending or showing flashing images electronically: England and Wales and Northern Ireland (No.2).

New clause 1—Provisional re-categorisation of a Part 3 service

“(1) This section applies in relation to OFCOM’s duty to maintain the register of categories of regulated user-to-user services and regulated search services under section 83.

(2) If OFCOM—

(a) consider that a Part 3 service not included in a particular part of the register is likely to meet the threshold conditions relevant to that part, and

(b) reasonably consider that urgent application of duties relevant to that part is necessary to avoid or mitigate significant harm,

New clause 16—Communication offence for encouraging or assisting self-harm

“(1) In the Suicide Act 1961, after section 3 insert—

“3A Communication offence for encouraging or assisting self-harm

(1) A person (“D”) commits an offence if—

(a) D sends a message,

(b) the message encourages or could be used to assist another person (“P”) to inflict serious physical harm upon themselves, and

(c) D’s act was intended to encourage or assist the infliction of serious physical harm.

(2) The person referred to in subsection (1)(b) need not be a specific person (or class of persons) known to, or identified by, D.

(3) D may commit an offence under this section whether or not any person causes serious physical harm to themselves, or attempts to do so.

(4) A person guilty of an offence under this section is liable—

(a) on summary conviction, to imprisonment for a term not exceeding 12 months, or a fine, or both;

(b) on indictment, to imprisonment for a term not exceeding 5 years, or a fine, or both.

(5) “Serious physical harm” means serious injury amounting to grievous bodily harm within the meaning of the Offences Against the Person Act 1861.

(6) No proceedings shall be instituted for an offence under this section except by or with the consent of the Director of Public Prosecutions.

(7) If D arranges for a person (“D2”) to do an Act and D2 does that Act, D is also to be treated as having done that Act for the purposes of subsection (1).

(8) In proceedings for an offence to which this section applies, it shall be a defence for D to prove that—

(a) P had expressed intention to inflict serious physical harm upon themselves prior to them receiving the message from D; and

(b) P’s intention to inflict serious physical harm upon themselves was not initiated by D; and

(c) the message was wholly motivated by compassion towards D or to promote the interests of P’s health or wellbeing.””

This new clause would create a new communication offence for sending a message encouraging or assisting another person to self-harm.

New clause 17—Liability of directors for compliance failure

“(1) This section applies where OFCOM considers that there are reasonable grounds for believing that a provider of a regulated service has failed, or is failing, to comply with any enforceable requirement (see section 112) that applies in relation to the service.

(2) If OFCOM considers that the failure results from any—

(a) action,

(b) direction,

(c) neglect, or

(d) with the consent

This new clause would enable Ofcom to exercise its enforcement powers under Chapter 6, Part 7 of the Bill against individual directors, managers and other officers at a regulated service provider where it considers the provider has failed, or is failing, to comply with any enforceable requirement.

New clause 23—Financial support for victims support services

“(1) The Secretary of State must by regulations make provision for penalties paid under Chapter 6 to be used for funding for victims support services.

(2) Those regulations must—

(a) specify criteria setting out which victim support services are eligible for financial support under this provision;

(b) set out a means by which the amount of funding available should be determined;

(c) make provision for the funding to be reviewed and allocated on a three year basis.

(3) Regulations under this section—

(a) shall be made by statutory instrument, and

(b) may not be made unless a draft has been laid before and approved by resolution of each House of Parliament.”

New clause 28—Establishment of Advocacy Body

“(1) There is to be a body corporate (“the Advocacy Body”) to represent interests of child users of regulated services.

(2) A “child user”—

(a) means any person aged 17 years or under who uses or is likely to use regulated internet services; and

(b) includes both any existing child user and any future child user.

(3) The work of the Advocacy Body may include—

(a) representing the interests of child users;

(b) the protection and promotion of these interests;

(c) any other matter connected with those interests.

(4) The “interests of child users” means the interests of children in relation to the discharge by any regulated company of its duties under this Act, including—

(a) safety duties about illegal content, in particular CSEA content;

(b) safety duties protecting children;

(c) “enforceable requirements” relating to children.

(5) The Advocacy Body must have particular regard to the interests of child users that display one or more protected characteristics within the meaning of the Equality Act 2010.

(6) The Advocacy Body will be defined as a statutory consultee for OFCOM’s regulatory decisions which impact upon the interests of children.

(7) The Advocacy Body must assess emerging threats to child users of regulated services and must bring information regarding these threats to OFCOM.

(8) The Advocacy Body may undertake research on their own account.

(9) The Secretary of State must either appoint an organisation known to represent children to be designated the functions under this Act, or create an organisation to carry out the designated functions.

(10) The budget of the Advocacy Body will be subject to annual approval by the board of OFCOM.

(11) The Secretary of State must give directions to OFCOM as to how it should recover the costs relating to the expenses of the Advocacy Body, or the Secretary of State in relation to the establishment of the Advocacy Body, through the provisions to require a provider of a regulated service to pay a fee (as set out in section 71).”

New clause 29—Duty to promote media literacy: regulated user-to-user services and search services

“(1) In addition to the duty on OFCOM to promote media literacy under section 11 of the Communications Act 2003, OFCOM must take such steps as they consider appropriate to improve the media literacy of the public in relation to regulated user-to-user services and search services.

(2) This section applies only in relation to OFCOM’s duty to regulate—

(a) user-to-user services, and

(b) search services.

(3) OFCOM’s performance of its duty in subsection (1) must include pursuit of the following objectives—

(a) to reach audiences who are less engaged with, and harder to reach through, traditional media literacy initiatives;

(b) to address gaps in the availability and accessibility of media literacy provisions targeted at vulnerable users;

(c) to build the resilience of the public to disinformation and misinformation by using media literacy as a tool to reduce the harm from that misinformation and disinformation;

(d) to promote greater availability and effectiveness of media literacy initiatives and other measures, including by—

(i) carrying out, commissioning or encouraging educational initiatives designed to improve the media literacy of the public;

(ii) seeking to ensure, through the exercise of OFCOM’s online safety functions, that providers of regulated services take appropriate measures to improve users’ media literacy;

(iii) seeking to improve the evaluation of the effectiveness of the initiatives and measures mentioned in sub paras (2)(d)(i) and (ii) (including by increasing the availability and adequacy of data to make those evaluations);

(e) to promote better coordination within the media literacy sector.

(4) OFCOM may prepare such guidance about the matters referred to in subsection (2) as it considers appropriate.

(5) Where OFCOM prepares guidance under subsection (4) it must—

(a) publish the guidance (and any revised or replacement guidance); and

(b) keep the guidance under review.

(6) OFCOM must co-operate with the Secretary of State in the exercise and performance of their duty under this section.”

This new clause places an additional duty on Ofcom to promote media literacy of the public in relation to regulated user-to-user services and search services.

New clause 30—Media literacy strategy

“(1) OFCOM must prepare a strategy which sets out how they intend to undertake their duty to promote media literacy in relation to regulated user-to-user services and regulated search services under section (Duty to promote media literacy: regulated user-to-user services and search services).

(2) The strategy must—

(a) set out the steps OFCOM propose to take to achieve the pursuit of the objectives set out in section (Duty to promote media literacy: regulated user-to-user services and search services),

(b) set out the organisations, or types of organisations, that OFCOM propose to work with in undertaking the duty;

(c) explain why OFCOM considers that the steps it proposes to take will be effective;

(d) explain how OFCOM will assess the extent of the progress that is being made under the strategy.

(3) In preparing the strategy OFCOM must have regard to the need to allocate adequate resources for implementing the strategy.

(4) OFCOM must publish the strategy within the period of 6 months beginning with the day on which this section comes into force.

(5) Before publishing the strategy (or publishing a revised strategy), OFCOM must consult—

(a) persons with experience in or knowledge of the formulation, implementation and evaluation of policies and programmes intended to improve media literacy;

(b) the advisory committee on disinformation and misinformation, and

(c) any other person that OFCOM consider appropriate.

(6) If OFCOM have not revised the strategy within the period of 3 years beginning with the day on which the strategy was last published, they must either—

(a) revise the strategy, or

(b) publish an explanation of why they have decided not to revise it.

(7) If OFCOM decides to revise the strategy they must—

(a) consult in accordance with subsection (3), and

(b) publish the revised strategy.”

This new clause places an additional duty on Ofcom to promote media literacy of the public in relation to regulated user-to-user services and search services.

New clause 31—Research conducted by regulated services

“(1) OFCOM may, at any time it considers appropriate, produce a report into how regulated services commission, collate, publish and make use of research.

(2) For the purposes of the report, OFCOM may require services to submit to OFCOM—

(a) a specific piece of research held by the service, or

(b) all research the service holds on a topic specified by OFCOM.”

New clause 34—Factual Accuracy

“(1) The purpose of this section is to reduce the risk of harm to users of regulated services caused by disinformation or misinformation.

(2) Any Regulated Service must provide an index of the historic factual accuracy of material published by each user who has—

(a) produced user-generated content,

(b) news publisher content, or

(c) comments and reviews on provider contact

(3) The index under subsection (1) must—

(a) satisfy minimum quality criteria to be set by OFCOM, and

(b) be displayed in a way which allows any user easily to reach an informed view of the likely factual accuracy of the content at the same time as they encounter it.”

New clause 35—Duty of balance

“(1) The purpose of this section is to reduce the risk of harm to users of regulated services caused by disinformation or misinformation.

(2) Any Regulated Service which selects or prioritises particular—

(a) user-generated content,

(b) news publisher content, or

(c) comments and reviews on provider content

New clause 36—Identification of information incidents by OFCOM

“(1) OFCOM must maintain arrangements for identifying and understanding patterns in the presence and dissemination of harmful misinformation and disinformation on regulated services.

(2) Arrangements for the purposes of subsection (1) must in particular include arrangements for—

(a) identifying, and assessing the severity of, actual or potential information incidents; and

(b) consulting with persons with expertise in the identification, prevention and handling of disinformation and misinformation online (for the purposes of subsection (2)(a)).

(3) Where an actual or potential information incident is identified, OFCOM must as soon as reasonably practicable—

(a) set out any steps that OFCOM plans to take under its online safety functions in relation to that situation; and

(b) publish such recommendations or other information that OFCOM considers appropriate.

(4) Information under subsection (3) may be published in such a manner as appears to OFCOM to be appropriate for bringing it to the attention of the persons who, in OFCOM’s opinion, should be made aware of it.

(5) OFCOM must prepare and issue guidance about how it will exercise its functions under this section and, in particular—

(a) the matters it will take into account in determining whether an information incident has arisen;

(b) the matters it will take into account in determining the severity of an incident; and

(c) the types of responses that OFCOM thinks are likely to be appropriate when responding to an information incident.

(6) For the purposes of this section—

“harmful misinformation or disinformation” means misinformation or disinformation which, taking into account the manner and extent of its dissemination, may have a material adverse effect on users of regulated services or other members of the public;

“information incident” means a situation where it appears to OFCOM that there is a serious or systemic dissemination of harmful misinformation or disinformation relating to a particular event or situation.”

This new clause would insert a new clause into the Bill to give Ofcom a proactive role in identifying and responding to the sorts of information incidents that can occur in moments of crisis.

New clause 37—Duty to promote media literacy: regulated user-to-user services and search services

“(1) In addition to the duty on OFCOM to promote media literacy under section 11 of the Communications Act 2003, OFCOM must take such steps as they consider appropriate to improve the media literacy of the public in relation to regulated user-to-user services and search services.

(2) This section applies only in relation to OFCOM’s duty to regulate—

(a) user-to-user services, and

(b) search services.

(3) OFCOM’s performance of its duty in subsection (1) must include pursuit of the following objectives—

(a) to encourage the development and use of technologies and systems in relation to user-to-user services and search services which help to improve the media literacy of members of the public, including in particular technologies and systems which—

(i) indicate the nature of content on a service (for example, show where it is an advertisement);

(ii) indicate the reliability and accuracy of the content; and

(iii) facilitate control over what content is received;

(b) to build the resilience of the public to disinformation and misinformation by using media literacy as a tool to reduce the harm from that misinformation and disinformation;

(c) to promote greater availability and effectiveness of media literacy initiatives and other measures, including by carrying out, commissioning or encouraging educational initiatives designed to improve the media literacy of the public.

(4) OFCOM must prepare guidance about—

(a) the matters referred to in subsection (3) as it considers appropriate; and

(b) minimum standards that media literacy initiatives must meet.

(5) Where OFCOM prepares guidance under subsection (4) it must—

(a) publish the guidance (and any revised or replacement guidance); and

(b) keep the guidance under review.

(6) Every report under paragraph 12 of the Schedule to the Office of Communications Act 2002 (OFCOM’s annual report) for a financial year must contain a summary of the steps that OFCOM have taken under subsection (1) in that year.”

This new clause places an additional duty on Ofcom to promote media literacy of the public in relation to regulated user-to-user services and search services.

New clause 45—Sharing etc intimate photographs or film without consent

“(1) A person (A) commits an offence if—

(a) A intentionally shares an intimate photograph or film of another person (B) with B or with a third person (C); and

(b) A does so—

(i) without B’s consent, and

(ii) without reasonably believing that B consents.

(2) References to a third person (C) in this section are to be read as referring to—

(a) an individual;

(b) a group of individuals;

(c) a section of the public; or

(d) the public at large.

(3) A person (A) does not commit an offence under this section if A shares a photograph or film of another person (B) with B or a third person (C) if—

(a) the photograph or film only shows activity that would be ordinarily seen on public street, except for a photograph or film of breastfeeding;

(b) the photograph or film was taken in public, where the person depicted was voluntarily nude, partially nude or engaging in a sexual act or toileting in public;

(c) A reasonably believed that the photograph or film, taken in public, showed a person depicted who was voluntarily nude, partially nude or engaging in a sexual act or toileting in public;

(d) the photograph or film has been previously shared with consent in public;

(e) A reasonably believed that the photograph or film had been previously shared with consent in public;

(f) the photograph or film shows a young child and is of a kind ordinarily shared by family and friends;

(g) the photograph or film is of a child shared for that child’s medical care or treatment, where there is parental consent.

(4) A person (A) does not commit an offence under this section if A shares information about where to access a photograph or film where this photograph or film has already been made available to A.

(5) It is a defence for a person charged with an offence under this section to prove that they—

(a) reasonably believed that the sharing was necessary for the purposes of preventing, detecting, investigating or prosecuting crime;

(b) reasonably believed that the sharing was necessary for the purposes of legal or regulatory proceedings;

(c) reasonably believed that the sharing was necessary for the administration of justice;

(d) reasonably believed that the sharing was necessary for a genuine medical, scientific or educational purpose; and

(e) reasonably believed that the sharing was in the public interest.

(6) An “intimate photograph or film” is a photograph or film that is sexual, shows a person nude or partially nude, or shows a person toileting, of a kind which is not ordinarily seen on a public street, which includes—

(a) any photograph or film that shows something a reasonable person would consider to be sexual because of its nature;

(b) any photograph or film that shows something which, taken as a whole, is such that a reasonable person would consider it to be sexual;

(c) any photograph or film that shows a person’s genitals, buttocks or breasts, whether exposed, covered with underwear or anything being worn as underwear, or where a person is similarly or more exposed than if they were wearing only underwear;

(d) any photograph or film that shows toileting, meaning a photograph or film of someone in the act of defecation and urination, or images of personal care associated with genital or anal discharge, defecation and urination.

(7) References to sharing such a photograph or film with another person include—

(a) sending it to another person by any means, electronically or otherwise;

(b) showing it to another person;

(c) placing it for another person to find; or

(d) sharing it on or uploading it to a user-to-user service, including websites or online public forums.

(8) “Photograph” includes the negative as well as the positive version.

(9) “Film” means a moving image.

(10) References to a photograph or film include—

(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film,

(b) an image which has been altered through computer graphics,

(c) a copy of a photograph, film or image, and

(d) data stored by any means which is capable of conversion into a photograph, film or image.

(11) Sections 74 to 76 of the Sexual Offences Act 2003 apply when determining consent in relation to offences in this section.

(12) A person who commits an offence under this section is liable on summary conviction, to imprisonment for a term not exceeding 6 months or a fine (or both).”

This new clause creates the offence of sharing an intimate image without consent, providing the necessary exclusions such as for children’s medical care or images taken in public places, and establishing the penalty as triable by magistrates only with maximum imprisonment of 6 months.

New clause 46—Sharing etc intimate photographs or film with intent to cause alarm, distress or humiliation

“(1) A person (A) commits an offence if—

(a) A intentionally shares an intimate photograph or film of another person (B) with B or with a third person (C); and

(b) A does so—

(i) without B’s consent, and

(ii) without reasonably believing that B consents; and

(c) A intends that the subject of the photograph or film will be caused alarm, distress or humiliation by the sharing of the photograph or film.

(2) References to a third person (C) in this section are to be read as referring to—

(a) an individual;

(b) a group of individuals;

(c) a section of the public; or

(d) the public at large.

(3) An “intimate photograph or film” is a photograph or film that is sexual, shows a person nude or partially nude, or shows a person toileting, of a kind which is not ordinarily seen on a public street, which includes—

(a) any photograph or film that shows something a reasonable person would consider to be sexual because of its nature;

(b) any photograph or film that shows something which, taken as a whole, is such that a reasonable person would consider it to be sexual;

(c) any photograph or film that shows a person’s genitals, buttocks or breasts, whether exposed, covered with underwear or anything being worn as underwear, or where a person is similarly or more exposed than if they were wearing only underwear;

(d) any photograph or film that shows toileting, meaning a photograph or film of someone in the act of defecation and urination, or images of personal care associated with genital or anal discharge, defecation and urination.

(4) References to sharing such a photograph or film with another person include—

(a) sending it to another person by any means, electronically or otherwise;

(b) showing it to another person;

(c) placing it for another person to find; or

(d) sharing it on or uploading it to a user-to-user service, including websites or online public forums.

(5) “Photograph” includes the negative as well as the positive version.

(6) “Film” means a moving image.

(7) References to a photograph or film include—

(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film,

(b) an image which has been altered through computer graphics,

(c) a copy of a photograph, film or image, and

(d) data stored by any means which is capable of conversion into a photograph, film or image.

(8) Sections 74 to 76 of the Sexual Offences Act 2003 apply when determining consent in relation to offences in this section.

(9) A person who commits an offence under this section is liable—

(a) on summary conviction, to imprisonment for a term not exceeding 12 months or a fine (or both);

(b) on conviction on indictment, to imprisonment for a term not exceeding three years.”

This new clause creates a more serious offence where there is the intent to cause alarm etc. by sharing an image, with the appropriately more serious penalty of 12 months through a magistrates’ court or up to three years in a Crown Court.

New clause 47—Sharing etc intimate photographs or film without consent for the purpose of obtaining sexual gratification

“(1) A person (A) commits an offence if—

(a) A intentionally shares an intimate photograph or film of another person (B) with B or with a third person (C); and

(b) A does so—

(i) without B’s consent, and

(ii) without reasonably believing that B consents; and

(c) A shared the photograph or film for the purpose of obtaining sexual gratification (whether for the sender or recipient).

(2) References to a third person (C) in this section are to be read as referring to—

(a) an individual;

(b) a group of individuals;

(c) a section of the public; or

(d) the public at large.

(3) An “intimate photograph or film” is a photograph or film that is sexual, shows a person nude or partially nude, or shows a person toileting, of a kind which is not ordinarily seen on a public street, which includes—

(a) any photograph or film that shows something a reasonable person would consider to be sexual because of its nature;

(b) any photograph or film that shows something which, taken as a whole, is such that a reasonable person would consider it to be sexual;

(c) any photograph or film that shows a person’s genitals, buttocks or breasts, whether exposed, covered with underwear or anything being worn as underwear, or where a person is similarly or more exposed than if they were wearing only underwear;

(d) any photograph or film that shows toileting, meaning a photograph or film of someone in the act of defecation and urination, or images of personal care associated with genital or anal discharge, defecation and urination.

(4) References to sharing such a photograph or film with another person include—

(a) sending it to another person by any means, electronically or otherwise;

(b) showing it to another person;

(c) placing it for another person to find; or

(d) sharing it on or uploading it to a user-to-user service, including websites or online public forums.

(5) “Photograph” includes the negative as well as the positive version.

(6) “Film” means a moving image.

(7) References to a photograph or film include—

(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film,

(b) an image which has been altered through computer graphics,

(c) a copy of a photograph, film or image, and

(d) data stored by any means which is capable of conversion into a photograph, film or image.

(8) Sections 74 to 76 of the Sexual Offences Act 2003 apply when determining consent in relation to offences in this section.

(9) A person who commits an offence under this section is liable—

(a) on summary conviction, to imprisonment for a term not exceeding 12 months or a fine (or both);

(b) on conviction on indictment, to imprisonment for a term not exceeding three years.”

This new clause creates a more serious offence where there is the intent to cause alarm etc. by sharing an image, with the appropriately more serious penalty of 12 months through a magistrates’ court or up to three years in a Crown Court.

New clause 48—Threatening to share etc intimate photographs or film

“(1) A person (A) commits an offence if—

(a) A threatens to share an intimate photograph or film of another person (B) with B or a third person (C); and

(i) A intends B to fear that the threat will be carried out; or A is reckless as to whether B will fear that the threat will be carried out.

(2) “Threatening to share” should be read to include threatening to share an intimate photograph or film that does not exist and other circumstances where it is impossible for A to carry out the threat.

(3) References to a third person (C) in this section are to be read as referring to—

(a) an individual;

(b) a group of individuals;

(c) a section of the public; or

(d) the public at large.

(4) An “intimate photograph or film” is a photograph or film that is sexual, shows a person nude or partially nude, or shows a person toileting, of a kind which is not ordinarily seen on a public street, which includes—

(a) any photograph or film that shows something a reasonable person would consider to be sexual because of its nature;

(b) any photograph or film that shows something which, taken as a whole, is such that a reasonable person would consider it to be sexual;

(c) any photograph or film that shows a person’s genitals, buttocks or breasts, whether exposed, covered with underwear or anything being worn as underwear, or where a person is similarly or more exposed than if they were wearing only underwear;

(d) any photograph or film that shows toileting, meaning a photograph or film of someone in the act of defecation and urination, or images of personal care associated with genital or anal discharge, defecation and urination.

(5) References to sharing, or threatening to share, such a photograph or film with another person include—

(a) sending, or threatening to send, it to another person by any means, electronically or otherwise;

(b) showing, or threatening to show, it to another person;

(c) placing, or threatening to place, it for another person to find; or

(d) sharing, or threatening to share, it on or uploading it to a user-to-user service, including websites or online public forums.

(6) “Photograph” includes the negative as well as the positive version.

(7) “Film” means a moving image.

(8) References to a photograph or film include—

(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film,

(b) an image which has been altered through computer graphics,

(c) a copy of a photograph, film or image, and

(d) data stored by any means which is capable of conversion into a photograph, film or image.

(9) Sections 74 to 76 of the Sexual Offences Act 2003 apply when determining consent in relation to offences in this section.

(10) A person who commits an offence under this section is liable—

(a) on summary conviction, to imprisonment for a term not exceeding 12 months or a fine (or both);

(b) on conviction on indictment, to imprisonment for a term not exceeding three years.”

This new clause creates another more serious offence of threatening to share an intimate image, regardless of whether such an image actually exists, and where the sender intends to cause fear, or is reckless to whether they would cause fear, punishable by 12 months through a magistrates’ court or up to three years in a Crown Court.

New clause 49—Special measures in criminal proceedings for offences involving the sharing of intimate images

“(1) Chapter 1 of Part 2 of the Youth Justice and Criminal Evidence Act 1999 (giving of evidence or information for purposes of criminal proceedings: special measures directions in case of vulnerable and intimidated witnesses) is amended as follows.

(2) In section 17 (witnesses eligible for assistance on grounds of fear or distress about testifying), in subsection (4A) after paragraph (b) insert “(c) ‘an offence under sections [Sharing etc intimate photographs or film without consent; Sharing etc intimate photographs or film with intent to cause alarm, distress or humiliation; Sharing etc intimate photographs or film without consent for the purpose of obtaining sexual gratification; Threatening to share etc intimate photographs or film] of the Online Safety Act 2023’”.”

This new clause inserts intimate image abuse into legislation that qualifies victims for special measures when testifying in court (such as partitions to hide them from view, video testifying etc.) which is already prescribed by law.

New clause 50—Anonymity for victims of offences involving the sharing of intimate images

“(1) Section 2 of the Sexual Offences (Amendment) Act 1992 (Offences to which this Act applies) is amended as follows.

(2) In subsection 1 after paragraph (db) insert—

(dc) ‘an offence under sections [Sharing etc intimate photographs or film without consent; Sharing etc intimate photographs or film with intent to cause alarm, distress or humiliation; Sharing etc intimate photographs or film without consent for the purpose of obtaining sexual gratification; Threatening to share etc intimate photographs or film] of the Online Safety Act 2023’”.”

Similar to NC49, this new clause allows victims of intimate image abuse the same availability for anonymity as other sexual offences to protect their identities and give them the confidence to testify against their abuser without fear of repercussions.

New clause 54—Report on the effect of Virtual Private Networks on OFCOM’s ability to enforce requirements

“(1) The Secretary of State must publish a report on the effect of the use of Virtual Private Networks on OFCOM’s ability to enforce requirements under section 112.

(2) The report must be laid before Parliament within six months of the passing of this Act.”

New clause 55—Offence of sending communication facilitating modern slavery and illegal immigration

‘(1) A person (A) commits an offence if—

(a) (A) intentionally shares with a person (B) or with a third person (C) a photograph or film which is reasonably considered to be, or to be intended to be, facilitating or promoting any activities which do, or could reasonably be expected to, give rise to an offence under—

(i) sections 1 (Slavery, servitude and forced labour), 2 (Human trafficking) or 4 (Committing offence with intent to commit an offence under section 2) of the Modern Slavery Act 2015; or

(ii) sections 24 (Illegal Entry and Similar Offences) or 25 (Assisting unlawful immigration etc) of the Immigration Act 1971; and

(a) (A) does so knowing, or when they reasonably ought to have known, that the activities being depicted are unlawful.

(2) References to a third person (C) in this section are to be read as referring to—

(a) an individual;

(b) a group of individuals;

(c) a section of the public; or

(d) the public at large.

(3) A person (A) does not commit an offence under this section if—

(a) the sharing is undertaken by or on behalf of a journalist or for journalistic purposes;

(b) the sharing is by a refugee organisation registered in the UK and which falls within the scope of sub-section (3) or section 25A of the Immigration Act 1971;

(c) the sharing is by or on behalf of a duly elected Member of Parliament or other elected representative in the UK.

(4) It is a defence for a person charged under this section to provide that they—

(a) reasonably believed that the sharing was necessary for the purposes of preventing, detecting, investigating or prosecuting crime and

(b) reasonably believed that the sharing was necessary for the purposes of legal or regulatory proceedings.

(5) A person who commits an offence under this section is liable on summary conviction, to imprisonment for a term not exceeding the maximum term for summary offences or a fine (or both).”

This new clause would create a new criminal offence of intentionally sharing a photograph or film that facilitates or promotes modern slavery or illegal immigration.

Government amendments 234 and 102 to 117.

Amendment 195, in clause 104, page 87, line 10, leave out subsection 1 and insert—

“(1) If OFCOM consider that it is necessary and proportionate to do so, they may—

(a) give a notice described in subsection (2), (3) or (4) relating to a regulated user to user service or a regulated search service to the provider of the service;

(b) give a notice described in subsection (2), (3) or (4) to a provider or providers of Part 3 services taking into account risk profiles produced by OFCOM under section 84.”

Amendment 152, page 87, line 18, leave out ‘whether’.

This amendment is consequential on Amendment 153.

Amendment 153, page 87, line 19, leave out ‘or privately’.

This amendment removes the ability to monitor encrypted communications.

Government amendment 118.

Amendment 204, in clause 105, page 89, line 17, at end insert—

“(ia) the level of risk of the use of the specified technology accessing, retaining or disclosing the identity or provenance of any confidential journalistic source or confidential journalistic material.”

This amendment would require Ofcom to consider the risk of the use of accredited technology by a Part 3 service accessing, retaining or disclosing the identity or provenance of journalistic sources or confidential journalistic material, when deciding whether to give a notice under Clause 104(1) of the Bill.

Government amendments 119 to 130, 132 to 134, 212, 213, 135 and 214.

Amendment 23, in clause 130, page 114, line 3, leave out paragraph (a).

Government amendment 175.

Amendment 160, in clause 141, page 121, line 9, leave out subsection (2).

This amendment removes the bar of conditionality that must be met for super complaints that relate to a single regulated service.

Amendment 24, page 121, line 16, leave out “The Secretary of State” and insert “OFCOM”.

Amendment 25, page 121, line 21, leave out from “(3),” to end of line 24 and insert “OFCOM must consult—

“(a) The Secretary of State, and

“(b) such other persons as OFCOM considers appropriate.”

This amendment would provide that regulations under clause 141 are to be made by OFCOM rather than by the Secretary of State.

Amendment 189, in clause 142, page 121, line 45, leave out from “including” to end of line 46 and insert

“90 day maximum time limits in relation to the determination and notification to the complainant of—”.

This requires the Secretary of State’s guidance to require Ofcom to determine whether a complaint is eligible for the super-complaints procedure within 90 days.

Amendment 26, in clause 146, page 123, line 33, leave out

“give OFCOM a direction requiring”

and insert “may make representations to”.

Amendment 27, page 123, line 36, leave out subsection (2) and insert—

“(2) OFCOM must have due regard to any representations made by the Secretary of State under subsection (1).”

Amendment 28, page 123, line 38, leave out from “committee” to end of line 39 and insert

“established under this section is to consist of the following members—”.

Amendment 29, page 124, line ], leave out from “committee” to “publish” in line 2 and insert

“established under this section must”.

Amendment 30, page 124, line 4, leave out subsection (5).

Amendment 32, page 124, line 4, leave out clause 148.

Government amendments 176, 239, 138, 240, 215, 241, 242, 217, 218, 243, 219, 244, 245, 220, 221, 140, 246, 222 to 224, 247, 225, 248, 226 and 227.

Amendment 194, in clause 157, page 131, line 16, leave out from beginning to end of line 17 and insert—

“(a) B has not consented for A to send or give the photograph or film to B, and”.

Government amendments 249 to 252, 228, 229 and 235 to 237.

Government new schedule 2—Amendments of Part 4B of the Communications Act.

Government new schedule 3—Video-sharing platform services: transitional provision etc.

Government amendment 238

Amendment 35, schedule 11, page 198, line 5, leave out “The Secretary of State” and insert “OFCOM”.

This amendment would give the power to make regulations under Schedule 11 to OFCOM.

Amendment 2, page 198, line 9, leave out “functionalities” and insert “characteristics”.

Amendment 1, page 198, line 9, at end insert—

“(1A) In this schedule, “characteristics” of a service include its functionalities, user base, business model, governance and other systems and processes.”

Amendment 159, page 198, line 9, at end insert—

“(1A) Regulations made under sub-paragraph (1) must provide for any regulated user-to-user service which OFCOM assesses as posing a very high risk of harm to be included within Category 1, regardless of the number of users.”

This amendment allows Ofcom to impose Category 1 duties on user-to-user services which pose a very high risk of harm.

Amendment 36, page 198, line 10, leave out “The Secretary of State” and insert “OFCOM”.

This amendment is consequential on Amendment 35.

Amendment 37, page 198, line 16, leave out “The Secretary of State” and insert “OFCOM”.

This amendment is consequential on Amendment 35.

Amendment 3, page 198, line 2, leave out “functionalities” and insert “characteristics”.

Amendment 9, page 198, line 28, leave out “and” and insert “or”.

Amendment 4, page 198, line 29, leave out “functionality” and insert “characteristic”.

Amendment 38, page 198, line 32, leave out “the Secretary of State” and insert “OFCOM”.

This amendment is consequential on Amendment 35.

Amendment 5, page 198, line 34, leave out “functionalities” and insert “characteristics”.

Amendment 39, page 198, line 37, leave out “the Secretary of State” and insert “OFCOM”.

This amendment is consequential on Amendment 35.

Amendment 40, page 198, line 41, leave out “the Secretary of State” and insert “OFCOM”.

This amendment is consequential on Amendment 35.

Amendment 6, page 198, line 4, leave out “functionalities” and insert “characteristics”.

Amendment 7, page 199, line 11, leave out “functionalities” and insert “characteristics”.

Amendment 8, page 199, line 28, leave out “functionalities” and insert “characteristics”.

Amendment 41, page 199, line 3, leave out subparagraphs (5) to (11).

This amendment is consequential on Amendment 35.

Government amendments 230, 253 to 261 and 233.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I was about to speak to the programme motion, Mr Speaker, but you have outlined exactly what I was going to say, so thank you for that—I am glad to get the process right.

I am delighted to bring the Online Safety Bill back to the House for the continuation of Report stage. I start by expressing my gratitude to colleagues across the House for their contributions to the Bill through pre-legislative scrutiny and before the summer recess, and for their engagement with me since I took office as the Minister for Tech and the Digital Economy.

The concept at the heart of this legislation is simple: tech companies, like those in every other sector, must take responsibility for the consequences of their business decisions. As they continue to offer users the latest innovations, they must consider the safety of their users as well as profit. They must treat their users fairly and ensure that the internet remains a place for free expression and robust debate. As Members will be aware, the majority of the Bill was discussed on Report before the summer recess. Our focus today is on the provisions that relate to the regulator’s power and the criminal law reforms. I will take this opportunity also to briefly set out the further changes that the Government recently committed to making later in the Bill’s passage.

Let me take the Government amendments in turn. The Government’s top priority for this legislation has always been the protection of children. We recognise that the particularly abhorrent and pernicious nature of online child sexual exploitation and abuse—CSEA—demands the most robust response possible. Throughout the passage of the Bill, we have heard evidence of the appalling harm that CSEA causes. Repeatedly, we heard calls for strong incentives for companies to do everything they can to innovate and make safety technologies their priority, to ensure that there is no place for offenders to hide online. The Bill already includes a specific power to tackle CSEA, which allows Ofcom, subject to safeguards, to require tech companies to use accredited technology to identify and remove illegal CSEA content in public and private communications. However, we have seen in recent years how the online world has evolved to allow offenders to reach their victims and one another in new ways.

Priti Patel Portrait Priti Patel (Witham) (Con)
- Hansard - - - Excerpts

I am listening to my hon. Friend with great interest on this aspect of child sexual abuse and exploitation, which is a heinous crime. Will he go on to speak about how the Ofcom role will interact with law enforcement, in particular the National Crime Agency, when dealing with these awful crimes?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

It is important that we tackle this in a number of ways. My right hon. Friend the Member for Haltemprice and Howden (Mr Davis) and I spoke earlier, and I will come to some of what he will outline. It is important that Ofcom recognises the technologies that are available and—with the Children’s Commissioner as one of the statutory consultees—liaises with the social media platforms, and the agencies, to ensure that there are codes of practice that work, and that we get this absolutely right. It is about enforcing the terms and conditions of the companies and being able to produce the evidence and track the exchanges, as I will outline later, for the agency to use for enforcement.

With the rapid developments in technology, on occasions there will be no existing accredited technology available that will satisfactorily mitigate the risks. Similarly, tech companies might be able to better design solutions that integrate more easily with their services than those that are already accredited. The new regulatory framework must incentivise tech companies to ensure that their safety measures keep pace with the evolving threat, and that they design their services to be safe from the outset. It is for these reasons that the Government have tabled the amendments that we are discussing.

New clauses 11 and 12 establish options for Ofcom when deploying its powers under notices to deal with terrorism content and CSEA content. These notices will empower Ofcom to require companies to use accredited technology to identify and remove illegal terrorism and CSEA content or to prevent users from encountering that content or, crucially, to use their best endeavours to develop or to source technology to tackle CSEA. That strikes the right balance of supporting the adoption of new technology, while ensuring that it does not come at the expense of children’s physical safety.

Rehman Chishti Portrait Rehman Chishti (Gillingham and Rainham) (Con)
- Hansard - - - Excerpts

Terrorism is often linked to non-violent extremism, which feeds into violent extremism and terrorism. How does the Bill define extremism? Previous Governments failed to define it, although it is often linked to terrorism.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

This Bill links with other legislation, and obviously the agencies. We do not seek to redefine extremism where those definitions already exist. As we expand on the changes that we are making, we will first ensure that anything that is already illegal goes off the table. Anything that is against the terms and conditions of those platforms that are hosting that content must not be seen. I will come to the safety net and user protection later.

Charlotte Nichols Portrait Charlotte Nichols (Warrington North) (Lab)
- Hansard - - - Excerpts

Since Elon Musk’s takeover of Twitter, hate speech has ballooned on the platform and the number of staff members at Twitter identifying images of child sexual abuse and exploitation has halved. How can the Minister be sure that the social media companies are able to mark their own homework in the way that he suggests?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Because if those companies do not, they will get a fine of up to £18 million or 10% of their global turnover, whichever is higher. As we are finding with Twitter, there is also a commercial impetus, because advertisers are fleeing that platform as they see the uncertainty being caused by those changes. A lot of things are moving here to ensure that safety is paramount; it is not just for the Government to act in this area. All we are doing is making sure that those companies enforce their own terms and conditions.

Priti Patel Portrait Priti Patel
- Hansard - - - Excerpts

This point is important: we are speaking about terrorism and counter-terrorism and the state’s role in preventing terrorist activity. For clarity, will the Minister update the House later on the work that takes place between his Department and the platforms and, importantly, between the Home Office and the security services. In particular, some specialist work takes place with the Global Internet Forum to Counter Terrorism, which looks at online terrorist and extremist content. That work can ensure that crimes are prevented and that the right kinds of interventions take place.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

My right hon. Friend talks with experience from her time at the Home Office. She is absolutely right that the Bill sets a framework to adhere to the terms and conditions of the platforms. It also sets out the ability for the services to look at things such as terrorism and CSEA, which I have been talking about—for example, through the evidence of photos being exchanged. The Bill is not re-examining and re-prosecuting the interaction between all the agencies, however, because that is apparent for all to see.

New clauses 11 and 12 bring those powers in line with the wider safety duties by making it clear that the tools may seek to proactively prevent CSEA content from appearing on a service, rather than focusing only on identification and removal after the fact. That will ensure the best possible protection for children, including on services that offer livestreaming.

The safeguards around those powers remain as strong as before to protect user privacy. Any tools that are developed will be accredited using a rigorous assessment process to ensure that they are highly accurate before the company is asked to use them. That will avoid any unnecessary intrusions into user privacy by minimising the risk that the tools identify false positives.

Crucially, the powers do not represent a ban on or seek to undermine any specific type of technology or design, such as end-to-end encryption. They align with the UK Government’s view that online privacy and cyber-security must be protected, but that technological changes should not be implemented in a way that diminishes public safety.

Kit Malthouse Portrait Kit Malthouse (North West Hampshire) (Con)
- Hansard - - - Excerpts

Can the Minister expand on the notion of “accredited technology”? The definition in the Bill is pretty scant as to where it will emerge from. Is he essentially saying that he is relying on the same industry that has thus far presided over the problem to produce the technology that will police it for us? Within that equation, which seems a little self-defeating, is it the case that if the technology does not emerge for one reason or another—commercial or otherwise—the Government will step in and devise, fund or otherwise create the technology required to be implemented?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I thank my right hon. Friend. It is the technology sector that develops technology—it is a simple, circular definition—not the Government. We are looking to make sure that it has that technology in place, but if we prescribed it in the Bill, it would undoubtedly be out of date within months, never mind years. That is why it is better for us to have a rounded approach, working with the technology sector, to ensure that it is robust enough.

Kit Malthouse Portrait Kit Malthouse
- Hansard - - - Excerpts

I may not have been clear in my original intervention: my concern is that the legislation relies on the same sector that has thus far failed to regulate itself and failed to invent the technology that is required, even though it is probably perfectly capable of doing so, to produce the technology that we will then accredit to be used. My worry is that the sector, for one reason or another—the same reason that it has not moved with alacrity already to deal with these problems in the 15 years or so that it has existed—may not move at the speed that the Minister or the rest of us require to produce the technology for accreditation. What happens if it does not?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Clearly, the Government can choose to step in. We are setting up a framework to ensure that we get the right balance and are not being prescriptive. I take issue with the idea that a lot of this stuff has not been invented, because there is some pretty robust work on age assurance and verification, and other measures to identify harmful and illegal material, although my right hon. Friend is right that it is not being used as robustly as it could be. That is exactly what we are addressing in the Bill.

15:44
David Davis Portrait Mr David Davis (Haltemprice and Howden) (Con)
- Hansard - - - Excerpts

My intervention is on the same point as that raised by my right hon. Friend the Member for North West Hampshire (Kit Malthouse), but from the opposite direction, in effect. What if it turns out that, as many security specialists and British leaders in security believe—not just the companies, but professors of security at Cambridge and that sort of thing—it is not possible to implement such measures without weakening encryption? What will the Minister’s Bill do then?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The Bill is very specific with regard to encryption; this provision will cover solely CSEA and terrorism. It is important that we do not encroach on privacy.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

I welcome my hon. Friend to his position. Under the Bill, is it not the case that if a company refuses to use existing technologies, that will be a failure of the regulatory duties placed on that company? Companies will be required to demonstrate which technology they will use and will have to use one that is available. On encrypted messaging, is it not the case that companies already gather large amounts of information about websites that people visit before and after they send a message that could be hugely valuable to law enforcement?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

My hon. Friend is absolutely right. Not only is it incumbent on companies to use that technology should it exist; if they hamper Ofcom’s inquiries by not sharing information about what they are doing, what they find and which technologies they are not using, that will be a criminal liability under the Bill.

Luke Evans Portrait Dr Luke Evans (Bosworth) (Con)
- Hansard - - - Excerpts

To take that one step further, is it correct that Ofcom would set minimum standards for operators? For example, the Content Authenticity Initiative does not need primary legislation, but is an industry open-standard, open-source format. That is an example of modern technology that all companies could sign up to use, and Ofcom would therefore determine what needs to be done in primary legislation.

Lindsay Hoyle Portrait Mr Speaker
- Hansard - - - Excerpts

Can I be helpful? We did say that our discussions should be within scope, but the Minister is tempting everybody to intervene out of scope. From his own point of view, I would have thought that it would be easier to keep within scope.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Thank you, Mr Speaker; I will just respond to my hon. Friend the Member for Bosworth (Dr Evans). There is a minimum standard in so far as the operators have to adhere to the terms of the Bill. Our aim is to exclude illegal content and ensure that children are as safe as possible within the remit of the Bill.

The changes will ensure a flexible approach so that companies can use their expertise to develop or source the most effective solution for their service, rather than us being prescriptive. That, in turn, supports the continued growth of our digital economy while keeping our citizens safe online.

Sajid Javid Portrait Sajid Javid (Bromsgrove) (Con)
- Hansard - - - Excerpts

My hon. Friend may know that there are third-party technology companies—developers of this accredited technology, as he calls it—that do not have access to all the data that might be necessary to develop technology to block the kind of content we are discussing. They need to be given the right to access that data from the larger platforms. Will Ofcom be able to instruct large platforms that have users’ data to make it available to third-party developers of technology that can help to block such content?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Ofcom will be working with the platforms over the next few months—in the lead-up to the commencement of the Bill and afterwards—to ensure that the provisions are operational, so that we get them up and running as soon as practicably possible. My right hon. Friend is right to raise the point.

Jim Shannon Portrait Jim Shannon (Strangford) (DUP)
- Hansard - - - Excerpts

In Northern Ireland we face the specific issue of the glorification of terrorism. Glorifying terrorism encourages terrorism. Is it possible that the Bill will stop that type of glorification, and therefore stop the terrorism that comes off the back of it?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will try to cover the hon. Member’s comments a little bit later, if I may, when I talk about some of the changes coming up later in the process.

Moving away from CSEA, I am pleased to say that new clause 53 fulfils a commitment given by my predecessor in Committee to bring forward reforms to address epilepsy trolling. It creates the two specific offences of sending and showing flashing images to an individual with epilepsy with the intention of causing them harm. Those offences will apply in England, Wales and Northern Ireland, providing people with epilepsy with specific protection from this appalling abuse. I would like to place on record our thanks to the Epilepsy Society for working with the Ministry of Justice to develop the new clause.

The offence of sending flashing images captures situations in which an individual sends a communication in a scatter-gun manner—for example, by sharing a flashing image on social media—and the more targeted sending of flashing images to a person who the sender knows or suspects is a person with epilepsy. It can be committed by a person who forwards or shares such an electronic communication as well as by the person sending it. The separate offence of showing flashing images will apply if a person shows flashing images to someone they know or suspect to have epilepsy by means of an electronic communications device—for example, on a mobile phone or a TV screen.

The Government have listened to parliamentarians and stakeholders about the impact and consequences of this reprehensible behaviour, and my thanks go to my hon. Friends the Members for Watford (Dean Russell), for Stourbridge (Suzanne Webb), for Blackpool North and Cleveleys (Paul Maynard) and for Ipswich (Tom Hunt) for their work and campaigning. [Interruption.] Indeed, and the hon. Member for Batley and Spen (Kim Leadbeater), who I am sure will be speaking on this later.

New clause 53 creates offences that are legally robust and enforceable so that those seeking to cause harm to people with epilepsy will face appropriate criminal sanctions. I hope that will reassure the House that the deeply pernicious activity of epilepsy trolling will be punishable by law.

Suzanne Webb Portrait Suzanne Webb (Stourbridge) (Con)
- Hansard - - - Excerpts

The Minister is thanking lots of hon. Members, but should not the biggest thanks go, first, to the Government for the inclusion of this amendment; and secondly, to Zach Eagling, the inspirational now 11-year-old who was the victim of a series of trolling incidents when flashing images were pushed his way after a charity walk? We have a huge amount to thank Zach Eagling for, and of course the amazing Epilepsy Society too.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

A number of Members across the House have been pushing for Zach’s law, and I am really delighted that Zach’s family can see in Hansard that that campaigning has really made a direct change to the law.

Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - - - Excerpts

I just want to echo the previous points. This has been a hard-fought decision, and I am so proud that the Government have done this, but may I echo the thanks to Zach for being a true hero? We talk about David and Goliath, the giant—the beast—who was taken down, but Zach has beaten the tech giants, and I think this is an incredible success.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I absolutely echo my hon. Friend’s remarks, and I again thank him for his work.

We are also taking steps to strengthen Ofcom’s enforcement powers, which is why we are giving Ofcom a discretionary power to require non-compliant services to publish or notify their users of enforcement action that it has taken against the service. Ofcom will be able to use this power to direct a service to publish details or notify its UK users about enforcement notices it receives from Ofcom. I thank the Antisemitism Policy Trust for bringing this proposal to our attention and for its helpful engagement on the issue. This new power will promote transparency by increasing awareness among users about breaches of the duty in the Bill. It will help users make much more informed decisions about the services they use, and act as an additional deterrent factor for service providers.

Luke Evans Portrait Dr Luke Evans
- Hansard - - - Excerpts

It is fantastic to have the data released. Does the Minister have any idea how many of these notifications are likely to be put out there when the Bill comes in? Has any work been done on that? Clearly, having thousands of these come out would be very difficult for the public to understand, but half a dozen over a year might be very useful to understand which companies are struggling.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I think this is why Ofcom has discretion, so that it can determine that. The most egregious examples are the ones people can learn from, and it is about doing this in proportion. My hon. Friend is absolutely right that if we are swamped with small notifications, this will be hidden in plain sight. That would not be useful, particularly for parents, to best understand what is going on. It is all about making more informed decisions.

The House will be aware that we recently announced our intention to make a number of other changes to the Bill. We are making those changes because we believe it is vital that people can continue to express themselves freely and engage in pluralistic debate online. That is why the Bill will be amended to strengthen its provisions relating to children and to ensure that the Bill’s protections for adults strike the right balance with its protections for free speech.

Margaret Hodge Portrait Dame Margaret Hodge (Barking) (Lab)
- Hansard - - - Excerpts

The Minister is alluding, I assume, to the legal but harmful provision, but what does he think about this as an example? People are clever; they do not use illegal language. They will not say, “I want to kill all Jews”, but they may well—and do—say, “I want to harm all globalists.” What is the Minister’s view of that?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The right hon. Lady and I have had a detailed chat about some of the abuse that she and many others have been suffering, and there were some particularly egregious examples. This Bill is not, and never will be, a silver bullet. This has to be worked through, with the Government acting with media platforms and social media platforms, and parents also have a role. This will evolve, but we first need to get back to the fundamental point that social media platforms are not geared up to enforce their own terms and conditions. That is ridiculous, a quarter of a century after the world wide web kicked in, and when social media platforms have been around for the best part of 20 years. We are shutting the stable door afterwards, and trying to come up with legislation two decades later.

Lindsay Hoyle Portrait Mr Speaker
- Hansard - - - Excerpts

Order. I am really bothered. I am trying to help the Minister, because although broadening discussion of the Bill is helpful, it is also allowing Members to come in with remarks that are out of scope. If we are going to go out of scope, we could be here a long time. I am trying to support the Minister by keeping him in scope.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Thank you, Mr Speaker; I will try to keep my remarks very much in scope.

The harmful communications offence in clause 151 was a reform to communication offences proposed in the Bill. Since the Bill has been made public, parliamentarians and stakeholders have expressed concern that the threshold that would trigger prosecution for the offence of causing serious distress could bring robust but legitimate conversation into the illegal space. In the light of that concern, we have decided not to take forward the harmful communications offence for now. That will give the Government an opportunity to consider further how the criminal law can best protect individuals from harmful communications, and ensure that protections for free speech are robust.

Jim Shannon Portrait Jim Shannon
- Hansard - - - Excerpts

This is about the protection of young people, and we are all here for the same reason, including the Minister. We welcome the changes that he is putting forward, but the Royal College of Psychiatrists has expressed a real concern about the mental health of children, and particularly about how screen time affects them. NHS Digital has referred to one in eight 11 to 16-year-olds being bullied. I am not sure whether we see in the Bill an opportunity to protect them, so perhaps the Minister can tell me the right way to do that.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The hon. Gentleman talks about the wider use of screens and screen time, and that is why Ofcom’s media literacy programme, and DCMS’s media literacy strategy—

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

That is because we have a detailed strategy that tackles many of these issues. Again, none of this is perfect, and as I have said, the Government are working in tandem with the platforms, and with parents and education bodies, to make sure we get that bit right. The hon. Gentleman is right to highlight that as a big issue.

I talked about harmful communications, recognising that we could leave a potential gap in the criminal law. The Government have also decided not to repeal existing communications offences in the Malicious Communications Act 1988, or those under section 127(1) of the Communications Act 2003. That will ensure that victims of domestic abuse or other extremely harmful communications will still be robustly protected by the criminal law. Along with planned changes to the harmful communications offence, we are making a number of additional changes to the Bill—that will come later, Mr Speaker, and I will not tread too much into that, as it includes the removal of the adult safety duties, often referred to as the legal but harmful provision. The amended Bill offers adults a triple shield of protection that requires platforms to remove illegal content and material that violates their terms and conditions, and gives adults user controls to help them avoid seeing certain types of content.

The Bill’s key objective, above everything else, is the safety of children online, and we will be making a number of changes to strengthen the Bill’s existing protections for children. We will make sure that we expect platforms to use age assurance technology when identifying the age of their users, and we will also require platforms with minimum age restrictions to explain in their terms of service what measures they have in place to prevent access to those below their minimum age, and enforce those measures consistently. We are planning to name the Children’s Commissioner as a statutory consultee for Ofcom in its development of the codes of practice, ensuring that children’s views and needs are represented.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

That is the Children’s Commissioner for England, specifically because they have particular reserved duties for the whole of the UK. None the less, Ofcom must also have regard to a wider range of voices, which can easily include the other Children’s Commissioners.

16:00
Mike Amesbury Portrait Mike Amesbury (Weaver Vale) (Lab)
- Hansard - - - Excerpts

On age reassurance, does the Minister not see a weakness? Lots of children and young people are far more sophisticated than many of us in the Chamber and will easily find a workaround, as they do now. The onus is being put on the children, so the Bill is not increasing regulation or the safety of those children.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

As I said, the social media platforms will have to put in place robust age assurance and age verification for material in an accredited form that is acceptable to Ofcom, which will look at that.

Tackling violence against women and girls is a key priority for the Government. It is unacceptable that women and girls suffer disproportionately from abuse online, and it is right that we go further to address that through the Bill. That is why we will name the commissioner for victims and witnesses and the Domestic Abuse Commissioner as statutory consultees for the code of practice and list “coercive or controlling behaviour” as a priority offence. That offence disproportionately affects women and girls, and that measure will mean that companies will have to take proactive measures to tackle such content.

Finally, we are making a number of criminal law reforms, and I thank the Law Commission for the great deal of important work that it has done to assess the law in these areas.

Ruth Edwards Portrait Ruth Edwards (Rushcliffe) (Con)
- Hansard - - - Excerpts

I strongly welcome some of the ways in which the Bill has been strengthened to protect women and girls, particularly by criminalising cyber-flashing, for example. Does the Minister agree that it is vital that our laws keep pace with the changes in how technology is being used? Will he therefore assure me that the Government will look to introduce measures along the lines set out in new clauses 45 to 50, standing in the name of my right hon. Friend the Member for Basingstoke (Dame Maria Miller), who is leading fantastic work in this area, so that we can build on the Government’s record in outlawing revenge porn and threats to share it?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I thank my hon. Friend, and indeed I thank my right hon. Friend the Member for Basingstoke (Dame Maria Miller) for the amazing work that she has done in this area. We will table an amendment to the Bill to criminalise more behaviour relating to intimate image abuse, so more perpetrators will face prosecution and potentially time in jail. My hon. Friend has worked tirelessly in this area, and we have had a number of conversations. I thank her for that. I look forward to more conversations to ensure that we get the amendment absolutely right and that it does exactly what we all want.

The changes we are making will include criminalising the non-consensual sharing of manufactured intimate images, which, as we have heard, are more commonly known as deepfakes. In the longer term, the Government will also take forward several of the Law Commission’s recommendations to ensure that the legislation is coherent and takes account of advancements in technology.

We will also use the Bill to bring forward a further communication offence to make the encouragement of self-harm illegal. We have listened to parliamentarians and stakeholders concerned about such behaviour and will use the Bill to criminalise that activity, providing users with protections from that harmful content. I commend my right hon. Friend the Member for Haltemprice and Howden on his work in this area and his advocacy for such a change.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

Intimate image abuse has been raised with me a number of times by younger constituents, who are particularly vulnerable to such abuse. Within the scope of what we are discussing, I am concerned that we have seen only one successful conviction for revenge porn, so if the Government base their intimate image work on the existing legislative framework for revenge porn, it will do nothing and protect no one, and will instead be a waste of everyone’s time and further let down victims who are already let down by the system.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

We will actually base that work on the independent Law Commission’s recommendations, and have been working with it on that basis.

Vicky Ford Portrait Vicky Ford (Chelmsford) (Con)
- Hansard - - - Excerpts

On images that promote self-harm, does the Minister agree that images that promote or glamourise eating disorders should be treated just as seriously as any other content promoting self-harm?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I thank my right hon. Friend, who spoke incredibly powerfully at Digital, Culture, Media and Sport questions, and on a number of other occasions, about her particular experience. That is always incredibly difficult. Absolutely that area will be tackled, especially for children, but it is really important—as we will see from further changes in the Bill—that, with the removal of the legal but harmful protections, there are other protections for adults.

Sajid Javid Portrait Sajid Javid
- Hansard - - - Excerpts

I think last year over 6,000 people died from suicide in the UK. Much of that, sadly, was encouraged by online content, as we saw from the recent coroner’s report into the tragic death of Molly Russell. On new clause 16, tabled by my right hon. Friend the Member for Haltemprice and Howden (Mr Davis), will the Minister confirm that the Government agree with the objectives of new clause 16 and will table an amendment to this Bill—to no other parliamentary vehicle, but specifically to this Bill—to introduce such a criminal offence? Will the Government amendment he referred to be published before year end?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

On self-harm, I do not think there is any doubt that we are absolutely aligned. On suicide, I have some concerns about how new clause 16 is drafted—it amends the Suicide Act 1961, which is not the right place to introduce measures on self-harm—but I will work to ensure we get this measure absolutely right as the Bill goes through the other place.

Caroline Dinenage Portrait Dame Caroline Dinenage (Gosport) (Con)
- Hansard - - - Excerpts

Will my hon. Friend give way?

Priti Patel Portrait Priti Patel
- Hansard - - - Excerpts

Will my hon. Friend give way?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will give way first to one of my predecessors.

Caroline Dinenage Portrait Dame Caroline Dinenage
- Hansard - - - Excerpts

I thank my hon. Friend for giving way. He is almost being given stereo questions from across the House, but I think they might be slightly different. I am very grateful to him for setting out his commitment to tackling suicide and self-harm content, and for his commitment to my right hon. Friend the Member for Chelmsford (Vicky Ford) on eating disorder content. My concern is that there is a really opaque place in the online world between what is legal and illegal, which potentially could have been tackled by the legal but harmful restrictions. Can he set out a little more clearly—not necessarily now, but as we move forward—how we really are going to begin to tackle the opaque world between legal and illegal content?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

If my hon. Friend will bear with me—I need to make some progress—I think that will be teased out today and in Committee, should the Bill be recommitted, as we amend the clauses relating directly to what she is talking about, and then as the Bill goes through the other place.

Priti Patel Portrait Priti Patel
- Hansard - - - Excerpts

Will the Minister give way?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will give way a final time before I finish.

Priti Patel Portrait Priti Patel
- Hansard - - - Excerpts

I am grateful to the Minister, who has taken a number of interventions. I fully agree with my hon. Friend the Member for Gosport (Dame Caroline Dinenage). This is a grey area and has consistently been so—many Members have given their views on that in previous stages of the Bill. Will the Minister come back in the later stages on tackling violence against women and girls, and show how the Bill will incorporate key aspects of the Domestic Abuse Act 2021, and tie up with the criminal justice system and the work of the forthcoming victims Bill? We cannot look at these issues in isolation—I see that the Minister of State, Ministry of Justice, my right hon. Friend the Member for Charnwood (Edward Argar) is also on the Front Bench. Rather, they all have to be put together in a golden thread of protecting victims, making sure that people do not become victims, and ensuring that we go after the perpetrators—we must not forget that at all. The Minister will not be able to answer that now, but I would ask him to please do so in the latter stages.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I talked about the fact that the Commissioner for Victims and Witnesses and the Domestic Abuse Commissioner will be statutory consultees, because it is really important that their voice is heard in the implementation of the Bill. We are also bringing in coercive control as one of the areas. That is so important when it comes to domestic abuse. Domestic abuse does not start with a slap, a hit, a punch; it starts with emotional abuse—manipulation, coercion and so on. That is why coercive abuse is an important point not just for domestic abuse, but for bullying, harassment and the wider concerns that the Bill seeks to tackle.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will give way and then finish up.

Jamie Stone Portrait Jamie Stone
- Hansard - - - Excerpts

I am one of three Scottish Members present, and the Scottish context concerns me. If time permits me in my contribution later, I will touch on a particularly harrowing case. The school involved has been approached but has done nothing. Education is devolved, so the Minister may want to think about that. It would be too bad if the Bill failed in its good intentions because of a lack of communication in relation to a function delivered by the Scottish Government. Can I take it that there will be the closest possible co-operation with the Scottish Government because of their educational responsibilities?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

There simply has to be. These are global companies and we want to make the Bill work for the whole of the UK. This is not an England-only Bill, so the changes must happen for every user, whether they are in Scotland, Northern Ireland, Wales or England.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will make a bit of progress, because I am testing Mr Speaker’s patience.

We are making a number of technical amendments to ensure that the new communications offences are targeted and effective. New clause 52 seeks to narrow the exemptions for broadcast and wireless telegraphy licence holders and providers of on-demand programme services, so that the licence holder is exempt only to the extent that communication is within the course of a licensed activity. A separate group of technical amendments ensure that the definition of sending false and threatening communications will capture all circumstances—that is far wider than we have at the moment.

We propose a number of consequential amendments to relevant existing legislation to ensure that new offences operate consistently with the existing criminal law. We are also making a number of wider technical changes to strengthen the enforcement provisions and ensure consistency with other regulatory frameworks. New clause 42 ensures that Ofcom has the power to issue an enforcement notice to a former service provider, guarding against service providers simply shutting down their business and reappearing in a slightly different guise to avoid regulatory sanction. A package of Government amendments will set out how the existing video-sharing platform regime will be repealed and the transitional provisions that will apply to those providers as they transition to the online safety framework.

Finally, new clause 40 will enable the CMA to share information with Ofcom for the purpose of facilitating Ofcom’s online safety functions. That will help to ensure effective co-operation between Ofcom and the CMA.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I thank my hon. Friend for giving way. In the past 40 minutes or so, he has demonstrated the complexity of the changes that are being proposed for the Bill, and he has done a very good job in setting that out. However, will he join me and many other right hon. and hon. Members who feel strongly that a Standing Committee should look at the Bill’s implementation, because of the complexities that he has so clearly demonstrated? I know that is a matter for the House rather than our consideration of the Bill, but I hope that other right hon. and hon. Members will join me in looking for ways to put that right. We need to be able to scrutinise the measures on an ongoing basis.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Indeed, there will be, and are, review points in the Bill. I have no doubt that my right hon. Friend will raise that on other occasions as well.

I want to ensure that there is plenty of time for Members to debate the Bill at this important stage, and I have spoken for long enough. I appreciate the constructive and collaborative approach that colleagues have taken throughout the Bill’s passage.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will give way a final time.

Debbie Abrahams Portrait Debbie Abrahams
- Hansard - - - Excerpts

I am grateful to the Minister. Does he support Baroness Kidron’s amendment asking for swift, humane access to data where there is a suspicion that online information may have contributed to a child’s suicide? That has not happened in previous instances; does he support that important amendment?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I am glad that I gave way so that the hon. Lady could raise that point. Baroness Kidron and her organisation have raised that issue with me directly, and they have gathered media support. We will look at that as the Bill goes through this place and the Lords, because we need to see what the powers are at the moment and why they are not working.

Now is the time to take this legislation forward to ensure that it can deliver the safe and transparent online environment that children and adults so clearly deserve.

Lindsay Hoyle Portrait Mr Speaker
- Hansard - - - Excerpts

I call the shadow Minister.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It is an absolute pleasure to be back in the Chamber to respond on behalf of the Opposition to this incredibly important piece of legislation on its long overdue second day on Report. It certainly has not been an easy ride so far: I am sure that Bill Committee colleagues across the House agree that unpicking and making sense of this unnecessarily complicated Bill has been anything but straightforward.

We should all be incredibly grateful and are all indebted to the many individuals, charities, organisations and families who have worked so hard to bring online safety to the forefront for us all. Today is a particularly important day, as we are joined in the Public Gallery by a number of families who have lost children in connection with online harms. They include Lorin LaFave, Ian Russell, Andy and Judy Thomas, Amanda and Stuart Stephens and Ruth Moss. I sincerely hope that this debate will do justice to their incredible hard work and commitment in the most exceptionally difficult of circumstances.

16:14
We must acknowledge that the situation has been made even harder by the huge changes that we have seen in the Government since the Bill was first introduced. Since its First Reading, it has been the responsibility of three different Ministers and two Secretaries of State. Remarkably, it has seen three Prime Ministers in post, too. We can all agree that legislation that will effectively keep people safe online urgently needs to be on the statute book: that is why Labour has worked hard and will continue to work hard to get the Bill over the line, despite the best efforts of this Government to kick the can down the road.
The Government have made a genuine mess of this important legislation. Before us today are a huge number of new amendments tabled by the Government to their own Bill. We now know that the Government also plan to recommit parts of their own Bill—to send them back into Committee, where the Minister will attempt to make significant changes that are likely to damage even further the Bill’s ability to properly capture online harm.
We need to be moving forwards, not backwards. With that in mind, I am keen to speak to a number of very important new clauses this afternoon. I will first address new clause 17, which was tabled by my right hon. Friend the Member for Barking (Dame Margaret Hodge), who has been an incredibly passionate and vocal champion for internet regulation for many years.
As colleagues will be aware, the new clause will fix the frustrating gaps in Ofcom’s enforcement powers. As the Bill stands, it gives Ofcom the power to fine big tech companies only 10% of their turnover for compliance failures. It does not take a genius to recognise that that can be a drop in the ocean for some of the global multimillionaires and billionaires whose companies are often at the centre of the debate around online harm. That is why the new clause, which will mean individual directors, managers or other officers finally being held responsible for their compliance failures, is so important. When it comes to responsibilities over online safety, it is clear that the Bill needs to go further if the bosses in silicon valley are truly to sit up, take notice and make positive and meaningful changes.
Jeremy Wright Portrait Sir Jeremy Wright (Kenilworth and Southam) (Con)
- Hansard - - - Excerpts

I am afraid I cannot agree with the hon. Lady that the fines would be a drop in the ocean. These are very substantial amounts of money. In relation to individual director liability, I completely understand where the right hon. Member for Barking (Dame Margaret Hodge) is coming from, and I support a great deal of what she says. However, there are difficulties with the amendment. Does the hon. Member for Pontypridd (Alex Davies-Jones) accept that it would be very odd to end up in a position in which the only individual director liability attached to information offences, meaning that, as long as an individual director was completely honest with Ofcom about their wrongdoing, they would attract no individual liability?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It may be a drop in the ocean to the likes of Elon Musk or Mark Zuckerberg—these multibillionaires who are taking over social media and using it as their personal plaything. They are not going to listen to fines; the only way they are going to listen, sit up and take notice is if criminal liability puts their neck on the line and makes them answer for some of the huge failures of which they are aware.

The right hon. and learned Member mentions that he shares the sentiment of the amendment but feels it could be wrong. We have an opportunity here to put things right and put responsibility where it belongs: with the tech companies, the platforms and the managers responsible. In a similar way to what happens in the financial sector or in health and safety regulation, it is vital that people be held responsible for issues on their platforms. We feel that criminal liability will make that happen.

David Davis Portrait Mr David Davis
- Hansard - - - Excerpts

May I intervene on a point of fact? The hon. Lady says that fines are a drop in the ocean. The turnover of Google is $69 billion; 10% of that is just shy of $7 billion. That is not a drop in the ocean, even to Elon Musk.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

We are looking at putting people on the line. It needs to be something that people actually care about. Money does not matter to these people, as we have seen with the likes of Google, Elon Musk and Mark Zuckerberg; what matters to them is actually being held to account. Money may matter to Government Members, but it will be criminal liability that causes people to sit up, listen and take responsibility.

While I am not generally in the habit of predicting the Minister’s response or indeed his motives—although my job would be a hell of a lot easier if I did—I am confident that he will try to peddle the line that it was the Government who introduced director liability for compliance failures in an earlier draft of the Bill. Let me be crystal clear in making this point, because it is important. The Bill, in its current form, makes individuals at the top of companies personally liable only when a platform fails to supply information to Ofcom, which misses the point entirely. Directors must be held personally liable when safety duties are breached. That really is quite simple, and I am confident that it would be effective in tackling harm online much more widely.

We also support new clause 28, which seeks to establish an advocacy body to represent the interests of children online. It is intended to deal with a glaring omission from the Bill, which means that children who experience online sexual abuse will receive fewer statutory user advocacy protections than users of a post office or even passengers on a bus. The Minister must know that that is wrong and, given his Government’s so-called commitment to protecting children, I hope he will carefully consider a new clause which is supported by Members on both sides of the House as well as the brilliant National Society for the Prevention of Cruelty to Children. In rejecting new clause 28, the Government would be denying vulnerable children a strong, authoritative voice to represent them directly, so I am keen to hear the Minister’s justification for doing so, if that is indeed his plan.

Members will have noted the bundle of amendments tabled by my hon. Friend the Member for Worsley and Eccles South (Barbara Keeley) relating to Labour’s concerns about the unnecessary powers to overrule Ofcom that the Bill, as currently drafted, gives the Secretary of State of the day. During Committee evidence sessions, we heard from Will Perrin of the Carnegie UK Trust, who, as Members will know, is an incredibly knowledgeable voice when it comes to internet regulation. He expressed concern about the fact that, in comparison with other regulatory frameworks such as those in place for advertising, the Bill

“goes a little too far in introducing a range of powers for the Secretary of State to interfere with Ofcom’s day-to-day doing of its business.”––[Official Report, Online Safety Public Bill Committee, 26 May 2022; c. 117.]

Labour shares that concern. Ofcom must be truly independent if it is to be an effective regulator. Surely we have to trust it to undertake logical processes, rooted in evidence, to arrive at decisions once this regime is finally up and running. It is therefore hard to understand how the Government can justify direct interference, and I hope that the Minister will seriously consider amendments 23 to 30, 32, and 35 to 41.

Before I address Labour’s main concerns about the Government’s proposed changes to the Bill, I want to record our support for new clauses 29 and 30, which seek to bring media literacy duties back into the scope of the Bill. As we all know, media literacy is the first line of defence when it comes to protecting ourselves against false information online. Prevention is always better than cure. Whether it is a question of viral conspiracy theories or Russian disinformation, Labour fears that the Government’s approach to internet regulation will create a two-tier internet, leaving some more vulnerable than others.

However, I am sorry to say that the gaps in this Bill do not stop there. I was pleased to see that my hon. Friend the Member for Rotherham (Sarah Champion) had tabled new clause 54, which asks the Government to formally consider the impact that the use of virtual private networks will have on Ofcom’s ability to enforce its powers. This touches on the issue of future-proofing, which Labour has raised repeatedly in debates on the Bill. As we have heard from a number of Members, the tech industry is evolving rapidly, with concepts such as the metaverse changing the way in which we will all interact with the internet in the future. When the Bill was first introduced, TikTok was not even a platform. I hope the Minister can reassure us that the Bill will be flexible enough to deal with those challenges head-on; after all, we have waited far too long.

That brings me to what Labour considers to be an incredible overturn by the Government relating to amendment 239, which seeks to remove the new offence of harmful communications from the Bill entirely. As Members will know, the communications offence was designed by the Law Commission with the intention of introducing a criminal threshold for the most dangerous online harms. Indeed, in Committee it was welcome to hear the then Minister—the present Minister for Crime, Policing and Fire, the right hon. Member for Croydon South (Chris Philp)—being so positive about the Government’s consultation with the commission. In relation to clause 151, which concerns the communications offences, he even said:

“The Law Commission is the expert in this kind of thing…and it is right that, by and large, we follow its expert advice in framing these offences, unless there is a very good reason not to. That is what we have done—we have followed the Law Commission’s advice, as we would be expected to do.” ––[Official Report, Online Safety Public Bill Committee, 21 June 2022; c. 558.]

Less than six months down the line, we are seeing yet another U-turn from this Government, who are doing precisely the opposite of what was promised.

Removing these communications offences from the Bill will have real-life consequences. It will mean that harmful online trends such as hoax bomb threats, abusive social media pile-ons and fake news such as encouraging people to drink bleach to cure covid will be allowed to spread online without any consequence.

Christian Wakeford Portrait Christian Wakeford (Bury South) (Lab)
- Hansard - - - Excerpts

No Jewish person should have to log online and see Hitler worship, but what we have seen in recent weeks from Kanye West has been nothing short of disgusting, from him saying “I love Hitler” to inciting online pile-ons against Jewish people, and this is magnified by the sheer number of his followers, with Jews actually being attacked on the streets in the US. Does my hon. Friend agree that the Government’s decision to drop the “legal but harmful” measures from the Bill will allow this deeply offensive and troubling behaviour to continue?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I thank my hon. Friend for that important and powerful intervention. Let us be clear: everything that Kanye West said online is completely abhorrent and has no place in our society. It is not for any of us to glorify Hitler and his comments or praise him for the work he did; that is absolutely abhorrent and it should never be online. Sadly, however, that is exactly the type of legal but harmful content that will now be allowed to proliferate online because of the Government’s swathes of changes to the Bill, meaning that that would be allowed to be seen by everybody. Kanye West has 30 million followers online. His followers will be able to look at, share, research and glorify that content without any consequence to that content being freely available online.

Margaret Hodge Portrait Dame Margaret Hodge
- Hansard - - - Excerpts

Further to that point, it is not just that some of the content will be deeply offensive to the Jewish community; it could also harm wider society. Some further examples of postings that would be considered legal but harmful are likening vaccination efforts to Nazi death camps and alleging that NHS nurses should stand trial for genocide. Does my hon. Friend not agree that the changes the Government are now proposing will lead to enormous and very damaging impacts right through society?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

My right hon. Friend is absolutely right. I am keen to bring this back into scope before Mr Speaker chastises us any further, but she is right to say that this will have a direct real-world impact. This is what happens when we focus on content rather than directly on the platforms and the algorithms on the platforms proliferating this content. That is where the focus needs to be. It is the algorithms that share and amplify this content to these many followers time and again that need to be tackled, rather than the content itself. That is what we have been pleading with the Government to concentrate on, but here we are in this mess.

We are pleased that the Government have taken on board Labour’s policy to criminalise certain behaviours—including the encouragement of self-harm, sharing people’s intimate images without their consent, and controlling or coercive behaviours—but we believe that the communications offences more widely should remain in order to tackle dangerous online harms at their root. We have worked consistently to get this Bill over the line and we have reached out to do so. It has been subject to far too many delays and it is on the Government’s hands that we are again facing substantial delays, when internet regulation has never been more sorely needed. I know that the Minister knows that, and I sincerely hope he will take our concerns seriously. I reach out to him again across the Dispatch Box, and look forward to working with him and challenging him further where required as the Bill progresses. I look forward to getting the Bill on to the statute book.

Rosie Winterton Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - - - Excerpts

I call the Chair of the Select Committee.

Julian Knight Portrait Julian Knight (Solihull) (Con)
- View Speech - Hansard - - - Excerpts

I welcome the Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend the Member for Sutton and Cheam (Paul Scully), to his place. To say that he has been given a hospital pass in terms of this legislation is a slight understatement. It is very difficult to understand, and the ability he has shown at the Dispatch Box in grasping many of the major issues is to his credit. He really is a safe pair of hands and I thank him for that.

Looking at the list of amendments, I think it is a bit of a hotchpotch, yet we are going to deal only with certain amendments today and others are not in scope. That shows exactly where we are with this legislation. We have been in this stasis now for five years. I remember that we were dealing with the issue when I joined the Digital, Culture, Media and Sport Committee, and it is almost three years since the general election when we said we would bring forward this world-leading legislation. We have to admit that is a failure of the political class in all respects, but we have to understand the problem and the realities facing my hon. Friend, other Ministers and the people from different Departments involved in drafting this legislation.

We are dealing with companies that are more powerful than the oil barons and railway barons of the 19th century. These companies are more important than many states. The total value of Alphabet, for instance, is more than the total GDP of the Netherlands, and that is probably a low estimate of Alphabet’s global reach and power. These companies are, in many respects, almost new nation states in their power and reach, and they have been brought about by individuals having an idea in their garage. They still have that culture of having power without the consequences that flow from it.

16:30
These companies have created wonderful things that enhance our lives in many respects through better communication and increased human knowledge, which we can barely begin to imagine, but they have done it with a skater boy approach—the idea that they are beyond the law. They had that enshrined in law in the United States, where they have effectively become nothing more than a megaphone or a noticeboard, and they have always relied on that. They are based or domiciled, in the main, in the United States, which is where they draw their legal power. They will always be in that position of power.
We talk about 10% fines and even business interruption to ensure these companies have skin in the game, but we have to realise these businesses are so gigantic and of such importance that they could simply ignore what we do in this place. Will we really block a major social media platform? The only time something like that has been done was when a major social media platform blocked a country, if I remember rightly. We have to understand where we are coming from in that respect.
This loose cannon, Elon Musk, is an enormously wealthy man, and he is quite strange, isn’t he? He is intrinsically imbued with the power of silicon valley and those new techno-masters of the universe. We are dealing with those realities, and this Bill is very imperfect.
David Davis Portrait Mr David Davis
- Hansard - - - Excerpts

My hon. Friend is giving a fascinating disquisition on this industry, but is not the implication that, in effect, these companies are modern buccaneer states and we need to do much more to legislate? I am normally a deregulator, but we need more than one Bill to do what we seek to do today.

Julian Knight Portrait Julian Knight
- Hansard - - - Excerpts

My right hon. Friend is correct. We spoke privately before this debate, and he said this is almost five Bills in one. There will be a patchwork of legislation, and there is a time limit. This is a carry-over Bill, and we have to get it on the statute book.

This Bill is not perfect by any stretch of the imagination, and I take the Opposition’s genuine concerns about legal but harmful material. The shadow Minister mentioned the tragic case of Molly Russell. I heard her father being interviewed on the “Today” programme, and he spoke about how at least three quarters of the content he had seen that had prompted that young person to take her life had been legal but harmful. We have to stand up, think and try our best to ensure there is a safer space for young people. This Bill does part of that work, but only part. The work will be done in the execution of the Bill, through the wording on age verification and age assurance.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

Given the complexities of the Bill, and given the Digital, Culture, Media and Sport Committee’s other responsibilities, will my hon. Friend join me in saying there should be a special Committee, potentially of both Houses, to keep this area under constant review? That review, as he says, is so badly needed.

Julian Knight Portrait Julian Knight
- Hansard - - - Excerpts

I thank my right hon. Friend for her question, which I have previously addressed. The problem is the precedent it would set. Any special Committee set up by a Bill would be appointed by the Whips, so we might as well forget about the Select Committee system. This is not a huge concern for the Digital, Culture, Media and Sport Committee, because the advent of any such special Committee would probably be beyond the next general election, and I am not thinking to that timeframe. I am concerned about the integrity of Parliament. The problem is that if we do that in this Bill, the next Government will come along and do it with another Bill and then another Bill. Before we know it, we will have a Select Committee system that is Whips-appointed and narrow in definition, and that cuts across something we all vote for.

There are means by which we can have legislative scrutiny—that is the point I am making in my speech. I would very much welcome a Committee being set up after a year, temporarily, to carry out post-legislative scrutiny. My Committee has a Sub-Committee on disinformation and fake news, which could also look at this Bill going forward. So I do not accept my right hon. Friend’s point, but I appreciate completely the concerns about our needing proper scrutiny in this area. We must also not forget that any changes to Ofcom’s parameters can be put in a statutory instrument, which can by prayed against by the Opposition and thus we would have the scrutiny of the whole House in debate, which is preferable to having a Whips-appointed Committee.

I have gone into quite a bit of my speech there, so I am grateful for that intervention in many respects. I am not going to touch on every aspect of this issue, but I urge right hon. and hon. Members in all parts of the House to think about the fact that although this is far from perfect legislation and it is a shame that we have not found a way to work through the legal but harmful material issue, we have to understand the parameters we are working in, in the real world, with these companies. We need to see that there is a patchwork of legislation, and the biggest way in which we can effectively let the social media companies know they have skin in the game in society—a liberal society that created them—is through competition legislation, across other countries and other jurisdictions. I am talking about our friends in the European Union and in the United States. We are working together closely now to come up with a suite of competition legislation. That is how we will be able to cover off some of this going forward. I will be supporting this Bill tonight and I urge everyone to do so, because, frankly, after five years I have had enough.

John Nicolson Portrait John Nicolson
- View Speech - Hansard - - - Excerpts

I rise to speak to the amendments in my name and those of my right hon. and hon. Friends, which of course I support.

It is welcome to see the Online Safety Bill back in the House. As we have debated this Bill and nursed it, as in my case, through both the Bill Committee and the Joint Committee, we have shone a light into some dark corners and heard some deeply harrowing stories. Who can forget the testimony given to us by Molly Russell’s dad, Ian? As we have heard, in the Public Gallery we have bereaved families who have experienced the most profound losses due to the extreme online harms to which their loved ones have been exposed; representatives of those families are watching the proceedings today. The hon. Member for Pontypridd (Alex Davies-Jones) mentioned that Ian is here, but let me mention the names of the children. Amanda and Stuart Stephens are here, and they are the parents of Olly; Andy and Judy Thomas are here, and they are the parents of Frankie; and Lorin LaFave, the mother of Breck is here, as is Ruth Moss, the mother of Sophie. All have lost children in connection with online harms, and I extend to each our most sincere condolences, as I am sure does every Member of the House. We have thought of them time and time again during the passage of this legislation; we have thought about their pain. All of us hope that this Bill will make very real changes, and we keep in our hearts the memories of those children and other young people who have suffered.

In our debates and Committee hearings, we have done our best to harry the social media companies and some of their secretive bosses. They have often been hiding away on the west coast of the US, to emerge blinking into the gloomy Committee light when they have to answer some questions about their nefarious activities and their obvious lack of concern for the way in which children and others are impacted.

We have debated issues of concern and sometimes disagreement in a way that shows the occasional benefits of cross-House co-operation. I have been pleased to work with friends and colleagues in other parties at every stage of the Bill, not least on Zach’s law, which we have mentioned. The result is a basis of good, much-needed legislation, and we must now get it on to the statute book.

It is unfortunate that the Bill has been so long delayed, which has caused great stress to some people who have been deeply affected by the issues raised, so that they have sometimes doubted our good faith. These delays are not immaterial. Children and young teenagers have grown older in an online world full of self-harm—soon to be illegal harms, we hope. It is a world full of easy-to-access pornography with no meaningful age verification and algorithms that provide harmful content to vulnerable people.

I have been pleased to note that calls from Members on the SNP Benches and from across the House to ensure that specific protection is granted to women and girls online have been heeded. New communications offences on cyber-flashing and intimate image abuse, and similar offences, are to be incorporated. The requirements for Ofcom to consult with the Victims’ Commissioner and the Domestic Abuse Commissioner are very welcome. Reporting tools should also be more responsive.

New clause 28 is an important new clause that SNP Members have been proud to sponsor. It calls for an advocacy body to represent the interests of children. That is vital, because the online world that children experience is ever evolving. It is not the online world that we in this Chamber tend to experience, nor is it the one experienced by most members of the media covering the debate today. We need, and young people deserve, a dedicated and appropriately funded body to look out for them online—a strong, informed voice able to stand up to the representations of big tech in the name of young people. This will, we hope, ensure that regulators get it right when acting on behalf of children online.

I am aware that there is broad support for such a body, including from those on the Labour Benches. We on the SNP Benches oppose the removal of the aspect of the Bill related to legal but harmful material. I understand the free speech arguments, and I have heard Ministers argue that the Government have proposed alternative approaches, which, they say, will give users control over the content that they see online. But adults are often vulnerable, too. Removing measures from the Bill that can protect adults, especially those in a mental health spiral or with additional learning needs, is a dereliction of our duty. An on/off toggle for harmful content is a poor substitute for what was originally proposed.

The legal but harmful discussion was and is a thorny one. It was important to get the language of the Bill right, so that people could be protected from harm online without impinging on freedom of expression, which we all hold dear. However, by sending aspects of the Bill back to Committee, with the intention of removing the legal but harmful provisions, I fear that the Government are simply running from a difficult debate, or worse, succumbing to those who have never really supported this Bill—some who rather approve of the wild west, free-for-all internet. It is much better to rise to the challenge of resolving the conflicts, such as they are, between free speech and legal but harmful. I accept that the Government’s proposals around greater clarity and enforcement of terms and conditions and of transparency in reporting to Ofcom offer some mitigation, but not, in my view, enough.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. Gentleman will remember that, when we served on the Joint Committee that scrutinised the draft Bill, we were concerned that the term “legal but harmful” was problematic and that there was a lack of clarity. We thought it would be better to have more clarity and enforcement based on priority illegal offences and on the terms of service. Does he still believe that, or has he changed his mind?

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

It is a fine debate. Like so much in legislation, there is not an absolute right and an absolute wrong. We heard contradictory evidence. It is important to measure the advantages and the disadvantages. I will listen to the rest of the debate very carefully, as I have done throughout.

As a journalist in a previous life, I have long been a proponent of transparency and open democracy—something that occasionally gets me into trouble. We on the SNP Benches have argued from the outset that the powers proposed for the Secretary of State are far too expensive and wide-reaching. That is no disrespect to the Minister or the new Secretary of State, but they will know that there have been quite a few Culture Secretaries in recent years, some more temperate than others.

In wishing to see a diminution of the powers proposed we find ourselves in good company, not least with Ofcom. I note that there have been some positive shifts in the proposals around the powers of the Secretary of State, allowing greater parliamentary oversight. I hope that these indicate a welcome acknowledgement that our arguments have fallen on fertile Government soil—although, of course, it could be that the Conservative Secretary of State realises that she may soon be the shadow Secretary of State and that it will be a Labour Secretary of State exercising the proposed powers. I hope she will forgive me for that moment’s cynicism.

16:45
As we have done throughout the progress of this Bill, the SNP will engage with the Government and our friends and colleagues on other Benches. We have worked hard on this Bill, as have so many other Members. In particular, I pay tribute to my friend the hon. Member for Folkestone and Hythe (Damian Collins), who I see sitting on the Back Benches after an all-too-short ministerial career. It has been a steep learning curve for us all. We have met some wonderful, motivated, passionate people, some with sad stories and some with inspiring stories. Let us do all we can to ensure that we do not let them down.
Priti Patel Portrait Priti Patel (Witham) (Con)
- View Speech - Hansard - - - Excerpts

Before I speak to specific clauses I pay tribute to all the campaigners, particularly the families who have campaigned so hard to give their loved ones a voice through this Bill and to change our laws. Having had some prior involvement in the early stages of this Bill three years ago as Home Secretary, I also pay tribute to many of the officials and Members of this House on both sides who have worked assiduously on the construction, development and advancement of this Bill. In particular, I pay tribute to my hon. Friend the Member for Folkestone and Hythe (Damian Collins) and the work of the Joint Committee; when I was Home Secretary we had many discussions about this important work. I also thank the Minister for the assiduous way in which he has handled interventions and actually furthered the debate with this Bill. There are many Government Departments that have a raft of involvement and engagement.

The victims must be at the heart of everything that we do now to provide safeguards and protections. Children and individuals have lost their lives because of the online space. We know there is a great deal of good in the online space, but also a great deal of harm, and that must unite us all in delivering this legislation. We have waited a long time for this Bill, but we must come together, knowing that this is foundational legislation, which will have to be improved and developed alongside the technology, and that there is much more work to do.

I start by focusing on a couple of the new clauses, beginning with Government new clause 11 on end-to-end encryption. The House will not be surprised by my background in dealing with end-to-end encryption, particularly the harmful content, the types of individuals and the perpetrators who hide behind end-to-end encryption. We must acknowledge the individuals who harm children or who peddle terrorist content through end-to-end encryption while recognising that encryption services are important to protect privacy.

There is great justification for encryption—business transactions, working for the Government and all sorts of areas of importance—but we must acknowledge in this House that there is more work to do, because these services are being used by those who would do harm to our country, threaten our national interest or threaten the safety of young people and children in particular. We know for a fact that there are sick-minded individuals who seek to abuse and exploit children and vulnerable adults. The Minister will know that, and I am afraid that many of us do. I speak now as a constituency Member of Parliament, and one of my first surgery cases back in 2010 was the sad and tragic case of a mother who came to see me because her son had accessed all sorts of content. Thanks to the Bill, that content will now be ruled as harmful. There were other services associated with access that the family could not see and could not get access to, and encryption platforms are part of that.

There are shocking figures, and I suspect that many of my colleagues in the House will be aware of them. Almost 100,000 reports relating to online child abuse were received by UK enforcement agencies in 2021 alone. That is shocking. The House will recognise my experience of working with the National Crime Agency, to which we must pay tribute for its work in this space, as we should to law enforcement more widely. Police officers and all sorts of individuals in law enforcement are, day in, day out, investigating these cases and looking at some of the most appalling images and content, all in the name of protecting vulnerable children, and we must pay tribute to them as well.

It is also really shocking that that figure of 100,000 reports in 2021 alone is a 29% increase on the previous year. The amount of disturbing content is going up and up, and we are, I am afraid, looking only at the tip of the iceberg. So, I think it is absolutely right—and I will always urge the Government and whichever Secretary of State, be they in the Home Office, DMCS or the MOJ—to put the right measures and powers in place so that we act to prevent child sexual abuse and exploitation, prevent terrorist content from being shielded behind the platforms of encryption and, importantly, bring those involved to face justice. End-to-end encryption is one thing, but we need end-to-end justice for victims and the prevention of the most heinous crimes.

This is where we, as a House, must come together. I commend the hon. Member for Rotherham (Sarah Champion) in particular for her work relating to girls, everything to do with the grooming gangs, and the most appalling crimes against individuals, quite frankly. I will always urge colleagues to support the Bill, on which we will need to build going forward.

I think I can speak with experience about the difficulties in drafting legislation—both more broadly and specifically in this area, which is complex and challenging. It is hard to foresee the multiplicity of circumstances. My hon. Friend the Member for Folkestone and Hythe was absolutely right to say in his comments to the SNP spokesman, the hon. Member for Ochil and South Perthshire (John Nicolson), that we have to focus on illegal content. It is difficult to get the balance right between the lawful and harmful. The illegal side is what we must focus on.

I also know that many campaigners and individuals—they are not just campaigners, but families—have given heartbreaking and devastating accounts of their experiences of online harms. As legislators, we owe them this Bill, because although their suffering is not something that we will experience, it must bring about the type of changes that we all want to see for everyone—children, adults and vulnerable individuals.

May I ask the Minister for reassurances on the definition of “best endeavours”? As my right hon. Friend the Member for Basingstoke (Dame Maria Miller) touched on, when it comes to implementation, that will be the area where the rubber hits the road. That is where we will need to know that our collective work will be meaningful and will deliver protections—not just change, but protections. We must be honest about the many serious issues that will arise even after we pass the Bill—be it, God forbid, a major terrorist incident, or cases of child sexual exploitation—and there is a risk that, without clarity in this area, when a serious issue does arise, we may not know whether a provider undertook best endeavours. I think we owe it to everyone to ensure that we run a slide rule over this on every single granular detail.

Cases and issues relating to best endeavours are debated and discussed extensively in court cases, coroner inquests and for social services relating to child safeguarding issues, for example—all right hon. and hon. Members here will have experience of dealing with social services on behalf of their constituents in child protection cases—or, even worse, in serious case reviews or public inquiries that could come in future. I worry that in any response a provider could say that it did its best and had undertaken its best endeavours, as a defence. That would be unacceptable. That would lead those affected to feel as if they suffered an even greater injustice than the violations that they experienced. It is not clear whether best endeavours will be enough to change the culture, behaviour and attitudes of online platforms.

I raise best endeavours in the context of changing attitudes and cultures because in many institutions, that very issue is under live debate right now. That may be in policing, attitudes around women and girls or how we protect other vulnerable groups, even in other services such as the fire service, which we have heard about recently. It is important that we ask those questions and have the scrutiny. We need to hear more about what constitutes best endeavours. Who will hold the providers to account? Ofcom clearly has a role. I know the Minister will do a very earnest and diligent job to provide answers, but the best endeavours principle goes wider than just the Minister on the Front Bench—it goes across the whole of Government. He knows that we will give him every backing to use his sharp elbows—perhaps I can help with my sharp elbows—to ensure that others are held to account.

It will also be for Ofcom to give further details and guidance. As ever, the guidance will be so important. The guidance has to have teeth and statutory powers. It has to be able to put the mirror up and hold people to account. For example, would Ofcom be able, in its notices to providers, to instruct them to use specific technologies and programmes to tackle and end the exposure to exploitation, in relation to end-to-end encryption services, to protect victims? That is an open question, but one that could be put to Ofcom and could be an implementation test. There is no reason why we should not put a series of questions to Ofcom around how it would practically implement.

I would like to ask the Minister why vulnerable adults and victims of domestic abuse and violence against women and girls are not included. We must do everything in this House. This is not about being party political. When it comes to all our work on women and violence against women and girls, there should be no party politics whatsoever. We should ensure that what is right for one group is consistent and that the laws are strengthened. That will require the MOJ, as well as the Home Office, to ensure that the work is joined up in the right kind of way.

It is right that powers are available for dealing with terrorist threats and tackling child sexual abuse thoroughly. There is some good work around terrorist content. There is excellent work in GIFCT, the Global Internet Forum to Counter Terrorism. The technology companies are doing great work. There is international co-operation in this space. The House should take some comfort in the fact that the United Kingdom leads the world in this space. We owe our gratitude to our intelligence and security agencies. I give my thanks to MI5 in particular for its work and to counter-terrorism policing, because they have led the world robustly in this work.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

My right hon. Friend makes an important point about this being a cross-Government effort. The Online Safety Bill creates a regulatory framework for the internet, but we need to make sure that we have the right offences in law clearly defined. Then, it is easy to read them and cross them with legislation. If we do not have that, it is a job for the whole of Government.

Priti Patel Portrait Priti Patel
- Hansard - - - Excerpts

Exactly that. My hon. Friend is absolutely right. I come back to the point about drafting this legislation, which is not straightforward and easy because of the definitions. It is not just about what is in scope of the Bill but about the implications of the definitions and how they could be applied in law.

The Minister touched on the criminal side of things; interpretation in the criminal courts and how that would be applied in case law are the points that need to be fleshed out. This is where our work on CT is so important, because across the world with Five Eyes we have been consistent. Again, there are good models out there that can be built upon. We will not fix all this through one Bill—we know that. This Bill is foundational, which is why we must move forward.

On new clause 11, I seek clarity—in this respect, I need reassurance not from the Minister but from other parts of government—on how victims and survivors, whether of terrorist activity, domestic abuse or violence against women and girls, will be supported and protected by the new safeguards in the Bill, and by the work of the Victims’ Commissioner.

Rachel Maclean Portrait Rachel Maclean (Redditch) (Con)
- Hansard - - - Excerpts

I thank my right hon. Friend for sharing her remarks with the House. She is making an excellent speech based on her considerable experience. On the specific issue of child sexual abuse and exploitation, many organisations, such as the Internet Watch Foundation, are instrumental in removing reports and web pages containing that vile and disgusting material. In the April 2020 White Paper, the Government committed to look at how the Internet Watch Foundation could use its technical expertise in that field. Does she agree that it would be good to hear from the Minister about how the Internet Watch Foundation could work with Ofcom to assist victims?

17:00
Priti Patel Portrait Priti Patel
- Hansard - - - Excerpts

My hon. Friend is absolutely right. I thank her for not just her intervention but her steadfast work when she was a Home Office Minister with responsibility for safeguarding. I also thank the Internet Watch Foundation; many of the statistics and figures that we have been using about child sexual abuse and exploitation content, and the take-downs, are thanks to its work. There is some important work to do there. The Minister will be familiar with its work—[Interruption.] Exactly that.

We need the expertise of the Internet Watch Foundation, so it is about integrating that skillset. There is a great deal of expertise out there, including at the Internet Watch Foundation, at GIFCT on the CT side and, obviously, in our services and agencies. As my right hon. Friend the Member for Basingstoke said, it is crucial that we pool organisations’ expertise to implement the Bill, as we will not be able to create it all over again overnight in government.

I thank my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) for tabling new clause 16, which would create new offences to address the challenges caused by those who promote, encourage and assist self-harm. That has been the subject of much of the debate already, which is absolutely right when we think about the victims and their families. In particular, I thank the Samaritans and others for their work to highlight this important issue. I do not need to dwell on the Samaritans’ report, because I think all hon. Members have read it.

All hon. Members who spoke in the early stages of the Bill, which I did not because I was in government, highlighted this essential area. It is important to ensure that we do everything we can to address it in the right way. Like all right hon. and hon. Members, I pay tribute to the family of Molly Russell. There are no words for the suffering that they have endured, but their campaign of bravery, courage and fortitude aims to close every loophole to stop other young people being put at risk.

Right hon. and hon. Members meet young people in schools every week, and we are also parents and, in some cases, grandparents. To know that this grey area leaves so many youngsters at risk is devastating, so we have almost a collective corporate duty to stand up and do the right thing. The long and short of it is that we need to be satisfied, when passing the Bill, that we are taking action to protect vulnerable people and youngsters who are susceptible to dangerous communications.

As I have emphasised, we should also seek to punish those who cause and perpetrate this harm and do everything we can to protect those who are vulnerable, those with learning disabilities, those with mental health conditions, and those who are exposed to self-harm content. We need to protect them and we have a duty to do that, so I look forward to the Minister’s reply.

I welcome new clauses 45 to 50, tabled by my right hon. Friend the Member for Basingstoke. I pay tribute to her for her work; she has been a strong campaigner for protecting the privacy of individuals, especially women and children, and for closing loopholes that have enabled people to be humiliated or harmed in the ways she has spoken about so consistently in the House. I am pleased that the Deputy Prime Minister, my right hon. Friend the Member for Esher and Walton (Dominic Raab), announced last month that the Government would table amendments in the other place to criminalise the sharing of intimate images, photographs and videos without consent; that is long overdue. When I was Home Secretary I heard the most appalling cases, with which my right hon. Friend the Member for Basingstoke will be familiar. I have met so many victims and survivors, and we owe it to them to do the right thing.

It would be reassuring to hear not just from the Minister in this debate, but from other Ministers in the Departments involved in the Bill, to ensure they are consistent in giving voice to the issues and in working through their Ministries on the implementation—not just of this Bill, but of the golden thread that runs throughout the legislation. Over the last three years, we have rightly produced a lot of legislation to go after perpetrators, and support women and girls, including the Domestic Abuse Act 2021. We should use those platforms to stand up for the individuals affected by these issues.

I want to highlight the importance of the provisions to protect women and girls, particularly the victims and survivors of domestic abuse and violence. Some abusive partners and ex-partners use intimate images in their possession; as the Minister said, that is coercive control which means that the victim ends up living their life in fear. That is completely wrong. We have heard and experienced too many harrowing and shocking stories of women who have suffered as a result of the use of such images and videos. It must now be a priority for the criminal justice system, and the online platforms in particular, to remove such content. This is no longer a negotiation. Too many of us—including myself, when I was Home Secretary—have phoned platforms at weekends and insisted that they take down content. Quite frankly, I have then been told, “Twitter doesn’t work on a Saturday, Home Secretary” or “This is going to take time.” That is not acceptable. It is an absolute insult to the victims, and is morally reprehensible and wrong. The platforms must be held to account.

Hon. Members will be well aware of the Home Office’s work on the tackling violence against women and girls strategy. I pay tribute to all colleagues, but particularly my hon. Friend the Member for Redditch (Rachel Maclean), who was the Minister at the time. The strategy came about after much pain, sorrow and loss of life, and it garnered an unprecedented 180,000 responses. The range of concerns raised were predominantly related to the issues we are discussing today. We can no longer stay mute and turn a blind eye. We must ensure that the safety of women in the public space offline—on the streets—and online is respected. We know how women feel about the threats. The strategy highlighted so much; I do not want to go over it again, as it is well documented and I have spoken about it in the House many times.

It remains a cause of concern that the Bill does not include a specific VAWG code of practice. We want and need the Bill. We are not going to fix everything through it, but, having spent valued time with victims and survivors, I genuinely believe that we could move towards a code of practice. Colleagues, this is an area on which we should unite, and we should bring such a provision forward; it is vital.

Let me say a few words in support of new clause 23, which was tabled by my right hon. Friend the Member for Basingstoke. I have always been a vocal and strong supporter of services for victims of crime, and of victims full stop. I think it was 10 years ago that I stood in this House and proposed a victims code of practice—a victims Bill is coming, and we look forward to that as well. This Government have a strong record of putting more resources into support for victims, including the £440 million over three years, but it is imperative that offenders—those responsible for the harm caused to victims—are made to pay, and it is absolutely right that they should pay more in compensation.

Companies profiteering from online platforms where these harms are being perpetrated should be held to account. When companies fail in their duties and have been found wanting, they must make a contribution for the harm caused. There are ways in which we can do that. There has been a debate already, and I heard the hon. Member for Pontypridd (Alex Davies-Jones) speak for the Opposition about one way, but I think we should be much more specific now, particularly in individual cases. I want to see those companies pay the price for their crimes, and I expect the financial penalties issued to reflect the severity of the harm caused—we should support that—and that such money should go to supporting the victims.

I pay tribute to the charities, advocacy groups and other groups that, day in and day out, have supported the victims of crime and of online harms. I have had an insight into that work from my former role in Government, but we should never underestimate how traumatic and harrowing it is. I say that about the support groups, but we have to magnify that multiple times for the victims. This is one area where we must ensure that more is done to provide extra resources for them. I look forward to hearing more from the Minister, but also from Ministers from other Departments in this space.

I will conclude on new clause 28, which has already been raised, on the advocacy body for children. There is a long way to go with this—there really is. Children are harmed in just too many ways, and the harm is unspeakable. We have touched on this in earlier debates and discussions on the Bill, in relation to child users on online platforms, and there will be further harm. I gently urge the Government —if not today or through this Bill, then later—to think about how we can pull together the skills and expertise in organisations outside this House and outside Government that give voice to children who have nowhere else to go.

This is not just about the online space; in the cases in the constituency of the hon. Member for Rotherham (Sarah Champion) and other constituencies, we have seen children being harmed under cover. Statutory services failed them and the state failed them. It was state institutional failure that let children down in the cases in Rotherham and other child grooming cases. We could see that all over again in the online space, and I really urge the Government to make sure that that does not happen—and actually never happens again, because those cases are far too harrowing.

There really is a lot here, and we must come together to ensure that the Bill comes to pass, but there are so many other areas where we can collectively put aside party politics and give voice to those who really need representation.

Margaret Hodge Portrait Dame Margaret Hodge
- View Speech - Hansard - - - Excerpts

I pay tribute to all the relatives and families of the victims of online abuse who have chosen to be with us today. I am sure that, for a lot of you, our debate is very dry and detached, yet we would not be here but for you. Our hearts are with you all.

I welcome the Minister to his new role. I hope that he will guide his Bill with the same spirit set by his predecessors, the right hon. Member for Croydon South (Chris Philp) and the hon. Member for Folkestone and Hythe (Damian Collins), who is present today and has done much work on this issue. Both Ministers listened and accepted ideas suggested by Back Benchers across the House. As a result, we had a better Bill.

17:15
We all understand that this is groundbreaking legislation, and that it therefore presents us with complex challenges as we try to legislate to achieve the best answers to the horrific, fast-changing and ever-growing problems of online abuse. Given that complexity, and given that this is our first attempt at regulating online platforms, the new Minister would do well to build on the legacy of his predecessors and approach the amendments on which there are votes tonight as wholly constructive. The policies we are proposing enjoy genuine cross-party support, and are proposed to help the Minister not to cause him problems.
Let me express particular support for new clauses 45 to 50, in the name of the right hon. Member for Basingstoke (Dame Maria Miller), which tackle the abhorrent misogynistic problem of intimate image abuse, and amendments 1 to 14, in the name of the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright), which address the issue of smaller platforms falling into category 2, which is now outside the scope of regulations. We all know that the smallest platforms can present the greatest risk. The killing of 51 people in the mosque in Christchurch New Zealand is probably the most egregious example, as the individual concerned used 8chan to plan his attack.
New clause 15, which I have tabled, seeks to place responsibility for complying with the new law unequivocally on the shoulders of individual directors of online platforms. As the Bill stands, criminal liability is enforced only when senior tech executives fail to co-operate with information requests from Ofcom. I agree that is far too limited, as the right hon. and learned Member for Kenilworth and Southam said. The Bill allows executives to choose and name the individual who Ofcom will hold to account, so that the company itself, not Ofcom, decides who is liable. That is simply not good enough.
Let me explain the thinking behind new clause 15. The purpose of the Bill is to change behaviour. Our experience in many other spheres of life tells us that the most effective way of achieving such change is to make individuals at the top of an organisation personally responsible for the behaviour of that organisation. We need to hold the chairmen and women, directors and senior executives to account by making those individuals personally liable for the practices and actions of their organisation.
Let us look at the construction industry, for example. Years ago, building workers dying on construction sites was an all too regular feature of the construction industry. Only when we reformed health and safety legislation and made the directors of construction companies personally responsible and liable for health and safety standards on their sites did we see an incredible 90% drop in deaths on building sites. Similarly, when we introduced corporate and director liability offences in the Bribery Act 2010, companies stopped trying to bribe their way into contracts.
It is not that we want to lock up directors of construction companies or trading companies, or indeed directors of online platforms; it is that the threat of personal criminal prosecution is the most powerful and effective way of changing behaviour. It is just the sort of deterrent tool that the Bill needs if it is to protect children and adults from online harms. That is especially important in this context, because the business model that underpins the profits that platforms enjoy encourages harmful content. The platforms need to encourage traffic on their sites, because the greater the traffic, the more attractive their sites become to advertisers; and the more advertising revenue they secure, the higher the profits they enjoy.
Harmful content attracts more traffic and so supports the platforms’ business objectives. We know that from studies such as the one by Harvard law professor Jonathan Zittrain, which showed that posts that tiptoe close to violating platforms’ terms and conditions generate far more engagement. We also know that from Mark Zuckerberg’s decisions in the lead-up to and just after the 2020 presidential elections, when he personally authorised tweaks to the Facebook algorithm to reduce the spread of election misinformation. However, after the election, despite officials at Facebook asking for the change to stay, he ensured that the previous algorithm was placed back on. An internal Facebook memo revealed that the tweak preventing fake news had led to “a decrease in sessions”, which made his offer less attractive to advertising and impacted his profits. Restoring fake news helped restore his profits.
The incentives in online platforms’ business models promote rather than prevent online harms, and we will not break those incentives by threatening to fine companies. We know from our experience elsewhere that, even at 10% of global revenue, such fines will inevitably be viewed as a cost to business, which will simply be passed on by raising advertising charges. However, we can and will break the incentives in the business model if we make Mark Zuckerberg or Elon Musk personally responsible for breaking the rules. It will not mean that we will lock them up, much as some of us might be tempted to do so. It will, however, provide that most powerful incentive that we have as legislators to change behaviour.
Furthermore, we know that the directors of online platforms personally take decisions in relation to harmful content, so they should be personally held to account. In 2018, Facebook’s algorithm was promoting posts for users in Myanmar that incited violence against protesters. The whistleblower Frances Haugen showed evidence that Facebook was aware that its engagement-based content was fuelling the violence, but it continued to roll it out on its platforms worldwide without checks. Decisions made at the top resulted in direct ethnic violence on the ground. That same year, Zuckerberg gave a host of interviews defending his decision to keep holocaust-denial on his platform, saying he did not believe that posts should be taken down for people getting it wrong. The debate continued for two years until 2020, when only after months of protest he finally decided to remove that abhorrent content.
In what world do we live where overpaid executives running around in their jeans and sneakers are allowed to make decisions on the hoof about how their platforms should be regulated without being held to account for their actions?
David Davis Portrait Mr David Davis
- Hansard - - - Excerpts

The right hon. Lady and I have co-operated to deal with international corporate villains, so I am interested in her proposal. However, a great number of these actions are taken by algorithms—I speak as someone who was taken down by a Google algorithm—so what happens then? I see no reason why we should not penalise directors, but how do we establish culpability?

Margaret Hodge Portrait Dame Margaret Hodge
- Hansard - - - Excerpts

That is for an investigation by the appropriate enforcement agency—Ofcom et al.—and if there is evidence that culpability rests with the managing director, the owner or whoever, they should be prosecuted. It is as simple as that. A case would have to be established through evidence, and that should be carried out by the enforcement agency. I do not think that this is any different from any other form of financial or other crime. In fact, it is from my experience in that that I came to this conclusion.

John Penrose Portrait John Penrose (Weston-super-Mare) (Con)
- Hansard - - - Excerpts

The right hon. Lady is making a powerful case, particularly on the effective enforcement of rules to ensure that they bite properly and that people genuinely pay attention to them. She gave the example of a senior executive talking about whether people should be stopped for getting it wrong—I think the case she mentioned was holocaust denial—by making factually inaccurate statements or allowing factually inaccurate statements to persist on their platform. May I suggest that her measures would be even stronger if she were to support new clause 34, which I have tabled? My new clause would require factual inaccuracy to become wrong, to be prevented and to be pursued by the kinds of regulators she is talking about. It would be a much stronger basis on which her measure could then abut.

Margaret Hodge Portrait Dame Margaret Hodge
- Hansard - - - Excerpts

Indeed. The way the hon. Gentleman describes his new clause, which I will look at, is absolutely right, but can I just make a more general point because it speaks to the point about legal but harmful? What I really fear with the legal but harmful rule is that we create more and more laws to make content illegal and that, ironically, locks up more and more people, rather than creates structures and systems that will prevent the harm occurring in the first place. So I am not always in favour of new laws simply criminalising individuals. I would love us to have kept to the legal but harmful route.

We can look to Elon Musk’s recent controversial takeover of Twitter. Decisions taken by Twitter’s newest owner—by Elon Musk himself—saw use of the N-word increase by nearly 500% within 12 hours of acquisition. And allowing Donald Trump back on Twitter gives a chilling permission to Trump and others to use the site yet again to incite violence.

The tech giants know that their business models are dangerous. Platforms can train their systems to recognise so-called borderline content and reduce engagement. However, it is for business reasons, and business reasons alone, that they actively choose not to do that. In fact, they do the opposite and promote content known to trigger extreme emotions. These platforms are like a “danger for profit” machine, and the decision to allow that exploitation is coming from the top. Do not take my word for it; just listen to the words of Ian Russell. He has said:

“The only person that I’ve ever come across in this whole world…that thought that content”—

the content that Molly viewed—

“was safe was…Meta.”

There is a huge disconnect between what silicon valley executives think is safe and what we expect, both for ourselves and for our children. By introducing liability for directors, the behaviour of these companies might finally change. Experience elsewhere has shown us that that would prove to be the most effective way of keeping online users safe. New clause 17 would hold directors of a regulated service personally liable on the grounds that they have failed, or are failing, to comply with any duties set in relation to their service, for instance failure that leads to the death of a child. The new clause further states that the decision on who was liable would be made by Ofcom, not the provider, meaning that responsibility could not be shirked.

I say to all Members that if we really want to reduce the amount of harmful abuse online, then making senior directors personally liable is a very good way of achieving it. Some 82% of UK adults agree with us, Labour Front Benchers agree and Back Benchers across the House agree. So I urge the Government to rethink their position on director liability and support new clause 17 as a cross-party amendment. I really think it will make a difference.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

As Members know, there is a tradition in the United States that when the President signs a new Bill into law, people gather around him in the Oval Office, and multiple pens are used and presented to people who had a part in that Bill being drafted. If we required the King to do something similar with this Bill and gave a pen to every Minister, every Member who had served on a scrutiny Committee and every hon. Member who introduced an amendment that was accepted, we would need a lot of pens and it would take a long time. In some ways, however, that shows the House at its best; the Bill’s introduction has been a highly collaborative process.

The right hon. Member for Barking (Dame Margaret Hodge) was kind in her words about me and my right hon. Friend the Member for Croydon South (Chris Philp). I know that my successor will continue in the same tradition and, more importantly, that he is supported by a team of officials who have dedicated, in some cases, years of their career to the Bill, who care deeply about it and who want to see it introduced with success. I had better be nice to them because some of them are sitting in the Box.

17:30
It is easy to consider the Bill on Report as it now, thinking about some areas where Members think it goes too far and other areas where Members think it does not quite go far enough, but let us not lose sight of the fact that we are establishing a world-leading regulatory system. It is not the first in the world, but it goes further than any other system in the world in the scope of offences. Companies will have to show priority activity in identifying and mitigating the harm of the unlawful activity. A regulator will be empowered to understand what is going on inside the companies, challenge them on the way that they enforce their codes and hold them to account for that. We currently have the ability to do none of those things. Creating a regulator with that statutory power and the power to fine and demand evidence and information is really important.
The case of Molly Russell has rightly been cited as so important many times in this debate. One of the hardships was not just the tragedy that the family had to endure and the cold, hard, terrible fact—presented by the coroner—that social media platforms had contributed to the death of their daughter, but that it took years for the family and the coroner, going about his lawful duty, to get hold of the information that was required and to bring it to people’s attention. I have had conversations with social media companies about how they combat self-harm and suicide, including with TikTok about what they were doing to combat the “blackout challenge”, which has led to the death of children in this country and around the world. They reassure us that they have systems in place to deal with that and that they are doing all that they can, but we do not know the truth. We do not know what they can see and we have no legal power to readily get our hands on that information and publish it. That will change.
This is a systems Bill—the hon. Member for Pontypridd (Alex Davies-Jones) and I have had that conversation over the Dispatch Boxes—because we are principally regulating the algorithms and artificial intelligence that drive the recommendation tools on platforms. The right hon. Member for Barking spoke about that, as have other Members. When we describe pieces of content, they are exemplars of the problem, but the biggest problem is the systems effect. If people posted individually and organically, and that sat on a Facebook page or a YouTube channel that hardly anyone saw, the amount of harm done would be very small. The fact is, however, that those companies have created systems to promote content to people by data-profiling them to keep them on their site longer and to get them coming back more frequently. That has been done for a business reason—to make money. Most of the platforms are basically advertising platforms making money out of other people’s content.
That point touches on every issue that Members have raised so far today. The Bill squarely makes the companies fully legally liable for their business activity, what they have designed to make money for themselves and the detriment that that can cause other people. That amplification of content, giving people more of what they think they want, is seen as a net positive, and people think that it therefore must always be positive, but it can be extremely damaging and negative.
That is why the new measures that the Government are introducing on combating self-harm and suicide are so important. Like other Members, I think that the proposal from my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) is important, and I hope that the Government’s amendment will address the issue fully. We are talking not just about the existing, very high bar in the law on assisting suicide, which almost means being present and part of the act. The act of consistently, systematically promoting content that exacerbates depression, anxiety and suicidal feelings among anyone, but particularly young people, must be an offence in law and the companies must be held to account for that.
When Ian Russell spoke about his daughter’s experience, I thought it was particularly moving when he said that police officers were not allowed to view the content on their own. They worked in shifts for short periods of time, yet that content was pushed at a vulnerable girl by a social media platform algorithm when she was on her own, probably late at night, with no one else to see it and no one to protect her. That was done in a systematic way, consistently, over a lengthy period of time. People should be held to account for that. It is outrageous—it is disgusting—that that was allowed to happen. Preventing that is one of the changes that the Bill will help us to deliver.
David Davis Portrait Mr David Davis
- Hansard - - - Excerpts

I listened with interest to the comments of the right hon. Member for Barking (Dame Margaret Hodge) about who should be held responsible. I am trying to think through how that would work in practice. Frankly, the adjudication mechanism, under Ofcom or whoever it might be, would probably take a rather different view in the case of a company: bluntly, it would go for “on the balance of probabilities”, whereas with an individual it might go for “beyond reasonable doubt”. I am struggling —really struggling—with the question of which would work best. Does my hon. Friend have a view?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

My right hon. Friend raises a very good question. As well as having a named individual with criminal liability for the supplying of information, should there be somebody who is accountable within a company, whether that comes with criminal sanctions or not—somebody whose job it is to know? As all hon. Members know if they have served on the Digital, Culture, Media and Sport Committee, which I chaired, on the Public Accounts Committee or on other Select Committees that have questioned people from the big tech companies, the frustrating thing is that no matter who they put up, it never seems to be the person who actually knows.

There needs to be someone who is legally liable, whether or not they have criminal liability, and is the accountable officer. In the same way as in a financial institution, it is really important to have someone whose job it is to know what is going on and who has certain liabilities. The Bill gives Ofcom the power to seek information and to appoint experts within a company to dig information out and work with the company to get it, but the companies need to feel the same sense of liability that a bank would if its systems had been used to launder money and it had not raised a flag.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I will dare to give way to yet another former Committee Chair—the former chair of the Public Accounts Committee.

Margaret Hodge Portrait Dame Margaret Hodge
- Hansard - - - Excerpts

I draw all hon. Members’ attention to issues relating to Barclays Bank in the wake of the economic crisis. An authority—I think it was the Serious Fraud Office—attempted to hold both the bank and its directors to account, but it failed because there was not a corporate criminal liability clause that worked. It was too difficult. Putting such a provision in the Bill would be a means of holding individual directors as well as companies to account, whatever standard of proof was used.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I thank the right hon. Lady for that information.

Let me move on to the debate about encryption, which my right hon. Friend the Member for Haltemprice and Howden has mentioned. I think it is important that Ofcom and law enforcement agencies be able to access information from companies that could be useful in prosecuting cases related to terrorism and child sexual exploitation. No one is suggesting that encrypted messaging services such as WhatsApp should be de-encrypted, and there is no requirement in the Bill for encryption to end, but we might ask how Meta makes money out of WhatsApp when it appears to be free. One way in which it makes money is by gathering huge amounts of data and information about the people who use it, about the names of WhatsApp groups and about the websites people visit before and after sending messages. It gathers a lot of background metadata about people’s activity around using the app and service.

If someone has visited a website on which severe illegal activity is taking place and has then used a messaging service, and the person to whom they sent the message has done the same, it should be grounds for investigation. It should be easy for law enforcement to get hold of the relevant information without the companies resisting. It should be possible for Ofcom to ask questions about how readily the companies make that information available. That is what the Government seek to do through their amendments on encryption. They are not about creating a back door for encryption, which could create other dangers, and not just on freedom of expression grounds: once a back door to a system is created, even if it is only for the company itself or for law enforcement, other people tend to find their way in.

Ian Paisley Portrait Ian Paisley (North Antrim) (DUP)
- Hansard - - - Excerpts

I thank the hon. Member for jointly sponsoring my private Member’s Bill, the Digital Devices (Access for Next of Kin) Bill. Does he agree that the best way to make progress is to ensure open access for the next of kin to devices that a deceased person leaves behind?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. Member makes an important point. Baroness Kidron’s amendment has been referred to; I anticipate that future amendments in the House of Lords will also seek to address the issue, which our Joint Committee looked at carefully in our pre-legislative scrutiny.

It should be much easier than it has been for the Russell family and the coroner to gain access to such important information. However, depending on the nature of the case, there may well be times when it would be wrong for families to have access. I think there has to be an expedited and official process through which the information can be sought, rather than a general provision, because some cases are complicated. There should not be a general right in law, but it needs to be a lot easier than it is. Companies should make the information available much more readily than they have done. The Molly Russell inquest had to be delayed for four months because of the late release of thousands of pages of information from Meta to the coroner. That is clearly not acceptable either.

My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) has tabled an amendment relating to small and risky platforms. The categorisation of platforms on the basis of size was linked to duties under the “legal but harmful” provisions, which we expect now to change. The priority illegal harms apply to platforms of all sizes. Surely when illegal activity is taking place on any platform of any size—I hope that the Minister will clarify this later—Ofcom must have the right to intervene and start asking questions. I think that, in practice, that is how we should expect the system to work.

Like other Members who served on the Joint Committee —I am thinking particularly of my hon. Friends the Members for Watford (Dean Russell) and for Stourbridge (Suzanne Webb), both of whom spoke so passionately about this subject, and the hon. Member for Ochil and South Perthshire (John Nicolson) raised it as well—I was delighted to see that the Government had tabled amendments to cover Zach’s law. The fact that someone can deliberately seek out a person with epilepsy and target that person with flashing images with the intention of causing a seizure is a terrible example of the way in which systems can be abused. It is wrong for the platforms to be neutral and have no obligation to identify and stop that action, but the action is wrong in practice as well, and it demonstrates the need for us to ensure that the law keeps pace with the nature of new offences. I was very proud to meet Zach and his mother in October. I said to them then that their work had changed the law, and I am glad that the Government have tabled those amendments.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

May I pay tribute to my hon. Friend for his chairmanship of the Joint Committee last year? We covered a wide range of challenging ethical, moral and technical decisions, with work across both Houses, and I think that the decisions contained in our report informed many of the Government amendments, but it was my hon. Friend’s chairmanship that helped to guide us through that period.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I am grateful to my hon. Friend for what he has said, and for his significant work on the Committee.

There is a great deal that we could say about this Bill, but let me end by touching on an important topic that I think my hon. Friend the Member for Dover (Mrs Elphicke) will speak about later: the way in which social media platforms are used by people trafficking gangs to recruit those who can help them with bringing people into the country in small boats. It was right that the Government included immigration offences in the list of priority legal harms in schedule 7. It was also right that, following a recommendation from the Joint Committee, they included fraud and scam ads in the scope of the Bill.

We have already accepted, in principle, that advertising can be within the Bill’s scope in certain circumstances, and that priority legal harms can be written into the Bill and identified as such. As I understand it, my hon. Friend’s amendment seeks to bring advertising services—not just organic posts on social media platforms—into the Bill’s scope as well. I know that the Government want to consider illegal activity in advertising as part of the online advertising review, but I hope that this could be an expedited process running in parallel with the Bill as it completes its stages. Illegal activity in advertising would not be allowed in the offline world. Newspaper editors are legally liable for what appears in their papers, and broadcasters can lose their licence if they allow illegal content to feature in advertising. We do not yet have the same enforcement mechanism through the advertising industry with the big online platforms, such as Google and Facebook, where the bulk of display advertising now goes. Their advertising market is bigger than the television advertising market. We are seeing serious examples of illegal activity, and it cannot be right that while such examples cannot be posted on a Facebook page, if money is put behind them and they are run as advertisements they can.

Priti Patel Portrait Priti Patel
- Hansard - - - Excerpts

My hon. Friend is making a very thoughtful speech. This is an important point, because it relates to criminality fuelled by online activity. We have discussed that before in the context of advertising. Tools already exist throughout Government to pick up such criminality, but we need the Bill to integrate them and drive the right outcomes—to stop this criminality, to secure the necessary prosecutions, and to bring about the deterrent effect that my hon. Friend the Member for Dover (Mrs Elphicke) is pursuing.

Natalie Elphicke Portrait Mrs Natalie Elphicke (Dover) (Con)
- Hansard - - - Excerpts

Will my right hon. Friend give way?

Natalie Elphicke Portrait Mrs Elphicke
- Hansard - - - Excerpts

I am grateful to my right hon. Friend raising this and for his support in this important area that affects our constituencies so much. I will be speaking later to the details of this, which go beyond the advertising payment to the usage, showing and sharing of this. As he has mentioned schedule 7, does he agree that there is—as I have set out in my amendment—a strong case for making sure that it covers all those illegal immigration and modern slavery offences, given the incredible harm that is being caused and that we see on a day-to-day basis?

17:45
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I agree with my hon. Friend, which is why I think it is important that immigration offences were included in schedule 7 of the Bill. I think this is something my right hon. Friend the Member for Croydon South felt strongly about, having been Immigration Minister before he was a tech Minister. It is right that this has been included in the scope of the Bill and I hope that when the code of practice is developed around that, the scope of those offences will be made clear.

On whether advertising should be included as well as other postings, it may well be that at this time the Online Safety Bill is not necessarily the vehicle through which that needs to be incorporated. It could be done separately through the review of the online advertising code. Either way, these are loopholes that need to be closed, and the debate around the Online Safety Bill has brought about a recognition of what offences can be brought within the regulatory scope of the Bill and where Ofcom can have a role in enforcing those measures. Indeed, the measures on disinformation in the National Security Bill are good example of that. In some ways it required the National Security Bill to create the offence, and then the offence could be read across into the Online Safety Bill and Ofcom could play a role in regulating the platforms to ensure that they complied with requests to take down networks of Russian state-backed disinformation. Something similar could work with immigration offences as well, but whether it is done that way or through the online advertising review or through new legislation, this is a loophole that needs to be closed.

Sarah Champion Portrait Sarah Champion (Rotherham) (Lab)
- View Speech - Hansard - - - Excerpts

I am learning so much sitting here. I am going to speak just on child protection, but all of us are vulnerable to online harms, so I am really grateful to hon. Members across the House who are bringing their specialisms to this debate with the sole aim of strengthening this piece of legislation to protect all of us. I really hope the Government listen to what is being said, because there seems to be a huge amount of consensus on this.

The reason I am focusing on child protection is that every police officer in this field that I talk to says that, in almost every case, abusers are now finding children first through online platforms. We cannot keep up with the speed or the scale of this, so I look to this Bill to try to do so much more. My frustration is that when the Bill first started, we were very much seen as a world leader in this field, but now the abuse has become so prolific, other countries have stepped in and we are sadly lagging behind, so I really hope the Minister does everything he can to get this into law as soon as possible.

Although there are aspects of the Bill that go a long way towards tackling child abuse online, it is far from perfect. I want to speak on a number of specific ways in which the Minister can hopefully improve it. The NSPCC has warned that over 100 online grooming and child abuse image crimes are likely to be recorded every day while we wait for this crucial legislation to pass. Of course, that is only the cases that are recorded. The number is going to be far greater than that. There are vital protections in the Bill, but there is a real threat that the use of virtual private networks—VPNs—could undermine the effectiveness of these measures. VPNs allow internet users to hide their private information, such as their location and data. They are commonly used, and often advertised, as a way for people to protect their data or watch online content. For example, on TV services such as Netflix, people might be able to access something only in the US, so they could use a VPN to circumnavigate that to enable them to watch it in this country.

During the Bill’s evidence sessions, Professor Clare McGlynn said that 75% of children aged 16 and 17 used, or knew how to use, a VPN, which means that they can avoid age verification controls. So if companies use age assurance tools, as listed in the safety duties of this Bill, there is no guarantee that they will provide the protections that are needed. I am also concerned that the use of VPNs could act as a barrier to removing indecent or illegal material from the internet. The Internet Watch Foundation uses a blocking list to remove this content from internet service providers, but users with a VPN are usually not protected through the provisions they use. It also concerns me that a VPN could be used in court to circumnavigate this legislation, which is very much based in the UK. Have the Government tested what will happen if someone uses a VPN to give the appearance of being overseas?

My new clause 54 would require the Secretary of State to publish, within six months of the Bill’s passage, a report on the effect of VPN use on Ofcom’s ability to enforce the requirements under clause 112. If VPNs cause significant issues, the Government must identify those issues and find solutions, rather than avoiding difficult problems.

New clause 28 would establish a user advocacy body to represent the interests of children in regulatory decisions. Children are not a homogenous group, and an advocacy body could reflect their diverse opinions and experiences. This new clause is widely supported in the House, as we have heard, and the NSPCC has argued that it would be an important way to counterbalance the attempts of big tech companies to reduce their obligations, which are placing their interests over children’s needs.

I would like to see more third sector organisations consulted on the code of practice. The Internet Watch Foundation, which many Members have discussed, already has the necessary expertise to drastically reduce the amount of child sexual abuse material on the internet. The Government must work with the IWF and build on its knowledge of web page blocking and image hashing.

Girls in particular face increased risk on social media, with the NSPCC reporting that nearly a quarter of girls who have taken a nude photo have had their image sent to someone else online without their permission. New clauses 45 to 50 would provide important protections to women and girls from intimate image abuse, by making the non-consensual sharing of such photos illegal. I am pleased that the Government have announced that they will look into introducing these measures in the other place, but we are yet to see any measures to compare with these new clauses.

In the face of the huge increase in online abuse, victims’ services must have the necessary means to provide specialist support. Refuge’s tech abuse team, for example, is highly effective at improving outcomes for thousands of survivors, but the demand for its services is rapidly increasing. It is only right that new clause 23 is instated so that a good proportion of the revenue made from the Bill’s provisions goes towards funding these vital services.

The landmark report by the independent inquiry into child sexual abuse recently highlighted that, between 2017-18 and 2020-21, there was an approximately 53% rise in recorded grooming offences. With this crime increasingly taking place online, the report emphasised that internet companies will need more moderators to aid technology in identifying this complex type of abuse. I urge the Minister to also require internet companies to provide sufficient and meaningful support to those moderators, who have to view and deal with disturbing images and videos on a daily basis. They, as well as the victims of these horrendous crimes, deserve our support.

I have consistently advocated for increased prevention of abuse, particularly through education in schools, but we must also ensure that adults, particularly parents, are educated about the threats online. Internet Matters found that parents underestimate the extent to which their children are having negative experiences online, and that the majority of parents believe their 14 to 16-year-olds know more about technology than they do.

The example that most sticks in my mind was provided by the then police chief in charge of child protection, who said, “What is happening on a Sunday night is that the family are sitting in the living room, all watching telly together. The teenager is online, and is being abused online.” In his words, “You wouldn’t let a young child go and open the door without knowing who is there, but that is what we do every day by giving them their iPad.”

If parents, guardians, teachers and other professionals are not aware of the risks and safeguards, how are they able to protect children online? I strongly encourage the Government to accept new clauses 29 and 30, which would place an additional duty on Ofcom to promote media literacy. Minister, you have the potential—

Sarah Champion Portrait Sarah Champion
- Hansard - - - Excerpts

Thank you, Madam Deputy Speaker. The Minister has the potential to do so much with this Bill. I urge him to do it, and to do it speedily, because that is what this country really needs.

David Davis Portrait Mr David Davis
- View Speech - Hansard - - - Excerpts

I do not agree with every detail of what the hon. Member for Rotherham (Sarah Champion) said, but I share her aims. She has exactly the right surname for what she does in standing up for children.

To avoid the risk of giving my Whip a seizure, I congratulate the Government and the Minister on all they have done so far, both in delaying the Bill and in modifying their stance.

My hon. Friend the Member for Solihull (Julian Knight), who is no longer in the Chamber, said that this is five Bills in one and should have had massively more time. At the risk of sounding like a very old man, there was a time when this Bill would have had five days on Report. That is what should have happened with such a big Bill.

Opposition Members will not agree, but I am grateful that the Government decided to remove the legal but harmful clause. The simple fact is that the hon. Member for Pontypridd (Alex Davies-Jones) and I differ not in our aim—my new clause 16 is specifically designed to protect children—but on the method of achieving it. Once upon a time, there was a tradition that this Chamber would consider a Companies Bill every year, because things change over time. We ought to have a digital Bill every year, specifically to address not legal but harmful but, “Is it harmful enough to be made illegal?” Obviously, self-harm material is harmful enough to be made illegal.

The hon. Lady and I have similar aims, but we have different perspectives on how to attack this. My perspective is as someone who has seen many pieces of legislation go badly wrong despite the best of intentions.

The Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend the Member for Sutton and Cheam (Paul Scully), knows he is a favourite of mine. He did a fantastic job in his previous role. I think this Bill is a huge improvement, but he has a lot more to do, as he recognises with the Bill returning to Committee.

One area on which I disagree with many of my hon. and right hon. Friends is the question of encryption. The Bill allows Ofcom to issue notices directing companies to use “accredited technology,” but it might as well say “magic,” because we do not know what is meant by “accredited technology.” Clause 104 will create a pressure to undermine the end-to-end encryption that is not only desirable but crucial to our telecommunications. The clause sounds innocuous and legalistic, especially given that the notices will be issued to remove terrorist or child sexual exploitation content, which we all agree has no place online.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Rather than it being magic, does my right hon. Friend agree that a company could not ignore it if we demystified the process? By saying there is an existing technology that is available and proven to work, the company would have to explain why it is not using that technology or something better.

David Davis Portrait Mr Davis
- Hansard - - - Excerpts

I will come back to that in some detail.

The first time I used encryption it was one-time pads and Morse, so it was a long time ago. The last time was much more recent. The issue here is that clause 104 causes pressure by requiring real-time decryption. The only way to do that is by either having it unencrypted on the server, having it weakly encrypted or creating a back door. I am talking not about metadata, which I will come back to in a second, but about content. In that context, if the content needs to be rapidly accessible, it is bound to lead to weakened encryption.

This is perhaps a debate for a specialist forum, but it is very dangerous in a whole series of areas. What do we use encryption for? We use it for banking, for legal and privileged conversations, and for conversations with our constituents and families. I could go on and on about the areas in which encryption matters.

Adam Afriyie Portrait Adam Afriyie (Windsor) (Con)
- Hansard - - - Excerpts

My right hon. Friend will be aware that the measure will encompass every single telephone conversation when it switches to IP. That is data, too.

David Davis Portrait Mr Davis
- Hansard - - - Excerpts

That is correct. The companies cannot easily focus the measure on malicious content alone, and that is the problem. With everything we do in dealing with enforcing the law, we have to balance the extent to which we make the job of the law enforcement agency possible—ideally, easy—against the rights we take away from innocent citizens. That is the key balance. Many bad things happen in households but we do not require people to live in houses with glass walls. That shows the intrinsic problem we have.

18:09
That imposition on privacy cannot sit comfortably with anybody who takes privacy rights seriously. As an aside, let me say to the House that the last thing we need, given that we want something to happen quickly, or at least effectively and soon, is to find ourselves in a Supreme Court case or a European Court case on privacy imposition. I do not think that is necessary. That is where I think the argument stands. If we end up in a case like that, it will not be about paedophiles or criminals; it will be about the weakening of the encryption of the data of an investigative journalist or a whistleblower. That is where it will come back to haunt us and we have to put that test on it. That is my main opening gambit.
I am conscious that everybody has spoken for quite a long time, so I am trying to make this short. However, the other thing I wish to say is that we have weapons, particularly in terms of metadata. If I recall correctly, Facebook takes down about 300,000 or so sites for paedophile content alone and millions for other reasons; so the use of metadata is very important. Europol carried out a survey of what was useful in terms of the data arising from the internet, social media and the like, and content was put at No. 7, after all sorts of other data. I will not labour the point, but I just worry about this. We need to get it right and so far we have taken more of a blunderbuss approach than a rifle shot. We need to correct that, which is what my two amendments are about.
The other thing I briefly wish to talk about is new clause 16, which a number of people have mentioned in favourable terms. It will make it an offence to encourage or assist another person to self-harm—that includes suicide. I know that the Government have difficulties getting their proposed provisions right in how they interact with other legislation—the suicide legislation and so on. I will be pressing the new clause to a vote. I urge the Government to take this new clause and to amend the Bill again in the Lords if it is not quite perfect. I want to be sure that this provision goes into the legislation. It comes back to the philosophical distinction involving “legal but harmful”, a decision put first in the hands of a Minister and then in the hands of an entirely Whip-chosen statutory instrument Committee, neither of which are trustworthy vehicles for the protection of free speech. My approach will take it from there and put it in the hands of this Chamber and the other place. Our control, in as much as we control the internet, should be through primary legislation, with maximum scrutiny, exposure and democratic content. If we do it in that way, nobody can argue with us and we will be world leaders, because we are pretty much the only people who can do that.
As I say, we should come back to this area time and time again, because this Bill will not be the last shot at it. People have talked about the “grey area”. How do we assess a grey area? Do I trust Whitehall to do it? No, I do not; good Minister though we have, he will not always be there and another Minister will be in place. We may have the British equivalent of Trump one day, who knows, and we do not want to leave this provision in that context. We want this House, and the public scrutiny that this Chamber gets, to be in control of it.
William Cash Portrait Sir William Cash (Stone) (Con)
- Hansard - - - Excerpts

Many years ago, in the 1970s, I was much involved in the Protection of Children Bill, which was one of the first steps in condemning and making illegal explicit imagery of children and their involvement in the making of such films. We then had the broadcasting Acts and the video Acts, and I was very much involved at that time in saying that we ought to prohibit such things in videos and so on. I got an enormous amount of flack for that. We have now moved right the way forward and it is tremendous to see not only the Government but the Opposition co-operating together on this theme. I very much sympathise with not only what my right hon. Friend has just said—I am very inclined to support his new clause for that reason— but with what the right hon. Member for Barking (Dame Margaret Hodge) said. I was deeply impressed by the way in which she presented the argument about the personal liability of directors. We cannot distinguish between a company and the people who run it, and I am interested to hear what the Government have to say in reply to that.

David Davis Portrait Mr Davis
- Hansard - - - Excerpts

I very much agree with my hon. Friend on that. He and I have been allies in the past—and sometimes opponents—and he has often been far ahead of other people. I am afraid that I do not remember the example from the 1970s, as that was before even my time here, but I remember the intervention he made in the 1990s and the fuss it caused. From that point of view, I absolutely agree with him. My new clause is clearly worded and I hope the House will give it proper consideration. It is important that we put something in the Bill on this issue, even if the Government, quite properly, amend it later.

I wish to raise one last point, which has come up as we have talked through these issues. I refer to the question of individual responsibility. One or two hon. Ladies on the Opposition Benches have cited algorithmic outcomes. As I said to the right hon. Member for Barking, I am worried about how we place the responsibility, and how it would lead the courts to behave, and so on. We will debate that in the next few days and when the Bill comes back again.

There is one other issue that nothing in this Bill covers, and I am not entirely sure why. Much of the behaviour pattern is algorithmic and it is algorithmic with an explicit design. As a number of people have said, it is designed as clickbait; it is designed to bring people back. We may get to a point, particularly if we come back to this year after year, of saying, “There are going to be rules about your algorithms, so you have to write it into the algorithm. You will not use certain sorts of content, pornographic content and so on, as clickbait.” We need to think about that in a sophisticated and subtle way. I am looking at my hon. Friend the Member for Folkestone and Hythe (Damian Collins), the ex-Chairman of the Select Committee, on this issue. If we are going to be the innovators—and we are the digital world innovators— we have to get this right.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

My right hon. Friend is right to raise this important point. The big area here is not only clickbait, but AI-generated recommendation tools, such as a news feed on Facebook or “next up” on YouTube. Mitigating the illegal content on the platforms is not just about content moderation and removal; it is about not promoting.

David Davis Portrait Mr Davis
- Hansard - - - Excerpts

My hon. Friend is exactly right about that. I used the example of clickbait as shorthand. The simple truth is that “AI-generated” is also a misnomer, because these things are not normally AI; they are normally algorithms written specifically to recommend and to maximise returns and revenue. We are not surprised at that. Why should we be? After all, these are commercial companies we are talking about and that is what they are going to do. Every commercial company in the world operates within a regulatory framework that prevents them from making profits out of antisocial behaviour.

Aaron Bell Portrait Aaron Bell (Newcastle-under-Lyme) (Con)
- Hansard - - - Excerpts

On the AI point, let me say that the advances we have seen over the weekend are remarkable. I have just asked OpenAI.com to write a speech in favour of the Bill and it is not bad. That goes to show that the risks to people are not just going to come from algorithms; people are going to be increasingly scammed by AI. We need a Bill that can adapt with the times as we move forward.

David Davis Portrait Mr Davis
- Hansard - - - Excerpts

Perhaps we should run my speech against—[Laughter.] I am teasing. I am coming to the end of my comments, Madam Deputy Speaker. The simple truth is that these mechanisms—call them what you like—are controllable if we put our mind to it. It requires subtlety, testing the thing out in practice and enormous expert input, but we can get this right.

None Portrait Several hon. Members rose—
- Hansard -

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - - - Excerpts

It will be obvious to everyone present that a great many Members wish to speak. Although we have a lot of time for this Bill, it is not infinite, and some speeches, so far, have been extremely long. I am trying to manage this without a formal time limit, because the debate flows better without one, but I hope that Members will now limit themselves to around eight minutes. If they do not do so, there will be a formal time limit of less than eight minutes.

John McDonnell Portrait John McDonnell (Hayes and Harlington) (Lab)
- View Speech - Hansard - - - Excerpts

The debate so far has been serious, and it has respected the views that have been expressed not only by Members from across the House, on a whole range of issues, but by the families joining us today who have suffered such a sad loss.

I wish to address one detailed element of the Bill, and I do so in my role as secretary of the National Union of Journalists’ cross-party parliamentary group. It is an issue to which we have returned time and again when we have been debating legislation of this sort. I just want to bring it to the attention of the House; I do not intend to divide the House on this matter. I hope that the Government will take up the issue, and then, perhaps, when it goes to the other place, it will be resolved more effectively than it has been in this place. I am happy to offer the NUJ’s services in seeking to provide a way forward on this matter.

Many investigative journalists base their stories on confidential information, disclosed often by whistleblowers. There has always been an historic commitment—in this House as well—to protect journalists’ right to protect their sources. It has been at the core of the journalists’ code of practice, promoted by the NUJ. As Members know, in some instances, journalists have even gone to prison to protect their sources, because they believe that it is a fundamental principle of journalism, and also a fundamental principle of the role of journalism in protecting our democracy.

The growth in the use of digital technology in journalism has raised real challenges in protecting sources. In the case of traditional material, a journalist has possession of it, whereas with digital technology a journalist does not own or control the data in the same way. Whenever legislation of this nature is discussed, there has been a long-standing, cross-party campaign in the House to seek to protect this code of practice of the NUJ and to provide protection for journalists to protect their sources and their information. It goes back as far as the Police and Criminal Evidence Act 1984. If Members can remember the operation of that Act, they will know that it requires the police or the investigatory bodies to produce a production order, and requires notice to be given to journalists of any attempt to access information. We then looked at it again in the Investigatory Powers Act 2016. Again, what we secured there were arrangements by which there should be prior approval by a judicial commissioner before an investigatory power can seek communications data likely to compromise a journalists’ sources. There has been a consistent pattern.

To comply with Madam Deputy Speaker’s attempt to constrain the length of our speeches, let me briefly explain to Members what amendment 204 would do. It is a moderate probing amendment, which seeks to ask the Government to look again at this matter. When Ofcom is determining whether to issue a notice to intervene or when it is issuing a notice to that tech platform to monitor user-to-user content, the amendment asks it to consider the level of risk of the specified technology accessing, retaining or disclosing the identity of any confidential journalistic source or confidential journalistic material. The amendment stands in the tradition of the other amendments that have been tabled in this House and that successive Government have agreed to. It puts the onus on Ofcom to consider how to ensure that technologies can be limited to the purpose that was intended. It should not result in massive data harvesting operations, which was referred to earlier, or become a back door way for investigating authorities to obtain journalistic data, or material, without official judicial approval.

18:15
David Davis Portrait Mr Davis
- Hansard - - - Excerpts

I rise in support of the right hon. Gentleman. The production order structure, as it stands, is already being abused: I know of a case in place today. The measure should be stronger and clearer—the Bill contains almost nothing on this—on the protection of journalists, whistleblowers and all people for public interest reasons.

John McDonnell Portrait John McDonnell
- Hansard - - - Excerpts

The right hon. Gentleman and I have some form on this matter going back a number of years. The amendment is in the tradition that this House has followed of passing legislation to protect journalists, their sources and their material. I make this offer again to the Minister: the NUJ is happy to meet and discuss how the matter can be resolved effectively through the tabling of an amendment in the other place or discussions around codes of practice. However, I emphasise to the Minister that, as we have found previously, the stronger protection is through a measure in the Bill itself.

Jeremy Wright Portrait Sir Jeremy Wright (Kenilworth and Southam) (Con)
- View Speech - Hansard - - - Excerpts

I rise to speak to amendments 1 to 9 and new clause 1 in my name and the names of other hon. and right hon. Members. They all relate to the process of categorisation of online services, particularly the designation of some user-to-user services as category 1 services. There is some significance in that designation. In the Bill as it stands, perhaps the greatest significance is that only category 1 services have to concern themselves with so-called “legal but harmful” content as far as adults are concerned. I recognise that the Government have advertised their intention to modify the Bill so that users are offered instead mechanisms by which they can insulate themselves from such content, but that requirement, too, would only apply to category 1 services. There are also other obligations to which only category 1 services are subject—to protect content of democratic importance and journalistic content, and extra duties to assess the impact of their policies and safety measures on rights of freedom of expression and privacy.

Category 1 status matters. The Bill requires Ofcom to maintain a register of services that qualify as category 1 based on threshold criteria set out in regulations under schedule 11 of the Bill. As schedule 11 stands, the Secretary of State must make those regulations, specifying threshold conditions, which Ofcom must then apply to designate a service as category 1. That is based only on the number of users of the service and its functionalities, which are defined in clause 189.

Amendments 2 to 8 would replace the word “functionalities” with the word “characteristics”. This term is defined in amendment 1 to include not only functionalities —in other words what can be done on the platform—but other aspects of the service: its user base; its business model; governance and other systems and processes. Incidentally, that definition of the term “characteristics” is already in the Bill in clause 84 dealing with risk profiles, so it is a definition that the Government have used themselves.

Categorisation is about risk, so the amendments ask more of platforms and services where the greatest risk is concentrated; but the greatest risk will not always be concentrated in the functionality of an online service. For example, its user base and business model will also disclose a significant risk in some cases. I suggest that there should be broader criteria available to Ofcom to enable it to categorise. I also argue that the greatest risk is not always concentrated on the platforms with the most users. Amendment 9 would change schedule 11 from its current wording, which requires the meeting of both a scale and a functionality threshold for a service to be designated as category 1, to instead require only one or the other.

Very harmful content being located on smaller platforms is an issue that has been discussed many times in consideration of the Bill. That could arise organically or deliberately, with harmful content migrating to smaller platforms to escape more onerous regulatory requirements. Amendment 9 would resolve that problem by allowing Ofcom to designate a service as category 1 based on its size or on its functionalities—or, better yet, on its broader characteristics.

I do not want to take too many risks, but I think the Government have some sympathy with my position, based on the indicative amendments they have published for the further Committee stage they would like this Bill to have. I appreciate entirely that we are not discussing those amendments today, but I hope, Madam Deputy Speaker, you will permit me to make some brief reference to them, as some of them are on exactly the same territory as my amendments here.

Some of those amendments that the Government have published would add the words “any other characteristics” to schedule 11 provisions on threshold conditions for categorisation, and define them in a very similar way to my amendment 1. They may ask whether that will answer my concerns, and the answer is, “Nearly.” I welcome the Government’s adding other characteristics to the consideration, not just of threshold criteria, but to the research Ofcom will carry out on how threshold conditions will be set in the first place, but I am afraid that they do not propose to change schedule 11, paragraph 1(4), which requires regulations made on threshold conditions to include,

“at least one specified condition about number of users and at least one specified condition about functionality.”

That means that to be category 1, a service must still be big.

I ask the Minister to consider again very carefully a way in which we can meet the genuine concern about high harm on small platforms. The amendment that he is likely to bring forward in Committee will not yet do so comprehensively. I also observe in passing that the reference the Government make in those amendments to any other characteristics are those that the Secretary of State considers relevant, not that Ofcom considers relevant—but that is perhaps a conversation for another day.

Secondly, I come on to the process of re-categorisation and new clause 1. It is broadly agreed in this debate that this is a fast-changing landscape; platforms can grow quickly, and the nature and scale of the content on them can change fast as well. If the Government are wedded to categorisation processes with an emphasis on scale, then the capacity to re-categorise a platform that is now category 2B but might become category 1 in the future will be very important.

That process is described in clause 83 of the Bill, but there are no timeframes or time limits for the re-categorisation process set out. We can surely anticipate that some category 2B platforms might be reluctant to take on the additional applications of category 1 status, and may not readily acquiesce in re-categorisation but instead dispute it, including through an appeal to the tribunal provided for in clause 139. That would mean that re-categorisation could take some time after Ofcom has decided to commence it and communicate it to the relevant service. New clause 1 is concerned with what happens in the meantime.

To be clear, I would not expect the powers that new clause 1 would create to be used often, but I can envisage circumstances where they would be beneficial. Let us imagine that the general election is under way—some of us will do that with more pleasure than others. Category 1 services have a particular obligation to protect content of democratic importance, including of course by applying their systems and processes for moderating content even-handedly across all shades of political opinion. There will not be a more important time for that obligation than during an election.

Let us assume also that a service subject to ongoing re-categorisation, because in Ofcom’s opinion it now has considerable reach, is not applying that even-handedness to the moderation of content or even to its removal. Formal re-categorisation and Ofcom powers to enforce a duty to protect democratic content could be months away, but the election will be over in weeks, and any failure to correct disinformation against a particular political viewpoint will be difficult or impossible to fully remedy by retrospective penalties at that point.

New clause 1 would give Ofcom injunction-style powers in such a scenario to act as if the platform is a category 1 service where that is,

“necessary to avoid or mitigate significant harm.”

It is analogous in some ways to the powers that the Government have already given to Ofcom to require a service to address a risk that it should have identified in its risk assessment but did not because that risk assessment was inadequate, and to do so before the revised risk assessment has been done.

Again, the Minister may say that there is an answer to that in a proposed Committee stage amendment to come, but I think the proposal that is being made is for a list of emerging category 1 services—those on a watchlist, as it were, as being borderline category 1—but that in itself will not speed up the re-categorisation process. It is the time that that process might take that gives rise to the potential problem that new clause 1 seeks to address.

I hope that my hon. Friend the Minister will consider the amendments in the spirit they are offered. He has probably heard me say before—though perhaps not, because he is new to this, although I do not think anyone else in the room is—that the right way to approach this groundbreaking, complex and difficult Bill is with a degree of humility. That is never an easy sell in this institution, but I none the less think that if we are prepared to approach this with humility, we will all accept, whether Front Bench or Back Bench, Opposition or Government, that we will not necessarily get everything right first time.

Therefore, these Report stages in this Bill of all Bills are particularly important to ensure that where we can offer positive improvements, we do so, and that the Government consider them in that spirit of positive improvement. We owe that to this process, but we also owe it to the families who have been present for part of this debate, who have lost far more than we can possibly imagine. We owe it to them to make sure that where we can make the Bill better, we make it better, but that we do not lose the forward momentum that I hope it will now have.

Neale Hanvey Portrait Neale Hanvey (Kirkcaldy and Cowdenbeath) (Alba)
- View Speech - Hansard - - - Excerpts

I approach my contribution from the perspective of the general principle, the thread that runs through all the amendments on the paper today on safety, reform of speech, illegal content and so on. That thread is how we deal with the harm landscape and the real-world impact of issues such as cyber-bullying, revenge porn, predatory grooming, self-harm or indeed suicide forums.

There is a serious risk to children and young people, particularly women and girls, on which there has been no debate allowed: the promulgation of gender ideology pushed by Mermaids and other so-called charities, which has created a toxic online environment that silences genuine professional concern, amplifies unquestioned affirmation and brands professional therapeutic concern, such as that of James Esses, a therapist and co-founder of Thoughtful Therapists, as transphobic. That approach, a non-therapeutic and affirmative model, has been promoted and fostered online.

The reality is that adolescent dysphoria is a completely normal thing. It can be a response to disruption from adverse childhood experiences or trauma, it can be a feature of autism or personality disorders or it can be a response to the persistence of misogynistic social attitudes. Dysphoria can present and manifest in many different ways, not just gender. If someone’s gender dysphoria persists even after therapeutic support, I am first in the queue to defend that person and ensure their wishes are respected and protected, but it is an absolute falsity to give young people information that suggests there is a quick-fix solution.

It is not normal to resolve dysphoria with irreversible so-called puberty blockers and cross-sex hormones, or with radical, irreversible, mutilating surgery. Gender ideology is being reinforced everywhere online and, indeed, in our public services and education system, but it is anything but progressive. It attempts to stuff dysphoric or gender non-conforming young people into antiquated, regressive boxes of what a woman is and what a man is, and it takes no account of the fact that it is fine to be a butch or feminine lesbian, a femboy or a boy next door, an old duffer like me, an elite gay sportsman or woman, or anything in between.

18:36
Transitioning will be right for some, but accelerating young people into an affirmative model is absolutely reckless. What do those who perpetuate this myth want to achieve? What is in it for them? Those are fundamental questions that we have to ask. The reality is that the affirmative model is the true conversion therapy—trans-ing away the gay and nullifying same-sex attraction.
I urge all right hon. and hon. Members to watch the four-part documentary “Dysphoric” on YouTube. It is so powerful and shows the growing number of young people who have been transitioned rapidly into those services, and the pain, torment and regret that they have experienced through the irreversible effects of their surgery and treatments. The de-transitioners are bearing the impacts. There is no follow-up to such services, and those people are just left to get on with it. Quite often, their friends in the trans community completely abandon them when they detransition.
I pay particular tribute to Sinead Watson and Ritchie Herron, who are both de-transitioners, for their courage and absolutely incredible resilience in dealing with this issue online and shining a light on this outrage. I also pay tribute to the LGB Alliance, For Women Scotland, and Sex Matters, which have done a huge amount of work to bring this matter to the fore.
Mermaids—the organisation—continues to deny that there is any harm, co-morbidities or serious iatrogenic impacts from hormone treatment or radical surgery. That is a lie; it is not true. Mermaids has promoted the illegal availability of online medicines that do lasting, irreversible damage to young people.
I pay tribute to the Government for the Cass review, which is beginning to shine a light on the matter. I welcome the interim report, but we as legislators must make a connection between what is happening online, how it is policed in society and the message that is given out there. We must link harm to online forums and organisations, as well as to frontline services.
I point out with real regret that I came across a document being distributed through King’s College Hospital NHS Foundation Trust from an organisation called CliniQ, which runs an NHS clinic for the trans community. The document has lots of important safety and health advice, but it normalises self-harm as sexual
“Play that involves blood, cutting and piercing.”
It advises that trans-identifying females can go in
“stealth if it is possible for them”
to private gay clubs, and gives examples of how to obtain sex by deception. It is unacceptable that such information is provided on NHS grounds.
Speaking out about this in Scotland has been a very painful experience for many of us. We have faced doxing, threats, harassment and vilification. In 2019, I raised my concerns about safeguarding with my colleagues in Government. A paper I wrote had this simple message: women are not being listened to in the gender recognition reform debate. I approached the then Cabinet Secretary for Social Security and Older People, Shirley-Anne Somerville, whose brief included equality. She was someone I had known for years and considered a friend; she knew my professional background, my family and, of course, my children. She told me she that she shared my concerns—she has children of her own—but she instructed me to be silent. She personally threatened and attempted to bully friends of mine, insisting that they abandon me. I pay great tribute to Danny Stone and the Antisemitism Policy Trust for their support in guiding me through what was an incredibly difficult period of my life. I also pay tribute to the hon. Member for Brigg and Goole (Andrew Percy).
I can see that you are anxious for me close, Madam Deputy Speaker, so I will—[Interruption.] I will chance my arm a bit further, then.
I am not on my pity pot here; this is not about me. It is happening all over Scotland. Women in work are being forced out of employment. If Governments north and south of the border are to tackle online harms, we must follow through with responsible legislation. Only last week, the First Minister of Scotland, who denied any validity to the concerns I raised in 2019, eventually admitted they were true. But her response must be to halt her premature and misguided legislation, which is without any protection for the trans community, women or girls. We must make the connection from online harms all the way through to meaningful legislation at every stage.
Maria Miller Portrait Dame Maria Miller
- View Speech - Hansard - - - Excerpts

I rise to speak to the seven new clauses in my name and those of right hon. and hon. Members from across the House. The Government have kindly said publicly that they are minded to listen to six of the seven amendments that I have tabled on Report. I hope they will listen to the seventh, too, once they have heard my compelling arguments.

First, I believe it is important that we discuss these amendments, because the Government have not yet tabled amendments. It is important that we in this place understand the Government’s true intention on implementing the Law Commission review in full before the Bill completes its consideration.

Secondly, the law simply does not properly recognise as a criminal offence the posting online of intimate images—whether real or fake—without consent. Victims say that having a sexual image of them posted online without their consent is akin to a sexual assault. Indeed, Clare McGlynn went even further by saying that there is a big difference between a physical sexual assault and one committed online: victims are always rediscovering the online images and waiting for them to be redistributed, and cannot see when the abuse will be over. In many ways, it is even more acute.

Just in case anybody in the Chamber is unaware of the scale of the problem after the various contributions that have been made, in the past five years more than 12,000 people reported to the revenge porn helpline almost 200,000 pieces of content that fall into that category. Indeed, since 2014 there have been 28,000 reports to the police of intimate images being distributed without consent.

The final reason why I believe it is important that we discuss the new clauses is that Ofcom will be regulating online platforms based on their adherence to the criminal law, among other things. It is so important that the criminal law actually recognises where criminal harm is done, but at the moment, when it comes to intimate image abuse, it does not. Throughout all the stages of the Bill’s passage, successive Ministers have said very positive things to me about the need to address this issue in the criminal law, but we still have not seen pen being put to paper, so I hope the Minister will forgive me for raising this yet again so that he can respond.

New clauses 45 to 50 simply seek to take the Law Commission’s recommendations on intimate image abuse and put them into law as far as the scope of the Bill will allow. New clause 45 would create a base offence for posting explicit images online without consent. Basing the offence on consent, or the lack of it, makes it comparable with three out of four offences already recognised in the Sexual Offences Act 2003. Subsection (10) of the new clause recognises that it is a criminal offence to distribute fake images, deepfakes or images using nudification software, which are currently not covered in law at all.

New clauses 46 and 47 recognise cases where there is a higher level of culpability for the perpetrator, where they intend to cause alarm, distress or humiliation. Two in three victims report that they know the perpetrators, as a current or former partner. In evidence to the Public Bill Committee, on which I was very pleased to serve, we heard from the Anjelou Centre and Imkaan that some survivors of this dreadful form of abuse are also at risk of honour-based violence. There are yet more layers of abuse.

New clause 48 would make it a crime to threaten to share an intimate image—this can be just as psychologically destructive as actually sharing it—and using the image to coerce, control or manipulate the victim. I pay real tribute to the team from the Law Commission, under the leadership of Penney Lewis, who did an amazing job of work over three years on their enquiry to collect this information. In the responses to the enquiry there were four mentions of suicide or contemplated suicide as a result of threats to share these sorts of images online without consent. Around one in seven young women and one in nine young men have experienced a threat to share an intimate or sexual image. One in four calls to the Revenge Porn Helpline relate to threats to share. The list of issues goes on. In 2020 almost 3,000 people, mostly men, received demands for money related to sexual images—“sextorsion”, as it is called. This new clause would make it clear that such threats are criminal, the police need to take action and there will be proper protection for victims in law.

New clauses 49 and 50 would go further. The Law Commission is clear that intimate image abuse is a type of sexual offending. Therefore, victims should have the same protection afforded to those of other sexual offences. That is backed up by the legal committee of the Council of His Majesty’s District Judges, which argues that it is appropriate to extend automatic lifetime anonymity protections to victims, just as they would be extended to victims of offences under the Modern Slavery Act 2015. Women’s Aid underlined that point, recognising that black and minoritised women are also at risk of being disowned, ostracised or even killed if they cannot remain anonymous. The special measures in these new clauses provide for victims in the same way as the Domestic Abuse Act 2021.

I hope that my hon. Friend the Minister can confirm that the Government intend to introduce the Law Commission’s full recommendations into the Bill, and that those in scope will be included before the Bill reaches its next stage in the other place. I also hope that he will outline how those measures not in scope of the Bill—specifically on the taking and making of sexual images without consent, which formed part of the Law Commission’s recommendations—will be addressed in legislation swiftly. I will be happy to withdraw my new clauses if those undertakings are made today.

Finally, new clause 23, which also stands in my name, is separate from the Law Commission’s recommendations. It would require a proportion of the fines secured by Ofcom to be used to fund victims’ services. I am sure that the Treasury thinks that it is an innovative way of handling things, although one could argue that it did something similar only a few days ago with regard to the pollution of waterways by water companies. I am sure that the Minister might want to refer to that.

The Bill identifies that many thousands more offences are committed as crimes than are currently recognised within law. I hope that the Minister can outline how appropriate measures will be put in place to ensure support for victims, who will now, possibly for the first time, have some measures in place to assist them. I raised earlier the importance of keeping the Bill and its effectiveness under review. I hope that the House will think about how we do that materially, so we do not end up having another five or 10 years without such a Bill and having to play catch-up in such a complex area.

18:45
Matt Rodda Portrait Matt Rodda (Reading East) (Lab)
- View Speech - Hansard - - - Excerpts

I am grateful to have the opportunity to speak in this debate. I commend the right hon. Member for Basingstoke (Dame Maria Miller) on her work in this important area. I would like to focus my remarks on legal but harmful content and its relationship to knife crime, and to mention a very harrowing and difficult constituency case of mine. As we have heard, legal but harmful content can have a truly dreadful effect. I pay tribute to the families of the children who have been lost, who have attended the debate, a number of whom are still in the Public Gallery.

Rosie Winterton Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - - - Excerpts

Just to be clear, the hon. Gentleman’s speech must relate to the amendments before us today.

Matt Rodda Portrait Matt Rodda
- Hansard - - - Excerpts

Thank you, Madam Deputy Speaker. A boy called Olly Stephens in my constituency was just 13 years old when he was stabbed and brutally murdered in an attack linked to online bullying. He died, sadly, very near his home. His parents had little idea of the social media activity in his life. It is impossible to imagine what they have been through. Our hearts go out to them.

Harmful but legal content had a terrible effect on the attack on Olly. The two boys who attacked and stabbed him had been sharing enormous numbers of pictures and videos of knives, repeatedly, over a long period of time. There were often videos of teenagers playing with knives, waving them or holding them. They circulated them on 11 different social media platforms over a long period of time. None of those platforms took any action to take the content down. We all need to learn more about such cases to fully understand the impact of legal but harmful content. Even at this late stage, I hope that the Government will think again about the changes they have made to the Bill and include this area again in the Bill.

There is a second aspect of this very difficult case that I want to mention: the fact that Olly’s murder was discussed on social media and was planned to some extent beforehand. The wider issues here underline the need for far greater regulation and moderation of social media, in particular teenagers’ use of these powerful sites. I am finding it difficult to talk about some of these matters, but I hope that the Government will take my points on board and address the issue of legal but harmful content, and that the Minister will think again about these important matters. Perhaps we will have an opportunity to discuss it in the Bill’s later stages.

Adam Afriyie Portrait Adam Afriyie
- View Speech - Hansard - - - Excerpts

I am pleased to follow my fairly close neighbour from Berkshire, the hon. Member for Reading East (Matt Rodda). He raised the issue of legal but harmful content, which I will come to, as I address some of the amendments before us.

I very much welcome the new shape and focus of the Bill. Our primary duty in this place has to be to protect children, above almost all else. The refocusing of the Bill certainly does that, and it is now in a position where hon. Members from all political parties recognise that it is so close to fulfilling its function that we want it to get through this place as quickly as possible with today’s amendments and those that are forthcoming in the Lords and elsewhere in future weeks.

The emerging piece of legislation is better and more streamlined. I will come on to further points about legal but harmful, but I am pleased to see that removed from the Bill for adults and I will explain why, given the sensitive case that the hon. Member for Reading East mentioned. The information that he talked about being published online should be illegal, so it would be covered by the Bill. Illegal information should not be published and, within the framework of the Bill, would be taken down quickly. We in this place should not shirk our responsibilities; we should make illegal the things that we and our constituents believe to be deeply harmful. If we are not prepared to do that, we cannot say that some other third party has a responsibility to do it on our behalf and we are not going to have anything to do with it, and they can begin to make the rules, whether they are a commercial company or a regulator without those specific powers.

I welcome the shape of the Bill, but some great new clauses have been tabled. New clause 16 suggests that we should make it an offence to encourage self-harm, which is fantastic. My right hon. Friend the Member for Haltemprice and Howden (Mr Davis) has indicated that he will not press it to a vote, because the Government and all of us acknowledge that that needs to be dealt with at some point, so hopefully an amendment will be forthcoming in the near future.

On new clause 23, it is clear that if a commercial company is perpetrating an illegal act or is causing harm, it should pay for it, and a proportion of that payment must certainly support the payments to victims of that crime or breach of the regulations. New clauses 45 to 50 have been articulately discussed by my right hon. Friend the Member for Basingstoke (Dame Maria Miller). The technology around revenge pornography and deepfakes is moving forward every day. With some of the fakes online today, it is not possible to tell that they are fakes, even if they are looked at under a microscope. Those areas need to be dealt with, but it is welcome that she will not necessarily press the new clauses to a vote, because those matters must be picked up and defined in primary legislation as criminal acts. There will then be no lack of clarity and we will not need the legal but harmful concept—that will not need to exist. Something will either be illegal, because it is harmful, or not.

The Bill is great because it provides a framework that enables everything else that hon. Members in the House and people across the country may want to be enacted at a future date. It also enables the power to make those judgments to remain with this House—the democratically elected representatives of the people—rather than some grey bureaucratic body or commercial company whose primary interest is rightly to make vast sums of money for its shareholders. It is not for them to decide; it is for us to decide what is legal and what should be allowed to be viewed in public.

On amendment 152, which interacts with new clause 11, I was in the IT industry for about 15 to 20 years before coming to this place, albeit with a previous generation of technology. When it comes to end-to-end encryption, I am reminded of King Canute, who said, “I’m going to pass a law so that the tide doesn’t come in.” Frankly, we cannot pass a law that bans mathematics, which is effectively what we would be trying to do if we tried to ban encryption. The nefarious types or evildoers who want to hide their criminal activity will simply use mathematics to do that, whether in mainstream social media companies or through a nefarious route. We have to be careful about getting rid of all the benefits of secure end-to-end encryption for democracy, safety and protection from domestic abuse—all the good things that we want in society—on the basis of a tiny minority of very bad people who need to be caught. We should not be seeking to ban encryption; we should be seeking to catch those criminals, and there are ways of doing so.

I welcome the Bill; I am pleased with the new approach and I think it can pass through this House swiftly if we stick together and make the amendments that we need. I have had conversations with the Minister about what I am asking for today: I am looking for an assurance that the Government will enable further debate and table the amendments that they have suggested. I also hope that they will be humble, as my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) said, and open to some minor adjustments, even to the current thinking, to make the Bill pass smoothly through the Commons and the Lords.

I would like the Government to confirm that it is part of their vision that it will be this place, not a Minister of State, that decides every year—or perhaps every few months, because technology moves quickly—what new offences need to be identified in law. That will mean that Ofcom and the criminal justice system can get on to that quickly to ensure that the online world is a safer place for our children and a more pleasant place for all of us.

None Portrait Several hon. Members rose—
- Hansard -

Rosie Winterton Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - - - Excerpts

Order. Just a quick reminder: I know it is extremely difficult, and I do not want to interrupt hon. Members when they are making their speeches, but it is important that we try to address the amendments that are before us today. There will be a separate debate on whether to recommit the Bill and on the other ideas, so they can be addressed at that point. As I say, it is important to relate remarks to the amendments that are before us.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- View Speech - Hansard - - - Excerpts

I apologise for having left the debate for a short time; I had committed to speaking to a room full of young people about the importance of political education, which felt like the right thing to do, given the nature of the debate and the impact that the Bill will have on our young people.

I am extremely relieved that we are continuing to debate the Bill, despite the considerable delays that we have seen; as I mentioned in this House previously, it is long overdue. I acknowledge that it is still groundbreaking in its scope and extremely important, but we must now ensure that it works, particularly for children and vulnerable adults, and that it goes some way to cleaning up the internet for everyone by putting users first and holding platforms to account.

On new clause 53, I put on record my thanks to the Government for following through with their commitments to me in Committee to write Zach’s law in full into the Bill. My constituent Zach Eagling and his mum Clare came into Parliament a few weeks ago, and I know that hon. Members from both sides of the House were pleased to meet him to thank him for his incredible campaign to make the vile practice of epilepsy trolling completely illegal, with a maximum penalty of a five-year prison sentence. The inspirational Zach, his mum and the Epilepsy Society deserve enormous praise and credit for their incredible campaign, which will now protect the 600,000 people living with epilepsy in the UK. I am delighted to report that Zach and his mum have texted me to thank all hon. Members for their work on that.

I will raise three areas of particular concern with the parts of the Bill that we are focusing on. First, on director liability, the Bill includes stiff financial penalties for platforms that I hope will force them to comply with these regulations, but until the directors of these companies are liable and accountable for ensuring that their platforms comply and treat the subject with the seriousness it requires, I do not believe that we will see the action needed to protect children and all internet users.

Ultimately, if platforms enforce their own terms and conditions, remove illegal content and comply with the legal but harmful regulations—as they consistently tell us that they will—they have nothing to worry about. When we hear the stories of harm committed online, however, and when we hear from the victims and their families about the devastation that it causes, we must be absolutely watertight in ensuring that those who manage and operate the platforms take every possible step to protect every user on their platform.

We must ensure that, to the directors of those companies, this is a personal commitment as part of their role and responsibility. As we saw with health and safety regulations, direct liability is the most effective way to ensure that companies implement such measures and are scrupulous in reviewing them. That is why I support new clause 17 and thank my right hon. Friend the Member for Barking (Dame Margaret Hodge) for her tireless and invaluable work on this subject.

Let me turn to media literacy—a subject that I raised repeatedly in Committee. I am deeply disappointed that the Government have removed the media literacy duty that they previously committed to introducing. Platforms can boast of all the safety tools they have to protect users, talk about them in meetings, publicise them in press releases and defend them during Committee hearings, but unless users know that they are there and know exactly how to use them, and unless they are being used, their existence is pointless.

19:04
Ofcom recently found that more than a third of children aged eight to 17 said they had seen something “worrying or nasty” online in the past 12 months, but only a third of children knew how to use online reporting or flagging functions. Among adults, a third of internet users were unaware of the potential for inaccurate or biased information online, and just over a third made no appropriate checks before registering their personal details online. Clearly, far more needs to be done to ensure that internet users of all ages are aware of online dangers and of the tools available to keep them safe.
Although programmes such as Google’s “Be Internet Legends” assemblies are a great resource in schools—I was pleased to visit one at Park Road Junior Infant and Nursery School in Batley recently—we cannot rely on platforms to do this themselves. We have had public information campaigns on the importance of wearing seatbelts, and on the dangers of drink-driving and smoking, and the digital world is now one of the largest dangers most people face in their daily lives. The public sector clearly has a role to warn of the dangers and promote healthy digital habits.
Let me give one example from the territory of legal but harmful content, which members have spoken about as opaque, challenging and thorny. I agree with all those comments, but if platforms have a tool within them that switches off legal but harmful content, it strikes me as incredibly important that users know what that tool does—that is, they know what information they may be subjected to if it is switched on, and they know exactly how to turn it off. Yet I have heard nothing from the Government since their announcement last week that suggests they will be taking steps to ensure that this tool is easily accessible to users of all ages and digital abilities, and that is exactly why there is a need for a proper digital media literacy strategy.
I therefore support new clauses 29 and 30, tabled by my colleagues in the SNP, which would empower Ofcom to publish a strategy at least every three years that sets out the measures it is taking to promote media literacy among the public, including through educational initiatives and by ensuring that platforms take the steps needed to make their users aware of online safety tools.
Finally, I turn to the categorisation of platforms under part 7 of the Bill. I feel extremely strongly about this subject and agree with many comments made by the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright). The categorisation system listed in the Bill is not fit for purpose. I appreciate that categorisation is largely covered in part 3 and schedule 10, but amendment 159, which we will be discussing in Committee, and new clause 1, which we are discussing today, are important steps towards addressing the Government’s implausible position—that the size of a platform equates to the level of risk. As a number of witnesses stated in Committee, that is simply not the case.
It is completely irresponsible and narrow-minded to believe that there are no blind spots in which small, high-risk platforms can fester. I speak in particular about platforms relating to dangerous, extremist content —be it Islamist, right wing, incel or any other. These platforms, which may fall out of the scope of the Bill, will be allowed to continue to host extremist individuals and organisations, and their deeply dangerous material. I hope the Government will urgently reconsider that approach, as it risks inadvertently pushing people, including young people, towards greater harm online—either for individuals or for society as a whole.
Although I am pleased that the Bill is back before us today, I am disappointed that aspects have been weakened since we last considered it, and urge the Government to consider closely some proposals we will vote on this evening, which would go a considerable way to ensuring that the online world is a safer place for children and adults, works in the interests of users, and holds platforms accountable and responsible for protecting us all online.
John Penrose Portrait John Penrose
- View Speech - Hansard - - - Excerpts

It is a pleasure to follow Zach’s MP, the hon. Member for Batley and Spen (Kim Leadbeater). I particularly want to pick up on her final comments about the difficulties of platforms—not just small platforms, but larger ones—hosting extremist content, be it incels, the alt-right, the radical left or any other kind.

I will speak to my new clauses 34 and 35, which seek to deal with both disinformation and misinformation. They are important amendments, because although the Bill has taken huge steps forward—we are led to believe that it may take a couple more in due course when the revised version comes back if the recommittal is passed—there are still whole categories of harm that it does not yet address. In particular, it focuses, rightly and understandably, on individual harms to children and illegal activities as they relate to adults, but it does not yet deal with anything to do with collective harms to our society and our democracy, which matter too.

We have heard from former journalists in this debate. Journalists know it takes time and money to come up with a properly researched, authoritatively correct, accurate piece of journalism, but it takes a fraction of that time and cost to invent a lie. A lie will get halfway around the world before the truth has got its boots on, as the saying rightly goes. Incidentally, the hon. Member for Rotherham (Sarah Champion) said that it is wonderful that we are all learning so much. I share that sentiment; it is marvellous that we are all comparing and sharing our particular areas of expertise.

One person who seems to have all areas of expertise under his belt is my hon. Friend the Member for Folkestone and Hythe (Damian Collins), who chaired the Joint Committee. He rightly pointed out that this is a systems Bill, and it therefore deals with trying to prevent some things from happening—and yet it is completely silent on misinformation and disinformation, and their effect on us collectively, as a society and a democracy. New clauses 34 and 35 are an attempt to begin to address those collective harms alongside some individual harms we face. One of them deals with a duty of balance; the other deals with factual accuracy.

The duty of balance is an attempt to address the problem as it relates to filter bubbles, because this is a systems Bill and because each of us has a tailored filter bubble, by which each of the major platforms, and some of the minor ones, work out what we are interested in and feed us more of the same. That is fine for people who are interested in fishing tackle; that is super. But if someone is interested in incels and they get fed more and more incel stuff, or they are vaguely left wing and get taken down a rabbit hole into the increasingly radical left—or alternatively alt-right, religious extremism or whatever it may be—pretty soon they get into echo chambers, and from echo chambers they get into radicalisation, and from radicalisation they can pretty soon end up in some very murky, dark and deep waters.

There are existing rules for other old-world broadcasters; the BBC, ITV and all the other existing broadcasters have a duty of balance and undue prominence imposed on them by Ofcom. My argument is that we should consider ways to impose a similar duty of balance on the people who put together the programs that create our own individual filter bubbles, so that when someone is shown an awful lot of stuff about incels, or alt-right or radical left politics, somewhere in that filter bubble they will be sent something saying, “You do know that this is only part of the argument, don’t you? Do you know that there is another side to this? Here’s the alternative; here’s the balancing point.” We are not doing that at the moment, which is one of the reasons we have an increasingly divided societal and political debate, and that our public square as a society is becoming increasingly more fractious—and dangerous, in some cases. New clause 35 would fix that particular problem.

New clause 34 would deal with the other point—the fact that a lie will get halfway around the world before the truth has got its boots on. It tries to deal with factual accuracy. Factual accuracy is not quite the same thing as truth. Truth is an altogether larger and more philosophical concept to get one’s head around. It is how we string together accurate and correct facts to create a narrative or an explanation. Factual accuracy is an essential building block for truth. We must at least try to ensure that we can all see when someone has made something up or invented something, whether it is that bleach is a good way to cure covid or whatever. When somebody makes something up, we need to know and it needs to be clear. In many cases that is clear, but in many cases, if it is a plausible lie, a deepfake or whatever it may be, it is not clear. We need to be able to see that easily, quickly and immediately, and say, “I can discount this, because I know that the person producing it is a serial liar and tells huge great big porkies, and I shouldn’t be trusting what they are sending me, or I can see that the actual item itself is clearly made up.”

The duty of achieving balance already exists in rules and law in other parts of our society and is tried and tested—it has stood us very well and done a good job for us for 40 or 50 years, since TV and radio became ubiquitous—and the same is true, although not for quite such a long time, for factual accuracy. There are increasingly good methods of checking the factual accuracy of individual bits of content, and if necessary, in some cases of doing so in real time, too. For example, Adobe is leading a very large global grouping producing something called the Content Authenticity Initiative, which can tell if something is a deepfake, because it has an audit trail of where the image, the item or whatever it may be came from and how it has been updated, modified or changed during the course of its life.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

On that point, I want to raise the work that my hon. Friend the Member for Bosworth (Dr Evans), who is not in the Chamber at the moment, has done on body image. When images are photo-shopped and changed to give an idea of beauty that is very different from what is possible in the real world, that very much falls into the idea of truth. What are my hon. Friend’s thoughts on that point?

John Penrose Portrait John Penrose
- Hansard - - - Excerpts

Addressing that is absolutely essential. That goes for any of the deepfake examples we have heard about, including from my right hon. Friend the Member for Basingstoke (Dame Maria Miller), because if we know that something has been changed—and the whole point about deepfake is that it is hard to tell—we can tell easily and say, “I know that is not right, I know that is not true, I know that is false, and I can aim away from it and treat it accordingly”.

Just to make sure that everybody understands, this is not some piece of new tech magic; it is already established. Adobe, as I have said, is doing it with the Content Authenticity Initiative, which is widely backed by other very serious tech firms. Others in the journalism world are doing the same thing, with the Journalism Trust Initiative. There is NewsGuard, which produces trust ratings; the Trust Project, which produces trust indicators; and we of course have our own press regulators in this country, the Independent Press Standards Organisation and IMPRESS.

I urge the Government and all here present not to be satisfied with where this Bill stands now. We have all heard how it can be improved. We have all heard that this is a new, groundbreaking and difficult area in which many other countries have not even got as far as we have, but we should not be in any way satisfied with where we are now. My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) said earlier that we need to approach this Bill in a spirit of being humble, and this is an area in which humility is absolutely essential. I hope all of us realise how much further we have to go, and I hope the Minister will say how he proposes to address these important and so far uncovered issues in due course.

Liz Twist Portrait Liz Twist (Blaydon) (Lab)
- View Speech - Hansard - - - Excerpts

I wish to address new clauses 16 and 28 to 30, and perhaps make a few passing comments on some others along the way. Many others who, like me, were in the Chamber for the start of the debate will I suspect feel like a broken record, because we keep revisiting the same issues and raising the same points again and again, and I am going to do exactly that.

First, I will speak about new clause 16, which would create a new offence of encouraging or assisting serious self-harm. I am going to do so because I am the chair of the all-party parliamentary group on suicide and self-harm prevention, and we have done a good deal of work on looking at the issue of self-harm and young people in the last two years. We know that suicide is the leading cause of death in men aged under 50 years and females aged under 35 years, with the latest available figures confirming that 5,583 people in England and Wales tragically took their own lives in 2021. We know that self-harm is a strong risk factor for future suicidal ideation, so it is really important that we tackle this issue.

The internet can be an invaluable and very supportive place for some people who are given the opportunity to access support, but for other people it is difficult. The information they see may provide access to content that acts to encourage, maintain or exacerbate self-harm and suicidal behaviours. Detailed information about methods can also increase the likelihood of imitative and copycat suicide, with risks such as contagion effects also present in the online environment.

19:09
Richard Burgon Portrait Richard Burgon (Leeds East) (Lab)
- Hansard - - - Excerpts

I pay tribute to my hon. Friend for the work she has done. She will be aware of the case of my constituent Joe Nihill, who at the age of 23 took his own life after accessing suicide-related material on the internet. Of course, we fully support new clause 16 and amendment 159. A lot of content about suicide is harmful, but not illegal, so does my hon. Friend agree that what we really need is assurances from the Minister that, when this Bill comes back, it will include protections to ensure that adults such as Joe, who was aged 23, and adults accessing these materials through smaller platforms are fully protected and get the protection they really need?

Liz Twist Portrait Liz Twist
- Hansard - - - Excerpts

I thank my hon. Friend for those comments, and I most definitely agree with him. One of the points we should not lose sight of is that his constituent was 23 years of age—not a child, but still liable to be influenced by the material on the internet. That is one of the points we need to take forward.

It is really important that we look at the new self-harm offence to make sure that this issue is addressed. That is something that the Samaritans, which I work with, has been campaigning for. The Government have said they will create a new offence, which we will discuss at a future date, but there is real concern that we need to address this issue as soon as possible through new clause 16. I ask the Minister to comment on that so that we can deal with the issue of self-harm straightaway.

I now want to talk about internet and media literacy in relation to new clauses 29 and 30. YoungMinds, which works with young people, is supported by the Royal College of Psychiatrists, the British Psychological Society and the Mental Health Foundation in its proposals to promote the public’s media literacy for both regulated user-to-user services and search services, and to create a strategy to do this. Young people, when asked by YoungMinds what they thought, said they wanted the Online Safety Bill to include a requirement for such initiatives. YoungMinds also found that young people were frustrated by very broad, generalised and outdated messages, and that they want much more nuanced information—not generalised fearmongering, but practical ways in which they can address the issue. I do hope that the Government will take that on board, because if people are to be protected, it is important that we have a more sophisticated media literacy than is reflected in the broad messages we sometimes get at present.

On new clause 28, I do believe there is a need for advocacy services to be supported by the Government to assist and support young people—not to take responsibilities away from them, but to assist and protect them. I want to make two other points. I see that the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) has left the Chamber again, but he raised an interesting and important point about the size of platforms covered by the Bill. I believe the Bill needs to cover those smaller or specialised platforms that people might have been pushed on to by changes to the larger platforms. I hope the Government will address that important issue in future, together with the issue of age, so that protection does not stop just with children, and we ensure that others who may have vulnerabilities are also protected.

I will not talk about “legal but harmful” because that is not for today, but there is a lot of concern about those provisions, which we thought were sorted out and agreed on, suddenly being changed. There is a lot of trepidation about what might come in future, and the Minister must understand that we will be looking closely at any proposed changes.

We have been talking about this issue for many years—indeed, since I first came to the House—and during the debate I saw several former Ministers and Secretaries of State with whom I have raised these issues. It is about time that we passed the Bill. People out there, including young people, are concerned and affected by these issues. The internet and social media are not going to stop because we want to make the Bill perfect. We must ensure that we have something in place. The legislation might be capable of revision in future, but we need it now for the sake of our young people and other vulnerable people who are accessing online information.

Suzanne Webb Portrait Suzanne Webb
- View Speech - Hansard - - - Excerpts

This is the first time I have been able to speak in the Chamber for some time, due to a certain role I had that prevented me from speaking in here. It is an absolute honour and privilege, on my first outing in some time, to have the opportunity to speak specifically to new clause 53, which is Zach’s law. I am delighted and thrilled that the Government are supporting Zach’s law. I have supported it for more than two years, together with my hon. Friend the Member for Watford (Dean Russell). We heard during the Joint Committee on the Draft Online Safety Bill how those who suffer from epilepsy were sent flashing images on social media by vile trolls. Zach Eagling, whom the law is named after, also has cerebral palsy, and he was one of those people. He was sent flashing images after he took part in a charity walk around his garden. He was only nine years of age.

Zach is inspirational. He is selflessly making a massive difference, and the new clause is world-leading. It is down to Zach, his mum, the UK Epilepsy Society, and of course the Government, that I am able to stand here to talk about new clause 53. I believe that the UK Epilepsy Society is the only charity in the world to change the law on any policy area, and that is new clause 53, which is pretty ground-breaking. I say thank you to Zach and the Epilepsy Society, who ensured that I and my hon. Friend the Member for Watford stepped up and played our part in that.

Being on the Joint Committee on the Draft Online Safety Bill was an absolute privilege, with the excellent chairmanship of my hon. Friend the Member for Folkestone and Hythe (Damian Collins). People have been talking about the Bill’s accompanying Committee, which is an incredibly good thing. In the Joint Committee we talked about this: we should follow the Bill through all its stages, and also once it is on the statute books, to ensure that it keeps up with those tech companies. The Joint Committee was brought together by being focused on a skill set, and on bringing together the right skills. I am a technological luddite, but I brought my skills and understanding of audit and governance. My hon. Friend the Member for Watford brought technology and all his experience from his previous day job. As a result we had a better Bill by having a mix of experience and sharing our expertise.

This Bill is truly world leading. New clause 53 is one small part of that, but it will make a huge difference to thousands of lives including, I believe, 600,000 who suffer from epilepsy. The simple reality is that the big tech companies can do better and need to step up. I have always said that we do not actually need the Bill or these amendments; we need the tech companies to do what they are supposed to do, and go out and regulate their consumer product. I have always strongly believed that.

During my time on the Committee I learned that we must follow the money—that is what it is all about for the tech companies. We have been listening to horrific stories from grieving parents, some of whom I met briefly, and from those who suffered at the hands of racism, abuse, threats—the list is endless. The tech companies could stop that now. They do not need the Bill to do it and they should do the right thing. We should not have to get the Bill on to the statute books to enforce what those companies should be doing in the first place. We keep saying that this issue has been going on for five years. The tech companies know that this has been talked about for five years, so why are they not doing something? For me the Bill is for all those grieving families who have lost their beautiful children, those who have been at the mercy of keyboard warriors, and those who have received harm or lost their lives because the tech companies have not, but could have, done better. This is about accountability. Where are the tech companies?

I wish to touch briefly on bereaved parents whose children have been at the mercy of technology and content. Many families have spent years and years still unable to understand their child’s death. We must consider imposing transparency on the tech companies. Those families cannot get their children back, but they are working hard to ensure that others do not lose theirs. Data should be given to coroners in the event of the death of a child to understand the circumstances. This is important to ensure there is a swift and humane process for the coroner to access information where there is reason to suspect that it has impacted on a child’s death.

In conclusion, a huge hurrah that we have new clause 53, and I thank the Government for this ground-breaking Bill. An even bigger hurrah to Zach, Zach’s mum, and the brilliant Epilepsy Society, and, of course, to Zach’s law, which is new clause 53.

Jamie Stone Portrait Jamie Stone
- View Speech - Hansard - - - Excerpts

Clearly I am on my feet now because I am the Liberal Democrat DCMS spokesman, but many is the time when, in this place, I have probably erred on the side of painting a rosy picture of my part of the world—the highlands—where children can play among the heather and enjoy themselves, and life is safe and easy. This week just gone I was pulled up short by two mothers I know who knew all about today. They asked whether I would be speaking. They told me of their deep concern for a youngster who is being bullied right now, to the point where she was overheard saying among her family that she doubted she would ever make the age of 21. I hope to God that that young person, who I cannot name, is reached out to before we reach the tragic level of what we have heard about already today. Something like that doesn’t half put a shadow in front of the sun, and a cold hand on one’s heart. That is why we are here today: we are all singing off the same sheet.

The Liberal Democrats back new clause 17 in the name of the right hon. Member for Barking (Dame Margaret Hodge). Fundamental to being British is a sense of fair play, and a notion that the boss or bosses should carry the can at the end of the day. It should not be beyond the wit of man to do exactly what the right hon. Lady suggested, and nobble those who ultimately hold responsibility for some of this. We are pretty strong on that point.

Having said all that, there is good stuff in the Bill. Obviously, it has been held up by the Government—or Governments, plural—which is regrettable, but it is easy to be clever after the fact. There is much in the Bill, and hopefully the delay is behind us. It has been chaotic, but we are pleased with the direction in which we are heading at the moment.

I have three or four specific points. My party welcomes the move to expand existing offences on sharing intimate images of someone to include those that are created digitally, known as deep fakes. We also warmly welcome the move to create a new criminal offence of assisting or encouraging self-harm online, although I ask the Government for more detail on that as soon as possible. Thirdly, as others have mentioned, the proposed implementation of Zach’s law will make it illegal to post stuff that hits people with epilepsy.

If the pandemic taught me one thing, it was that “media-savvy” is not me. Without my young staff who helped me during that period, it would have been completely beyond my capability to Zoom three times in one week. Not everyone out there has the assistance of able young people, which I had, and I am very grateful for that. One point that I have made before is that we would like to see specific objectives—perhaps delivered by Ofcom as a specific duty—on getting more media savvy out there. I extol to the House the virtue of new clause 37, tabled by my hon. Friend the Member for Twickenham (Munira Wilson). The more online savvy we can get through training, the better.

At the end of the day, the Bill is well intentioned and, as we have heard, it is essential that it makes a real impact. In the case of the young person I mentioned who is in a dark place right now, we must get it going pretty dashed quick.

19:30
Natalie Elphicke Portrait Mrs Elphicke
- View Speech - Hansard - - - Excerpts

I rise to speak to new clause 55, which stands in my name. I am grateful to my many right hon. and hon. Friends who have supported it, both by putting their name to it and otherwise. I welcome the Minister and his engagement with the new clause and hope to hear from him further as we move through the debate.

The new clause seeks to create a new criminal offence of intentionally sharing a photograph or film that facilitates or promotes modern slavery or illegal immigration. Members may have wondered how so many people—more than 44,000 this year alone—know who to contact to cross the channel, how to go about it and how much it will cost. Like any business, people smuggling relies on word of mouth, a shopfront or digital location on the internet, and advertising. As I will set out, in this context advertising is done not through an advert in the local paper but by posting a video and photos online.

Nationalities who use the channel crossing routes are from an astonishing array of countries—from Eritrea and Vietnam to Iraq and Iran—but they all end up arriving on boats that leave from France. Since May 2022, there has been a massive increase in the number of Albanians crossing the channel in small boats. From May to September this year, Albanian nationals comprised 42% of small boat crossings, with more than 11,000 Albanians arriving by small boats, compared with 815 the entire previous year. It is little wonder that it is easy to find criminal gangs posting in Albanian on TikTok with videos showing cheery migrants with thumbs up scooting across the channel on dinghies and motoring into Britain with ease. Those videos have comments, which have been roughly translated as:

“At 8 o’clock the next departure, hurry to catch the road”;

“They passed again today! Get in touch today”;

“Get on the road today, serious escape within a day, not…a month in the forest like most”;

“The trips continue, contact us, we are the best and the fastest”;

and

“Every month, safe passage, hurry up”.

However, far from being safe, the small boat crossings are harmful, dangerous and connected with serious crime here in the UK, including modern slavery, the drugs trade and people trafficking.

With regard to the journey, there have been a number of deaths at sea. The Minister for Immigration recently stated that many people in processing centres

“present with severe burns that they have received through the combination of salty water and diesel fuel in the dinghies.”—[Official Report, 28 November 2022; Vol. 723, c. 683.]

That, of course, underlines why prevention, detection and interception of illegal entry is so important on our sea border. It also speaks to the harm and prevention of harm that my new clause seeks to address: to identify and disrupt the ability of those gangs to post on social media and put up photographs, thereby attracting new business, and communicate in relation to their illegal activity.

The National Crime Agency has identified links with the criminal drugs trade, modern slavery and other serious and violent crime. That is because illegal immigration and modern slavery offences do not just happen abroad. A criminal enterprise of this scale has a number of operators both here in the UK and abroad. That includes people here in the UK who pay for the transit of another. When they do, they do not generally have the good fortune of that other individual in mind. There are particular concerns about young people and unaccompanied children as well as people who find themselves in debt bondage in modern slavery.

That also includes people here in the UK who provide information, such as those TikTok videos, to a friend or contacts in a home country so that other people can make their own arrangements to travel. It includes people here in the UK who take photos of arrivals and post or message them to trigger success fees. Those fees are the evidence-based method of transacting in this illegal enterprise and are thought to be responsible for some of the most terrifying experiences of people making the crossing, including even a pregnant woman and others being forced into boats at gunpoint and knifepoint in poor weather when they did not want to go, and parents separated from their children at the water’s edge, with their children taken and threatened to coerce them into complying.

Last year, 27 people died in the channel in a single day, in the worst small boat incident to date. A newspaper report about those deaths contains comment about a young man who died whose name was Pirot. His friend said of the arrangements for the journey:

“Typically…the smugglers made deals with families at home. Sometimes they turned up at the camp in masks. The crossing costs about £3,000 per person, with cash demanded in full once their loved one had made it to Dover. One of the Iraqi Kurdish smugglers who arranged Pirot’s crossing has since deleted his Facebook page and WhatsApp account”.

TikTok, WhatsApp and Facebook have all been identified as platforms actively used by the people smugglers. Action is needed in the Bill’s remit to protect people from people smugglers and save lives in the channel. The new offence would ensure that people here in the UK who promote illegal immigration and modern slavery face a stronger deterrent and, for the first time, real criminal penalties for their misdeeds. It would make it harder for the people smugglers to sell their wares. It would help to protect people who would be exploited and put at risk by those criminal gangs. The risk to life and injury, the risk of modern slavery, and the risks of being swept into further crime, both abroad and here in the UK, are very real.

The new offence would be another in the toolbox to tackle illegal immigration and prevent modern slavery. I hope that when the Minister makes his remarks, he may consider further expansion of other provisions currently in the Bill but outside the scope of our discussions, such as the schedule 7 priority offences. New clause 55 would tackle the TikTok traffickers and help prevent people from risking their lives by taking these journeys across the English channel.

Carla Lockhart Portrait Carla Lockhart (Upper Bann) (DUP)
- View Speech - Hansard - - - Excerpts

I welcome the fact that we are here today to discuss the Bill. It has been a long haul, and we were often dubious as to whether we would see it progressing. The Government have done the right thing by progressing it, because ultimately, as each day passes, harm is being caused by the lack of regulation and enforcement. While some concerns have been addressed, many have not. To that end, this must be not the end but the beginning of a legislative framework that is fit for purpose; one that is agile and keeps up with the speed at which technology changes. For me, probably the biggest challenge for the House and the Government is not how we start but how we end on these issues.

Like many Members, I am quite conflicted when it comes to legal but harmful content. I know that is a debate for another day, but I will make one short point. I am aware of the concerns about free speech. As someone of faith, I am cognisant of the outrageous recent statement from the Crown Prosecution Service that it is “no longer appropriate” to quote certain parts of the Bible in public. I would have serious concerns about similar diktats and censorship being imposed by social media platforms on what are perfectly legitimate texts, and beliefs based on those texts. Of course, that is just one example, but it is a good example of why, because of the ongoing warfare of some on certain beliefs and opinions, it would be unwise to bestow such policing powers on social media outlets.

When the Bill was first introduced, I made it very clear that it needed to be robust in its protection of children. In the time remaining, I wish to address some of the amendments that would strengthen the Bill in that regard, as well as the enforcement provisions.

New clause 16 is a very important amendment. None of us would wish to endure the pain of a child or loved one self-harming. Sadly, we have all been moved by the very personal accounts from victims’ families of the pain inflicted by self-harm. We cannot fathom what is in the mind of those who place such content on the internet. The right hon. Member for Haltemprice and Howden (Mr Davis) and those co-signing the new clause have produced a very considered and comprehensive text, dealing with all the issues in terms of intent, degree of harm and so on, so I fully endorse and welcome new clause 16.

Likewise, new clauses 45 and 46 would further strengthen the legislation by protecting children from the sharing of an intimate image without consent. Unfortunately, I have sat face to face—as I am sure many in this House have—with those who have been impacted by such cruel use of social media. The pain and humiliation it imposes on the victim is significant. It can cause scars that last a lifetime. While the content can be removed, the impact cannot be removed from the mind of the victim.

Finally, I make mention of new clause 53. Over recent months I have engaged with campaigners who champion the rights and welfare of those with epilepsy. Those with this condition need to be safe on the internet from the very specific and callous motivation of those who target them because of their condition. We make this change knowing that such legislative protection will increase online protection. Special mention must once again go to young Zach, who has been the star in making this change. What an amazing campaign, one that says to society that no matter how young or old you are, you can bring about change in this House.

This is a milestone Bill. I believe it brings great progress in offering protections from online harm. I believe it can be further strengthened in areas such as pornography. We only have to think that the British Board of Film Classification found that children are coming across pornography online as young as seven, with 51% of 11 to 13-year-olds having seen pornography at some point. That is damaging people’s mental health and their perception of what a healthy relationship should look and feel like. Ultimately, the Bill does not go far enough on that issue. It will be interesting to see how the other place deals with the Bill and makes changes to it. The day of the internet being the wild west, lawless for young and old, must end. I commend the Bill to the House.

Vicky Ford Portrait Vicky Ford
- View Speech - Hansard - - - Excerpts

It is great that the Bill is back in this Chamber. I have worked on it for many years, as have many others, during my time on the Science and Technology Committee and the Women and Equalities Committee, and as Children’s Minister. I just want to make three points.

First, I want to put on the record my support for the amendments tabled by my right hon. Friend the Member for Basingstoke (Dame Maria Miller). She is a true, right and honourable friend of women and girls all across the country. It is vital that women and girls are protected from intimate image abuse, from perverse and extreme pornography, and from controlling and coercive behaviour, as well as that we make a new offence to criminalise cyber-flashing.

Secondly, I want to talk about new clause 16 and self-harm, especially in relation to eating disorders. As I said in this place on Thursday, it is terrifying how many young people are suffering from anorexia today. The charity Beat estimates that 1.25 million people are suffering from eating disorders. A quarter of them are men; most are women. It also reminds us that anorexia is the biggest killer of all mental illnesses.

It is very hard to talk about one’s own experiences of mental illness. It brings back all the horrors. It makes people judge you differently. And you fear that people will become prejudiced against you. I buried my own experiences for nearly 40 years, but when I did speak out, I was contacted by so many sufferers and families, thanking me for having done so and saying it had brought them hope.

19:44
There may be many reasons why we have an increase in eating disorders, and I am sure that lockdown and the fears of the pandemic are a part of it, but I do remember from my own experience of anorexia 40 years ago how I had got it into my head that only by being ultra-thin could I be beautiful or valued. That is why images that glamorise self-harm, images that glamorise eating disorders, are so damaging. So it is really concerning to hear in recent surveys that more than one in four children have seen content about anorexia online. It is great that Ministers have promised that all children will be protected from self-harm, including eating disorders. When it comes to adults, however, I understand that Ministers may be considering an amendment similar to new clause 16 that would make it illegal to encourage self-harm online, but that it might not cover eating disorders, because they are just considering giving adults the right to opt out of seeing such content.
I was lucky that by the time I turned 18 years old I was over the worst of my anorexia, but when I look back at my teenage self, had I been 18 at the peak of my illness and had access to social media, I do not think I would have opted out of that content; I think I might have sought it out. It is incredibly important that the definition of self-harm absolutely recognises that eating disorders are a form of self-harm and are a killer.
My third point is that I welcome the measures to protect children from sexual abuse online and join my voice with all those who have thanked the Internet Watch Foundation. I have been honoured to be a champion of the foundation for over a decade. The work it does is so important and so brave. The Everyone’s Invited movement exposed the epidemic of sexual violence being suffered by young women and girls in our schools. As Children’s Minister at the time, I listened to their campaigners and learned from them how online pornography normalises sexual violence. There must be measures to prevent children from accessing all online porn. I was worried that Barnardo’s contacted me recently saying that more needs to be done to address the content that sexualises children in pornography. I hope the Minister will work closely with all children’s charities, including the wonderful Children’s Commissioner, as the Bill goes through the rest of its stages.
Jim Shannon Portrait Jim Shannon
- View Speech - Hansard - - - Excerpts

It is a pleasure to speak in the debate. I thank Members who have spoken thus far for their comments. I commend the right hon. Member for Chelmsford (Vicky Ford) for what she referred to in relation to eating disorders. At this time, we are very aware of that pertinent issue: the impact that social media has—the social pressure and the peer pressure—on those who feel they are too fat when they are not, or that they are carrying weight when they are not. That is part of what the Bill tries to address. I thank the Minister for his very constructive comments—he is always constructive—and for laying out where we are. Some of us perhaps have concerns that the Bill does not go far enough. I know I am one of them and maybe Minister, you might be of the same mind yourself—

Jim Shannon Portrait Jim Shannon
- Hansard - - - Excerpts

The Minister might be of the same mind himself.

Through speaking in these debates, my office has seen an increase in correspondence from parents who are thankful that these difficult issues are being talked about. The world is changing and progressing, and if we are going to live in a world where we want to protect our children and our grandchildren—I have six grandchildren —and all other grandchildren who are involved in social media, the least we can do is make sure they are safe.

I commend the hon. Member for Batley and Spen (Kim Leadbeater) and others, including the hon. Member for Watford (Dean Russell), who have spoken about Zach’s law. We are all greatly impressed that we have that in the Bill through constructive lobbying. New clause 28, which the hon. Member for Rotherham (Sarah Champion) referred to, relates to advocacy for young people. That is an interesting idea, but I feel that advocacy should be for the parents first and not necessarily young people.

Ahead of the debate, I was in contact with the Royal College of Psychiatrists. It published a report entitled “Technology use and the mental health of children and young people”—new clause 16 is related to that—which was an overview of research into the use of screen time and social media by children and young teenagers. It has been concluded that excessive use of phones and social media by a young person is detrimental to their development and mental health—as we all know and as Members have spoken about—and furthermore that online abuse and bullying has become more prevalent because of that. The right hon. Member for Witham (Priti Patel) referred to those who are susceptible to online harm. We meet them every day, and parents tell me that our concerns are real.

A recent report by NHS Digital found that one in eight 11 to 16-year-olds reported that they had been bullied online. When parents contact me, they say that bulling online is a key issue for them, and the statistics come from those who choose to be honest and talk about it. Although the Government’s role is to create a Bill that enables protection for our children, there is also an incredible role for schools, which can address bullying. My hon. Friend the Member for Upper Bann (Carla Lockhart) and I talked about some of the young people we know at school who have been bullied online. Schools have stepped in and stopped that, encouraging and protecting children, and they can play that role as well.

We have all read of the story of Molly Russell, who was only 14 years old when she took her life. Nobody in this House or outside it could not have been moved by her story. Her father stated that he strongly believed that the images, videos and information that she was able to access through Instagram played a crucial part in her life being cut short. The Bill must complete its passage and focus on strengthening protections online for children. Ultimately, the responsibility is on large social media companies to ensure that harmful information is removed, but the Bill puts the onus on us to hold social media firms to account and to ensure that they do so.

Harmful and dangerous content for children comes in many forms—namely, online abuse and exposure to self-harm and suicidal images. In addition, any inappropriate or sexual content has the potential to put children and young people at severe risk. The Bill is set to put provisions in place to protect victims in the sharing of nude or intimate photos. That is increasingly important for young people, who are potentially being groomed online and do not understand the full extent of what they are doing and the risks that come with that. Amendments have been tabled to ensure that, should such cases of photo sharing go to court, provisions are in place to ensure complete anonymity for the victims—for example, through video links in court, and so on.

I commend the right hon. Member for Basingstoke (Dame Maria Miller), who is not in her place, for her hard work in bringing forward new clause 48. Northern Ireland, along with England and Wales, will benefit from new clause 53, and I welcome the ability to hand down sentences of between six months and potentially five years.

Almost a quarter of girls who have taken a naked image have had their image sent to someone else online without their permission. Girls face very distinct and increased risks on social media, with more than four in five online grooming crimes targeting girls, and 97% of child abuse material featuring the sexual abuse of girls—wow, we really need to do something to protect our children and to give parents hope. There needs to be increased emphasis and focus on making children’s use of the internet safer by design. Once established, all platforms and services need to have the capacity and capability to respond to emerging patterns of sexual abuse, which often stem from photo sharing.

The Minister referred to terrorism and how terrorism can be promoted online. I intervened on him to mention the glorification of IRA terrorism and how that encourages further acts of terrorism and people who are susceptible to be involved. I am quite encouraged by the Minister’s response, and I think that we need to take a significant step. Some in Northern Ireland, for instance, try to rewrite history and use the glorification of terrorism for that purpose. We would like to see strengthening of measures to ensure that those involved in those acts across Northern Ireland are controlled.

In conclusion, there are many aspects of the Bill that I can speak in support of in relation to the benefits of securing digital protections for those on social media. This is, of course, about protecting not just children, but all of us from the dangers of social media. I have chosen to speak on these issues as they are often raised by constituents. There are serious matters regarding the glorification and encouragement of self-harm that the Bill needs to address. We have heard stories tonight that are difficult to listen to, because they are true stories from people we know, and we have heard horror stories about intimate photo sharing online. I hope that action on those issues, along with the many others that the Government are addressing, will be embedded in the Bill with the intent to finally ensure that we have regulations and protection for all people, especially our children—I think of my children and grandchildren, and like everybody else, my constituents.

Dean Russell Portrait Dean Russell
- View Speech - Hansard - - - Excerpts

I welcome the Minister to his place; I know that he will be excellent in this role, and it is incredible that he is so across the detail in such a short time.

I will primarily talk about new clause 53—that may not be that surprising, given how often it has been spoken about today—which is, ultimately, about Zach’s law. Zach is a truly heroic figure, as has been said. He is a young child with cerebral palsy, autism and epilepsy who was cruelly trolled by sick individuals who sent flashing images purposely to cause seizures and cause him damage. That was not unique to Zach, sadly; it happened to many people across the internet and social media. When somebody announced that they were looking for support, having been diagnosed with epilepsy, others would purposely identify that and target the person with flashing images to trigger seizures. That is absolutely despicable.

My hon. Friend the Member for Stourbridge (Suzanne Webb) has been my partner in crime—or in stopping the crime—over the past two years, and this has been a passion for us. Somebody said to me recently that we should perhaps do our victory lap in the Chamber today for the work that has been done to change the law, but Zach is the person who will get to go around and do that, as he did when he raised funds after he was first cruelly trolled.

My hon. Friend the Member for Folkestone and Hythe (Damian Collins) also deserves an awful lot of praise. My hon. Friend the Member for Stourbridge and I worked with him on the Joint Committee on the draft Online Safety Bill this time last year. It was incredible to work with Members of both Houses to look at how we can make the Bill better. I am pleased about the response to so many measures that we put forward, including the fact that we felt that the phrase “legal but harmful” created too many grey areas that would not catch the people who were doing these awful—what I often consider to be—crimes online to cause harm.

I want to highlight some of what has been done over the past two years to get Zach’s law to this point. If I ever write a memoir, I am sure that my diaries will not be as controversial as some in the bookshops today, but I would like to dedicate a chapter to Zach’s law, because it has shown the power of one individual, Zach, to change things through the democratic process in this House, to change the law for the entire country and to protect people who are vulnerable.

Not only was Zach’s case raised in the Joint Committee’s discussions, but afterwards my hon. Friend the Member for Stourbridge and I managed to get all the tech companies together on Zoom—most people will probably not be aware of this—to look at making technical changes to stop flashing images being sent to people. There were lots of warm words: lots of effort was supposedly put in so that we would not need a law to stop flashing images. We had Giphy, Facebook, Google, Twitter—all these billion-pound platforms that can do anything they want, yet they could not stop flashing images being sent to vulnerable people. I am sorry, but that is not the work of people who really want to make a difference. That is people who want to put profit over pain—people who want to ensure that they look after themselves before they look after the most vulnerable.

20:00
That is why the Bill is so important: because if the platforms will not do the right thing, we will. Hon. Members may disagree on some of the detail in the Bill, but most of that detail is there to stop the platforms doing the wrong thing. We should not have to force them into it, but we have come to the point where we will. I am sure that the measures in the Bill will go further than they would ever have wanted.
To repeat a phrase that I have used before in this Chamber, Andy Warhol used to talk about the era of 15 minutes of fame, but sadly through social media we now have 15 minutes of shame. People are hate-mobbed because they have a different point of view, or have images shared that they do not want shared, purely to cause them distress. The Bill will help to stop most of that.
As my hon. Friend the Member for Stourbridge says, a key issue is chasing the money. The truth is that a lot of online content is addictive, especially to young kids who scroll throughout the night, watching the next TikTok or reading the next message or the next post. They are trying to see the next piece of content that will give them some enjoyment or connect them to the real world. The platforms have put the “ad” into “addiction” and have caused harm by doing so, making profits they should not have made from the harm that they have done to children.
Ultimately, this debate is about making sure that the Bill is fit for purpose. I totally understand that many hon. Members across the Chamber want lots of changes and additions to it, but as we are coming up to Christmas, perhaps I can use a suitable analogy. We do not want a Christmas tree Bill with so many baubles of new legislation hanging from it that we do not achieve our ultimate goal, which is to protect.
Suzanne Webb Portrait Suzanne Webb
- Hansard - - - Excerpts

Talking of Christmas, would not the best Christmas present for lovely Zach be to enshrine new clause 53, that amazing amendment, as Zach’s law? Somehow we should formalise it as Zach’s law—that would be a brilliant Christmas present.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

I wholeheartedly agree. Zach, if you are listening right now, you are an absolute hero—you have changed so much for so many people. Without your effort, this would not be happening today. In future, we can look back on this and say, “You know what? Democracy does work.”

I thank all hon. Members for their campaigning work to raise Zach’s law in the public consciousness. It even reached the US. I am sure many hon. Members dance along to Beyoncé of an evening or listen to her in the car when they are bopping home; a few months ago she changed one of her YouTube videos, which had flashing images in it, because the Epilepsy Society reached out to describe the dangers that it would cause. These campaigns work. They are about public awareness and about changing the law. We talk about the 15 minutes of shame that people face on social media, but ultimately the shame is on the platforms for forcing us to legislate to make them do the right thing.

I will end with one small point. The internet has evolved; the world wide web has evolved; social media is evolving; the metaverse, 3D virtual reality worlds and augmented reality are changing. I urge the Government or the House to look at creating a Committee specifically on the Bill. I know that there are lots of arguments that it should be a Sub-Committee of the Digital, Culture, Media and Sport Committee, but the truth is that the online world is changing dramatically. We cannot take snapshots every six months, every year or every two years and assume that they will pick up on all the changes happening in the world.

As the hon. Member for Pontypridd (Alex Davies-Jones) said, TikTok did not even exist when the Bill was first discussed. We now have an opportunity to ask what is coming next, keep pace with it and put ethics and morality at the heart of the Bill to ensure that it is fit for purpose for many decades to come. I thank the Minister for his fantastic work; my partner in crime, my hon. Friend the Member for Stourbridge, for her incredible work; and all Members across the House. Please, please, let us get this through tonight.

Laura Farris Portrait Laura Farris (Newbury) (Con)
- View Speech - Hansard - - - Excerpts

It is a privilege to follow my hon. Friend the Member for Watford (Dean Russell) and so many hon. Members who have made thoughtful contributions. I will confine my comments to the intersection of new clauses 28 and 45 to 50 with the impact of online pornography on children in this country.

There has been no other time in the history of humanity when we have exposed children to the violent, abusive, sexually explicit material that they currently encounter online. In 2008, only 14% of children under 13 had seen pornography; three years later, that figure had risen to 49%, correlating with the rise in children owning smartphones. Online pornography has a uniquely pernicious impact on children. For very young children, there is an impact just from seeing the content. For older teenagers, there is an impact on their behaviour.

We are seeing more and more evidence of boys exhibiting sexually aggressive behaviour, with actions such as strangulation, which we have dealt with separately in this House, and misogynistic attitudes. Young girls are being conditioned into thinking that their value depends on being submissive or objectified. That is leading children down a pathway that leads to serious sexual offending by children against children. Overwhelmingly, the victims are young girls.

Hon. Members need not take my word for it: after Everyone’s Invited began documenting the nature and extent of the sexual experiences happening in our schools, an Ofsted review revealed that the most prevalent victims of serious sexual assaults among the under-25s are girls aged 15 to 17. In a recent publication in anticipation of the Bill, the Children’s Commissioner cited the example of a teenage boy arrested for his part in the gang rape of a 14-year old girl. In his witness statement to the police, the boy said that it felt just like a porn film.

Dr John Foubert, the former White House adviser on rape prevention, has said:

“It wasn’t until 10 years ago when I came to the realization that the secret ingredient in the recipe for rape was not secret at all…That ingredient…is today’s high speed Internet pornography.”

The same view has been expressed, in one form or another, by the chief medical officers for England and for Wales, the Independent Inquiry into Child Sexual Abuse, the Government Equalities Office, the Children’s Commissioner, Ofsted and successive Ministers.

New clause 28 requests an advocacy body to represent and protect the interests of child users. I welcome the principle behind the new clause. I anticipate that the Minister will say that he is already halfway there by making the Children’s Commissioner a statutory consultee to Ofcom, along with the Domestic Abuse Commissioner and others who have been named in this debate. However, whatever the Government make of the Opposition’s new clause, they must surely agree that it alights on one important point: the online terrain in respect of child protection is evolving very fast.

By the time the Bill reaches the statute book, new providers will have popped up again. With them will come unforeseen problems. When the Bill was first introduced, TikTok did not exist, as my hon. Friend the Member for Watford said a moment ago, and neither did OnlyFans. That is precisely the kind of user-generated site that is likely to try and dodge its obligations to keep children safe from harm, partly because it probably does not even accept that it exposes them to harm: it relies on the fallacy that the user is in control, and operates an exploitative business model predicated on that false premise.

I think it important for someone to represent the issue of child protection on a regular basis because of the issue of age verification, which we have canvassed, quite lightly, during the debate. Members on both sides of the House have pointed out that the current system which allows children to self-certify their date of birth is hopelessly out of date. I know that Ministers envisage something much more ambitious with the Bill’s age assurance and age verification requirements, including facial recognition technology, but I think it is worth our having a constant voice reporting on the adequacy of whatever age assurance steps internet providers may take, because we know how skilful children can be in navigating the internet. We know that there are those who have the technological skills to IP shroud or to use VPN. I also think it important for there to be a voice to maintain the pressure on the Government—which is what I myself want to do tonight—for an official Government inquiry into pornography harms, akin to the one on gambling harms that was undertaken in 2019. That inquiry was extremely important in identifying all the harm that was caused by gambling. The conclusions of an equivalent inquiry into pornography would leave no wriggle room for user-generated services to deny the risk of harm.

My right hon. Friend the Member for Basingstoke (Dame Maria Miller) pointed out, very sensibly, that her new clauses 45 to 50 build on all the Law Commission’s recommendations. It elides with so much work that has already been done in the House. We have produced, for instance, the Domestic Abuse Act 2021, which dealt with revenge porn, whether threatened or actual and whether genuine or fake, and with coercive control. Many Members recognise what was achieved by all our work a couple of years ago. However, given the indication from Ministers that they are minded to accept the new clauses in one form or another, I should like them to explain to the House how they think the Bill will capture the issue of sexting, if, indeed, it will capture that issue at all.

As the Minister will know, sexting means the exchanging of intimate images by, typically, children, sometimes on a nominally consensual basis. Everything I have read about it seems to say, “Yes, prima facie this is an unlawful act, but no, we do not seek to criminalise children, because we recognise that they make errors of judgment.” However, while I agree that it may be proportionate not to criminalise children for doing this, it remains the case that when an image is sent with the nominal consent of the child—it is nearly always a girl—it is often a product of duress, the image is often circulated much more widely than the recipient, and that often has devastating personal consequences for the young girl involved. All the main internet providers now have technology that can identify a nude image. It would be possible to require them to prevent nude images from being shared when, because of extended age-verification abilities, they know that the user is a child. If the Government are indeed minded to accept new clauses 45 to 50, I should like them to address that specific issue of sexting rather than letting it fall by the wayside as something separate, or outside the ambit of the Bill.

Nigel Evans Portrait Mr Deputy Speaker (Mr Nigel Evans)
- Hansard - - - Excerpts

The last Back-Bench speaker is Miriam Cates.

Miriam Cates Portrait Miriam Cates (Penistone and Stocksbridge) (Con)
- View Speech - Hansard - - - Excerpts

Thank you, Mr Deputy Speaker. I think you are the third person to take the Chair during the debate. It is an honour to follow my hon. Friend the Member for Newbury (Laura Farris); I agree with everything that she said, and my comments will be similar.

This has been a long but fascinating debate. We have discussed only a small part of the Bill today, and just a few amendments, but the wide range of the debate reflects the enormous complexity of what the Bill is intended to do, which is to regulate the online world so that it is subject to rules, regulations, obligations and protective measures equivalent to those in the offline world. We must do this, because the internet is now an essential part of our infrastructure. I think that we see the costs of our high-speed broadband as being in the same category as our energy and water costs, because we could not live without it. Like all essential infrastructure, the internet must be regulated. We must ensure that providers are working in the best interests of consumers, within the law and with democratic accountability.

Regulating the internet through the Bill is not a one-off project. As many Members have said, it will take years to get it right, but we must begin now. I think the process can be compared with the regulation of roads. A century ago there were hardly any private motor cars on the roads. There were no rules; people did not even have to drive on a particular side of the road. There have been more than 100 years of frequent changes to rules and regulations to get it right. It seems crazy now to think there was a time when there were no speed limits and no seat belts. The death rates on the roads, even in the 1940s, were 13 times higher than they are now. Over time, however, with regulation, we have more or less solved the complex problems of road regulation. Similarly, it will take time to get this Bill right, but we must get it on to the statute book and give it time to evolve.

20:15
The crucial point, though, is that we must look at the internet through a child’s eyes. I thoroughly support the sentiment embodied in new clause 28, which, as my hon. Friend said, calls for the establishment of an advocacy body to represent child users of the internet. The internet has many impacts on adults. Some are good—I love Google Maps; I will never get lost again—and some are bad, but the internet has utterly transformed childhood. Some would say that it has destroyed childhood. Childhood is a crucial and irreplaceable time, and before the internet parents, schools and communities had full control over who influenced their children. People did not let others into their home unless they trusted them, and knew that they had the best interests and the welfare of their children at heart. Now, the number of people who are influencing our children in their bedrooms, often malevolently, is off the scale. It is hard to comprehend the impact and the influence that the internet has had on children, and a large number of those providers do not have their best interests at heart.
We have heard a great many tragic stories today about children who have been harmed through other people’s direct access to their lives over mobile phones, but, as my hon. Friend said, one of the overriding results of the internet is the sexualisation of children in a truly destructive way. As my hon. Friend also said, about 50% of 12-year-olds have now seen online pornography, and 1.4 million UK children access porn every month. There is nothing mainstream about this pornography. It is not the same as the dodgy magazines of old. Violence, degrading behaviour, abuse and addiction are all mainstream on pornography sites now.
Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Does my hon. Friend agree that the work of charities such as Dignify in Watford, where Helen Roberts does incredible work in raising awareness of this issue, is essential to ensuring that people are aware of the harm that can be done?

Miriam Cates Portrait Miriam Cates
- Hansard - - - Excerpts

I completely agree. Other charities, such as CEASE—the Centre to End All Sexual Exploitation —and Barnardo’s have been mentioned in the debate, and I think it so important to raise awareness. There are many harms in the internet, but pornography is an epidemic. It makes up a third of the material on the internet, and its impact on children cannot be overstated. Many boys who watch porn say that it gives them ideas about the kind of sex that they want to try. It is not surprising that a third of child sexual abuse is committed by other children. During puberty—that very important period of development—boys in particular are subject to an erotic imprint. The kind of sex that they see and the sexual ideas that they have during that time determine what they see as normal behaviour for the rest of their lives. It is crucial for children to be protected from harmful pornography that encourages the objectification and abuse of—almost always—women.

Neale Hanvey Portrait Neale Hanvey
- Hansard - - - Excerpts

I thank—in this context—my hon. Friend for giving way.

The lawsuits are coming. There can certainly be no more harmful act than encouraging a young person to mutilate their body with so-called gender-affirming surgery with no therapeutic intervention beforehand. In Scotland, the United Nations special rapporteur for violence against women and girls has criticised the Scottish Government’s Gender Recognition Reform (Scotland) Bill. Does the hon. Lady agree that it is time to establish who is a feminist, and who is a fake to their fingertips?

Miriam Cates Portrait Miriam Cates
- Hansard - - - Excerpts

I thank the hon. Gentleman for his intervention. He is absolutely right: inciting a child to harm their body, whatever that harm is, should be criminalised, and I support the sentiment of new clause 16, which seeks to do that. Sadly, lots of children, particularly girls, go online and type in “I don’t like my body”. Maybe they are drawn to eating disorder sites, as my right hon. Friend the Member for Chelmsford (Vicky Ford) has mentioned, but often they are drawn into sites that glorify transition, often with adult men that they do not even know in other countries posting pictures of double mastectomies on teenage girls.

John Nicolson Portrait John Nicolson (Ochil and South Perthshire) (SNP)
- Hansard - - - Excerpts

The hon. Lady must realise that this is fantasy land. It is incredibly difficult to get gender reassignment surgery. The “they’re just confused” stuff is exactly what was said to me as a young gay man. She must realise that this really simplifies a complicated issue and patronises people going through difficult choices.

Miriam Cates Portrait Miriam Cates
- Hansard - - - Excerpts

I really wish it was fantasy land, but I am in contact with parents each and every day who tell me stories of their children being drawn into this. Yes, in this country it is thankfully very difficult to get a double mastectomy when you are under 18, but it is incredibly easy to buy testosterone illegally online and to inject it, egged on by adults in other countries. Once a girl has injected testosterone during puberty, she will have a deep voice and facial hair for life and male-pattern baldness, and she will be infertile. That is a permanent change, it is self-harm and it should be criminalised under this Bill, whether through this clause or through the Government’s new plans. The hon. Member for Kirkcaldy and Cowdenbeath (Neale Hanvey) is absolutely right: this is happening every day and it should be classed as self-harm.

Going back to my comments about the effect on children of viewing pornography, I absolutely support the idea of putting children’s experience at the heart of the Bill but it needs to be about children’s welfare and not about what children want. One impact of the internet has been to blur the boundary between adults and children. As adults, we need to be able to say, “This is the evidence of what is harmful to children, and this is what children should not be seeing.” Of course children will say that they want free access to all content, just like they want unlimited sweets and unlimited chocolate, but as adults we need to be able to say what is harmful for children and to protect them from seeing it.

This bring me to Government new clause 11, which deals with making sure that child sexual abuse material is taken offline. There is a clear link between the epidemic of pornography and the epidemic of child sexual abuse material. The way the algorithms on porn sites work is to draw users deeper and deeper into more and more extreme content—other Members have mentioned this in relation to other areas of the internet—so someone might go on to what they think is a mainstream pornography site and be drawn into more and more explicit, extreme and violent criminal pornography. At the end of this, normal people are drawn into watching children being abused, often in real time and often in other countries. There is a clear link between the epidemic of porn and the child sexual abuse material that is so prevalent online.

Last week in the Home Affairs Committee we heard from Professor Alexis Jay, who led the independent inquiry into child sexual abuse. Her report is harrowing, and it has been written over seven years. Sadly, its conclusion is that seven years later, there are now even more opportunities for people to abuse children because of the internet, so making sure that providers have a duty to remove any child sexual abuse material that they find is crucial. Many Members have referred to the Internet Watch Foundation. One incredibly terrifying statistic is that in 2021, the IWF removed 252,194 web pages containing child sexual abuse material and an unknown number of images. New clause 11 is really important, because it would put the onus on the tech platforms to remove those images when they are found.

It is right to put the onus on the tech companies. All the way through the writing of this Bill, at all the consultation meetings we have been to, we have heard the tech companies say, “It’s too hard; it’s not possible because of privacy, data, security and cost.” I am sure that is what the mine owners said in the 19th century when they were told by the Government to stop sending children down the mines. It is not good enough. These are the richest, most powerful companies in the world. They are more powerful than an awful lot of countries, yet they have no democratic accountability. If they can employ real-time facial recognition at airports, they can find a way to remove child abuse images from the internet.

This leads me on to new clause 17, tabled by the right hon. Member for Barking (Dame Margaret Hodge), which would introduce individual director liability for non-compliance. I completely support that sentiment and I agree that this is likely to be the only way we will inject some urgency into the process of compliance. Why should directors who are profiting from the platforms not be responsible if children suffer harm as a result of using their products? That is certainly the case in many other industries. The right hon. Lady used the example of the building trade. Of course there will always be accidents, but if individual directors face the prospect of personal liability, they will act to address the systemic issues, the problems with the processes and the malevolent algorithms that deliberately draw users towards harm.

William Cash Portrait Sir William Cash
- Hansard - - - Excerpts

My hon. Friend knows that I too take a great interest in this, and I am glad that the Government have agreed to continue discussions on this question. Is she aware that the personal criminal liability for directors flows from the corporate criminal liability in the company of which they are a director, and that their link to the criminal act itself, even if the company has not been or is not being prosecuted, means that the matter has to be made clear in the legislation, so that we do not have any uncertainty about the relationship of the company director and the company of which he is a director?

Miriam Cates Portrait Miriam Cates
- Hansard - - - Excerpts

I was not aware of that, but I am now. I thank my hon. Friend for that information. This is a crucial point. We need the accountability of the named director associated with the company, the platform and the product in order to introduce the necessary accountability. I do not know whether the Minister will accept this new clause today, but I very much hope that we will look further at how we can make this possible, perhaps in another place.

I very much support the Bill. We need to get it on the statute book, although it will probably need further work, and I support the Government amendments. However, given the link between children viewing pornography and child sexual abuse, I hope that when the Bill goes through the other place, their lordships will consider how regulations around pornographic content can be strengthened, in order to drastically reduce the number of children viewing porn and eventually being drawn into criminal activities themselves. In particular, I would like their lordships to look at tightening and accelerating the age verification and giving equal treatment to all pornography, whether it is on a porn site or a user-to-user service and whether it is online or offline. Porn is harmful to children in whatever form it comes, so the liability on directors and the criminality must be exactly the same. I support the Bill and the amendments in the Government’s name, but it needs to go further when it goes to the other place.

Paul Scully Portrait Paul Scully
- View Speech - Hansard - - - Excerpts

I thank Members for their contributions during today’s debate and for their ongoing engagement with such a crucial piece of legislation. I will try to respond to as many of the issues raised as possible.

My right hon. Friend the Member for Haltemprice and Howden (Mr Davis), who is not in his place, proposed adding in promoting self-harm as a criminal offence. The Government are sympathetic to the intention behind that proposal; indeed, we asked the Law Commission to consider how the criminal law might address that, and have agreed in principle to create a new offence of encouraging or assisting serious self-harm. The form of the offence recommended by the Law Commission is based on the broadly comparable offence of encouraging or assisting suicide. Like that offence, it covers the encouragement of, or assisting in, self-harm by means of communication and in other ways. When a similar amendment was tabled by the hon. Members for Ochil and South Perthshire (John Nicolson) and for Aberdeen North (Kirsty Blackman) in Committee, limiting the offence to encouragement or assistance by means of sending a message, the then Minister, my right hon. Friend the Member for Croydon South, said it would give only partial effect to the Law Commission’s recommendation. It remains the Government’s intention to give full effect to the Law Commission’s recommend-ations in due course.

20:43
I recognise the strong cross-party support for the amendment and the terrible damage done by online communications that encourage self-harm. The Molly Russell case has been mentioned by many Members today, and I send my condolences to Mr Russell, who was here earlier—I welcomed him to the Gallery—to listen to the early parts of this debate, along with other people who have suffered in a similar fashion. That case illustrates all too clearly that we have to do much more to protect young people like Molly from such harmful content. As we signalled in a written ministerial statement on 29 November, the Government intend to introduce in the Lords a new communications offence of encouraging self-harm.
My right hon. Friend the Member for Chelmsford (Vicky Ford) spoke so powerfully and movingly. She bared her soul in her personal testimony, having covered this deep inside herself for four decades. For her to come in front of us, in public, and give her testimony, all I can say is thank you. I commit to working with her to see what more we can do to ensure that eating disorders are captured in legislation as best we can. This will clearly be for children, but we want to see what more we can do for everyone and to protect the most vulnerable.
New clause 16, tabled by my right hon. Friend the Member for Haltemprice and Howden, would add a communications offence of encouraging or assisting self-harm to the Suicide Act 1961. I recognise the link between self-harm and suicide, but the two are distinct. The 1961 Act is about encouraging or assisting suicide, not self-harm, so any offence covering the latter should be separate from that Act. I would like to have a chat with him about the drafting of proposed new section 3A(8)(c), as I do not follow its logic and would like to test it a little more. For those reasons, I hope he will agree not to press his amendment and to allow the Government to move an amendment in the Lords.
New clause 17 would enable Ofcom to use enforcement sanctions directly against senior managers if their actions, directions, negligence or consent cause a service’s failure to comply with any of the enforceable requirements. It is vital that senior executives take their new responsibilities seriously. Under the Bill, Ofcom will be able to hold senior tech executives criminally liable if they fail to ensure that their company provides Ofcom with the information it needs to regulate effectively.
The existing provisions have been carefully designed to ensure that tech executives take personal responsibility for ensuring compliance with the framework, while ensuring sufficient legal clarity on what amounts to an offence and who can be prosecuted. The senior management liability is targeted specifically at the obligations to ensure that Ofcom is provided with the information it needs to regulate, as this is essential to the effective functioning of the regime. This approach is similar to the regulation of a number of other sectors, such as telecommunications.
New clause 17 would make senior managers personally liable, far beyond the current proposals, for the actions of the entities for which they work. The framework establishes a range of enforcement requirements, and a regulated service is the proper legal entity to be liable for failures to comply with those requirements. It would not be appropriate to extend that liability to any director or manager of a regulated service.
The Government do not believe it would be proportionate or effective to expand the scope of individual liability under this Bill, for a number of reasons. There is a real risk of damaging the UK’s attractiveness as a place to start and grow a digital business. It might also lead to unintended consequences, such as tech executives driving an over-zealous approach to content take-down, for fear of going to prison for a regulatory failing.
William Cash Portrait Sir William Cash
- Hansard - - - Excerpts

I have raised this on a number of occasions in the past few hours, as have my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) and the right hon. Member for Barking (Dame Margaret Hodge). Will the Minister be good enough to ensure that this matter is thoroughly looked at and, furthermore, that the needed clarification is thought through?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I was going to come to my hon. Friend in two seconds.

In the absence of clearly defined offences, the changes we are making to the Bill mean that it is likely to be almost impossible to take enforcement action against individuals. We are confident that Ofcom will have all the tools necessary to drive the necessary culture change in the sector, from the boardroom down.

This is not the last stage of the Bill. It will be considered in Committee—assuming it is recommitted today—and will come back on Report and Third Reading before going to the House of Lords, so there is plenty of time further to discuss this and to give my hon. Friend the clarification he needs.

Margaret Hodge Portrait Dame Margaret Hodge
- Hansard - - - Excerpts

Is the Minister saying he is open to changing his view on why he is minded to reject new clause 17 tonight?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I do not think I am changing my view. I am saying that this is not the last stage of the Bill, so there will be plenty of opportunity further to test this, should Members want to do so.

On new clause 28, the Government recognise and agree with the intent behind this amendment to ensure that the interests of child users of regulated services are represented. Protecting children online is the top priority in this Bill, and its key measures will ensure that children are protected from harmful content. The Bill appoints a regulator with comprehensive powers to force tech companies to keep children safe online, and the Bill’s provisions will ensure that Ofcom will listen and respond to the needs of children when identifying priority areas for regulatory action, setting out guidance for companies, taking enforcement action and responding to super-complaints.

Right from the outset, Ofcom must ensure that its risk assessment and priorities reflect the needs of children. For example, Ofcom is required to undertake research that will help understand emerging risks to child safety. We have heard a lot today about the emerging risks with changing technology, and it is important that we keep on top of those and have that children’s voice at the heart of this. The Bill also expands the scope of the Communications Consumer Panel to online safety matters. That independent panel of experts ensures that user needs are at the heart of Ofcom’s regulatory approach. Ofcom will also have the flexibility to choose other mechanisms to better understand user experiences and emerging threats. For example, it may set up user panels or focus groups.

Importantly, Ofcom will have to engage with expert bodies representing children when developing codes of practice and other regulatory guidance. For example, Ofcom will be required to consult persons who represent the interests of children when developing its codes of practice. That means that Ofcom’s codes will be fully informed by how children behave online, how they experience harm and what impact the proposed measures will have on their online experience. The super-complaints process will further enable independent bodies advocating for children to have their voices heard, and will help Ofcom to recognise and eliminate systemic failures.

As we have heard, the Government also plan to name the Children’s Commissioner for England as a statutory consultee for Ofcom when it develops its code of practice. That amendment will be tabled in the House of Lords. Through this consultation, the commissioner will be able to flag systemic issues or issues of particular importance to the regulator, helping Ofcom to target investigations and, if necessary, sanctions at matters that most affect children’s online experience.

As such, there are ample opportunities in the framework for children’s voices to be heard, and the Government are not convinced of the need to legislate for another child user advocacy body. There are plenty of bodies out there that Ofcom will already be reaching out to and there is an abundance of experience in committed representative groups that are already engaged and will be engaged with the online safety framework. They include the existing statutory body responsible for promoting the interests of children, the Children’s Commissioner. Adding an additional statutory body would duplicate existing provision, creating a confusing landscape, and that would not be in the best interests of children.

Sarah Champion Portrait Sarah Champion
- Hansard - - - Excerpts

I hear what the Minister is saying about creating a statutory body, but will he assure this House that there is a specific vehicle for children’s voices to be heard in this? I ask because most of us here are not facing the daily traumas and constant recreation of different apps and social media ways to reach out to children that our children are. So unless we have their voice heard, this Bill is not going to be robust enough.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

As I say, we are putting the Children’s Commissioner as a statutory consultee in the Bill. Ofcom will also have to have regard to all these other organisations, such as the 5Rights Foundation and the NSPCC, that are already there. It is in the legislation that Ofcom will have to have regard to those advocates, but we are not specifically suggesting that there should be a separate body duplicating that work. These organisations are already out there and Ofcom will have to reach out to them when coming up with its codes of practice.

We also heard from my hon. Friend the Member for Dover (Mrs Elphicke) about new clause 55. She spoke powerfully and I commend her for all the work she is doing to tackle the small boats problem, which is affecting so many people up and down this country. I will continue to work closely with her as the Bill continues its passage, ahead of its consideration in the Lords, to ensure that this legislation delivers the desired impact on the important issues of illegal immigration and modern slavery. The legislation will give our law enforcement agencies and the social media companies the powers and guidance they need to stop the promotion of organised criminal activity on social media. Clearly, we have to act.

My right hon. Friend the Member for Witham (Priti Patel), who brings to bear her experience as a former Home Secretary, spoke eloquently about the need to have joined-up government, to make sure that lots of bits of legislation and all Departments are working on this space. This is a really good example of joined-up government, where we have to join together.

Natalie Elphicke Portrait Mrs Elphicke
- Hansard - - - Excerpts

Will the Minister confirm that, in line with the discussions that have been had, the Government will look to bring back amendments, should they be needed, in line with new clause 55 and perhaps schedule 7, as the Bill goes to the Lords or returns for further consideration in this House?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

All that I can confirm is that we will work with my hon. Friend and with colleagues in the Home Office to make sure that this legislation works in the way that she intends.

We share with my right hon. Friend the Member for Basingstoke (Dame Maria Miller) the concern about the abuse of deep fake images and the need to tackle the sharing of intimate images where the intent is wider than that covered by current offences. We have committed to bring forward Government amendments in the Lords to do just that, and I look forward to working with her to ensure that, again, we get that part of the legislation exactly right.

We also recognise the intent behind my right hon. Friend’s amendment to provide funding for victim support groups via the penalties paid by entities for failing to comply with the regulatory requirements. Victim and survivor support organisations play a critical role in providing support and tools to help people rebuild their lives. That is why the Government continue to make record investments in this area, increasing the funding for victim and witness support services to £192 million a year by 2024-25. We want to allow the victim support service to provide consistency for victims requiring support.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I thank my hon. Friend for giving way and for his commitment to look at this matter before the Bill reaches the House of Lords. Can he just clarify to me that it is his intention to implement the Law Commission’s recommendations that are within the scope of the Bill prior to the Bill reaching the House of Lords? If that is the case, I am happy to withdraw my amendments.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I cannot confirm today at what stage we will legislate. We will continue to work with my right hon. Friend and the Treasury to ensure that we get this exactly right. We will, of course, give due consideration to the Law Commission’s recommendations.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

Unless I am mistaken, no other stages of the Bill will come before the House where this can be discussed. Either it will be done or it will not. I had hoped that the Minister would answer in the affirmative.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I understand. We are ahead of the Lords on publication, so yes is the answer.

I have two very quick points for my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright). He was right to speak about acting with humility. We will bring forward amendments for recommittal to amend the approach for category 1 designation—not just the smaller companies that he was talking about, but companies that are pushing that barrier to get to category 1. I very much get his view that the process could be delayed unduly, and we want to make sure that we do not get the unintended consequences that he describes. I look forward to working with him to get the changes to the Bill to work exactly as he describes.

Finally, let me go back to the point that my right hon. Friend the Member for Haltemprice and Howden made about encrypted communications. We are not talking about banning end-to-end encryption or about breaking encryption—for the reasons set out about open banking and other areas. The amendment would leave Ofcom powerless to protect thousands of children and could leave unregulated spaces online for offenders to act, and we cannot therefore accept that.

John McDonnell Portrait John McDonnell
- Hansard - - - Excerpts

Just briefly, because I know that the Minister is about to finish, can he respond on amendment 204 with regard to the protection of journalists?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I am happy to continue talking to the right hon. Gentleman, but I believe that we have enough protections in the Bill, with the human touch that we have added after the automatic flagging up of inquiries. The NCA will also have to have due regard to protecting sources. I will continue to work with him on that.

I have not covered everybody’s points, but this has been a very productive debate. I thank everyone for their contributions. We are really keen to get the Bill on the books and to act quickly to ensure that we can make children as safe as possible online.

Question put and agreed to.

New clause 11 accordingly read a Second time, and added to the Bill.

New Clause 12

Warning notices

‘(1) OFCOM may give a notice under section (Notices to deal with terrorism content or CSEA content (or both))(1) to a provider relating to a service or part of a service only after giving a warning notice to the provider that they intend to give such a notice relating to that service or that part of it.

(2) A warning notice under subsection (1) relating to the use of accredited technology (see section (Notices to deal with terrorism content or CSEA content (or both))(2)(a) and (3)(a)) must—

(a) contain details of the technology that OFCOM are considering requiring the provider to use,

(b) specify whether the technology is to be required in relation to terrorism content or CSEA content (or both),

(c) specify any other requirements that OFCOM are considering imposing (see section 106(2) to (4)),

(d) specify the period for which OFCOM are considering imposing the requirements (see section 106(6)),

(e) state that the provider may make representations to OFCOM (with any supporting evidence), and

(f) specify the period within which representations may be made.

(3) A warning notice under subsection (1) relating to the development or sourcing of technology (see section (Notices to deal with terrorism content or CSEA content (or both))(2)(b) and (3)(b)) must—

(a) describe the proposed purpose for which the technology must be developed or sourced (see section (Notices to deal with terrorism content or CSEA content (or both))(2)(a)(iii) and (iv) and (3)(a)(ii)),

(b) specify steps that OFCOM consider the provider needs to take in order to comply with the requirement described in section (Notices to deal with terrorism content or CSEA content (or both))(2)(b) or (3)(b), or both those requirements (as the case may be),

(c) specify the proposed period within which the provider must take each of those steps,

(d) specify any other requirements that OFCOM are considering imposing,

(e) state that the provider may make representations to OFCOM (with any supporting evidence), and

(f) specify the period within which representations may be made.

(4) A notice under section (Notices to deal with terrorism content or CSEA content (or both))(1) that relates to both the user-to-user part of a combined service and the search engine of the service (as described in section (Notices to deal with terrorism content or CSEA content (or both))(4)(c) or (d)) may be given to the provider of the service only if—

(a) two separate warning notices have been given to the provider (one relating to the user-to-user part of the service and the other relating to the search engine), or

(b) a single warning notice relating to both the user-to-user part of the service and the search engine has been given to the provider.

(5) A notice under section (Notices to deal with terrorism content or CSEA content (or both))(1) may not be given to a provider until the period allowed by the warning notice for the provider to make representations has expired.’—(Paul Scully.)

This clause, which would follow NC11, also replaces part of existing clause 104. There are additions to the warning notice procedure to take account of the new options for notices under NC11.

Brought up, read the First and Second time, and added to the Bill.

New Clause 20

OFCOM’s reports about news publisher content and journalistic content

‘(1) OFCOM must produce and publish a report assessing the impact of the regulatory framework provided for in this Act on the availability and treatment of news publisher content and journalistic content on Category 1 services (and in this section, references to a report are to a report described in this subsection).

(2) Unless the Secretary of State requires the production of a further report (see subsection (6)), the requirement in subsection (1) is met by producing and publishing one report within the period of two years beginning with the day on which sections (Duties to protect news publisher content) and 16 come into force (or if those sections come into force on different days, the period of two years beginning with the later of those days).

(3) A report must, in particular, consider how effective the duties to protect such content set out in sections (Duties to protect news publisher content) and 16 are at protecting it.

(4) In preparing a report, OFCOM must consult—

(a) persons who represent recognised news publishers,

(b) persons who appear to OFCOM to represent creators of journalistic content,

(c) persons who appear to OFCOM to represent providers of Category 1 services, and

(d) such other persons as OFCOM consider appropriate.

(5) OFCOM must send a copy of a report to the Secretary of State, and the Secretary of State must lay it before Parliament.

(6) The Secretary of State may require OFCOM to produce and publish a further report if the Secretary of State considers that the regulatory framework provided for in this Act is, or may be, having a detrimental effect on the availability and treatment of news publisher content or journalistic content on Category 1 services.

(7) But such a requirement may not be imposed—

(a) within the period of three years beginning with the date on which the first report is published, or

(b) more frequently than once every three years.

(8) For further provision about reports under this section, see section 138.

(9) In this section—

“journalistic content” has the meaning given by section 16;

“news publisher content” has the meaning given by section 49;

“recognised news publisher” has the meaning given by section 50.

(10) For the meaning of “Category 1 service”, see section 82 (register of categories of services).’—(Paul Scully.)

This inserts a new clause (after clause 135) which requires Ofcom to publish a report on the impact of the regulatory framework provided for in the Bill within two years of the relevant provisions coming into force. It also allows the Secretary of State to require Ofcom to produce further reports.

Brought up, read the First and Second time, and added to the Bill.

New Clause 40

Amendment of Enterprise Act 2002

‘In Schedule 15 to the Enterprise Act 2002 (enactments relevant to provisions about disclosure of information), at the appropriate place insert—

‘Online Safety Act 2022.”’—(Paul Scully.)



This amendment has the effect that the information gateway in section 241 of the Enterprise Act 2002 allows disclosure of certain kinds of information by a public authority (such as the Competition and Markets Authority) to OFCOM for the purposes of OFCOM’s functions under this Bill.

Brought up, read the First and Second time, and added to the Bill.

New Clause 42

Former providers of regulated services

‘(1) A power conferred by Chapter 6 of Part 7 (enforcement powers) to give a notice to a provider of a regulated service is to be read as including power to give a notice to a person who was, at the relevant time, a provider of such a service but who has ceased to be a provider of such a service (and that Chapter and Schedules 13 and 15 are to be read accordingly).

(2) “The relevant time” means—

(a) the time of the failure to which the notice relates, or

(b) in the case of a notice which relates to the requirement in section 90(1) to co-operate with an investigation, the time of the failure or possible failure to which the investigation relates.’—(Paul Scully.)

This new clause, which is intended to be inserted after clause 162, provides that a notice that may be given under Chapter 6 of Part 7 to a provider of a regulated service may also be given to a former provider of a regulated service.

Brought up, read the First and Second time, and added to the Bill.

New Clause 43

Amendments of Part 4B of the Communications Act

‘Schedule (Amendments of Part 4B of the Communications Act) contains amendments of Part 4B of the Communications Act.’—(Paul Scully.)

This new clause introduces a new Schedule amending Part 4B of the Communications Act 2003 (see NS2).

Brought up, read the First and Second time, and added to the Bill.

New Clause 44

Repeal of Part 4B of the Communications Act: transitional provision etc

‘(1) Schedule (Video-sharing platform services: transitional provision etc) contains transitional, transitory and saving provision—

(a) about the application of this Act and Part 4B of the Communications Act during a period before the repeal of Part 4B of the Communications Act (or, in the case of Part 3 of Schedule (Video-sharing platform services: transitional provision etc), in respect of charging years as mentioned in that Part);

(b) in connection with the repeal of Part 4B of the Communications Act.

(2) The Secretary of State may by regulations make transitional, transitory or saving provision of the kind mentioned in subsection (1)(a) and (b).

(3) Regulations under subsection (2) may amend or repeal—

(a) Part 2A of Schedule3;

(b) Schedule (Video-sharing platform services: transitional provision etc).

(4) Regulations under subsection (2) may, in particular, make provision about—

(a) the application of Schedule (Video-sharing platform services: transitional provision etc) in relation to a service if the transitional period in relation to that service ends on a date before the date when section 172 comes into force;

(b) the application of Part 3 of Schedule (Video-sharing platform services: transitional provision etc), including further provision about the calculation of a provider’s non-Part 4B qualifying worldwide revenue for the purposes of paragraph 19 of that Schedule;

(c) the application of Schedule 10 (recovery of OFCOM’s initial costs), and in particular how fees chargeable under that Schedule may be calculated, in respect of charging years to which Part 3 of Schedule (Video-sharing platform services: transitional provision etc) relates.’—(Paul Scully.)

This new clause introduces a new Schedule containing transitional provisions (see NS3), and provides a power for the Secretary of State to make regulations containing further transitional provisions etc.

Brought up, read the First and Second time, and added to the Bill.

New Clause 51

Publication by providers of details of enforcement action

‘(1) This section applies where—

(a) OFCOM have given a person (and not withdrawn) any of the following—

(i) a confirmation decision;

(ii) a penalty notice under section 119;

(iii) a penalty notice under section 120(5);

(iv) a penalty notice under section 121(6), and

(b) the appeal period in relation to the decision or notice has ended.

(2) OFCOM may give to the person a notice (a “publication notice”) requiring the person to—

(a) publish details describing—

(i) the failure (or failures) to which the decision or notice mentioned in subsection (1)(a) relates, and

(ii) OFCOM’s response, or

(b) otherwise notify users of the service to which the decision or notice mentioned in subsection (1)(a) relates of those details.

(3) A publication notice may require a person to publish details under subsection (2)(a) or give notification of details under subsection (2)(b) or both.

(4) A publication notice must—

(a) specify the decision or notice mentioned in subsection (1)(a) to which it relates,

(b) specify or describe the details that must be published or notified,

(c) specify the form and manner in which the details must be published or notified,

(d) specify a date by which the details must be published or notified, and

(e) contain information about the consequences of not complying with the notice.

(5) Where a publication notice requires a person to publish details under subsection (2)(a) the notice may also specify a period during which publication in the specified form and manner must continue.

(6) Where a publication notice requires a person to give notification of details under subsection (2)(b) the notice may only require that notification to be given to United Kingdom users of the service (see section 184).

(7) A publication notice may not require a person to publish or give notification of anything that, in OFCOM’s opinion—

(a) is confidential in accordance with subsections (8) and (9), or

(b) is otherwise not appropriate for publication or notification.

(8) A matter is confidential under this subsection if—

(a) it relates specifically to the affairs of a particular body, and

(b) publication or notification of that matter would or might, in OFCOM’s opinion, seriously and prejudicially affect the interests of that body.

(9) A matter is confidential under this subsection if—

(a) it relates to the private affairs of an individual, and

(b) publication or notification of that matter would or might, in OFCOM’s opinion, seriously and prejudicially affect the interests of that individual.

(10) A person to whom a publication notice is given has a duty to comply with it.

(11) The duty under subsection (10) is enforceable in civil proceedings by OFCOM—

(a) for an injunction,

(b) for specific performance of a statutory duty under section 45 of the Court of Session Act 1988, or

(c) for any other appropriate remedy or relief.

(12) For the purposes of subsection (1)(b) “the appeal period”, in relation to a decision or notice mentioned in subsection (1)(a), means—

(a) the period during which any appeal relating to the decision or notice may be made, or

(b) where such an appeal has been made, the period ending with the determination or withdrawal of that appeal.’—(Paul Scully.)

This new clause, which is intended to be inserted after clause 129, gives OFCOM the power to require a person to whom a confirmation decision or penalty notice has been given to publish details relating to the decision or notice or to otherwise notify service users of those details.

Brought up, read the First and Second time, and added to the Bill.

New Clause 52

Exemptions from offence under section 152

‘(1) A recognised news publisher cannot commit an offence under section 152.

(2) An offence under section 152 cannot be committed by the holder of a licence under the Broadcasting Act 1990 or 1996 in connection with anything done under the authority of the licence.

(3) An offence under section 152 cannot be committed by the holder of a multiplex licence in connection with anything done under the authority of the licence.

(4) An offence under section 152 cannot be committed by the provider of an on-demand programme service in connection with anything done in the course of providing such a service.

(5) An offence under section 152 cannot be committed in connection with the showing of a film made for cinema to members of the public.’—(Paul Scully.)

This new clause contains exemptions from the offence in clause 152 (false communications). The clause ensures that holders of certain licences are only exempt if they are acting as authorised by the licence and, in the case of Wireless Telegraphy Act licences, if they are providing a multiplex service.

Brought up, read the First and Second time, and added to the Bill.

New Clause 53

Offences of sending or showing flashing images electronically: England and Wales and Northern Ireland (No.2)

‘(1) A person (A) commits an offence if—

(a) A sends a communication by electronic means which consists of or includes flashing images (see subsection (13)),

(b) either condition 1 or condition 2 is met, and

(c) A has no reasonable excuse for sending the communication.

(2) Condition 1 is that—

(a) at the time the communication is sent, it is reasonably foreseeable that an individual with epilepsy would be among the individuals who would view it, and

(b) A sends the communication with the intention that such an individual will suffer harm as a result of viewing the flashing images.

(3) Condition 2 is that, when sending the communication—

(a) A believes that an individual (B)—

(i) whom A knows to be an individual with epilepsy, or

(ii) whom A suspects to be an individual with epilepsy,

will, or might, view it, and

(b) A intends that B will suffer harm as a result of viewing the flashing images.

(4) In subsections (2)(a) and (3)(a), references to viewing the communication are to be read as including references to viewing a subsequent communication forwarding or sharing the content of the communication.

(5) The exemptions contained in section (Exemptions from offence under section 152) apply to an offence under subsection (1) as they apply to an offence under section 152.

(6) For the purposes of subsection (1), a provider of an internet service by means of which a communication is sent is not to be regarded as a person who sends a communication.

(7) In the application of subsection (1) to a communication consisting of or including a hyperlink to other content, references to the communication are to be read as including references to content accessed directly via the hyperlink.

(8) A person (A) commits an offence if—

(a) A shows an individual (B) flashing images by means of an electronic communications device,

(b) when showing the images—

(i) A knows that B is an individual with epilepsy, or

(ii) A suspects that B is an individual with epilepsy,

(c) when showing the images, A intends that B will suffer harm as a result of viewing them, and

(d) A has no reasonable excuse for showing the images.

(9) An offence under subsection (1) or (8) cannot be committed by a healthcare professional acting in that capacity.

(10) A person who commits an offence under subsection (1) or (8) is liable—

(a) on summary conviction in England and Wales, to imprisonment for a term not exceeding the general limit in a magistrates’ court or a fine (or both);

(b) on summary conviction in Northern Ireland, to imprisonment for a term not exceeding six months or a fine not exceeding the statutory maximum (or both);

(c) on conviction on indictment, to imprisonment for a term not exceeding five years or a fine (or both).

(11) It does not matter for the purposes of this section whether flashing images may be viewed at once (for example, a GIF that plays automatically) or only after some action is performed (for example, pressing play).

(12) In this section—

(a) references to sending a communication include references to causing a communication to be sent;

(b) references to showing flashing images include references to causing flashing images to be shown.

(13) In this section—

“electronic communications device” means equipment or a device that is capable of transmitting images by electronic means;

“flashing images” means images which carry a risk that an individual with photosensitive epilepsy who viewed them would suffer a seizure as a result;

“harm” means—

(a) a seizure, or

(b) alarm or distress;

“individual with epilepsy” includes, but is not limited to, an individual with photosensitive epilepsy;

“send” includes transmit and publish (and related expressions are to be read accordingly).

(14) This section extends to England and Wales and Northern Ireland.’—(Paul Scully.)

This new clause creates (for England and Wales and Northern Ireland) a new offence of what is sometimes known as “epilepsy trolling” - sending or showing flashing images electronically to people with epilepsy intending to cause them harm.

Brought up, read the First and Second time, and added to the Bill.

New Clause 16

Communication offence for encouraging or assisting self-harm

‘(1) In the Suicide Act 1961, after section 3 insert—

“3A Communication offence for encouraging or assisting self-harm

(1) A person (“D”) commits an offence if—

(a) D sends a message,

(b) the message encourages or could be used to assist another person (“P”) to inflict serious physical harm upon themselves, and

(c) D’s act was intended to encourage or assist the infliction of serious physical harm.

(2) The person referred to in subsection (1)(b) need not be a specific person (or class of persons) known to, or identified by, D.

(3) D may commit an offence under this section whether or not any person causes serious physical harm to themselves, or attempts to do so.

(4) A person guilty of an offence under this section is liable—

(a) on summary conviction, to imprisonment for a term not exceeding 12 months, or a fine, or both;

(b) on indictment, to imprisonment for a term not exceeding 5 years, or a fine, or both.

(5) “Serious physical harm” means serious injury amounting to grievous bodily harm within the meaning of the Offences Against the Person Act 1861.

(6) No proceedings shall be instituted for an offence under this section except by or with the consent of the Director of Public Prosecutions.

(7) If D arranges for a person (“D2”) to do an Act and D2 does that Act, D is also to be treated as having done that Act for the purposes of subsection (1).

(8) In proceedings for an offence to which this section applies, it shall be a defence for D to prove that—

(a) P had expressed intention to inflict serious physical harm upon themselves prior to them receiving the message from D; and

(b) P’s intention to inflict serious physical harm upon themselves was not initiated by D; and

(c) the message was wholly motivated by compassion towards D or to promote the interests of P’s health or wellbeing.”’—(Mr Davis.)

This new clause would create a new communication offence for sending a message encouraging or assisting another person to self-harm.

Brought up, and read the First time.

Question put, That the clause be read a Second time.

20:49

Division 107

Ayes: 242

Noes: 308

21:03
Proceedings interrupted (Programme Order, 20 March).
The Deputy Speaker put forthwith the Questions necessary for the disposal of the business to be concluded at that time (Standing Order No. 83E).
New Clause 17
Liability of directors for compliance failure
‘(1) This section applies where OFCOM considers that there are reasonable grounds for believing that a provider of a regulated service has failed, or is failing, to comply with any enforceable requirement (see section 112) that applies in relation to the service.
(2) If OFCOM considers that the failure results from any—
(a) action,
(b) direction,
(c) neglect, or
(d) with the consent’—(Dame Margaret Hodge.)
This new clause would enable Ofcom to exercise its enforcement powers under Chapter 6, Part 7 of the Bill against individual directors, managers and other officers at a regulated service provider where it considers the provider has failed, or is failing, to comply with any enforceable requirement.
Brought up.
Question put, That the clause be added to the Bill.
21:03

Division 108

Ayes: 238

Noes: 311

New Clause 28
Establishment of Advocacy Body
(1) There is to be a body corporate (“the Advocacy Body”) to represent interests of child users of regulated services.
(2) A “child user”—
(a) means any person aged 17 years or under who uses or is likely to
use regulated internet services; and
(b) includes both any existing child user and any future child user.
(3) The work of the Advocacy Body may include—
(a) representing the interests of child users;
(b) the protection and promotion of these interests;
(c) any other matter connected with those interests.
(4) The “interests of child users” means the interests of children in relation to the discharge by any regulated company of its duties under this Act,
including—
(a) safety duties about illegal content, in particular CSEA content;
(b) safety duties protecting children;
(c) “enforceable requirements” relating to children.
(5) The Advocacy Body must have particular regard to the interests of child users that display one or more protected characteristics within the meaning of the Equality Act 2010.
(6) The Advocacy Body will be defined as a statutory consultee for OFCOM’s regulatory decisions which impact upon the interests of children.
(7) The Advocacy Body must assess emerging threats to child users of regulated services and must bring information regarding these threats to OFCOM.
(8) The Advocacy Body may undertake research on their own account.
(9) The Secretary of State must either appoint an organisation known to represent children to be designated the functions under this Act, or create an organisation to carry out the designated functions.
(10) The budget of the Advocacy Body will be subject to annual approval by the board of OFCOM.
(11) The Secretary of State must give directions to OFCOM as to how it should recover the costs relating to the expenses of the Advocacy Body, or the Secretary of State in relation to the establishment of the Advocacy Body, through the provisions to require a provider of a regulated service to pay a fee (as set out in section 71).”—(John Nicolson).
Brought up.
Question put, That the clause be added to the Bill.
21:16

Division 109

Ayes: 240

Noes: 312

Clause 47
Duties and the first codes of practice
Amendment made: 234, page 45, line 2, at end insert—
“(9) This section is subject to Part 2 of Schedule (Video-sharing platform services: transitional provision etc) (video-sharing platform services: transitional provision etc).” —(Paul Scully.)
This amendment ensures that clause 47 is subject to Part 2 of the new transitional provisions Schedule (see NS3) - otherwise clause 47 might have the effect that a provider of a service currently regulated by Part 4B of the Communications Act 2003 must comply with a safety duty during the transitional period.
Clause 84
OFCOM’s register of risks, and risk profiles, of Part 3 services
Amendments made: 102, page 72, line 28, leave out paragraph (a) and insert—
“(a) the risks of harm to individuals in the United Kingdom presented by illegal content present on regulated user-to-user services and by the use of such services for the commission or facilitation of priority offences;
(aa) the risk of harm to individuals in the United Kingdom presented by search content of regulated search services that is illegal content;”
This amendment ensures that OFCOM must prepare risk profiles relating to the use of user-to-user services for the commission or facilitation of priority offences.
Amendment 103, page 72, line 40, leave out from the second “the” to end of line and insert
“risk of harm mentioned in subsection (1)(b)”.
This technical amendment is consequential on Amendment 102.
Amendment 104, page 73, line 23, leave out “(1)(c)” and insert “(1)(a) or (c)”.
This technical amendment is consequential on Amendment 102.
Amendment 105, page 73, line 24, at end insert—
“(c) in the case of a risk assessment or risk profiles which relate only to the risk of harm mentioned in subsection (1)(aa), are to be read as references to regulated search services.”
This technical amendment is consequential on Amendment 102.
Amendment 106, page 73, line 36, at end insert—
““priority offence” has the same meaning as in Part 3 (see section 52).”—(Paul Scully.)
This amendment inserts a definition of “priority offence” into clause 84.
Clause 85
OFCOM’s guidance about risk assessments
Amendments made: 107, page 73, line 38, leave out subsection (1) and insert—
“(1) As soon as reasonably practicable after OFCOM have published the first risk profiles relating to the illegality risks, OFCOM must produce guidance to assist providers of regulated user-to-user services in complying with their duties to carry out illegal content risk assessments under section 8.
(1A) As soon as reasonably practicable after OFCOM have published the first risk profiles relating to the risk of harm from illegal content, OFCOM must produce guidance to assist providers of regulated search services in complying with their duties to carry out illegal content risk assessments under section 23.”
This amendment splits up OFCOM’s duty to produce guidance for providers about illegal content risk assessments, since, for user-to-user services, the effect of Amendment 102 is that such a risk assessment must also consider risks around the use of such services for the commission or facilitation of priority offences.
Amendment 108, page 74, line 11, leave out “(1) or”.
This technical amendment is consequential on Amendment 107.
Amendment 109, page 74, line 12, leave out “those subsections are” and insert “that subsection is”.
This technical amendment is consequential on Amendment 107.
Amendment 110, page 74, line 15, leave out “subsection (7)” and insert “this section”.
This technical amendment is consequential on Amendment 107.
Amendment 111, page 74, line 17, at end insert—
““illegality risks” means the risks mentioned in section 84(1)(a);”.
This amendment inserts a definition of “illegality risks” which is now used in clause 85.
Amendment 112, page 74, line 19, leave out “84(1)(a)” and insert “84(1)(aa)”.—(Paul Scully.)
This technical amendment is consequential on Amendment 102.
Clause 86
Power to require information
Amendment made: 113, page 75, line 38, at end insert—
“(fa) the purpose of assessing whether to give a notice under section (Notices to deal with terrorism content or CSEA content (or both))(1) relating to the development or sourcing of technology (see subsections (2)(b) and (3)(b) of that section);”.—(Paul Scully.)
This amendment makes it clear that OFCOM have the power to require information to decide whether to give a notice under the clause inserted by NC11 which requires a provider to develop or source technology to deal with CSEA content.
Clause 89
Report by skilled persons
Amendments made: 114, page 77, line 36, leave out “either or both” and insert “any”.
This amendment is consequential on Amendment 116.
Amendment 115, page 77, line 39, leave out “or”.
This amendment is consequential on Amendment 116.
Amendment 116, page 77, line 43, at end insert—
“(c) assisting OFCOM in deciding whether to give a provider of a Part 3 service a notice under section (Notices to deal with terrorism content or CSEA content (or both))(1) requiring the provider to use their best endeavours to develop or source technology dealing with CSEA content (see subsections (2)(b) and (3)(b) of that section), or assisting OFCOM in deciding the requirements to be imposed by such a notice.”—(Paul Scully.)
This amendment extends OFCOM’s power to require a skilled person’s report to cover assistance in relation to a notice under NC11 which requires a provider to develop or source technology to deal with CSEA content.
Clause 104
Amendment made: 117, page 87, line 9, leave out clause 104.—(Paul Scully.)
This amendment leaves out existing clause 104, which is replaced by NC11 and NC12.
Clause 105
Matters relevant to a decision to give a notice under section 104(1)
Amendments made: 118, page 88, line 40, at beginning insert
“In the case of a notice requiring the use of accredited technology,”.
This amendment ensures that the matters listed in clause 105(2) which OFCOM have to take account of in deciding whether to give a notice under NC11 apply just to such notices which require the use of accredited technology.
Amendment 119, page 89, line 25, at end insert—
“(3A) In the case of a notice relating to the development or sourcing of technology, subsection (2) applies—
(a) as if references to relevant content were to CSEA content, and
(b) with the omission of paragraphs (h), (i) and (j).” —(Paul Scully.)
This amendment sets out how the matters listed in clause 105(2) which OFCOM have to take account of in deciding whether to give a notice under NC11 apply to such notices which require the development or sourcing of technology to deal with CSEA content.
Clause 106
Notices under section 104(1): supplementary
Amendments made: 120, page 89, line 47, at end insert—
“(4A) A notice given to a provider of a Part 3 service requiring the use of accredited technology is to be taken to require the provider to make such changes to the design or operation of the service as are necessary for the technology to be used effectively.”
This amendment makes it clear that if OFCOM give a notice under NC11 requiring a provider to use accredited technology, that encompasses necessary design changes to a service.
Amendment 121, page 90, line 1, after “notice” insert
“requiring the use of accredited technology”.
This amendment ensures that requirements listed in clause 106(5) about the contents of a notice given under NC11 apply just to such notices which require the use of accredited technology.
Amendment 122, page 90, line 15, after “notice” insert
“requiring the use of accredited technology”.
This amendment is consequential on Amendment 121.
Amendment 123, page 90, line 17, at end insert—
“(6A) A notice relating to the development or sourcing of technology must—
(a) give OFCOM’s reasons for their decision to give the notice,
(b) describe the purpose for which technology is required to be developed or sourced (see section (Notices to deal with terrorism content or CSEA content (or both))(2)(a)(iii) and (iv) and (3)(a)(ii)),
(c) specify steps that the provider is required to take (including steps relating to the use of a system or process) in order to comply with the requirement described in section (Notices to deal with terrorism content or CSEA content (or both))(2)(b) or (3)(b), or both those requirements (as the case may be),
(d) specify a reasonable period within which each of the steps specified in the notice must be taken,
(e) contain details of any other requirements imposed by the notice,
(f) contain details of the rights of appeal under section 140,
(g) contain information about when OFCOM intend to review the notice (see section 107), and
(h) contain information about the consequences of not complying with the notice (including information about the further kinds of enforcement action that it would be open to OFCOM to take).
(6B) In deciding what period or periods to specify for steps to be taken in accordance with subsection (6A)(d), OFCOM must, in particular, consider—
(a) the size and capacity of the provider, and
(b) the state of development of technology capable of achieving the purpose described in the notice in accordance with subsection (6A)(b).”
This amendment sets out the requirements which apply regarding the contents of a notice given under the NC11 requiring the development or sourcing of technology to deal with CSEA content.
Amendment 124, page 90, line 18, after “the” insert “design and”.
This amendment makes it clear that a notice given under NC11 may impose requirements about design of a service.
Amendment 125, page 90, line 24, leave out
“section 104 and this section”
and insert “this Chapter”.—(Paul Scully.)
This amendment is consequential on NC12.
Clause 107
Review and further notice under section 104(1)
Amendments made: 126, page 90, line 42, leave out from “must” to end of line 44 and insert
“carry out a review of the provider’s compliance with the notice—
(a) in the case of a notice requiring the use of accredited technology, before the end of the period for which the notice has effect;
(b) in the case of a notice relating to the development or sourcing of technology, before the last date by which any step specified in the notice is required to be taken.”
This amendment is consequential on NC11.
Amendment 127, page 90, line 45, leave out “The” and insert
“In the case of a notice requiring the use of accredited technology, the”.
This amendment is needed because the matters listed in the provision which is amended can only relate to a notice given under NC11 which requires the use of accredited technology.
Amendment 128, page 91, line 10, leave out
“require the use of different accredited technology from”
and insert “impose different requirements from”.
This amendment is needed because the provision which is amended is relevant to all notices given under NC11 (not just those which require the use of accredited technology).
Amendment 129, page 91, line 12, leave out
“Section 104(7) to (10) (warning notice) do”
and insert
“Section (Warning notices) (warning notices) does”.—(Paul Scully.)
This amendment is consequential on the warning notice procedure now being contained in NC12.
Clause 112
Requirements enforceable by OFCOM against providers of regulated services
Amendment 174, page 93, line 38, at end insert—

“Section (Duties to protect news publisher content)

News publisher content”

This amendment ensures that Ofcom are able to use their enforcement powers in Chapter 6 of Part 7 in relation to a breach of any of the duties set out in NC19.
Clause 115
Confirmation decisions: risk assessments
Amendments made: 130, page 96, line 40, leave out “illegal content” and insert
“matters required to be covered by an illegal content risk assessment”.
This amendment ensures that clause 115, which relates to a confirmation decision that may be given where a risk assessment is defective, covers matters in a risk assessment relating to the use of a service for commission or facilitation of priority offences, not just illegal content.
Amendment 131, page 96, line 41, after “9(2)” insert “(b) or (c)”.
This technical amendment is consequential on Amendment 61.
Amendment 132, page 96, line 44, leave out
“content that is harmful to children”
and insert
“matters required to be covered by a children’s risk assessment”.
This amendment brings clause 115(2)(b) (children’s risk assessments) into line with clause 115(2)(a) (illegal content risk assessments).
Amendment 133, page 97, line 15, leave out the definition of
“content that is harmful to children”.
This technical amendment is consequential on Amendment 132.
Amendment 134, page 97, line 17, leave out the definition of “illegal content”.—(Paul Scully.)
This technical amendment is consequential on Amendment 130.
Clause 119
Penalty for failure to comply with confirmation decision
Amendments made: 212, page 101, line 16, leave out “intend” and insert “propose”.
This amendment is a technical amendment and ensures that clause 119 uses the same terminology as used in other clauses in Chapter 6 of Part 7.
Amendment 213, page 101, line 19, at end insert “(with any supporting evidence)”.—(Paul Scully.)
This amendment provides that where OFCOM propose to give a penalty notice to a person in connection with a failure to comply with a confirmation decision, the representations that may be made to OFCOM before that notice is given may include supporting evidence.
Clause 120
Penalty for failure to comply with notice under section 104(1)
Amendment made: 135, page 101, line 37, leave out from beginning to “OFCOM”.—(Paul Scully.)
This is about a penalty notice which OFCOM may give for failure to comply with a notice given under NC11. The amendment omits words which are not apt to cover such a notice which relates to the development or sourcing of technology to deal with CSEA content.
Clause 129
Publication of details of enforcement action
Amendment made: 214, page 113, line 3, after “person” insert “(and not withdrawn)”.—(Paul Scully.)
This amendment provides that OFCOM’s duty to publish information following the giving of a confirmation decision or penalty notice to a person does not apply where the decision or notice has been withdrawn.
Clause 138
OFCOM’s reports
Amendment made: 175, page 118, line 29, at end insert—
“(aa) a report under section (OFCOM’s reports about news publisher content and journalistic content) (report about news publisher content and journalistic content),”.—(Paul Scully.)
This amendment ensures that the provisions about excluding confidential information from a report before publication apply to the duty to publish the report produced under NC20.
Clause 150
Review
Amendment made: 176, page 126, line 36, at end insert—
“(5A) In carrying out the review, the Secretary of State must take into account any report published by OFCOM under section (OFCOM’s reports about news publisher content and journalistic content) (reports about news publisher content and journalistic content).”—(Paul Scully.)
This amendment ensures that the Secretary of State is required to take into account Ofcom’s reports published under NC20 when carrying out the review under clause 150.
Page 127
Amendment made: 239, page 127, line 11, leave out clause 151.—(Paul Scully.)
This amendment omits clause 151, which had introduced a new offence relating to harmful communications.
Clause 152
False communications offence
Amendments made: 138, page 128, line 22, leave out subsections (4) and (5).
This amendment leaves out material which now appears, with changes, in NC52.
Amendment 240, page 128, line 29, at end insert—
“(5A) See section (Exemptions from offence under section 152) for exemptions from the offence under this section.”—(Paul Scully.)
This amendment adds a signpost to NC52.
Clause 153
Threatening communications offence
Amendment made: 215, page 129, line 29, leave out
“maximum summary term for either-way offences”
and insert
“general limit in a magistrates’ court”.—(Paul Scully.)
This amendment relates to the maximum term of imprisonment on summary conviction of an either-way offence in England and Wales. The amendment inserts a reference to the general limit in a magistrates’ court, meaning the time limit in section 224(1) of the Sentencing Code, which, currently, is 12 months.
Clause 154
Interpretation of sections 151 to 153
Amendments made: 241, page 129, line 33, leave out “151” and insert “152”.
This amendment is consequential on the omission of clause 151 (see Amendment 239).
Amendment 242, page 129, line 34, leave out “any of those sections” and insert “section 152 or 153”.
This is a technical amendment to correct a reference, taking into account NC52.
Amendment 217, page 129, line 38, after “sends” insert
“, or gives to an individual,”.
This amendment clarifies that the new communications offences cover cases of giving (a letter etc) to an individual.
Amendment 218, page 129, line 43, at end insert
“, or
(ii) given to an individual.”
This amendment clarifies that the new communications offences cover cases of causing a letter etc to be given to an individual.
Amendment 243, page 130, line 10, leave out “151” and insert “152”.
This amendment is consequential on the omission of clause 151 (see Amendment 239).
Amendment 219, page 130, line 10, leave out “, transmission or publication”.
This is a technical drafting change reflecting the fact that the reference in this provision to sending a message already covers cases of transmission or publication.
Amendment 244, page 130, line 16, leave out “151 or”.
This amendment is consequential on the omission of clause 151 (see Amendment 239).
Amendment 245, page 130, line 18, leave out “151” and insert “152”.
This amendment is consequential on the omission of clause 151 (see Amendment 239).
Amendment 220, page 130, line 21, at end insert
“(and in this subsection “sending” includes “giving”, and “sender” is to be read accordingly)”.
This amendment ensures that references to sending in a technical provision relating to the new communications offences include giving.
Amendment 221, page 130, line 23, leave out “, transmitted or published”.
This is a technical drafting change reflecting the fact that the reference in this provision to sending a message already covers cases of transmission or publication.
Amendment 140, page 130, line 24, at end insert—
“(9A) “Recognised news publisher” has the meaning given by section 50.
(9B) “Multiplex licence” means a licence under section 8 of the Wireless Telegraphy Act 2006 which authorises the provision of a multiplex service within the meaning of section 42(6) of that Act.”—(Paul Scully.)
This amendment adds definitions of terms used in NC52.
Clause 155
Extra-territorial application and jurisdiction
Amendments made: 246, page 130, line 31, leave out “151(1),”.
This amendment is consequential on the omission of clause 151 (see Amendment 239).
Amendment 222, page 130, line 32, leave out “United Kingdom person” and insert “person within subsection (2)”.
This is a technical drafting improvement resulting from the introduction of the new epilepsy trolling offence which extends to Northern Ireland as well as England and Wales (see NC53).
Amendment 223, page 130, leave out line 33 and insert
“A person is within this subsection if the person is—”.
This is a technical drafting improvement resulting from the introduction of the new epilepsy trolling offence which extends to Northern Ireland as well as England and Wales (see NC53).
Amendment 224, page 130, line 36, at end insert—
“(2A) Section (Offences of sending or showing flashing images electronically: England and Wales and Northern Ireland)(1) applies to an act done outside the United Kingdom, but only if the act is done by a person within subsection (2B).
(2B) A person is within this subsection if the person is—
(a) an individual who is habitually resident in England and Wales or Northern Ireland, or
(b) a body incorporated or constituted under the law of England and Wales or Northern Ireland.”
This amendment provides for extra-territorial application of the offence of sending flashing images electronically under the new clause inserted by NC53.
Amendment 247, page 130, line 37, leave out “151,”.
This amendment is consequential on the omission of clause 151 (see Amendment 239).
Amendment 225, page 130, line 39, at end insert—
“(4) Proceedings for an offence committed under section (Offences of sending or showing flashing images electronically: England and Wales and Northern Ireland)(1) outside the United Kingdom may be taken, and the offence may for incidental purposes be treated as having been committed, at any place in England and Wales or Northern Ireland.
(5) This section extends to England and Wales and Northern Ireland.”—(Paul Scully.)
This amendment provides for courts in England and Wales or Northern Ireland to have jurisdiction over an offence of sending flashing images electronically (see NC53) that is committed outside the United Kingdom.
Clause 156
Liability of corporate officers
Amendments made: 248, page 130, line 41, leave out “151,”.
Clause 156 is about the liability of corporate officers etc for offences. This amendment removes a reference to clause 151 (the harmful communications offence omitted by Amendment 239).
Amendment 226, page 130, line 41, leave out “or 153” and insert
“, 153 or (Offences of sending or showing flashing images electronically: England and Wales and Northern Ireland)”.
Clause 156 is about the liability of corporate officers etc for offences. This amendment ensures that the provision applies to the epilepsy trolling offence inserted by NC53.
Amendment 227, page 131, line 9, at end insert—
“(3) This section extends to England and Wales and Northern Ireland.”—(Paul Scully.)
This amendment states the extent of clause 156.
Clause 158
Repeals in connection with offences under sections 151, 152 and 153
Amendments made: 249, page 132, line 3, leave out from beginning to end of line 4 and insert
“Section 127(2)(a) and (b) of the Communications Act (false messages) is repealed so far as it extends”.
This amendment, together with Amendment 250, provides for the repeal of section 127(2)(a) and (b) of the Communications Act 2003 for England and Wales, but not (as previously) also the repeal of section 127(1) of that Act.
Amendment 250, page 132, line 6, leave out paragraphs (a) and (b).
This amendment, together with Amendment 249, provides for the repeal of section 127(2)(a) and (b) of the Communications Act 2003 for England and Wales, but not (as previously) also the repeal of section 127(1) of that Act.
Amendment 251, page 132, line 8, leave out subsection (2) and insert—
“(2) The following provisions of the Malicious Communications Act 1988 are repealed—
(a) section 1(1)(a)(ii),
(b) section 1(1)(a)(iii), and
(c) section 1(2).”—(Paul Scully.)
This amendment provides for the repeal of the specified provisions of the Malicious Communications Act 1988, but not (as previously) the whole of that Act.
Clause 159
Consequential amendments
Amendments made: 252, page 132, line 10, leave out “151,”.
Clause 159 introduces a Schedule of consequential amendments. This amendment omits the reference to clause 151 (consequential on the omission of clause 151 (see Amendment 239)).
Amendment 228, page 132, line 11, leave out “and 153” and insert
“, 153 and (Offences of sending or showing flashing images electronically: England and Wales and Northern Ireland)”.—(Paul Scully.)
Clause 159 introduces a Schedule of consequential amendments. This amendment adds a reference to the new epilepsy trolling offence (see NC53).
Clause 172
Repeal of Part 4B of the Communications Act
Amendment made: 229, page 139, line 8, at end insert—
“(3) In this Act, omit—
(a) section (Amendments of Part 4B of the Communications Act), and
(b) Schedule (Amendments of Part 4B of the Communications Act).
(4) In the Audiovisual Media Services (Amendment) (EU Exit) Regulations 2020 (S.I. 2020/1536), omit regulation 4.”—(Paul Scully.)
This amendment revokes enactments which amend Part 4B of the Communications Act 2003, which is repealed by clause 172.
Clause 182
Parliamentary procedure for regulations
Amendments made: 235, page 147, line 1, at end insert—
“(ca) regulations under section (Repeal of Part 4B of the Communications Act: transitional provision etc)(2),”.
This amendment provides for the affirmative procedure to apply to regulations under the new clause inserted by NC44.
Amendment 236, page 147, line 42, at end insert—
“(da) regulations under paragraph 6B(1) of Schedule 3, or”.—(Paul Scully.)
This amendment provides for the negative procedure to apply to regulations under paragraph 6B(1) of Schedule 3 (regulations setting a date when the requirements to carry out risk assessments etc begin for providers of services currently regulated by Part 4B of the Communications Act 2003).
Clause 196
Commencement and transitional provision
Amendment made: 237, page 161, line 39, at end insert—
“(3A) Regulations under subsection (2) may not bring section 172 into force before the end of the period of six months beginning with the date specified in regulations under paragraph 6B(1) of Schedule 3.”—(Paul Scully.)
Regulations under paragraph 6B(1) of Schedule 3 will set a date when the requirements to carry out risk assessments etc begin for providers of services currently regulated by Part 4B of the Communications Act 2003. This amendment ensures that Part 4B may not be repealed until at least 6 months after the chosen date (to give providers time to do their assessments before they become subject to the safety duties).
New Schedule 2
Amendments of Part 4B of the Communications Act
“1 Part 4B of the Communications Act (video-sharing platform services) is amended in accordance with this Schedule.
2 In section 368U (maintenance of list of providers)—
(a) omit subsection (2);
(b) for subsection (3) substitute—
‘(3) OFCOM must publish the up to date list on a publicly accessible part of their website.’
3 In section 368V(4) (meaning of ‘significant differences’), for the words from ‘the determination of jurisdiction’ to the end substitute ‘whether or not the person has the required connection with the United Kingdom under section 368S(2)(d)’.
4 In section 368Y(2)(d) (information to be provided by providers of video-sharing platform services), for the words from ‘under the jurisdiction’ to the end substitute ‘subject to regulation under this Part in respect of the video-sharing platform service that P provides’.
5 In section 368Z1(3) (duty to take appropriate measures), for the words from ‘of the description’ to the end substitute ‘to monitor the information which they transmit or store, or actively to seek to discover facts or circumstances indicating illegal activity’.
6 In section 368Z10(3)(a) (power to demand information), for the words from ‘falls under’ to the end substitute ‘has the required connection with the United Kingdom under section 368S(2)(d)’.
7 For section 368Z12 (co-operation with member States and the European Commission) substitute—
368Z12 Co-operation with EEA States
OFCOM may co-operate with EEA states which are subject to the Audiovisual Media Services Directive, and with the national regulatory authorities of such EEA states, for the following purposes—
(a) facilitating the carrying out by OFCOM of any of their functions under this Part; or
(b) facilitating the carrying out by the national regulatory authorities of the EEA states of any of their functions in relation to video-sharing platform services under that Directive as it has effect in EU law as amended from time to time.’”—(Paul Scully.)
This new Schedule amends Part 4B of the Communications Act 2003, which regulates video-sharing platform services. The amendments, which will apply during a transitional period prior to the repeal of Part 4B, are made in connection with the United Kingdom’s exit from the European Union.
Brought up, and added to the Bill.
New Schedule 3
Video-sharing platform services: transitional provision etc
“Part 1
Interpretation
1 (1) In this Schedule, “pre-existing Part 4B service” means—
(a) an internet service which—
(i) is a video-sharing platform service by reason of the conditions in section 368S(1) and (2) of the Communications Act being met in relation to the service as a whole, and
(ii) was being provided immediately before this Schedule comes into force; or
(b) a dissociable section of an internet service, where that dissociable section—
(i) is a video-sharing platform service by reason of the conditions in section 368S(1)(a) and (2) of the Communications Act being met in relation to that dissociable section, and
(ii) was being provided immediately before this Schedule comes into force.
(2) In sub-paragraph (1), any reference to a service provided before this Schedule comes into force includes a reference to a service provided in breach of the requirement in section 368V of the Communications Act.
2 In this Schedule—
“the relevant day”, in relation to a pre-existing Part 4B service or to a service which includes a pre-existing Part 4B service, means—
(a) the date when section 172 comes into force (repeal of Part 4B of the Communications Act), or
(b) if the pre-existing Part 4B service ceases to be a video-sharing platform service before the date mentioned in paragraph (a), the date when that service ceases to be a video-sharing platform service;
“safety duties” means the duties mentioned in section 6(2), (4) and (5), except the duties set out in—
(a) section 8 (illegal content risk assessments),
(b) section 10 (children’s risk assessments),
(c) section 12 (adults’ risk assessments), and
(d) section 20(2) (records of risk assessments);
“the transitional period”, in relation to a pre-existing Part 4B service or to a service which includes a pre-existing Part 4B service, means the period—
(a) beginning with the date when this Schedule comes into force, and
(b) ending with the relevant day;
“video-sharing platform service” has the same meaning as in Part 4B of the Communications Act (see section 368S of that Act).
Part 2
During the transitional period
Pre-existing Part 4B services which are regulated user-to-user services
3 (1) This paragraph applies in relation to a pre-existing Part 4B service which—
(a) is within the definition in paragraph (a) of paragraph 1(1), and
(b) is also a regulated user-to-user service.
(2) Both this Act and Part 4B of the Communications Act apply in relation to the pre-existing Part 4B service during the transitional period.
(3) But that is subject to—
(a) sub-paragraph (4),
(b) sub-paragraph (5), and
(c) paragraph 4.
(4) The following duties and requirements under this Act do not apply during the transitional period in relation to the pre-existing Part 4B service—
(a) the safety duties;
(b) the duties set out in section 34 (fraudulent advertising);
(c) the duties set out in section 57 (user identity verification);
(d) the requirements under section 59(1) and (2) (reporting CSEA content to the NCA);
(e) the duty on OFCOM to give a notice under section 64(1) requiring information in a transparency report;
(f) the requirements to produce transparency reports under section 64(3) and (4).
(5) OFCOM’s powers under Schedule 12 to this Act (powers of entry, inspection and audit) do not apply during the transitional period in relation to the pre-existing Part 4B service.
(6) In sub-paragraph (2) the reference to this Act does not include a reference to Part 6 (fees); for the application of Part 6, see Part 3 of this Schedule.
Regulated user-to-user services that include regulated provider pornographic content
4 (1) The duties set out in section 68 of this Act do not apply during the transitional period in relation to any regulated provider pornographic content published or displayed on a pre-existing Part 4B service.
(2) In the case of a regulated user-to-user service which includes a pre-existing Part 4B service within the definition in paragraph (b) of paragraph 1(1), nothing in sub-paragraph (1) is to be taken to prevent the duties set out in section 68 from applying during the transitional period in relation to any regulated provider pornographic content published or displayed on any other part of the service.
(3) In this paragraph ‘regulated provider pornographic content’ and ‘published or displayed’ have the same meaning as in Part 5 of this Act (see section 66).
Pre-existing Part 4B services which form part of regulated user-to-user services
5 (1) During the transitional period, Part 4B of the Communications Act applies in relation to a pre-existing Part 4B service within the definition in paragraph (b) of paragraph 1(1).
(2) Sub-paragraph (3), and paragraphs 6 to 8, apply in relation to a regulated user-to-user service which includes a pre-existing Part 4B service within the definition in paragraph (b) of paragraph 1(1).
(3) During the transitional period, this Act applies in relation to the regulated user-to-user service with the modifications set out in paragraph 6, 7, or 8 (whichever applies).
(4) In paragraphs 6 to 8 the dissociable section of the service which is the pre-existing Part 4B service is referred to as ‘the Part 4B part’.
(5) In sub-paragraph (3) the reference to this Act does not include a reference to Part 6 (fees); for the application of Part 6, see Part 3 of this Schedule.
Regulated user-to-user services with a Part 4B part and another user-to-user part
6 (1) This paragraph applies in relation to a regulated user-to-user service described in paragraph 5(2) if the service would still be a regulated user-to-user service even if the Part 4B part were to be assumed not to be part of the service.
(2) During the transitional period—
(a) any duty or requirement mentioned in paragraph 3(4) which applies in relation to the regulated service is to be treated as applying only in relation to the rest of the service;
(b) the powers mentioned in paragraph 3(5) are to be treated as applying only in relation to the rest of the service.
(3) In this paragraph ‘the rest of the service’ means any user-to-user part of the regulated service other than the Part 4B part.
Regulated user-to-user services with a Part 4B part and a search engine
7 (1) This paragraph applies in relation to a regulated user-to-user service described in paragraph 5(2) if the service would be a regulated search service if the Part 4B part were to be assumed not to be part of the service.
(2) During the transitional period, no duty or requirement mentioned in paragraph 3(4) applies in relation to the Part 4B part of the service (but that is not to be taken to prevent any other duty or requirement under this Act from applying in relation to the search engine of the service during the transitional period).
(3) During the transitional period, the powers mentioned in paragraph 3(5) are to be treated as applying only in relation to the search engine of the service.
Regulated user-to-user services with a Part 4B part but no other user-to-user part or search engine
8 (1) This paragraph applies in relation to a regulated user-to-user service described in paragraph 5(2) if the service does not fall within paragraph 6 or 7.
(2) The duties, requirements and powers mentioned in paragraph 3(4) and (5) do not apply in relation to the regulated service during the transitional period.
Risk assessments and children’s access assessments of pre-existing Part 4B services or of services which include a pre-existing Part 4B service
9 See Part 2A of Schedule 3 for provision about—
(a) the timing of risk assessments and children’s access assessments of pre-existing Part 4B services, and
(b) modifications of Parts 1 and 2 of that Schedule in connection with risk assessments and children’s access assessments of services which include a pre-existing Part 4B service within the definition in paragraph (b) of paragraph 1(1).
Operation of section 368U of the Communications Act
10 During the transitional period, section 368U of the Communications Act has effect as a requirement to establish and maintain an up to date list of persons providing a video-sharing platform service to which Part 4B applies.
Video-sharing platform services which start up, or start up again, during the transitional period
11 Part 4B of the Communications Act does not apply in relation to a video-sharing platform service which is first provided on or after the date when this Schedule comes into force.
12 (1) Sub-paragraph (2) applies in relation to a pre-existing Part 4B service if—
(a) the service ceases to be a video-sharing platform service on a date within the transitional period, and
(b) the service begins again to be a video-sharing platform service on some later date within the transitional period.
(2) Part 4B of the Communications Act does not start applying again in relation to the service on the date mentioned in sub-paragraph (1)(b).
13 Paragraphs 11 and 12 apply regardless of whether, or when, a provider of a service has notified the appropriate regulatory authority in accordance with section 368V of the Communications Act.
Part 3
Application of Part 6 of this Act: fees
Introduction
14 This Part makes provision about the application of the following provisions of this Act in relation to a person who is the provider of a relevant regulated service—
(a) section 70 (duty to notify OFCOM in relation to the charging of fees);
(b) section 71 (payment of fees);
(c) Schedule 10 (additional fees).
15 In this Part ‘relevant regulated service’ means—
(a) a regulated user-to-user service which is a pre-existing Part 4B service within the definition in paragraph (a) of paragraph 1(1), or
(b) a regulated user-to-user service which includes a pre-existing Part 4B service within the definition in paragraph (b) of paragraph 1(1).
Application of section 70
16 (1) Sub-paragraph (2) applies in relation to a person who is the provider of a relevant regulated service, whether or not the person is the provider of any other regulated service.
(2) Section 70, which makes provision about the notification of OFCOM in relation to a charging year, applies to the provider in relation to every charging year, regardless of whether any part, or all, of a charging year falls within the transitional period.
17 (1) This paragraph applies in relation to a person who is the provider of a relevant regulated service, unless the person is an exempt provider (see paragraph 24).
(2) Sub-paragraph (3) applies in relation to the provider if—
(a) the provider is required by section 70 to give details to OFCOM of the provider’s qualifying worldwide revenue for the qualifying period that relates to a charging year,
(b) the provider gives such details in relation to that charging year at a time within the transitional period, and
(c) no regulations under section 196(2) have been made before that time specifying that section 172 is to come into force on or before the first day of that charging year.
(3) The provider’s notification under section 70 about qualifying worldwide revenue must include a breakdown indicating the amounts which are wholly referable to a relevant Part 4B service (if any).
Application of section 71: transitional charging year
18 If a person who is the provider of a relevant regulated service is an exempt provider, section 71 and Schedule 10 do not apply in relation to the provider in respect of a transitional charging year (see paragraph 23).
19 (1) If a person who is the provider of a relevant regulated service is not an exempt provider, section 71 and Schedule 10 apply in relation to the provider in respect of a transitional charging year.
(2) But sub-paragraphs (3) and (4) apply in relation to the provider in respect of a transitional charging year if the provider’s notification under section 70 in relation to that charging year has included details of amounts wholly referable to a relevant Part 4B service (as mentioned in paragraph 17(3)).
(3) For the purposes of the computation of the provider’s fee under section 71 in respect of the transitional charging year, references in that section to the provider’s qualifying worldwide revenue are to be taken to be references to the provider’s non-Part 4B qualifying worldwide revenue.
(4) OFCOM may not require the provider to pay a fee under section 71 in respect of the transitional charging year if the provider’s non-Part 4B qualifying worldwide revenue for the qualifying period that relates to that charging year is less than the threshold figure that has effect for that charging year.
(5) The amount of a provider’s ‘non-Part 4B qualifying worldwide revenue’ is the amount that would be the provider’s qualifying worldwide revenue (see section 72) if all amounts wholly referable to a relevant Part 4B service were left out of account.
Application of section 71: non-transitional charging year
20 (1) Sub-paragraph (2) applies in relation to a person who is the provider of a relevant regulated service, whether or not the person is the provider of any other regulated service.
(2) Section 71 and Schedule 10 apply without modification in relation to the provider in respect of a non-transitional charging year (even if the notification date in relation to such a charging year fell within the transitional period).
Amounts wholly referable to relevant Part 4B service
21 (1) For the purposes of this Part, OFCOM may produce a statement giving information about the circumstances in which amounts do, or do not, count as being wholly referable to a relevant Part 4B service.
(2) If OFCOM produce such a statement, they must publish it (and any revised or replacement statement).
Interpretation of this Part
22 In this Part—
“non-transitional charging year” means a charging year which is not a transitional charging year;
“notification date”, in relation to a charging year, means the latest date by which a notification under section 70 relating to that charging year is required to be given (see section 70(5));
“relevant Part 4B service” means—
(a) a regulated user-to-user service described in paragraph 15(a), or
(b) a pre-existing Part 4B service included in a regulated user-to-user service described in paragraph 15(b).
23 For the purposes of this Part a charging year is a “transitional charging year”
if—
(a) the notification date in relation to that charging year fell within the transitional period, and
(b) no regulations under section 196(2) were made before the notification date specifying that section 172 was to come into force on or before the first day of that charging year.
24 (1) In this Part “exempt provider” means a person within sub-paragraph (2) or (3).
(2) A person is within this sub-paragraph if the person is the provider of only one regulated service, and that service is—
(a) a regulated user-to-user service which is a pre-existing Part 4B service within the definition in paragraph (a) of paragraph 1(1), or
(b) a regulated user-to-user service which—
(i) includes a pre-existing Part 4B service within the definition in paragraph (b) of paragraph 1(1), and
(ii) does not fall within paragraph 6 or 7.
(3) A person is within this sub-paragraph if the person is the provider of more than one regulated service, if each regulated service is of a kind described in sub-paragraph (2).
25 In this Part the following terms have the same meaning as in Part 6 of this Act—
“charging year”;
“qualifying period”;
“threshold figure”.
Part 4
After the end of the transitional period
Interpretation of this Part
26 In this Part of this Schedule—
(a) “the repeal time” means the time when section 172 of this Act comes into force (repeal of Part 4B of the Communications Act);
(b) (except in paragraph (a)) references to sections are to sections of the Communications Act.
27 For the purposes of this Part an investigation relating to a person begins when OFCOM notify the person to that effect.
OFCOM as appropriate regulatory authority
28 The repeal of section 368T does not affect OFCOM’s powers to act after the repeal time as the appropriate regulatory authority under Part 4B of the Communications Act as it has effect by virtue of this Part of this Schedule.
Duties of service providers to co-operate with investigations
29 The repeal of section 368Y(3)(c) (duty to co-operate) does not affect the application of that provision after the repeal time in relation to—
(a) an investigation as mentioned in section 368Z10(3)(f) begun before that time, or
(b) any demand for information for the purpose mentioned in section 368Z10(3)(i) resulting from such an investigation.
Demands for information, and enforcement of such demands
30 (1) The repeal of sections 368Y(3)(b) and 368Z10 (demands for information) does not affect the application of those provisions after the repeal time in a case in which—
(a) OFCOM require information after the repeal time for the purposes of an investigation as mentioned in section 368Z10(3)(f), and
(b) the investigation was begun before that time.
(2) The repeal of sections 368Z2, 368Z4 and 368Z10 does not affect the application of those sections after the repeal time in connection with—
(a) a failure to comply with a requirement under section 368Z10 imposed before that time, or
(b) a failure to comply with a requirement imposed after that time under section 368Z10 as it has effect in a case mentioned in subparagraph (1).
(3) In this paragraph—
(a) “the purposes of an investigation” include the purposes of any enforcement action or proceedings resulting from an investigation;
(b) references to sections 368Z2 and 368Z4 include references to those sections as modified by section 368Z10.
Enforcement notifications, financial penalties etc
31 (1) The repeal of sections 368W and 368Z4 (enforcement of section 368V) does not affect the application of those sections after the repeal time in a case in which OFCOM—
(a) made a determination as mentioned in section 368W(1) before that time, or
(b) began, before that time, to investigate whether they may have grounds to make such a determination.
(2) The repeal of sections 368Z2 and 368Z4 (enforcement of sections 368Y and 368Z1(6) and (7)) does not affect the application of those sections after the repeal time in a case in which OFCOM—
(a) made a determination as mentioned in section 368Z2(1) before that time, or
(b) began, before that time, to investigate whether they may have grounds to make such a determination.
(3) The repeal of sections 368Z3 and 368Z4 (enforcement of sections 368Z1(1) and (2)) does not affect the application of those sections after the repeal time in a case in which OFCOM—
(a) made a determination as mentioned in section 368Z3(1) before that time, or
(b) began, before that time, to investigate whether they may have grounds to make such a determination.
Suspension or restriction of service for contraventions or failures
32 (1) The repeal of section 368Z5 (suspension or restriction of service for contraventions or failures) does not affect the application of that section after the repeal time in a case in which OFCOM—
(a) made a determination as mentioned in section 368W(1), 368Z2(1) or 368Z3(1) before that time, or
(b) made such a determination after that time following an investigation begun before that time.
(2) The repeal of section 368Z5 does not affect the application of that section (as modified by section 368Z10) after the repeal time in a case in which—
(a) OFCOM are satisfied that a person failed to comply with a requirement under section 368Z10 imposed before that time, or
(b) OFCOM are satisfied that a person failed to comply with a requirement imposed after that time under section 368Z10 as it has effect in a case mentioned in paragraph 30(1).
(3) The repeal of sections 368Z7 (directions under sections 368Z5 and 368Z6) and 368Z8 (offence relating to such directions) does not affect the application of those sections after the repeal time in connection with a direction given under section 368Z5 as it has effect by virtue of this paragraph.”—(Paul Scully.)
Parts 2 and 3 of this new Schedule contain transitional provisions etc dealing with how services currently regulated by Part 4B of the Communications Act 2003 (“video-sharing platform services”) make the transition to regulation under the Online Safety Bill. Part 4 of this new Schedule contains saving provisions operating after the repeal of Part 4B.
Brought up, and added to the Bill.
Schedule 3
Timing of providers’ assessments
Amendment made: 238, page 175, line 11, at end insert—
“Part 2A
Pre-existing Part 4B Services
Interpretation of this Part
6A (1) In this Part, “pre-existing Part 4B service” means—
(a) an internet service which—
(i) is a video-sharing platform service by reason of the conditions in section 368S(1) and (2) of the Communications Act being met in relation to the service as a whole, and
(ii) was being provided immediately before Schedule (Video-sharing platform services: transitional provision etc) (video-sharing platform services: transitional provision etc) comes into force; or
(b) a dissociable section of an internet service, where that dissociable section—
(i) is a video-sharing platform service by reason of the conditions in section 368S(1)(a) and (2) of the Communications Act being met in relation to that dissociable section, and
(ii) was being provided immediately before Schedule (Video-sharing platform services: transitional provision etc) comes into force.
(2) In sub-paragraph (1), any reference to a service provided before Schedule (Video-sharing platform services: transitional provision etc) comes into force includes a reference to a service provided in breach of the requirement in section 368V of the Communications Act.
6B (1) In this Part, “assessment start day”, in relation to a pre-existing Part 4B service, means—
(a) the date specified in regulations made by the Secretary of State for the purposes of this Part of this Schedule, or
(b) if the pre-existing Part 4B service ceases to be a video-sharing platform service before the date specified in the regulations, the date when that service ceases to be a video-sharing platform service.
(2) But in respect of any period during which this Schedule is fully in force and no regulations under sub-paragraph (1) have yet been made, the definition in sub-paragraph (1) has effect as if—
(a) for paragraph (a) there were substituted “the date when section 172 comes into force”, and
(b) in paragraph (b), for “specified in the regulations” there were substituted “when section 172 comes into force”.
6C In this Part “video-sharing platform service” has the same meaning as in Part 4B of the Communications Act (see section 368S of that Act).
6D Any reference in this Part to the effect of Part 1 or 2 of this Schedule is a reference to the effect that Part 1 or 2 would have if this Part were disregarded.
Pre-existing Part 4B services which are regulated user-to-user services
Application of paragraphs 6F to 6H
6E (1) This paragraph and paragraphs 6F to 6H apply in relation to a pre-existing Part 4B service which—
(a) is within the definition in paragraph (a) of paragraph 6A(1), and
(b) is also a regulated user-to-user service.
(2) If the effect of Part 1 of this Schedule is that the period within which the first illegal content risk assessment or CAA of the service must be completed begins on a day before the assessment start day, the time for carrying out that assessment is extended as set out in paragraph 6F or 6G.
(3) If the effect of paragraph 6 is that the period within which the first adults’ risk assessment of the service must be completed begins on a day before the assessment start day, the time for carrying out that risk assessment is extended as set out in paragraph 6H.
(4) But paragraphs 6F to 6H do not apply if the service ceases to be a regulated user-to-user service on the assessment start day.
Illegal content risk assessments and children’s access assessments
6F (1) Sub-paragraphs (2) and (3) apply in relation to the service if, on the assessment start day, illegal content risk assessment guidance is available but the first CAA guidance has not yet been published.
(2) The first illegal content risk assessment of the service must be completed within the period of three months beginning with the assessment start day.
(3) The first CAA of the service must be completed within the period of three months beginning with the day on which the first CAA guidance is published.
6G If, on the assessment start day, illegal content risk assessment guidance and CAA guidance are both available, both of the following must be completed within the period of three months beginning with that day—
(a) the first illegal content risk assessment of the service, and
(b) the first CAA of the service.
Adults’ risk assessments
6H (1) If adults’ risk assessment guidance is available on the assessment start day, the first adults’ risk assessment of the service must be completed within the period of three months beginning with that day.
(2) If, on the assessment start day, the first adults’ risk assessment guidance has not yet been published, the first adults’ risk assessment of the service must be completed within the period of three months beginning with the day on which the first adults’ risk assessment guidance is published.
Regulated user-to-user services which include a pre-existing Part 4B service
Application of paragraphs 6J to 6N
6I (1) Paragraphs 6J to 6N make provision about the timing of assessments in the case of a regulated user-to-user service which includes a pre-existing Part 4B service within the definition in paragraph (b) of paragraph 6A(1).
(2) In sub-paragraph (3) and paragraphs 6J to 6N—
(a) “the regulated service” means the regulated user-to-user service, and
(b) “the Part 4B part” means the pre-existing Part 4B service which is included in the regulated service.
(3) If the effect of Part 1 or paragraph 6 of this Schedule is that the period within which the first illegal content risk assessment, CAA or adults’ risk assessment of the regulated service must be completed begins on a day before the assessment start day—
(a) the time for carrying out the assessment in question in relation to the Part 4B part is extended as set out in paragraph 6J, 6K or 6L (whichever applies),
(b) Part 1 and paragraph 6 apply as set out in paragraph 6M, and
(c) paragraph 5 applies as set out in paragraph 6N.
(4) But paragraphs 6J to 6N do not apply if the service ceases to be a regulated user-to-user service on the assessment start day.
Illegal content risk assessments and children’s access assessments of Part 4B part
6J (1) Sub-paragraphs (2) and (3) apply in relation to the Part 4B part if, on the assessment start day, illegal content risk assessment guidance is available but the first CAA guidance has not yet been published.
(2) The first illegal content risk assessment of the Part 4B part must be completed within the period of three months beginning with the assessment start day.
(3) The first CAA of the Part 4B part must be completed within the period of three months beginning with the day on which the first CAA guidance is published.
6K If, on the assessment start day, illegal content risk assessment guidance and CAA guidance are both available, both of the following must be completed within the period of three months beginning with that day—
(a) an illegal content risk assessment of the Part 4B part, and
(b) a CAA of the Part 4B part.
Adults’ risk assessments of Part 4B part
6L (1) If adults’ risk assessment guidance is available on the assessment start day, an adults’ risk assessment of the Part 4B part must be completed within the period of three months beginning with that day.
(2) If, on the assessment start day, the first adults’ risk assessment guidance has not yet been published, an adults’ risk assessment of the Part 4B part must be completed within the period of three months beginning with the day on which the first adults’ risk assessment guidance is published.
Application of Part 1 and paragraph 6
6M (1) This paragraph applies in relation to—
(a) an illegal content risk assessment or a CAA of the regulated service if an assessment of that kind is due to be carried out in relation to the Part 4B part of the service in accordance with paragraph 6J or 6K;
(b) an adults’ risk assessment of the regulated service if an adults’ risk assessment is due to be carried out in relation to the Part 4B part of the service in accordance with paragraph 6L.
References in the rest of this paragraph to an illegal content risk assessment, a CAA or an adults’ risk assessment are to an assessment of that kind to which this paragraph applies.
(2) For the purposes of this paragraph—
(a) the regulated service is “type 1” if it would still be a regulated user-to-user service even if the Part 4B part were to be assumed not to be part of the service;
(b) the regulated service is “type 2” if it would be a regulated search service if the Part 4B part were to be assumed not to be part of the service;
(c) the regulated service is “type 3” if it does not fall within paragraph (a) or (b).
(3) If the regulated service is type 1, an illegal content risk assessment, a CAA or an adults’ risk assessment is to be treated as being due at the time provided for by Part 1 or paragraph 6 only in relation to the rest of the service.
(4) In sub-paragraph (3) “the rest of the service” means any user-to-user part of the regulated service other than the Part 4B part.
(5) If the regulated service is type 2—
(a) an illegal content risk assessment is not required to be carried out at the time provided for by Part 1, but that is not to be taken to prevent an illegal content risk assessment as defined by section 23 from being due in relation to the search engine of the service at the time provided for by Part 1;
(b) a CAA is to be treated as being due at the time provided for by Part 1 only in relation to the search engine of the service;
(c) an adults’ risk assessment is not required to be carried out at the time provided for by paragraph 6.
(6) If the regulated service is type 3, no illegal content risk assessment, CAA or adults’ risk assessment is required to be carried out at the time provided for by Part 1 or paragraph 6.
Application of paragraph 5
6N (1) This paragraph sets out how paragraph 5 (children’s risk assessments) is to apply if a CAA is required to be carried out in accordance with—
(a) paragraph 6J or 6K (CAA of Part 4B part of a service),
(b) paragraph 6M(3) (CAA of the rest of a service), or
(c) paragraph 6M(5)(b) (CAA of search engine of a service).
(2) The definition of “the relevant day” is to operate by reference to the CAA that was (or was required to be) carried out, and accordingly, references to the day on which the service is to be treated as likely to be accessed by children are to be read as references to the day on which the Part 4B part of the service, the rest of the service or the search engine of the service (as the case may be) is to be treated as likely to be accessed by children.
(3) References to a children’s risk assessment of the service are to a children’s risk assessment of the Part 4B part of the service, the rest of the service or the search engine of the service (as the case may be).”—(Paul Scully.)
This amendment deals with the timing of risk assessments etc to be carried out by providers of services currently regulated by Part 4B of the Communications Act 2003. The requirement to do the assessments is triggered on the date set in regulations under new paragraph 6B(1) of Schedule 3.
Schedule 13
Penalties imposed By OFCOM under Chapter 6 of Part 7
Amendment made: 230, page 212, leave out lines 13 to 18.—(Paul Scully.)
This amendment is consequential on NC42.
Schedule 14
Amendments consequential on offences in Part 10 of this Act
Amendments made: 253, page 212, line 36, at end insert—
“Football Spectators Act 1989
A1 In Schedule 1 to the Football Spectators Act 1989 (football banning orders: relevant offences), after paragraph 1(y) insert—
(z) any offence under section 152 (false communications) or 153 (threatening communications) of the Online Safety Act 2022—
(i) which does not fall within paragraph (d), (e), (m), (n), (r) or (s),
(ii) as respects which the court has stated that the offence is aggravated by hostility of any of the types mentioned in section 66(1) of the Sentencing Code (racial hostility etc), and
(iii) as respects which the court makes a declaration that the offence related to a football match, to a football organisation or to a person whom the accused knew or believed to have a prescribed connection with a football organisation.””
This amendment concerns offences relevant to the making of football banning orders. The new false and threatening communications offences under this Bill are added for that purpose.
Amendment 254, page 212, line 40, leave out paragraph (a).
This amendment has the effect of retaining a reference to section 127(1) of the Communications Act 2003 in the Sexual Offences Act 2003.
Amendment 255, page 213, leave out lines 2 and 3.
This amendment is consequential on the omission of clause 151 (see Amendment 239).
Amendment 256, page 213, line 4, leave out “63E” and insert “63D”.
This amendment is consequential on Amendment 255.
Amendment 257, page 213, line 4, leave out “that Act” and insert
“the Online Safety Act 2022”.
This amendment is consequential on Amendment 255.
Amendment 258, page 213, line 6, leave out “63F” and insert “63E”.
This amendment is consequential on Amendment 255.
Amendment 259, page 213, line 12, leave out paragraph (a).
This amendment has the effect of retaining a reference to the Malicious Communications Act 1988 in the Regulatory Enforcement and Sanctions Act 2008.
Amendment 260, page 213, line 15, leave out “151,”.
This amendment is consequential on the omission of clause 151 (see Amendment 239).
Amendment 261, page 213, line 15, at end insert—
“Elections Act 2022
2A In Schedule 9 to the Elections Act 2022 (offences for purposes of Part 5), in Part 2, after paragraph 52 insert—
“Online Safety Act 2022
52A An offence under any of the following provisions of the Online Safety Act 2022—
(a) section 152 (false communications);
(b) section 153 (threatening communications);
(c) section (Offences of sending or showing flashing images electronically: England and Wales and Northern Ireland) (sending flashing images).””
This amendment concerns offences relevant for Part 5 of the Elections Act 2022 (disqualification from holding elective office). The new false and threatening communications offences under this Bill, and the new epilepsy trolling offence (see NC53), are added for that purpose.
Amendment 233, page 214, line 23, at end insert—
“Elections Act 2022
9 In Schedule 9 to the Elections Act 2022 (offences for purposes of Part 5), after paragraph 47(f) insert—section 66A (sending etc photograph or film of genitals).””
(i) section 66A (sending etc photograph or film of genitals).””—(Paul Scully.)
This amendment concerns offences relevant for Part 5 of the Elections Act 2022 (disqualification from holding elective office). The amendment adds a reference to the new offence (cyber-flashing) inserted into the Sexual Offences Act 2003 by clause 157 of this Bill.

Online Safety Bill (Programme) (No. 4)

21:30
Michelle Donelan Portrait The Secretary of State for Digital, Culture, Media and Sport (Michelle Donelan)
- View Speech - Hansard - - - Excerpts

I beg to move,

That the following provisions shall apply to the Online Safety Bill for the purpose of varying and supplementing the Order of 19 April 2022 in the last session of Parliament (Online Safety Bill: Programme) as varied by the Orders of 12 July 2022 (Online Safety Bill: Programme (No.2)) and today (Online Safety Bill: Programme (No.3)).

Re-committal

(1) The Bill shall be re-committed to a Public Bill Committee in respect of the following Clauses and Schedules—

(a) in Part 3, Clauses 11 to 14, 17 to 20, 29, 45, 54 and 55 of the Bill as amended in Public Bill Committee;

(b) in Part 4, Clause 64 of, and Schedule 8 to, the Bill as amended in Public Bill Committee;

(c) in Part 7, Clauses 78, 81, 86, 89 and 112 of, and Schedule 11 to, the Bill as amended in Public Bill Committee;

(d) in Part 9, Clause 150 of the Bill as amended in Public Bill Committee;

(e) in Part 11, Clause 161 of the Bill as amended in Public Bill Committee;

(f) in Part 12, Clauses 192, 195 and 196 of the Bill as amended in Public Bill Committee;

(g) New Clause [Repeal of Part 4B of the Communications Act: transitional provision etc], if it has been added to the Bill, and New Schedule [Video-sharing platform services: transitional provision etc], if it has been added to the Bill.

Proceedings in Public Bill Committee on re-committal

(2) Proceedings in the Public Bill Committee on re-committal shall (so far as not previously concluded) be brought to a conclusion on Thursday 15 December 2022.

(3) The Public Bill Committee shall have leave to sit twice on the first day on which it meets.

Consideration following re-committal and Third Reading

(4) Proceedings on Consideration following re-committal shall (so far as not previously concluded) be brought to a conclusion one hour before the moment of interruption on the day on which those proceedings are commenced.

(5) Proceedings on Third Reading shall (so far as not previously concluded) be brought to a conclusion at the moment of interruption on that day.

(6) Standing Order No. 83B (Programming committees) shall not apply to proceedings on Consideration following re-committal.

I know that colleagues across the House have dedicated a huge amount of time to getting the Bill to this point, especially my predecessor, my right hon. Friend the Member for Mid Bedfordshire (Ms Dorries), who unfortunately could not be with us today. I thank everybody for their contributions through the pre-legislative scrutiny and passage and for their engagement with me since I took office. Since then, the Bill has been my No. 1 priority.

Margaret Hodge Portrait Dame Margaret Hodge (Barking) (Lab)
- Hansard - - - Excerpts

Does the right hon. Member not agree that it is regrettable that her junior Minister—the Under-Secretary of State for Digital, Culture, Media and Sport, the hon. Member for Sutton and Cheam (Paul Scully)—failed to acknowledge in his winding-up speech that there had been any contributions to the debate on Report from Labour Members?

Michelle Donelan Portrait Michelle Donelan
- Hansard - - - Excerpts

As the right hon. Member will note, the Minister had to stop at a certain point and he had spoken for 45 minutes in his opening remarks. I think that he gave a true reflection of many of the comments that were made tonight. The right hon. Member will also know that all the comments from Opposition Members are on the parliamentary record and were televised.

The sooner that we pass the Bill, the sooner we can start protecting children online. This is a groundbreaking piece of legislation that, as hon. Members have said, will need to evolve as technology changes.

Natalie Elphicke Portrait Mrs Natalie Elphicke (Dover) (Con)
- Hansard - - - Excerpts

Will my right hon. Friend confirm that the Department will consider amendments, in relation to new clause 55, to stop the people smugglers who trade their wares on TikTok?

Michelle Donelan Portrait Michelle Donelan
- Hansard - - - Excerpts

I commit to my hon. Friend that we will consider those amendments and work very closely with her and other hon. Members.

We have to get this right, which is why we are adding a very short Committee stage to the Bill. We propose that there will be four sittings over two days. That is the right thing to do to allow scrutiny. It will not delay or derail the Bill, but Members deserve to discuss the changes.

With that in mind, I will briefly discuss the new changes that make recommittal necessary. Children are at the very heart of this piece of legislation. Parents, teachers, siblings and carers will look carefully at today’s proceedings, so for all those who are watching, let me be clear: not only have we kept every single protection for children intact, but we have worked with children’s organisations and parents to create new measures to protect children. Platforms will still have to shield children and young people from both illegal content and a whole range of other harmful content, including pornography, violent content and so on. However, they will also face new duties on age limits. No longer will social media companies be able to claim to ban users under 13 while quietly turning a blind eye to the estimated 1.6 million children who use their sites under age. They will also need to publish summaries of their risk assessments relating to illegal content and child safety in order to ensure that there is greater transparency for parents, and to ensure that the voice of children is injected directly into the Bill, Ofcom will consult the Children’s Commissioner in the development of codes of practice.

These changes, which come on top of all the original child protection measures in the Bill, are completely separate from the changes that we have made in respect of adults. For many people, myself included, the so-called “legal but harmful” provisions in the Bill prompted concerns. They would have meant that the Government were creating a quasi-legal category—a grey area—and would have raised the very real risk that to avoid sanctions, platforms would carry out sweeping take-downs of content, including legitimate posts, eroding free speech in the process.

Sarah Jones Portrait Sarah Jones (Croydon Central) (Lab)
- Hansard - - - Excerpts

Will the Secretary of State join me in congratulating the work of the all-party parliamentary group against antisemitism? Does she agree with the group, and with us, that by removing parts of the Bill we are allowing the kind of holocaust denial that we all abhor to continue online?

Michelle Donelan Portrait Michelle Donelan
- Hansard - - - Excerpts

I have worked very closely with a range of groups backing the causes that the hon. Lady mentions in relation to cracking down on antisemitism, including the Board of Deputies, the Antisemitism Policy Trust and members of the APPG. [Hon. Members: “They don’t back it.”] They do indeed back the Bill. They have said that it is vital that we progress this further. We have adopted their clause in relation to breach notifications, to increase transparency, and we have injected a triple shield that will ensure that antisemitism does not remain on these platforms.

I return to the concerns around “legal but harmful”. Worryingly, it meant that users could run out of road. If a platform allowed legal but harmful material, users would therefore face a binary choice between not using the platform at all or facing abuse and harm that they did not want to see. We, however, have added a third shield that transfers power away from silicon valley algorithms to ordinary people. Our new triple shield mechanism puts accountability, transparency and choice at the heart of the way we interact with each other online. If it is illegal, it has to go. If it violates a company’s terms and conditions, it has to go. Under the third and final layer of the triple shield, platforms must offer users tools to allow them to choose what kind of content they want to see and engage with.

These are significant changes that I know are of great interest to hon. Members. As they were not in scope on Report, I propose that we recommit a selection of clauses for debate by a Public Bill Committee in a very short Committee stage, so that this House of Commons can scrutinise them line by line.

I assure hon. Members that the Bill is my absolute top priority. We are working closely with both Houses to ensure that it completes the remainder of its passage and reaches Royal Assent by the end of this parliamentary Session. It is absolutely essential that we get proper scrutiny. I commend the motion to the House.

21:37
Lucy Powell Portrait Lucy Powell (Manchester Central) (Lab/Co-op)
- View Speech - Hansard - - - Excerpts

There has been long-standing consensus since the Bill was first mooted more than four years ago—before anyone had even heard of TikTok—that online and social media needed regulating. Despite our concerns about both the previous drafting and the new amendments, we support the principle of the Online Safety Bill, but I take issue with the Secretary of State’s arguments today. [Interruption.] I think the hon. Member for Peterborough (Paul Bristow) is trying to correct my language from a sedentary position. Perhaps he wants to listen to the argument instead, because what he and the Secretary of State are doing today will take the Bill a massive step backwards, not forwards.

The consensus has not just been about protecting children online, although of course that is a vital part of the Bill; it is also about the need to tackle the harms that these powerful platforms present when they go unmitigated. As we have heard this evening, there is a cross-party desire to strengthen and broaden the Bill, not water it down, as we are now hearing. Alas, we are not there.

This is not a perfect Bill and was never going to be, but even since the last delay before the summer, we have had the coroner’s inquest into the tragic Molly Russell case, Russian disinformation campaigns and the takeover and ongoing implosion of Twitter. Yet the Government are now putting the entire Bill at risk. It has already been carried over once, so if we do not complete its passage before the end of this parliamentary Session, it will fall completely. The latest hold-up is to enable the Government to remove “legal but harmful” clauses. This goes against the very essence of the Bill, which was created to address the particular power of social media to share, to spread and to broadcast around the world very quickly.

Peter Bone Portrait Mr Peter Bone (Wellingborough) (Con)
- Hansard - - - Excerpts

I understand the shadow Minister’s concern about what the Government are trying to do, but I do not understand why she is speaking against a programme motion that gives the Opposition more time to scrutinise the Bill. It must be the first time I have heard a member of the Opposition demand less time in which to scrutinise a Bill.

Lucy Powell Portrait Lucy Powell
- Hansard - - - Excerpts

I shall come on to that. It is we, on the Opposition side of the House, who are so determined to get the Bill on to the statute book that I find myself arguing against the Government’s further delay. Let us not forget that six months have passed between the first day on Report and the second, today—the longest ever gap between two days of Report in the history of the House—so it is delay after delay.

Disinformation, abuse, incel gangs, body shaming, covid denial, holocaust denial, scammers—the list goes on, all of it actively encouraged by unregulated engagement algorithms and business models that reward sensational, extreme, controversial and abusive behaviour. It is these powers and models that need regulating, for individuals on the receiving end of harm but also to deal with harms to society, democracy and our economy. The enormous number of amendments that have been tabled in the last week should be scrutinised, but we now face a real trade-off between the Bill not passing through the other place in time and the provision of more scrutiny. As I told the Secretary of State a couple of weeks ago in private, our judgment is this: get the Bill to the other place as soon as possible, and we will scrutinise it there.

Sara Britcliffe Portrait Sara Britcliffe (Hyndburn) (Con)
- Hansard - - - Excerpts

Does the hon. Lady agree that what the Labour party did was initiate a vote of no confidence in the Prime Minister rather than making progress with the Bill—which she says is so important—at the time when it was needed?

Lucy Powell Portrait Lucy Powell
- Hansard - - - Excerpts

The hon. Lady remembers incorrectly. It was members of her own party who tabled the motion of no confidence. Oh, I have just remembered: they did not have confidence in the Prime Minister at the time, did they? We have had two Prime Ministers since then, so I am not sure that they have much confidence—[Interruption.]

Lucy Powell Portrait Lucy Powell
- Hansard - - - Excerpts

I will move on now, thank you.

We would not have been here at all if the Secretary of State had stuck to the guns of her predecessor, who, to be fair to her—I know she is not here today—saw off a raft of vested interests to enable the Bill to progress. The right hon. Member for Mid Bedfordshire (Ms Dorries) understood that this is not about thwarting the right to hold views that most of us find abhorrent, but about not allowing those views to be widely shared on a powerful platform that, in the offline world, just does not exist. She understood that the Online Safety Bill came from a fundamental recognition that the algorithms and the power of platforms to push people towards content that, although on its own may not be illegal, cumulatively causes significant harm. Replacing the prevention of harm with an emphasis on free speech lets the platforms off the hook, and the absence of duties to prevent harm and dangerous outcomes will allow them to focus on weak user controls.

Simply holding platforms to account for their own terms and conditions—the Secretary of State referred to that earlier—which, as we saw just this week at Twitter, can be rewritten or changed at whim, will not constitute robust enough regulation to deal with the threat that these platforms present. To protect children, the Government are relying on age verification, but as those with teenage children are well aware—including many of us in the House—most of them pass themselves off as older that they are, and verification is easy to get around. The proposed three shields for adults are just not workable and do not hold up to scrutiny. Let us be clear that the raft of new amendments that have been tabled by the Government this week are nothing more than a major weakening and narrowing of this long-awaited legislation.

This is not what Labour would do. We would tackle at root the power of the platforms to negatively shape all our lives. But we are where we are, and it is better to have the regulator in place with some powers than to have nothing at all. I fear that adding more weeks in Committee in the Commons, having already spent years and years debating this Bill, will not make it any better anyway. Going back into Committee is an unprecedented step, and where might that end? What is to prevent another new Minister or Secretary of State from changing their mind again in the new year, or to prevent there being another reshuffle or even another Prime Minister? That might happen! This is a complex and important Bill, but it is also long, long overdue. We therefore support the original programme motion to get the Bill into the other place immediately, and we will not be voting to put the Bill back into Committee.

21:45
John Nicolson Portrait John Nicolson (Ochil and South Perthshire) (SNP)
- View Speech - Hansard - - - Excerpts

My hon. Friend the Member for Aberdeen North (Kirsty Blackman) would have been speaking in this debate but she is indisposed, so I am delighted to offer some of her bons mots to the House. The effect of this motion is to revive the Third Reading debate that was previously programmed to take place immediately after the Report stage ended. It is fair to say that there has been a bit of chaos in the UK Government in recent times, with a disastrous yet thankfully short prime ministerial period when it looked as if the Online Safety Bill might be scrapped altogether. We on the SNP Benches are glad to see the Bill return to finish its Report stage. Although we are not entirely happy with the contents of the Bill—as Members can see by the number of amendments we had rejected in Committee and the number of amendments we tabled on Report today—we strongly believe that this version is better than the version the Government are proposing to create by recommitting the Bill later today. If this programme motion were to fall, the Government might not be able to recommit the Bill.

During the progress of both the legislative and pre-legislative stages of the Bill, as well as in the Digital, Culture, Media and Sport Committee, we have heard from survivors who have been permanently scarred as a result of so-called legal but harmful content. We have heard from families whose loved ones have died as a result of accessing this content, as Members around the House well know. It is surely imperative that action is taken; otherwise, we will see more young people at risk. Having protections in place for children is a good step forward, but it is not sufficient. Therefore we will be voting against this programme motion, which creates the conditions for recommitting a Bill that—as I well know, having sat through it—has already had 50 hours of Committee scrutiny and countless hours in the pre-legislative Joint Committee.

21:48
Michelle Donelan Portrait Michelle Donelan
- View Speech - Hansard - - - Excerpts

With the leave of the House, in making my closing remarks, I want to remind all Members and all those watching these proceedings exactly why we are here today. The children and families who have had their lives irreparably damaged by social media giants need to know that we are on their side, and that includes the families who sat in the Gallery here today and who I had the opportunity to talk to. I want to take this opportunity to pay tribute to the work they have done, including Ian Russell. They have shone a spotlight and campaigned on this issue. As many Members will know, in 2017, Ian’s 14-year-old daughter Molly took her own life after being bombarded by self-harm content on Instagram and Pinterest. She was a young and innocent girl.

To prevent other families from going through this horrendous ordeal, we must all move the Bill forward together. And we must work together to get the Bill on the statute book as soon as possible by making sure this historic legislation gets the proper scrutiny it deserves, so that we can start protecting children and young people online while also empowering adults.

For too long, the fierce debate surrounding the Bill has been framed by an assumption that protecting children online must come at the expense of free speech for adults. Today we can put an end to this dispute once and for all. Our common-sense amendments to the Bill overcome these barriers by strengthening the protections for children while simultaneously protecting free speech and choice for adults.

However, it is right that the House is allowed to scrutinise these changes in Committee, which is why we need to recommit a selection of clauses for a very short Committee stage. This will not, as the Opposition suggest, put the Bill at risk. I think it is really wrong to make such an assertion. As well as being deeply upsetting to the families who visited us this evening, it is a low blow by the Opposition to play politics with such an important Bill.

We will ensure the Bill completes all stages by the end of this Session, and we need to work together to ensure that children come first. We can then move the Bill forward, so that we can start holding tech companies to account for their actions and finally stop them putting profits before people and before our children.

Question put.

21:51

Division 110

Ayes: 314

Noes: 216

ONLINE SAFETY BILL (First sitting)

Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(1 year, 4 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
The Committee consisted of the following Members:
Chairs: Dame Angela Eagle, † Sir Roger Gale
† Ansell, Caroline (Eastbourne) (Con)
† Bailey, Shaun (West Bromwich West) (Con)
† Bhatti, Saqib (Meriden) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
† Bonnar, Steven (Coatbridge, Chryston and Bellshill) (SNP)
Bristow, Paul (Peterborough) (Con)
† Collins, Damian (Folkestone and Hythe) (Con)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Fletcher, Nick (Don Valley) (Con)
† Leadbeater, Kim (Batley and Spen) (Lab)
† Maclean, Rachel (Redditch) (Con)
† Nichols, Charlotte (Warrington North) (Lab)
† Owen, Sarah (Luton North) (Lab)
Peacock, Stephanie (Barnsley East) (Lab)
† Russell, Dean (Watford) (Con)
† Scully, Paul (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
† Wood, Mike (Dudley South) (Con)
Kevin Maddison, Bethan Harding, Committee Clerks
† attended the Committee
Public Bill Committee
Tuesday 13 December 2022
(Morning)
[Sir Roger Gale in the Chair]
Online Safety Bill
(Re-committed Clauses and Schedules: Clauses 11 to 14, 18 to 21, 30, 46, 55 and 65, Schedule 8, Clauses 79 and 82, Schedule 11, Clauses 87, 90, 115, 169 and 183, Schedule 17, Clauses 203, 206 and 207, new Clauses and new Schedules)
09:25
None Portrait The Chair
- Hansard -

Good morning, ladies and gentlemen. We are sitting in public and the proceedings are being broadcast. I have a few preliminary announcements. Hansard would be grateful if Members would provide speaking notes as and when they finish with them. Please also ensure that all mobile phones and other electronic devices are switched off.

I had better apologise at the start for the temperature in the room. For reasons that none of us can understand, two of the windows were left wide open all night. However, the Minister’s private office has managed to sort that out, because that is what it is there for. Hopefully the room will warm up. If any hon. Member has a problem with the temperature, they will have to tell the Chair. If necessary, I will suspend, but we are made of tough stuff, so we will try to bat on if we can.

We will first consider the programme motion, which can be debated for up to half an hour. I call the Minister to move the motion standing in his name, which was discussed yesterday by the Programming Sub-Committee.

Ordered,

That—

(1) the Committee shall (in addition to its first meeting at 9.25 am on Tuesday 13 December) meet—

(a) at 2.00 pm on Tuesday 13 December;

(b) at 11.30 am and 2.00 pm on Thursday 15 December;

(2) the proceedings shall be taken in the following order: Clauses 11 to 14, 18 to 21, 30, 46, 55, 56 and 65; Schedule 8; Clauses 79 and 82; Schedule 11; Clauses 87, 90, 115, 155, 169 and 183; Schedule 17; Clauses 203, 206 and 207; new Clauses; new Schedules; remaining proceedings on the Bill;

(3) the proceedings shall (so far as not previously concluded) be brought to a conclusion at 5.00 pm on Thursday 15 December. —(Paul Scully.)

Resolved,

That, subject to the discretion of the Chair, any written evidence received by the Committee shall be reported to the House for publication.—(Paul Scully.)

None Portrait The Chair
- Hansard -

We now begin line-by-line consideration of the Bill. Owing to the unusual nature of today’s proceedings on recommittal, which is exceptional, I need to make a few points.

Only the recommitted clauses and schedules, and amendments and new clauses relating to them, are in scope for consideration. The selection list, which has been circulated to Members and is available in the room, outlines which clauses and schedules those are. Any clause or schedule not on the list is not in scope for discussion. Basically, that means that we cannot have another Second Reading debate. Moreover, we cannot have a further debate on any issues that have been debated already on Report on the Floor of the House. As I say, this is unusual; in fact, I think it is probably a precedent—“Erskine May” will no doubt wish to take note.

The selection list also shows the selected amendments and how they have been grouped. Colleagues will by now be aware that we group amendments by subject for debate. They are not necessarily voted on at the time of the completion of the debate on that group, but as we reach their position in the Bill. Do not panic; we have expert advice to ensure that we do not miss anything—at least, I hope we have.

Finally, only the lead amendment is decided on at the end of the debate. If a Member wishes to move any other amendment in the group, please let the Chair know. Dame Angela or I will not necessarily select it for a Division, but we need to know if Members wish to press it to one. Otherwise, there will be no Division on the non-lead amendments.

Clause 11

Safety duties protecting children

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I beg to move amendment 98, in clause 11, page 10, line 17, at end insert

“, and—

(c) mitigate the harm to children caused by habit-forming features of the service by consideration and analysis of how processes (including algorithmic serving of content, the display of other users’ approval of posts and notifications) contribute to development of habit-forming behaviour.”

This amendment requires services to take or use proportionate measures to mitigate the harm to children caused by habit-forming features of a service.

Thank you, Sir Roger, for chairing this recommitted Bill Committee. I will not say that it is nice to be back discussing the Bill again; we had all hoped to have made more progress by now. If you will indulge me for a second, I would like to thank the Clerks, who have been massively helpful in ensuring that this quick turnaround could happen and that we could table the amendments in a sensible form.

Amendment 98 arose from comments and evidence from the Royal College of Psychiatrists highlighting that a number of platforms, and particularly social media platforms such as TikTok and Facebook, generally encourage habit-forming behaviour or have algorithms that encourage it. Such companies are there to make money—that is what companies do—so they want people to linger on their sites and to spend as much time there as possible.

I do not know how many hon. Members have spent time on TikTok, but if they do, and they enjoy some of the cat videos, for instance, the algorithm will know and will show them more videos of cats. They will sit there and think, “Gosh, where did the last half-hour go? I have been watching any number of 20-second videos about cats, because they constantly come up.” Social media sites work by encouraging people to linger on the site and to spend the time dawdling and looking at the advertisements, which make the company additional revenue.

That is good for capitalism and for the company’s ability to make money but the issue, particularly in relation to clause 11, is how that affects children. Children may not have the necessary filters; they may not have the ability that we have to put our phones down—not that we always manage to do so. That ability and decision-making process may not be as refined in children as in adults. Children can be sucked into the platforms by watching videos of cats or of something far more harmful.

Sarah Owen Portrait Sarah Owen (Luton North) (Lab)
- Hansard - - - Excerpts

The hon. Member makes an excellent point about TikTok, but it also applies to YouTube. The platforms’ addictive nature has to do with the content. A platform does not just show a person a video of a cat, because that will not keep them hooked for half an hour. It has to show them a cat doing something extraordinary, and then a cat doing something even more extraordinary. That is why vulnerable people, especially children, get sucked into a dark hole. They click to see not just the same video but something more exciting, and then something even more exciting. That is the addictive nature of this.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

That is absolutely the case. We are talking about cats because I chose them to illustrate the situation, but people may look at content about healthy eating, and that moves on to content that encourages them to be sick. The way the algorithms step it up is insidious; they get more and more extreme, so that the linger time is increased and people do not get bored. It is important that platforms look specifically at their habit-forming features.

Charlotte Nichols Portrait Charlotte Nichols (Warrington North) (Lab)
- Hansard - - - Excerpts

A specific case on the platform TikTok relates to a misogynist who goes by the name of Andrew Tate, who has been banned from a number of social media platforms. However, because TikTok works by making clips shorter, which makes it more difficult for the company to identify some of this behaviour among users, young boys looking for videos of things that might interest them were very quickly shown misogynist content from Andrew Tate. Because they watched one video of him, they were then shown more and more. It is easy to see how the habit-forming behaviours built into platforms’ algorithms, which the hon. Lady identifies, can also be a means of quickly radicalising children into extreme ideologies.

None Portrait The Chair
- Hansard -

Order. I think we have the message. I have to say to all hon. Members that interventions are interventions, not speeches. If Members wish to make speeches, there is plenty of time.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Thank you, Sir Roger. I absolutely agree with the hon. Member for Warrington North. The platform works by stitching things together, so a video could have a bit of somebody else’s video in it, and that content ends up being shared and disseminated more widely.

This is not an attack on every algorithm. I am delighted to see lots of videos of cats—it is wonderful, and it suits me down to the ground—but the amendment asks platforms to analyse how those processes contribute to the development of habit-forming behaviour and to mitigate the harm caused to children by habit-forming features in the service. It is not saying, “You can’t use algorithms” or “You can’t use anything that may encourage people to linger on your site.” The specific issue is addiction—the fact that people will get sucked in and stay on platforms for hours longer than is healthy.

There is a demographic divide here. There is a significant issue when we compare children whose parents are engaged in these issues and spend time—and have the time to spend—assisting them to use the internet. There is a divide between the experiences of those children online and the experiences of children who are generally not nearly as well off, whose parents may be working two or three jobs to try to keep their homes warm and keep food on the table, so the level of supervision those children have may be far lower. We have a parental education gap, where parents are not able to instruct or teach their children a sensible way to use these things. A lot of parents have not used things such as TikTok and do not know how it works, so they are unable to teach their children.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Does the hon. Lady agree that this feeds into the problem we have with the lack of a digital media literacy strategy in the Bill, which we have, sadly, had to accept? However, that makes it even more important that we protect children wherever we have the opportunity to do so, and this amendment is a good example of where we can do that.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The hon. Lady makes an excellent point. This is not about mandating that platforms stop doing these things; it is about ensuring that they take this issue into account and that they agree—or that we as legislators agree—with the Royal College of Psychiatrists that we have a responsibility to tackle it. We have a responsibility to ask Ofcom to tackle it with platforms.

This comes back to the fact that we do not have a user advocacy panel, and groups representing children are not able to bring emerging issues forward adequately and effectively. Because of the many other inadequacies in the Bill, that is even more important than it was. I assume the Minister will not accept my amendment—that generally does not happen in Bill Committees—but if he does not, it would be helpful if he could give Ofcom some sort of direction of travel so that it knows it should take this issue into consideration when it deals with platforms. Ofcom should be talking to platforms about habit-forming features and considering the addictive nature of these things; it should be doing what it can to protect children. This threat has emerged only in recent years, and things will not get any better unless we take action.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

It is a privilege to see you back in the Chair for round 2 of the Bill Committee, Sir Roger. It feels slightly like déjà vu to return to line-by-line scrutiny of the Bill, which, as you said, Sir Roger, is quite unusual and unprecedented. Seeing this Bill through Committee is the Christmas gift that keeps on giving. Sequels are rarely better than the original, but we will give it a go. I have made no secret of my plans, and my thoughts on the Minister’s plans, to bring forward significant changes to the Bill, which has already been long delayed. I am grateful that, as we progress through Committee, I will have the opportunity to put on record once again some of Labour’s long-held concerns with the direction of the Bill.

I will touch briefly on clause 11 specifically before addressing the amendments to the clause. Clause 11 covers safety duties to protect children, and it is a key part of the Bill—indeed, it is the key reason many of us have taken a keen interest in online safety more widely. Many of us, on both sides of the House, have been united in our frustrations with the business models of platform providers and search engines, which have paid little regard to the safety of children over the years in which the internet has expanded rapidly.

That is why Labour has worked with the Government. We want to see the legislation get over the line, and we recognise—as I have said in Committee previously—that the world is watching, so we need to get this right. The previous Minister characterised the social media platforms and providers as entirely driven by finance, but safety must be the No. 1 priority. Labour believes that that must apply to both adults and children, but that is an issue for debate on a subsequent clause, so I will keep my comments on this clause brief.

The clause and Government amendments 1, 2 and 3 address the thorny issue of age assurance measures. Labour has been clear that we have concerns that the Government are relying heavily on the ability of social media companies to distinguish between adults and children, but age verification processes remain fairly complex, and that clearly needs addressing. Indeed, Ofcom’s own research found that a third of children have false social media accounts aged over 18. This is an area we certainly need to get right.

I am grateful to the many stakeholders, charities and groups working in this area. There are far too many to mention, but a special shout-out should go to Iain Corby from the Age Verification Providers Association, along with colleagues at the Centre to End All Sexual Exploitation and Barnardo’s, and the esteemed John Carr. They have all provided extremely useful briefings for my team and me as we have attempted to unpick this extremely complicated part of the Bill.

We accept that there are effective age checks out there, and many have substantial anti-evasion mechanisms, but it is the frustrating reality that this is the road the Government have decided to go down. As we have repeatedly placed on the record, the Government should have retained the “legal but harmful” provisions that were promised in the earlier iteration of the Bill. Despite that, we are where we are.

I will therefore put on the record some brief comments on the range of amendments on this clause. First, with your permission, Sir Roger, I will speak to amendments 98, 99—

None Portrait The Chair
- Hansard -

Order. No, you cannot. I am sorry. I am perfectly willing to allow—the hon. Lady has already done this—a stand part debate at the start of a group of selections, rather than at the end, but she cannot have it both ways. I equally understand the desire of an Opposition Front Bencher to make some opening remarks, which is perfectly in order. With respect, however, you may not then go through all the other amendments. We are dealing now with amendment 98. If the hon. Lady can confine her remarks to that at this stage, that would be helpful.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Of course, Sir Roger. Without addressing the other amendments, I would like us to move away from the overly content-focused approach that the Government seem intent on taking in the Bill more widely. I will leave my comments there on the SNP amendment, but we support our SNP colleagues on it.

Paul Scully Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Paul Scully)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Sir Roger.

Being online can be a hugely positive experience for children and young people, but we recognise the challenge of habit-forming behaviour or designed addiction to some digital services. The Bill as drafted, however, would already deliver the intent of the amendment from the hon. Member for Aberdeen North. If service providers identify in their risk assessment that habit-forming or addictive-behaviour risks cause significant harm to an appreciable number of children on a service, the Bill will require them to put in place measures to mitigate and manage that risk under clause 11(2)(a).

To meet the child safety risk assessment duties under clause 10, services must assess the risk of harm to children from the different ways in which the service is used; the impact of such use; the level of risk of harm to children; how the design and operation of the service may increase the risks identified; and the functionalities that facilitate the presence or dissemination of content of harm to children. The definition of “functionality” at clause 200 already includes an expression of a view on content, such as applying a “like” or “dislike” button, as at subsection (2)(f)(i).

Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

I thank the Minister for giving way so early on. He mentioned an “appreciable number”. Will he clarify what that is? Is it one, 10, 100 or 1,000?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I do not think that a single number can be put on that, because it depends on the platform and the type of viewing. It is not easy to put a single number on that. An “appreciable number” is basically as identified by Ofcom, which will be the arbiter of all this. It comes back to what the hon. Member for Aberdeen North said about the direction that we, as she rightly said, want to give Ofcom. Ofcom has a range of powers already to help it assess whether companies are fulfilling their duties, including the power to require information about the operation of their algorithms. I would set the direction that the hon. Lady is looking for, to ensure that Ofcom uses those powers to the fullest and can look at the algorithms. We should bear in mind that social media platforms face criminal liability if they do not supply the information required by Ofcom to look under the bonnet.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

If platforms do not recognise that they have an issue with habit-forming features, even though we know they have, will Ofcom say to them, “Your risk assessment is insufficient. We know that the habit-forming features are really causing a problem for children”?

09:45
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

We do not want to wait for the Bill’s implementation to start those conversations with the platforms. We expect companies to be transparent about their design practices that encourage extended engagement and to engage with researchers to understand the impact of those practices on their users.

The child safety duties in clause 11 apply across all areas of a service, including the way it is operated and used by children and the content present on the service. Subsection (4)(b) specifically requires services to consider the

“design of functionalities, algorithms and other features”

when complying with the child safety duties. Given the direction I have suggested that Ofcom has, and the range of powers that it will already have under the Bill, I am unable to accept the hon. Member’s amendment, and I hope she will therefore withdraw it.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I would have preferred it had the Minister been slightly more explicit that habit-forming features are harmful. That would have been slightly more helpful.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will say that habit-forming features can be harmful.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I thank the Minister. Absolutely—they are not always harmful. With that clarification, I am happy to beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I beg to move amendment 1, in clause 11, page 10, line 22, leave out

“, or another means of age assurance”.

This amendment omits words which are no longer necessary in subsection (3)(a) of clause 11 because they are dealt with by the new subsection inserted by Amendment 3.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss Government amendments 2 and 3.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The Bill’s key objective, above everything else, is the safety of young people online. That is why the strongest protections in the Bill are for children. Providers of services that are likely to be accessed by children will need to provide safety measures to protect child users from harmful content, such as pornography, and from behaviour such as bullying. We expect companies to use age verification technologies to prevent children from accessing services that pose the highest risk of harm to them, and age assurance technologies and other measures to provide children with an age-appropriate experience.

The previous version of the Bill already focused on protecting children, but the Government are clear that the Bill must do more to achieve that and to ensure that the requirements on providers are as clear as possible. That is why we are strengthening the Bill and clarifying the responsibilities of providers to provide age-appropriate protections for children online. We are making it explicit that providers may need to use age assurance to identify the age of their users in order to meet the child safety duties for user-to-user services.

The Bill already set out that age assurance may be required to protect children from harmful content and activity, as part of meeting the duty in clause 11(3), but the Bill will now clarify that it may also be needed to meet the wider duty in subsection (2) to

“mitigate and manage the risks of harm to children”

and to manage

“the impact of harm to children”

on such services. That is important so that only children who are old enough are able to use functionalities on a service that poses a risk of harm to younger children. The changes will also ensure that children are signposted to support that is appropriate to their age if they have experienced harm. For those reasons, I recommend that the Committee accepts the amendments.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I have a few questions regarding amendments 1 to 3, which as I mentioned relate to the thorny issue of age verification and age assurance, and I hope the Minister can clarify some of them.

We are unclear about why, in subsection (3)(a), the Government have retained the phrase

“for example, by using age verification, or another means of age assurance”.

Can that difference in wording be taken as confirmation that the Government want harder forms of age verification for primary priority content? The Minister will be aware that many in the sector are unclear about what that harder form of age verification may look like, so some clarity would be useful for all of us in the room and for those watching.

In addition, we would like to clarify the Minister’s understanding of the distinction between age verification and age assurance. They are very different concepts in reality, so we would appreciate it if he could be clear, particularly when we consider the different types of harm that the Bill will address and protect people from, how that will be achieved and what technology will be required for different types of platform and content. I look forward to clarity from the Minister on that point.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

That is a good point. In essence, age verification is the hard access to a service. Age assurance ensures that the person who uses the service is the same person whose age was verified. Someone could use their parent’s debit card or something like that, so it is not necessarily the same person using the service right the way through. If we are to protect children, in particular, we have to ensure that we know there is a child at the other end whom we can protect from the harm that they may see.

On the different technologies, we are clear that our approach to age assurance or verification is not technology-specific. Why? Because otherwise the Bill would be out of date within around six months. By the time the legislation was fully implemented it would clearly be out of date. That is why it is incumbent on the companies to be clear about the technology and processes they use. That information will be kept up to date, and Ofcom can then look at it.

None Portrait The Chair
- Hansard -

The Minister leapt to his feet before I had the opportunity to call any other Members. I call Kirsty Blackman.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Thank you, Sir Roger. It was helpful to hear the Minister’s clarification of age assurance and age verification, and it was useful for him to put on the record the difference between the two.

I have a couple of points. In respect of Ofcom keeping up to date with the types of age verification and the processes, new ones will come through and excellent new methods will appear in coming years. I welcome the Minister’s suggestion that Ofcom will keep up to date with that, because it is incredibly important that we do not rely on, say, the one provider that there is currently, when really good methods could come out. We need the legislation to ensure that we get the best possible service and the best possible verification to keep children away from content that is inappropriate for them.

This is one of the most important parts of the Bill for ensuring that we can continue to have adult sections of the internet—places where there is content that would be disturbing for children, as well as for some adults—and that an age-verification system is in place to ensure that that content can continue to be there. Websites that require a subscription, such as OnlyFans, need to continue to have in place the age-verification systems that they currently have. By writing into legislation the requirement for them to continue to have such systems in place, we can ensure that children cannot access such services but adults can continue to do so. This is not about what is banned online or about trying to make sure that this content does not exist anywhere; it is specifically about gatekeeping to ensure that no child, as far as we can possibly manage, can access content that is inappropriate for kids.

There was a briefing recently on children’s access to pornography, and we heard horrendous stories. It is horrendous that a significant number of children have seen inappropriate content online, and the damage that that has caused to so many young people cannot be overstated. Blocking access to adult parts of the internet is so important for the next generation, not just so that children are not disturbed by the content they see, but so that they learn that it is not okay and normal and understand that the depictions of relationships in pornography are not the way reality works, not the way reality should work and not how women should be treated. Having a situation in which Ofcom or anybody else is better able to take action to ensure that adult content is specifically accessed only by adults is really important for the protection of children and for protecting the next generation and their attitudes, particularly towards sex and relationships.

Rachel Maclean Portrait Rachel Maclean (Redditch) (Con)
- Hansard - - - Excerpts

I wish to add some brief words in support of the Government’s proposals and to build on the comments from Members of all parties.

We know that access to extreme and abusive pornography is a direct factor in violence against women and girls. We see that play out in the court system every day. People claim to have watched and become addicted to this type of pornography; they are put on trial because they seek to play that out in their relationships, which has resulted in the deaths of women. The platforms already have technology that allows them to figure out the age of people on their platforms. The Bill seeks to ensure that they use that for a good end, so I thoroughly support it. I thank the Minister.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

There are two very important and distinct issues here. One is age verification. The platforms ask adults who have identification to verify their age; if they cannot verify their age, they cannot access the service. Platforms have a choice within that. They can design their service so that it does not have adult content, in which case they may not need to build in verification systems—the platform polices itself. However, a platform such as Twitter, which allows adult content on an app that is open to children, has to build in those systems. As the hon. Member for Aberdeen North mentioned, people will also have to verify their identity to access a service such as OnlyFans, which is an adult-only service.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

On that specific point, I searched on Twitter for the name—first name and surname—of a politician to see what people had been saying, because I knew that he was in the news. The pictures that I saw! That was purely by searching for the name of the politician; it is not as though people are necessarily seeking such stuff out.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

On these platforms, the age verification requirements are clear: they must age-gate the adult content or get rid of it. They must do one or the other. Rightly, the Bill does not specify technologies. Technologies are available. The point is that a company must demonstrate that it is using an existing and available technology or that it has some other policy in place to remedy the issue. It has a choice, but it cannot do nothing. It cannot say that it does not have a policy on it.

Age assurance is always more difficult for children, because they do not have the same sort of ID that adults have. However, technologies exist: for instance, Yoti uses facial scanning. Companies do not have to do that either; they have to demonstrate that they do something beyond self-certification at the point of signing up. That is right. Companies may also demonstrate what they do to take robust action to close the accounts of children they have identified on their platforms.

If a company’s terms of service state that people must be 13 or over to use the platform, the company is inherently stating that the platform is not safe for someone under 13. What does it do to identify people who sign up? What does it do to identify people once they are on the platform, and what action does it then take? The Bill gives Ofcom the powers to understand those things and to force a change of behaviour and action. That is why—to the point made by the hon. Member for Pontypridd—age assurance is a slightly broader term, but companies can still extract a lot of information to determine the likely age of a child and take the appropriate action.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I think we are all in agreement, and I hope that the Committee will accept the amendments.

Amendment 1 agreed to.

Amendments made: 2, in clause 11, page 10, line 25, leave out

“(for example, by using age assurance)”.

This amendment omits words which are no longer necessary in subsection (3)(b) of clause 11 because they are dealt with by the new subsection inserted by Amendment 3.

Amendment 3, in clause 11, page 10, line 26, at end insert—

“(3A) Age assurance to identify who is a child user or which age group a child user is in is an example of a measure which may be taken or used (among others) for the purpose of compliance with a duty set out in subsection (2) or (3).”—(Paul Scully.)

This amendment makes it clear that age assurance measures may be used to comply with duties in clause 11(2) as well as (3) (safety duties protecting children).

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move amendment 99, in clause 11, page 10, line 34, leave out paragraph (d) and insert—

“(d) policies on user access to the service, parts of the service, or to particular content present on the service, including blocking users from accessing the service, parts of the service, or particular content,”.

This amendment is intended to make clear that if it is proportionate to do so, services should have policies that include blocking access to parts of a service, rather than just the entire service or particular content on the service.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 96, in clause 11, page 10, line 41, at end insert—

“(i) reducing or removing a user’s access to private messaging features”.

This amendment is intended to explicitly include removing or reducing access to private messaging features in the list of areas where proportionate measures can be taken to protect children.

Amendment 97, in clause 11, page 10, line 41, at end insert—

“(i) reducing or removing a user’s access to livestreaming features”.

This amendment is intended to explicitly include removing or reducing access to livestreaming features in the list of areas where proportionate measures can be taken to protect children.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am glad that the three amendments are grouped, because they link together nicely. I am concerned that clause 11(4)(d) does not do exactly what the Government intend it to. It refers to

“policies on user access to the service or to particular content present on the service, including blocking users from accessing the service or particular content”.

There is a difference between content and parts of the service. It would be possible to block users from accessing some of the things that we have been talking about —for example, eating disorder content—on the basis of clause 11(4)(d). A platform would be able to take that action, provided that it had the architecture in place. However, on my reading, I do not think it would be possible to block a user from accessing, for example, private messaging or livestreaming features. Clause 11(4)(d) would allow a platform to block certain content, or access to the service, but it would not explicitly allow it to block users from using part of the service.

Let us think about platforms such as Discord and Roblox. I have an awful lot of issues with Roblox, but it can be a pretty fun place for people to spend time. However, if a child, or an adult, is inappropriately using its private messaging features, or somebody on Instagram is using the livestreaming features, there are massive potential risks of harm. Massive harm is happening on such platforms. That is not to say that Instagram is necessarily inherently harmful, but if it could block a child’s access to livestreaming features, that could have a massive impact in protecting them.

09:49
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Does the hon. Lady accept that the amendments would give people control over the bit of the service that they do not currently have control of? A user can choose what to search for and which members to engage with, and can block people. What they cannot do is stop the recommendation feeds recommending things to them. The shields intervene there, which gives user protection, enabling them to say, “I don’t want this sort of content recommended to me. On other things, I can either not search for them, or I can block and report offensive users.” Does she accept that that is what the amendment achieves?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I think that that is what the clause achieves, rather than the amendments that I have tabled. I recognise that the clause achieves that, and I have no concerns about it. It is good that the clause does that; my concern is that it does not take the second step of blocking access to certain features on the platform. For example, somebody could be having a great time on Instagram looking at various people’s pictures or whatever, but they may not want to be bombarded with private messages. They have no ability to turn off the private messaging section.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

They can disengage with the user who is sending the messages. On a Meta platform, often those messages will be from someone they are following or engaging with. They can block them, and the platforms have the ability, in most in-app messaging services, to see whether somebody is sending priority illegal content material to other users. They can scan for that and mitigate that as well.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

That is exactly why users should be able to block private messaging in general. Someone on Twitter can say, “I’m not going to receive a direct message from anybody I don’t follow.” Twitter users have the opportunity to do that, but there is not necessarily that opportunity on all platforms. We are asking for those things to be included, so that the provider can say, “You’re using private messaging inappropriately. Therefore, we are blocking all your access to private messaging,” or, “You are being harmed as a result of accessing private messaging. Therefore, we are blocking your access to any private messaging. You can still see pictures on Instagram, but you can no longer receive any private messages, because we are blocking your access to that part of the site.” That is very different from blocking a user’s access to certain kinds of content, for example. I agree that that should happen, but it is about the functionalities and stopping access to some of them.

We are not asking Ofcom to mandate that platforms take this measure; they could still take the slightly more nuclear option of banning somebody entirely from their service. However, if this option is included, we could say, “Your service is doing pretty well, but we know there is an issue with private messaging. Could you please take action to ensure that those people who are using private messaging to harm children no longer have access to private messaging and are no longer able to use the part of the service that enables them to do these things?” Somebody might be doing a great job of making games in Roblox, but they may be saying inappropriate things. It may be proportionate to block that person entirely, but it may be more proportionate to block their access to voice chat, so that they can no longer say those things, or direct message or contact anybody. It is about proportionality and recognising that the service is not necessarily inherently harmful but that specific parts of it could be.

Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

The hon. Member is making fantastic, salient points. The damage with private messaging is around phishing, as well as seeing a really harmful message and not being able to unsee it. Would she agree that it is about protecting the victim, not putting the onus on the victim to disengage from such conversations?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I completely agree. The hon. Member put that much better than I could. I was trying to formulate that point in my head, but had not quite got there, so I appreciate her intervention. She is right: we should not put the onus on a victim to deal with a situation. Once they have seen a message from someone, they can absolutely block that person, but that person could create another account and send them messages again. People could be able to choose, and to say, “No, I don’t want anyone to be able to send me private messages,” or “I don’t want any private messages from anyone I don’t know.” We could put in those safeguards.

I am talking about adding another layer to the clause, so that companies would not necessarily have to demonstrate that it was proportionate to ban a person from using their service, as that may be too high a bar—a concern I will come to later. They could, however, demonstrate that it was proportionate to ban a person from using private messaging services, or from accessing livestreaming features. There has been a massive increase in self-generated child sexual abuse images, and huge amount has come from livestreaming. There are massive risks with livestreaming features on services.

Livestreaming is not always bad. Someone could livestream themselves showing how to make pancakes. There is no issue with that—that is grand—but livestreaming is being used by bad actors to manipulate children into sharing videos of themselves, and once they are on the internet, they are there forever. It cannot be undone. If we were able to ban vulnerable users—my preferred option would be all children—from accessing livestreaming services, they would be much safer.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. Lady is talking about extremely serious matters. My expectation is that Ofcom would look at all of a platform’s features when risk-assessing the platform and enforcing safety, and in-app messaging services would not be exempt. Platforms have to demonstrate what they would do to mitigate harmful and abusive behaviour, and that they would take action against the accounts responsible.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Absolutely, I agree, but the problem is with the way the Bill is written. It does not suggest that a platform could stop somebody accessing a certain part of a service. The Bill refers to content, and to the service as a whole, but it does not have that middle point that I am talking about.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

A platform is required to demonstrate to Ofcom what it would do to mitigate activity that would breach the safety duties. It could do that through a feature that it builds in, or it may take a more draconian stance and say, “Rather than turning off certain features, we will just suspend the account altogether.” That could be discussed in the risk assessments, and agreed in the codes of practice.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

What I am saying is that the clause does not actually allow that middle step. It does not explicitly say that somebody could be stopped from accessing private messaging. The only options are being banned from certain content, or being banned from the entire platform.

I absolutely recognise the hard work that Ofcom has done, and I recognise that it will work very hard to ensure that risks are mitigated, but the amendment ensures what the Minister intended with this legislation. I am not convinced that he intended there to be just the two options that I outlined. I think he intended something more in line with what I am suggesting in the amendment. It would be very helpful if the Minister explicitly said something in this Committee that makes it clear that Ofcom has the power to say to platforms, “Your risk assessment says that there is a real risk from private messaging”—or from livestreaming—“so why don’t you turn that off for all users under 18?” Ofcom should be able to do that.

Could the Minister be clear that that is the direction of travel he is hoping and intending that Ofcom will take? If he could be clear on that, and will recognise that the clause could have been slightly better written to ensure Ofcom had that power, I would be quite happy to not push the amendment to a vote. Will the Minister be clear about the direction he hopes will be taken?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I rise to support my SNP colleagues’ amendments 99, and 96 and 97, just as I supported amendment 98. The amendments are sensible and will ensure that service providers are empowered to take action to mitigate harms done through their services. In particular, we support amendment 99, which makes it clear that a service should be required to have the tools available to allow it to block access to parts of its service, if that is proportionate.

Amendments 96 and 97 would ensure that private messaging and livestreaming features were brought into scope, and that platforms and services could block access to them when that was proportionate, with the aim of protecting children, which is the ultimate aim of the Bill. Those are incredibly important points to raise.

In previous iterations of the Bill Committee, Labour too tabled a number of amendments to do with platforms’ responsibilities for livestreaming. I expressed concerns about how easy it is for platforms to host live content, and about how ready they were to screen that content for harm, illegal or not. I am therefore pleased to support our SNP colleagues. The amendments are sensible, will empower platforms and will keep children safe.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

It is a pleasure to serve with you in the Chair, Sir Roger. I rise in support of amendments 99, and 96 and 97, as my hon. Friend the Member for Pontypridd did. I have an issue with the vagueness and ambiguity in the Bill. Ministerial direction is incredibly helpful, not only for Ofcom, but for the companies and providers that will use the Bill to make technologies available to do what we are asking them to do.

As the hon. Member for Aberdeen North said, if the Bill provided for that middle ground, that would be helpful for a number of purposes. Amendment 97 refers to livestreaming; in a number of cases around the world, people have livestreamed acts of terror, such as the shooting at the Christchurch mosque. Those offences were watched in real time, as they were perpetuated, by potentially hundreds of thousands of people. We have people on watch lists—people we are aware of. If we allowed them to use a social media platform but not the livestreaming parts, that could go some way to mitigating the risk of their livestreaming something like that. Their being on the site is perhaps less of a concern, as their general use of it could be monitored in real time. Under a risk analysis, we might be happy for people to be on a platform, but consider that the risk was too great to allow them to livestream. Having such a provision would be helpful.

My hon. Friend the Member for Luton North mentioned the onus always being on the victim. When we discuss online abuse, I really hate it when people say, “Well, just turn off your messages”, “Block them” or “Change your notification settings”, as though that were a panacea. Turning off the capacity to use direct messages is a much more effective way of addressing abuse by direct message than banning the person who sent it altogether—they might just make a new account—or than relying on the recipient of the message to take action when the platform has the capacity to take away the option of direct messaging. The adage is that sunlight is the best disinfectant. When people post in public and the post can be seen by anyone, they can be held accountable by anyone. That is less of a concern to me than what they send privately, which can be seen only by the recipient.

This group of amendments is reasonable and proportionate. They would not only give clear ministerial direction to Ofcom and the technology providers, and allow Ofcom to take the measures that we are discussing, but would pivot us away from placing the onus on the recipients of abusive behaviour, or people who might be exposed to it. Instead, the onus would be on platforms to make those risk assessments and take the middle ground, where that is a reasonable and proportionate step.

10:15
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

If someone on a PlayStation wants to play online games, they must sign up to PlayStation Plus—that is how the model works. Once they pay that subscription, they can access online games and play Fortnite or Rocket League or whatever they want online. They then also have access to a suite of communication features; they can private message people. It would be disproportionate to ban somebody from playing any PlayStation game online in order to stop them from being able to private message inappropriate things. That would be a disproportionate step. I do not want PlayStation to be unable to act against somebody because it could not ban them, as that would be disproportionate, but was unable to switch off the direct messaging features because the clause does not allow it that flexibility. A person could continue to be in danger on the PlayStation platform as a result of private communications that they could receive. That is one example of how the provision would be key and important.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Again, the Government recognise the intent behind amendment 99, which, as the hon. Member for Aberdeen North said, would require providers to be able to block children’s access to parts of a service, rather than the entire service. I very much get that. We recognise the nature and scale of the harm that can be caused to children through livestreaming and private messaging, as has been outlined, but the Bill already delivers what is intended by these amendments. Clause 11(4) sets out examples of areas in which providers will need to take measures, if proportionate, to meet the child safety duties. It is not an exhaustive list of every measure that a provider might be required to take. It would not be feasible or practical to list every type of measure that a provider could take to protect children from harm, because such a list could become out of date quickly as new technologies emerge, as the hon. Lady outlined with her PlayStation example.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a concern. The Minister’s phrasing was “to block children’s access”. Surely some of the issues would be around blocking adults’ access, because they are the ones causing risk to the children. From my reading of the clause, it does not suggest that the action could be taken only against child users; it could be taken against any user in order to protect children.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will come to that in a second. The hon. Member for Luton North talked about putting the onus on the victim. Any element of choice is there for adults; the children will be protected anyway, as I will outline in a second. We all agree that the primary purpose of the Bill is to be a children’s protection measure.

Ofcom will set out in codes of practice the specific steps that providers can take to protect children who are using their service, and the Government expect those to include steps relating to children’s access to high-risk features, such as livestreaming or private messaging. Clause 11(4)(d) sets out that that providers may be required to take measures in the following areas:

“policies on user access to the service or to particular content present on the service, including blocking users from accessing the service or particular content”.

The other areas listed are intentionally broad categories that allow for providers to take specific measures. For example, a measure in the area of blocking user access to particular content could include specific measures that restrict children’s access to parts of a service, if that is a proportionate way to stop users accessing that type of content. It can also apply to any of the features of a service that enable children to access particular content, and could therefore include children’s access to livestreaming and private messaging features. In addition, the child safety duties make it clear that providers need to use proportionate systems and processes that prevent children from encountering primary priority content that is harmful to them, and protect children and age groups at risk of harm from other content that is harmful to them.

While Ofcom will set out in codes of practice the steps that providers can take to meet these duties, we expect those steps, as we have heard, to include the use of age verification to prevent children accessing content that poses the greatest risk of harm to them. To meet that duty, providers may use measures that restrict children from accessing parts of the service. The Bill therefore allows Ofcom to require providers to take that step where it is proportionate. I hope that that satisfies the hon. Member for Aberdeen North, and gives her the direction that she asked for—that is, a direction to be more specific that Ofcom does indeed have the powers that she seeks.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

The Bill states that we can expect little impact on child protection before 2027-28 because of the enforcement road map and when Ofcom is planning to set that out. Does the Minister not think that in the meantime, that sort of ministerial direction would be helpful? It could make Ofcom’s job easier, and would mean that children could be protected online before 2027-28.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The ministerial direction that the various platforms are receiving from the Dispatch Box, from our conversations with them and from the Bill’s progress as it goes through the House of Lords will be helpful to them. We do not expect providers to wait until the very last minute to implement the measures. They are starting to do so now, but we want them to go them further, quicker.

Government amendment 4 will require providers who already have a minimum age requirement for access to their service, or parts of it, to give details of the measures that they use to restrict access in their terms of service and apply them consistently. Providers will also need to provide age-appropriate protections for children using their service. That includes protecting children from harmful content and activity on their service, as well as reviewing children’s use of higher-risk features, as I have said.

To meet the child safety risk assessment duties in clause 10, providers must assess: the risk of harm to children from functionalities that facilitate the presence or dissemination of harmful content; the level of risk from different kinds of harmful content, giving separate consideration to children in different age groups; the different ways in which the service is used, and the impact of such use on the level of risk of harm; and how the design and operation of the service may increase the risks identified.

The child safety duties in clause 11 apply across all areas of the service, including the way it is operated and used by children, as well as the content present on the service. For the reasons I have set out, I am not able to accept the amendments, but I hope that the hon. Member for Aberdeen North will take on board my assurances.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

That was quite helpful. I am slightly concerned about the Minister’s focus on reducing children’s access to the service or to parts of it. I appreciate that is part of what the clause is intended to do, but I would also expect platforms to be able to reduce the ability of adults to access parts of the service or content in order to protect children. Rather than just blocking children, blocking adults from accessing some features—whether that is certain adults or adults as a group—would indeed protect children. My reading of clause 11(4) was that users could be prevented from accessing some of this stuff, rather than just child users. Although the Minister has given me more questions, I do not intend to push the amendment to a vote.

May I ask a question of you, Sir Roger? I have not spoken about clause stand part. Are we still planning to have a clause stand part debate?

None Portrait The Chair
- Hansard -

No.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Thank you, Sir Roger; I appreciate the clarification. When I talk about Government amendment 4, I will also talk about clause stand part. I withdraw the amendment.

None Portrait The Chair
- Hansard -

That is up to the Committee.

Amendment, by leave, withdrawn.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I beg to move amendment 4, in clause 11, page 11, line 9, at end insert—

“(6A) If a provider takes or uses a measure designed to prevent access to the whole of the service or a part of the service by children under a certain age, a duty to—

(a) include provisions in the terms of service specifying details about the operation of the measure, and

(b) apply those provisions consistently.”

This amendment requires providers to give details in their terms of service about any measures they use which prevent access to a service (or part of it) by children under a certain age, and to apply those terms consistently.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Government amendment 5.

Amendment 100, in clause 11, page 11, line 15, after “accessible” insert “for child users.”

This amendment makes clear that the provisions of the terms of service have to be clear and accessible for child users.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Although the previous version of the Bill already focused on protecting children, as I have said, the Government are clear that it must do more to achieve that and to ensure that requirements for providers are as clear as possible. That is why we are making changes to strengthen the Bill. Amendments 4 and 5 will require providers who already have a minimum age requirement for access to their service, or parts of it, to give details in their terms of services of the measures that they use to ensure that children below the minimum age are prevented access. Those terms must be applied consistently and be clear and accessible to users. The change will mean that providers can be held to account for what they say in their terms of service, and will no longer do nothing to prevent underage access.

The Government recognise the intent behind amendment 100, which is to ensure that terms of service are clear and accessible for child users, but the Bill as drafted sets an appropriate standard for terms of service. The duty in clause 11(8) sets an objective standard for terms of service to be clear and accessible, rather than requiring them to be clear for particular users. Ofcom will produce codes of practice setting out how providers can meet that duty, which may include provisions about how to tailor the terms of service to the user base where appropriate.

The amendment would have the unintended consequence of limiting to children the current accessibility requirement for terms of service. As a result, any complicated and detailed information that would not be accessible for children—for example, how the provider uses proactive technology—would probably need to be left out of the terms of service, which would clearly conflict with the duty in clause 11(7) and other duties relating to the terms of service. It is more appropriate to have an objective standard of “clear and accessible” so that the terms of service can be tailored to provide the necessary level of information and be useful to other users such as parents and guardians, who are most likely to be able to engage with the more detailed information included in the terms of service and are involved in monitoring children’s online activities.

Ofcom will set out steps that providers can take to meet the duty and will have a tough suite of enforcement powers to take action against companies that do not meet their child safety duties, including if their terms of service are not clear and accessible. For the reasons I have set out, I am not able to accept the amendment tabled by the hon. Member for Aberdeen North and I hope she will withdraw it.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

As I said, I will also talk about clause 11. I can understand why the Government are moving their amendments. It makes sense, particularly with things like complying with the provisions. I have had concerns all the way along—particularly acute now as we are back in Committee with a slightly different Bill from the one that we were first presented with—about the reliance on terms of service. There is a major issue with choosing to go down that route, given that providers of services can choose what to put in their terms of service. They can choose to have very good terms of service that mean that they will take action on anything that is potentially an issue and that will be strong enough to allow them to take the actions they need to take to apply proportionate measures to ban users that are breaking the terms of service. Providers will have the ability to write terms of service like that, but not all providers will choose to do that. Not all providers will choose to write the gold standard terms of service that the Minister expects everybody will write.

We have to remember that these companies’ and organisations’ No. 1 aim is not to protect children. If their No. 1 aim was to protect children, we would not be here. We would not need an Online Safety Bill because they would be putting protection front and centre of every decision they make. Their No. 1 aim is to increase the number of users so that they can get more money. That is the aim. They are companies that have a duty to their shareholders. They are trying to make money. That is the intention. They will not therefore necessarily draw up the best possible terms of service.

I heard an argument on Report that market forces will mean that companies that do not have strong enough terms of service, companies that have inherent risks in their platforms, will just not be used by people. If that were true, we would not be in the current situation. Instead, the platforms that are damaging people and causing harm—4chan, KiwiFarms or any of those places that cause horrendous difficulties—would not be used by people because market forces would have intervened. That approach does not work; it does not happen that the market will regulate itself and people will stay away from places that cause them or others harm. That is not how it works. I am concerned about the reliance on terms of service and requiring companies to stick to their own terms of service. They might stick to their own terms of service, but those terms of service might be utterly rubbish and might not protect people. Companies might not have in place what we need to ensure that children and adults are protected online.

10:30
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Does the hon. Lady agree that people out there in the real world have absolutely no idea what a platform’s terms of service are, so we are being expected to make a judgment on something about which we have absolutely no knowledge?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Absolutely. The amendment I tabled regarding the accessibility of terms of service was designed to ensure that if the Government rely on terms of service, children can access those terms of service and are able to see what risks they are putting themselves at. We know that in reality children will not read these things. Adults do not read these things. I do not know what Twitter’s terms of service say, but I do know that Twitter managed to change its terms of service overnight, very easily and quickly. Companies could just say, “I’m a bit fed up with Ofcom breathing down my neck on this. I’m just going to change my terms of service, so that Ofcom will not take action on some of the egregious harm that has been done. If we just change our terms of service, we don’t need to bother. If we say that we are not going to ban transphobia on our platform—if we take that out of the terms of service—we do not need to worry about transphobia on our platform. We can just let it happen, because it is not in our terms of service.”

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Does the hon. Lady agree that the Government are not relying solely on terms of service, but are rightly saying, “If you say in your terms of service that this is what you will do, Ofcom will make sure that you do it”? Ofcom will take on that responsibility for people, making sure that these complex terms of service are understood and enforced, but the companies still have to meet all the priority illegal harms objectives that are set out in the legislation. Offences that exist in law are still enforced on platforms, and risk-assessed by Ofcom as well, so if a company does not have a policy on race hate, we have a law on race hate, and that will apply.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

It is absolutely the case that those companies still have to do a risk assessment, and a child risk assessment if they meet the relevant criteria. The largest platforms, for example, will still have to do a significant amount of work on risk assessments. However, every time a Minister stands up and talks about what they are requiring platforms and companies to do, they say, “Companies must stick to their terms of service. They must ensure that they enforce things in line with their terms of service.” If a company is finding it too difficult, it will just take the tough things out of their terms of service. It will take out transphobia, it will take out abuse. Twitter does not ban anyone for abuse anyway, it seems, but it will be easier for Twitter to say, “Ofcom is going to try to hold us for account for the fact that we are not getting rid of people for abusive but not illegal messages, even though we say in our terms of service, ‘You must act with respect’, or ‘You must not abuse other users’. We will just take that out of our terms of service so that we are not held to account for the fact that we are not following our terms of service.” Then, because the abuse is not illegal—because it does not meet that bar—those places will end up being even less safe than they are right now.

For example, occasionally Twitter does act in line with its terms of service, which is quite nice: it does ban people who are behaving inappropriately, but not necessarily illegally, on its platform. However, if it is required to implement that across the board for everybody, it will be far easier for Twitter to say, “We’ve sacked all our moderators—we do not have enough people to be able to do this job—so we will just take it all out of the terms of service. The terms of service will say, ‘We will ban people for sharing illegal content, full stop.’” We will end up in a worse situation than we are currently in, so the reliance on terms of service causes me a big, big problem.

Turning to amendment 100, dealing specifically with the accessibility of this feature for child users, I appreciate the ministerial clarification, and agree that my amendment could have been better worded and potentially causes some problems. However, can the Minister talk more about the level of accessibility? I would like children to be able to see a version of the terms of service that is age-appropriate, so that they understand what is expected of them and others on the platform, and understand when and how they can make a report and how that report will be acted on. The kids who are using Discord, TikTok or YouTube are over 13—well, some of them are—so they are able to read and understand, and they want to know how to make reports and for the reporting functions to be there. One of the biggest complaints we hear from kids is that they do not know how to report things they see that are disturbing.

A requirement for children to have an understanding of how reporting functions work, particularly on social media platforms where people are interacting with each other, and of the behaviour that is expected of them, does not mean that there cannot be a more in-depth and detailed version of the terms of service, laying out potential punishments using language that children may not be able to understand. The amendment would specifically ensure that children have an understanding of that.

We want children to have a great time on the internet. There are so many ace things out there and wonderful places they can access. Lego has been in touch, for example; its website is really pretty cool. We want kids to be able to access that stuff and communicate with their friends, but we also want them to have access to features that allow them to make reports that will keep them safe. If children are making reports, then platforms will say, “Actually, there is real problem with this because we are getting loads of reports about it.” They will then be able to take action. They will be able to have proper risk assessments in place because they will be able to understand what is disturbing people and what is causing the problems.

I am glad to hear the Minister’s words. If he were even more clear about the fact that he would expect children to be able to understand and access information about keeping themselves safe on the platforms, then that would be even more helpful.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

On terms and conditions, it is clearly best practice to have a different level of explanation that ensures children can fully understand what they are getting into. The hon. Lady talked about the fact that children do not know how to report harm. Frankly, judging by a lot of conversations we have had in our debates, we do not know how to report harm because it is not transparent. On a number of platforms, how to do that is very opaque.

A wider aim of the Bill is to make sure that platforms have better reporting patterns. I encourage platforms to do exactly what the hon. Member for Aberdeen North says to engage children, and to engage parents. Parents are well placed to engage with reporting and it is important that we do not forget parenting in the equation of how Government and platforms are acting. I hope that is clear to the hon. Lady. We are mainly relying on terms and conditions for adults, but the Bill imposes a wider set of protections for children on the platforms.

Amendment 4 agreed to.

Amendment made: 5, in clause 11, page 11, line 15, after “(5)” insert “, (6A)”.—(Paul Scully.)

This amendment ensures that the duty in clause 11(8) to have clear and accessible terms of service applies to the terms of service mentioned in the new subsection inserted by Amendment 4.

Clause 11, as amended, ordered to stand part of the Bill.

Clause 12

Adults’ risk assessment duties

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause 13 stand part.

Government amendments 18, 23 to 25, 32, 33 and 39.

Clause 55 stand part.

Government amendments 42 to 45, 61 to 66, 68 to 70, 74, 80, 85, 92, 51 and 52, 54, 94 and 60.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

To protect free speech and remove any possibility that the Bill could cause tech companies to censor legal content, I seek to remove the so-called “legal but harmful” duties from the Bill. These duties are currently set out in clauses 12 and 13 and apply to the largest in-scope services. They require services to undertake risk assessments for defined categories of harmful but legal content, before setting and enforcing clear terms of service for each category of content.

I share the concerns raised by Members of this House and more broadly that these provisions could have a detrimental effect on freedom of expression. It is not right that the Government define what legal content they consider harmful to adults and then require platforms to risk assess for that content. Doing so may encourage companies to remove legal speech, undermining this Government’s commitment to freedom of expression. That is why these provisions must be removed.

At the same time, I recognise the undue influence that the largest platforms have over our public discourse. These companies get to decide what we do and do not see online. They can arbitrarily remove a user’s content or ban them altogether without offering any real avenues of redress to users. On the flip side, even when companies have terms of service, these are often not enforced, as we have discussed. That was the case after the Euro 2020 final where footballers were subject to the most appalling abuse, despite most platforms clearly prohibiting that. That is why I am introducing duties to improve the transparency and accountability of platforms and to protect free speech through new clauses 3 and 4. Under these duties, category 1 platforms will only be allowed to remove or restrict access to content or ban or suspend users when this is in accordance with their terms of service or where they face another legal obligation. That protects against the arbitrary removal of content.

Companies must ensure that their terms of service are consistently enforced. If companies’ terms of service say that they will remove or restrict access to content, or will ban or suspend users in certain circumstances, they must put in place proper systems and processes to apply those terms. That will close the gap between what companies say they will do and what they do in practice. Services must ensure that their terms of service are easily understandable to users and that they operate effective reporting and redress mechanisms, enabling users to raise concerns about a company’s application of the terms of service. We will debate the substance of these changes later alongside clause 18.

Clause 55 currently defines

“content that is harmful to adults”,

including

“priority content that is harmful to adults”

for the purposes of this legislation. As this concept would be removed with the removal of the adult safety duties, this clause will also need to be removed.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

My hon. Friend mentioned earlier that companies will not be able to remove content if it is not part of their safety duties or if it was not a breach of their terms of service. I want to be sure that I heard that correctly and to ask whether Ofcom will be able to risk assess that process to ensure that companies are not over-removing content.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Absolutely. I will come on to Ofcom in a second and respond directly to his question.

The removal of clauses 12, 13 and 55 from the Bill, if agreed by the Committee, will require a series of further amendments to remove references to the adult safety duties elsewhere in the Bill. These amendments are required to ensure that the legislation is consistent and, importantly, that platforms, Ofcom and the Secretary of State are not held to requirements relating to the adult safety duties that we intend to remove from the Bill. The amendments remove requirements on platforms and Ofcom relating to the adult safety duties. That includes references to the adult safety duties in the duties to provide content reporting and redress mechanisms and to keep records. They also remove references to content that is harmful to adults from the process for designating category 1, 2A and 2B companies. The amendments in this group relate mainly to the process for the category 2B companies.

I also seek to amend the process for designating category 1 services to ensure that they are identified based on their influence over public discourse, rather than with regard to the risk of harm posed by content that is harmful to adults. These changes will be discussed when we debate the relevant amendments alongside clause 82 and schedule 11. The amendments will remove powers that will no longer be required, such as the Secretary of State’s ability to designate priority content that is harmful to adults. As I have already indicated, we intend to remove the adult safety duties and introduce new duties on category 1 services relating to transparency, accountability and freedom of expression. While they will mostly be discussed alongside clause 18, amendments 61 to 66, 68 to 70 and 74 will add references to the transparency, accountability and freedom of expression duties to schedule 8. That will ensure that Ofcom can require providers of category 1 services to give details in their annual transparency reports about how they comply with the new duties. Those amendments define relevant content and consumer content for the purposes of the schedule.

We will discuss the proposed transparency and accountability duties that will replace the adult safety duties in more detail later in the Committee’s deliberations. For the reasons I have set out, I do not believe that the current adult safety duties with their risks to freedom of expression should be retained. I therefore urge the Committee that clauses 12, 13 and 55 do not stand part and instead recommend that the Government amendments in this group are accepted.

None Portrait The Chair
- Hansard -

Before we proceed, I emphasise that we are debating clause 13 stand part as well as the litany of Government amendments that I read out.

10:45
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Clause 12 is extremely important because it outlines the platforms’ duties in relation to keeping adults safe online. The Government’s attempts to remove the clause through an amendment that thankfully has not been selected are absolutely shocking. In addressing Government amendments 18, 23, 24, 25, 32, 33 and 39, I must ask the Minister: exactly how will this Bill do anything to keep adults safe online?

In the original clause 12, companies had to assess the risk of harm to adults and the original clause 13 outlined the means by which providers had to report these assessments back to Ofcom. This block of Government amendments will make it impossible for any of us—whether that is users of a platform or service, researchers or civil society experts—to understand the problems that arise on these platforms. Labour has repeatedly warned the Government that this Bill does not go far enough to consider the business models and product design of platforms and service providers that contribute to harm online. By tabling this group of amendments, the Government are once again making it incredibly difficult to fully understand the role of product design in perpetuating harm online.

We are not alone in our concerns. Colleagues from Carnegie UK Trust, who are a source of expertise to hon. Members across the House when it comes to internet regulation, have raised their concerns over this grouping of amendments too. They have raised specific concerns about the removal of the transparency obligation, which Labour has heavily pushed for in previous Bill Committees.

Previously, service providers had been required to inform customers of the harms their risk assessment had detected, but the removal of this risk assessment means that users and consumers will not have the information to assess the nature or risk on the platform. The Minister may point to the Government’s approach in relation to the new content duties in platforms’ and providers’ terms of service, but we know that there are risks arising from the fact that there is no minimum content specified for the terms of service for adults, although of course all providers will have to comply with the illegal content duties.

This approach, like the entire Bill, is already overly complex—that is widely recognised by colleagues across the House and is the view of many stakeholders too. In tabling this group of amendments, the Minister is showing his ignorance. Does he really think that all vulnerabilities to harm online simply disappear at the age of 18? By pushing these amendments, which seek to remove these protections from harmful but legal content to adults, the Minister is, in effect, suggesting that adults are not susceptible to harm and therefore risk assessments are simply not required. That is an extremely narrow-minded view to take, so I must push the Minister further. Does he recognise that many young, and older, adults are still highly likely to be impacted by suicide and self-harm messaging, eating disorder content, disinformation and abuse, which will all be untouched by these amendments?

Labour has been clear throughout the passage of the Bill that we need to see more, not less, transparency and protection from online harm for all of us—whether adults or children. These risk assessments are absolutely critical to the success of the Online Safety Bill and I cannot think of a good reason why the Minister would not support users in being able to make an assessment about their own safety online.

We have supported the passage of the Bill, as we know that keeping people safe online is a priority for us all and we know that the perfect cannot be the enemy of the good. The Government have made some progress towards keeping children safe, but they clearly do not consider it their responsibility to do the same for adults. Ultimately, platforms should be required to protect everyone: it does not matter whether they are a 17-year-old who falls short of being legally deemed an adult in this country, an 18-year-old or even an 80-year-old. Ultimately, we should all have the same protections and these risk assessments are critical to the online safety regime as a whole. That is why we cannot support these amendments. The Government have got this very wrong and we have genuine concerns that this wholesale approach will undermine how far the Bill will go to truly tackling harm online.

I will also make comments on clause 55 and the other associated amendments. I will keep my comments brief, as the Minister is already aware of my significant concerns over his Department’s intention to remove adult safety duties more widely. In the previous Bill Committee, Labour made it clear that it supports, and thinks it most important, that the Bill should clarify specific content that is deemed to be harmful to adults. We have repeatedly raised concerns about missing harms, including health misinformation and disinformation, but really this group of amendments, once again, will touch on widespread concerns that the Government’s new approach will see adults online worse off. The Government’s removal of the “legal but harmful” sections of the Online Safety Bill is a major weakening—not a strengthening—of the Bill. Does the Minister recognise that the only people celebrating these decisions will be the executives of big tech firms, and online abusers? Does he agree that this delay shows that the Government have bowed to vested interests over keeping users and consumers safe?

Labour is not alone in having these concerns. We are all pleased to see that child safety duties are still present in the Bill, but the NSPCC, among others, is concerned about the knock-on implications that may introduce new risks to children. Without adult safety duties in place, children will be at greater risk of harm if platforms do not identify and protect them as children. In effect, these plans will now place a significant greater burden on platforms to protect children than adults. As the Bill currently stands, there is a significant risk of splintering user protections that can expose children to adult-only spaces and harmful content, while forming grooming pathways for offenders, too.

The reality is that these proposals to deal with harms online for adults rely on the regulator ensuring that social media companies enforce their own terms and conditions. We already know and have heard that that can have an extremely damaging impact for online safety more widely, and we have only to consider the very obvious and well-reported case study involving Elon Musk’s takeover of Twitter to really get a sense of how damaging that approach is likely to be.

In late November, Twitter stopped taking action against tweets in violation of coronavirus rules. The company had suspended at least 11,000 accounts under that policy, which was designed to remove accounts posting demonstrably false or misleading content relating to covid-19 that could lead to harm. The company operated a five-strike policy, and the impact on public health around the world of removing that policy will likely be tangible. The situation also raises questions about the platform’s other misinformation policies. As of December 2022, they remain active, but for how long remains unclear.

Does the Minister recognise that as soon as they are inconvenient, platforms will simply change their terms and conditions, and terms of service? We know that simply holding platforms to account for their terms and conditions will not constitute robust enough regulation to deal with the threat that these platforms present, and I must press the Minister further on this point.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

My hon. Friend is making an excellent speech. I share her deep concerns about the removal of these clauses. The Government have taken this tricky issue of the concept of “legal but harmful”—it is a tricky issue; we all acknowledge that—and have removed it from the Bill altogether. I do not think that is the answer. My hon. Friend makes an excellent point about children becoming 18; the day after they become 18, they are suddenly open to lots more harmful and dangerous content. Does she also share my concern about the risks of people being drawn towards extremism, as well as disinformation and misinformation?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

My hon. Friend makes a valid point. This is not just about misinformation and disinformation; it is about leading people to really extreme, vile content on the internet. As we all know, that is a rabbit warren. That situation does not change as soon as a 17-year-old turns 18 on their 18th birthday—that they are then exempt when it comes to seeing this horrendous content. The rules need to be there to protect all of us.

As we have heard, terms and conditions can change overnight. Stakeholders have raised the concern that, if faced with a clearer focus on their terms of service, platforms and providers may choose to make their terms of service shorter, in an attempt to cut out harmful material that, if left undealt with, they may be held liable for.

In addition, the fact that there is no minimum requirement in the regime means that companies have complete freedom to set terms of service for adults, which may not reflect the risks to adults on that service. At present, service providers do not even have to include terms of service in relation to the list of harmful content proposed by the Government for the user empowerment duties—an area we will come on to in more detail shortly as we address clause 14. The Government’s approach and overreliance on terms of service, which as we know can be so susceptible to rapid change, is the wrong approach. For that reason, we cannot support these amendments.

I would just say, finally, that none of us was happy with the term “legal but harmful”. It was a phrase we all disliked, and it did not encapsulate exactly what the content is or includes. Throwing the baby out with the bathwater is not the way to tackle that situation. My hon. Friend the Member for Batley and Spen is right that this is a tricky area, and it is difficult to get it right. We need to protect free speech, which is sacrosanct, but we also need to recognise that there are so many users on the internet who do not have access to free speech as a result of being piled on or shouted down. Their free speech needs to be protected too. We believe that the clauses as they stand in the Bill go some way to making the Bill a meaningful piece of legislation. I urge the Minister not to strip them out, to do the right thing and to keep them in the Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Throughout the consideration of the Bill, I have been clear that I do not want it to end up simply being the keep MPs safe on Twitter Bill. That is not what it should be about. I did not mean that we should therefore take out everything that protects adults; what I meant was that we need to have a big focus on protecting children in the Bill, which thankfully we still do. For all our concerns about the issues and inadequacies of the Bill, it will go some way to providing better protections for children online. But saying that it should not be the keep MPs safe on Twitter Bill does not mean that it should not keep MPs safe on Twitter.

I understand how we have got to this situation. What I cannot understand is the Minister’s being willing to stand up there and say, “We can’t have these clauses because they are a risk to freedom of speech.” Why are they in the Bill in the first place if they are such a big risk to freedom of speech? If the Government’s No. 1 priority is making sure that we do not have these clauses, why did they put them in it? Why did it go through pre-legislative scrutiny? Why were they in the draft Bill? Why were they in the Bill? Why did they agree with them in Committee? Why did they agree with them on Report? Why have we ended up in a situation where, suddenly, there is a massive epiphany that they are a threat to freedom of speech and therefore we cannot possibly have them?

What is it that people want to say that they will be banned from saying as a result of this Bill? What is it that freedom of speech campaigners are so desperate to want to say online? Do they want to promote self-harm on platforms? Is that what people want to do? Is that what freedom of speech campaigners are out for? They are now allowed to do that a result of the Bill.

Nick Fletcher Portrait Nick Fletcher (Don Valley) (Con)
- Hansard - - - Excerpts

I believe that the triple shield being put in is in place of “legal but harmful”. That will enable users to put a layer of protection in so they can actually take control. But the illegal content still has to be taken down: anything that promotes self-harm is illegal content and would still have to be removed. The problem with the way it was before is that we had a Secretary of State telling us what could be said out there and what could not. What may offend the hon. Lady may not offend me, and vice versa. We have to be very careful of that. It is so important that we protect free speech. We are now giving control to each individual who uses the internet.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The promotion of self-harm is not illegal content; people are now able to do that online—congratulations, great! The promotion of incel culture is not illegal content, so this Bill will now allow people to do that online. It will allow terms of service that do not require people to be banned for promoting incel culture, self-harm, not wearing masks and not getting a covid vaccine. It will allow the platforms to allow people to say these things. That is what has been achieved by campaigners.

The Bill is making people less safe online. We will continue to have the same problems that we have with people being driven to suicide and radicalised online as a result of the changes being made in this Bill. I know the Government have been leaned on heavily by the free speech lobby. I still do not know what people want to say that they cannot say as a result of the Bill as it stands. I do not know. I cannot imagine that anybody is not offended by content online that drives people to hurt themselves. I cannot imagine anybody being okay and happy with that. Certainly, I imagine that nobody in this room is okay and happy with that.

These people have won this war on the attack on free speech. They have won a situation where they are able to promote misogynistic, incel culture and health disinformation, where they are able to say that the covid vaccine is entirely about putting microchips in people. People are allowed to say that now—great! That is what has been achieved, and it is a societal issue. We have a generational issue where people online are being exposed to harmful content. That will now continue.

It is not just a generational societal thing—it is not just an issue for society as a whole that these conspiracy theories are pervading. Some of the conspiracy theories around antisemitism are unbelievably horrific, but do not step over into illegality or David Icke would not be able to stand up and suggest that the world is run by lizard people—who happen to be Jewish. He would not be allowed to say that because it would be considered harmful content. But now he is. That is fine. He is allowed to say that because this Bill is refusing to take action on that.

11:00
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Can the hon. Lady tell me where in the Bill, as it is currently drafted—so, unamended—it requires platforms to remove legal speech?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

It allows the platforms to do that. It allows them, and requires legal but harmful stuff to be taken into account. It requires the platforms to act—to consider, through risk assessments, the harm done to adults by content that is legal but massively harmful.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. Lady is right: the Bill does not require the removal of legal speech. Platforms must take the issue into account—it can be risk assessed—but it is ultimately their decision. I think the point has been massively overstated that, somehow, previously, Ofcom had the power to strike down legal but harmful speech that was not a breach of either terms of service or the law. It never had that power.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Why do the Government now think that there is a risk to free speech? If Ofcom never had that power, if it was never an issue, why are the Government bothered about that risk—it clearly was not a risk—to free speech? If that was never a consideration, it obviously was not a risk to free speech, so I am now even more confused as to why the Government have decided that they will have to strip this measure out of the Bill because of the risk to free speech, because clearly it was not a risk in this situation. This is some of the most important stuff in the Bill for the protection of adults, and the Government are keen to remove it.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

The hon. Member is making an excellent and very passionate speech, and I commend her for that. Would she agree with one of my concerns, which is about the message that this sends to the public? It is almost that the Government were acknowledging that there was a problem with legal but harmful content—we can all, hopefully, acknowledge that that is a problem, even though we know it is a tricky one to tackle—but, by removing these clauses from the Bill, are now sending the message that, “We were trying to clean up the wild west of the internet, but, actually, we are not that bothered anymore.”

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The hon. Lady is absolutely right. We have all heard from organisations and individuals who have had their lives destroyed as a result of “legal but harmful”—I don’t have a better phrase for it—content online and of being radicalised by being driven deeper and deeper into blacker and blacker Discord servers, for example, that are getting further and further right wing.

A number of the people who are radicalised—who are committing terror attacks, or being referred to the Prevent programme because they are at risk of committing terror attacks—are not so much on the far-right levels of extremism any more, or those with incredible levels of religious extremism, but are in a situation where they have got mixed up or unclear ideological drivers. It is not the same situation as it was before, because people are being radicalised by the stuff that they find online. They are being radicalised into situations where they “must do something”—they “must take some action”—because of the culture change in society.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The hon. Member is making a powerful point. Just a few weeks ago, I asked the Secretary of State for Digital, Culture, Media and Sport, at the Dispatch Box, whether the horrendous and horrific content that led a man to shoot and kill five people in Keyham—in the constituency of my hon. Friend the Member for Plymouth, Sutton and Devonport (Luke Pollard)—would be allowed to remain and perpetuate online as a result of the removal of these clauses from the Bill. I did not get a substantial answer then, but we all know that the answer is yes.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

That is the thing: this Bill is supposed to be the Online Safety Bill. It is supposed to be about protecting people from the harm that can be done to them by others. It is also supposed to be about protecting people from that radicalisation and that harm that they can end up in. It is supposed to make a difference. It is supposed to be game changer and a world leader.

Although, absolutely, I recognise the importance of the child-safety duties in the clauses and the change that that will have, when people turn 18 they do not suddenly become different humans. They do not wake up on their 18th birthday as a different person from the one that they were before. They should not have to go from that level of protection, prior to 18, to being immediately exposed to comments and content encouraging them to self-harm, and to all of the negative things that we know are present online.

Nick Fletcher Portrait Nick Fletcher
- Hansard - - - Excerpts

I understand some of the arguments the hon. Lady is making, but that is a poor argument given that the day people turn 17 they can learn to drive or the day they turn 16 they can do something else. There are lots of these things, but we have to draw a line in the sand somewhere. Eighteen is when people become adults. If we do not like that, we can change the age, but there has to be a line in the sand. I agree with much of what the hon. Lady is saying, but that is a poor argument. I am sorry, but it is.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I do not disagree that overnight changes are involved, but the problem is that we are going from a certain level of protection to nothing; there will be a drastic, dramatic shift. We will end up with any vulnerable person who is over 18 being potentially subject to all this content online.

I still do not understand what people think they will have won as a result of having the provisions removed from the Bill. I do not understand how people can say, “This is now a substantially better Bill, and we are much freer and better off as a result of the changes.” That is not the case; removing the provisions will mean the internet continuing to be unsafe—much more unsafe than it would have been under the previous iteration of the Bill. It will ensure that more people are harmed as a result of online content. It will absolutely—

Nick Fletcher Portrait Nick Fletcher
- Hansard - - - Excerpts

Will the hon. Lady give way?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

No, I will not give way again. The change will ensure that people can absolutely say what they like online, but the damage and harm that it will cause are not balanced by the freedoms that have been won.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

As a Back-Bench Member of Parliament, I recommended that the “legal but harmful” provisions be removed from the Bill. When I chaired the Joint Committee of both Houses of Parliament that scrutinised the draft Bill, it was the unanimous recommendation of the Committee that the “legal but harmful” provisions be removed. As a Minister at the Dispatch Box, I said that I thought “legal but harmful” was a problematic term and we should not use it. The term “legal but harmful” does not exist in the Bill, and has never existed in the Bill, but it has provoked a debate that has caused a huge confusion. There is a belief, which we have heard expressed in debate today, that somehow there are categories of content that Ofcom can deem categories for removal whether they are unlawful or not.

During the Bill’s journey from publication in draft to where we are today, it has become more specific. Rather than our relying on general duties of care, written into the Bill are areas of priority illegal activity that the companies must proactively look for, monitor and mitigate. In the original version of the Bill, that included only terrorist content and child sexual exploitation material, but on the recommendation of the Joint Committee, the Government moved in the direction of writing into the Bill at schedule 7 offences in law that will be the priority illegal offences.

The list of offences is quite wide, and it is more comprehensive than any other such list in the world in specifying exactly what offences are in scope. There is no ambiguity for the platforms as to what offences are in scope. Stalking, harassment and inciting violence, which are all serious offences, as well as the horrible abuse a person might receive as a consequence of their race or religious beliefs, are written into the Bill as priority illegal offences.

There has to be a risk assessment of whether such content exists on platforms and what action platforms should take. They are required to carry out such a risk assessment, although that was never part of the Bill before. The “legal but harmful” provisions in some ways predate that. Changes were made; the offences were written into the Bill, risk assessments were provided for, and Parliament was invited to create new offences and write them into the Bill, if there were categories of content that had not been captured. In some ways, that creates a democratic lock that says, “If we are going to start to regulate areas of speech, what is the legal reason for doing that? Where is the legal threshold? What are the grounds for us taking that decision, if it is something that is not already covered in platforms’ terms of service?”

We are moving in that direction. We have a schedule of offences that we are writing into the Bill, and those priority illegal offences cover most of the most serious behaviour and most of the concerns raised in today’s debate. On top of that, there is a risk assessment of platforms’ terms of service. When we look at the terms of service of the companies—the major platforms we have been discussing—we see that they set a higher bar again than the priority illegal harms. On the whole, platforms do not have policies that say, “We won’t do anything about this illegal activity, race hate, incitement to violence, or promotion or glorification of terrorism.” The problem is that although have terms of service, they do not enforce them. Therefore, we are not relying on terms of service. What we are saying, and what the Bill says, is that the minimum safety standards are based on the offences written into the Bill. In addition, we have risk assessment, and we have enforcement based on the terms of service.

There may be a situation in which there is a category of content that is not in breach of a platform’s terms of service and not included in the priority areas of illegal harm. It is very difficult to think of what that could be—something that is not already covered, and over which Ofcom would not have power. There is the inclusion of the new offences of promoting self-harm and suicide. That captures not just an individual piece of content, but the systematic effect of a teenager like Molly Russell—or an adult of any age—being targeted with such content. There are also new offences for cyber-flashing, and there is Zach’s law, which was discussed in the Chamber on Report. We are creating and writing into the Bill these new priority areas of illegal harm.

Freedom of speech groups’ concern was that the Government could have a secret list of extra things that they also wanted risk-assessed, rather enforcement being clearly based either on the law or on clear terms of service. It is difficult to think of categories of harm that are not already captured in terms of service or priority areas of illegal harm, and that would be on such a list. I think that is why the change was made. For freedom of speech campaigners, there was a concern about exactly what enforcement was based on: “Is it based on the law? Is it based on terms of service? Or is it based on something else?”

I personally believed that the “legal but harmful” provisions in the Bill, as far as they existed, were not an infringement on free speech, because there was never a requirement to remove legal speech. I do not think the removal of those clauses from the Bill suddenly creates a wild west in which no enforcement will take place at all. There will be very effective enforcement based on the terms of service, and on the schedule 7 offences, which deal with the worst kinds of illegal activity; there is a broad list. The changes make it much clearer to everybody—platforms and users alike, and Ofcom—exactly what the duties are, how they are enforced and what they are based on.

For future regulation, we have to use this framework, so that we can say that when we add new offences to the scope of the legislation, they are offences that have been approved by Parliament and have gone through a proper process, and are a necessary addition because terms of service do not cover them. That is a much clearer and better structure to follow, which is why I support the Government amendments.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

I cannot help but see the Government’s planned removal of clauses 12 and 13 as essentially wrecking amendments to the Bill. Taking those provisions out of the Bill makes it a Bill not about online safety, but about child protection. We have not had five years or so of going backwards and forwards, and taken the Bill through Committee and then unprecedentedly recommitted it to Committee, in order to fundamentally change what the Bill set out to do. The fact that, at this late stage, the Government are trying to take out these aspects of the Bill melts my head, for want of a better way of putting it.

My hon. Friend the Member for Batley and Spen was absolutely right when she talked about what clauses 12 and 13 do. In effect, they are an acknowledgement that adults are also harmed online, and have different experiences online. I strongly agree with the hon. Member for Aberdeen North about this not being the protect MPs from being bullied on Twitter Bill, because obviously the provisions go much further than that, but it is worth noting, in the hope that it is illustrative to Committee members, the very different experience that the Minister and I have in using Twitter. I say that as a woman who is LGBT and Jewish—and although I would not suggest that it should be a protected characteristic, the fact that I am ginger probably plays a part as well. He and I could do the same things on Twitter on the same day and have two completely different experiences of that platform.

The risk-assessment duties set out in clause 12, particularly in subsection (5)(d) to (f), ask platforms to consider the different ways in which different adult users might experience them. Platforms have a duty to attempt to keep certain groups of people, and categories of user, safe. When we talk about free speech, the question is: freedom of speech for whom, and at what cost? Making it easier for people to perpetuate, for example, holocaust denial on the internet—a category of speech that is lawful but awful, as it is not against the law in this country to deny that the holocaust happened—makes it much less likely that I, or other Jewish people, will want to use the platform.

11:24
Our freedom of speech and ability to express ourselves on the platform is curtailed by the platform’s decision to prioritise the freedom of expression of people who would deny the holocaust over that of Jewish people who want to use the platform safely and not be bombarded by people making memes of their relatives getting thrown into gas chambers, of Jewish people with big noses, or of the Rothschild Zionist global control conspiracy nonsense that was alluded to earlier, which is encountered online constantly by Jewish users of social media platforms.
Organisations such as the Community Security Trust and the Antisemitism Policy Trust, which do excellent work in this area, have been very clear that someone’s right to be protected from that sort of content should not end the day they turn 18. Duties should remain on platforms to do risk assessments to protect certain groups of adults who may be at increased risk from such content, in order to protect their freedom of speech and expression.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The hon. Member makes a powerful point about the different ways in which people experience things. That tips over into real-life abusive interactions, and goes as far as terrorist incidents in some cases. Does she agree that protecting people’s freedom of expression and safety online also protects people in their real, day-to-day life?

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

I could not agree more. I suppose that is why this aspect of the Bill is so important, not just to me but to all those categories of user. I mentioned paragraphs (d) to (f), which would require platforms to assess exactly that risk. This is not about being offended. Personally, I have the skin of a rhino. People can say most things to me and I am not particularly bothered by it. My concern is where things that are said online are transposed into real-life harms. I will use myself as an example. Online, we can see antisemitic and conspiratorial content, covid misinformation, and covid misinformation that meets with antisemitism and conspiracies. When people decide that I, as a Jewish Member of Parliament, am personally responsible for George Soros putting a 5G chip in their arm, or whatever other nonsense they have become persuaded by on the internet, that is exactly the kind of thing that has meant people coming to my office armed with a knife. The kind of content that they were radicalised by on the internet led to their perpetrating a real-life, in-person harm. Thank God—Baruch Hashem—neither I nor my staff were in the office that day, but that could have ended very differently, because of the sorts of content that the Bill is meant to protect online users from.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. Lady is talking about an incredibly important issue, but the Bill covers such matters as credible threats to life, incitement to violence against an individual, and harassment and stalking—those patterns of behaviour. Those are public order offences, and they are in the Bill. I would absolutely expect companies to risk-assess for that sort of activity, and to be required by Ofcom to mitigate it. On her point about holocaust denial, first, the shield will mean that people can protect themselves from seeing stuff. The further question would be whether we create new offences in law, which can then be transposed across.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

I accept the points that the hon. Member raised, but he is fundamentally missing the point. The categories of information and content that these people had seen and been radicalised by would not fall under the scope of public order offences or harassment. The person was not sending me harassing messages before they turned up at my office. Essentially, social media companies and other online platforms have to take measures to mitigate the risk of categories of offences that are illegal, whether or not they are in the Bill. I am talking about what clauses 12 and 13 covered, whether we call it the “legal but harmful” category or “lawful but awful”. Whatever we name those provisions, by taking out of the Bill clauses relating to the “legal but harmful” category, we are opening up an area of harm that already exists, that has a real-world impact, and that the Bill was meant to go some way towards addressing.

The provisions have taken out the risk assessments that need to be done. The Bill says,

“(e) the level of risk of functionalities of the service facilitating the presence or dissemination of priority content that is harmful to adults, identifying and assessing those functionalities that present higher levels of risk;

(f) the different ways in which the service is used, and the impact of such use on the level of risk of harm that might be suffered by adults;

(g) the nature, and severity, of the harm that might be suffered by adults”.

Again, the idea that we are talking about offence, and that the clauses need to be taken out to protect free speech, is fundamentally nonsense.

I have already mentioned holocaust denial, but it is also worth mentioning health-related disinformation. We have already seen real-world harms from some of the covid misinformation online. It led to people including Piers Corbyn turning up outside Parliament with a gallows, threatening to hang hon. Members for treason. Obviously, that was rightly dealt with by the police, but the kind of information and misinformation that he had been getting online and that led him to do that, which is legal but harmful, will now not be covered by the Bill.

I will also raise an issue I have heard about from a number of people dealing with cancer and conditions such as multiple sclerosis. People online try to discourage them from accessing the proper medical interventions for their illnesses, and instead encourage them to take more vitamin B or adopt a vegan diet. There are people who have died because they had cancer but were encouraged online to not access cancer treatment because they were subject to lawful but awful categories of harm.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I wonder if the hon. Member saw the story online about the couple in New Zealand who refused to let their child have a life-saving operation because they could not guarantee that the blood used would not be from vaccinated people? Is the hon. Member similarly concerned that this has caused real-life harm?

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

I am aware of the case that the hon. Member mentioned. I appreciate that I am probably testing the patience of everybody in the Committee Room, but I want to be clear just how abhorrent I find it that these provisions are coming out of the Bill. I am trying to be restrained, measured and reasonably concise, but that is difficult when there are so many parts of the change that I find egregious.

My final point is on self-harm and suicide content. For men under the age of 45, suicide is the biggest killer. In the Bill, we are doing as much as we can to protect young people from that sort of content. My real concern is this: many young people are being protected by the Bill’s provisions relating to children. They are perhaps waiting for support from child and adolescent mental health services, which are massively oversubscribed. The minute they tick over into 18, fall off the CAMHS waiting list and go to the bottom of the adult mental health waiting list—they may have to wait years for treatment of various conditions—there is no requirement or duty on the social media companies and platforms to do risk assessments.

11:25
The Chair adjourned the Committee without Question put (Standing Order No. 88).
Adjourned till this day at Two o’clock.

ONLINE SAFETY BILL (Second sitting)

Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(1 year, 4 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
The Committee consisted of the following Members:
Chairs: † Dame Angela Eagle, Sir Roger Gale
† Ansell, Caroline (Eastbourne) (Con)
† Bailey, Shaun (West Bromwich West) (Con)
Bhatti, Saqib (Meriden) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
† Bonnar, Steven (Coatbridge, Chryston and Bellshill) (SNP)
† Bristow, Paul (Peterborough) (Con)
† Collins, Damian (Folkestone and Hythe) (Con)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Fletcher, Nick (Don Valley) (Con)
† Leadbeater, Kim (Batley and Spen) (Lab)
† Maclean, Rachel (Redditch) (Con)
† Nichols, Charlotte (Warrington North) (Lab)
† Owen, Sarah (Luton North) (Lab)
Peacock, Stephanie (Barnsley East) (Lab)
† Russell, Dean (Watford) (Con)
† Scully, Paul (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
† Wood, Mike (Dudley South) (Con)
Kevin Maddison, Bethan Harding, Committee Clerks
† attended the Committee
Public Bill Committee
Tuesday 13 December 2022
(Afternoon)
[Dame Angela Eagle in the Chair]
Online Safety Bill
(Re-committed Clauses and Schedules: Clauses 11 to 14, 18 to 21, 30, 46, 55 and 65, Schedule 8, Clauses 79 and 82, Schedule 11, Clauses 87, 90, 115, 169 and 183, Schedule 17, Clauses 203, 206 and 207, new Clauses and new Schedules)
14:00
None Portrait The Chair
- Hansard -

Before we begin, I have a few preliminary announcements. Hansard colleagues would be grateful if Members could email their speaking notes to hansardnotes@ parliament.uk. Please switch electronic devices to silent. Traditionally, the Chair of a Committee gives Members permission to take off their jackets, but given the temperature in this room, please understand that you do not need my permission to keep on your blankets or coats.

Clause 12

Adults’ risk assessment duties

Question (this day) again proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

I remind the Committee that with this we are discussing the following:

Clause 13 stand part.

Government amendments 18, 23 to 25, 32, 33 and 39.

Clause 55 stand part.

Government amendments 42 to 45, 61 to 66, 68 to 70, 74, 80, 85, 92, 51 and 52, 54, 94 and 60.

Charlotte Nichols Portrait Charlotte Nichols (Warrington North) (Lab)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairship, Dame Angela. I did not make a note of the specific word I was on when we adjourned, so I hope Hansard colleagues will forgive me if the flow between what I said previously and what I say now is somewhat stilted.

I will keep this brief, because I was—purposefully—testing the patience of the Minister with some of my contributions. However, I did so to hammer home the fact that the removal of clauses 12 and 13 from the Bill is a fatal error. If the recommittal of the Bill is not to fundamentally undermine what the Bill set out to do five years or so ago, their removal should urgently be reconsidered. We have spent five years debating the Bill to get it to this point.

As I said, there are forms of harm that are not illegal, but they are none the less harmful, and they should be legislated for. They should be in the Bill, as should specific protections for adults, not just children. I therefore urge the Minister to keep clauses 12 and 13 in the Bill so that we do not undermine what it set out to do and all the work that has been done up to this point. Inexplicably, the Government are trying to undo that work at this late stage before the Bill becomes law.

Sarah Owen Portrait Sarah Owen (Luton North) (Lab)
- Hansard - - - Excerpts

It is a pleasure to see you in the Chair, Dame Angela—I wish it was a toastier room. Let me add to the points that the shadow Minister, my hon. Friend the Member for Pontypridd, made so powerfully about vulnerable people. There is no cliff edge when such a person becomes 18. What thought have the Minister and the Department given to vulnerable young adults with learning disabilities or spectrum disorders? Frankly, the idea that, as soon as a person turns 18, they are magically no longer vulnerable is for the birds—particularly when it comes to eating disorders, suicide and self-harm.

Adults do not live in isolation, and they do not just live online. We have a duty of care to people. The perfect example is disinformation, particularly when it comes to its harmful impact on public health. We saw that with the pandemic and vaccine misinformation. We saw it with the harm done to children by the anti-vaccine movement’s myths about vaccines, children and babies. It causes greater harm than just having a conversation online.

People do not stay in one lane. Once people start being sucked into conspiracy myths, much as we discussed earlier around the algorithms that are used to keep people online, it has to keep ramping up. Social media and tech companies do that very well. They know how to do it. That is why I might start looking for something to do with ramen recipes and all of a sudden I am on to a cat that has decided to make noodles. It always ramps up. That is the fun end of it, but on the serious end somebody will start to have doubts about certain public health messages the Government are sending out. That then tips into other conspiracy theories that have really harmful, damaging consequences.

I saw that personally. My hon. Friend the Member for Warrington North eloquently put forward some really powerful examples of what she has been subjected to. With covid, some of the anti-vaccinators and anti-mask-wearers who targeted me quickly slipped into Sinophobia and racism. I was sent videos of people eating live animals, and being blamed for a global pandemic.

The people who have been targeted do not stay in one lane. The idea that adults are not vulnerable, and susceptible, to such targeting and do not need protection from it is frankly for the birds. We see that particularly with extremism, misogyny and the incel culture. I take the point from our earlier discussion about who determines what crosses the legal threshold, but why do we have to wait until somebody is physically hurt before the Government act?

That is really regrettable. So, too, is the fact that this is such a huge U-turn in policy, with 15% of the Bill coming back to Committee. As we have heard, that is unprecedented, and yet, on the most pivotal point, we were unable to hear expert advice, particularly from the National Society for the Prevention of Cruelty to Children, Barnardo’s and the Antisemitism Policy Trust. I was struggling to understand why we would not hear expert advice on such a drastic change to an important piece of legislation—until I heard the hon. Member for Don Valley talk about offence. This is not about offence; it is about harm.

The hon. Member’s comments highlighted perfectly the real reason we are all here in a freezing cold Bill Committee, rehashing work that has already been solved. The Bill was not perfect by any stretch of the imagination, but it was better than what we have today. The real reason we are here is the fight within the Conservative party.

Nick Fletcher Portrait Nick Fletcher (Don Valley) (Con)
- Hansard - - - Excerpts

No such fight has taken place. These are my personal views, and I genuinely believe that people have a right to say what they would like to say. That is free speech. There have been no fights whatever.

Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

In that case, I must have been mistaken in thinking that the hon. Member—who has probably said quite a lot of things, which is why his voice is as hoarse as it is—was criticising the former Minister for measures that were agreed in previous Committee sittings.

For me, the current proposals are a really disappointing, retrograde step. They will not protect the most vulnerable people in our communities, including offline—this harm is not just online, but stretches out across all our communities. What happens online does not take place, and stay, in an isolated space; people are influenced by it and take their cues from it. They do not just take their cues from what is said in Parliament; they see misogynists online and think that they can treat people like that. They see horrific abuses of power and extreme pornography and, as we heard from the hon. Member for Aberdeen North, take their cues from that. What happens online does not stay online.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

My hon. Friend makes an important point about what happens online and its influence on the outside world. We saw that most recently with Kanye West being reinstated to Twitter and allowed to spew his bile and abhorrent views about Jews. That antisemitism had a real-world impact in terms of the rise in antisemitism on the streets, particularly in the US. The direct impact of his being allowed to talk about that online was Jews being harmed in the real world. That is exactly what is happening.

Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

I thank the shadow Minister for that intervention. She is absolutely right. We have had a discussion about terms of reference and terms of service. Not only do most people not actually fully read them or understand them, but they are subject to change. The moment Elon Musk took over Twitter, everything changed. Not only have we got Donald Trump back, but Elon Musk also gave the keys to a mainstream social media platform to Kanye West. We have seen what happened then.

That is the situation the Government will now not shut the door on. That is regrettable. For all the reasons we have heard today, it is really damaging. It is really disappointing that we are not taking the opportunity to lead in this area.

Paul Scully Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Paul Scully)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Dame Angela.

A lot of the discussion has replayed the debate from day two on Report about the removal of “legal but harmful” measures. Some of the discussion this morning and this afternoon has covered really important issues such as self-harm on which, as we said on the Floor of the House, we will introduce measures at a later stage. I will not talk about those measures now, but I would just say that we have already said that if we agree that the promotion of things such as self-harm is illegal, it should be illegal. Let us be very straight about how we deal with the promotion of self-harm.

The Bill will bring huge improvements for adult safety online. In addition to their duty to tackle illegal content, companies will have to provide adult users with tools to keep themselves safer. On some of the other clauses, we will talk about the triple shield that was mentioned earlier. If the content is illegal, it will still be illegal. If content does not adhere to the companies’ terms of service—that includes many of the issues that we have been debating for the last hour—it will have to be removed. We will come to user enforcement issues in further clauses.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

The Minister mentions tools for adults to keep themselves safe. Does he not think that that puts the onus on the users—the victims—to keep themselves safe? The measures as they stand in the Bill put the onus on the companies to be more proactive about how they keep people safe.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The onus on adults is very much a safety net—very much a catch-all, after we have put the onus on the social media companies and the platforms to adhere to their own terms and conditions.

We have heard a lot about Twitter and the changes to Twitter. We can see the commercial imperative for mainstream platforms, certainly the category 1 platforms, to have a wide enough catch-all in their terms of service—anything that an advertiser, for example, would see as reasonably sensible—to be able to remain a viable platform in the first place. When Elon Musk first started making changes at Twitter, a comment did the rounds: “How do you build a multimillion-dollar company? You sell it to Elon Musk for £44 billion.” He made that change. He has seen the bottom falling out of his market and has lost a lot of the cash he put into Twitter. That is the commercial impetus that underpins a lot of the changes we are making.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

Is the Minister really suggesting that it is reasonable for people to say, “Right, I am going to have to walk away from Facebook because I don’t agree with their terms of service,” to hold the platform to account? How does he expect people to keep in touch with each other if they have to walk away from social media platforms in order to try to hold them to account?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I do not think the hon. Lady is seriously suggesting that people can communicate only via Facebook—via one platform. The point is that there are a variety of methods of communication, of which has been a major one, although it is not one of the biggest now, with its share value having dropped 71% in the last year. That is, again, another commercial impetus in terms of changing its platform in other, usability-related ways.

14:15
People will have choice in terms of how they communicate, but we are saying that if something is illegal, it will need to be removed from the platform. The majority of the big communication platforms referred to by the hon. Member for Aberdeen North, which people are becoming increasingly committed and dedicated to as part of their lives and ways of communicating around the world, will need to keep their terms of service broad. We heard from my hon. Friend the Member for Folkestone and Hythe that their terms of service are largely a higher bar than what was in the original Bill, so it is about getting them to adhere to the terms of service. It is not about the measures we put in; it is about how things are enforced. If platforms cannot enforce their terms of service, there is a swingeing fine—£18 million or 10% of their qualifying global turnover. If they do not then put those things right or share with Ofcom their methods of risk assessment, age verification, age assurance, user enforcement and all those other areas, there is a criminal liability attached as well.
As we heard in this morning’s sitting, companies will clearly need to design their services to prevent the spread of illegal content and protect children. Ofcom will have that broad power to require information from companies to assess compliance with the rules. As I keep saying, platforms have that strong commercial incentive to tackle harmful content, and the major companies already prohibit most of the harmful and abusive content that we talked about this morning, but they are just not readily enforcing that. Their business model does not lend itself to enforcing that legislation, so we have to change that impetus so that they adhere to their moral, as well as their legal, duties.
For that reason, which has been well addressed in the main Chamber and which we will continue to talk about the Bill continues its passage, this legislation finds the right balance between protecting free speech and freedom of expression, which are vital aims, and protecting vulnerable adults and particularly children. These user empowerment duties are about giving users greater control over their online experience, very much as a safety net, but understanding the risk assessments that each platform will have to provide. It is right that the Bill empowers vulnerable people who may find certain types of legal content unhelpful or harmful, depending on their personal circumstances. We had the sentence from the hon. Member for Warrington North about people’s different experiences, so it is right that people can change and enforce their own experience with that safety net.
Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

One of the examples I alluded to, which is particularly offensive for Jewish people, LGBT people and other people who were persecuted in the Nazi holocaust, is holocaust denial. Does the Minister seriously think that it is only Jewish people, LGBT people and other people who were persecuted in the holocaust who find holocaust denial offensive and objectionable and who do not want to see it as part of their online experience? Surely having these sorts of safety nets in place and saying that we do not think that certain kinds of content—although they may not be against the law—have a place online protects everyone’s experience, whether they are Jewish or not. Surely, no one wants to see holocaust denial online.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

No, but there is freedom of expression to a point—when it starts to reach into illegality. We have to have the balance right: someone can say something in public—in any session offline—but what the hon. Lady is suggesting is that, as soon as they hit a keyboard or a smartphone, there are two totally different regimes. That is not getting the balance right.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

The Minister says that we should have freedom of speech up to a point. Does that point include holocaust denial? He has just suggested that if something is acceptable to say in person, which I do not think holocaust denial should be, it should be acceptable online. Surely holocaust denial is objectionable whenever it happens, in whatever context—online or offline.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I have been clear about where I set the line. [Interruption.] I have said that if something is illegal, it is illegal. The terms of service of the platforms largely cover the list that we are talking about. As my hon. Friend the Member for Folkestone and Hythe and I have both said, the terms of service of the vast majority of platforms—the big category 1 platforms—set a higher bar than was in our original Bill. The hon. Member for Luton North talked about whether we should have more evidence. I understand that the pre-legislative scrutiny committee heard evidence and came to a unanimous conclusion that the “legal but harmful” conditions should not be in the Bill.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

A few moments ago, the Minister compared the online world to the real world. Does he agree that they are not the same? Sadly, the sort of thing that someone says in the pub on a Friday night to two or three of their friends is very different from someone saying something dangerously harmful online that can reach millions and billions of people in a very short space of time. The person who spoke in the pub might get up the following morning and regret what they said, but no harm was done. Once something is out there in the online world, very serious damage can be done very quickly.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The hon. Lady makes a good point. I talked about the offline world rather than the real world, but clearly that can happen. That is where the balance has to be struck, as we heard from my hon. Friend the Member for Don Valley. It is not black and white; it is a spectrum of greys. Any sensible person can soon see when they stray into areas that we have talked about such as holocaust denial and extremism, but we do not want to penalise people who invariably are testing their freedom of expression.

It is a fine balance, but I think that we have reached the right balance between protecting freedom of expression and protecting vulnerable adults by having three layers of checks. The first is illegality. The second is enforcing the terms of service, which provide a higher bar than we had in the original Bill for the vast majority of platforms, so that we can see right at the beginning how they will be enforced by the platforms. If they change them and do not adhere them, Ofcom can step in. Ofcom can step in at any point to ensure that they are being enforced. The third is a safety net.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

On illegal content, is the Minister proposing that the Government will introduce new legislation to make, for example, holocaust denial and eating disorder content illegal, whether it is online or offline? If he is saying that the bar in the online and offline worlds should be the same, will the Government introduce more hate crime legislation?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Hate crime legislation will always be considered by the Ministry of Justice, but I am not committing to any changes. That is beyond my reach, but the two shields that we talked about are underpinned by a safety net.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

Does my hon. Friend agree that the risk assessments that will be done on the priority illegal offences are very wide ranging, in addition to the risk assessments that will be done on meeting the terms of service? They will include racially and religiously motivated harassment, and putting people in fear of violence. A lot of the offences that have been discussed in the debate would already be covered by the adult safety risk assessments in the Bill.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I absolutely agree. As I said in my opening remarks about the racial abuse picked up in relation to the Euro 2020 football championship, that would have been against the terms and conditions of all those platforms, but it still happened as the platforms were not enforcing those terms and conditions. Whether we put them on a list in the Bill or talk about them in the terms of the service, they need to be enforced, but the terms of service are there.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

On that point, does my hon. Friend also agree that the priority legal offences are important too? People were prosecuted for what they posted on Twitter and Instagram about the England footballers, so that shows that we understand what racially motivated offences are and that people are prosecuted for them. The Bill will require a minimum regulatory standard that meets that threshold and requires companies to act in cases such as that one, where we know what this content is, what people are posting and what is required. Not only will the companies have to act, but they will have to complete risk assessments to demonstrate how they will do that.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Indeed. I absolutely agree with my hon. Friend and that is a good example of enforcement being used. People can be prosecuted if such abuse appears on social media, but a black footballer, who would otherwise have seen that racial abuse, can choose in the user enforcement to turn that off so that he does not see it. That does not mean that we cannot pursue a prosecution for racial abuse via a third-party complaint or via the platform.

None Portrait The Chair
- Hansard -

Order. Could the Minister address his remarks through the Chair so that I do not have to look at his back?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I apologise, Dame Angela. I will bring my remarks to a close by saying that with those triple shields, we have the protections and the fine balance that we need.

Question put, That the clause, as amended, stand part of the Bill.

Division 1

Ayes: 6

Noes: 9

Clause 13
Safety duties protecting adults
Question put, That the clause stand part of the Bill.

Division 2

Ayes: 6

Noes: 9

Clause 14
User empowerment duties
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I beg to move amendment 8, in clause 14, page 14, line 3, leave out “harmful content” and insert—

“content to which this subsection applies”.

This amendment, and Amendments 9 to 17, amend clause 14 (user empowerment) as the adult safety duties are removed (see Amendments 6, 7 and 41). New subsections (8B) to (8D) describe the kinds of content which are now relevant to the duty in clause 14(2) - see Amendment 15.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Government amendments 9 to 14.

Government amendment 15, in clause 14, page 14, line 29, at end insert—

“(8A) Subsection (2) applies to content that—

(a) is regulated user-generated content in relation to the service in question, and

(b) is within subsection (8B), (8C) or (8D).

(8B) Content is within this subsection if it encourages, promotes or provides instructions for—

(a) suicide or an act of deliberate self-injury, or

(b) an eating disorder or behaviours associated with an eating disorder.

(8C) Content is within this subsection if it is abusive and the abuse targets any of the following characteristics—

(a) race,

(b) religion,

(c) sex,

(d) sexual orientation,

(e) disability, or

(f) gender reassignment.

(8D) Content is within this subsection if it incites hatred against people—

(a) of a particular race, religion, sex or sexual orientation,

(b) who have a disability, or

(c) who have the characteristic of gender reassignment.”

This amendment describes the content relevant to the duty in subsection (2) of clause 14. The effect is (broadly) that providers must offer users tools to reduce their exposure to these kinds of content.

Amendment (a), to Government amendment 15, at end insert—

“(8E) Content is within this subsection if it—

(a) incites hateful extremism,

(b) provides false information about climate change, or

(c) is harmful to health.”

Government amendment 16, in clause 14, page 14, line 30, leave out subsection (9) and insert—

“(9) In this section—

‘disability’ means any physical or mental impairment;

‘injury’ includes poisoning;

‘non-verified user’ means a user who has not verified their identity to the provider of a service (see section 58(1));

‘race’ includes colour, nationality, and ethnic or national origins.”

This amendment inserts definitions of terms now used in clause 14.

Amendment (a), to Government amendment 16, after “mental impairment;” insert—

“‘hateful extremism’ means activity or materials directed at an out-group who are perceived as a threat to an in-group motivated by or intended to advance a political, religious or racial supremacist ideology—

(a) to create a climate conducive to hate crime, terrorism or other violence, or

(b) to attempt to erode or destroy the rights and freedoms protected by article 17 (Prohibition of abuse of rights) of Schedule 1 of the Human Rights Act 1998.”

Government amendment 17.

14:30
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The Government recognise the importance of giving adult users greater choice about what they see online and who they interact with, while upholding users’ rights to free expression online. That is why we have removed the “legal but harmful” provisions from the Bill in relation to adults and replaced it with a fairer, simpler approach: the triple shield.

As I said earlier, the first shield will require all companies in scope to take preventive measures to tackle illegal content or activity. The second shield will place new duties on category 1 services to improve transparency and accountability, and protect free speech, by requiring them to adhere to their terms of service when restricting access to content or suspending or banning users. As I said earlier, user empowerment is the key third shield, empowering adults with a greater control over their exposure to legal forms of abuse or hatred, or content that encourages, promotes or provides instructions for suicide, self-harm or eating disorders. That has been done while upholding and protecting freedom of expression.

Amendments 9 and 12 will strengthen the user empowerment duty, so that the largest companies are required to ensure that those tools are effective in reducing the likelihood of encountering the listed content or alerting users to it, and are easy for users to access. That will provide adult users with greater control over their online experience.

We are also setting out the categories of content that those user empowerment tools apply to in the Bill, through amendment 15. Adult users will be given the choice of whether they want to take advantage of those tools to have greater control over content that encourages, promotes or provides instructions for suicide, self-harm and eating disorders, and content that targets abuse or incites hate against people on the basis of race, religion, sex, sexual orientation, disability, or gender reassignment. This is a targeted approach, focused on areas where we know that adult users—particularly those who are vulnerable or disproportionately targeted by online hate and abuse—would benefit from having greater choice.

As I said, the Government remain committed to free speech, which is why we have made changes to the adult safety duties. By establishing high thresholds for inclusion in those content categories, we have ensured that legitimate debate online will not be affected by the user empowerment duties.

I want to emphasise that the user empowerment duties do not require companies to remove legal content from their services; they are about giving individual adult users the option to increase their control over those kinds of content. Platforms will still be required to provide users with the ability to filter out unverified users, if they so wish. That duty remains unchanged. For the reasons that I have set out, I hope that Members can support Government amendments 8 to 17.

I turn to the amendments in the name of the hon. Member for Pontypridd to Government amendments 15 and 16. As I have set out in relation to Government amendments 8 to 17, the Government recognise the intent behind the amendments—to apply the user empowerment tools in clause 14(2) to a greater range of content categories. As I have already set out, it is crucial that a tailored approach is taken, so that the user empowerment tools stay in balance with users’ rights to free expression online. I am sympathetic to the amendments, but they propose categories of content that risk being either unworkable for companies or duplicative to the approach already set out in amendment 15.

The category of

“content that is harmful to health”

sets an extremely broad scope. That risks requiring companies to apply the tools in clause 14(2) to an unfeasibly large volume of content. It is not a proportionate approach and would place an unreasonable burden on companies. It might also have concerning implications for freedom of expression, as it may capture important health advice. That risks, ultimately, undermining the intention behind the user empowerment tools in clause 14(2) by preventing users from accessing helpful content, and disincentivising users from using the features.

In addition, the category

“provides false information about climate change”

places a requirement on private companies to be the arbiters of truth on subjective and evolving issues. Those companies would be responsible for determining what types of legal content were considered false information, which poses a risk to freedom of expression and risks silencing genuine debate.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Did the Minister just say that climate change is subjective?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

No, not about whether climate change is happening, but we are talking about a wide range. “Provides false information”—how do the companies determine what is false? I am not talking about the binary question of whether climate change is happening, but climate change is a wide-ranging debate. “Provides false information” means that someone has to determine what is false and what is not. Basically, the amendment outsources that to the social media platforms. That is not appropriate.

Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

Would that not also apply to vaccine efficacy? If we are talking about everything being up for debate and nothing being a hard fact, we are entering slightly strange worlds where we undo a huge amount of progress, in particular on health.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The amendment does not talk about vaccine efficacy; it talks about content that is harmful to health. That is a wide-ranging thing.

None Portrait The Chair
- Hansard -

Order. I am getting increasingly confused. The Minister appears to be answering a debate on an amendment that has not yet been moved. It might be helpful to the Committee, for good debate, if the Minister were to come back with his arguments against the amendment not yet moved by the Opposition spokesperson, the hon. Member for Pontypridd, once she has actually moved it. We can then hear her reasons for it and he can reply.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

In that case, having moved my amendment, I close my remarks.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It is a pleasure to serve under your chairship, Dame Angela. With your permission, I will take this opportunity to make some broad reflections on the Government’s approach to the new so-called triple-shield protection that we have heard so much about, before coming on to the amendment tabled in my name in the group.

Broadly, Labour is disappointed that the system-level approach to content that is harmful to adults is being stripped from the Bill and replaced with a duty that puts the onus on the user to keep themselves safe. As the Antisemitism Policy Trust among others has argued, the two should be able to work in tandem. The clause allows a user to manage what harmful material they see by requiring the largest or most risky service providers to provide tools to allow a person in effect to reduce their likelihood of encountering, or to alert them to, certain types of material. We have concerns about the overall approach of the Government, but Labour believes that important additions can be made to the list of content where user-empowerment tools must be in place, hence our amendment (a) to Government amendment 15.

In July, in a little-noticed written ministerial statement, the Government produced a prototype list of content that would be harmful to adults. The list included priority content that category 1 services need to address in their terms and conditions; online abuse and harassment—mere disagreement with another’s point of view would not reach the threshold for harmful content, and so would not be covered; circulation of real or manufactured intimate images without the subject’s consent; content promoting self-harm; content promoting eating disorders; legal suicide content; and harmful health content that is demonstrably false, such as urging people to drink bleach to cure cancer.

We have concerns about whether listing those harms in the Bill is the most effective mechanism, mostly because we feel that the list should be more flexible and able to change according to the issues of the day, but it is clear that the Government will continue to pursue this avenue despite some very worrying gaps. With that in mind, will the Minister clarify what exactly underpins that list if there have been no risk assessments? What was the basis for drawing up that specific list? Surely the Government should be implored to publish the research that determined the list, at the very least.

I recognise that the false communications offence has remained in the Bill, but the list in Government amendment 15 is not exhaustive. Without the additions outlined in our amendment (a) to amendment 15, the list will do little to tackle some of the most pressing harm of our time, some of which we have already heard about today.

I am pleased that the list from the written ministerial statement has more or less been reproduced in amendment 15, under subsection (2), but there is a key and unexplained omission that our amendment (a) to it seeks to correct: the absence of the last point, on harmful health content. Amendment (a) seeks to reinsert such important content into the Bill directly. It seems implausible that the Government failed to consider the dangerous harm that health misinformation can have online, especially given that back in July they seemed to have a grasp of its importance by including it in the original list.

We all know that health-related misinformation and disinformation can significantly undermine public health, as we have heard. We only have to cast our minds back to the height of the coronavirus pandemic to remind ourselves of how dangerous the online space was, with anti-vax scepticism being rife. Many groups were impacted, including pregnant women, who received mixed messages about the safety of covid vaccination, causing widespread confusion, fear and inaction. By tabling amendment (a) to amendment 15, we wanted to understand why the Government have dropped that from the list and on what exact grounds.

In addition to harmful health content, our amendment (a) to amendment 15 would also add to the list content that incites hateful extremism and provides false information about climate change, as we have heard. In early written evidence from Carnegie, it outlined how serious the threat of climate change disinformation is to the UK. Malicious actors spreading false information on social media could undermine collective action to combat the threats. At present, the Online Safety Bill is not designed to tackle those threats head on.

We all recognise that social media is an important source of news and information for many people, and evidence is emerging of its role in climate change disinformation. The Centre for Countering Digital Hate published a report in 2021 called “The Toxic Ten: How ten fringe publishers fuel 69% of digital climate change denial”, which explores the issue further. Further analysis of activity on Facebook around COP26 undertaken by the Institute for Strategic Dialogue demonstrates the scale of the challenge in dealing with climate change misinformation and disinformation. The research compared the levels of engagement generated by reliable, scientific organisations and climate-sceptic actors, and found that posts from the latter frequently received more traction and reach than the former, which is shocking. For example, in the fortnight in which COP26 took place, sceptic content garnered 12 times the level of engagement that authoritative sources did on the platform, and 60% of the sceptic posts analysed could be classified as actively and explicitly attacking efforts to curb climate change, which just goes to show the importance of ensuring that climate change disinformation is also included in the list in Government amendment 15.

Our two amendments—amendment (a) to amendment 15, and amendment (a) to amendment 16 —seek to ensure that the long-standing omission from the Bill of hateful extremism is put right here as a priority. There is increasing concern about extremism leading to violence and death that does not meet the definition for terrorism. The internet and user-to-user services play a central role in the radicalisation process, yet the Online Safety Bill does not cover extremism.

Colleagues may be aware that Sara Khan, the former lead commissioner for countering extremism, provided a definition of extremism for the Government in February 2021, but there has been no response. The issue has been raised repeatedly by Members across the House, including by my hon. Friend the Member for Plymouth, Sutton and Devonport (Luke Pollard), following the tragic murders carried out by a radicalised incel in his constituency.

Amendment (a) to amendment 16 seeks to bring a formal definition of hateful extremism into the Bill and supports amendment (a) to amendment 15. The definition, as proposed by Sara Khan, who was appointed as Britain’s first countering extremism commissioner in 2018, is an important first step in addressing the gaps that social media platforms and providers have left open for harm and radicalisation.

Social media platforms have often been ineffective in removing other hateful extremist content. In November 2020, The Guardian reported that research from the Centre for Countering Digital Hate had uncovered how extremist merchandise had been sold on Facebook and Instagram to help fund neo-Nazi groups. That is just one of a huge number of instances, and it goes some way to suggest that a repeatedly inconsistent and ineffective approach to regulating extremist content is the one favoured by some social media platforms.

I hope that the Minister will seriously consider the amendments and will see the merits in expanding the list in Government amendment 15 to include these additional important harms.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Thank you for chairing the meeting this afternoon, Dame Angela. I agree wholeheartedly with the amendments tabled by the Labour Front-Bench team. It is important that we talk about climate change denial and what we can do to ensure people are not exposed to that harmful conspiracy theory through content. It is also important that we do what we can to ensure that pregnant women, for example, are not told not to take the covid vaccine or that parents are not told not to vaccinate their children against measles, mumps and rubella. We need to do what we can to ensure measures are in place.

I appreciate the list in Government amendment 15, but I have real issues with this idea of a toggle system—of being able to switch off this stuff. Why do the Government think people should have to switch off the promotion of suicide content or content that promotes eating disorders? Why is it acceptable that people should have to make an active choice to switch that content off in order to not see it? People have to make an active choice to tick a box that says, “No, I don’t want to see content that is abusing me because of my religion,” or “No, I don’t want to see content that is abusing me because of my membership of the LGBT community.” We do not want people to have to look through the abuse they are receiving in order to press the right buttons to switch it off. As the hon. Member for Don Valley said, people should be allowed to say what they want online, but the reality is that the extremist content that we have seen published online is radicalising people and bringing them to the point that they are taking physical action against people in the real, offline world as well as taking action online.

14:45
There were many issues with the Bill before, but it was significantly better than it will be at the end of this Committee. I wholeheartedly support the Opposition amendments. They are particularly clever, in that they bring in that additional content and include that definition of extremism, and they would make a significant and positive difference to the Bill.
On clause 14 more generally, the user empowerment tools are important. It is important that we have user empowerment and that that is mandated. I agree that people should be able to fix their online lives in order to stay away from some of the stuff that they may not want to see. I am disappointed that gambling is not included in the list. It could have been included so that people have the opportunity to avoid it. In real life, if someone has an issue with gambling, they can go to their local betting shop and say, “I have a problem with gambling. Do not allow me to bet anymore.” The betting shop has to say, “Okay, we will not allow you to bet anymore.” That is how it works in real life, and not having that in the Bill, as I said at the previous Committee stage, is a concern, because we do not have parity between the online and offline worlds.
As a result of the Bill, people will be able to stop seeing content on YouTube, for example, promoting eating disorders, but they might not be able to stop seeing content promoting online poker sites, when that might be causing a significant issue for their health, so not including that is bit of an oversight. As I say, user empowerment is important, but the Government have not implemented it in nearly as good a way as they should have done, and the Opposition amendments would make the Government amendments better.
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I rise briefly to say that the introduction of the shields is a significant additional safety measure in the Bill and shows that the Government have thought about how to improve certain safety features as the Bill has progressed.

In the previous version of the Bill, as we have discussed at length, there were the priority legal offences that companies had to proactively identify and mitigate, and there were the measures on transparency and accountability on the terms of service. However, if pieces of content fell below the threshold for the priority legal offences or were not covered, or if they were not addressed in the terms of service, the previous version of the Bill never required the companies to act in any particular way. Reports might be done by Ofcom raising concerns, but there was no requirement for further action to be taken if the content was not a breach of platform policies or the priority safety duties.

The additional measure before us says that there may be content where there is no legal basis for removal, but users nevertheless have the right to have that content blocked. Many platforms offer ad tools already—they are not perfect, but people can opt in to say that they do not want to see ads for particular types of content—but there was nothing for the types of content covered by the Online Safety Bill, where someone could say, “I want to make sure I protect myself from seeing this at all,” and then, for the more serious content, “I expect the platforms to take action to mitigate it.” So this measure is an important additional level of protection for adult users, which allows them to give themselves the certainty that they will not see certain types of content and puts an important, additional duty on the companies themselves.

Briefly, on the point about gambling, the hon. Member for Aberdeen North is quite right to say that someone can self-exclude from gambling at the betting shop, but the advertising code already requires that companies do not target people who have self-excluded with advertising messages. As the Government complete their online advertising review, which is a separate piece of work, it is important that that is effectively enforced on big platforms, such as Facebook and Google, to ensure that they do not allow companies to advertise to vulnerable users in breach of the code. However, that can be done outside the Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

My concern is not just about advertising content or stuff that is specifically considered as an advert. If someone put up a TikTok video about how to cheat an online poker system, that would not be classed as an advert and therefore would not be caught. People would still be able to see it, and could not opt out.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I totally appreciate the point that the hon. Lady makes, which is a different one. For gambling, the inducement to act straightaway often comes in the form of advertising. It usually comes in the form of free bets and immediate inducements to act. People who have self-excluded should not be targeted in that way. We need to ensure that that is rigorously enforced on online platforms too.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

It is a pleasure to serve under your chairship, Dame Angela. It is lovely to be back in a Public Bill Committee with many familiar faces—and a few new ones, including the Minister. However, after devoting many weeks earlier this year to the previous Committee, I must admit that it is with some frustration that we are back here with the Government intent on further weakening their Bill.

Throughout the passage of the Bill, I have raised a number of specific concerns, from democratic and journalistic exemptions, to age verification, recognised news publishers, advocacy bodies and media literacy. On clause 14, while I support the principles of Government amendments 15 and 16, I draw the Minister’s attention to the importance of amendment (a) to amendment 15 and amendment (a) to amendment 16. He has already said that he is sympathetic to those amendments. Let me try to convince him to turn that sympathy into action.

I will focus primarily on an issue that is extremely important to me and to many others: extremism and radicalisation. However, while I will focus on the dangers of extremism and radicalisation, be it right-wing, Islamist, incel or other, the dangers that I am about to set out—the chain of events that leads to considerable harm online—are the same for self-harm content, eating disorder content, health disinformation, climate change disinformation or any dangerous, hateful material directed at people based on their sex, sexual orientation, ethnicity, religion or other characteristics.

Such content is not just deeply offensive and often wholly inaccurate; it is dangerous and vile and serves only to spread harm, misinformation and conspiracy. To be clear, such content is not about a social media user stating how upset and angry they are about the football result, or somebody disagreeing legitimately and passionately about a political issue. It is not the normal, everyday social media content that most people see on their feeds.

This is content that is specifically, carefully and callously designed to sit just below the criminal threshold, yet that can still encourage violence, self-harm or worse. It is content used by extremists of all types that lures vulnerable people in, uses social media likes and comments to create the illusion of legitimacy and popularity, and then directly targets those most likely to be susceptible, encouraging them either to commit harm or to move on to smaller but high-harm platforms that may fall out of the scope of the Bill. This is not free speech; it is content that can act as a dangerous gateway to radicalisation and extremism. The Government know how dangerous it is because their own report from His Majesty’s Prison and Probation Service last year found:

“The Internet appears to be playing an increasingly prominent role in radicalisation processes of those convicted of extremist offences in England and Wales.”

Hon. Members will understand my deep and personal interest in this matter. Since the murder of my sister, a Member of this House, six and a half years ago by a far-right extremist, I have worked hard to bring communities and people together in the face of hatred. Some of that work has included meeting former extremists and discussing how they were radicalised. Those conversations were never easy, but what became very clear to me was that such people are not born extremists. Their radicalisation starts somewhere, and it is often somewhere that appears to be completely innocent, such as a Facebook group about issues or problems in their community, a Twitter discussion about current affairs or the state of the country, or even a page for supporters of their football team.

One day, a comment is posted that is not illegal and is not hate speech, but that references a conspiracy or a common trope. It is an ideological remark placed there to test the water. The conversation moves on and escalates. More disturbing or even violent comments start to be made. They might be accompanied by images or videos, leading those involved down a more sinister path. Nothing yet is illegal, but clearly—I hope we would all agree—it is unacceptable.

The number of contributors reduces, but a few remain. No warnings are presented, no flags are raised and it appears like normal social media content. However, the person reading it might be lonely or vulnerable, and now feels that they have found people to listen to them. They might be depressed or unhappy and looking to blame their situation on something or someone. They might feel that nobody understands them, but these people seem to.

The discussion is then taken to a more private place, to the smaller but more harmful platforms that may fall outside the scope of the Bill, but that will now become the go-to place for spreading extremism, misinformation and other harmful content. The radicalisation continues there—harder to track, harder to monitor and harder to stop. Let us remember, however, that all of that started with those legal but harmful comments being witnessed. They were clearly unacceptable, but mainstream social media give them legitimacy. The Online Safety Bill will do nothing to stop that.

Unfortunately, that chain of events occurs far too often. It is a story told many times, about how somebody vulnerable is lured in by those wishing to spread their hatred. It is hosted by major social media platforms. Hon. Members may remember the case of John, a teenager radicalised online and subsequently sentenced. His story was covered by The Guardian last year. John was feeling a sense of hopelessness, which left him susceptible to the messaging of the far right. Aged 15, he felt “written off”: he was in the bottom set at school, with zero exam expectations, and feeling that his life opportunities would be dismal. The far right, however, promised him a future. John became increasingly radicalised by an online barrage of far-right disinformation. He said:

“I was relying on the far right for a job. They were saying that when they got power they would be giving jobs to people like me”.

John now says:

“Now I know the posts were all fake, but the 15-year-old me didn’t bother to fact-check.”

For some people in the room, that might seem like a totally different world. Thankfully, for most of us, it is. However, if Members take the time to see some of that stuff online, it is extremely disturbing and alarming. It is a world that we do not understand, but we have to be aware that it exists. The truth, as we can see, is that such groups use popular online platforms to lure in young people and give them a sense of community. One white nationalist group actively targets younger recruits and recently started Call of Duty warcraft gaming tournaments for its supporters. Let us be clear: John was 15, but he could easily have been 18, 19 or indeed significantly older.

John was radicalised by the far right, but we know that similar methods are used by Islamist extremists. A 2020 report from New York University’s Centre for Global Affairs stated:

“The age of social media has allowed ISIS to connect with a large-scale global audience that it would not be able to reach without it...Through strategic targeting, ISIS selects those who are most vulnerable and susceptible to radicalization”.

That includes those who are

“searching for meaning or purpose in their life, feeling anger and…alienated from society”.

The ages that are most vulnerable are 15 to 25.

Social media platforms allow ISIS to present its propaganda as mainstream news at little to no cost. Preventing that harm and breaking those chains of radicalisation is, however, possible, and the Bill could go much further to put the responsibility not on the user, but on the platforms. I believe that those platforms need unique regulation, because social media interaction is fundamentally different from real-life social interaction.

Social media presents content to us as if it is the only voice and viewpoint. On social media, people are far more likely to say things that they never would in person. On social media, those views spread like wildfire in a way that they would not in real life. On social media, algorithms find such content and pump it towards us, in a way that can become overwhelming and that can provide validity and reassurance where doubt might otherwise set in.

Allowing that content to remain online without warnings, or allowing it to be visible to all users unless they go searching through their settings to turn it off—which is wholly unrealistic—is a dereliction of duty and a missed opportunity to clean up the platforms and break the chains of radicalisation. As I set out, the chain of events is not unique to one form of radicalisation or hateful content. The same online algorithms that present extremist content to users also promote negative body image, eating disorders, and self-harm and suicide content.

I hope the Committee realises why I am so impassioned about “legal but harmful” clauses, and why I am particularly upset that a few Conservative Members appear to believe that such content should remain unchecked online because of free speech, with full knowledge that it is exactly that content that serves as the gateway for people to self-harm and to be radicalised. That is not free speech.

15:00
There is broad consensus across the Committee that the Bill as a whole must do greater good than harm and lead the world in effectively regulating the internet for the benefit and safety of its users. However, there remains a number of considerable gaps that will allow harm to continue online. One small step that the Government could commit to today—I urge the Minister to do so—is to accept at least the Opposition amendment (a) to amendment 15 and amendment (a) to amendmentâ 16, which would define and explicitly categorise content that incites hateful extremism as harmful content in the Bill, ensuring that platforms have a responsibility to find, label and hopefully hide that content from users by default. The Government can be assured of the Opposition’s support to strengthen the Bill further, including in the “legal but harmful” area, in the face of a very small number of Conservative Members who are resisting on the basis of ideological purity rather than of preventing real life harm.
Allowing such content freely on platforms and doing nothing to ensure that smaller but high-harm platforms are brought into the remit of this Bill is a backward step. We should be strengthening, not weakening, the Bill in this Committee. That is why I oppose the Government’s position and wholeheartedly support the Opposition’s amendments to clause 14.
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I have talked a little already about these amendments, so let me sum up where I think we are. I talked about harmful health content and why it is not included. The Online Safety Bill will force social media companies to tackle health misinformation and disinformation online, where it constitutes a criminal offence. It includes the communications offence, which would capture posts encouraging dangerous hoax cures, where the sender knows the information to be false and intends to cause harm, such as encouraging drinking bleach to cure cancer, which we heard about a little earlier.

The legislation is only one part of the wider Government approach to this issue. It includes the work of the counter-disinformation unit, which brings together cross-Government monitoring and analysis capabilities and engages with platforms directly to ensure that appropriate action is taken, in addition to the Government’s work to build users’ resilience to misinformation through media literacy.

Including harmful health content as a category risks requiring companies to apply the adult user empowerment tools to an unfeasibly large volume of content—way beyond just the vaccine efficacy that was mentioned. That has implications both for regulatory burden and for freedom of expression, as it may capture important health advice. Similarly, on climate change, the Online Safety Bill itself will introduce new transparency, accountability and free speech duties and category one services. If a platform said that certain types of content are not allowed, it will be held to account for their removal.

We recognised that there was a heightened risk of disinformation surrounding the COP26 summit. The counter-disinformation unit led by the Department for Digital, Culture, Media and Sport brought together monitoring and analysis capabilities across Government to understand disinformation that posed a risk to public safety or to delegates or that represented attempts at interference from malign actors. We are clear that free debate is essential to a democracy and that the counter-disinformation unit should not infringe upon political debate. Government already work closely with the major social media platforms to encourage them to collaborate at speed to remove disinformation as per their terms of service.

Amendment (a) to amendment 15 and amendment (a) to amendment 16 would create that new category of content that incites hateful extremism. That is closely aligned with the approach that the Government are already taking with amendment 15, specifically subsections (8C) and (8D), which create a category of content that is abusive or incites hate on the basis of race, religion, sex, sexual orientation, disability, or gender reassignment. Those conditions would likely capture the majority of the kinds of content that the hon. Members are seeking to capture through their hateful extremism category. For example, it would capture antisemitic abuse and conspiracy theories, racist abuse and promotion of racist ideologies.

Furthermore, where companies’ terms of service say they prohibit or apply restrictions to the kind of content listed in the Opposition amendments, companies must ensure that those terms are consistently enforced. It comes back so much to the enforcement. They must also ensure that the terms of service are easily understandable.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

If this is about companies enforcing what is in their terms of service for the use of their platforms, could it not create a perverse incentive for them to have very little in their terms of service? If they will be punished for not enforcing their terms of service, surely they will want them to be as lax as possible in order to limit their legal liability for enforcing them. Does the Minister follow?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I follow, but I do not agree. The categories of content in proposed new subsections (8C) and (8D), introduced by amendment 15, underpin a lot of this. I answered the question in an earlier debate when talking about the commercial impetus. I cannot imagine many mainstream advertisers wanting to advertise with a company that removed from its terms of service the exclusion of racial abuse, misogyny and general abuse. We have seen that commercial impetus really kicking in with certain platforms. For those reasons, I am unable to accept the amendments to the amendments, and I hope that the Opposition will not press them to a vote.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful for the opportunity to push the Minister further. I asked him whether he could outline where the list in amendment 15 came from. Will he publish the research that led him to compile that specific list of priority harms?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The definitions that we have taken are ones that strike the right balance and have a comparatively high threshold, so that they do not capture challenging and robust discussions on controversial topics.

Amendment 8 agreed to.

Amendments made: 9, in clause 14, page 14, line 5, after “to” insert “effectively”.

This amendment strengthens the duty in this clause by requiring that the systems or processes used to deal with the kinds of content described in subsections (8B) to (8D) (see Amendment 15) should be designed to effectively increase users’ control over such content.

Amendment 10, in clause 14, page 14, line 6, leave out from “encountering” to “the” in line 7 and insert

“content to which subsection (2) applies present on”.

This amendment inserts a reference to the kinds of content now relevant for this clause, instead of referring to priority content that is harmful to adults.

Amendment 11, in clause 14, page 14, line 9, leave out from “to” to end of line 10 and insert

“content present on the service that is a particular kind of content to which subsection (2) applies”.—(Paul Scully.)

This amendment inserts a reference to the kinds of content now relevant for this clause, instead of referring to priority content that is harmful to adults.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move amendment 102, in clause 14, page 14, line 12, leave out “made available to” and insert “in operation for”.

This amendment, and Amendment 103, relate to the tools proposed in Clause 14 which will be available for individuals to use on platforms to protect themselves from harm. This amendment specifically forces platforms to have these safety tools “on” by default.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss amendment 103, in clause 14, page 14, line 15, leave out “take advantage of” and insert “disapply”.

This amendment relates to Amendment 102.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The amendments relate to the tools proposed in clause 14, which as we know will be available for individuals to use on platforms to protect themselves from harm. As the Minister knows, Labour fundamentally disagrees with that approach, which will place the onus on the user, rather than the platform, to protect themselves from harmful content. It is widely recognised that the purpose of this week’s Committee proceedings is to allow the Government to remove the so-called “legal but harmful” clauses and replace them with the user empowerment tool option. Let us be clear that that goes against the very essence of the Bill, which was created to address the particular way in which social media allows content to be shared, spread and broadcast around the world at speed.

This approach could very well see a two-tier internet system develop, which leaves those of us who choose to utilise the user empowerment tools ignorant of harmful content perpetuated elsewhere for others to see. The tools proposed in clause 14, however, reflect something that we all know to be true: that there is some very harmful content out there for us all to see online. We can all agree that individuals should therefore have access to the appropriate tools to protect themselves. It is also right that providers will be required to ensure that adults have greater choice and control over the content that they see and engage with, but let us be clear that instead of focusing on defining exactly what content is or is not harmful, the Bill should focus on the processes by which harmful content is amplified on social media.

However, we are where we are, and Labour believes that it is better to have the Bill over the line, with a regulator in place with some powers, than simply to do nothing at all. With that in mind, we have tabled the amendment specifically to force platforms to have safety tools on by default. We believe that the user empowerment tools should be on by default and that they must be appropriately visible and easy to use. We must recognise that for people at a point of crisis—if a person is suffering with depressive or suicidal thoughts, or with significant personal isolation, for example—the tools may not be at the forefront of their minds if their mental state is severely impacted.

On a similar point, we must not patronise the public. Labour sees no rational argument why the Government would not support the amendment. We should all assume that if a rational adult is able to easily find and use these user empowerment tools, then they will be easily able to turn them off if they choose to do so.

The Minister knows that I am not in the habit of guessing but, judging from our private conversations, his rebuttal to my points may be because he believes it is not the Government’s role to impose rules directly on platforms, particularly when they impact their functionality. However, for Labour, the existence of harm and the importance of protecting people online tips the balance in favour of turning these user empowerment tools on by default. We see no negative reason why that should not be the case, and we now have a simple amendment that could have a significantly positive impact.

I hope the Minister and colleagues will reflect strongly on these amendments, as we believe they are a reasonable and simple ask of platforms to do the right thing and have the user empowerment tools on by default.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Once again, this is a very smart amendment that I wish I had thought of myself and I am happy to support. The case made by those campaigning for freedom of speech at any cost is about people being able to say what they want to say, no matter how harmful that may be. It is not about requiring me, or anyone else, to read those things—the harmful bile, the holocaust denial or the promotion of suicide that is spouted. It is not freedom of speech to require someone else to see and read such content so I cannot see any potential argument that the Government could come up with against these amendments.

The amendments have nothing to do with freedom of speech or with limiting people’s ability to say whatever they want to say or to promote whatever untruths they want to promote. However, they are about making sure that people are protected and that they are starting from a position of having to opt in if they want to see harmful content. If I want to see content about holocaust denial—I do not want to see that, but if I did—I should have to clearly tick a button that says, “Yes, I am pretty extreme in my views and I want to see things that are abusing people. I want to see that sort of content.” I should have to opt in to be able to see that.

There are a significant number of newspapers out there. I will not even pick up a lot of them because there is so much stuff in them with which I disagree, but I can choose not to pick them up. I do not have that newspaper served to me against my will because I have the opportunity to choose to opt out from buying it. I do not have to go into the supermarket and say, “No, please do not give me that newspaper!” I just do not pick it up. If we put the Government’s proposal on its head and do what has been suggested in the Opposition amendments, everyone would be in a much better position.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

I note that many providers of 4G internet, including the one I have on my own phone, already block adult content. Essentially, if people want to look at pornography or other forms of content, they have to proactively opt in to be allowed to see it. Would it not make sense to make something as straightforward as that, which already exists, into the model that we want on the internet more widely, as opposed to leaving it to EE and others to do?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I absolutely agree. Another point that has been made is that this is not creating undue burden; the Government are already creating the burden for companies—I am not saying that it is a bad burden, but the Government are already creating it. We just want people to have the opportunity to opt into it, or out of it. That is the position that we are in.

15:19
My hon. Friend the Member for Coatbridge, Chryston and Bellshill and I were having a conversation earlier about how the terms of service might say that holocaust denial was banned. Potentially, however, the terms of service could say, “You may see content that is about holocaust denial on our platform, because we don’t ban it.” They could explicitly have to warn people about the presence of that content.
The Opposition are suggesting flipping the issue on its head. As the hon. Member for Batley and Spen said, there is no way that people go on to Facebook and imagine that they will see extremist content. Most people do not imagine that they will be led down this rabbit hole of increasing extremism on Facebook, because Facebook is where we go to speak to our friends, to see our auntie’s holiday photos or to communicate with people.
The Minister was making slightly light of the fact that there are other ways to communicate—yes, absolutely, but two of the communities that I spend a lot of time in and where I get an awful lot of support and friendship exist only on Facebook. That is the only place where I can have those relationships with friends who live all around the world, because that is where the conversation is taking place. I will not choose to opt out of that, because I would be cut off from two of my support networks. I do not think it is right that we should be told, “If you don’t want to see extremist content, just don’t be a member of Facebook”—or whatever platform it happens to be.
That is not the way to go; we should be writing in the protections. We should be starting from the point of view that no one wants to see content on the promotion of suicide; if they do, they can tick a box to see it. We should start from that point of view: allowing people to opt in if they want to see free speech in an untrammelled way on whatever platform it is.
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I will speak briefly in favour of amendments 102 and 103. As I mentioned a few moments ago, legal but harmful content can act as the gateway to dangerous radicalisation and extremism. Such content, hosted by mainstream social media platforms, should not be permitted unchecked online. I appreciate tható for children the content will be banned, but I strongly believe that the default position should be for such content to be hidden by default to all adult users, as the amendments would ensure.

The chain of events that leads to radicalisation, as I spelt out, relies on groups and individuals reaching people unaware that they are being radicalised. The content is posted in otherwise innocent Facebook groups, forums or Twitter threads. Adding a toggle, hidden somewhere in users’ settings, which few people know about or use, will do nothing to stop that. It will do nothing to stop the harmful content from reaching vulnerable and susceptible users.

We, as legislators, have an obligation to prevent at root that harmful content reaching and drawing in those vulnerable and susceptible to the misinformation and conspiracy spouted by vile groups and individuals wishing to spread their harm. The only way that we can make meaningful progress is by putting the responsibility squarely on platforms, to ensure that by default users do not come across the content in the first place.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

In the previous debate, I talked about amendment 15, which brought in a lot of protections against content that encourages and promotes, or provides instruction for, self-harm, suicide or eating disorders, and against content that is abusive or incites hate on the base of race, religion, disability, sex, gender reassignment or sexual orientation. We have also placed a duty on the largest platforms to offer adults the option to filter out unverified users if they so wish. That is a targeted approach that reflects areas where vulnerable users in particular could benefit from having greater choice and control. I come back to the fact that that is the third shield and an extra safety net. A lot of the extremes we have heard about, which have been used as debating points, as important as they are, should very much be wrapped up by the first two shields.

We have a targeted approach, but it is based on choice. It is right that adult users have a choice about what they see online and who they interact with. It is right that this choice lies in the hands of those adults. The Government mandating that these tools be on by default goes against the central aim of users being empowered to choose for themselves whether they want to reduce their engagement with some kinds of legal content.

We have been clear right from the beginning that it is not the Government’s role to say what legal content adults should or should not view online or to incentivise the removal of legal content. That is why we removed the adult legal but harmful duties in the first place. I believe we are striking the right balance between empowering adult users online and protecting freedom of expression. For that reason, I am not able to accept the amendments from the hon. Member for Pontypridd.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It is disappointing that the Government are refusing to back these amendments to place the toggle as “on” by default. It is something that we see as a safety net, as the Minister described. Why would someone have to choose to have the safety net there? If someone does not want it, they can easily take it away. The choice should be that way around, because it is there to protect all of us.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

I am sure that, like me, the shadow Minister will be baffled that the Government are against our proposals to have to opt out. Surely this is something that is of key concern to the Government, given that the former MP for Tiverton and Honiton might still be an MP if users had to opt in to watching pornography, rather than being accidentally shown it when innocently searching for tractors?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

My hon. Friend makes a very good point. It goes to show the nature of this as a protection for all of us, even MPs, from accessing content that could be harmful to our health or, indeed, profession. Given the nature of the amendment, we feel that this is a safety net that should be available to all. It should be on by default.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I should say that in the spirit of choice, companies can also choose to default it to be switched off in the first place as well.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Minister makes the point that companies can choose to have it off by default, but we would not need this Bill in the first place if companies did the right thing. Let us be clear: we would not have had to be here debating this for the past five years —for me it has been 12 months—if companies were going to do the right thing and protect people from harmful content online. On that basis, I will push the amendments to a vote.

Question put, That the amendment be made.

Division 3

Ayes: 6

Noes: 8

Amendment made: 12, in clause 14, page 14, line 12, at end insert
“and are easy to access”.—(Paul Scully.)
This amendment requires providers to ensure that features for users to increase their control over content described in subsections (8B) to (8D) (see Amendment 15) are easy to access.
Amendment proposed: 103, in clause 14, page 14, line 15, leave out “take advantage of” and insert “disapply”.—(Alex Davies-Jones.)
This amendment relates to Amendment 102.
Question put, That the amendment be made.

Division 4

Ayes: 6

Noes: 8

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move amendment 101, in clause 14, page 14, line 17, at end insert—

“(6A) A duty to ensure features and provisions in subsections (2), (4) and (6) are accessible and understandable to adult users with learning disabilities.”

This amendment creates a duty that user empowerment functions must be accessible and understandable to adult users with learning disabilities.

This issue was originally brought to my attention by Mencap. It is incredibly important, and it has potentially not been covered adequately by either our previous discussions of the Bill or the Bill itself. The amendment is specifically about ensuring that available features are accessible to adult users with learning disabilities. An awful lot of people use the internet, and people should not be excluded from using it and having access to safety features because they have a learning disability. That should not be the case, for example, when someone is trying to find how to report something on a social media platform. I had an absolute nightmare trying to report a racist gif that was offered in the list of gifs that came up. There is no potential way to report that racist gif to Facebook because it does not take responsibility for it, and GIPHY does not take responsibility for it because it might not be a GIPHY gif.

It is difficult to find the ways to report some of this stuff and to find some of the privacy settings. Even when someone does find the privacy settings, on a significant number of these platforms they do not make much sense—they are not understandable. I am able to read fairly well, I would think, and I am able to speak in the House of Commons, but I still do not understand some of the stuff in the privacy features found on some social media sites. I cannot find how to toggle off things that I want to toggle off on the level of accessibility or privacy that I have, particularly on social media platforms; I will focus on those for the moment. The Bill will not achieve even its intended purpose if all people using these services cannot access or understand the safety features and user empowerment tools.

I am quite happy to talk about the difference between the real world and the online world. My online friends have no problem with me talking about the real world as if it is something different, because it is. In the real world, we have a situation where things such as cuckooing take place and people take advantage of vulnerable adults. Social services, the police and various organisations are on the lookout for that and try to do what they can to put protections in place. I am asking for more parity with the real world here. Let us ensure that we have the protections in place, and that people who are vulnerable and taken advantage of far too often have access to those tools in order to protect themselves. It is particularly reasonable.

Let us say that somebody with a learning disability particularly likes cats; the Committee may have worked out that I also particularly like cats. Let us say that they want to go on TikTok or YouTube and look at videos of cats. They have to sign up to watch videos of cats. They may not have the capacity or understanding to know that there might be extreme content on those sites. They may not be able to grasp that. It may never cross their minds that there could be extreme content on that site. When they are signing up to TikTok, they should not have to go and find the specific toggle to switch off eating disorder content. All they had thought about was that this is a cool place to look at videos of cats.

15:30
I am asking the Minister to make it really clear that these tools should be available and accessible to everybody, and that Ofcom will look at that availability and accessibility and listen to experts who say that there is a real issue with a certain website because the tools are not as accessible as they should be. Would the Minister be kind enough to make that incredibly clear, so that platforms are aware of the direction and the intention? Ofcom also needs to be aware that this is a priority and that these tools should be available to everyone in order to provide that level of accessibly, and in order that everybody can enjoy cat videos.
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I am happy to do that. In the same way that we spoke this morning about children’s protection, I am very aware of the terms of service and what people are getting into by looking for cats or whatever they want to do.

The Bill requires providers to make all the usual enforcement and protection tools available to all adults, including those with learning disabilities. Clause 14(4) makes it explicitly clear that features offered by providers, in compliance with the duty for users to be given greater control over the content that they see, must be made available to all adult users. Clause 14(5) further outlines that providers must have clear and accessible terms of service about what tools are offered in their service and how users may take advantage of them. We have strengthened the accessibility of the user enforcement duties through Government amendment 12 as well, to make sure that user enforcement tools and features are easy for users to access.

In addition, clause 58(1) says that providers must offer all adult users the option to verify themselves so that vulnerable users, including those with learning disabilities, are not at a disadvantage as a result of the user empowerment duties. Clause 59(2) and (3) further stipulate that in producing the guidance for providers about the user verification duty, Ofcom must have particular regard to the desirability of making identity verification available to vulnerable adult users, and must consult with persons who represent the interests of vulnerable adult users. That is about getting the thoughts of experts and advocates into their processes to make sure that they can enforce what is going on.

In addition, Ofcom is subject to the public sector equality duty, so it will have to take into account the ways in which people with disabilities may be impacted when performing its duties, such as writing its codes of practice for the user empowerment duty. I hope the hon. Member will appreciate the fact that, in a holistic way, that covers the essence of exactly what she is trying to do in her amendment, so I do not believe her amendment is necessary.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

In view of the Minister’s statement, I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Amendments made: 13, in clause 14, page 14, line 26, leave out paragraph (a) and insert—

“(a) the likelihood of adult users encountering content to which subsection (2) applies by means of the service, and”

This amendment is about factors relevant to the proportionality of measures to comply with the duty in subsection (2). The new wording replaces a reference to an adults’ risk assessment, as adults’ risk assessments are no longer required (see Amendment 6 which removes clause 12).

Amendment 14, in clause 14, page 14, line 29, leave out “a” and insert “the”.—(Paul Scully.)

This is a technical amendment consequential on Amendment 13.

Amendment (a) proposed to amendment 15: (a), at end insert—

“(8E) Content is within this subsection if it—

(a) incites hateful extremism,

(b) provides false information about climate change, or

(c) is harmful to health.”—(Alex Davies-Jones.)

Question put, That the amendment be made.

Division 5

Ayes: 6

Noes: 8

Amendments made: 15, in clause 14, page 14, line 29, at end insert—
“(8A) Subsection (2) applies to content that—
(a) is regulated user-generated content in relation to the service in question, and
(b) is within subsection (8B), (8C) or (8D).
(8B) Content is within this subsection if it encourages, promotes or provides instructions for—
(a) suicide or an act of deliberate self-injury, or
(b) an eating disorder or behaviours associated with an eating disorder.
(8C) Content is within this subsection if it is abusive and the abuse targets any of the following characteristics—
(a) race,
(b) religion,
(c) sex,
(d) sexual orientation,
(e) disability, or
(f) gender reassignment.
(8D) Content is within this subsection if it incites hatred against people—
(a) of a particular race, religion, sex or sexual orientation,
(b) who have a disability, or
(c) who have the characteristic of gender reassignment.”
This amendment describes the content relevant to the duty in subsection (2) of clause 14. The effect is (broadly) that providers must offer users tools to reduce their exposure to these kinds of content.
Amendment 16, in clause 14, page 14, line 30, leave out subsection (9) and insert—
“(9) In this section—
‘disability’ means any physical or mental impairment;
‘injury’ includes poisoning;
‘non-verified user’ means a user who has not verified their identity to the provider of a service (see section 58(1));
‘race includes colour, nationality, and ethnic or national origins.”
This amendment inserts definitions of terms now used in clause 14.
Amendment 17, in clause 14, page 14, line 33, at end insert
“, and
(b) references to religion include references to a lack of religion.
(11) For the purposes of this section, a person has the characteristic of gender reassignment if the person is proposing to undergo, is undergoing or has undergone a process (or part of a process) for the purpose of reassigning the person’s sex by changing physiological or other attributes of sex, and the reference to gender reassignment in subsection (8C) is to be construed accordingly.” —(Paul Scully.)
This amendment clarifies the meaning of terms now used in clause 14.
Clause 14, as amended, ordered to stand part of the Bill.
Clause 18
Duty about content reporting
Amendment made: 18, in clause 18, page 19, line 15, leave out subsection (5).—(Paul Scully.)
This amendment is consequential on the removal of the adult safety duties (see Amendments 6, 7 and 41.)
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I beg to move amendment 19, in clause 18, page 19, line 32, leave out from “also” to second “section”.

This is a technical amendment relating to Amendment 20.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Government amendments 20 and 21, 26 and 27, 30, 34 and 35, 67, 71, 46 and 47, 50, 53, 55 to 57, and 95.

Government new clause 3—Duty not to act against users except in accordance with terms of service.

Government new clause 4—Further duties about terms of service.

Government new clause 5—OFCOM’s guidance about duties set out in sections (Duty not to act against users except in accordance with terms of service) and (Further

duties about terms of service).

Government new clause 6—Interpretation of this Chapter.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I am seeking to impose new duties on category 1 services to ensure that they are held accountable to their terms of service and to protect free speech. Under the status quo, companies get to decide what we do and do not see online. They can arbitrarily ban users or remove their content without offering any form of due process and with very few avenues for users to achieve effective redress. On the other hand, companies’ terms of service are often poorly enforced, if at all.

I have mentioned before the horrendous abuse suffered by footballers around the 2020 Euro final, despite most platforms’ terms and conditions clearly not allowing that sort of content. There are countless similar instances, for example, relating to antisemitic abuse—as we have heard—and other forms of hate speech, that fall below the criminal threshold.

This group of amendments relates to a series of new duties that will fundamentally reset the relationship between platforms and their users. The duties will prevent services from arbitrarily removing content or suspending users without offering users proper avenues to appeal. At the same time, they will stop companies making empty promises to their users about their terms of service. The duties will ensure that where companies say they will remove content or ban a user, they actually do.

Government new clause 3 is focused on protecting free speech. It would require providers of category 1 services to remove or restrict access to content, or ban or suspend users, only where this is consistent with their terms of service. Ofcom will oversee companies’ systems and processes for discharging those duties, rather than supervising individual decisions.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I am grateful for what the Minister has said, and glad that Ofcom will have a role in seeing that companies do not remove content that is not in breach of terms of service where there is no legal requirement to do so. In other areas of the Bill where these duties exist, risk assessments are to be conducted and codes of practice are in place. Will there similarly be risk assessments and codes of practice to ensure that companies comply with their freedom of speech obligations?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Absolutely. As I say, it is really important that people understand right at the beginning, through risk assessments, what they are signing up for and what they can expect. To come back to the point of whether someone is an adult or a child, it is really important that parents lean in when it comes to children’s protections; that is a very important tool in the armoury.

New clause 4 will require providers of category 1 services to ensure that what their terms of service say about their content moderation policies is clear and accessible. Those terms have to be easy for users to understand, and should have sufficient detail, so that users know what to expect, in relation to moderation actions. Providers of category 1 services must apply their terms of service consistently, and they must have in place systems and processes that enable them to enforce their terms of service consistently.

These duties will give users the ability to report any content or account that they suspect does not meet a platform’s terms of service. They will also give users the ability to make complaints about platforms’ moderation actions, and raise concerns if their content is removed in error. Providers will be required to take appropriate action in response to complaints. That could include removing content that they prohibit, or reinstating content removed in error. These duties ensure that providers are made aware of issues to do with their services and require them to take action to resolve them, to keep users safe, and to uphold users’ rights to free speech.

The duties set out in new clauses 3 and 4 will not apply to illegal content, content that is harmful to children or consumer content. That is because illegal content and content that is harmful to children are covered by existing duties in the Bill, and consumer content is already regulated under consumer protection legislation. Companies will also be able to remove any content where they have a legal obligation to do so, or where the user is committing a criminal offence, even if that is not covered in their terms of service.

New clause 5 will require Ofcom to publish guidance to help providers of category 1 services to understand what they need to do to comply with their new duties. That could include guidance on how to make their terms of service clear and easy for users to understand, and how to operate an effective reporting and redress mechanism. The guidance will not prescribe what types of content companies should include in their terms of service, or how they should treat such content. That will be for companies to decide, based on their knowledge of their users, and their brand and commercial incentives, and subject to their other legal obligations.

New clause 6 clarifies terms used in new clauses 3 and 4. It also includes a definition of “Consumer content”, which is excluded from the main duties in new clauses 3 and 4. This covers content that is already regulated by the Competition and Markets Authority and other consumer protection bodies, such as content that breaches the Consumer Protection from Unfair Trading Regulations 2008. These definitions are needed to provide clarity to companies seeking to comply with the duties set out in new clauses 3 and 4.

The remaining amendments to other provisions in the Bill are consequential on the insertion of these new transparency, accountability and free speech duties. They insert references to the new duties in, for example, the provisions about content reporting, enforcement, transparency and reviewing compliance. That will ensure that the duties apply properly to the new measure.

Amendment 30 removes the duty on platforms to include clear and accessible provisions in their terms of service informing users that they have a right of action in court for breach of contract if a platform removes or restricts access to their content in violation of its terms of service. This is so that the duty can be moved to new clause 4, which focuses on ensuring that platforms comply with their terms of service. The replacement duty in new clause 4 will go further than the original duty, in that it will cover suspensions and bans of users as well as restrictions on content.

Amendments 46 and 47 impose a new duty on Ofcom to have regard to the need for it to be clear to providers of category 1 services what they must do to comply with their new duties. These amendments will also require Ofcom to have regard to the extent to which providers of category 1 services are demonstrating, in a transparent and accountable way, how they are complying with their new duties.

Lastly, amendment 95 temporarily exempts video-sharing platforms that are category 1 services from the new terms of service duties, as set out in new clauses 3 and 4, until the Secretary of State agrees that the Online Safety Bill is sufficiently implemented. This approach simultaneously maximises user protections by the temporary continuation of the VSP regime and minimises burdens for services and Ofcom. The changes are central to the Government’s intention to hold companies accountable for their promises. They will protect users in a way that is in line with companies’ terms of service. They are a critical part of the triple shield, which aims to protect adults online. It ensures that users are safe by requiring companies to remove illegal content, enforce their terms of service and provide users with tools to control their online experiences. Equally, these changes prevent arbitrary or random content removal, which helps to protect pluralistic and robust debate online. For those reasons, I hope that Members can support the amendments.

15:45
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

This is an extremely large grouping so, for the sake of the Committee, I will do my best to keep my comments focused and brief where possible. I begin by addressing Government new clauses 3 and 4 and the consequential amendments.

Government new clause 3 introduces new duties that aim to ensure that the largest or most risky online service providers design systems and processes that ensure they cannot take down or restrict content in a way prevents a person from seeing it without further action by the user, or ban users, except in accordance with their own terms of service, or if the content breaks the law or contravenes the Online Safety Bill regime. This duty is referred to as the duty not to act against users except in accordance with terms of service. In reality, that will mean that the focus remains far too much on the banning, taking down and restriction of content, rather than our considering the systems and processes behind the platforms that perpetuate harm.

Labour has long held the view that the Government have gone down an unhelpful cul-de-sac on free speech. Instead of focusing on defining exactly which content is or is not harmful, the Bill should be focused on the processes by which harmful content is amplified on social media. We must recognise that a person posting a racist slur online that nobody notices, shares or reads is significantly less harmful than a post that can quickly go viral, and can within hours gain millions of views or shares. We have talked a lot in this place about Kanye West and the comments he has made on Twitter in the past few weeks. It is safe to say that a comment by Joe Bloggs in Hackney that glorifies Hitler does not have the same reach or produce the same harm as Kanye West saying exactly the same thing to his 30 million Twitter followers.

Our approach has the benefit of addressing the things that social media companies can control—for example, how content spreads—rather than the things they cannot control, such as what people say online. It reduces the risk to freedom of speech because it tackles how content is shared, rather than relying entirely on taking down harmful content. Government new clause 4 aims to improve the effectiveness of platforms’ terms of service in conjunction with the Government’s new triple shield, which the Committee has heard a lot about, but the reality is they are ultimately seeking to place too much of the burden of protection on extremely flexible and changeable terms of service.

If a provider’s terms of service say that certain types of content are to be taken down or restricted, then providers must run systems and processes to ensure that that can happen. Moreover, people must be able to report breaches easily, through a complaints service that delivers appropriate action, including when the service receives complaints about the provider. This “effectiveness” duty is important but somewhat misguided.

The Government, having dropped some of the “harmful but legal” provisions, seem to expect that if large and risky services—the category 1 platforms—claim to be tackling such material, they must deliver on that promise to the customer and user. This reflects a widespread view that companies may pick and choose how to apply their terms of service, or implement them loosely and interchangeably, as we have heard. Those failings will lead to harm when people encounter things that they would not have thought would be there when they signed up. All the while, service providers that do not fall within category 1 need not enforce their terms of service, or may do so erratically or discriminatorily. That includes search engines, no matter how big.

This large bundle of amendments seems to do little to actually keep people safe online. I have already made my concerns about the Government’s so-called triple shield approach to internet safety clear, so I will not repeat myself. We fundamentally believe that the Government’s approach, which places too much of the onus on the user rather than the platform, is wrong. We therefore cannot support the approach that is taken in the amendments. That being said, the Minister can take some solace from knowing that we see the merits of Government new clause 5, which

“requires OFCOM to give guidance to providers about complying with the duties imposed by NC3 and NC4”.

If this is the avenue that the Government insist on going down, it is absolutely vital that providers are advised by Ofcom on the processes they will be required to take to comply with these new duties.

Amendment 19 agreed to.

Amendment made: 20, in clause 18, page 19, line 33, at end insert

“, and

(b) section (Further duties about terms of service)(5)(a) (reporting of content that terms of service allow to be taken down or restricted).”—(Paul Scully.)

This amendment inserts a signpost to the new provision about content reporting inserted by NC4.

Clause 18, as amended, ordered to stand part of the Bill.

Clause 19

Duties about complaints procedures

Amendment made: 21, in clause 19, page 20, line 15, leave out “, (3) or (4)” and insert “or (3)”.—(Paul Scully.)

This amendment removes a reference to clause 20(4), as that provision is moved to NC4.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I beg to move amendment 22, in clause 19, page 20, line 27, leave out from “down” to “and” in line 28 and insert

“or access to it being restricted, or given a lower priority or otherwise becoming less likely to be encountered by other users,”.

NC2 states what is meant by restricting users’ access to content, and this amendment makes a change in line with that, to avoid any implication that downranking is a form of restriction on access to content.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Government amendment 59.

Government new clause 2—Restricting users’ access to content.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

These amendments clarify the meaning of “restricting access to content” and “access to content” for the purposes of the Bill. Restricting access to content is an expression that is used in various provisions across the Bill, such as in new clause 2, under which providers of category 1 services will have a duty to ensure that they remove or restrict access to users’ content only where that is in accordance with their terms of service or another legal obligation. There are other such references in clauses 15, 16 and 17.

The amendments make it clear that the expression

“restricting users’ access to content”

covers cases where a provider prevents a user from accessing content without that user taking a prior step, or where content is temporarily hidden from a user. They also make it clear that this expression does not cover any restrictions that the provider puts in place to enable users to apply user empowerment tools to limit the content that they encounter, or cases where access to content is controlled by another user, rather than by the provider.

The amendments are largely technical, but they do cover things such as down-ranking. Amendment 22 is necessary because the previous wording of this provision wrongly suggested that down-ranking was covered by the expression “restricting access to content”. Down-ranking is the practice of giving content a lower priority on a user’s feed. The Government intend that users should be able to complain if they feel that their content has been inappropriately down-ranked as a result of the use of proactive technology. This amendment ensures consistency.

I hope that the amendments provide clarity as to the meaning of restricting access to content for those affected by the Bill, and assist providers with complying with their duties.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Again, I will keep my comments on clause 19 brief, as we broadly support the intentions behind the clause and the associated measures in the grouping. My hon. Friend the Member for Worsley and Eccles South (Barbara Keeley) spoke at length about this important clause, which relates to the all-important complaints procedures available around social media platforms and companies, in the previous Bill Committee.

During the previous Committee, Labour tabled amendments that would have empowered more individuals to make a complaint about search content in the event of non-compliance. In addition, we wanted an external complaints option for individuals seeking redress. Sadly, all those amendments were voted down by the last Committee, but I must once again press the Minister on those points, particularly in the context of the new amendments that have been tabled.

Without redress for individual complaints, once internal mechanisms have been exhausted, victims of online abuse could be left with no further options. Consumer protections could be compromised and freedom of expression, with which the Government seem to be borderline obsessed, could be infringed for people who feel that their content has been unfairly removed.

Government new clause 2 deals with the meaning of references to

“restricting users’ access to content”,

in particular by excluding restrictions resulting from the use of user empowerment tools as described in clause 14. We see amendments 22 and 59 as important components of new clause 2, and are therefore more than happy to support them. However, I reiterate to the Minister and place on the record once again the importance of introducing an online safety ombudsman, which we feel is crucial to new clause 2. The Joint Committee recommended the introduction of such an ombudsman, who would consider complaints when internal routes of redress had not resulted in resolution, had failed to address risk and had led to significant and demonstrable harm. As new clause 2 relates to restricting users’ access to content, we must also ensure that there is an appropriate channel for complaints if there is an issue that users wish to take up around restrictions in accessing content.

By now, the Minister will be well versed in my thoughts on the Government’s approach, and on the reliance on the user empowerment tool approach more broadly. It is fundamentally an error to pursue a regime that is so content-focused. Despite those points, we see the merits in Government amendments 22 and 59, and in new clause 2, so have not sought to table any further amendments at this stage.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am slightly confused, and would appreciate a little clarification from the Minister. I understand what new clause 2 means; if the hon. Member for Pontypridd says that she does not want to see content of a certain nature, and I put something of that nature online, I am not being unfairly discriminated against in any way because she has chosen to opt out of receiving that content. I am slightly confused about the downgrading bit.

I know that an awful lot of platforms use downgrading when there is content that they find problematic, or something that they feel is an issue. Rather than taking that content off the platform completely, they may just no longer put it in users’ feeds, for example; they may move it down the priority list, and that may be part of what they already do to keep people safe. I am not trying to criticise what the Government are doing, but I genuinely do not understand whether that downgrading would still be allowed, whether it would be an issue, and whether people could complain about their content being downgraded because the platform was a bit concerned about it, and needed to check it out and work out what was going on, or if it was taken off users’ feeds.

Some companies, if they think that videos have been uploaded by people who are too young to use the platform, or by a registered child user of the platform, will not serve that content to everybody’s feeds. I will not be able to see something in my TikTok feed that was published by a user who is 13, for example, because there are restrictions on how TikTok deals with and serves that content, in order to provide increased protection and the safety that they want on their services.

Will it still be acceptable for companies to have their own internal downgrading system, in order to keep people safe, when content does not necessarily meet an illegality bar or child safety duty bar? The Minister has not used the phrase “market forces”; I think he said “commercial imperative”, and he has talked a lot about that. Some companies and organisations use downgrading to improve the systems on their site and to improve the user experience on the platform. I would very much appreciate it if the Minister explained whether that will still be the case. If not, will we all have a worse online experience as a result?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will have a go at that, but I am happy to write to the hon. Lady if I do not respond as fully as she wants. Down-ranking content is a moderation action, as she says, but it is not always done just to restrict access to content; there are many reasons why people might want to do it. Through these changes, we are saying that the content is not actually being restricted; it can still be seen if it is searched for or otherwise encountered. That is consistent with the clarification.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

This is quite an important point. The hon. Member for Aberdeen North was talking about recommendation systems. If a platform chooses not to amplify content, that is presumably not covered. As long as the content is accessible, someone could search and find it. That does not inhibit a platform’s decision, for policy reasons or whatever, not to actively promote it.

15:09
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Absolutely. There are plenty of reasons why platforms will rank users’ content, including down-ranking it. Providing personal content recommendations will have that process in it as well. It is not practical to specify that restricting access includes down-ranking. That is why we made that change.

Amendment 22 agreed to.

Amendments made: 23, in clause 19, page 21, line 7, leave out from “The” to “complaints” in line 10 and insert

“relevant kind of complaint for Category 1 services is”.

This amendment is consequential on the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 24, in clause 19, page 21, line 12, leave out sub-paragraph (i).

This amendment is consequential on Amendment 7 (removal of clause 13).

Amendment 25, in clause 19, page 21, line 18, leave out paragraphs (c) and (d).

This amendment is consequential on the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 26, in clause 19, page 21, line 33, leave out from “also” to second “section”.

This is a technical amendment relating to Amendment 27.

Amendment 27, in clause 19, page 21, line 34, at end insert

“, and

(b) section (Further duties about terms of service)(6) (complaints procedure relating to content that terms of service allow to be taken down or restricted).”—(Paul Scully.)

This amendment inserts a signpost to the new provision about complaints procedures inserted by NC4.

Clause 19, as amended, ordered to stand part of the Bill.

Clause 20

Duties about freedom of expression and privacy

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I beg to move amendment 28, in clause 20, page 21, line 42, after “have” insert “particular”.

This amendment has the result that providers of regulated user-to-user services must have particular regard to freedom of expression when deciding on and implementing safety measures and policies.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss Government amendments 29, 31, 36 to 38 and 40.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will be brief. The rights to freedom of expression and privacy are essential to our democracy. We have long been clear that the Bill must not interfere with those rights. The amendments will further strengthen protections for freedom of expression and privacy and ensure consistency in the Bill. They require regulated user-to-user and search services to have particular regard to freedom of expression and privacy when deciding on and implementing their safety measures and policy.

Amendments 28, 29 and 31 mean that service providers will need to thoroughly consider the impact that their safety and user empowerment measures have on users’ freedom of expression and privacy. That could mean, for example, providing detailed guidance and training for human reviewers about content that is particularly difficult to assess. Amendments 36 and 37 apply that to search services in relation to their safety duties. Ofcom can take enforcement action against services that fail to comply with those duties and will set out steps that platforms can take to safeguard freedom of expression and privacy in their codes of practice.

Those changes will not detract from platforms’ illegal content and child protection duties. Companies must tackle illegal content and ensure that children are protected on their services, but the amendments will protect against platforms taking an over-zealous approach to removing content or undermining users’ privacy when complying with their duties. Amendments 38 and 40 ensure that the rest of the Bill is consistent with those changes. The new duties will therefore ensure that companies give proper consideration to users’ rights when complying with them, and that that is reflected in Ofcom’s codes, providing greater clarity to companies.

Amendment 28 agreed to.

Amendments made: 29, in clause 20, page 22, line 2, after “have” insert “particular”.

This amendment has the result that providers of regulated user-to-user services must have particular regard to users’ privacy when deciding on and implementing safety measures and policies.

Amendment 30, in clause 20, page 22, line 6, leave out subsection (4).

This amendment removes clause 20(4), as that provision is moved to NC4.

Amendment 31, in clause 20, page 22, line 37, leave out paragraph (c) and insert—

“(c) section 14 (user empowerment),”.—(Paul Scully.)

The main effect of this amendment is that providers must consider freedom of expression and privacy issues when deciding on measures and policies to comply with clause 14 (user empowerment). The reference to clause 14 replaces the previous reference to clause 13 (adults’ safety duties), which is now removed (see Amendment 7).

Question proposed, That the clause, as amended, stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause 30 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I will speak broadly to clause 20, as it is an extremely important clause, before making remarks about the group of Government amendments we have just voted on.

Clause 20 is designed to provide a set of balancing provisions that will require companies to have regard to freedom of expression and privacy when they implement their safety duties. However, as Labour has repeatedly argued, it is important that companies cannot use privacy and free expression as a basis to argue that they can comply with regulations in less substantive ways. That is a genuine fear here.

We all want to see a Bill in place that protects free speech, but that cannot come at the expense of safety online. The situation with regards to content that is harmful to adults has become even murkier with the Government’s attempts to water down the Bill and remove adult risk assessments entirely.

The Minister must acknowledge that there is a balance to be achieved. We all recognise that. The truth is—and this is something that his predecessor, or should I say his predecessor’s predecessor, touched on when we considered this clause in the previous Bill Committee—that at the moment platforms are extremely inconsistent in their approach to getting the balance right. Although Labour is broadly supportive of this clause and the group of amendments, we feel that now is an appropriate time to put on record our concerns over the important balance between safety, transparency and freedom of expression.

Labour has genuine concerns over the future of platforms’ commitment to retaining that balance, particularly if the behaviours following the recent takeover of Twitter by Elon Musk are anything to go by. Since Elon Musk took over ownership of the platform, he has repeatedly used Twitter polls, posted from his personal account, as metrics to determine public opinion on platform policy. The general amnesty policy and the reinstatement of Donald Trump both emerged from such polls.

According to former employees, those polls are not only inaccurate representations of the platform’s user base, but are actually

“designed to be spammed and gamed”.

The polls are magnets for bots and other inauthentic accounts. This approach and the reliance on polls have allowed Elon Musk to enact and dictate his platform’s policy on moderation and freedom of expression. Even if he is genuinely trusting the results of these polls and not gamifying them, they do not accurately represent the user base nor the best practices for confronting disinformation and harm online.

Elon Musk uses the results to claim that “the people have spoken”, but they have not. Research from leading anti-hate organisation the Anti-Defamation League shows that far-right extremists and neo-Nazis encouraged supporters to actively re-join Twitter to vote in these polls. The impacts of platforming neo-Nazis on Twitter do not need to be stated. Such users are explicitly trying to promote violent and hateful agendas, and they were banned initially for that exact reason. The bottom line is that those people were banned in line with Twitter’s terms of service at the time, and they should not be re-platformed just because of the findings of one Twitter poll.

These issues are at the very heart of Labour’s concerns in relation to the Bill—that the duties around freedom of expression and privacy will be different for those at the top of the platforms. We support the clause and the group of amendments, but I hope the Minister will be able to address those concerns in his remarks.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I endorse the general approach set out by the hon. Member for Pontypridd. We do not want to define freedom of speech based on a personal poll carried out on one platform. That is exactly why we are enshrining it in this ground-breaking Bill.

We want to get the balance right. I have talked about the protections for children. We also want to protect adults and give them the power to understand the platforms they are on and the risks involved, while having regard for freedom of expression and privacy. That is a wider approach than one man’s Twitter feed. These clauses are important to ensure that the service providers interpret and implement their safety duties in a proportionate way that limits negative impact on users’ rights to freedom of expression. However, they also have to have regard to the wider definition of freedom of expression, while protecting users, which the rest of the Bill covers in a proportionate way.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

This goes to the heart of more than just one person’s Twitter feed, although we could say that that person is an incredibly powerful and influential figure on the platform. In the past 24 hours, Twitter has disbanded its trust and safety council. Members of that council included expert groups working to tackle harassment and child sexual exploitation, and to promote human rights. Does the Minister not feel that the council being disbanded goes to the heart of what we have been debating? It shows how a platform can remove its terms of service or change them at whim in order to prevent harm from being perpetrated on that platform.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will come back to some of the earlier points. At the end of the day, when platforms change their terms and conditions, which they are free to do, they will be judged by their users and indeed the advertisers from whom they make their money. There are market forces—I will use that phrase as well as “commercial imperative”, to get that one in there—that will drive behaviour. It may be the usability of Facebook, or Twitter’s terms and conditions and the approach of its new owner, that will drive those platforms to alternative users. I am old enough to remember Myspace, CompuServe and AOL, which tried to box people into their walled gardens. What happened to them? Only yesterday, someone from Google was saying that the new artificial intelligence chatbot—ChatGPT—may well disrupt Google. These companies, as big as they are, do not have a right to exist. They have to keep innovating. If they get it wrong, then they get it wrong.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Does my hon. Friend agree that this is why the Bill is structured in the way it is? We have a wide range of priority illegal offences that companies have to meet, so it is not down to Elon Musk to determine whether he has a policy on race hate. They have to meet the legal standards set, and that is why it is so important to have that wide range of priority illegal offences. If companies go beyond that and have higher safety standards in their terms of service, that is checked as well. However, a company cannot avoid its obligations simply by changing its terms of service.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

My hon. Friend is absolutely right. We are putting in those protections, but we want companies to have due regard to freedom of speech.

I want to clarify a point that my hon. Friend made earlier about guidance to the new accountability, transparency and free speech duties. Companies will be free to set any terms of service that they want to, subject to their other legal obligations. That is related to the conversations that we have just been having. Those duties are there to properly enforce the terms of service, and not to remove content or ban users except in accordance with those terms. There will no platform risk assessments or codes of practices associated with those new duties. Instead, Ofcom will issue guidance on how companies can comply with their duties rather than codes of practice. That will focus on how companies set their terms of service, but companies will not be required to set terms directly for specific types of content or cover risks. I hope that is clear.

To answer the point made by the hon. Member for Pontypridd, I agree with the overall sentiment about how we need to protect freedom of expression.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I want to be clear on my point. My question was not related to how platforms set their terms of service, which is a matter for them and they are held to account for that. If we are now bringing in requirements to say that companies cannot go beyond terms of service or their duties in the Bill if they are going to moderate content, who will oversee that? Will Ofcom have a role in checking whether platforms are over-moderating, as the Minister referred to earlier? In that case, where those duties exist elsewhere in the Bill, we have codes of practice in place to make sure it is clear what companies should and should not do. We do not seem to be doing that with this issue.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Absolutely. We have captured that in other parts of the Bill, but I wanted to make that specific bit clear because I am not sure whether I understood or answered my hon. Friend’s question correctly at the time.

Question put and agreed to.

Clause 20, as amended, accordingly ordered to stand part of the Bill.

Clause 21

Record-keeping and review duties

Amendments made: 32, in clause 21, page 23, line 5, leave out “, 10 or 12” and insert “or 10”.

This amendment is consequential on Amendment 6 (removal of clause 12).

Amendment 33, in clause 21, page 23, line 45, leave out paragraph (c).

This amendment is consequential on Amendment 7 (removal of clause 13).

Amendment 34, in clause 21, page 24, line 6, leave out “section” and insert “sections”.

This amendment is consequential on Amendment 35.

Amendment 35, in clause 21, page 24, line 6, at end insert—

“, (Duty not to act against users except in accordance with terms of service) and (Further duties about terms of service) (duties about terms of service).”—(Paul Scully.)

This amendment ensures that providers have a duty to review compliance with the duties set out in NC3 and NC4 regularly, and after making any significant change to the design or operation of the service.

Question proposed, That the clause, as amended, stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Given that there are few changes to this clause from when the Bill was amended in the previous Public Bill Committee, I will be brief. We in the Opposition are clear that record-keeping and review duties on in-scope services make up an important function of the regulatory regime and sit at the very heart of the Online Safety Bill. We must push platforms to transparently report all harms identified and the action taken in response, in line with regulation.

16:15
The requirements to keep records of the action taken in response to harm will be vital in supporting the regulator in making effective decisions about regulatory breaches and on whether the company responses are sufficient. They will also be vital to understanding the success of the regime once it is in place. We see the clause as central to preventing concerns over the under-reporting of harms to evade regulation.
We already know that under-reporting exists. We only have to turn to the testimony of many whistleblowers—colleagues will be aware of those who have bravely shared their concerns over the lack of transparency in this space—to know that we are often not presented with the full picture on the scale of the harm.
Labour has not sought to amend the clause, but one again I must reiterate a point that we have pushed on numerous occasions—namely, the importance of requiring in-scope services to publish their risk assessments. The Government have refused on a number of occasions to understand the significance of the level of transparency, but it could bring great benefits, as it would allow researchers and civil society to track harms and hold services to account. Again, I push the Minister and urge him to stress that the risk assessments are published.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Specifically on the issue that was just raised, there were two written ministerial statements on the Online Safety Bill. The first specifically said that an amendment would

“require the largest platforms to publish summaries of their risk assessments for illegal content and material that is harmful to children, to allow users and empower parents to clearly understand the risks presented by these services and the approach platforms are taking to children’s safety”.—[Official Report, 29 November 2022; Vol. 723, c. 31WS.]

Unless I have completely missed an amendment that has been tabled for this Committee, my impression is that that amendment will be tabled in the Lords and that details will be made available about how exactly the publishing will work and which platforms will be required to publish.

I would appreciate it if the Minister could provide more clarity about what that might look like, and about which platforms might have to publish their assessments. I appreciate that that will be scrutinised in the Lords but, to be fair, this is the second time that the Bill has been in Committee in the Commons. It would be helpful if we could be a bit more sighted on what exactly the Government intend to do—meaning more than the handful of lines in a written ministerial statement—because then we would know whether the proposal is adequate, or whether we would have to ask further questions in order to draw it out and ensure that it is published in a certain form. The more information the Minister can provide, the better.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I think we all agree that written records are hugely important. They are important as evidence in cases where Ofcom is considering enforcement action, and a company’s compliance review should be done regularly, especially before they make changes to their service.

The Bill does not intend to place excessive burdens on small and low-risk businesses. As such, clause 21 provides Ofcom with the power to exempt certain types of service from the record-keeping and review duties. However, the details of any exemptions must be published.

To half-answer the point made by the hon. Member for Aberdeen North, the measures will be brought to the Lords, but I will endeavour to keep her up to date as best we can so that we can continue the conversation. We have served together on several Bill Committees, including on technical Bills that required us to spend several days in Committee—although they did not come back for re-committal—so I will endeavour to keep her and, indeed, the hon. Member for Pontypridd, up to date with developments.

Question put and agreed to. 

Clause 21, as amended, accordingly ordered to stand part of the Bill.

Clause 30

duties about freedom of expression and privacy

Amendments made: 36, in clause 30, page 31, line 31, after “have” insert “particular”.

This amendment has the result that providers of regulated search services must have particular regard to freedom of expression when deciding on and implementing safety measures and policies.

Amendment 37, in clause 30, page 31, line 34, after “have” insert “particular”.—(Paul Scully.)

This amendment has the result that providers of regulated search services must have particular regard to users’ privacy when deciding on and implementing safety measures and policies.

Clause 30, as amended, ordered to stand part of the Bill.

Clause 46

Relationship between duties and codes of practice

Amendments made: 38, in clause 46, page 44, line 27, after “have” insert “particular”.

This amendment has the result that providers of services who take measures other than those recommended in codes of practice in order to comply with safety duties must have particular regard to freedom of expression and users’ privacy.

Amendment 39, in clause 46, page 45, line 12, leave out paragraph (c).

This amendment is consequential on Amendment 7 (removal of clause 13).

Amendment 40, in clause 46, page 45, line 31, at end insert “, or

(ii) a duty set out in section 14 (user empowerment);”.—(Paul Scully.)

This amendment has the effect that measures recommended in codes of practice to comply with the duty in clause 14 are relevant to the question of whether a provider is complying with the duties in clause 20(2) and (3) (having regard to freedom of expression and users’ privacy).

Question proposed, That the clause, as amended, stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I do not wish to repeat myself and test the Committee’s patience, so I will keep my comments brief. As it stands, service providers would be treated as complying with their duties if they had followed the recommended measures set out in the relevant codes of practice, as set out in the Bill. However, providers could take alternative measures to comply, but as I said in previous Committee sittings, Labour remains concerned that the definition of “alternative measures” is far too broad. I would be grateful if the Minister elaborated on his assessment of the instances in which a service provider may seek to comply via alternative measures.

The codes of practice should be, for want of a better phrase, best practice. Labour is concerned that, to avoid the duties, providers may choose to take the “alternative measures” route as an easy way out. We agree that it is important to ensure that providers have a duty with regard to protecting users’ freedom of expression and personal privacy. As we have repeatedly said, the entire Online Safety Bill regime relies on that careful balance being at the forefront. We want to see safety at the forefront, but recognise the importance of freedom of expression and personal privacy, and it is right that those duties are central to the clause. For those reasons, Labour has not sought to amend this part of the Bill, but I want to press the Minister on exactly how he sees this route being used.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

It is important that service providers have flexibility, so that the Bill does not disincentivise innovation or force service providers to use measures that might not work for all business models or technological contexts. The tech sector is diverse and dynamic, and it is appropriate that companies can take innovative approaches to fulfilling their duties. In most circumstances, we expect companies to take the measures outlined in Ofcom’s code of practice as the easiest route to compliance. However, where a service provider takes alternative measures, Ofcom must consider whether those measures safeguard users’ privacy and freedom of expression appropriately. Ofcom must also consider whether they extend across all relevant areas of a service mentioned in the illegal content and children’s online safety duties, such as content moderation, staff policies and practices, design of functionalities, algorithms and other features. Ultimately, it will be for Ofcom to determine a company’s compliance with the duties, which are there to ensure users’ safety.

Question put and agreed to.

Clause 46, as amended, accordingly ordered to stand part of the Bill.

Clause 55 disagreed to.

Clause 56

Regulations under sections 54 and 55

Amendments made: 42, in clause 56, page 54, line 40, leave out subsection (3).

This amendment is consequential on Amendment 41 (removal of clause 55).

Amendment 43, in clause 56, page 54, line 46, leave out “or 55”.

This amendment is consequential on Amendment 41 (removal of clause 55).

Amendment 44, in clause 56, page 55, line 8, leave out “or 55”.

This amendment is consequential on Amendment 41 (removal of clause 55).

Amendment 45, in clause 56, page 55, line 9, leave out

“or adults are to children or adults”

and insert “are to children”.—(Paul Scully.)

This amendment is consequential on Amendment 41 (removal of clause 55).

Question proposed, That the clause, as amended, stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

As we know, the clause makes provision in relation to the making of regulations designating primary and priority content that is harmful to children, and priority content that is harmful to adults. The Secretary of State may specify a description of content in regulations only if they consider that there is a material risk of significant harm to an appreciable number of children or adults in the United Kingdom presented by user-generated or search content of that description, and must consult Ofcom before making such regulations.

In the last Bill Committee, Labour raised concerns that there were no duties that required the Secretary of State to consult others, including expert stakeholders, ahead of making these regulations. That decision cannot be for one person alone. When it comes to managing harmful content, unlike illegal content, we can all agree that it is about implementing systems that prevent people from encountering it, rather than removing it entirely.

Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

The fact that we are here again to discuss what one Secretary of State wanted to put into law, and which another is now seeking to remove before the law has even been introduced, suggests that my hon. Friend’s point about protection and making sure that there are adequate measures within which the Secretary of State must operate is absolutely valid.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree: we are now on our third Secretary of State, our third Minister and our third Prime Minister since we began considering this iteration of the Bill. It is vital that this does not come down to one person’s ideological beliefs. We have spoken at length about this issue; the hon. Member for Don Valley has spoken about his concerns that Parliament should be sovereign, and should make these decisions. It should not be for one individual or one stakeholder to make these determinations.

We also have issues with the Government’s chosen toggle approach—we see that as problematic. We have debated it at length, but our concerns regarding clause 56 are about the lack of consultation that the Secretary of State of the day, whoever that may be and whatever political party they belong to, will be forced to make before making widespread changes to a regime. I am afraid that those concerns still exist, and are not just held by us, but by stakeholders and by Members of all political persuasions across the House. However, since our proposed amendment was voted down in the previous Bill Committee, nothing has changed. I will spare colleagues from once again hearing my pleas about the importance of consultation when it comes to determining all things related to online safety, but while Labour Members do not formally oppose the clause, we hope that the Minister will address our widespread concerns about the powers of the Secretary of State in his remarks.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I appreciate the hon. Lady’s remarks. We have tried to ensure that the Bill is proportionate, inasmuch as the Secretary of State can designate content if there is material risk of significant harm to an appreciable number of children in the United Kingdom. The Bill also requires the Secretary of State to consult Ofcom before making regulations on the priority categories of harm.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

I appreciate that this point has been made about the same wording earlier today, but I really feel that the ambiguity of “appreciable number” is something that could do with being ironed out. The ambiguity and vagueness of that wording make it very difficult to enforce the provision. Does the Minister agree that “appreciable number” is too vague to be of real use in legislation such as this?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The different platforms, approaches and conditions will necessitate different numbers; it would be hard to pin a number down. The wording is vague and wide-ranging because it is trying to capture any number of scenarios, many as yet unknown. However, the regulations designating priority harms will be made under the draft affirmative resolution procedure.

Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

On that point, which we discussed earlier—my hon. Friend the Member for Warrington North discussed it—I am struggling to understand what is an acceptable level of harm, and what is the acceptable number of people to be harmed, before a platform has to act.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

It totally depends on the scenario. It is very difficult for me to stand here now and give a wide number of examples, but the Secretary of State will be reacting to a given situation, rather than trying to predict them.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Minister has just outlined exactly what our concerns are. He is unable to give an exact number, figure or issue, but that is what the Secretary of State will have to do, without having to consult any stakeholders regarding that issue. There are many eyes on us around the world, with other legislatures looking at us and following suit, so we want the Bill to be world-leading. Many Governments across the world may deem that homosexuality, for example, is of harm to children. Because this piece of legislation creates precedent, a Secretary of State in such a Government could determine that any platform in that country should take down all that content. Does the Minister not see our concerns in that scenario?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I was about to come on to the fact that the Secretary of State would be required to consult Ofcom before making regulations on the priority categories of harm. Indeed Ofcom, just like the Secretary of State, speaks to and engages with a number of stakeholders on this issue to gain a deeper understanding. Regulations designating priority harms would be made under the draft affirmative resolution procedure, but there is also provision for the Secretary of State to use the made affirmative resolution procedure in urgent scenarios, and this would be an urgent scenario. It is about getting the balance right.

16:30
Following amendments 42 to 45, the definition of priority harmful content to adults and the power for the Secretary of State to designate categories of priority harmful content to adults have been removed. These amendments update clause 56 to reflect the removal of the adult safety duties and the concept of legal but harmful content from the Bill.
Question put and agreed to.
Clause 56, as amended, accordingly ordered to stand part of the Bill.
Clause 65
Transparency reports about certain Part 3 services
Question proposed, That the clause stand part of the Bill.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

As we know, this clause requires providers of relevant services to publish annual transparency reports and sets out Ofcom’s powers in relation to those reports. The information set out in transparency reports is intended to help users to understand the steps that providers are taking to help keep them safe and to provide Ofcom with the information required to hold them to account.

These duties on regulated services are very welcome indeed. Labour has long held the view that mandatory transparency reporting and reporting mechanisms are vital to hold platforms to account, and to understand the true nature of how online harm is driven and perpetuated on the internet.

I will reiterate the points that were made in previous Committee sittings about our concerns about the regularity of these transparency reports. I note that, sadly, those reports remain unchanged and therefore they will only have to be submitted to Ofcom annually. It is important that the Minister truly considers the rapid rate at which the online world can change and develop, so I urge him to reconsider this point and to make these reports a biannual occurrence. Labour firmly believes that increasing the frequency of the transparency reports will ensure that platforms and services remain on the pulse, and are forced to be aware of and act on emergent risks. In turn, that would compel Ofcom to do the same in its role as an industry regulator.

I must also put on the record some of our concerns about subsections (12) and (13), which state that the Secretary of State of the day could amend by regulation the frequency of the transparency reporting, having consulted Ofcom first. I hope that the Minister can reassure us that this approach will not result in our ending up in a position where, perhaps because of Ofcom’s incredible workload, transparency reporting becomes even less frequent than an annual occurrence. We need to see more transparency, not less, so I really hope that he can reassure me on this particular point.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Does my hon. Friend agree that transparency should be at the heart of this Bill and that the Government have missed an opportunity to accelerate the inclusion of a provision in the Bill, namely the requirement to give researchers and academics access to platform data? Data access must be prioritised in the Bill and without such prioritisation the UK will fall behind the rest of Europe in safety, research and innovation. The accessibility and transparency of that data from a research perspective are really important.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree with my hon. Friend. We both made the point at length in previous sittings of the Committee about the need to ensure transparency, access to the data, and access to reporting for academics, civil society and researchers.

That also goes to the point that it is not for this Committee or this Minister—it is not in his gift—to determine something that we have all discussed in this place at length, which is the potential requirement for a standalone Committee specifically to consider online harm. Such a Committee would look at whether this legislation is actively doing what we need it to do, whether it needs to be reviewed, whether it could look at the annual reports from Ofcom to determine the length and breadth of harm on the internet, and whether or not this legislation is actually having an impact. That all goes to the heart of transparency, openness and the review that we have been talking about.

I want to go further and raise concerns about how public the reports will be, as we have touched on. The Government claim that their so-called triple shield approach will give users of platforms and services more power and knowledge to understand the harms that they may discover online. That is in direct contradiction to the Bill’s current approach, which does not provide any clarity about exactly how the transparency reports will be made available to the public. In short, we feel that the Government are missing a significant opportunity. We have heard many warnings about what can happen when platforms are able to hide behind a veil of secrecy. I need only point to the revelations of whistleblowers, including Frances Haugen, to highlight the importance of that point.

As the Bill stands, once Ofcom has issued a notice, companies will have to produce a transparency report that

“must…be published in the manner and by the date specified in the notice”.

I want to press the Minister on that and ask him to clarify the wording. We are keen for the reports to be published publicly and in an accessible way, so that users, civil society, researchers and anyone else who wants to see them can make sense of them. The information contained in the transparency reports is critical to analysing trends and harms, so I hope that the Minister will clarify those points in his response.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Does my hon. Friend agree that if the Government are to achieve their objective—which we all share—for the Bill to be world-leading legislation, we cannot rely on whistleblowers to tell us what is really going on in the online space? That is why transparency is vital. This is the perfect opportunity to provide that transparency, so that we can do some proper research into what is going on out there. We cannot rely on whistleblowers to give us such information.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

My hon. Friend is absolutely right. We want the Bill to work. We have always wanted the Bill to work. We want it to achieve its aim of keeping children, adults and everyone who uses the internet safe from the harms that are perpetuated there. If there is no transparency, how will we know that the platforms are breaking the rules covertly, and whether they are hiding content and getting round the rules? That is what they do; we know it, because we have heard it from whistleblowers, but we cannot rely on whistleblowers alone to highlight exactly what happens behind the closed doors of the platforms.

We need the transparency and the reports to be made public, so that we can see whether the legislation is working. If that does not happen, although we have waited five years, we will need another piece of legislation to fix it. We know that the Bill is not perfect, and the Minister knows that—he has said so himself—but, ultimately, we need to know that it works. If it does not, we have a responsibility as legislators to put something in place that does. Transparency is the only way in which we will figure that out.

Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

I want to add to the brilliant points made by my hon. Friend the shadow Minister, in particular on the continually changing nature of market forces, which the Minister himself referenced. We want innovation. We want the tech companies to innovate—preferably ones in the UK—but we do not want to be playing catch-up as we are now, making legislation retrospectively to right wrongs that have taken place because our legislative process has been too slow to deal with the technological changes and the changes in social media, in apps, and with how we access data and communicate with one another online. The bare minimum is a biannual report.

Within six months, if a new piece of technology comes up, it does not simply stay with one app or platform; that technology will be leapfrogged by others. Such technological advances can take place at a very rapid pace. The transparency aspect is important, because people should have a right to know what they are using and whether it is safe. We as policy makers should have a right to know clearly whether the legislation that we have introduced, or the legislation that we want to amend or update, is effective.

If we look at any other approach that we take to protect the health and safety of the people in our country—the people we all represent in our constituencies —we always say that prevention is better than cure. At the moment, without transparency and without researchers being able to update the information we need to see, we will constantly be playing catch-up with digital tech.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

This may be the only place in the Bill where I do not necessarily agree wholeheartedly with the Labour Front Benchers. I agree with the vast majority of what was said, but I have some concerns about making mandatory the requirement for transparency reports to be public in all circumstances, because there are circumstances in which that would simply highlight loopholes, allowing people to exploit them in a way that we do not want them to do.

Specifically on the regularity of reporting and some level of transparency, given that the Minister is keen on the commercial imperative and ensuring that people are safe, we need a higher level of transparency than we currently see among the platforms. There is a very good case to be made for some of the transparency reporting to be made public, particularly for the very largest platforms to be required to make it public, or to make sections of it public.

I want to talk about the speed of change to the terms of service and about proportionality. If Ofcom could request transparency reporting only annually, imagine that it received transparency information three days before Elon Musk took over Twitter. Twitter would be a completely different place three days later, and Ofcom would be unable to ask for more transparency information for a whole year, by which point a significant amount of damage could have been done. We have seen that the terms of service can change quickly. Ofcom would not have the flexibility to ask for an updated transparency report, even if drastic changes were made to the services.

Another thing slightly concerns me about doing this annually and not allowing a bit more flexibility. Let us say that a small platform that none of us has ever heard of, such as Mastodon, shoots to prominence overnight. Let us also say that, as a small platform, Mastodon was previously regulated, and Ofcom had made a request for transparency information shortly before Elon Musk took over Twitter and people had migrated to Mastodon. Mastodon would now be suffering from very different issues than those it had when it had a small number of users, compared with the significant number that it has now. It would have changed dramatically, yet Ofcom would not have the flexibility to seek that information. We know that platforms in the online world have sudden stellar increases in popularity overnight. Some have been bubbling along for ages with nobody using them. Not all of them are brand-new platforms that suddenly shoot to prominence. The lack of flexibility is a problem.

Lastly, I agree about researchers being able to access the transparency information provided. It is really important that we recognise that Ofcom is not the only expert. Ofcom has a huge amount of expertise, and it is massively increasing its staff numbers to cope with these issues, but the reality is that those staff are not academic researchers. They are unable to look at the issues and are not necessarily the most prominent experts in the field of child protection, for example. That is not to take away from the expertise in Ofcom, but we could allow it to ask a regulated group of researchers to look at the information and point out any issues that may not have been spotted, particularly given the volume of transparency reports that there are likely to be.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

The hon. Lady makes an important point. In terms of transparency, the question for me is, what are the Government worried about? Surely part of the Bill is about finding out what is really going on, and the only way that we will do that is by having access to the information. The more transparency, the better. The hon. Lady is right that having experts who can research what is going on is fundamental. If there is a concern around the workload for Ofcom, that is a separate issue that the Minister needs to address, but surely the more work that is done in terms of research and transparency, the better.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

We have seen that just from the people from external organisations who have contacted us about the Bill. The amount of expertise that we do not have that they have brought to the table has significantly improved the debate and hopefully the Bill. Even prior to the consultations that have happened, that encouraged the Minister to make the Bill better. Surely that is why the pre-legislative scrutiny Committee looked at the Bill—in order to improve it and to get expert advice. I still think that having specific access to expertise in order to analyse the transparency report has not been covered adequately.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Annual transparency reporting is an important part of how the system will work. Transparency is one of the most important aspects of how the Online Safety Bill works, because without it companies can hide behind the transparency reports they produce at the moment, which give no transparency at all. For example, Facebook and YouTube report annually that their AI finds 95% of the hate speech they remove, but Frances Haugen said that they removed only 5% of the hate speech. So the transparency report means that they remove 95% of 5%, and that is one of the fundamental problems. The Bill gives the regulator the power to know, and the regulator then has to make informed decisions based on the information it has access to.

16:45
Ofcom is also acting with statutory powers, which is different from how other researchers or organisations that might be appointed would work. The nature of the relationship between Ofcom and the regulated platforms is very different from that with a company that is open to independent scrutiny from independent researchers. Of course, the Bill does not limit Ofcom to just doing annual transparency reports. Ofcom can appoint what I think the Bill calls a “skilled person”, although I think an Ofcom special agent is a better description. At any time, Ofcom can appoint a skilled person—an expert—to go into the company and analyse particular problems. If it was a case of a change of ownership and new risks on the platform that were not previously foreseen, or a big concern about the platform’s performance, Ofcom can appoint that person.
Of course, Ofcom would be free to appoint outside experts, not just people from within the organisation. It could bring in a specialist with particular knowledge of an area where it had concerns. It could do that at any time and appoint as many people as it liked.
Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

As much as I am keen on the idea of Ofcom special agents conceptually, my concern on the transparency front is that, to appoint a special agent and send them in to look at the data, Ofcom would have to have cause to believe that there was an issue of concern with the data, whereas if that data is more transparently available to the research community, they can then proactively identify things that they can flag to Ofcom as a concern. Without that, we are relying on an annual cycle of Ofcom being able to intervene only when they have a concern, rather than the research community, which is much better placed to make that determination, being able to keep a watching brief on the company.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

That concern would be triggered by Ofcom discovering things as a consequence of user complaint. Although Ofcom is not a complaint resolution company, users can complain to it. Independent academics and researchers may produce studies and reports highlighting problems at any time, so Ofcom does not have to wait through an annual cycle of transparency reporting. At any time, Ofcom can say, “We want to have a deeper look at this problem.” It could be something Ofcom or someone else has discovered, and Ofcom can either research that itself or appoint an outside expert.

As the hon. Member for Warrington North mentioned, very sensitive information might become apparent through the transparency reporting that one might not necessarily wish to make public because it requires further investigation and could highlight a particular flaw that could be exploited by bad actors. I would hope and expect, as I think we all would, that we would have the routine publication of transparency reporting to give people assurance that the platforms are meeting their obligations. Indeed, if Ofcom were to intervene against a platform, it would probably use information gathered and received to provide the rationale for why a fine has been issued or another intervention has been made. I am sure that Ofcom will draw all the time on information gathered through transparency reporting and, where relevant, share it.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

This has been a helpful debate. Everyone was right that transparency must be and is at the heart of the Bill. From when we were talking earlier today about how risk assessments and terms of service must be accessible to all, through to this transparency reporting section, it is important that we hold companies to account and that the reports play a key role in allowing users, Ofcom and civil society, including those in academia, to understand the steps that companies are taking to protect users.

Under clause 65, category 1 services, category 2A search services and category 2B user-to-user services need to publish transparency reports annually in accordance with the transparency report notice from Ofcom. That relates to the points about commerciality that my hon. Friend the Member for Folkestone and Hythe talked about. Ofcom will set out what information is required from companies in their notice, which will also specify the format, manner and deadline for the information to be provided to Ofcom. Clearly, it would not be proportionate to require every service provider within the scope of the overall regulatory framework to produce a transparency report—it is also important that we deal with capacity and proportionality—but those category threshold conditions will ensure that the framework is flexible and future-proofed.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

I note what the Minister said about the commercial implications of some of these things, and some of those commercial implications might act as levers to push companies to do better on some things. By that same token, should this information not be more transparent and publicly available to give the user the choice he referred to earlier? That would mean that if a user’s data was not being properly protected and these companies were not taking the measures around safety that the public would expect, users can vote with their feet and go to a different platform. Surely that underpins a lot of what we have been talking about.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Yes, and that is why Ofcom will be the one that decides which information should be published, and from whom, to ensure that it is proportionate. At the end of the day, I have talked about the fact that transparency is at the heart of the Bill and that the transparency reports are important. To go to the original point raised by the hon. Member for Pontypridd about when these reports will be published, they will indeed be published in accordance with subsection 3(d) of the clause.

Question put and agreed to.

Clause 65 accordingly ordered to stand part of the Bill.

Schedule 8

Transparency reports by providers of Category 1 services, Category 2A services and Category 2B services

Amendments made: 61, in schedule 8, page 203, line 13, leave out

“priority content that is harmful to adults”

and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 62, in schedule 8, page 203, line 15, leave out

“priority content that is harmful to adults”

and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 63, in schedule 8, page 203, line 17, leave out

“priority content that is harmful to adults”

and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 64, in schedule 8, page 203, line 21, leave out from “or” to end of line 23 and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about user reporting of content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 65, in schedule 8, page 203, line 25, leave out

“priority content that is harmful to adults”

and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 66, in schedule 8, page 203, line 29, leave out

“priority content that is harmful to adults”

and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 67, in schedule 8, page 203, line 41, at end insert—

“11A Measures taken or in use by a provider to comply with any duty set out in section (Duty not to act against users except in accordance with terms of service) or (Further duties about terms of service) (terms of service).”

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about measures taken to comply with the new duties imposed by NC3 and NC4.

Amendment 68, in schedule 8, page 204, line 2, leave out from “illegal content” to end of line 3 and insert

“or content that is harmful to children—”.

This amendment removes the reference to content that is harmful to adults, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 69, in schedule 8, page 204, line 10, leave out from “illegal content” to “, and” in line 12 and insert

“and content that is harmful to children”.

This amendment removes the reference to content that is harmful to adults, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 70, in schedule 8, page 204, line 14, leave out from “illegal content” to “present” in line 15 and insert

“and content that is harmful to children”.

This amendment removes the reference to content that is harmful to adults, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 71, in schedule 8, page 205, line 38, after “Part 3” insert

“or Chapters 1 to 2A of Part 4”.—(Paul Scully.)

This amendment requires OFCOM, in considering which information to require from a provider in a transparency report, to consider whether the provider is subject to the duties imposed by Chapter 2A, which is the new Chapter expected to be formed by NC3 to NC6 (and Chapter 1 of Part 4).

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I beg to move amendment 72, in schedule 8, page 206, line 5, at end insert—

“35A (1) For the purposes of this Schedule, content of a particular kind is ‘relevant content’ if—

(a) a term of service, other than a term of service within sub-paragraph (2), states that a provider may or will take down content of that kind from the service or restrict users’ access to content of that kind, and

(b) it is regulated user-generated content.

(2) The terms of service within this sub-paragraph are as follows—

(a) terms of service which make provision of the kind mentioned in section 9(5) (protecting individuals from illegal content) or 11(5) (protecting children from content that is harmful to children);

(b) terms of service which deal with the treatment of consumer content.

(3) References in this Schedule to relevant content are to content that is relevant content in relation to the service in question.”

This amendment defines “relevant content” for the purposes of Schedule 8.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss Government amendments 73 and 75.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The amendments to schedule 8 confirm that references to relevant content, consumer content and regulated user-generated content have the same meaning as established by other provisions of the Bill. Again, that ensures consistency, which will, in turn, support Ofcom in requiring providers of category 1 services to give details in their annual transparency reports of their compliance with the new transparency, accountability and freedom of expression duties.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I will keep my comments on this grouping brief, because I have already raised our concerns and our overarching priority in terms of transparency reports in the previous debate, which was good one, with all Members highlighting the need for transparency and reporting in the Bill. With the Chair’s permission, I will make some brief comments on Government amendment 72 before addressing Government amendments 73 and 75.

It will come as no surprise to the Minister that amendment 72, which defines relevant content for the purposes of schedule 8, has a key omission—specifying priority content harmful to adults. For reasons we have covered at length, we think that it is a gross mistake on the Government’s side to attempt to water down the Bill in this way. If the Minister is serious about keeping adults safe online, he must reconsider this approach. However, we are happy to see amendments 73 and 75, which define consumer content and regulated user-generated content. It is important for all of us—whether we are politicians, researchers, academics, civil society, stakeholders, platforms, users or anyone else—that these definitions are in the Bill so that, when it is passed, it can be applied properly and at pace. That is why we have not sought to amend this grouping.

I must press the Minister to respond on the issues around relevant content as outlined in amendment 72. We greatly feel that more needs to be done to address this type of content and its harm to adults, so I would be grateful to hear the Minister’s assessment of how exactly these transparency reports will report back on this type of harm, given its absence in this group of amendments and the lack of a definition.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am pleased to see the list included and the number of things that Ofcom can ask for more information on. I have a specific question about amendment 75. Amendment 75 talks about regulated user-generated content and says it has the same meaning as it does in the interpretation of part 3 under clause 50. The Minister may or may not know that there are concerns about clause 50(5), which relates to

“One-to-one live aural communications”.

One-to-one live aural communications are exempted. I understand that that is because the Government do not believe that telephony services, for example, should be part of the Online Safety Bill—that is a pretty reasonable position for them to take. However, allowing one-to-one live aural communications not to be regulated means that if someone is using voice chat in Fortnite, for example, and there are only two people on the team that they are on, or if someone is using voice chat in Discord and there are only two people online on the channel at that time, that is completely unregulated and not taken into account by the Bill.

I know that that is not the intention of the Bill, which is intended to cover user-generated content online. The exemption is purely in place for telephony services, but it is far wider than the Government intend it to be. With the advent of more and more people using virtual reality technology, for example, we will have more and more aural communication between just two people, and that needs to be regulated by the Bill. We cannot just allow a free-for-all.

If we have child protection duties, for example, they need to apply to all user-generated content and not exempt it specifically because it is a live, one-to-one aural communication. Children are still at significant risk from this type of communication. The Government have put this exemption in because they consider such communication to be analogous to telephony services, but it is not. It is analogous to telephony services if we are talking about a voice call on Skype, WhatsApp or Signal—those are voice calls, just like telephone services—but we are talking about a voice chat that people can have with people who they do not know, whose phone number they do not know and who they have no sort of relationship with.

Some of the Discord servers are pretty horrendous, and some of the channels are created by social media influencers or people who have pretty extreme views in some cases. We could end up with a case where the Discord server and its chat functions are regulated, but if aural communication or a voice chat is happening on that server, and there are only two people online because it is 3 o’clock in the morning where most of the people live and lots of them are asleep, that would be exempted. That is not the intention of the Bill, but the Government have not yet fixed this. So I will make one more plea to the Government: will they please fix this unintended loophole, so that it does not exist? It is difficult to do, but it needs to be done, and I would appreciate it if the Minister could take that into consideration.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I do not believe that the provisions in terms of Ofcom’s transparency powers have been watered down. It is really important that the Bill’s protection for adults strikes the right balance with its protections for free speech, which is why we have replaced the “legal but harmful” clause. I know we will not agree on that, but there are more new duties that will make platforms more accountable. Ofcom’s transparency powers will enable it to assess compliance with the new safety duties and hold platforms accountable for enforcing their terms of service to keep users safe. Companies will also have to report on the measures that they have in place to tackle illegal content or activity and content that is harmful for children, which includes proactive steps to address offences such as child sexual exploitation and abuse.

The legislation will set out high-level categories of information that companies may be required to include in their transparency reports, and Ofcom will then specify the information that service providers will need to include in those reports, in the form of a notice. Ofcom will consider companies’ resources and capacity, service type and audience in determining what information they will need to include. It is likely that the information that is most useful to the regulator and to users will vary between different services. To ensure that the transparency framework is proportionate and reflects the diversity of services in scope, the transparency reporting requirements set out in the Ofcom notice are likely to differ between those services, and the Secretary of State will have powers to update the list of information that Ofcom may require to reflect any changes of approach.

17:03
Let me address the interesting point about aural exemptions made by the hon. Member for Aberdeen North. As she says, the exemption is there to ensure that we do not capture traditional phone calls. Phones have moved from pots and pans—from a plain old telephone system—to public access networks and beyond over the last 20 years. Although one-to-one live aural communications are exempt, other types of interactions between adults and children, including in-game private messaging chat functions and video calls, are in scope. If there are unintended consequences—the hon. Lady will know that I was described as the Minister for unintended consequences when I was at the Department for Business, Energy and Industrial Strategy—I would be happy to continue chatting with her and others to ensure that we get that difficult position right.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The in-game chat that children use is overwhelmingly voice chat. Children do not type if they can possibly avoid it. I am sure that that is not the case for all children, but it is for most children. Aural communication is used if someone is playing Fortnite duos, for example, with somebody they do not know. That is why that needs to be included.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I very much get that point. It is not something that I do, but I have certainly seen it myself. I am happy to chat to the hon. Lady to ensure that we get it right.

Amendment 72 agreed to.

Amendments made: 73, in schedule 8, page 206, line 6, at end insert—

“‘consumer content’ has the same meaning as in Chapter 2A of Part 4 (see section (Interpretation of this Chapter)(3));”.

This amendment defines “consumer content” for the purposes of Schedule 8.

Amendment 74, in schedule 8, page 206, leave out lines 7 and 8.

This amendment is consequential on Amendment 41 (removal of clause 55).

Amendment 75, in schedule 8, page 206, line 12, at end insert—

“‘regulated user-generated content’ has the same meaning as in Part 3 (see section 50), and references to such content are to content that is regulated user-generated content in relation to the service in question;”.—(Paul Scully.)

This amendment defines “regulated user-generated content” for the purposes of Schedule 8.

Schedule 8, as amended, agreed to.

Ordered, That further consideration be now adjourned. —(Mike Wood.)

17:02
Adjourned till Thursday 15 December at half-past Eleven o’clock.
Written evidence reported to the House
OSB101 Mencap
OSB102 News Media Association (NMA)
OSB103 Dr Edina Harbinja, Reader in law, Aston Law School, Aston Business School, and Deputy Editor of the Computer Law and Security Review
OSB104 Carnegie UK
OSB105 Full Fact
OSB106 Antisemitism Policy Trust
OSB107 Big Brother Watch
OSB108 Microsoft
OSB109 Internet Society
OSB110 Parent Zone
OSB111 Robin Wilton
OSB112 Wikimedia Foundation

ONLINE SAFETY BILL (Third sitting)

Committee stage (re-committed clauses and schedules)
Thursday 15th December 2022

(1 year, 4 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 15 December 2022 - (15 Dec 2022)
The Committee consisted of the following Members:
Chairs: †Dame Angela Eagle, Sir Roger Gale
† Ansell, Caroline (Eastbourne) (Con)
† Bailey, Shaun (West Bromwich West) (Con)
Bhatti, Saqib (Meriden) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
† Bonnar, Steven (Coatbridge, Chryston and Bellshill) (SNP)
† Bristow, Paul (Peterborough) (Con)
† Collins, Damian (Folkestone and Hythe) (Con)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Fletcher, Nick (Don Valley) (Con)
Leadbeater, Kim (Batley and Spen) (Lab)
† Maclean, Rachel (Redditch) (Con)
† Nichols, Charlotte (Warrington North) (Lab)
Owen, Sarah (Luton North) (Lab)
Peacock, Stephanie (Barnsley East) (Lab)
Russell, Dean (Watford) (Con)
† Scully, Paul (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
† Wood, Mike (Dudley South) (Con)
Kevin Maddison, Bethan Harding, Committee Clerks
† attended the Committee
Public Bill Committee
Thursday 15 December 2022
(Morning)
[Dame Angela Eagle in the Chair]
Online Safety Bill
(Re-committed Clauses and Schedules: Clauses 11 to 14, 18 to 21, 30, 46, 55 and 65, Schedule 8, Clauses 79 and 82, Schedule 11, Clauses 87, 90, 115, 169 and 183, Schedule 17, Clauses 203, 206 and 207, new Clauses and new Schedules)
Clause 79
General duties of OFCOM under section 3 of the Communications Act
11:30
Amendments made: 46, in clause 79, page 69, line 35, after “Chapter 1” insert “or 2A”.
Clause 79 is about OFCOM’s general duties. This amendment inserts a reference to Chapter 2A, which is the new Chapter expected to be formed by NC3 to NC6.
Amendment 47, in clause 79, page 70, line 9, after “Chapter 1” insert “or 2A”.
Clause 79 is about OFCOM’s general duties. This amendment inserts a reference to Chapter 2A, which is the new Chapter expected to be formed by NC3 to NC6.(Paul Scully.)
Clause 79, as amended, ordered to stand part of the Bill.
Clause 82
Meaning of threshold conditions etc
Paul Scully Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Paul Scully)
- Hansard - - - Excerpts

I beg to move amendment 48, in clause 82, page 72, line 21, at end insert—

“(ca) a regulated user-to-user service meets the conditions in section (List of emerging Category 1 services)(2) if those conditions are met in relation to the user-to-user part of the service;”.

This is a technical amendment ensuring that references to user-to-user services in the new clause inserted by NC7 relate to the user-to-user part of the service.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Government amendment 49.

Government new clause 7—List of emerging Category 1 services.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

These Government amendments confer a duty on Ofcom to create and publish a list of companies that are approaching the category 1 threshold to ensure that it proactively identifies emerging high-reach, high-influence companies and is ready to add them to the category 1 register without delay. That is being done in recognition of the rapid pace of change in the tech industry, in which companies can grow quickly. The changes mean that Ofcom can designate companies as category 1 at pace. That responds to concerns that platforms could be unexpectedly popular and quickly grow in size, and that there could be delays in capturing them as category 1 platforms. Amendments 48 and 49 are consequential on new clause 7, which confers a duty on Ofcom to create and publish a list of companies that are approaching the category 1 threshold. For those reasons, I recommend that the amendments be accepted.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

It will come as no surprise to Members to hear that we have serious concerns about the system of categorisation and the threshold conditions for platforms and service providers, given our long-standing view that the approach taken is far too inflexible.

In previous sittings, we raised the concern that the Government have not provided enough clarity about what will happen if a service is required to shift from one category to another, and how long that will take. We remain unclear about that, about how shifting categories will work in practice, and about how long Ofcom will have to preside over such changes and decisions.

I have been following this Bill closely for just over a year, and I recognise that the online space is constantly changing and evolving. New technologies are popping up that will make this categorisation process even more difficult. The Government must know that their approach does not capture smaller, high-harm platforms, which we know—we have debated this several times—can be at the root of some of the most dangerous and harmful content out there. Will the Minister clarify whether the Government amendments will allow Ofcom to consider adding such small, high-harm platforms to category 1, given the risk of harm?

More broadly, we are pleased that the Government tabled new clause 7, which will require Ofcom to prepare and update a list of regulated user-to-user services that have 75% of the number of users of a category 1 service, and at least one functionality of a category 1 service, or one required combination of a functionality and another characteristic or factor of a category 1 service. It is absolutely vital that Ofcom, as the regulator, is sufficiently prepared, and that there is monitoring of regulated user-to-user services so that this regime is as flexible as possible and able to cope with the rapid changes in the online space. That is why the Opposition support new clause 7 and have not sought to amend it. Moreover, we also support Government amendments 48 and 49, which are technical amendments to ensure that new clause 7 references user-to-user services and assessments of those services appropriately. I want to press the Minister on how he thinks these categories will work, and on Ofcom’s role in that.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I agree with everything that the hon. Lady said. New clause 7 is important. It was missing from the earlier iterations of the Bill, and it makes sense to have it here, but it raises further concerns about the number of people who are required to use a service before it is classed as category 1. We will come later to our amendment 104 to schedule 11, which is about adding high-risk platforms to the categorisation.

I am still concerned that the numbers are a pretty blunt instrument for categorising something as category 1. The number may end up being particularly high. I think it would be very easy for the number to be wrong—for it to be too high or too low, and probably too high rather than too low.

If Twitter were to disappear, which, given the changing nature of the online world, is not outside the realms of possibility, we could see a significant number of other platforms picking up the slack. A lot of them might have fewer users, but the same level of risk as platforms such as Twitter and Facebook. I am still concerned that choosing a number is a very difficult thing to get right, and I am not totally convinced that the Government’s way of going about this is right.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Ofcom will assess services that are close to meeting the threshold conditions of category 1 services and will publish a publicly available list of those emerging high-risk services. A service would have to meet two conditions to be added to the emerging services list: it would need at least 75% of the number of user figures in any category 1 threshold condition, and at least one functionality of a category 1 threshold condition, or one specified combination of a functionality and a characteristic or factor of a category 1 threshold condition.

Ofcom will monitor the emergence of new services. If it becomes apparent that a service has grown sufficiently to meet the threshold of becoming a category 1 service, Ofcom will be required to add that service to the register. The new clause and the consequential amendments take into account the possibility of quick growth.

Following the removal of “legal but harmful” duties, category 1 services will be subject to new transparency, accountability and free speech duties, as well as duties relating to protection for journalists and democratic content. Requiring all companies to comply with that full range of category 1 duties would pose a disproportionate regulatory burden on smaller companies that do not exert the same influence on public discourse, and that would possibly divert those companies’ resources away from tackling vital tasks.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

Will my hon. Friend confirm that the risk assessments for illegal content—the priority illegal offences; the worst kind of content—apply to all services, whether or not they are category 1?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

My hon. Friend is absolutely right. All companies will still have to tackle the risk assessment, and will have to remove illegal content. We are talking about the extra bits that could take a disproportionate amount of resource from core functions that we all want to see around child protection.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I would push the Minister further. He mentioned that there will not be an onus on companies to tackle the “legal but harmful” duty now that it has been stripped from the Bill, but we know that disinformation, particularly around elections in this country, is widespread on these high-harm platforms, and they will not be in scope of category 2. We have debated that at length. We have debated the time it could take Ofcom to act and put those platforms into category 1. Given the potential risk of harm to our democracy as a result, will the Minister press Ofcom to act swiftly in that regard? We cannot put that in the Bill now, but time is of the essence.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Absolutely. The Department has techniques for dealing with misinformation and disinformation as well, but we will absolutely push Ofcom to work as quickly as possible. As my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), the former Secretary of State, has said, once an election is done, it is done and it cannot be undone.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Could the Minister also confirm that the provisions of the National Security Bill read across to the Online Safety Bill? Where disinformation is disseminated by networks operated by hostile foreign states, particularly Russia, as has often been the case, that is still in scope. That will still require a risk assessment for all platforms, whether or not they are category 1.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Indeed. We need to take a wide-ranging, holistic view of disinformation and misinformation, especially around election times. There is a suite of measures available to us, but it is still worth pushing Ofcom to make sure that it works as quickly as possible.

Amendment 48 agreed to.

Amendment made: 49, in clause 82, page 72, line 23, after “conditions” insert

“or the conditions in section (List of emerging Category 1 services)(2)”.—(Paul Scully.)

This is a technical amendment ensuring that references to assessments of user-to-user services in the new clause inserted by NC7 relate to the user-to-user part of the service.

Clause 82, as amended, ordered to stand part of the Bill.

Schedule 11

Categories of regulated user-to-user services and regulated search services: regulations

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I beg to move amendment 76, in schedule 11, page 213, line 11, at end insert

“, and

(c) any other characteristics of that part of the service or factors relating to that part of the service that the Secretary of State considers relevant.”

This amendment provides that regulations specifying Category 1 threshold conditions for the user-to-user part of regulated user-to-user services must also include conditions relating to any other characteristics of that part of the service or factors relating to that part of the service that the Secretary of State considers relevant.

None Portrait The Chair
- Hansard -

With this, it will be convenient to discuss Government amendments 77 to 79, 81 to 84, 86 to 91 and 93.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

These Government amendments seek to change the approach to category 1 designation, following the removal from the Bill of the adult safety duties and the concept of “legal but harmful” content. Through the proposed new duties on category 1 services, we aim to hold companies accountable to their terms of service, as we have said. I seek to remove all requirements on category 1 services relating to harmful content, so it is no longer appropriate to designate them with reference to harm. Consequently, the amendments in this group change the approach to designating category 1 services, to ensure that only the largest companies with the greatest influence over public discourse are designated as category 1 services.

Specifically, these amendments will ensure that category 1 services are so designated where they have functionalities that enable easy, quick and wide dissemination of user-generated content, and the requirement of category 1 services to meet a number of users threshold remains unchanged.

The amendments also give the Secretary of State the flexibility to consider other characteristics of services, as well as other relevant factors. Those characteristics might include a service’s functionalities, the user base, the business model, governance, and other systems and processes. That gives the designation process greater flexibility to ensure that services are designated category 1 services only when they have significant influence over public discourse.

The amendments also seek to remove the use of criteria for content that is harmful to adults from category 2B, and we have made a series of consequential amendments to the designation process for categories 2A and 2B to ensure consistency.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I have commented extensively on the flaws in the categorisation process in this and previous Committees, so I will not retread old ground. I accept the amendments in this grouping. They show that the Government are prepared to broaden the criteria for selecting which companies are likely to be in category 1. That is a very welcome, if not subtle, shift in the right direction.

The amendments bring the characteristics of a company’s service into consideration, which will be a slight improvement on the previous focus on size and functionality, so we welcome them. The distinction is important, because size and functionality alone are obviously very vague indicators of harm, or the threat of harm.

We are pleased to see that the Government have allowed for a list to be drawn up of companies that are close to the margins of category 1, or that are emerging as category 1 companies. This is a positive step for regulatory certainty, and I hope that the Minister will elaborate on exactly how the assessment will be made.

However, I draw the Minister’s attention to Labour’s long-held concern about the Bill’s over-reliance on powers afforded to the Secretary of State of the day. We debated this concern in a previous sitting. I press the Minister again on why these amendments, and the regulations around the threshold conditions, are ultimately only for the Secretary of State to consider, depending on characteristics or factors that only he or she, whoever they may be, deems relevant.

We appreciate that the regulations need some flexibility, but we have genuine concerns—indeed, colleagues from all parties have expressed such concerns—that the Bill will give the Secretary of State far too much power to determine how the entire online safety regime is imposed. I ask the Minister to give the Committee an example of a situation in which it would be appropriate for the Secretary of State to make such changes without any consultation with stakeholders or the House.

It is absolutely key for all of us that transparency should lie at the heart of the Bill. Once again, we fear that the amendments are a subtle attempt by the Government to impose on what is supposed to be an independent regulatory process the whim of one person. I would appreciate assurance on that point. The Minister knows that these concerns have long been held by me and colleagues from all parties, and we are not alone in those concerns. Civil society groups are also calling for clarity on exactly how decisions will be made, and particularly on what information will be used to determine a threshold. For example, do the Government plan on quantifying a user base, and will the Minister explain how the regime would work in practice, when we know that a platform’s user base can fluctuate rapidly? We have seen that already with Mastodon; the latter’s users have increased incredibly as a result of Elon Musk’s takeover of Twitter. I hope that the Minister can reassure me about those concerns. He will know that this is a point of contention for colleagues from across the House, and we want to get the Bill right before we progress to Report.

11:45
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

My understanding is that only a very small number of platforms will reach the category 1 threshold. We are talking about the platforms that everybody has heard of—Facebook, Twitter and so on—and not about the slightly smaller platforms that lots of people have heard of and use. We are probably not talking about platforms such as Twitch, which has a much smaller user base than Facebook and Twitter but has a massive reach. My concern continues to be that the number threshold does not take into account the significant risks of harm from some of those platforms.

I have a specific question about amendment 76. I agree with my Labour Front-Bench colleague, the hon. Member for Pontypridd, that it shows that the Government are willing to take into account other factors. However, I am concerned that the Secretary of State is somehow being seen as the arbiter of knowledge—the person who is best placed to make the decisions—when much more flexibility could have been given to Ofcom instead. From all the evidence I have heard and all the people I have spoken to, Ofcom seems much more expert in dealing with what is happening today than any Secretary of State could ever hope to be. There is no suggestion about how the Secretary of State will consult, get information and make decisions on how to change the threshold conditions.

It is important that other characteristics that may not relate to functionalities are included if we discover that there is an issue with them. For example, I have mentioned livestreaming on a number of occasions in Committee, and we know that livestreaming is inherently incredibly risky. The Secretary of State could designate livestreaming as a high-risk functionality, and it could be included, for example, in category 1. I do not know whether it will be, but we know that there are risks there. How will the Secretary of State get that information?

There is no agreement to set up a user advocacy board. The requirement for Ofcom to consult the Children’s Commissioner will be brought in later, but organisations such as the National Society for the Prevention of Cruelty to Children, which deals with phone calls from children asking for help, are most aware of emerging threats. My concern is that the Secretary of State cannot possibly be close enough to the issue to make decisions, unless they are required to consult and listen to organisations that are at the coal face and that regularly support people. I shall go into more detail about high-harm platforms when we come to amendment 104.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The amendments give the Secretary of State the flexibility to consider other characteristics of services as well as other relevant factors, which include functionalities, user base, business model, governance, and other systems and processes. They effectively introduce greater flexibility into the designation process, so that category 1 services are designated only if they have significant influence over public discourse. Although the Secretary of State will make the regulations, Ofcom will carry out the objective and evidence-based process, which will be subject to parliamentary scrutiny via statutory instruments. The Secretary of State will have due consultation with Ofcom at every stage, but to ensure flexibility and the ability to move fast, it is important that the Secretary of State has those powers.

Amendment 76 agreed to.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move amendment 104, in schedule 11, page 213, line 11, at end insert—

“(1A) Regulations made under sub-paragraph (1) must provide for any regulated user-to-user service which OFCOM assesses as posing a very high risk of harm to be included within Category 1, regardless of the number of users.”

This amendment allows Ofcom to impose Category 1 duties on user-to-user services which pose a very high risk of harm.

I would say this, but I think that this is the most important amendment. The key area that the Government are getting wrong is the way in which platforms, providers or services will be categorised. The threshold is based on the number of users. It is the number of users “and” one of those other things, not the number of users “or” one of those other things; even that would make a significant difference.

The Secretary of State talked about the places that have a significant influence over public discourse. It is perfectly possible to have a significant influence over public discourse with a small number of users, or with a number of users that does not number into the millions. We have seen the spread of conspiracy theories that have originated and been perpetuated on very small platforms—very small, shady places on the internet that none of us has experienced or even heard of. Those are the places that have a massive impact and effect.

We know that one person can have a significant impact on the world and on people’s lives. We have heard about the physical harm that people can be incited to cause by the platforms they access, and the radicalisation and extremism they find themselves subject to. That can cause massive, damaging effects to anybody they choose to take physical action against, and to some of the most marginalised communities and groups in society. We are seeing an increase in the amount of hate crime and the number of people who believe conspiracy theories, and not all of that is because of the spread of those things on Facebook and Twitter. It is because of the breadcrumbing and the spread that there can be on smaller platforms.

The most extreme views do not necessarily tip over into “illegal” or “incitement”; they do not actually say, “Please go out and kill everybody in this particular group.” They say, “This particular group is responsible for all of ills you feel and for every negative thing that is happening in your life”, and people are therefore driven to take extremist, terrorist action. That is a significant issue.

I want to talk about a couple of platforms. Kiwi Farms, which is no longer in existence and has been taken down, was a very small platform that dramatically damaged the lives of trans people in particular. It was a platform where people went to incite hatred and give out the addresses of folk who they knew were members of the trans community. Some of those people had to move to another continent to get away from the physical violence and attacks they faced as a result of the behaviour on that incredibly small platform, which very few people will have heard about.

Kiwi Farms has been taken down because the internet service providers decided that it was too extreme and they could not possibly host it any more. That was eventually recognised and change was made, but the influence that that small place had on lives—the difficulties and harm it caused—is untold. Some of that did tip over into illegality, but some did not.

I also want to talk about the places where there is a significant amount of pornography. I am not going to say that I have a problem with pornography online; the internet will always have pornography on it. It attracts a chunk of people to spend time online, and some of that pornography is on large mainstream sites. Searches for incest, underage girls, or black women being abused all get massive numbers of hits. There is a significant amount of pornography on these sites that is illegal, that pretends to be illegal or that acts against people with protected characteristics. Research has found that a significant proportion—significantly more than a half—of pornography on mainstream sites that involves black women also involves violence. That is completely and totally unacceptable, and has a massive negative impact on society, whereby it reinforces negativity and discrimination against groups that are already struggling with being discriminated against and that do not experience the privilege of a cis white man.

It is really grim that we are requiring a number of users to be specified, when we know the harm that caused by platforms that do not have 10 million or 20 million United Kingdom users. I do not know what the threshold will be, but I know it will be too high to include a lot of platforms that have a massive effect. The amendment is designed specifically to give Ofcom the power to designate as category 1 any service that it thinks has a very high risk of harm; I have not set the bar particularly low. Now that the Minister has increased the levels of transparency that will be required for category 1 platforms, it is even more important that we subject extremist sites and platforms—the radicalising ones, which are perpetuating discrimination—to a higher bar and require them to have the transparency that they need as a category 1 service. This is a place where the Bill could really make a difference and change lives, and I am really concerned that it is massively failing to do so.

The reason I have said that it should be Ofcom’s responsibility to designate category 1 services is on the basis that it has the experts who will be looking at all the risk assessments, dealing with companies on a day-to-day basis, and seeing the harms and transparencies that the rest of us will not be able to see. The reporting mechanisms will be public for only some of the category 1 platforms, and we will not be able to find out the level of information that Ofcom has, so it is right that it should be responsible for designating sites as having a very high risk of harm. That is why I tabled the amendment, which would make a massive difference to people who are the most discriminated against as it is and who are the most at risk of harm from extremism. I urge the Minister to think again.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I rise briefly to support everything the hon. Member for Aberdeen North just said. We have long called for the Bill to take a harm-led approach; indeed, the Government initially agreed with us, as when it was in its first iteration it was called the Online Harms Bill rather than the Online Safety Bill. Addressing harm must be a central focus of the Bill, as we know extremist content is perpetuated on smaller, high-harm platforms; this is something that the Antisemitism Policy Trust and Hope not Hate have long called for with regards to the Bill.

I want to put on the record our huge support for the amendment. Should the hon. Lady be willing to push it to a vote—I recognise that we are small in number—we will absolutely support her.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I want to speak briefly to the amendment. I totally understand the reasons that the hon. Member for Aberdeen North has tabled it, but in reality, the kinds of activities she describes would be captured anyway, because most would fall within the remit of the priority illegal harms that all platforms and user-to-user services have to follow. If there were occasions when they did not, being included in category 1 would mean that they would be subject to the additional transparency of terms of service, but the smaller platforms that allow extremist behaviour are likely to have extremely limited terms of service. We would be relying on the priority illegal activity to set the minimum safety standards, which Ofcom would be able to do.

It would also be an area where we would want to move at pace. Even if we wanted to bring in extra risk assessments on terms of service that barely exist, the time it would take to do that would not give a speedy resolution. It is important that in the way Ofcom exercises its duties, it does not just focus on the biggest category 1 platforms but looks at how risk assessments for illegal activity are conducted across a wide range of services in scope, and that it has the resources needed to do that.

Even within category 1, it is important that is done. We often cite TikTok, Instagram and Facebook as the biggest platforms, but I recently spoke to a teacher in a larger secondary school who said that by far the worst platform they have to deal with in terms of abuse, bullying, intimidation, and even sharing of intimate images between children, is Snapchat. We need to ensure that those services get the full scrutiny they should have, because they are operating at the moment well below their stated terms of service, and in contravention of the priority illegal areas of harm.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

As debated earlier, we are removing the adult safety duties from the Bill, which means that no company will face any duties related to legal but harmful content. In their place, the Government are introducing new transparency accountability, and free speech duties on category 1 services. They have been discussed in detail earlier this session.

It would not be proportionate to apply those new duties to smaller services, but, as we have heard from my hon. Friend the Member for Folkestone and Hythe, they will still have to comply with the illegal content and child safety duties if they are accessed by children. Those services have limited resources, and blanket applying additional duties on them would divert those resources away from complying with the illegal content and child safety duties. That would likely weaken the duties’ impact on tackling criminal activity and protecting children.

The new duties are about user choice and accountability on the largest platforms—if users do not want to use smaller harmful sites, they can choose not to—but, in recognition of the rapid pace with which companies can grow, I introduced an amendment earlier to create a watchlist of companies that are approaching the category 1 threshold, which will ensure that Ofcom can monitor rapidly scaling companies, reduce any delay in designating companies as category 1 services, and apply additional obligations on them.

The hon. Member for Aberdeen North talked about ISPs acting with respect to Kiwi Farms. I talked on Tuesday about the need for a holistic approach. There is not one silver bullet. It is important to look at Government, the platforms, parenting and ISPs, because that makes up a holistic view of how the internet works. It is the multi-stakeholder framework of governing the internet in its entirety, rather than the Government trying to do absolutely everything. We have talked a lot about illegality, and I think that a lot of the areas in that case were illegal; the hon. Lady described some very distasteful things. None the less, with the introduction of the watchlist, I do not believe amendment 104 is required.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The hon. Member for Folkestone and Hythe made a good point. I do not disagree that Ofcom will have a significant role in policing platforms that are below the category 1 threshold. I am sure it will be very hands on, particularly with platforms that have the highest risk and are causing the most harm.

I still do not think that is enough. I do not think that the Minister’s change with regard to emerging platforms should be based on user numbers. It is reasonable for us to require platforms that encourage extremism, spread conspiracy theories and have the most horrific pornography on them to meet a higher bar of transparency. I do not really care if they only have a handful of people working there. I am not fussed if they say, “Sorry, we can’t do this.” If they cannot keep people safe on their platform, they should have to meet a higher transparency bar, provide more information on how they are meeting their terms of service and provide toggles—all those things. It does not matter how small these platforms are. What matters is that they have massive risks and cause massive amounts of harm. It is completely reasonable that we hold them to a higher regulatory bar. On that basis, I will push the amendment to a vote.

Division 6

Ayes: 4

Noes: 8

Amendments made: 77, in schedule 11, page 213, line 16, after “other” insert
“characteristics of the search engine or”.
This amendment provides that regulations specifying Category 2A threshold conditions for the search engine of regulated search services must also include conditions relating to any other characteristics of the search engine that the Secretary of State considers relevant.
Amendment 78, in schedule 11, page 213, line 23, after “other” insert
“characteristics of that part of the service or”.
This amendment provides that regulations specifying Category 2B threshold conditions for the user-to-user part of regulated user-to-user services must also include conditions relating to any other characteristics of that part of the service that the Secretary of State considers relevant.
Amendment 79, in schedule 11, page 213, line 36, leave out from “on” to “disseminated” in line 37 and insert
“how easily, quickly and widely regulated user-generated content is”.
This amendment provides that in making regulations specifying Category 1 threshold conditions the Secretary of State must take into account the impact of certain matters in relation to which conditions must be specified on how easily, quickly and widely regulated user-generated content is disseminated by means of the service.
Amendment 80, in schedule 11, page 214, line 2, leave out from “illegal content” to “disseminated” in line 3 and insert
“and content that is harmful to children”.
This amendment is consequential on the removal of the adult safety duties (see Amendments 6, 7 and 41).
Amendment 81, in schedule 11, page 214, line 12, leave out “the relationship between”.
This amendment is consequential on Amendment 83 (which provides for additional matters that OFCOM must carry out research into).
Amendment 82, in schedule 11, page 214, line 13, leave out from beginning to “by” and insert
“how easily, quickly and widely regulated user-generated content is disseminated”.
This amendment provides that research required to be carried out by OFCOM before regulations specifying Category 1 threshold conditions may be made must include research into how easily, quickly and widely regulated user-generated content is disseminated by means of regulated user-to-user services.
Amendment 83, in schedule 11, page 214, line 16, at end insert
“, and
(c) such other characteristics of that part of such services or factors relating to that part of such services as OFCOM consider to be relevant to specifying the Category 1 threshold conditions.”
This amendment provides that research required to be carried out by OFCOM before regulations specifying Category 1 threshold conditions may be made must include research into other characteristics or factors of the user-to-user part of regulated user-to-user services as OFCOM consider relevant to specifying the Category 1 threshold conditions.
Amendment 84, in schedule 11, page 214, line 24, after “other” insert “characteristics or”.
This amendment provides that research required to be carried out by OFCOM before regulations specifying Category 2A threshold conditions may be made must also include research into characteristics of the search engine of regulated search services and combined services as OFCOM consider relevant to specifying the Category 2A threshold conditions.
Amendment 85, in schedule 11, page 214, line 29, leave out from “illegal content” to “by” in line 30 and insert
“and content that is harmful to children”.
This amendment is consequential on the removal of the adult safety duties (see Amendments 6, 7 and 41).
Amendment 86, in schedule 11, page 214, line 34, leave out “factors” and insert
“characteristics of that part of such services or factors relating to that part of such services”.
This amendment provides that research required to be carried out by OFCOM before regulations specifying Category 2B threshold conditions may be made must include research into such other characteristics of the user-to-user part of regulated user-to-user services as OFCOM consider relevant to specifying the Category 2B threshold conditions.
Amendment 87, in schedule 11, page 214, leave out lines 40 to 42.
This amendment and Amendments 88 to 90 (which provide that OFCOM’s advice as to what provision is appropriate for regulations under paragraph 1(1), (2) or (3) of Schedule 11 to make, may include advice that the regulations include other characteristics or factors) are consequential on Amendments 76 to 78.
Amendment 88, in schedule 11, page 214, line 44, at beginning insert “characteristic or”.
This amendment and Amendments 87, 89 and 90 (which provide that OFCOM’s advice as to what provision is appropriate for regulations under paragraph 1(1), (2) or (3) of Schedule 11 to make, may include advice that the regulations include other characteristics or factors) are consequential on Amendments 76 to 78.
Amendment 89, in schedule 11, page 214, line 45, leave out “1(3)” and insert “1(1) or (3)”.
This amendment and Amendments 87, 88 and 90 (which provide that OFCOM’s advice as to what provision is appropriate for regulations under paragraph 1(1), (2) or (3) of Schedule 11 to make, may include advice that the regulations include other characteristics or factors) are consequential on Amendments 76 to 78.
Amendment 90, in schedule 11, page 214, line 45, after “other” insert “characteristic or”.
This amendment and Amendments 87 to 89 (which provide that OFCOM’s advice as to what provision is appropriate for regulations under paragraph 1(1), (2) or (3) of Schedule 11 to make, may include advice that the regulations include other characteristics or factors) are consequential on Amendments 76 to 78.
Amendment 91, in schedule 11, page 216, line 38, at end insert—
“5A In this Schedule the ‘characteristics’ of a user-to-user part of a service or a search engine include its user base, business model, governance and other systems and processes.”
This amendment defines “characteristics” of a user-to-user part of a service or search engine for the purposes of Schedule 11.
Amendment 92, in schedule 11, page 216, leave out lines 43 and 44.
This amendment is consequential on Amendment 41 (removal of clause 55).
Amendment 93, in schedule 11, page 216, line 44, at end insert—
“‘regulated user-generated content’ has the same meaning as in Part 3 (see section 50);”—(Paul Scully.)
This amendment defines “regulated user-generated content” for the purposes of Schedule 11.
Schedule 11, as amended, agreed to.
Clause 87
Power to require information
Amendment made: 50, in clause 87, page 78, line 18, at end insert—
“(iiia) any duty set out in section (Duty not to act against users except in accordance with terms of service) or (Further duties about terms of service) (terms of service),”—(Paul Scully.)
This amendment mentions the new duties imposed by NC3 and NC4 in the clause that sets out the purposes for which OFCOM may require people to provide information.
Clause 87, as amended, ordered to stand part of the Bill.
Clause 90
Reports by skilled persons
Amendments made: 51, in clause 90, page 82, line 5, leave out “12,”.
This amendment is consequential on Amendment 6 (removal of clause 12).
Amendment 52, in clause 90, page 82, line 8, leave out sub-paragraph (iv).
This amendment is consequential on Amendment 7 (removal of clause 13).
Amendment 53, in clause 90, page 82, line 16, at end insert—
“(xiia) section (Duty not to act against users except in accordance with terms of service) or (Further duties about terms of service) (terms of service);”.—(Paul Scully.)
This amendment has the effect that OFCOM may require a skilled person’s report in relation to compliance with the new duties imposed by NC3 and NC4.
Clause 90, as amended, ordered to stand part of the Bill.
Clause 115
Requirements enforceable by OFCOM against providers of regulated services
None Portrait The Chair
- Hansard -

We now come to Government amendments 54 and 55 to clause 115.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I do not wish to test the Committee’s patience. I know we need to get the Bill over the line quickly, so I do not wish to delay it by talking over old ground that we covered in the previous Public Bill Committee on clauses that we support. We do support the Government on this clause, but I will make some brief comments because, as we know, clause 115 is important. It lists the enforceable requirements for which failure to comply can trigger enforcement action.

None Portrait The Chair
- Hansard -

Order. I think the hon. Lady is speaking to clause 115. This is Government amendments 54 and 55 to clause 115. I will call you when we get to that place, which will be very soon, so stay alert.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Apologies, Dame Angela. I got carried away.

Amendments made: 54, in clause 115, page 98, leave out lines 35 and 36.

This amendment is consequential on Amendments 6 and 7 (removal of clauses 12 and 13).

Amendment 55, in clause 115, page 99, line 19, at end insert—

“Section (Duty not to act against users except in accordance with terms of service)

Acting against users only in accordance with terms of service

Section (Further duties about terms of service)

Terms of service”



—(Paul Scully.)

This amendment ensures that OFCOM are able to use their enforcement powers in Chapter 6 of Part 7 in relation to a breach of any of the new duties imposed by NC3 and NC4.

Question proposed, That the clause, as amended, stand part of the Bill.

None Portrait The Chair
- Hansard -

We now come to clause 115 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Thank you, Dame Angela—take 2.

Clause 115 focuses on the enforcement action that may be taken and will be triggered if a platform fails to comply. Given that the enforceable requirements may include, for example, duties to carry out and report on risk assessments and general safety duties, it is a shame that the Government have not seen the merits of going further with these provisions. I point the Minister to the previous Public Bill Committee, where Labour made some sensible suggestions for how to remedy the situation. Throughout the passage of the Bill, we have made it abundantly clear that more access to, and availability of, data and information about systems and processes would improve understanding of the online environment.

We cannot and should not rely solely on Ofcom to act as problems arise when they could be spotted earlier by experts somewhere else. We have already heard the Minister outline the immense task that Ofcom has ahead of it to monitor risk assessments and platforms, ensuring that platforms comply and taking action where there is illegal content and a risk to children. It is important that Ofcom has at its disposal all the help it needs.

It would be helpful if there were more transparency about how the enforcement provisions work in practice. We have repeatedly heard that without independent researchers accessing data on relevant harm, platforms will have no real accountability for how they tackle online harm. I hope that the Minister can clarify why, once again, the Government have not seen the merit of encouraging transparency in their approach. It would be extremely valuable and helpful to both the online safety regime and the regulator as a whole, and it would add merit to the clause.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

We have talked about the fact that Ofcom will have robust enforcement powers. It can direct companies to take specific steps to come into compliance or to remedy failure to comply, as well as issue fines and apply to the courts for business disruption measures. Indeed, Ofcom can institute criminal proceedings against senior managers who are responsible for compliance with an information notice, when they have failed to take all reasonable steps to ensure the company’s compliance with that notice. That criminal offence will commence two months after Royal Assent.

Ofcom will be required to produce enforcement guidelines, as it does in other areas that it regulates, explaining how it proposes to use its enforcement powers. It is important that Ofcom is open and transparent, and that companies and people using the services understand exactly how to comply. Ofcom will provide those guidelines. People will be able to see who are the users of the services. The pre-emptive work will come from the risk assessments that platforms themselves will need to produce.

We will take a phased approach to bringing the duties under the Bill into effect. Ofcom’s initial focus will be on illegal content, so that the most serious harms can be addressed as soon as possible. When those codes of practice and guidelines come into effect, the hon. Member for Pontypridd will see some of the transparency and openness that she is looking for.

Question put and agreed to.

Clause 115, as amended, accordingly ordered to stand part of the Bill.

Clause 55

Review

Amendment made: 56, in clause 155, page 133, line 27, after “Chapter 1” insert “or 2A”.—(Paul Scully.)

Clause 155 is about a review by the Secretary of State of the regulatory framework established by this Bill. This amendment inserts a reference to Chapter 2A, which is the new Chapter expected to be formed by NC3 to NC6.

12:14
Question proposed, That the clause, as amended, stand part of the Bill.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am glad that there is a review function in the Bill. I have been a member of a lot of Bill Committees and Delegated Legislation Committees that have considered legislation that has no review function and that says, “This will be looked at in the normal course of departmental reviews.” We know that not all Departments always do such reviews. In fact, some Departments do under 50% of the reviews that they are supposed to do, and whether reviews take place is not checked. We therefore we do not find out whether a piece of legislation has had the intended effect. I am sure some will have done, but some definitely will not.

If the Government do not internally review whether a Bill or piece of delegated legislation has had the effect it was supposed to have, they cannot say whether it has been a success and cannot make informed decisions about future legislation, so having a review function in this Bill is really good. However, that function is insufficient as it is not enough for the Secretary of State to do the review and we will not see enough outputs from Ofcom.

The Bill has dominated the lives of a significant number of parliamentarians for the past year—longer, in some cases—because it is so important and because it has required so much scrutiny, thinking and information gathering to get to this stage. That work will not go away once the Bill is enacted. Things will not change or move at once, and parts of the legislation will not work as effectively as they could, as is the case for any legislation, whether moved by my Government or somebody else’s. In every piece of legislation there will be things that do not pan out as intended, but a review by the Secretary of State and information from Ofcom about how things are working do not seem to be enough.

Committee members, including those on the Government Benches, have suggested having a committee to undertake the review or adding that function to the responsibilities of the Digital, Culture, Media and Sport Committee. We know that the DCMS Committee is busy and will be looking into a significant number of wide-ranging topics, so it would be difficult for it to keep a watching brief on the Online Safety Bill.

The previous Minister said that there will be some sort of reviewing mechanism, but I would like further commitment from the Government that the Bill will be kept under review and that the review process as set out will not be the only type of review that happens as things move and change and the internet develops. Many people talk about more widespread use of virtual reality, for example, but there could be other things that we have not even heard of yet. After the legislation is implemented, it will be years before every part of the Bill is in action and every requirement in the legislation is working. By the time we get to 2027-28—or whenever every part of the legislation is working—things could have changed again and be drastically different to today. Indeed, the legislation may not be fit for purpose when it first starts to work, so will the Minister provide more information about what the review process will look like on an ongoing basis? The Government say this is world-leading legislation, but how we will ensure that that is the case and that it makes a difference to the safety and experience of both children and adults online?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I am glad that we are all in agreement on the need for a review. It is important that we have a comprehensive and timely review of the regulatory regime and how it is built into legislation. It is important that we understand that the legislation has the impact that we intend.

The legislation clearly sets out what the review must consider, how Ofcom is carrying out its role and if the legislation is effective in dealing with child protection, which as the hon. Lady rightly says is its core purpose. We have struck the balance of specifying two to five years after the regime comes into force, because it provides a degree of flexibility to future Ministers to judge when it should happen. None the less, I take the hon. Lady’s point that technology is developing. That is why this is a front-footed first move in this legislation, when other countries are looking at what we are doing; because of that less prescriptive approach to technologies, the legislation can be flexible and adapt to emerging new technologies. Inevitably, this will not be the last word. Some of the things in the Digital Economy Act 2017, for example, are already out of date, as is some of the other legislation that was put in place in the early 2000s. We will inevitably come back to this, but I think we have the right balance at the moment in terms of the timing.

I do not think we need to bed in whom we consult, but wider consultation will none the less be necessary to ascertain the effectiveness of the legislation.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I am following carefully what the Minister says, but I would say briefly that a lot of the debate we have had at all stages of the Bill has rested on how we believe Ofcom will use the powers it has been given, and we need to make sure that it does that. We need to ensure that it is effective and that it has the resources it needs. The hon. Member for Aberdeen North (Kirsty Blackman) makes an important point that it may not be enough to rely on a Select Committee of the Lords or the Commons having the time to do that in the detail we would want. We might need to consider either a post-legislative scrutiny Committee or some other mechanism to ensure that there is the necessary level of oversight.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

My hon. Friend is absolutely right. The report as is obviously has to be laid before Parliament and will form part of the package of parliamentary scrutiny. But, yes, we will consider how we can utilise the expertise of both Houses in post-legislative scrutiny. We will come back on that.

Question put and agreed to.

Clause 155, as amended, accordingly ordered to stand part of the Bill.

Clause 169

Individuals providing regulated services: liability

Amendment made: 57, in clause 169, page 143, line 15, at end insert—

“(fa) Chapter 2A of Part 4 (terms of service: transparency, accountability and freedom of expression);”.—(Paul Scully.)

Clause 169 is about liability of providers who are individuals. This amendment inserts a reference to Chapter 2A, which is the new Chapter expected to be formed by NC3 to NC6, so that individuals may be jointly and severally liable for the duties imposed by that Chapter.

Clause 169, as amended, ordered to stand part of the Bill.

Clause 183 ordered to stand part of the Bill.

Schedule 17

Video-sharing platform services: transitional provision etc

Amendments made: 94, in schedule 17, page 235, line 43, leave out paragraph (c).

This amendment is consequential on Amendment 6 (removal of clause 12).

Amendment 95, in schedule 17, page 236, line 27, at end insert—

“(da) the duties set out in sections (Duty not to act against users except in accordance with terms of service) and (Further duties about terms of service) (terms of service);”.—(Paul Scully.)

This amendment ensures that services already regulated under Part 4B of the Communications Act 2003 (video-sharing platform services) are not required to comply with the new duties imposed by NC3 and NC4 during the transitional period.

Question proposed, That the schedule, as amended, be the Seventeenth schedule to the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour welcomes schedule 17, which the Government introduced on Report. We see this schedule as clarifying exactly how the existing video-sharing platform regime will be repealed and the transitional provisions that will apply to those providers as they transition to the online safety framework. The schedule is fundamentally important for both providers and users, as it establishes the formal requirements of these platforms as we move the requirement to this new legislation.

We welcome the clarification in paragraph 1(1) of the definition of a qualifying video-sharing service. On that point, I would be grateful if the Minister clarified the situation around livestreaming video platforms and whether this schedule would also apply to them. Throughout this Bill Committee, we have heard just how dangerous and harmful live video-sharing platforms can be, so this is an important point to clarify.

I have spoken at length about the importance of capturing the harms on these platforms, particularly in the context of child sexual exploitation being livestreamed online, which, thanks to the brilliant work of International Justice Mission, we know is a significant and widespread issue. I must make reference to the IJM’s findings from its recent White Paper, which highlighted the extent of the issue in the Philippines, which is widely recognised as a source country for livestreamed sexual exploitation of children. It found that traffickers often use cheap Android smartphones with pre-paid cellular data services to communicate with customers and produce and distribute explicit material. To reach the largest possible customer base, they often connect with sexually motivated offenders through everyday technology—the same platforms that the rest of us use to communicate with friends, family and co-workers.

One key issue in assessing the extent of online sexual exploitation of children is that we are entirely dependent on the detection of the crime, but the reality is that most current technologies that are widely used to detect various forms of online sexual exploitation of children are not designed to recognise livestreaming video services. This is an important and prolific issue, so I hope the Minister can assure me that the provisions in the schedule will apply to those platforms too.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

We are setting out in schedule 17 how the existing video-sharing platform regime will be repealed in the transitional provisions that apply to these providers as they transition to the online safety framework. My understanding is that it does include livestreaming, but I will obviously write to the hon. Lady if I have got that wrong. I am not sure there is a significant legal effect here. To protect children and treat services fairly while avoiding unnecessary burdens on business, we are maintaining the current user protections in the VSP regime while the online safety framework is being implemented. That approach to transition avoids the duplication of regulation.

Question put and agreed to.

Schedule 17, as amended, accordingly agreed to.

Clause 203

Interpretation: general

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move amendment 105, in clause 203, page 167, line 8, after “including” insert “but not limited to”.

This amendment makes clear that the definition provided for content is not exhaustive.

I am delighted that we have a new Minister, because I can make exactly the same speech as I made previously in Committee—don’t worry, I won’t—and he will not know.

I still have concerns about the definition of “content”. I appreciate that the Government have tried to include a number of things in the definition. It currently states:

“‘content’ means anything communicated by means of an internet service, whether publicly or privately, including written material or messages, oral communications, photographs, videos, visual images, music and data of any description”.

That is pretty wide-ranging, but I do not think it takes everything into account. I know that it uses the word “including”; it does not say “only limited to” or anything like that. If there is to be a list of stuff, it should be exhaustive. That is my idea of how the Bill should be.

I have suggested in amendment 105 that we add “not limited to” after “including” in order to be absolutely clear that the content that we are talking about includes anything. It may or may not be on this list. Something that is missing from the list is VR technology. If someone is using VR or immersive technology and is a character on the screen, they can see what the character is doing and move their body around as that character, and whatever they do is user-generated content. It is not explicitly included in the Bill, even though there is a list of things. I do not even know how that would be written down in any way that would make sense.

I have suggested adding “not limited to” to make it absolutely clear that this is not an exhaustive list of the things that could be considered to be user-generated content or content for the purposes of the Bill. It could be absolutely anything that is user-generated. If the Minister is able to make it absolutely clear that this is not an exhaustive list and that “content” could be anything that is user-generated, I will not press the amendment to a vote. I would be happy enough with that commitment.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Indeed I can give that commitment. This is an indicative list, not an exhaustive list, for the reasons that the hon. Lady set out. Earlier, we discussed the fact that technology moves on, and she has come up with an interesting example. It is important to note that adding unnecessary words in legislation could lead to unforeseen outcomes when it is interpreted by courts, which is why we have taken this approach, but we think it does achieve the same thing.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

On that basis, I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Amendment proposed: 58, in clause 203, page 167, leave out lines 26 to 31. —(Paul Scully.)

This amendment removes the definition of the “maximum summary term for either-way offences”, as that term has been replaced by references to the general limit in a magistrates’ court.

12:30
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I would like to ask the Minister why this amendment has been tabled. I am not entirely clear. Could he give us some explanation of the intention behind the amendment? I am pretty sure it will be fine but, if he could just let us know what it is for, that would be helpful.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I am happy to do so. Clause 203 sets out the interpretation of the terms used throughout the Bill. Amendment 58 removes a definition that is no longer required because the term is no longer in the Bill. It is as simple as that. The definition of relevant crime penalties under the Bill now uses a definition that has been updated in the light of changes to sentencing power in magistrates courts set out in the Judicial Review And Courts Act 2022. The new definition of

“general limit in a magistrates court”

is now included in the Interpretation Act 1978, so no definition is required in this Bill.

Question put and agreed to.

Amendment 58 accordingly agreed to.

Amendment made: 59, in clause 203, page 168, line 48, at end insert—

“and references to restrictions on access to a service or to content are to be read accordingly.” —(Paul Scully.)

NC2 states what is meant by restricting users’ access to content, and this amendment makes it clear that the propositions in clause 203 about access read across to references about restricting access.

Question proposed, That the clause, as amended, stand part of the Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Once again, I will abuse the privilege of having a different Minister at the Dispatch Box and mention the fact that, in the definitions, “oral communications” is mentioned in line 9 that we already mentioned in terms of the definition of “content”. It is “oral communications” in this part of the Bill but “aural communications” in an earlier part of the Bill. I am still baffled as to why there is a difference. Perhaps we should have both included in both of these sections or perhaps there should be some level of consistency throughout the Bill.

The “aural communications” section that I mentioned earlier in clause 50 is the one of the parts that I am particularly concerned about because it could create a loophole. That is a different spelling of the word. I asked this last time. I am not convinced that the answer I got gave me any more clarity than I had previously. I would be keen to understand why there is a difference, if the difference is intentional and what the difference therefore is between “oral” and “aural” communications in terms of the Bill. My understanding is that oral communications are ones that are said and aural communications are ones that are heard. But, for the purposes of the Bill, those two things are really the same, unless user-generated content in which there is user-generated oral communication that no one can possibly hear is included. That surely does not fit into the definitions, because user-generated content is only considered if it is user-to-user—something that other people can see. Surely, oral communication would also be aural communication. In pretty much every instance that the Bill could possibly apply to, both definitions would mean the same thing. I understand the Minister may not have the answer to this at his fingertips, and I would be happy to hear from him later if that would suit him better.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The clause provides legal certainty about the meaning of those terms as used in the Bill: things such as “content”, “encounter”, “taking down” and “terms of service”. That is what the clause is intended to do. It is intentional and is for the reasons the hon. Lady said. Oral means speech and speech only. Aural is speech and other sounds, which is what can be heard on voice calls. That includes music as well. One is speech. The other is the whole gamut.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I am intrigued, because the hon. Member for Aberdeen North makes an interesting point. It is not one I have heard made before. Does the Minister think there is a distinction between oral and aural, where oral is live speech and aural is pre-recorded material that might be played back? Are those two are considered distinct?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

My knowledge is being tested, so I will write to the hon. Member for Aberdeen North and make that available to the Committee. Coming back to the point she made about oral and aural on Tuesday about another clause on the exclusions, as I said, we have a narrow exemption to ensure that traditional phone calls are not subject to regulation. But that does mean that if a service such as Fortnite, which she spoke about previously, enables adults and children to have one-to-one oral calls, companies will still need to address the surrounding functionality around how that happens, because to enable that might cause harm—for example if an adult can contact an unknown child. That is still captured within the Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Platforms will have to address, for example, the ways in which users can communicate with people who are not on their friends list. Things like that and other ways in which communication can be set up will have to be looked at in the risk assessment. With Discord, for instance, where two people can speak to each other, Discord will have to look at the way those people got into contact with each other and the risks associated with that, rather than the conversation itself, even though the conversation might be the only bit that involves illegality.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

It is the functionalities around it that enable the voice conversation to happen.

Question put and agreed to.

Clause 203, as amended, accordingly ordered to stand part of the Bill.

Clause 206

Extent

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I would like to welcome the Government’s clarification, particularly as an MP representing a devolved nation within the UK. It is important to clarify the distinction between the jurisdictions, and I welcome that this clause does that.

Question put and agreed to.

Clause 206 accordingly ordered to stand part of the Bill.

Clause 207

Commencement and transitional provision

Amendment made: 60, in clause 207, page 173, line 15, leave out “to” and insert “and”.—(Paul Scully.)

This amendment is consequential on amendment 41 (removal of clause 55).

Question proposed, That the clause, as amended, stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour welcomes clause 207, which outlines the commencement and transitional provisions for the Bill to effectively come into existence. The Minister knows that Labour is concerned about the delays that have repeatedly held up the Bill’s progress, and I need not convince him of the urgent need for it to pass. I think contributions in Committee plus those from colleagues across the House as the Bill has progressed speak for themselves. The Government have repeatedly claimed they are committed to keeping children safe online, but have repeatedly failed to bring forward this legislation. We must now see commitments from the Minister that the Bill, once enacted, will make a difference right away.

Labour has specific concerns shared with stakeholders, from the Age Verification Providers Association to the Internet Watch Foundation, the NSPCC and many more, about the road map going forward. Ofcom’s plan for enforcement already states that it will not begin enforcement on harm to children from user-to-user content under part 3 of the Bill before 2025. Delays to the Bill as well as Ofcom’s somewhat delayed enforcement plans mean that we are concerned that little will change in the immediate future or even in the short term. I know the Minister will stand up and say that if the platforms want to do the right thing, there is nothing stopping them from doing so immediately, but as we have seen, they need convincing to take action when it counts, so I am not convinced that platforms will do the right thing.

Charlotte Nichols Portrait Charlotte Nichols (Warrington North) (Lab)
- Hansard - - - Excerpts

If the Government’s argument is that there is nothing to stop platforms taking such actions early, why are we discussing the Bill at all? Platforms have had many years to implement such changes, and the very reason we need this Bill is that they have not been.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Exactly. My hon. Friend makes an incredibly important point that goes to the heart of why we are here in the first place. If the platforms were not motivated by commercial interest and we could trust them to do the right thing on keeping children safe and reducing harm on their platforms, we would not require this legislation in the first place. But sadly, we are where we are, which is why it is even more imperative that we get on with the job, that Ofcom is given the tools to act swiftly and tries to reduce the limit of when they come into effect and that this legislation is enacted so that it actually makes a lasting difference.

Ofcom has already been responsible for regulating video-sharing platforms for two years, yet still, despite being in year 3, it is only asking websites to provide a plan as to how they will be compliant. That means the reality is that we can expect little on child protection before 2027-28, which creates a massive gap compared with public expectations of when the Bill will be passed. We raised these concerns last time, and I felt little assurance from the Minister in post last time, so I am wondering whether the current Minister can improve on his predecessor by ensuring a short timeline for when exactly the Bill can be implemented and Ofcom can act.

We all understand the need for the Bill, which my hon. Friend the Member for Warrington North just pointed out. That is why we have been supportive in Committee and throughout the passage of the Bill. But the measures that the Bill introduces must come into force as soon as is reasonably possible. Put simply, the industry is ready and users want to be protected online and are ready too. It is just the Government, sadly, and the regulator that would be potentially holding up implementation of the legislation.

The Minister has failed to concede on any of the issues that we have raised in Committee, despite being sympathetic and supportive. His predecessor was also incredibly supportive and sympathetic on everything we raised in Committee, yet failed to take into account a single amendment or issue that we raised. I therefore make a plea to this Minister to at least see the need to press matters and the timescale that is needed here. We have not sought to formally amend this clause, so I seek the Minister’s assurance that this legislation will be dealt with swiftly. I urge him to work with Labour, SNP colleagues and colleagues across the House to ensure that the legislation and the provisions in it are enacted and that there are no further unnecessary delays.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Our intention is absolutely to get this regime operational as soon as possible after Royal Assent. We have to get to Royal Assent first, so I am looking forward to working with all parties in the other House to get the legislation to that point. After that, we have to ensure that the necessary preparations are completed effectively and that service providers understand exactly what is expected of them. To answer the point made by the hon. Member for Warrington North about service providers, the key difference from what happened in the years that led to this legislation being necessary is that they now will know exactly what is expected of them—and it is literally being expected of them, with legislation and with penalties coming down the line. They should not be needing to wait for the day one switch-on. They can be testing and working through things to ensure that the system does work on day one, but they can do that months earlier.

The legislation does require some activity that can be carried out only after Royal Assent, such as public consultation or laying of secondary legislation. The secondary legislation is important. We could have put more stuff in primary legislation, but that would belie the fact that we are trying to make this as flexible as possible, for the reasons that we have talked about. It is so that we do not have to keep coming back time and again for fear of this being out of date almost before we get to implementation in the first place.

However, we are doing things at the moment. Since November 2020, Ofcom has begun regulation of harmful content online through the video-sharing platform regulatory regime. In December 2020, Government published interim codes of practice on terrorist content and activity and sexual exploitation and abuse online. Those will help to bridge the gap until the regulator becomes operational. In June 2021, we published “safety by design” guidance, and information on a one-stop-shop for companies on protecting children online. In July 2021, we published the first Government online media literacy strategy. We do encourage stakeholders, users and families to engage with and help to promote that wealth of material to minimise online harms and the threat of misinformation and disinformation. But clearly, we all want this measure to be on the statute book and implemented as soon as possible. We have talked a lot about child protection, and that is the core of what we are trying to do here.   

     Question put and agreed to.

Clause 207, as amended, accordingly ordered to stand part of the Bill.

New Clause 1

OFCOM’s guidance: content that is harmful to children and user empowerment

“(1) OFCOM must produce guidance for providers of Part 3 services which contains examples of content or kinds of content that OFCOM consider to be, or consider not to be— OFCOM must produce guidance for providers of Category 1 services which contains examples of content or kinds of content that OFCOM consider to be, or consider not to be, content to which section 14(2) applies (see section 14(8A)).

(a) primary priority content that is harmful to children, or

(b) priority content that is harmful to children.

(2) Before producing any guidance under this section (including revised or replacement guidance), OFCOM must consult such persons as they consider appropriate.

(3) OFCOM must publish guidance under this section (and any revised or replacement guidance).”—(Paul Scully.)

This new clause requires OFCOM to give guidance to providers in relation to the kinds of content that OFCOM consider to be content that is harmful to children and content relevant to the duty in clause 14(2) (user empowerment).

Brought up, and read the First time.

12:44
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

The Government are committed to empowering adults to have greater control over their online experience, and to protecting children from seeing harmful content online. New clause 1 places a new duty on Ofcom to produce and publish guidance for providers of user-to-user regulated services, in relation to the crucial aims of empowering adults and providers having effective systems and processes in place. The guidance will provide further clarity, including through

“examples of content or kinds of content that OFCOM consider to be…primary priority”

or

“priority content that is harmful to children.”

Ofcom will also have to produce guidance that sets out examples of content that it considers to be relevant to the user empowerment duties, as set out in amendment 15 to clause 14.

It is really important that expert opinion is considered in the development of this guidance, and the new clause places a duty on Ofcom to consult with relevant persons when producing sets of guidance. That will ensure that the views of subject matter experts are reflected appropriately.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour is pleased to see the introduction of the new clause, which clarifies the role of Ofcom in delivering guidance to providers about their duties. Specifically, the new clause will require Ofcom to give guidance to providers on the kind of content that Ofcom considers to be harmful to children, or relevant to the user empowerment duty in clause 14. That is a very welcome addition indeed.

Labour remains concerned about exactly how these so-called user empowerment tools will work in practice—we have discussed that at length—and let us face it: we have had little assurance from the Minister on that point. We welcome the new clause, as it clarifies what guidance providers can expect to receive from Ofcom once the Bill is finally enacted. We can all recognise that Ofcom has a colossal task ahead of it—the Minister said so himself—so it is particularly welcome that the guidance will be subject to consultation with those that it deems appropriate. I can hope only that that will include the experts, and the many groups that provided expertise, support and guidance on internet regulation long before the Bill even received its First Reading, a long time ago. There are far too many of those experts and groups to list, but it is fundamental that the experts who often spot online harms before they properly emerge be consulted and included in this process if we are to truly capture the priority harms to children, as the new clause intends.

We also welcome the clarification in subsection (2) that Ofcom will be required to provide “examples of content” that would be considered to be—or not be—harmful. These examples will be key to ensuring that the platforms have nowhere to hide when it comes to deciding what is harmful; there will be no grey area. Ofcom will have the power to show them exact examples of what could be deemed harmful.

We recognise, however, that there is subjectivity to the work that Ofcom will have to do once the Bill passes. On priority content, it is most important that providers are clear about what is and is not acceptable; that is why we welcome the new clause, but we do of course wish that the Government applied the same logic to harm pertaining to adults online.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am also happy to support new clause 1, but I have a couple of questions. It mentions that “replacement guidance” may be provided, which is important because, as we have said a number of times, things will change, and we will end up with a different online experience; that can happen quickly. I am glad that Ofcom has the ability to refresh and update the guidance.

My question is about timelines. There do not seem to be any timelines in the new clause for when the guidance is required to be published. It is key that the guidance be published before companies and organisations have to comply with it. My preference would be for it to be published as early as possible. There may well need to be more work, and updated versions of the guidance may therefore need to be published, but I would rather companies had an idea of the direction of travel, and what they must comply with, as soon as possible, knowing that it might be tweaked. That would be better than waiting until the guidance was absolutely perfect and definitely the final version, but releasing it just before people had to start complying with it. I would like an assurance that Ofcom will make publishing the guidance a priority, so that there is enough time to ensure compliance. We want the Bill to work; it will not work if people do not know what they have to comply with. Assurance on that would be helpful.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I absolutely give that assurance to the hon. Lady; that is important. We all want the measures to be implemented, and the guidance to be out there, as soon as possible. Just now I talked about the platforms bringing in measures as soon as possible, without waiting for the implementation period. They can do that far better if they have the guidance. We are already working with Ofcom to ensure that the implementation period is as short as possible, and we will continue to do so.

Question put and agreed to.

New clause 1 accordingly read a Second time, and added to the Bill.

New Clause 2

Restricting users’ access to content

“(1) This section applies for the purposes of this Part.

(2) References to restricting users’ access to content, and related references, include any case where a provider takes or uses a measure which has the effect that—

(a) a user is unable to access content without taking a prior step (whether or not taking that step might result in access being denied), or

(b) content is temporarily hidden from a user.

(3) But such references do not include any case where—

(a) the effect mentioned in subsection (2) results from the use or application by a user of features, functionalities or settings which a provider includes in a service in compliance with the duty set out in section 14(2) (user empowerment), or

(b) access to content is controlled by another user, rather than the provider.

(4) See also section 203(5).”—(Paul Scully.)

This new clause deals with the meaning of references to restricting users’ access to content, in particular by excluding restrictions resulting from the use of user empowerment tools as described in clause 14.

Brought up, read the First and Second time, and added to the Bill.

New Clause 3

Duty not to act against users except in accordance with terms of service

“(1) A provider of a Category 1 service must operate the service using proportionate systems and processes designed to ensure that the provider does not—

(a) take down regulated user-generated content from the service,

(b) restrict users’ access to regulated user-generated content, or

(c) suspend or ban users from using the service,

except in accordance with the terms of service.

(2) Nothing in subsection (1) is to be read as preventing a provider from taking down content from a service or restricting users’ access to it, or suspending or banning a user, if such an action is taken—

(a) to comply with the duties set out in—

(i) section 9(2) or (3) (protecting individuals from illegal content), or

(ii) section 11(2) or (3) (protecting children from content that is harmful to children), or

(b) to avoid criminal or civil liability on the part of the provider that might reasonably be expected to arise if such an action were not taken.

(3) In addition, nothing in subsection (1) is to be read as preventing a provider from—

(a) taking down content from a service or restricting users’ access to it on the basis that a user has committed an offence in generating, uploading or sharing it on the service, or

(b) suspending or banning a user on the basis that—

(i) the user has committed an offence in generating, uploading or sharing content on the service, or

(ii) the user is responsible for, or has facilitated, the presence or attempted placement of a fraudulent advertisement on the service.

(4) The duty set out in subsection (1) does not apply in relation to—

(a) consumer content (see section (Interpretation of this Chapter));

(b) terms of service which deal with the treatment of consumer content.

(5) If a person is the provider of more than one Category 1 service, the duty set out in subsection (1) applies in relation to each such service.

(6) The duty set out in subsection (1) extends only to the design, operation and use of a service in the United Kingdom, and references in this section to users are to United Kingdom users of a service.

(7) In this section—

‘criminal or civil liability’ includes such a liability under the law of a country outside the United Kingdom;

‘fraudulent advertisement’ has the meaning given by section 35;

‘offence’ includes an offence under the law of a country outside the United Kingdom.

(8) See also section 16 (duties to protect news publisher content).”—(Paul Scully.)

This new clause imposes a duty on providers of Category 1 services to ensure that they do not take down content or restrict users’ access to it, or suspend or ban users, except in accordance with the terms of service.

Brought up, read the First and Second time, and added to the Bill.

New Clause 4

Further duties about terms of service

All services

“(1) A provider of a regulated user-to-user service must include clear and accessible provisions in the terms of service informing users about their right to bring a claim for breach of contract if—

(a) regulated user-generated content which they generate, upload or share is taken down, or access to it is restricted, in breach of the terms of service, or

(b) they are suspended or banned from using the service in breach of the terms of service.

Category 1 services

(2) The duties set out in subsections (3) to (7) apply in relation to a Category 1 service, and references in subsections (3) to (9) to ‘provider’ and ‘service’ are to be read accordingly.

(3) A provider must operate a service using proportionate systems and processes designed to ensure that—

(a) if the terms of service state that the provider will take down a particular kind of regulated user-generated content from the service, the provider does take down such content;

(b) if the terms of service state that the provider will restrict users’ access to a particular kind of regulated user-generated content in a specified way, the provider does restrict users’ access to such content in that way;

(c) if the terms of service state cases in which the provider will suspend or ban a user from using the service, the provider does suspend or ban the user in those cases.

(4) A provider must ensure that—

(a) terms of service which make provision about the provider taking down regulated user-generated content from the service or restricting users’ access to such content, or suspending or banning a user from using the service, are—

(i) clear and accessible, and

(ii) written in sufficient detail to enable users to be reasonably certain whether the provider would be justified in taking the specified action in a particular case, and

(b) those terms of service are applied consistently.

(5) A provider must operate a service using systems and processes that allow users and affected persons to easily report—

(a) content which they consider to be relevant content (see section (Interpretation of this Chapter));

(b) a user who they consider should be suspended or banned from using the service in accordance with the terms of service.

(6) A provider must operate a complaints procedure in relation to a service that—

(a) allows for complaints of a kind mentioned in subsection (8) to be made,

(b) provides for appropriate action to be taken by the provider of the service in response to complaints of those kinds, and

(c) is easy to access, easy to use (including by children) and transparent.

(7) A provider must include in the terms of service provisions which are easily accessible (including to children) specifying the policies and processes that govern the handling and resolution of complaints of a kind mentioned in subsection (8).

(8) The kinds of complaints referred to in subsections (6) and (7) are—

(a) complaints by users and affected persons about content present on a service which they consider to be relevant content;

(b) complaints by users and affected persons if they consider that the provider is not complying with a duty set out in any of subsections (1) or (3) to (5);

(c) complaints by a user who has generated, uploaded or shared content on a service if that content is taken down, or access to it is restricted, on the basis that it is relevant content;

(d) complaints by users who have been suspended or banned from using a service.

(9) The duties set out in subsections (3) and (4) do not apply in relation to terms of service which—

(a) make provision of the kind mentioned in section 9(5) (protecting individuals from illegal content) or 11(5) (protecting children from content that is harmful to children), or

(b) deal with the treatment of consumer content.

Further provision

(10) If a person is the provider of more than one regulated user-to-user service or Category 1 service, the duties set out in this section apply in relation to each such service.

(11) The duties set out in this section extend only to the design, operation and use of a service in the United Kingdom, and references to users are to United Kingdom users of a service.

(12) See also section 16 (duties to protect news publisher content).”—(Paul Scully.)

Subsections (3) to (8) of this new clause impose new duties on providers of Category 1 services in relation to terms of service that allow a provider to take down content or restrict users’ access to it, or to suspend or ban users. Such terms of service must be clear and applied consistently. Subsection (1) of the clause contains a duty which, in part, was previously in clause 20 of the Bill.

Brought up, read the First and Second time, and added to the Bill.

New Clause 5

OFCOM’s guidance about duties set out in sections (Duty not to act against users except in accordance with terms of service) and (Further duties about terms of service)

“(1) OFCOM must produce guidance for providers of Category 1 services to assist them in complying with their duties set out in sections (Duty not to act against users except in accordance with terms of service) and (Further duties about terms of service)(3) to (7).

(2) OFCOM must publish the guidance (and any revised or replacement guidance).”—(Paul Scully.)

This new clause requires OFCOM to give guidance to providers about complying with the duties imposed by NC3 and NC4.

Brought up, read the First and Second time, and added to the Bill.

New Clause 6

Interpretation of this Chapter

“(1) This section applies for the purposes of this Chapter.

(2) “Regulated user-generated content” has the same meaning as in Part 3 (see section 50), and references to such content are to content that is regulated user-generated content in relation to the service in question.

(3) “Consumer content” means—

(a) regulated user-generated content that constitutes, or is directly connected with content that constitutes, an offer to sell goods or to supply services,

(b) regulated user-generated content that amounts to an offence under the Consumer Protection from Unfair Trading Regulations 2008 (S.I. 2008/1277) (construed in accordance with section 53: see subsections (3), (11) and (12) of that section), or

(c) any other regulated user-generated content in relation to which an enforcement authority has functions under those Regulations (see regulation 19 of those Regulations).

(4) References to restricting users’ access to content, and related references, are to be construed in accordance with sections (Restricting users’ access to content) and 203(5).

(5) Content of a particular kind is “relevant content” if—

(a) a term of service, other than a term of service mentioned in section (Further duties about terms of service)(9), states that a provider may or will take down content of that kind from the service or restrict users’ access to content of that kind, and

(b) it is regulated user-generated content.

References to relevant content are to content that is relevant content in relation to the service in question.

(6) “Affected person” means a person, other than a user of the service in question, who is in the United Kingdom and who is—

(a) the subject of the content,

(b) a member of a class or group of people with a certain characteristic targeted by the content,

(c) a parent of, or other adult with responsibility for, a child who is a user of the service or is the subject of the content, or

(d) an adult providing assistance in using the service to another adult who requires such assistance, where that other adult is a user of the service or is the subject of the content.

(7) In determining what is proportionate for the purposes of sections (Duty not to act against users except in accordance with terms of service) and (Further duties about terms of service), the size and capacity of the provider of a service is, in particular, relevant.

(8) For the meaning of “Category 1 service”, see section 83 (register of categories of services).”—(Paul Scully.)

This new clause gives the meaning of terms used in NC3 and NC4.

Brought up, read the First and Second time, and added to the Bill.

New Clause 7

List of emerging Category 1 services

“(1) As soon as reasonably practicable after the first regulations under paragraph 1(1) of Schedule 11 come into force (regulations specifying Category 1 threshold conditions), OFCOM must comply with subsections (2) and (3).

(2) OFCOM must assess each regulated user-to-user service which they consider is likely to meet each of the following conditions, to determine whether the service does, or does not, meet them—

(a) the first condition is that the number of United Kingdom users of the user-to-user part of the service is at least 75% of the figure specified in any of the Category 1 threshold conditions relating to number of users (calculating the number of users in accordance with the threshold condition in question);

(b) the second condition is that—

(i) at least one of the Category 1 threshold conditions relating to functionalities of the user-to-user part of the service is met, or

(ii) if the regulations under paragraph 1(1) of Schedule 11 specify that a Category 1 threshold condition relating to a functionality of the user-to-user part of the service must be met in combination with a Category 1 threshold condition relating to another characteristic of that part of the service or a factor relating to that part of the service (see paragraph 1(4) of Schedule 11), at least one of those combinations of conditions is met.

(3) OFCOM must prepare a list of regulated user-to-user services which meet the conditions in subsection (2).

(4) The list must contain the following details about a service included in it—

(a) the name of the service,

(b) a description of the service,

(c) the name of the provider of the service, and

(d) a description of the Category 1 threshold conditions by reference to which the conditions in subsection (2) are met.

(5) OFCOM must take appropriate steps to keep the list up to date, including by carrying out further assessments of regulated user-to-user services.

(6) OFCOM must publish the list when it is first prepared and each time it is revised.

(7) When assessing whether a service does, or does not, meet the conditions in subsection (2), OFCOM must take such steps as are reasonably practicable to obtain or generate information or evidence for the purposes of the assessment.

(8) An assessment for the purposes of this section may be included in an assessment under section 83 or 84 (as the case may be) or carried out separately.”—(Paul Scully.)

This new clause requires OFCOM to prepare and keep up to date a list of regulated user-to-user services that have 75% of the number of users of a Category 1 service, and at least one functionality of a Category 1 service or one required combination of a functionality and another characteristic or factor of a Category 1 service.

Brought up, read the First and Second time, and added to the Bill.

New Clause 8

Child user empowerment duties

“(1) This section sets out the duties to empower child users which apply in relation to Category 1 services.

(2) A duty to include in a service, to the extent that it is proportionate to do so, features which child users may use or apply if they wish to increase their control over harmful content.

(3) The features referred to in subsection (2) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—

(a) reduce the likelihood of the user encountering priority content that is harmful, or particular kinds of such content, by means of the service, or

(b) alert the user to the harmful nature of priority content that is harmful that the user may encounter by means of the service.

(4) A duty to ensure that all features included in a service in compliance with the duty set out in subsection (2) are made available to all child users.

(5) A duty to include clear and accessible provisions in the terms of service specifying which features are offered in compliance with the duty set out in subsection (2), and how users may take advantage of them.

(6) A duty to include in a service features which child users may use or apply if they wish to filter out non-verified users.

(7) The features referred to in subsection (6) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—

(a) prevent non-verified users from interacting with content which that user generates, uploads or shares on the service, and

(b) reduce the likelihood of that user encountering content which non-verified users generate, upload or share on the service.

(8) A duty to include in a service features which child users may use or apply if they wish to only encounter content by users they have approved.

(9) A duty to include in a service features which child users may use or apply if they wish to filter out private messages from—

(a) non-verified users, or

(b) adult users, or

(c) any user other than those on a list approved by the child user.

(10) In determining what is proportionate for the purposes of subsection (2), the following factors, in particular, are relevant—

(a) all the findings of the most recent child risk assessment (including as to levels of risk and as to nature, and severity, of potential harm), and

(b) the size and capacity of the provider of a service.

(11) In this section “non-verified user” means a user who has not verified their identity to the provider of a service (see section 58(1)).

(12) In this section references to features include references to functionalities and settings.”—(Kirsty Blackman.)

Brought up, and read the First time.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

That was some stretch of procedure, Dame Angela, but we got there in the end. This new clause is about child user empowerment duties. I am really pleased that the Government have user empowerment duties in the Bill—they are a good thing—but I am confused as to why they apply only to adult users, and why children do not deserve the same empowerment rights over what they access online.

In writing the new clause, I pretty much copied clause 14, before there were any amendments to it, and added a couple of extra bits: subsections (8) and (9). In subsection (8), I have included:

“A duty to include in a service features which child users may use or apply if they wish to only encounter content by users they have approved.”

That would go a step further than the verification process and allow users to approve only people who are in their class at school, people with whom they are friends, or even certain people in their class at school, and to not have others on that list. I know that young people playing Fortnite—I have mentioned Fortnite a lot because people play it a lot—or Roblox are contacted by users whom they do not know, and there is no ability for young people to switch off some of the features while still being able to contact their friends. Users can either have no contact from anyone, or they can have a free-for-all. That is not the case for all platforms, but a chunk of them do not let users speak only to people on their friends list, or receive messages only from people on the list.

My proposed subsection (8) would ensure that children could have a “white list” of people who they believe are acceptable, and who they want to be contacted by, and could leave others off the list. That would help tackle not just online child exploitation, but the significant online bullying that teachers and children report. Children have spoken of the harms they experience as a result of people bullying them and causing trouble online; the perpetrators are mainly other children. Children would be able to remove such people from the list and so would not receive any content, messages or comments from those who make their lives more negative.

Subsection (9) is related to subsection (8); it would require a service to include

“features which child users may use or apply if they wish to filter out private messages from—

(a) non-verified users, or

(b) adult users, or

(c) any user other than those on a list approved by the child user.”

Adults looking to exploit children will use private messaging on platforms such as Instagram. Instagram has to know how old its users are, so anybody who is signed up to it will have had to provide it with their date of birth. It is completely reasonable for a child to say, “I want to filter out everything from an adult.” When we talk about children online, we are talking about anybody from zero to 18, which is a very wide age range. Some of those people will be working and paying bills, but will not have access to the empowerment features that adults have access to, because they have not yet reached that magical threshold. Some services may decide to give children access to user empowerment tools, but there is no requirement to. The only requirement in the Bill on user empowerment tools is for adults. That is not fair.

Children should have more control over the online environment. We know how many children feel sad as a result of their interactions online, and how many encounter content online that they wish they had never seen and cannot unsee. We should give them more power over that, and more power to say, “No, I don’t want to see that. I don’t want people I don’t know contacting me. I don’t want to get unsolicited messaged. I don’t want somebody messaging me, pretending that they are my friend or that they go to another school, when they are in fact an adult, and I won’t realise until it is far too late.”

The Bill applies to people of all ages. All of us make pretty crappy decisions sometimes. That includes teenagers, but they also make great decisions. If there was a requirement for them to have these tools, they could choose to make their online experience better. I do not think this was an intentional oversight, or that the Government set out to disadvantage children when they wrote the adult user empowerment clauses. I think they thought that it would be really good to have those clauses in the Bill, in order to give users a measure of autonomy over their time and interactions online. However, they have failed to include the same thing for children. It is a gap.

I appreciate that there are child safety duties, and that there is a much higher bar for platforms that have child users, but children are allowed a level of autonomy; look at the UN convention on the rights of the child. We give children choices and flexibilities; we do not force them to do every single thing they do, all day every day. We recognise that children should be empowered to make decisions where they can.

I know the Government will not accept the provision—I am not an idiot. I have never moved a new clause in Committee that has been accepted, and I am pretty sure that it will not happen today. However, if the Government were to say that they would consider, or even look at the possibility of, adding child user empowerment duties to the Bill, the internet would be a more pleasant place for children. They are going to use it anyway; let us try to improve their online experience even more than the Bill does already.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The hon. Member for Aberdeen North has outlined the case for the new clause eloquently and powerfully. She may not press it to a Division, if the Minister can give her assurances, but if she did, she would have the wholehearted support of the Opposition.

We see new clause 8 as complementing the child safety duties in the legislation. We fully welcome provisions that provide children with greater power and autonomy in choosing to avoid exposure to certain types of content. We have concerns about how the provisions would work in practice, but that issue has more to do with the Government’s triple-shield protections than the new clause.

The Opposition support new clause 8 because it aims to provide further protections, in addition to the child safety duties, to fully protect children from harmful content and to empower them. It would empower and enable them to filter out private messages from adults or non-verified users. We also welcome the measures in the new clause that require platforms and service providers to design accessible terms of service. That is absolutely vital to best protect children online, which is why we are all here, and what the legislation was designed for.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The aim of the user empowerment duty is to give adults more control over certain categories of legal content that some users will welcome greater choice over. Those duties also give adult users greater control over who they interact with online, but these provisions are not appropriate for children. As the hon. Member for Aberdeen North acknowledged, there are already separate duties on services likely to be accessed by children, in scope of part 3, to undertake comprehensive risk assessments and to comply with safety duties to protect children from harm. That includes requirements to assess how many specific functionalities may facilitate the spread of harmful content, as outlined in clause 10(6)(e), and to protect children from harmful content, including content that has been designated as priority harmful content, by putting in place age-appropriate protections.

As such, children will not need to be provided with tools to control any harmful content they see, as the platform will need to put in place age-appropriate protections. We do not want to give children an option to choose to see content that is harmful to them. The Bill also outlines in clause 11(4)(f) that, where it is proportionate to do so, service providers will be required to take measures in certain areas to meet the child-safety duties. That includes functionalities allowing for control over content that is encountered. It would not be appropriate to require providers to offer children the option to verify their identity, due to the safeguarding and data protection risks that that would pose. Although we expect companies to use technologies such as age assurance to protect children on their service, they would only be used to establish age, not identity.

The new clause would create provisions to enable children to filter out private messages from adults and users who are not on an approved list, but the Bill already contains provisions that address the risks of adults contacting children. There are also requirements on service providers to consider how their service could be used for grooming or child sexual exploitation and abuse, and to apply proportionate measures to mitigate those risks. The service providers already have to assess and mitigate the risks. They have to provide the risk assessment, and within it they could choose to mitigate risk by requiring services to prevent unknown users from contacting children.

For the reasons I have set out, the Bill already provides strong protections for children on services that they are likely to access. I am therefore not able to accept the new clause, and I hope that the hon. Member for Aberdeen North will withdraw it.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

That was one of the more disappointing responses from the Minister, I am afraid. I would appreciate it if he could write to me to explain which part of the Bill provides protection to children from private messaging. I would be interested to have another look at that, so it would be helpful if he could provide details.

We do not want children to choose to see unsafe stuff, but the Bill is not strong enough on stuff like private messaging or the ability of unsolicited users to contact children, because it relies on the providers noticing that in their risk assessment, and putting in place mitigations after recognising the problem. It relies on the providers being willing to act to keep children safe in a way that they have not yet done.

When I am assisting my children online, and making rules about how they behave online, the thing I worry most about is unsolicited contact: what people might say to them online, and what they might hear from adults online. I am happy enough for them to talk to their friends online—I think that is grand—but I worry about what adults will say to them online, whether by private messaging through text or voice messages, or when they are playing a game online with the ability for a group of people working as a team together to broadcast their voices to the others and say whatever they want to say.

Lastly, one issue we have seen on Roblox, which is marketed as a children’s platform, is people creating games within it—people creating sex dungeons within a child’s game, or having conversations with children and asking the child to have their character take off their clothes. Those things have happened on that platform, and I am concerned that there is not enough protection in place, particularly to address that unsolicited contact. Given the disappointing response from the Minister, I am keen to push this clause to a vote.

Question put, That the clause be read a Second time.

Division 7

Ayes: 4

Noes: 8

New Clause 9
Offence of failing to comply with a relevant duty
“(1) The provider of a service to whom a relevant duty applies commits an offence if the provider fails to comply with the duty.
(2) Where the provider is an entity and the offence is proved to have been committed with the consent or connivance of—
(a) a senior manager or director of the entity, or
(b) a person purporting to act in such a capacity,
the senior manager, director or person (as well as the entity) is guilty of the offence and liable to be proceeded against and punished accordingly.
(3) A person who commits an offence under this section is liable on conviction on indictment to imprisonment for a term not exceeding two years or a fine (or both).
(4) In this section—
a ‘director’, in relation to a body corporate whose affairs are managed by its members, means a member of the body corporate;
‘relevant duty’ means a duty provided for by section 11, 14, 18, 19, 21 or 30 of this Act; and
‘senior manager’ has the meaning given in section 89(4) of this Act.”—(Nick Fletcher.)
Brought up, and read the First time.
Nick Fletcher Portrait Nick Fletcher (Don Valley) (Con)
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

It is a pleasure to serve under your chairmanship, Dame Angela. If you will allow, I want to apologise for comments made on the promotion of suicide and self-harm to adults. I believed that to be illegal, but apparently it is not. I am a free speech champion, but I do not agree with the promotion of this sort of information. I hope that the three shields will do much to stop those topics being shared.

I turn to new clause 9. I have done much while in this position to try to protect children, and that is why I followed the Bill as much as I could all the way through. Harmful content online is having tragic consequences for children. Cases such as that of Molly Russell demonstrate the incredible power of harmful material and dangerous algorithms. We know that the proliferation of online pornography is rewiring children’s brains and leading to horrendous consequences, such as child-on-child sexual abuse. This issue is of immense importance for the safety and protection of children, and for the future of our whole society.

Under the Bill, senior managers will not be personally liable for breaching the safety duties, and instead are liable only where they fail to comply with information requests or willingly seek to mislead the regulator. The Government must hardwire the safety duties to deliver a culture of compliance in regulated firms. The Bill must be strengthened to actively promote cultural change in companies and embed compliance with online safety regulations at board level.

We need a robust corporate and senior management liability scheme that imposes personal liability on directors whose actions consistently and significantly put children at risk. The Bill must learn lessons from other regulated sectors, principally financial services, where regulation imposes specific duties on the directors and senior managers of financial institutions, and those responsible individuals face regulatory enforcement if they act in breach of such duties.

The Joint Committee on the draft Online Safety Bill, which conducted pre-legislative scrutiny, recommended that a senior manager at or reporting to board level

“should be designated the ‘Safety Controller’ and made liable for a new offence: the failure to comply with their obligations as regulated service providers when there is clear evidence of repeated and systemic failings that result in a significant risk of serious harm to users.”

Some 82% of UK adults would support the appointment of a senior manager to be held liable for children’s safety on social media sites, and I believe that the measure is also backed by the NSPCC.

There is no direct relationship in the Bill between senior management liability and the discharge by a platform of its safety duties. The Government have repeatedly argued against the designation of a specific individual as a safety controller for some understandable reasons: an offence could be committed by the company without the knowledge of the named individual, and the arrangement would allow many senior managers and directors to face no consequences. However, new clause 9 would take a different approach by deeming any senior employee or manager at the company to be a director for the purposes of the Bill

The concept of consent or connivance is already used in other Acts of Parliament, such as the Theft Act 1968 and the Health and Safety at Work etc. Act 1974. In other words, if a tech platform is found to be in breach of the Online Safety Bill—once it has become an Act—with regard to its duties to children, and it can be proven that this breach occurred with the knowledge or consent of a senior person, that person could be held criminally liable for the breach.

I have been a director in the construction industry for many years. There is a phrase in the industry that the company can pay the fine, but it cannot do the time. I genuinely believe that holding directors criminally liable will ensure that the Bill, which is good legislation, will really be taken seriously. I hope the Minister will agree to meet me to discuss this further.

13:18
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I want to briefly speak on this amendment, particularly as my hon. Friend the Member for Don Valley referenced the report by the Joint Committee, which I chaired. As he said, the Joint Committee considered the question of systematic abuse. A similar provision exists in the data protection legislation, whereby any company that is consistently in breach could be considered to have failed in its duties under the legislation and there could be criminal liability. The Joint Committee considered whether that should also apply with the Online Safety Bill.

As the Bill has gone through its processes, the Government have brought forward the commencement of criminal liability for information offences, whereby if a company refuses to respond to requests for information or data from the regulator, that would be a breach of their duties; it would invoke criminal liability for a named individual. However, I think the question of a failure to meet the safety duty set out in the Bill really needs to be framed along the lines of being a systematic and persistent breach, as the Joint Committee recommended. If, for example, a company was prepared to ignore requests from Ofcom, use lawyers to evade liability for as long as possible and consistently pay fines for serious breaches without ever taking responsibility for them, what would we do then? Would there be some liability at that point?

The amendment drafted by my hon. Friend the Member for Stone (Sir William Cash) is based on other existing legislation, and on there being knowledge—with “consent or connivance”. We can see how that would apply in cases such as the diesel emissions concerns raised at Volkswagen, where there was criminal liability, or maybe the LIBOR bank rate rigging and the serious failures there. In those cases, what was discovered was senior management’s knowledge and connivance; they were part of a process that they knew was illegal.

With the amendment as drafted, the question we would have is: could it apply for any failure? Where management could say, “We have created a system to resolve this system that hasn’t worked on this occasion”, would that trigger it? Or is it something broader and more systematic? These failures will be more about the failure to design a regime that takes into account the required stated duties, rather than a particular individual act, such as the rigging of the LIBOR rates or giving false public information on diesel emissions, which could only be made at a corporate level.

When I chaired the Joint Committee, we raised the question, “What about systematic failure, as we have that as an offence in data protection legislation?” I still think that would be an interesting question to consider when the Bill goes to another place. However, I have concerns that the current drafting would not fit quite as well in the online safety regime as it does in other industries. It would really need to reflect consistent, persistent failures on behalf of a company that go beyond the criminal liabilities that already exist in the Bill around information offences.

None Portrait The Chair
- Hansard -

Just to be clear, it is new clause 9 that we are reading a Second time, not an amendment.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Forgive me, Dame Angela.

Caroline Ansell Portrait Caroline Ansell (Eastbourne) (Con)
- Hansard - - - Excerpts

I rise to recognise the spirit and principle behind new clause 9, while, of course, listening carefully to the comments made by my hon. Friend the Member for Folkestone and Hythe. He is right to raise those concerns, but my question is: is there an industry-specific way in which the same responsibility and liability could be delivered?

I recognise too that the Bill is hugely important. It is a good Bill that has child protection at its heart. It also contains far more significant financial penalties than we have previously seen—as I understand it, 10% of qualifying revenue up to £18 million. This will drive some change, but it comes against the backdrop of multi-billion-pound technology companies.

I would be interested to understand whether a double lock around the board-level responsibility might further protect children from some of the harrowing and harmful content we see online. What we need is nothing short of transformation and significant culture change. Even today, The Guardian published an article about TikTok and a study by the Centre for Countering Digital Hate, which found that teenagers who demonstrated an interest in self-harm and eating disorders were having algorithms pushing that content on to them within minutes. That is most troubling.

We need significant, serious and sustained culture change. There is precedent in other sectors, as has been mentioned, and there was a previous recommendation, so clearly there is merit in this. My understanding is that there is strong public support, because the public recognise that this new responsibility cannot be strengthened by anything other than liability. If there is board-level liability, that will drive priorities and resources, which will broker the kind of change we are looking for. I look forward to what the Minister might share today, as this has been a good opportunity to bring these issues into further consideration, and they might then be carried over into subsequent stages of this excellent Bill.

Rachel Maclean Portrait Rachel Maclean (Redditch) (Con)
- Hansard - - - Excerpts

I would like to build on the excellent comments from my colleagues and to speak about child sexual abuse material. I thank my hon. Friends the Members for Penistone and Stocksbridge (Miriam Cates) and for Stone for tabling the amendment. I am very interested in how we can use the excellent provisions in the Bill to keep children safe from child sexual abuse material online. I am sure the Committee is aware of the devastating impact of such material.

Sexual abuse imagery—of girls in particular—is increasingly prevalent. We know that 97% of this material in 2021 showed female children. The Internet Watch Foundation took down a record-breaking 252,000 URLs that had images of children being raped, and seven in 10 of those images were of children aged 11 to 13. Unfortunately, the National Crime Agency estimates that between 550,000 and 850,000 people in the UK are searching for such material on the internet. They are actively looking for it, and at the moment they are able to find it.

My concern is with how we use what is in the Bill already to instil a top-down culture in companies, because this is about culture change in the boardroom, so that safety is considered with every decision. I have read the proceedings from previous sittings, and I recognise that the Government and Ministers have said that we have sufficient provisions to protect children, but I think there is a little bit of a grey area with tech companies.

I want to mention Apple and the update it was planning for quite a few years. There was an update that would have automatically scanned for child sex abuse material. Apple withdrew it following a backlash from encryption and privacy experts, who claimed it would undermine the privacy and security of iCloud users and make people less safe on the internet. Having previously said that it would pause it to improve it, Apple now says that it has stopped it altogether and that it is vastly expanding its end-to-end encryption, even though law enforcement agencies around the world, including our own UK law enforcement agencies, have expressed serious concerns because it makes investigations and prosecution more challenging. All of us are not technical experts. I do not believe that we are in a position to judge how legitimate it is for Apple to have this pause. What we do know is that while there is this pause, the risks for children are still there, proliferating online.

We understand completely that countering this material involves a complicated balance and that the tech giants need to walk a fine line between keeping users safe and keeping their data safe. But the question is this: if Apple and others continue to delay or backtrack, will merely failing to comply with an information request, which is what is in the Bill now, be enough to protect children from harm? Could they delay indefinitely and still be compliant with the Bill? That is what I am keen to hear from the Minister. I would be grateful if he could set out why he thinks that individuals who have the power to prevent the harmful content that has torn apart the lives of so many young people and their families should not face criminal consequences if they fail to do so. Can he reassure us as to how he thinks that the Bill can protect so many children—it is far too many children—from this material online?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour supports new clause 9, as liability is an issue that we have repeatedly raised throughout the passage of the Bill—most recently, on Report. As colleagues will be aware, the new clause would introduce criminal liabilities for directors who failed to comply with their duties. This would be an appropriate first step in ensuring a direct relationship between senior management of platforms and companies, and their responsibilities to protect children from significant harm. As we have heard, this measure would drive a more effective culture of awareness and accountability in relation to online safety at the top of and within the entire regulated firm. It would go some way towards ensuring that online safety was at the heart of the governance structures internally. The Bill must go further to actively promote cultural change and put online safety at the forefront of business models; it must ensure that these people are aware that it is about keeping people safe and that that must be at the forefront, over any profit. A robust corporate and senior management liability scheme is needed, and it needs to be one that imposes personal liability on directors when they put children at risk.

The Minister knows as well as I do that the benefits of doing so would be strong. We have only to turn to the coroner’s comments in the tragic case of Molly Russell’s death—which I know we are all mindful of as we debate this Bill—to fully understand the damaging impact of viewing harmful content online. I therefore urge the Minister to accept new clause 9, which we wholeheartedly support.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The Government recognise that the intent behind the new clause is to create new criminal offences of non-compliance with selected duties. It would establish a framework for personal criminal offences punishable through fines or imprisonment. It would mean that providers committed a criminal offence if they did not comply with certain duties.

We all want this Bill to be effective. We want it to be on the statute book. It is a question of getting that fine balance right, so that we can properly hold companies to account for the safety of their users. The existing approach to enforcement and senior manager liability strikes the right balance between robust enforcement and deterrent, and ensuring that the UK remains an attractive place to do business. We are confident that the Bill as a whole will bring about the change necessary to ensure that users, especially younger users, are kept safe online.

This new clause tries to criminalise not complying with the Bill’s duties. Exactly what activity would be criminalised is not obvious from the new clause, so it could be difficult for individuals to foresee exactly what type of conduct would constitute an offence. That could lead to unintended consequences, with tech executives driving an over-zealous approach to content take-down for fear of imprisonment, and potentially removing large volumes of innocuous content and so affecting the ability for open debate to take place.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Does the Minister not think that the freedom of speech stuff and the requirement to stick to terms of service that he has put in as safeguards for that are strong enough, then?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I come back to this point: I think that if people were threatened with personal legal liability, that would stifle innovation and make them over-cautious in their approach. That would remove the balance, disturb the balance, that we have tried to achieve in this iteration of the Bill. Trying to keep internet users, particularly children, safe has to be achieved alongside free speech and not at its expense.

Further, the threat of criminal prosecution for failing to comply with numerous duties also runs a real risk of damaging the attractiveness of the UK as a place to start up and grow a digital business. I want internet users in the future to be able to access all the benefits of the internet safely, but we cannot achieve that if businesses avoid the UK because our enforcement regime is so far out of kilter with international comparators. Instead, the most effective way to ensure that services act to protect people online is through the existing framework and the civil enforcement options that are already provided for in the Bill, overseen by an expert regulator.

13:30
Going forward, companies will need to regularly assess the risks that their services pose to users, including ahead of any major design or functionality changes, and put in place proportionate systems and processes to mitigate those risks. It is only when companies thoroughly understand the risks arising from their services that they will be able to take proportionate action to keep users safe. This approach will fundamentally change the way tech services operate. It will mandate that services and tech executives properly consider risks and user safety from the get-go, rather than as an afterthought once a product is already open to users.
If platforms fail to comply with their enforceable requirements, Ofcom will be able to use its range of strong enforcement powers, including fines. By court order, it will be able to take business disruption measures and block sites from operating in the UK. Make no mistake about the substance of those fines: it is 10% of a company’s global turnover. No matter how big the company is, 10% is 10%; it is still a massive proportion of operating costs that will be removed. Our approach will ensure that providers are held to account and that swift action is taken to keep users safe, whether by bringing the platform into compliance or through stronger measures.
Senior tech executives can already be held criminally liable under the Bill for failing to take reasonable steps to ensure their company properly complies with Ofcom’s information requests. That includes failing to ensure that their company responds fully, accurately and on time; failing to ensure that their company does not provide false information; failing to ensure that their company does not provide encrypted information that Ofcom cannot understand; and failing to ensure that their company does not destroy or alter information required by Ofcom.
If we start to widen the scope of senior management liability in the Bill, we start to come up against problems quickly. For a criminal offence, a precise statement of the prohibited behaviour must clearly be set out—in other words, that a particular act or omission constitutes the criminal offence. In this case, a failure to comply with the relevant duties listed in the amendment would depend on a huge number of factors. That is because the Bill applies to a providers of various sizes and types. In most areas, the framework is flexible, rather than prescriptive: it does not prescribe certain steps that providers must take. That means that it may be difficult for individuals to foresee exactly what type of conduct constitutes an offence, and that can easily lead to unintended consequences and to tech executives taking an over-zealous approach to content take-down for fear of imprisonment.
My hon. Friend the Member for Folkestone and Hythe talked about health and safety, LIBOR and diesel emissions, which have been raised here and in the main Chamber. There is a big difference between what we are talking about and those examples. On health and safety, LIBOR and the cover-up of diesel emissions, there is far closer contact with a personal conduct measure; the Bill contains broader measures.
My hon. Friend the Member for Eastbourne talked about having an industry-specific way of delivering the responsibility and liability. This is the industry-specific way. We are making sure that the approach is proportionate and that executives have to co-operate with Ofcom at every stage. It is pre-emptive as well as reactive. It ensures that that, when Ofcom assesses their risk assessments, their approaches to algorithms and so on, it has all the facilities it needs to check that what they are doing is the right approach. If there are complaints and systemic failings within the platform’s regime, they need to comply; they must not cover it up or hinder Ofcom’s investigation.
On the TikTok algorithm, the development of an algorithm is quite remote from personal conduct. It is not easy to make an individual criminally liable for it, not least because algorithms tend to be developed by hundreds if not thousands of people in different continents. To boil that down to one person is incredibly difficult.
We also heard about the example of Apple. There is no way that through this legislation, we are banning, or creating back doors in, end-to-end encryption; there is no safe back door, frankly, so if we did that, we could kiss goodbye to open banking and any number of things that we use daily. I may be wrong, but my understanding of the Apple product that was mentioned is that it would involve scanning pretty well everything that a person had in their iCloud, so it would be a sledgehammer to crack a nut, although clearly a really important nut. If Apple will not bring that forward, we would expect it and other platforms to bring forward something else that is effective specifically against terrorism content and child sexual exploitation and abuse.
For the reasons that I have given, I strongly believe that the Bill’s approach to enforcement will be effective. It will protect users without introducing incentives for managers to remove swathes of content out of fear of prosecution. I want to make sure that the legislation gets on the books and is proportionate, and that we do not start gold-plating it with these sorts of measures now, because we risk disrupting the balance that I think we have achieved in the Bill as amended.
Nick Fletcher Portrait Nick Fletcher
- Hansard - - - Excerpts

I appreciate the Minister’s comments, but from what my hon. Friends the Members for Folkestone and Hythe, for Eastbourne, and for Redditch said this morning about TikTok—these sorts of images get to children within two and a half minutes—it seems that there is a cultural issue, which the hon. Member for Pontypridd mentioned. Including new clause 9 in the Bill would really ram home the message that we are taking this seriously, that the culture needs to change, and that we need to do all that we can. I hope that the Minister will speak to his colleagues in the Ministry of Justice to see what, if anything, can be done.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I forgot to respond to my hon. Friend’s question about whether I would meet him. I will happily meet him.

Nick Fletcher Portrait Nick Fletcher
- Hansard - - - Excerpts

I appreciate that. We will come back to this issue on Report, but I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

Question proposed, That the Chair do report the Bill, as amended, to the House.

None Portrait The Chair
- Hansard -

It is usual at this juncture for there to be a few thanks and niceties, if people wish to give them.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I apologise, Dame Angela; I did not realise that I had that formal role, but you are absolutely right.

None Portrait The Chair
- Hansard -

If the Minister does not want niceties, that is up to him.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Dame Angela, you know that I love niceties. It is Christmas—the festive season! It is a little bit warmer today because we changed room, but we remember the coldness; it reminds us that it is Christmas.

I thank you, Dame Angela, and thank all the Clerks in the House for bringing this unusual recommittal to us all, and schooling us in the recommittal process. I thank Members from all parts of the House for the constructive way in which the Bill has been debated over the two days of recommittal. I also thank the Doorkeepers and my team, many of whom are on the Benches here or in the Public Gallery. They are watching and WhatsApping—ironically, using end-to-end encryption.

None Portrait The Chair
- Hansard -

I was just about to say that encryption would be involved.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I look forward to continuing the debate on Report.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I thank you, too, Dame Angela. I echo the Minister’s sentiments, and thank all the Clerks, the Doorkeepers, the team, and all the stakeholders who have massively contributed, with very short turnarounds, to the scrutiny of this legislation. I have so appreciated all that assistance and expertise, which has helped me, as shadow Minister, to compile our comments on the Bill following the Government’s recommittal of it to Committee, which is an unusual step. Huge thanks to my colleagues who joined us today and in previous sittings, and to colleagues from across the House, and particularly from the SNP, a number of whose amendments we have supported. We look forward to scrutinising the Bill further when it comes back to the House in the new year.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I thank you, Dame Angela, as well as Sir Roger for chairing our debates. Recommittal has been a very odd and unusual process; it has been a bit like groundhog day, discussing things we have discussed previously. I very much appreciate the hard work of departmental and Ofcom staff that went into making this happen, as well as the work of the Clerks, the Doorkeepers, and the team who ensured that we have a room that is not freezing—that has been really helpful.

I thank colleagues from across the House, particularly the Labour Front-Bench spokespeople, who have been incredibly helpful in supporting our amendments. This has been a pretty good-tempered Committee and we have all got on fairly well, even though we have disagreed on a significant number of issues. I am sure we will have those arguments again on Report.

None Portrait The Chair
- Hansard -

There being no more obvious niceties, I add my thanks to everybody. I wish everybody season’s greetings and a happy Christmas.

Question put and agreed to.

Bill, as amended, accordingly to be reported.

13:41
Committee rose.
Written evidence reported to the House
OSB113 HOPE not hate
OSB114 Samaritans
OSB115 Jeffrey Howard, Associate Professor of Political Theory and Director of the Online Speech Project, School of Public Policy, University College London
OSB116 Open Rights Group
OSB117 Meta
Consideration of Bill, as amended, on re-committal, in the Public Bill Committee
[Relevant documents: Second Report of the Petitions Committee of Session 2021-22, Tackling Online Abuse, HC 766, and the Government response, HC 1224; Letter from the Chair of the Women and Equalities Committee to the Minister for Tech and the Digital Economy regarding Pornography and its impact on VAWG, dated 13 June 2022; Letter from the Minister for Tech and the Digital Economy to the Chair of the Women and Equalities Committee regarding Pornography and its impact on VAWG, dated 30 August 2022; e-petition 272087, Hold online trolls accountable for their online abuse via their IP address; e-petition 332315, Ban anonymous accounts on social media; e-petition 575833, Make verified ID a requirement for opening a social media account; e-petition 582423, Repeal Section 127 of the Communications Act 2003 and expunge all convictions; e-petition 601932, Do not restrict our right to freedom of expression online.]
Rosie Winterton Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - - - Excerpts

Before we open the debate, I want to make a brief comment about the scope of today’s debate. Today’s debate on consideration follows the re-committal of the Bill to a Public Bill Committee in December last year. We are therefore debating today only the new clauses and amendments listed on the selection paper issued today. These are either: new clauses relating to the re-committed clauses and schedules; amendments to those clauses and schedules; or amendments to other parts of the Bill consequential on changes made to the Bill on re-committal in the Public Bill Committee.

On 5 December, the House finished its consideration on report of other parts of the Bill. The scope of today’s report stage generally does not include those parts of the Bill that were not re-committed. The exception is where amendments on the selection paper are consequential to the changes made to re-committed clauses, and relate to clauses that were not re-committed. Should there be time for debate on Third Reading, it is of course permissible to speak then to any of the content of the Bill.

I should also remind the House that, because of the time taken for the emergency debate, proceedings on consideration are now scheduled to finish at 8.13 pm and proceedings on Third Reading at 9.13 pm.

New Clause 1

Report on redress for individual complaints

‘(1) The Secretary of State must publish a report assessing options for dealing with appeals about complaints made under section 17 of this Act.

(2) The report must—

(a) provide a general update on the fulfilment of duties about complaints procedures which apply in relation to all regulated user-to-user services;

(b) assess which body should be responsible for a system to deal with appeals in cases where a complainant considers that a complaint has not been satisfactorily dealt with; and

(c) provide options for how the system should be funded, including consideration of whether an annual surcharge could be imposed on user-to-user services.

(3) The report must be laid before Parliament within six months of the commencement of section 17.’—(Alex Davies- Jones.)

Brought up, and read the First time.

17:42
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- View Speech - Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

Rosie Winterton Portrait Madam Deputy Speaker
- Hansard - - - Excerpts

With this it will be convenient to discuss the following:

New clause 2—Offence of failing to comply with a relevant duty

‘(1) The provider of a service to whom a relevant duty applies commits an offence if the provider fails to comply with the duty.

(2) In the application of sections 178(2) and 179(5) to an offence under this section (where the offence has been committed with the consent or connivance of an officer of the entity or is attributable to any neglect on the part of an officer of the entity) the references in those provisions to an officer of an entity include references to any person who, at the time of the commission of the offence—

(a) was (within the meaning of section 93) a senior manager of the entity in relation to the activities of the entity in the course of which the offence was committed; or

(b) was a person purporting to act in such a capacity.

(3) A person who commits an offence under this section is liable on conviction on indictment to imprisonment for a term not exceeding two years or a fine (or both).

(4) In this section, “relevant duty” means a duty provided for by section 11 of this Act.’

This new clause makes it an offence for the provider of a user-to-service not to comply with the safety duties protecting children set out in clause 11. Where the offence is committed with the consent or connivance of a senior manager or other officer of the provider, or is attributable to their neglect, the officer, as well as the entity, is guilty of the offence.

New clause 3—Child user empowerment duties

‘(1) This section sets out the duties to empower child users which apply in relation to Category 1 services.

(2) A duty to include in a service, to the extent that it is proportionate to do so, features which child users may use or apply if they wish to increase their control over harmful content.

(3) The features referred to in subsection (2) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—

(a) reduce the likelihood of the user encountering priority content that is harmful, or particular kinds of such content, by means of the service, or

(b) alert the user to the harmful nature of priority content that is harmful that the user may encounter by means of the service.

(4) A duty to ensure that all features included in a service in compliance with the duty set out in subsection (2) are made available to all child users.

(5) A duty to include clear and accessible provisions in the terms of service specifying which features are offered in compliance with the duty set out in subsection (2), and how users may take advantage of them.

(6) A duty to include in a service features which child users may use or apply if they wish to filter out non-verified users.

(7) The features referred to in subsection (6) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—

(a) prevent non-verified users from interacting with content which that user generates, uploads or shares on the service, and

(b) reduce the likelihood of that user encountering content which non-verified users generate, upload or share on the service.

(8) A duty to include in a service features which child users may use or apply if they wish to only encounter content by users they have approved.

(9) A duty to include in a service features which child users may use or apply if they wish to filter out private messages from—

(a) non-verified users, or

(b) adult users, or

(c) any user other than those on a list approved by the child user.

(10) In determining what is proportionate for the purposes of subsection (2), the following factors, in particular, are relevant—

(a) all the findings of the most recent child risk assessment (including as to levels of risk and as to nature, and severity, of potential harm), and

(b) the size and capacity of the provider of a service.

(11) In this section “non-verified user” means a user who has not verified their identity to the provider of a service (see section 57(1)).

(12) In this section references to features include references to functionalities and settings.’

New clause 4—Safety duties protecting adults and society: minimum standards for terms of service

‘(1) OFCOM may set minimum standards for the provisions included in a provider’s terms of service as far as they relate to the duties under sections 11, [Harm to adults and society risk assessment duties], [Safety duties protecting adults and society], 12, 16 to 19 and 28 of this Act (“relevant duties”).

(2) Where a provider does not meet the minimum standards, OFCOM may direct the provider to amend its terms of service in order to ensure that the standards are met.

(3) OFCOM must, at least once a year, conduct a review of—

(a) the extent to which providers are meeting the minimum standards, and

(b) how the providers’ terms of service are enabling them to fulfil the relevant duties.

(4) The report must assess whether any provider has made changes to its terms of service that might affect the way it fulfils a relevant duty.

(5) OFCOM must lay a report on the first review before both Houses of Parliament within one year of this Act being passed.

(6) OFCOM must lay a report on each subsequent review at least once a year thereafter.’

New clause 5—Harm to adult and society risk assessment duties

‘(1) This section sets out the duties about risk assessments which apply in relation to Category 1 services (in addition to the duties about risk assessments set out in section 8 and, in the case of Category 1 services likely to be accessed by children, section 10).

(2) A duty to carry out a suitable and sufficient harm to adults and society risk assessment at a time set out in, or as provided by, Schedule 3.

(3) A duty to take appropriate steps to keep an harm to adults and society risk assessment up to date, including when OFCOM make any significant change to a risk profile that relates to services of the kind in question.

(4) Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient harm to adults and society risk assessment relating to the impacts of that proposed change.

(5) A “harm to adults and society risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—

(a) the user base;

(b) the level of risk of adults who are users of the service encountering, by means of the service, each kind of priority content that is harmful to adults and society (with each kind separately assessed), taking into account (in particular) algorithms used by the service, and how easily, quickly and widely content may be disseminated by means of the service;

(c) the level of risk of harm to adults and society presented by different kinds of priority content that is harmful to adults and society;

(d) the level of risk of harm to adults and society presented by priority content that is harmful to adults and society which particularly affects individuals with a certain characteristic or members of a certain group;

(e) the level of risk of functionalities of the service facilitating the presence or dissemination of priority content that is harmful to adults and society, identifying and assessing those functionalities that present higher levels of risk;

(f) the different ways in which the service is used, and the impact of such use on the level of risk of harm that might be suffered by adults and society;

(g) the nature, and severity, of the harm that might be suffered by adults and society from the matters identified in accordance with paragraphs (b) to (f);

(h) how the design and operation of the service (including the business model, governance, use of proactive technology, measures to promote users’ media literacy and safe use of the service, and other systems and processes) may reduce or increase the risks identified.

(6) In this section references to risk profiles are to the risk profiles for the time being published under section 85 which relate to the risk of harm to adults and society presented by priority content that is harmful to adults and society.

(7) See also—

(a) section 19(2) (records of risk assessments), and

(b) Schedule 3 (timing of providers’ assessments).’

New clause 6—Safety duties protecting adults and society

‘(1) This section sets out the duties to prevent harms to adults and society which apply in relation to Category 1 services.

(2) A duty to summarise in the terms of service the findings of the most recent adults and society risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to adults and society).

(3) If a provider decides to treat a kind of priority content that is harmful to adults and society in a way described in subsection (4), a duty to include provisions in the terms of service specifying how that kind of content is to be treated (separately covering each kind of priority content that is harmful to adults and society which a provider decides to treat in one of those ways).

(4) These are the kinds of treatment of content referred to in subsection (3)—

(a) taking down the content;

(b) restricting users’ access to the content;

(c) limiting the recommendation or promotion of the content;

(d) recommending or promoting the content;

(e) allowing the content without treating it in a way described in any of paragraphs (a) to (d).

(5) A duty to explain in the terms of service the provider’s response to the risks relating to priority content that is harmful to adults and society (as identified in the most recent adults and society risk assessment of the service), by reference to—

(a) any provisions of the terms of service included in compliance with the duty set out in subsection (3), and

(b) any other provisions of the terms of service designed to mitigate or manage those risks.

(6) If provisions are included in the terms of service in compliance with the duty set out in subsection (3), a duty to ensure that those provisions—

(a) are clear and accessible, and

(b) are applied consistently.

(7) If the provider of a service becomes aware of any non-designated content that is harmful to adults and society present on the service, a duty to notify OFCOM of—

(a) the kinds of such content identified, and

(b) the incidence of those kinds of content on the service.

(8) In this section—

“harm to adults and society risk assessment” has the meaning given by section [harm to adults and society risk assessment duties];

“non-designated content that is harmful to adults and society” means content that is harmful to adults and society other than priority content that is harmful to adults and society.

(9) See also, in relation to duties set out in this section, section 18 (duties about freedom of expression and privacy).’

New clause 7—“Content that is harmful to adults and society” etc

‘(1) This section applies for the purposes of this Part.

(2) “Priority content that is harmful to adults and society” means content of a description designated in regulations made by the Secretary of State as priority content that is harmful to adults and society.

(3) “Content that is harmful to adults and society” means—

(a) priority content that is harmful to adults and society, or

(b) content, not within paragraph (a), of a kind which presents a material risk of significant harm to an appreciable number of adults in the United Kingdom.

(4) For the purposes of this section—

(a) illegal content (see section 53) is not to be regarded as within subsection (3)(b), and

(b) content is not to be regarded as within subsection (3)(b) if the risk of harm flows from—

(i) the content’s potential financial impact,

(ii) the safety or quality of goods featured in the content, or

(iii) the way in which a service featured in the content may be performed (for example, in the case of the performance of a service by a person not qualified to perform it).

(5) References to “priority content that is harmful to adults and society” and “content that is harmful to adults and society” are to be read as—

(a) limited to content within the definition in question that is regulated user-generated content in relation to a regulated user-to-user service, and

(b) including material which, if it were present on a regulated user-to-user service, would be content within paragraph (a) (and this section is to be read with such modifications as may be necessary for the purpose of this paragraph).

(6) Sections 55 and 56 contain further provision about regulations made under this section.’

Government amendments 1 to 4.

Amendment 44, clause 11, page 10, line 17, , at end insert ‘, and—

“(c) mitigate the harm to children caused by habit-forming features of the service by consideration and analysis of how processes (including algorithmic serving of content, the display of other users’ approval of posts and notifications) contribute to development of habit-forming behaviour.”’

Amendment 82, page 10, line 25, at end insert—

‘(3A) Content under subsection (3) includes content that may result in serious harm or death to a child while crossing the English Channel with the aim of entering the United Kingdom in a vessel unsuited or unsafe for those purposes.’

This amendment would require proportionate systems and processes, including removal of content, to be in place to control the access by young people to material which encourages them to undertake dangerous Channel crossings where their lives could be lost.

Amendment 83, page 10, line 25, at end insert—

‘(3A) Content promoting self-harm, including content promoting eating disorders, must be considered as harmful.’

Amendment 84, page 10, line 25, at end insert—

‘(3A) Content which advertises or promotes the practice of so-called conversion practices of LGBTQ+ individuals must be considered as harmful for the purposes of this section.’

Amendment 45, page 10, line 36, leave out paragraph (d) and insert—

‘(d) policies on user access to the service, parts of the service, or to particular content present on the service, including blocking users from accessing the service, parts of the service, or particular content,’.

Amendment 47, page 10, line 43, at end insert ‘, and

“(i) reducing or removing a user’s access to livestreaming features.”’

Amendment 46, page 10, line 43, at end insert ‘, and

“(i) reducing or removing a user’s access to private messaging features.”’

Amendment 48, page 11, line 25, after ‘accessible’ insert ‘for child users.’

Amendment 43, clause 12, page 12, line 24, leave out ‘made available to’ and insert

‘in operation by default for’.

Amendment 52, page 12, line 30, after ‘non-verified users’ insert

‘and to enable them to see whether another user is verified or non-verified.’

This amendment would require Category 1 services to make visible to users whether another user is verified or non-verified.

Amendment 49, page 12, line 30, at end insert—

‘(6A) A duty to ensure features and provisions in subsections (2), (4) and (6) are accessible and understandable to adult users with learning disabilities.’

Amendment 53, page 12, line 32, after ‘to’ insert ‘effectively’.

This amendment would bring this subsection into line with subsection (3) by requiring that the systems or processes available to users for the purposes described in subsections (7)(a) and (7)(b) should be effective.

Amendment 55, page 18, line 15, at end insert—

‘(4A) Content that is harmful to adults and society.’

Amendment 56, clause 17, page 20, line 10, leave out subsection (6) and insert—

‘(6) The following kinds of complaint are relevant for Category 1 services—

(a) complaints by users and affected persons about content present on a service which they consider to be content that is harmful to adults and society;

(b) complaints by users and affected persons if they consider that the provider is not complying with a duty set out in—

(i) section [adults and society online safety]

(ii) section 12 (user empowerment),

(iii) section 13 (content of democratic importance),

(iv) section 14 (news publisher content),

(v) section 15 (journalistic content), or

(vi) section 18(4), (6) or (7) (freedom of expression and privacy);

(c) complaints by a user who has generated, uploaded or shared content on a service if that content is taken down, or access to it is restricted, on the basis that it is content that is harmful to adults and society;

(d) complaints by a user of a service if the provider has given a warning to the user, suspended or banned the user from using the service, or in any other way restricted the user’s ability to use the service, as a result of content generated, uploaded or shared by the user which the provider considers to be content that is harmful to adults and society.’

Amendment 57, clause 19, page 21, line 40, leave out ‘or 10’ and insert

‘, 10 or [harms to adults and society risk assessment duties]’.

Amendment 58, page 22, line 37, at end insert—

‘(ba) section [adults and society online safety] (adults and society online safety),’

Government amendment 5.

Amendment 59, clause 44, page 44, line 11, at end insert

‘or

(ba) section [adults and society online safety] (adults and society online safety);’

Government amendment 6.

Amendment 60, clause 55, page 53, line 43, at end insert—

‘(2A) The Secretary of State may specify a description of content in regulations under section [“Content that is harmful to adult and society” etc](2) (priority content that is harmful to adults and society) only if the Secretary of State considers that, in relation to regulated user-to-user services, there is a material risk of significant harm to an appreciable number of adults presented by content of that description that is regulated user-generated content.’

Amendment 61, page 53, line 45, after ‘54’ insert

‘or [“Content that is harmful to adults and society” etc]’.

Amendment 62, page 54, line 8, after ‘54’ insert

‘or [“Content that is harmful to adults and society” etc]’.

Amendment 63, page 54, line 9, leave out ‘are to children’ and insert

‘or adults are to children or adults and society’.

Government amendments 7 to 16.

Amendment 77, clause 94, page 85, line 42, after ‘10’ insert

‘, [Adults and society risk assessment duties]’.

Amendment 78, page 85, line 44, at end insert—

‘(iiia) section [Adults and society online safety] (adults and society online safety);’

Amendment 54, clause 119, page 102, line 22, at end insert—

‘Section [Safety duties protecting adults and society: minimum standards for terms of service]

Minimum standards for terms of service’



Amendment 79, page 102, line 22, at end insert—

‘Section [Harm to adults and society assessments]

Harm to adults and society risk assessments

Section [Adults and society online safety]

Adults and society online safety’



Government amendments 17 to 19.

Amendment 51, clause 207, page 170, line 42, after ‘including’ insert ‘but not limited to’.

Government amendments 20 to 23.

Amendment 81, clause 211, page 177, line 3, leave out ‘and 55’ and insert

‘, [“Content that is harmful to adults and society” etc] and 55’.

Government amendments 24 to 42.

Amendment 64, schedule 8, page 207, line 13, leave out ‘relevant content’ and insert

‘priority content that is harmful to adults and society’.

Amendment 65, page 207, line 15, leave out ‘relevant content’ and insert

‘priority content that is harmful to adults and society’.

Amendment 66, page 207, line 17, leave out ‘relevant content’ and insert

‘priority content that is harmful to adults and society’.

Amendment 67, page 207, line 21, leave out ‘relevant content’ and insert

‘content that is harmful to adults and society, or other content which they consider breaches the terms of service.’

Amendment 68, page 207, line 23, leave out ‘relevant content’ and insert

‘priority content that is harmful to adults and society’.

Amendment 69, page 207, line 26, leave out ‘relevant content’ and insert

‘priority content that is harmful to adults and society’.

Amendment 70, page 208, line 2, leave out

‘or content that is harmful to children’

and insert

‘content that is harmful to children or priority content that is harmful to adults and society’.

Amendment 71, page 208, line 10, leave out

‘and content that is harmful to children’

and insert

‘content that is harmful to children and priority content that is harmful to adults and society’.

Amendment 72, page 208, line 13, leave out

“and content that is harmful to children”

and insert

‘content that is harmful to children and priority content that is harmful to adults and society’.

Amendment 73, page 210, line 2, at end insert

‘“content that is harmful to adults and society” and “priority content that is harmful to adults and society” have the same meaning as in section [“Content that is harmful to adults and society” etc]’.

Amendment 50, schedule 11, page 217, line 31, at end insert—

‘(1A) Regulations made under sub-paragraph (1) must provide for any regulated user-to-user service which OFCOM assesses as posing a very high risk of harm to be included within Category 1, regardless of the number of users.’

Amendment 74, page 218, line 24, leave out

‘and content that is harmful to children’

and insert

‘content that is harmful to children and priority content that is harmful to adults and society’.

Amendment 75, page 219, line 6, leave out

‘and content that is harmful to children’

and insert

‘content that is harmful to children and priority content that is harmful to adults and society’.

Amendment 76, page 221, line 24, at end insert—

‘“priority content that is harmful to adults and society” has the same meaning as in section [“Content that is harmful to adults and society” etc]’.

Amendment 80, page 240, line 35, in schedule 17, at end insert—

‘(ba) section [Harm to adults and society assessments] (Harm to adults and society assessments), and’.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Once again, it is a privilege to be back in the Chamber opening this debate—the third Report stage debate in recent months—of this incredibly important and urgently needed piece of legislation. I speak on behalf of colleagues across the House when I say that the Bill is in a much worse position than when it was first introduced. It is therefore vital that it is now able to progress to the other place. Although we are all pleased to see the Bill return today, the Government’s delays have been incredibly costly and we still have a long way to go until we see meaningful change for the better.

In December, during the last Report stage debate, we had the immense privilege to be joined in the Public Gallery by a number of the families who have all lost children in connection with online harms. It is these families whom we must keep in our mind when we seek to get the Bill over the line once and for all. As ever, I pay tribute to their incredible efforts in the most difficult of all circumstances.

Today’s debate is also very timely in that, earlier today, the End Violence Against Women and Girls coalition and Glitch, a charity committed to ending online abuse, handed in their petition, which calls on the Prime Minister to protect women and girls online. The petition has amassed more than 90,000 signatures and rising, so we know there is strong support for improving internet safety across the board. I commend all those involved on their fantastic efforts in raising this important issue.

It would be remiss of me not to make a brief comment on the Government’s last-minute U-turns in their stance on criminal sanctions. The fact that we are seeing amendments withdrawn at the last minute goes to show that this Government have absolutely no idea where they truly stand on these issues and that they are ultimately too weak to stand up against vested interests, whereas Labour is on the side of the public and has consistently put safety at the forefront throughout the Bill’s passage.

More broadly, I made Labour’s feelings about the Government’s highly unusual decision to send part of this Bill back to Committee a second time very clear during the previous debate. I will spare colleagues by not repeating those frustrations here, but let me be clear: it is absolutely wrong that the Government chose to remove safety provisions relating to “legal but harmful” content in Committee. That is a major weakening, not strengthening, of the Bill; everyone online, including users and consumers, will be worse off without those provisions.

The Government’s alternative proposal, to introduce a toggle to filter out harmful content, is unworkable. Replacing the sections of this Bill that could have gone some way towards preventing harm with an emphasis on free speech instead undermines the very purpose of the Bill. It will embolden abusers, covid deniers, hoaxers and others, who will feel encouraged to thrive online.

In Committee, the Government also chose to remove important clauses from the Bill that were in place to keep adults safe online. Without the all-important risk assessments for adults, I must press the Minister on an important point: exactly how will this Bill do anything to keep adults safe online? The Government know all that, but have still pursued a course of action that will see the Bill watered down entirely.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Does my hon. Friend agree that, as we discussed in the Bill Committee, there is clear evidence that legal but harmful content is often the gateway to far more dangerous radicalisation and extremism, be it far-right, Islamist, incel or other? Will she therefore join me in supporting amendment 43 to ensure that by default such content is hidden from all adult users?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely support my hon. Friend’s comments and I was pleased to see her champion that cause in the Bill Committee. Of course I support amendment 43, tabled in the names of SNP colleagues, to ensure that the toggle is on by default. Abhorrent material is being shared and amplified—that is the key point, amplified—online by algorithms and by the processes and systems in place. It is obvious that the Government just do not get that. That said, there is a majority in Parliament and in the country for strengthening the Online Safety Bill, and Labour has been on the front foot in arguing for a stronger Bill since First Reading last year.

It is also important to recognise the sheer number of amendments and changes we have seen to the Bill so far. Even today, there are many more amendments tabled by the Government. If that does not give an indication of the mess they have made of getting this legislation over the line in a fit and proper state, I do not know what does.

I have said it before, and I am certain I will say it again, but we need to move forward with this Bill, not backward. That is why, despite significant Government delay, we will support the Bill’s Third Reading, as each day of inaction allows more harm to spread online. With that in mind, I too will make some progress.

I will first address new clause 1, tabled in my name and that of my hon. Friend the Member for Manchester Central (Lucy Powell). This important addition to the Bill will go some way to address the gaps around support for individual complaints. We in the Opposition have repeatedly queried Ministers and the Secretary of State on the mechanisms available for individuals who have appeals of complaints. That is why new clause 1 is so important. It is vital that platforms’ complaints procedures are fit for purpose, and this new clause will finally see the Secretary of State publishing a report on the options available to individuals.

We already know that the Bill in its current form fails to consider an appropriate avenue for individual complaints. This is a classic case of David and Goliath, and it is about time those platforms went further in giving their users a transparent, effective complaints process. That substantial lack of transparency underpins so many of the issues Labour has with the way the Government have handled—or should I say mishandled—the Bill so far, and it makes the process by which the Government proceeded to remove the all-important clauses on legal but harmful content, in a quiet room on Committee Corridor just before Christmas, even more frustrating.

That move put the entire Bill at risk. Important sections that would have put protections in place to prevent content such as health and foreign-state disinformation, the promotion of self-harm, and online abuse and harassment from being actively pushed and promoted were rapidly removed by the Government. That is not good enough, and it is why Labour has tabled a series of amendments, including new clauses 4, 5, 6 and 7, that we think would go some way towards correcting the Government’s extremely damaging approach.

Under the terms of the Bill as currently drafted, platforms could set whatever terms and conditions they want and change them at will. We saw that in Elon Musk’s takeover at Twitter, when he lifted the ban on covid disinformation overnight because of his own personal views. Our intention in tabling new clause 4 is to ensure that platforms are not able to simply avoid safety duties by changing their terms and conditions whenever they see fit. This group of amendments would give Ofcom the power to set minimum standards for platforms’ terms and conditions, and to direct platforms to change them if they do not meet those standards.

Andrew Gwynne Portrait Andrew Gwynne (Denton and Reddish) (Lab)
- Hansard - - - Excerpts

My hon. Friend is making an important point. She might not be aware of it, but I recently raised in the House the case of my constituents, whose 11-year-old daughter was groomed on the music streaming platform Spotify and was able to upload explicit photographs of herself on that platform. Thankfully, her parents found out and made several complaints to Spotify, which did not immediately remove that content. Is that not why we need the ombudsman?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am aware of that case, which is truly appalling and shocking. That is exactly why we need such protections in the Bill: to stop those cases proliferating online, to stop the platforms from choosing their own terms of service, and to give Ofcom real teeth, as a regulator, to take on those challenges.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

Does the hon. Lady accept that the Bill does give Ofcom the power to set minimum safety standards based on the priority legal offences written into the Bill? That would cover almost all the worst kinds of offences, including child sexual exploitation, inciting violence and racial hatred, and so on. Those are the minimum safety standards that are set, and the Bill guarantees them.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

What is not in those minimum safety standards is all the horrendous and harmful content that I have described: covid disinformation, harmful content from state actors, self-harm promotion, antisemitism, misogyny and the incel culture, all of which is proliferating online and being amplified by the algorithms. This set of minimum safety standards can be changed overnight.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

As the hon. Lady knows, foreign-state disinformation is covered because it is part of the priority offences listed in the National Security Bill, so those accounts can be disabled. Everything that meets the criminal threshold is in this Bill because it is in the National Security Bill, as she knows. The criminal threshold for all the offences she lists are set in schedule 7 of this Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

That is just the problem, though, isn’t it? A lot of those issues would not be covered by the minimum standards—that is why we have tabled new clause 4—because they do not currently meet the legal threshold. That is the problem. There is a grey area of incredibly harmful but legal content, which is proliferating online, being amplified by algorithms and by influencers—for want of a better word—and being fed to everybody online. That content is then shared incredibly widely, and that is what is causing harm and disinformation.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Will the hon. Lady give way one more time?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

No, I will not. I need to make progress; we have a lot to cover and a lot of amendments, as I have outlined.

Under the terms of the Bill, platforms can issue whatever minimum standards they wish and then simply change them at will overnight. In tabling new clause 4, our intention is to ensure that the platforms are not able to avoid safety duties by changing their terms and conditions. As I have said, this group of amendments will give Ofcom the relevant teeth to act and keep everybody safe online.

We all recognise that there will be a learning curve for everyone involved once the legislation is enacted. We want to get that right, and the new clauses will ensure that platforms have specific duties to keep us safe. That is an important point, and I will continue to make it clear at every opportunity, because the platforms and providers have, for far too long, got away with zero regulation—nothing whatsoever—and enough is enough.

During the last Report stage, I made it clear that Labour considers individual liability essential to ensuring that online safety is taken seriously by online platforms. We have been calling for stronger criminal sanctions for months, and although we welcome some movement from the Government on that issue today, enforcement is now ultimately a narrower set of measures because the Government gutted much of the Bill before Christmas. That last minute U-turn is another one to add to a long list, but to be frank, very little surprises me when it comes to this Government’s approach to law-making.

John Hayes Portrait Sir John Hayes (South Holland and The Deepings) (Con)
- Hansard - - - Excerpts

I have to say to the hon. Lady that to describe it as a U-turn is not reasonable. The Government have interacted regularly with those who, like her, want to strengthen the Bill. There has been proper engagement and constructive conversation, and the Government have been persuaded by those who have made a similar case to the one she is making now. I think that warrants credit, rather than criticism.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely disagree with the right hon. Member, because we voted on this exact amendment before Christmas in the previous Report stage. It was tabled in the name of my right hon. Friend the Member for Barking (Dame Margaret Hodge), and it was turned down. It was word for word exactly the same amendment. If this is not anything but a U-turn, what is it?

I am pleased to support a number of important amendments in the names of the hon. Members for Aberdeen North (Kirsty Blackman) and for Ochil and South Perthshire (John Nicolson). In particular, I draw colleagues’ attention to new clause 3, which would improve the child empowerment duties in the Bill. The Government may think they are talking a good game on child safety, but it is clear to us all that some alarming gaps remain. The new clause would go some way to ensuring that the systems and processes behind platforms will go further in keeping children safe online.

In addition, we are pleased, as I have mentioned, to support amendment 43, which calls for the so-called safety toggle feature to be turned on by default. When the Government removed the clause relating to legal but harmful content in Committee, they instead introduced a requirement for platforms to give users the tools to reduce the likelihood of certain content appearing on their feeds. We have serious concerns about whether this approach is even workable, but if it is the route that the Government wish to take, we feel that these tools should at least be turned on by default.

Debbie Abrahams Portrait Debbie Abrahams (Oldham East and Saddleworth) (Lab)
- Hansard - - - Excerpts

Since my hon. Friend is on the point of safeguarding children, will she support Baroness Kidron as the Bill progresses to the other House in ensuring that coroners have access to data where they suspect that social media may have played a part in the death of children?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I can confirm that we will be supporting Baroness Kidron in her efforts. We will support a number of amendments that will be tabled in the Lords in the hope of strengthening this Bill further, because we have reached the limit of what we can do in this place. I commend the work that Baroness Kidron and the 5Rights Foundation have been doing to support children and to make this Bill work to keep everybody online as safe as possible.

Supporting amendment 43 would send a strong signal that our Government want to put online safety at the forefront of all our experiences when using the internet. For that reason, I look forward to the Minister seriously considering this amendment going forward. Scottish National party colleagues can be assured of our support, as I have previously outlined, should there be a vote on that.

More broadly, I highlight the series of amendments tabled in my name and that of my hon. Friend the Member for Manchester Central that ultimately aim to reverse out of the damaging avenue that the Government have chosen to go to down in regulating so-called legal but harmful content. As I have already mentioned, the Government haphazardly chose to remove those important clauses in Committee. They have chopped and changed this Bill more times than any of us can remember, and we are now left with a piece of legislation that is even more difficult to follow and, importantly, implement than when it was first introduced. We can all recognise that there is a huge amount of work to be done in making the Bill fit for purpose. Labour has repeatedly worked to make meaningful improvements at every opportunity, and it will be on the Government’s hands if the Bill is subject to even more delay. The Minister knows that, and I sincerely hope that he will take these concerns seriously. After all, if he will not listen to me, he would do well to listen to the mounting concerns raised by Members on his own Benches instead.

None Portrait Several hon. Members rose—
- Hansard -

Rosie Winterton Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - - - Excerpts

I have noticed that some people are standing who may not have applied earlier. If anybody is aware of that, can they let me know, and I can adjust timings accordingly? At the moment, my estimate is that if everybody takes no longer than seven minutes, and perhaps more like six, we can get everybody in comfortably without having to impose a time limit.

Priti Patel Portrait Priti Patel (Witham) (Con)
- View Speech - Hansard - - - Excerpts

I rise to speak to new clause 2 on the offence of failing to comply with a relevant duty. I pay tribute to my right hon. and hon. Friends who have championed new clause 2 to strengthen protections for children by introducing criminal liability for senior managers.

18:00
We have discussed this issue already in this Chamber. I thank charities and campaigners such as the National Society for the Prevention of Cruelty to Children for raising awareness and for being constructive and assiduous. I also thank the families who, through voicing their own pain and suffering, have given impetus to this issue. I thank those on the Front Bench; it is fair to say that I have had constructive dialogue with the Minister and the Secretary of State. They listened to our concerns and accepted that this issue had to be addressed.
As we debate this new clause and other aspects of the Bill, we should begin as we did last time by thinking of those who face tragedy and distress as a result of accessing inappropriate content online. Children and vulnerable people have been failed by tech companies and regulation. We have the duty and responsibility to step up and tighten the law, and protect children from online harms, exploitation and inappropriate content. That must be at the heart and centre of a lot of the legislation—not just this Bill but going forward. Throughout the various debates, and at Committee stage, we have touched on the fact that technology is evolving and changing constantly. With that, we must keep on building upon insights.
New clause 2 does simple and straightforward things. It makes senior managers liable and open to being prosecuted for failing to proactively promote and support the safety duties in clause 11. As it stands, the Bill’s criminal liability provisions fall short of what is expected or required. Criminal liability for failing to comply with an information notice from Ofcom is welcome. Ofcom has a very important role to play—I do not need to emphasise that any more. But the Bill does not go far enough, and Ministers have recognised that. We must ensure that all the flaws and failings are sanctionable and that the laws are changed in the right way. It not just about the laws for the Government Department leading the Bill; it cuts across other Government Departments. We have touched on that many times before.
More than 80% of the public agree that senior tech managers should be held legally responsible, to prevent harm to children on social media. That is a statement of the obvious, as we have seen such abhorrent and appalling harms take place. Around two thirds want managers to be prosecuted when failures result in serious harm. But harm can happen prior to an information notice being issued by Ofcom—again, we have discussed that.
The public need assurances that these companies will have the frameworks and safeguards to act responsibly and be held to account so that children and vulnerable individuals are protected. That means meaningful actions, not warm words. We should have proactivity when developing the software, algorithms and technology to be responsive. We must ensure that measures are put in place to hold people to account, and that sanctions cover company law, accountability, health and safety and other areas. Ireland has been mentioned throughout the passage of this Bill. That is important. My colleagues who will speak shortly have also touched on similar provisions.
It is right that we put these measures in the Bill for the serious failures to protect children. This is a topical issue. In fact, a number of colleagues met tech companies and techUK yesterday, as did I. We have an opportunity to raise the bar in the United Kingdom so that technology investment still comes forward and the sector continues to grow and flourish in the right way and for the right reasons. We want to see that.
Jamie Stone Portrait Jamie Stone (Caithness, Sutherland and Easter Ross) (LD)
- Hansard - - - Excerpts

The issues of evolving technology and holding people to account are hugely important. May I make the general point that digital education could underpin all those safeguards? The teaching of digital literacy should be conducted in parallel with all the other good efforts made across our schools.

Priti Patel Portrait Priti Patel
- Hansard - - - Excerpts

The hon. Member is absolutely right, and I do not think anyone in the House would disagree with that. We have to carry on learning in life, and that links to technology and other issues. That applies to all of us across the board, and we need people in positions of authority to ensure that the right kind of information is shared, to protect our young people.

I look forward to hearing from the Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend the Member for Sutton and Cheam (Paul Scully), who has been so good in engaging on this issue, and I thank him for the proactive way in which he has spent time with all of us. Will we see the Government’s amendment prior to the Bill going to the other place for its Second Reading there? It is vital for all colleagues who support new clause 2 to have clear assurances that the provisions we support, which could have passed through this House, will not be diluted in the other place by Ministers. Furthermore—we should discuss this today—what steps are the Government and Ofcom taking to secure the agreement of tech companies to work to ensure that senior managers are committed and proactive in meeting their duties under clause 11?

I recognise that a lot of things will flow through secondary legislation, but on top of that, engagement with tech companies is vital, so that they can prepare, be ready and know what duties will be upon them. We also need to know what further guidance and regulation will come forward to secure the delivery of clause 11 duties and hold tech companies to account.

In the interests of time, I will shorten my remarks. I trust and hope that Ministers will give those details. It is important to give those assurances before the Bill moves to the House of Lords. We need to know that those protections will not be diluted. This is such a sensitive issue. We have come a long way, and that is thanks to colleagues on both sides of the House. It is important that we get the right outcomes, because all of us want to make sure that children are protected from the dreadful harms that we have seen online.

Margaret Hodge Portrait Dame Margaret Hodge (Barking) (Lab)
- View Speech - Hansard - - - Excerpts

This is a really important piece of legislation. As my hon. Friend the Member for Pontypridd (Alex Davies-Jones) said, it has taken far too long to get to this point. The Bill has been considered in a painstaking way by Members across the House. While today’s announcement that we will introduce senior manager and director liability is most welcome, the recent decisions to strip out vast chunks of the Bill—clauses that would have contributed to making online a safe place for us all—represent a tragic opportunity missed by the Government, and it will fall to a Labour Government to put things right. I know from the assurances given by those on our Front Bench that they will do just that.

I do not want to spend too much time on it, but in discussing the removal of provisions on “legal but harmful” content, I have to talk a little bit about the Jewish community. The hope that the Online Safety Bill would give us some respite from the torrent of antisemitic abuse that some of us have been subjected to has been thwarted. The Centre for Countering Digital Hate has conducted research in this area, and it found that nine out of 10 antisemitic posts on Facebook and Twitter stay there, despite requests to have them removed. Its analysis of 714 posts containing anti-Jewish hate found that they were viewed by more than 7.3 million people across the platforms, and that 80% of posts containing holocaust denial and 70% identified as neo-Nazi were not acted on, although they were in breach of the rules set by the platforms. People like me are left with a sense of bitterness that our suffering has to be tolerated because of some ideological, misplaced, flawed and ill-thought-out interpretation of freedom of speech.

I turn to new clause 2, tabled by the hon. Member for Stone (Sir William Cash) and the hon. Member for Penistone and Stocksbridge (Miriam Cates). I congratulate them on the work they have done in bringing this forward. I think they will probably agree with me that this issue should never have divided us as it did before Christmas, when I tabled a similar amendment. It is not a party political issue; it is a common-sense measure that best serves the national interest and will make online a safer place for children. I am pleased that the hon. Members for Stone and for Penistone and Stocksbridge have persuaded their colleagues of the justification and that the Government have listened to them—I am only sorry that I was not as successful.

This is an important measure. The business model that platforms operate encourages, not just passively but actively, the flourishing of abusive content online. They do not just fail to remove that content, but actively promote its inclusion through the algorithms that they employ. Sadly, people get a kick out of reading hateful, harmful and abusive content online, as the platform companies and their senior managers know. It is in their interest to encourage maximum traffic on their platforms, and if that means letting people post and see vile abuse, they will. The greater the traffic on such sites, the more attractive they become to advertisers and the more advertisers are willing to pay for the ads that they post on the sites. The platforms make money out of online abuse.

Originally, the Government wanted to deal with the problem by fining the companies, but companies would simply treat such fines as a cost to their business. It would not change their model or the platforms’ behaviour, although it might add to the charges for those who want to advertise on the platforms. Furthermore, we know that senior directors, owners and managers personally take decisions about the content that they allow to appear on their platforms and that their approach affects what people post.

Elon Musk’s controversial and aggressive takeover of Twitter, where he labelled the sensible moderation of content as a violation of freedom of speech, led to a 500% increase in the use of the N-word within 12 hours of his acquisition. Telegram, whose CEO is Pavel Durov, has become the app of choice of terror networks such as ISIS, according to research conducted by the Middle East Media Research Institute. When challenged about that, however, Durov refused to act on the intelligence to moderate content and said:

“You cannot make messaging technology secure for everybody except for terrorists.”

If senior managers have responsibility for the content on their platforms, they must be held to account, because we know that doing so will mean that online businesses become a safer place for our children.

We have to decide whose side we are on. Are we really putting our children’s wellbeing first, or are we putting the platforms’ interest first? Of course, everybody will claim that we are putting children’s interests first, but if we are, we have to put our money where our mouth is, which involves making the managers truly accountable for what appears on their platforms. We know that legislating for director liability works, because it has worked for health and safety on construction sites, in the Bribery Act 2010 and on tax evasion. I hope to move similar amendments when we consider the Economic Crime and Corporate Transparency Bill on Report next week.

This is not simply a punitive measure—in fact, the last thing we want to do is lock up a lot of platform owners—but a tool to transform behaviour. We will not be locking up the tech giants, but we will be ensuring that they moderate their content. Achieving this change shows the House truly working at its best, cross-party, and focusing on the merits of the argument rather than playing party politics with such a serious issue. I commend new clause 2 to the House.

None Portrait Several hon. Members rose—
- Hansard -

Rosie Winterton Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - - - Excerpts

I remind hon. Members about the six-minute advisory time limit.

Caroline Dinenage Portrait Dame Caroline Dinenage (Gosport) (Con)
- View Speech - Hansard - - - Excerpts

It is a great relief to see the Online Safety Bill finally reach this stage. It seems like a long time since my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) kicked it off with the ambitious aim of making the UK the safest place in the world to be online. Although other countries around the world had picked at the edges of it, we were truly the first country in the world to set out comprehensive online safety legislation. Since then, other jurisdictions have started and, in some cases, concluded this work. As one of the relay of Ministers who have carried this particular baton of legislation on its very long journey, I know we are tantalising close to getting to the finish line. That is why we need to focus on that today, and I am really grateful to the hon. Member for Pontypridd (Alex Davies-Jones) for confirming that the Opposition are going to support the Bill on Third Reading.

18:14
We know that the internet is magnificent and life-changing in so many ways, but the dark corners remain a serious concern with regard particularly to children, but also to scores of other vulnerable people. Of course, the priorities of this Bill must be to protect children, to root out illegal content, and to hold the online platforms to account and ensure they are actually doing what they say they are doing when it comes to the dangerous content on their sites. I warmly welcome the Minister and the Secretary of State’s engagement on these particular aspects of the Bill and how they have worked really hard to strengthen it.
This legislation is so vital for our children. The National Society for the Prevention of Cruelty to Children has estimated that more than 21,000 online child sex crimes have been recorded by the police just in the time this legislation has been delayed since last summer.
Richard Graham Portrait Richard Graham (Gloucester) (Con)
- Hansard - - - Excerpts

Does my hon. Friend agree that the new crime of cyber-flashing is one instance of how this Bill has been improved? It should also help to reduce some of the violence against women and girls, which is a major issue of our time.

Caroline Dinenage Portrait Dame Caroline Dinenage
- Hansard - - - Excerpts

My hon. Friend is absolutely right to raise this, because we do need the Bill to be future-proofed to deal with some of the recently emerging threats to women and others that the online world has offered.

The potential threat of online harms is everyday life for most children in the modern world. Before Christmas, I received an email from my son’s school highlighting a TikTok challenge encouraging children to strangle each other until they passed out. This challenge probably did not start on TikTok, and it certainly is not exclusive to the platform, but when my children were born I never envisaged a day when I would have to sit them down and warn them about the potential dangers of allowing someone else to throttle them until they passed out. It is terrifying. Our children need this legislation.

I welcome the Government support for amendment 84 to clause 11, in the name of my hon. Friend the Member for Rutland and Melton (Alicia Kearns), to ban content that advertises so-called conversion therapies for LGBTQ+ people. Someone’s sexuality and who they love is not something to be cured, and unscrupulous crooks should not be able to profit from pushing young people towards potentially sinister and harmful treatments.

I really sympathise with the aims behind new clause 2, on senior executive liability. It is vital that this regime has the teeth to protect children and hold companies to account. I know the 10% of annual global turnover maximum fine is higher than some of the global comparisons, and certainly having clear personal consequences for those responsible for enforcing the law is an incentive for them to do it properly, but there is clearly a balance to strike. We must make sure that sanctions are proportionate and targeted, and do not make the UK a less attractive place to build a digital business. I am really pleased to hear Ministers’ commitment to a final amendment that will strike that really important balance.

I am concerned about the removal of measures on legal but harmful content. I understand the complexity of defining them, but other measures, including the so-called triple shield, do not offer the same protections for vulnerable adults or avoid the cliff edge when someone reaches the age of 18. That particularly concerns me for adults with special educational needs or disabilities. The key point here is that, if the tragic cases of Molly Russell and dozens of young people like her teach us anything, it is that dreadful, harmful online content cannot be defined strictly by what is illegal, because algorithms do not differentiate between harmful and harmless content. They see a pattern and they exploit it.

We often talk about the parallels between the online and offline world—we say that what is illegal online should be illegal offline, and vice versa—but in reality the two worlds are fundamentally different. In the real world, for a young person struggling with an eating disorder or at risk of radicalisation, their inner demons are not reinforced by everyone they meet on the street, but algorithms are echo chambers. They take our fears and our paranoia, and they surround us with unhealthy voices that normalise and validate them, however dangerous and however hateful, glamorising eating disorders, accelerating extremist, racist and antisemitic views and encouraging violent misogyny on incel sites.

That is why I worry that the opt-out option suggested in the Bill simply does not offer enough protection: the lines between what is legal and illegal are too opaque. Sadly, it feels as though this part of the Bill has become the lightning rod for those who think it will result in an overly censorious approach. However, we are where we are. As the Molly Rose Foundation said, the swift implementation of the Bill must now be the priority. Time is no longer on our side, and while we perfect this vast, complicated and inherently imperfect legislation, the most unspeakable content is allowed to proliferate in the online world every single day.

Finally, I put on record the exhaustive efforts made by the incredible team at the Department for Digital, Culture, Media and Sport and the Home Office, who brought this Bill to fruition. If there was ever an example of not letting the perfect be the enemy of the good, this is it, and right now we need to get this done. The stakes in human terms simply could not be any higher.

Rosie Winterton Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - - - Excerpts

I call the SNP spokesperson, Kirsty Blackman.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- View Speech - Hansard - - - Excerpts

I congratulate the hon. Member for Gosport (Dame Caroline Dinenage) on what was one of the best speeches on this Bill—and we have heard quite a lot. It was excellent and very thoughtful. I will speak to a number of amendments. I will not cover the Labour amendments in any detail because, as ever, the Labour Front Benchers did an excellent job of that. The right hon. Member for Barking (Dame Margaret Hodge) covered nicely the amendment on liability, and brought up the issue of hate, particularly when pointed towards the Jewish community. I thank her for consistently bringing that up. It is important to hear her voice and others on this issue.

Amendment 43 was tabled by me and my hon. Friend the Member for Ochil and South Perthshire (John Nicolson) and it regards a default toggle for material that we all agree is unsafe or harmful. The Labour party has said that it agrees with the amendment, and the SNP believes that the safest option should be the default option. We should start from a point of view that if anybody wants to see eating disorder content, or racist or incredibly harmful content that does not meet the bar of illegality, they should have to opt in to receive it. They should not see it by default; they should have to make that choice to see such content.

Freedom of speech is written into the Bill. People can say whatever they want as long as it is below that bar of illegality, but we should not have to read it. We should not have to read abuse that is pointed toward minority groups. We should start from the position of having the safest option on. We are trying to improve the permissive approach that the Government have arrived at, and this simple change is not controversial. It would require users to flip a switch if they want to opt in to some of the worst and most dangerous content available online, including pro-suicide, pro-anorexia or pro-bulimia content, rather than leaving that switch on by default.

If the Government want the terms and conditions to be the place where things are excluded or included, I think platforms should have to say, “We are happy to have pro-bulimia or pro-anorexia content.” They should have to make that clear and explicit in their terms of service, rather than having to say, “We do not allow x, y and z.” They should have to be clear, up front and honest with people, because then people would know what they are signing up to when they sign up to a website.

Amendment 44 is on habit forming features, and we have not spoken enough about the habit forming nature of social media in particular. Sites such as TikTok, Instagram and Facebook are set up to encourage people to spend time on them. They make money by encouraging people to spend as much time on them as possible—that is the intention behind them. We know that 42% of respondents to a survey by YoungMinds reported displaying signs of addiction-like behaviour when questioned about their social media habits. Young people are worried about that, and they do not necessarily have the tools to avoid it. We therefore tabled amendment 44 to take that into account, and to require platforms to consider that important issue.

New clause 3, on child user empowerment, was mentioned earlier. There is a bizarre loophole in the Bill requiring user empowerment toggles for adults but not for children. It is really odd not to require them for children when we know that they will be able to see some of this content and access features that are much more inherently dangerous to them than to adults. That is why we tabled amendments on private messaging features and live streaming features.

Live streaming is a place where self-generated child sexual abuse has shot through the roof. With child user empowerment, children would have to opt in, and they would have empowerment tools to allow them opportunities to say, “No, I don’t want to be involved in live streaming,” or to allow their parents to say, “No, I don’t want my child to be able to do live streaming when they sign up to Instagram. I don’t want them able to share live photos and to speak to people they don’t know.” Amendment 46, on private messaging features, would allow children to say, “No, I don’t want to get any private messages from anyone I don’t know.” That is not written into terms of service or in the Bill as a potentially harmful thing, but children should be able to exclude themselves from having such conversations.

We have been talking about the relationship between real life and the online world. If a child is playing in a play park and some stranger comes up and talks to them, the child is perfectly within their rights to say, “No, I’m not speaking to strangers. My parents have told me that, and it is a good idea not to speak to strangers,” but they cannot do that in the online world. We are asking for that to be taken into account and for platforms to allow private messaging and live streaming features to be switched off for certain groups of people. If they were switched off for children under 13, that would make Roblox, for example, a far safer place than it currently is.

I turn to amendment 84, on conversion therapy. I am glad that the amendment was tabled and that there are moves by the UK Government to bring forward the conversion therapy ban. As far as I am aware—I have been in the Chamber all day—we have not yet seen that legislation, but I am told that it will be coming. I pay tribute to all those who have worked really hard to get us to the position where the Government have agreed to bring forward a Bill. They are to be commended on that. I am sorry that it has taken this long, but I am glad that we are in that position. The amendment was massively helpful in that.

Lastly, I turn to amendment 50, on the risk of harm. One of the biggest remaining issues with the Bill is about the categorisation of platforms, which is done on the basis of their size and the risk of their features. The size of the platform—the number of users on it—is the key thing, but that fails to take into account very small and incredibly harmful platforms. The amendment would give Ofcom the power to categorise platforms that are incredibly harmful—incel forums, for example, and Kiwi Farms, set up entirely to dox trans people and put their lives at risk—as category 1 platforms and require them to meet all the rules, risk assessments and things for those platforms.

We should be asking those platforms to answer for what they are doing, no matter how few members they have or how small their user base. One person being radicalised on such a platform is one person too many. Amendment 50 is not an extreme amendment saying that we should ban all those platforms, although we probably should. It would ask Ofcom to have a higher bar for them and require them to do more.

I cannot believe that we are here again and that the Bill has taken so long to get to this point. I agree that the Bill is far from perfect, but it is better than nothing. The SNP will therefore not be voting against its Third Reading, because it is marginally better than the situation that we have right now.

Jeremy Wright Portrait Sir Jeremy Wright (Kenilworth and Southam) (Con)
- View Speech - Hansard - - - Excerpts

I want to say in passing that I support amendments 52 and 53, which stand in the name of my hon. Friend the Member for Stroud (Siobhan Baillie) and others. She will explain them fully so I do not need to, but they seem to be sensible clarifications that I hope the Government will consider favourably.

18:30
I want to focus on new clause 2. I have said before, and am happy to repeat it, that the individual criminal liability provided for in the Bill as it stands is too limited. Attaching it to information offences only means that, in effect, very bad behaviour cannot be penalised under the criminal law as long as the perpetrator is prepared to provide Ofcom with information about it. That cannot be sensible so there is a strong case for extending criminal liability, but new clause 2 goes too far. There are, fundamentally, two problems with new clause 2.
First, new clause 2 is drafted too broadly. It would potentially criminalise any breach of a safety duty under clause 11, the clause relating to children. We all, of course, think that keeping children safer online is a core mission of the Bill. I hope Ministers will consider favourably various other amendments that might achieve that, including the amendments in the name of the noble Baroness Kidron, which the hon. Member for Pontypridd (Alex Davies-Jones) mentioned earlier, in relation to coroners and all services likely to be accessed by children. Clause 11 covers a variety of different duties, including duties to incorporate certain provisions in terms of service and to ensure that terms of service are clear and accessible. Those are important duties no doubt, but I am not convinced that any and all failures to fulfil them should result in criminal prosecution.
William Cash Portrait Sir William Cash (Stone) (Con)
- Hansard - - - Excerpts

I thought I might mention to my right hon. and learned Friend that the written ministerial statement, which is now available to the public, makes it clear that useful and constructive discussions have taken place. Much of what he is saying is not necessarily applicable to the state of affairs we are now faced with.

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

I am grateful to my hon. Friend and I will come on to the written statement. I accept what he says. I think we are heading in the right direction, but since new clause 2 is before us at the moment, it seemed to me that I ought to address it, I hope in a helpful way.

There is nothing in the language of new clause 2 as it stands that requires a breach of the duties to be serious or even more than minimal. We should be more discriminating than that.

The second difficulty with new clause 2, which I hope the Government will pick up when they look at it again, is with prosecuting successfully the sorts of offences we may create. The more substantive and fundamental child safety duties in clause 11, which are to

“mitigate and manage the risks of harm”

and to prevent children encountering harmful content, are expressed in terms of the use of “proportionate measures” or “proportionate systems and processes”. The word “proportionate” is important and describes the need for balanced judgments to be made, including by taking into account freedom of expression and privacy as required by clause 11 itself. Aside from the challenges of obtaining evidence of what individual managers did or did not know, did or said, those balanced judgments could be very difficult for a prosecutor to assess and to demonstrate to a criminal court, to the required standard of proof, were deliberately or negligently wrong.

The consequences of that difficulty could either be that it becomes apparent that the cases are very hard to prosecute, and therefore criminal liability is not the deterrent we hoped for, or that wide criminal liability causes the sort of risk aversion and excessive take-down of material that I know worries my hon. Friend the Member for Stone (Sir William Cash) and others who support new clause 2. We therefore need to calibrate criminal liability appropriately.

It is also worth saying that if we are to pursue an extension of criminal liability, I am not sure that I see the logic of limiting that further criminal liability only to breaches of the child safety duties; I can envisage some breaches of safety duties in relation to illegal content that may also be deserving of such liability.

That leads me on to consider, as has been said, exactly how we might extend criminal liability differently. I appreciate that the Government will now be doing just that. Perhaps they can consider doing so in relation to serious or persistent breaches of the safety duties, rather than in relation to all breaches of safety duties.

Alternatively, or additionally, they could look at individual criminal liability for a failure to comply with a confirmed notice of contravention from Ofcom. I welcome the direction of travel set out in the written ministerial statement, which suggests that that is where the Government may go. As the statement says, the recent Irish legislation that has been prayed in aid does something very similar, and it is an approach with several advantages: it is easier to prove, we will know whether Ofcom has issued a notice requiring action to remedy a deficient approach to the safety duties, and we will know whether Ofcom believes that it has not been responded to adequately.

As we design a new system of regulation in this new era of regulation, we should want open conversations to take place between the regulator and the regulated as to how best to counter harms. Anything that discourages platforms and their directors from doing so may make the system we are designing work less well in promoting safety online. The approach that I think the Government will now consider is unlikely to do that.

Let me say one final thing. As my hon. Friend the Member for Gosport (Dame Caroline Dinenage) said, I have been involved in the progress of this Bill almost from the start, and I am delighted to see present my right hon. Friend the Member for Maidenhead (Mrs May), at whose instruction I started doing it. It has been tortuous progress, no doubt—to some extent that was inevitable because of the difficulty of the Bill and the territory in which we seek to legislate—but the hon. Member for Aberdeen North (Kirsty Blackman), who speaks for the SNP and for whom I have a good deal of respect, was probably a little grudging in suggesting that as it stands the Bill does only slightly better than the status quo. It does a lot more than that.

If we send the Bill to the other place this evening, as I hope we do, and if the other place considers it again with some thoroughness and seeks to improve it further, as I know it will, we will make the internet not a safe place—I do not believe that is achievable—but a significantly safer place. If we can do that, it will be the most important thing that most of us in this place have ever done.

None Portrait Several hon. Members rose—
- Hansard -

Rosie Winterton Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - - - Excerpts

Order. Things are not going quite according to plan, so colleagues might perhaps like to gear more towards five minutes as we move forward.

Luke Pollard Portrait Luke Pollard (Plymouth, Sutton and Devonport) (Lab/Co-op)
- View Speech - Hansard - - - Excerpts

I rise to speak in favour of new clause 4, on minimum standards. In particular, I shall restrict my remarks to minimum standards in respect of incel culture.

Colleagues will know of the tragedy that took place in Plymouth in 2021. Indeed, the former Home Secretary, the right hon. Member for Witham (Priti Patel), visited Plymouth to meet and have discussions with the people involved. I really want to rid the internet of the disgusting, festering incel culture that is capturing so many of our young people, especially young men. In particular, I want minimum standards to apply and to make sure that, on big and small platforms where there is a risk, those minimum standards include the recognition of incel content. At the moment, incel content is festering in the darkest corners of the internet, where young men are taught to channel their frustrations into an insidious hatred of women and to think of themselves as brothers in arms in a war against women. It is that serious.

In Parliament this morning I convened a group of expert stakeholders, including those from the Centre for Countering Digital Hate, Tech Against Terrorism, Moonshot, Girlguiding, the Antisemitism Policy Trust and the Internet Watch Foundation, to discuss the dangers of incel culture. I believe that incel culture is a growing threat online, with real-world consequences. Incels are targeting young men, young people and children to swell their numbers. Andrew Tate may not necessarily be an incel, but his type of hate and division is growing and is very popular online. He is not the only one, and the model of social media distribution that my right hon. Friend the Member for Barking (Dame Margaret Hodge) spoke about incentivises hate to be viewed, shared and indulged in.

This Bill does not remove incel content online and therefore does not prevent future tragedies. As chair of the all-party parliamentary group on social media, I want to see minimum standards to raise the internet out of the sewer. Where is the compulsion for online giants such as Facebook and YouTube to remove incel content? Five of the most popular incel channels on YouTube have racked up 140,000 subscribers and 24 million views between them, and YouTube is still platforming four of those five. Why? How can these channels apparently pass YouTube’s current terms and conditions? The content is truly harrowing. In these YouTube videos, men who have murdered women are described as saints and lauded in incel culture.

We know that incels use mainstream platforms such as YouTube to reel in unsuspecting young men—so-called normies—before linking them to their own small, specialist websites that show incel content. This is called breadcrumbing: driving traffic and audiences from mainstream platforms to smaller platforms—which will be outside the scope of category 1 provisions and therefore any minimum standards—where individuals start their journey to incel radicalisation.

I think we need to talk less about freedom of speech and more about freedom of reach. We need to talk about enabling fewer and fewer people to see that content, and about down-ranking sites with appalling content like this to increase the friction to reduce audience reach. Incel content not only includes sexist and misogynist material; it also frequently includes anti-Semitic, racist, homophobic and transphobic items layered on top of one another. However, without a “legal but harmful” provision, the Bill does nothing to force search engines to downrate harmful content. If it is to be online, it needs to be harder and harder to find.

I do not believe that a toggle will be enough to deal with this. I agree with amendment 43—if we are to have a toggle, the default should be the norm—but I do not think a toggle will work because it will be possible to evade it with a simple Google Chrome extension that will auto-toggle and therefore make it almost redundant immediately. It will be a minor inconvenience, not a game changer. Some young men spent 10 hours a day looking at violent incel content online. Do we really think that a simple button, a General Data Protection Regulation annoyance button, will stop them from doing so? It will not, and it will not prevent future tragedies.

However, this is not just about the effect on other people; it is also about the increase in the number of suicides. One of the four largest incel forums is dedicated to suicide and self-harm. Suicide is normalised in the forum, and is often referred to as “catching the bus.” People get together to share practical advice on how they can take their own lives. That is not content to which we should be exposing our young people, but it is currently legal. It is harmful, but it will remain legal under the Bill because the terms and conditions of those sites are written by incels to promote incel content. Even if the sites were moved from category 2 to category 1, they would still pass the tests in the Bill, because the incels have written the terms and conditions to allow that content.

Why are smaller platforms not included in the Bill? Ofcom should have the power to bring category 2 sites into scope on the basis of risk. Analysis conducted by the Center for Countering Digital Hate shows that on the largest incel website, rape is mentioned in posts every 29 minutes, with 89% of those posts referring to it in a positive sense. Moreover, 50% of users’ posts about child abuse on the same site are supportive of paedophilia. Indeed, the largest incel forum has recently changed its terms and conditions to allow mention of the sexualisation of pubescent minors—unlike pre-pubescent minors; it makes that distinction. This is disgusting and wrong, so why is it not covered in the Bill? I think there is a real opportunity to look at incel content, and I would be grateful if the Minister met the cross-party group again to discuss how we can ensure that it is harder and harder to find online and is ultimately removed, so that we can protect all our young people from going down this path.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) made an excellent speech about new clause 2, a clause with which I had some sympathy. Indeed, the Joint Committee that I chaired proposed that there should be criminal liability for failure to meet the safety duties set out in the Bill, and that that should apply not just to child safety measures, but to any such failure.

However, I agree with my right hon. and learned Friend that, as drafted, the new clause is too wide. If it is saying that the liability exists when the failure to meet the duties has occurred, who will be the determinant of that factor? Will it be determined when Ofcom has issued a notice, or when it has issued a fine? Will it be determined when guidance has been given and has not been followed? What we do not want to see is a parallel judicial system in which decisions are made that are different from those of the regulator in respect of when the safety duties had not been met.

I think it is when there are persistent breaches of the safety duties, when companies have probably already been fined and issued with guidance, and when it has been demonstrated that they are clearly in breach of the codes of practice and are refusing to abide by them, that the criminal liability should come in. Similar provisions already exist in the GDPR legislation for companies that are in persistent breach of their duties and obligations. The Joint Committee recommended that this should be included in the Bill, and throughout the journey of this legislation the inclusion of criminal liability has been consistently strengthened. When the draft Bill was published there was no immediate commencement of any criminal liability, even for not complying with the information notices given by Ofcom, but that was included when the Bill was presented for Second Reading. I am pleased that the Government are now going to consider how we can correctly define what a failure to meet the safety duties would be and therefore what the committal sanction that goes with it would be. That would be an important measure for companies that are in serial breach of their duties and obligations and have no desire to comply.

18:48
This issue is also relevant and linked to the wider debate around legal but harmful that we have had today and in the recommittal Committee, because if we are going to have criminal sanctions for non-compliance, we need to be really clear what companies are supposed to do. It needs to be really clear to them what they have to do as well. That is why, when the Joint Committee produced its report, we recommended that the legal but harmful provisions in the Bill should be changed. They do not do what many people in the House have asserted they do, which is to set standards and requirements for companies to remove legal content. They were never there to do that. They provided risk assessment for a wider range of content, and that may have been helpful, but they did not require the removal of content that was neither a breach of the community standards of the platform nor a breach of the legal threshold.
The changes to the Bill help in some ways with the idea of having criminal liability because written on to the face of the Bill are the offences that are within scope, what the companies have to do and also the requirement to enforce their own terms of service where the safety standards are defined not by law but by the platform. Safety standards are important, and there is sometimes a danger in this debate that we pretend they do not really exist. That is understandable, because companies are not very good at enforcing them. They are not very good at doing the things they say they will do. As a former board member of the Centre for Countering Digital Hate, I am pleased to hear that organisation being cited so often in the debate. Its chief executive gave evidence to the Joint Committee, in which he said that if there was one thing it could do, it would be to ensure that companies enforced their own terms of service. He said that if there were a legal power to make them do that, many of the problems we are discussing would go away. That is a very important sanction.
On the point around smaller platforms, in reality Ofcom has the power to enforce safety standards at the level set in the Bill on any platform of any size. On the question of smaller platforms being out of scope, they can only take enforcement based on the terms of service, and platforms like that are likely to have very weak or practically non-existent terms of service. That is why having the minimum safety standards based in law is so important.
With regard to advertising, my hon. Friend and constituency neighbour the Member for Dover (Mrs Elphicke) has an amendment relating to immigration offences that are promoted through advertising. The additional amendment that the Government are accepting relating to advertising in banning the promotion of conversion therapy is also important.
Theresa May Portrait Mrs Theresa May (Maidenhead) (Con)
- Hansard - - - Excerpts

My hon. Friend has referenced the proposals from my hon. Friend the Member for Dover (Mrs Elphicke). I am grateful to the Minister and the Secretary of State for the discussions they have had with me on making modern slavery a specific priority offence, as well as illegal immigration. I think this is very important.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I agree with my right hon. Friend; that is exactly right, and it is also right that we look at including additional offences on the face of the Bill in schedule 7 as offences that will be considered as part of the legislation.

Where this touches on advertising, the Government have already accepted, following the recommendation of the Joint Committee, that the promotion of fraud should be regulated in the Bill, even if it is in advertising. There are other aspects of this, too, including modern slavery and immigration, where we need to move at pace to close the loophole where consideration was to be given to advertising outside of the Bill through the online advertising review. The principle has already been accepted that illegal activity promoted through an advert on an online platform should be regulated as well as if it was an organic posting. That general provision does not yet exist, however. Given that the Government have considered these additional amendments, which was the right thing to do, they also need to look at the general presumption that any illegal activity that is a breach of the safety duties should be included and regulated, and that if somebody includes it in an advert it does not become exempt, when it would be regulated if it was in an organic posting.

Matt Rodda Portrait Matt Rodda (Reading East) (Lab)
- View Speech - Hansard - - - Excerpts

I would like to focus on new clause 1, dealing with redress, new clause 43, dealing with the toggle default, and new clause 4 on minimum standards. This Bill is a very important piece of legislation, but I am afraid that it has been seriously watered down by the Government. In particular, it has been seriously weakened by the removal of measures to tackle legal but harmful content. I acknowledge that some progress has been made recently, now that the Government have accepted the need for criminal sanctions for senior managers of tech companies. However, there are still many gaps in the Bill and I want to deal with some of them in the time available to me tonight.

First, I pay tribute to the families who have lost children due to issues related to social media. Some of those families are in the Public Gallery tonight. In particular, I want to mention the Stephens family from my Reading East constituency. Thirteen-year-old Olly Stephens was murdered in an horrific attack following a plot hatched on social media. The two boys who attacked Olly had both shared dozens of images of knives online, and they used 11 different social media platforms to do so. Sadly, none of the platforms took down the content, which is why these matters are so important to all of us and our communities.

Following this awful case, I support a number of new clauses that I believe would lead to a significant change in the law to prevent a similar tragedy. I stress the importance of new clause 1, which would help parents to make complaints. As Olly’s dad, Stuart, often says, “You simply cannot contact the tech companies. You send an email and get no reply.” It is important to tackle this matter, and I believe that new clause 1 would go some way towards doing that.

As others have said, surely it makes sense for parents to know their children have some protection from harmful content. New clause 43 would provide reassurance by introducing a default position of protecting children. I urge Members on both sides of the House to support this new clause. Both children and vulnerable adults should be better protected from legal but harmful content, and further action should be taken. New clause 43 would take clear steps in that direction.

I am aware of time, and I support many other important new clauses. I reiterate my support and backing for my Front-Bench colleague, my hon. Friend the Member for Pontypridd (Alex Davies-Jones). Thank you, Madam Deputy Speaker, for the opportunity to contribute to this debate.

Andrea Leadsom Portrait Dame Andrea Leadsom (South Northamptonshire) (Con)
- View Speech - Hansard - - - Excerpts

It is a pleasure to follow the hon. Member for Reading East (Matt Rodda). I congratulate him on his moving tribute to his constituent’s son. It is a terrible story.

This Bill will be life changing for many, but I am sorry to say that it has taken far too long to get to this point. The Government promised in 2015 to end children’s exposure to harmful online material, and in 2017 they committed to making the UK the safest place for children to be online. This morning, as I waited in the freezing cold on the station platform for a train that was late, a fellow passenger spoke to me about the Bill. He told me how happy he is that action is, at last, under way to protect children from the dangers of the internet. As a father of three young children, he told me that the internet is one of his greatest concerns.

I am afraid that, at the moment, the internet is as lawless as the wild west, and children are viewing images of abuse, addiction and self-harm on a daily basis. As others have said, the stats are shocking. Around 3,500 online child sex offences are recorded by police each month, and each month more than a million UK children access online pornography. It has been said that, in the time it takes to make a cup of tea, a person who joins certain popular social media platforms will have been introduced to suicidal content, “Go on, just kill yourself. You know you want to.”

I am incredibly proud that our Government have introduced a Bill that will change lives for the better, and I hope and expect it will be a “best in class” for other Governments to do likewise. I pay tribute to my right hon. Friend the Secretary of State for Digital, Culture, Media and Sport and her predecessors for their ruthless focus on making the online world a safer place. Ultimately, improving lives is what every MP is here to do, and on both sides of the House we should take great delight that, at last, this Bill will have its remaining Commons stages today.

I pay tribute to my hon. Friends the Members for Penistone and Stocksbridge (Miriam Cates) and for Stone (Sir William Cash) for their determination to give the Bill even more teeth, and I sincerely thank the Secretary of State for her willingness not only to listen but to take action.

New clause 2, tabled by my hon. Friends, will not be pressed because the Secretary of State has agreed to table a Government amendment when the Bill goes to the other place. New clause 2 sought to create a backstop so that, if a senior manager in a tech firm knowingly allows harm to be caused to a child that results in, for example their abuse or suicide, the manager should be held accountable and a criminal prosecution, with up to two years in prison, should follow. I fully appreciate that many in the tech world say, first, that that will discourage people from taking on new senior roles and, secondly, that it will discourage inward investment in the UK tech sector. Those serious concerns deserve to be properly addressed.

First, with regard to the potential for senior tech staff to be unwilling to take on new roles where there is this accountability, I would argue that from my experience as City Minister in 2015 I can provide a good example of why that is an unnecessary concern. We were seeking to address the aftermath of the 2008 financial crisis and we established the possibility of criminal liability for senior financial services staff. It was argued at the time that that would be highly damaging to UK financial services and that people would be unwilling to take on directorships and risk roles. I think we can all see clearly that those concerns were unfounded. Some might even say, “Well, tech firms would say that, wouldn’t they?”. The likelihood of a criminal prosecution will always be low, but the key difference is that in the future tech managers, instead of waking up each day thinking only about business targets, will wake up thinking, “Have I done enough to protect children, as I meet my business targets?”. I am sure we can agree that that would be a very good thing.

Secondly, there are those who argue that inward investment to the UK’s tech sector would be killed off by this move, and that would indeed be a concern. The UK tech sector leads in Europe, and at the end of 2022 it retained its position as the main challenger to the US and China. Fast-growing UK tech companies have continued to raise near-record levels of investment—more than France and Germany combined. The sector employs 3 million people across the UK and continues to thrive. So it is absolutely right that Ministers take seriously the concerns of these major employers.

However, I think we can look to Ireland as a good example of a successful tech hub where investment has not stopped as a result of strong accountability laws. The Irish Online Safety and Media Regulation Act 2022 carries a similar criminal responsibility to the one proposed in new clause 2, yet Ireland remains a successful tech hub in the European Union.

Tim Loughton Portrait Tim Loughton (East Worthing and Shoreham) (Con)
- Hansard - - - Excerpts

My right hon. Friend is rightly dispelling all these scare stories we have heard. One brief we had warned that if new clause 2 were to go through, it would portend the use of upload filters, where the system sweeps in and removes content before it has been posted. That would be a good thing, would it not? We need social media companies to be investing in more moderators in order to be more aware of the harmful stuff before it goes online and starts to do the damage. This should lead to more investment, but in the right part—in the employees of these social media companies. Facebook—Meta, as it now is—made $39 billion profit in 2021, so they are not short of money to do that, are they?

Andrea Leadsom Portrait Dame Andrea Leadsom
- Hansard - - - Excerpts

My hon. Friend makes a good point. Of course, as I have said, tech managers who wake up trying to meet business targets will now look at meeting them in a way that also protects children. That is a good thing.

We will look back on this period since the real rise of social media and simply not be able to believe what millions of children have been subjected to every day. As the Government’s special adviser on early years, it seems to me that all the work we are doing to give every baby the best start for life will be in vain if we then subject them during their vulnerable childhood years to the daily onslaught of appalling vitriol, violence, abuse and sordid pornography that is happening right now. It is little wonder that the mental health of young people is so poor. So it is my hope that this Bill will truly support our attempts to build back better after the covid lockdown. The Government’s clear commitment to families and children, and the Prime Minister's own personal commitment to the vision for “The Best Start for Life” is apparent for all to see. Keeping children safe online will make a radical improvement to all their lives.

None Portrait Several hon. Members rose—
- Hansard -

Rosie Winterton Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - - - Excerpts

In order to ensure that we get everybody in, I am going to introduce a five-minute time limit. I call Richard Burgon.

Richard Burgon Portrait Richard Burgon (Leeds East) (Lab)
- View Speech - Hansard - - - Excerpts

I have listened with interest to all the powerful speeches that have been made today. As legislation moves through Parliament, it is meant to be improved, but the great pity with this Bill is that it has got worse, not better. It is a real tragedy that measures protecting adults from harmful but legal content have been watered down.

I rise to speak against the amendments that have come from the Government, including amendments 11 to 14 and 18 and 19, which relate to the removal of adult safety duties. I am also speaking in favour of new clause 4 from the Labour Front Bench team and amendment 43 from the SNP, which go at least some of the way to protect adults from harmful but legal content.

19:00
The reason I am keen to highlight these points today stems from a tragic case in my constituency, which I have raised in the House on more than one occasion. Joe Nihill, a popular former Army cadet, was aged 23 when he took his own life after accessing dangerous, suicide-related content online. As I have mentioned previously, his mother, Catherine, and his sister-in-law, Melanie, have run a courageous campaign to ensure that, when this legislation becomes law, what happened to Joe does not happen to others.
For much of the passage of the Bill, I have been heartened. In particular, speaking to the previous Front-Bench Government team, it felt like we were going in the right direction, but perhaps not as quickly as we would like. However, the Government amendments mean that we are now heading in the wrong direction. Joe’s mother and sister-in-law are heartbroken at the Government’s current direction of travel on the Bill in relation to protecting adults from harmful but legal content. I urge the Minister to think again, because Government amendments have gutted harmful but legal protections for adults. Reckless amendments mean that sites will not even have to consider the risk that harmful but legal content poses to adult users on their platform. As I have said, Bills are meant to get better as they go through Parliament. With the Government’s amendments, we have seen the opposite happen.
Research from the Samaritans shows that just 16% of people think that access to potentially harmful content on the internet should be restricted only for children. As I have said, my constituent, Joe, was 23. We all know that it is false to presume that people stop being vulnerable at the age of 18. There are so many vulnerable adults in our society, and there are also people who become vulnerable when they see these things online—when they are dragged down this online rabbit hole of dangerous, harmful content.
The importance of including harmful but legal content is clear. Content that is legal but undoubtedly harmful includes information, instructions and advice on methods of self-harm and suicide, and material that portrays self-harm and suicide as desirable. Crudely removing protections from harmful content at 18 years of age leaves vulnerable people exposed to potentially fatal content.
As we have heard today, individual filters are simply not enough to protect vulnerable people. The Government have set out that it is up to individuals to filter legal but harmful content, but, often, people experiencing suicidal thoughts will look for ways to take their own life and actively seek out harmful content.
In conclusion, the truth is that the Government have ignored the real-world expertise of groups such as the Samaritans and others in order to put the interests of tech giants first as well as those on the Tory Back Benches who put so-called freedom of speech ahead of the safety of people like Joe from my constituency who took his own life at the age of 23.
I hope to see further work on this Bill in the other place to ensure that vulnerable adults are given the protection that they deserve. That was Joe’s parting wish in the letter that he left to his family—that what happened to him would not happen to others. Let us not lose this opportunity. Let us improve the Bill. The other place has a vital role to play in ensuring that the Bill improves and protects everybody.
John Hayes Portrait Sir John Hayes (South Holland and The Deepings) (Con)
- View Speech - Hansard - - - Excerpts

One of the most noticeable changes in my lifetime has been the disheartening debasement of public discourse. The internet—a place for posturing, preening and posing, but rarely for genuine discussion or measured debate—must take much of the blame for that transformative decline, but, while the coarsening of the national conversation is among the most obvious examples of the harm being done by the internet, it is merely the tip of a very dangerous iceberg.

Beyond every superficial banality lurks a growing crisis of depression, decay, misery and malaise, of self-doubt and self-harm, all facilitated by tech companies that profit from exploiting insecurities, doubts and fears. Such companies do not exist simply to facilitate communication; rather, they control and manipulate virtual interaction in ways that play on innate fears.

The social media conglomerates’ entire business model relies on ruthlessly exploiting vast quantities of data harvested from their users. Driven by nothing beyond profit and growth, they have abandoned any notion of duty of care, because their business model depends on monetarising information with little regard to how it is generated or how it is used, even when that puts children at deadly risk.

Perhaps that wilful ignorance is why social media consistently fails to police videos advertising and glamorising illegal channel crossings. The 1,400 minors accompanying the nearly 50,000 crossings last year had their images placed on the internet as poster children for that despicable trade. I am delighted that the work done by my hon. Friend the Member for Dover (Mrs Elphicke), and her amendment 82, now wisely accepted by the Government, will begin to address that particular wickedness. The amendment will wipe such material from the internet, requiring social media companies to face up to their responsibilities in plying this evil trade.

If drafted correctly, this Bill is an opportunity for Britain to lead the way in curbing the specious, sinister, spiteful excesses of the internet age. For all their virtue signalling, the tech giants’ lack of action speaks louder than words. Whether it is facilitating the promotion of deadly channel crossings or the day-to-day damage done to the mental health of Britain’s young people, let us be under no illusion: those at the top know exactly the harm they wreak.

Whistleblowing leaks by Frances Haugen last year revealed Mr Zuckerberg’s Meta as a company fully aware of the damage it does to the mental health of young people. In the face of its inaction, new clause 2, tabled by my hon. Friends the Members for Stone (Sir William Cash) and for Penistone and Stocksbridge (Miriam Cates), whom I was pleased and proud to support in doing so, makes tech directors personally legally liable for breaches of their child safety duties. No longer will those senior managers be able to wash their hands of the harm they do, and no longer will they be able to perpetuate those sinister algorithms, which, rather than merely reflecting harm, cause harm.

Strengthening the powers of Ofcom to enforce those duties will ensure that the buck stops with tech management. Like the American frontier of legend, the virtual world of the internet can be tamed—the beast can be caged—but, as GK Chesterton said:

“Unless a man becomes the enemy of an evil, he will not even become its slave but rather its champion.”

The greedy, careless tech conglomerates cannot be trusted to check themselves. This Bill is a welcome start, but in time to come, as the social media beast writhes and breathes, Parliament will need to take whatever action is necessary to protect our citizens by quenching its fearful fire. fearful fire.

Jim Shannon Portrait Jim Shannon (Strangford) (DUP)
- View Speech - Hansard - - - Excerpts

First and foremost, as we approach the remaining stages of this Bill, we must remember its importance. As MPs, we hear stories of the dangers of online harms that some would not believe. I think it is fair to say that those of my generation were very fortunate to grow up in a world where social media did not exist; as I just said to my hon. Friend the Member for South Antrim (Paul Girvan) a few minutes ago, I am really glad I did not have to go through that. Social media is so accessible nowadays and children are being socialised in that environment, so it is imperative that we do all we can to ensure that they are protected and looked after.

I will take a moment to discuss the importance of new clause 2. There are many ongoing discussions about where the responsibility lies when it comes to the regulation of online harms, but new clause 2 ultimately would make it an offence for service providers not to comply with their safety duties in protecting children.

The hon. Member for Penistone and Stocksbridge (Miriam Cates) has described the world of social media as

“a modern Wild West, a lawless and predatory environment”—

how true those words are. I put on record my thanks to her and to the hon. Member for Stone (Sir William Cash) for all their endeavours to deliver change—they have both been successful, and I say well done to them.

Some 3,500 online child sexual offences are recorded by the police every month. Every month, 1.4 million UK children access online porn, the majority of which is degrading, abusive and violent. As drafted, the Bill would not hold tech bosses individually liable for their own failure in child and public safety. New clause 2 must be supported, and I am very pleased that the Government are minded to accept it.

Fines are simply not enough. If we fail to address that in the Bill, this House will be liable, because senior tech bosses seem not to be. I am minded, as is my party, to support the official Opposition’s new clause 4, “Safety duties protecting adults and society: minimum standards for terms of service”.

New clause 8 is also important. Over the last couple of years, my office has received numerous stories from parents who have witnessed their children deal with the consequences of what an eating disorder can do. I have a very close friend whose 16-year-old daughter is experiencing that at the moment. It is very hard on the family. Social media pages are just brutal. I have heard of TikTok pages glorifying bulimia and anorexia, and Instagram pages providing tips for self-harm—that is horrendous. It is important that we do not pick and choose what forms of harm are written into the Bill. It is not fair that some forms of harm are addressed under the Bill or referred to Ofcom while others are just ignored.

Communication and engagement with third-party stakeholders is the way to tackle and deal with this matter. Let us take, for example, a social media page that was started to comment on eating disorders and is generally unsafe and unhelpful to young people who are struggling. Such a page should be flagged to healthcare professionals, including GPs and nurses, who know best. If we can do that through the Bill, it would be a step in the right direction. On balance, we argue that harmful content should be reserved for regulations, which should be informed by proper stakeholder engagement.

I will touch briefly on new clause 3, which would require providers to include features that child users may use or apply if they wish to increase their control over harmful content. Such features are currently restricted to adults. Although we understand the need to empower young people to be responsible and knowledgeable for the decisions they make, we recognise the value of targeting such a duty at adults, many of whom hold their parental responsibilities very close to their hearts. More often than not, that is just as important as regulation.

To conclude, we have seen too many suicides and too much danger emerge from online and social media. Social media has the potential to be an educational and accessible space for all, including young people. However, there must be safety precautions for the sake of young people, who can very easily fall into traps, as we are all aware. In my constituency, we have had a spate of suicides among young people—it seems to be in a clique of friends, and that really worries me. This is all about regulation, and ensuring that harmful content is dealt with and removed, and that correct and informed individuals are making the decisions about what is and is not safe. I have faith that the Minister, the Government and the Bill will address the outstanding issues. The Bill will not stop every online evil, but it will, as the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) said, make being online safer. If the Bill does that, we can support it, because that would be truly good news.

Natalie Elphicke Portrait Mrs Natalie Elphicke (Dover) (Con)
- View Speech - Hansard - - - Excerpts

Thank you, Mr Deputy Speaker—if I may say so, it is a pleasure to see my east Kent neighbour in the Chair.

I will speak to amendment 82, which was tabled in my name, and in support of new clause 2 and amendment 83. At the last Report stage I spoke at some length on an associated amendment, and I am conscious that many Members wish to speak, so I will keep my comments brief.

I am grateful to the many right hon. and hon. Friends who supported my amendment, whether or not their names appear next to it on the amendment paper. I thank in particular my right hon. Friend the Member for South Holland and The Deepings (Sir John Hayes) for his considerable assistance in securing changes.

Amendment 82 sets out a requirement to remove content that may result in serious harm or death to a child while crossing the English channel in small boats. The risk of harm or death from channel crossings is very real. Four children have drowned in the past 15 months, with many more harmed through exposure to petrol and saltwater burns and put in danger here and abroad by organised crime and people traffickers. Social media is playing a direct role in this criminal enterprise. It must be brought to book, and the videos and other content that encourage such activity must be taken down.

19:15
There is an obligation on us to protect children, especially lone children who find themselves not in the protection of social services, either here or abroad, but in the hands of evil people smugglers and people traffickers. I hope that whatever our differences may be across this House on how open or otherwise our borders and migration system should be, we should be united in compassion, concern and action for children and young people in the snare of this wicked criminal activity. That is what my amendment 82 seeks to ensure.
Turning briefly to other amendments, new clause 2 seeks to hold senior managers to account. I am grateful to my hon. Friends the Members for Stone (Sir William Cash) and for Penistone and Stocksbridge (Miriam Cates) for their excellent work on this. I was somewhat disappointed to read the comments, repeated today by the hon. Member for Pontypridd (Alex Davies-Jones), that it is some kind of weakness for Government to agree to amendments. I particularly wanted to comment on that in relation to new clause 2.
In deciding to support new clause 2, I was persuaded by the remarks of the right hon. Member for Barking (Dame Margaret Hodge) in the previous Report stage. I am grateful to her for the strength of her comments and their persuasive nature. It is our job here in this House to make sure that we consider and make responsible amendments. That is what those of us on the Government Benches have sought to do. I am very pleased that the Government have moved in relation to new clause 2, and it is important to recognise that it shows the confidence and strength of leadership of the Prime Minister, his Ministers, the Culture Secretary and Ministers in her Department and the Home Office, as well as the Solicitor General, that they will work with us to ensure that the Bill is stronger yet.
Finally, I turn to amendment 83 in the same spirit. I was moved by the personal account and the comments made by my right hon. Friend the Member for Chelmsford (Vicky Ford) on Report, and that is why I lent my support to her amendment. She has made a powerful case that it is important to protect children, but also to recognise, as has been said, that as children turn 18 they may still be extremely vulnerable and in need of support. I thank her for that, and I know that a number of Members feel likewise.
In conclusion, I thank the Culture Secretary and the Minister, my hon. Friend the Member for Sutton and Cheam (Paul Scully), for their engagement to date and for the commitment made in the written ministerial statement to strengthen the Bill in relation to the prevention of modern slavery and illegal immigration, including for the protection of children. On that basis, I confirm that I will not be moving amendment 82 later today.
William Cash Portrait Sir William Cash
- View Speech - Hansard - - - Excerpts

In a nutshell, we must be able to threaten tech bosses with jail. There is precedent for that—jail sentences for senior managers are commonplace for breaches of duties across a great range of UK legislation. That is absolutely and completely clear, and as a former shadow Attorney General, I know exactly what the law is on this subject. I can say this: we must protect our children and grandchildren from predatory platforms operating for financial gain on the internet. It is endemic throughout the world and in the UK, inducing suicide, self-harm and sexual abuse, and it is an assault on the minds of our young children and on those who are affected by it, including the families and such people as Ian Russell. He has shown great courage in coming out with the tragedy of his small child of 14 years old committing suicide as a result of such activities, as the coroner made clear. It is unthinkable that we will not deal with that. We are dealing with it now, and I thank the Secretary of State and the Minister for responding with constructive dialogue in the short space of time since we have got to grips with this issue.

The written ministerial statement is crystal clear. It says that

“where senior managers, or those purporting to act in that capacity, have consented or connived in ignoring enforceable requirements, risking serious harm to children. The criminal penalties, including imprisonment and fines, will be commensurate with similar offences.”

We can make a comparison, as the right hon. Member for Barking (Dame Margaret Hodge) made clear, with financial penalties in the financial services sector, which is also international. There is also the construction industry, as my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) just said. Those penalties are already on our statute book.

I do not care what the European Union is doing in its legislation. I am glad to know that the Irish legislation, which has been passed and is an Act, has been through different permutations and examinations. The Irish have come up with something that includes similar severe penalties. It can be done. But this is our legislation in this House. We will do it the way that we want to do it to protect our children and families. I am just about fed up with listening to the mealy-mouthed remarks from those who say, “You can’t do it. It’s not quite appropriate.” To hell with that. We are talking about our children.

On past record, which I just mentioned, in 1977-78, a great friend of mine, Cyril Townsend, the Member for Bexleyheath, introduced the first Protection of Children Bill. He asked me to help him, and I did. We got it through. That was incredibly difficult at the time. You have no idea, Mr Deputy Speaker, how much resistance was put up by certain Members of this House, including Ministers. I spoke to Jim Callaghan—I have been in this House so long that I was here with him after he had been Prime Minister—and asked, “How did you give us so much time to get the Bill through?” He said, “It’s very simple. I was sitting in bed with my wife in the flat upstairs at No. 10. She wasn’t talking to me. I said, ‘What’s wrong, darling?’ She replied, ‘If you don’t get that Protection of Children Bill through, I won’t speak to you for six months.’” And it went through, so there you go. There is a message there for all Secretaries of State, and even Prime Ministers.

I raised this issue with the Prime Minister in December in a question at the Liaison Committee. I invited him to consider it, and I am so glad that we have come to this point after very constructive discussion and dialogue. It needed that. It is a matter not of chariots of fire but of chariots on fire, because we have done all this in three weeks. I am extremely grateful to the 51 MPs who stood firm. I know the realities of this House, having been involved in one or two discussions in the past. As a rule, it is only when you have the numbers that the results start to come. I pay tribute to the Minister for the constructive dialogue.

The Irish legislation will provide a model, but this will be our legislation. It will be modelled on some of the things that have already enacted there, but it is not simply a matter of their legislation being transformed into ours. It will be our legislation. In the European Parliament—

William Cash Portrait Sir William Cash
- Hansard - - - Excerpts

I know my time is up; I just want to say this.

Nigel Evans Portrait Mr Deputy Speaker
- Hansard - - - Excerpts

No. I call Miriam Cates.

Miriam Cates Portrait Miriam Cates (Penistone and Stocksbridge) (Con)
- View Speech - Hansard - - - Excerpts

I too rise to speak to new clause 2, which seeks to introduce senior manager criminal liability to the Bill. As my hon. Friend the Member for Stone (Sir William Cash) set out, we will not push it to a vote as a result of the very welcome commitments that the Minister has made to introduce a similar amendment in the other place.

Protecting children is not just the role of parents but the responsibility of the whole of society, including our institutions and businesses that wish to trade here. That is the primary aim of this Bill, which I wholeheartedly support: to keep children safe online from horrendous and unspeakable harms, many of which were mentioned by my right hon. Friend the Member for South Northamptonshire (Dame Andrea Leadsom).

We look back in horror at children being forced to work down mines or neglected in Victorian orphanages, but I believe we will look back with similar outrage at online harms. What greater violation could there be of childhood than to entice a child to collaborate in their own sexual abuse in the privacy and supposed safety of their own bedroom? Yet this is one of the many crimes that are occurring on an industrial scale every day. Past horrors such as children down mines were tackled by robust legislation, and the Online Safety Bill must continue our Parliament’s proud tradition of taking on vested interests to defend the welfare of children.

The Bill must succeed in its mission, but in its present form, it does not have sufficient teeth to drive the determination that is needed in tech boardrooms to tackle the systemic issue of the malevolent algorithms that drive this sickening content to our children. There is no doubt that the potential fines in the Bill are significant, but many of these companies have deep pockets, and the only criminal sanctions are for failure to share data with Ofcom. The inquest following the tragic death of Molly Russell was an example of this, as no one could be held personally responsible for what happened to her. I pay tribute to Ian Russell, Molly’s father, whose courage in the face of such personal tragedy has made an enormous difference in bringing to light the extent of online harms.

Only personal criminal liability will drive proactive change, and we have seen this in other areas such as the financial services industry and the construction industry. I am delighted that the Government have recognised the necessity of senior manager liability for tech bosses, after much campaigning across the House, and committed to introducing it in the other place. I thank the Secretary of State and her team for the very constructive and positive way in which they have engaged with supporters of this measure.

William Cash Portrait Sir William Cash
- Hansard - - - Excerpts

Would my hon. Friend not also like to say that the NSPCC has been magnificent in supporting us?

Miriam Cates Portrait Miriam Cates
- Hansard - - - Excerpts

I was coming on to that—absolutely.

The advantage of introducing this measure in the other place is that we can widen the scope to all appropriate child safety duties beyond clause 11 and perhaps tackle pornography and child sexual abuse material as well. We will have a groundbreaking Bill that will hold to account powerful executives who knowingly allow our children to be harmed.

There are those who say—not least the tech companies —that we should not be seeking to criminalise tech directors. There are those who worry that this will reduce tech investment, but that has not happened in Ireland. There are those who say that the senior manager liability amendment will put a great burden on tech companies to comply, to which I say, “Great!” There are those who are worried that this will set an international precedent, to which I say, “Even better!”

Nothing should cause greater outrage in our society than the harming of innocent children. In a just society founded on the rule of law, those who harm children or allow children to be harmed should expect to be punished by the law. That is what new clause 2 seeks to do, and I look forward to working with the Secretary of State and others to bring forward a suitable amendment in the other place.

I offer my sincere thanks to the NSPCC, especially Rich Collard, and the outstanding Charles Hymas of The Telegraph, who have so effectively supported this campaign. I also pay tribute to my hon. Friend the Member for Stone (Sir William Cash); without his determination, knowledge and experience, it would not have been possible to achieve this change. He has been known as Mr Brexit, but as he said, even before he was Mr Brexit, he was Mr Child Protection, having been involved with the Protection of Children Act 1978. It is certainly advantageous in negotiations to work with someone who knows vastly more about legislation than pretty much anyone else involved. He sat through the debate in December on the amendment tabled by the right hon. Member for Barking (Dame Margaret Hodge), and while the vote was taking place, he said, “I think we can do this.” He spent the next week in the Public Bill Office and most of his recess buried in legislation. I pay tribute to him for his outstanding work. Once again, I thank the Secretary of State for her commitment to this, and I think this will continue our Parliament’s proud history of protecting children.

Lia Nici Portrait Lia Nici (Great Grimsby) (Con)
- View Speech - Hansard - - - Excerpts

I fully support the Bill and pay tribute to the work that Members have done over months and years to get us to where we are. I support the amendments tabled by my hon. Friends the Members for Dover (Mrs Elphicke), for Penistone and Stocksbridge (Miriam Cates) and for Stone (Sir William Cash), because these are the right things to do. We cannot have—effectively—illegal advertising for illegal activities on platforms. We would not allow it on television, so why would we allow it on other easily accessible platforms? With regard to content that is harmful to children, why should we not focus the minds of senior managers in those hugely rich organisations on the idea that, “If I do not do my job properly and protect children, I may go to prison.” I think that threat will focus those individuals’ minds.

19:30
William Cash Portrait Sir William Cash
- Hansard - - - Excerpts

Does my hon. Friend agree that it is an assault not just on the physical person, but on their minds? That is what is going on, and it is destroying them.

Lia Nici Portrait Lia Nici
- Hansard - - - Excerpts

My hon. Friend is correct. Often, senior managers are high-profile individuals with PR budgets that are probably larger than those of many countries. If we think about fines, they would just put those fines into their business plans, so fines would not effect a cultural change, as my hon. Friend the Member for Penistone and Stocksbridge has said on many occasions. We need cultural change to ensure that companies say, “What are we doing to make sure that children are being protected?” That is why I wholeheartedly support the new clause.

I also thank the Secretary of State, Ministers and officials, who have talked through issues with Back Benchers and taken them seriously. That means that we are where we need to be, which is fantastic. As a child of the 1970s and a parent, I never envisaged that we would have to be having these kinds of conversations with our children about what they are coming across: “Mum, what is this? Should I go and find a needle to inject this into myself?”. That is the kind of horrifying content that parents and teachers come across. Schools do a fantastic job with their digital footprint training to ensure that we can start to have such conversations.

John Hayes Portrait Sir John Hayes
- Hansard - - - Excerpts

The opponents of our cause claim that we are curbing freedom, but in fact, it is not freedom that these people offer. They turn their addicts into the slaves of cruel, callous conglomerates.

Lia Nici Portrait Lia Nici
- Hansard - - - Excerpts

I absolutely agree with my right hon. Friend. If freedom means that our children become collateral damage for harmful and dangerous people, we need to have some real conversations about what freedom is all about.

Thankfully, as a child of the 1970s, my only experience was of three television channels. My hon. Friends the Members for Stone and for Penistone and Stocksbridge are like Zorro and Tonto coming to save the villagers in a wild west town where all the baddies are waiting to annihilate them. I thank them for that and I look forward to supporting the Bill all the way.

Vicky Ford Portrait Vicky Ford (Chelmsford) (Con)
- View Speech - Hansard - - - Excerpts

Legislating in an online world is incredibly complex and full of pitfalls, because the digital world moves so fast that it is difficult to make effective and future-proof legislation. I do not want to wind up my hon. Friend the Member for Stone (Sir William Cash) by mentioning Europe, but I am proud to have worked alongside other British MEPs to introduce the GDPR, which the tech companies hated—especially the penalties.

The GDPR is not perfect legislation, but it fundamentally transformed how online actors think about the need to protect personal data, confidentiality and privacy. The Bill can do exactly the same and totally transform how online safety is treated, especially for children. I have been a proud champion of the Internet Watch Foundation for more than a decade and I have worked with it to tackle the hideous sexual abuse of children online. As a children’s Minister during the Bill’s passage, I am aware of the serious harms that the online world can and does pose, and I am proud that Ministers have put protecting children at the front of the Bill.

Along with other hon. Members, I have signed new clause 2. If, God forbid, hospital staff were constantly and repeatedly causing harm to children and the hospital boss was aware of it but turned a blind eye and condoned it, we would all expect that hospital boss to end up in the courts and, if necessary, in prison. Tech bosses should have the same. I thank the Government for saying that they will go along with the Irish style legislation here, and I look forward to their doing so.

My amendments—amendment 83 and new clause 8, which was not in scope—relate to eating disorders. Amendment 83 is intended to make it very clear that eating disorders should be treated as seriously as other forms of self-harm. I would like to thank everybody in the Chamber who spoke to me so kindly after I spoke in the last debate about my own experience as a former anorexic and all those outside the Chamber who have since contacted me.

Anorexia is the biggest killer of all mental illnesses. It is a sickness that has a slow and long-burning fuse, but all too often that fuse is deadly. There has been a terrifying rise in the number of cases, and it is very clear that social media posts that glamorise eating disorders are helping to fuel this epidemic. I am talking not about content that advertises a diet, but egregious content that encourages viewers to starve themselves in some cases—too many cases—to death. Content promoting eating disorders is no less dangerous than other content promoting other forms of self-harm; in fact, given the huge numbers of people suffering from eating disorders—about 1.25 million people in this country—it may be considered the most dangerous. It is dangerous not only for children, but for vulnerable adults.

My amendment, as I have said, endeavours to make it clear that content promoting eating disorders should be treated in the same way and as seriously as content promoting other forms of self-harm. I thank all those who signed it, including former Health Ministers and Digital Ministers, the current Chair of the Health and Social Care Committee, my hon. Friend the Member for Winchester (Steve Brine) and the current and former Chairs of the Women and Equalities Committee, my right hon. Friends the Members for Romsey and Southampton North (Caroline Nokes) and for Basingstoke (Dame Maria Miller). I hope the fact that MPs of such experience have signed these amendment sends a clear message to those in the other place that we treat this issue very seriously.

My amendment 83 is not the clearest legal way in which to manage the issue, so I do not intend to press it today. I thank the Secretary of State, the Minister responsible for the Bill and the Minister of State, Ministry of Justice, my right hon. Friend the Member for Charnwood (Edward Argar), who I know want to move on this, for meeting me earlier today and agreeing that we will find a way to help protect vulnerable adults as well as children from being constantly subjected to this type of killing content. I look forward to continuing to work with Ministers and Members of the other place to find the best legally watertight way forward.

Marcus Fysh Portrait Mr Marcus Fysh (Yeovil) (Con)
- View Speech - Hansard - - - Excerpts

It is a pleasure to follow my right hon. Friend the Member for Chelmsford (Vicky Ford), who made a very powerful speech, and I completely agree with her about the importance of treating eating disorders as being of the same scale of harm as other things in the Bill.

I was the media analyst for Merrill Lynch about 22 years ago, and I made a speech about the future of media in which I mentioned the landscape changing towards one of self-generated media. However, I never thought we would get to where it is now and what the effect is. I was in the Pizza Express on Gloucester Road the other day at birthday party time, and an 11-year-old boy standing in the queue was doomscrolling TikTok videos rather than talking to his friends, which I just thought was a really tragic indication of where we have got to.

Digital platforms are also critical sources of information and our public discourse. Across the country, people gather up to 80% of information from such sources, but we should not have trust in them. Their algorithms, which promote and depromote, and their interfaces, which engage, are designed, as we have heard, to make people addicted to the peer validation and augmentation of particular points of view. They are driving people down tribal rabbit holes to the point where they cannot talk to each other or even listen to another point of view. It is no wonder that 50% of young people are unhappy or anxious when they use social media, and these algorithmic models are the problem. Trust in these platforms is wrong: their promotion or depromotion of messages and ideas is opaque, often subjective and subject to inappropriate influence.

It is right that we tackle illegal activity and that harms to children and the vulnerable are addressed, and I support the attempt to do that in the Bill. Those responsible for the big platforms must be held to account for how they operate them, but trusting in those platforms is wrong, and I worry that compliance with their terms of service might become a tick-box absolution of their responsibility for unhappiness, anxiety and harm.

What about harm to our public sphere, our discourse, and our processes of debate, policymaking and science? To trust the platforms in all that would be wrong. We know they have enabled censorship. Elon Musk’s release of the Twitter files has shown incontrovertibly that the big digital platforms actively censor people and ideas, and not always according to reasonable moderation. They censor people according to their company biases, by political request, or with and on behalf of the three-letter Government agencies. They censor them at the behest of private companies, or to control information on their products and the public policy debate around them. Censorship itself creates mistrust in our discourse. To trust the big platforms always to do the right thing is wrong. It is not right that they should be able to hide behind their terms of service, bury issues in the Ofcom processes in the Bill, or potentially pay lip service to a tick-box exercise of merely “having regard” to the importance of freedom of expression. They might think they can just write a report, hire a few overseers, and then get away scot-free with their cynical accumulation, and the sale of the data of their addicted users and the manipulation of their views.

The Government have rightly acknowledged that addressing such issues of online safety is a work in progress, but we must not think that the big platforms are that interested in helping. They and their misery models are the problem. I hope that the Government, and those in the other place, will include in the Bill stronger duties to stop things that are harmful, to promote freedom of expression properly, to ensure that people have ready and full access to the full range of ideas and opinions, and to be fully transparent in public and real time about the way that content is promoted or depromoted on their platforms. Just to trust in them is insufficient. I am afraid the precedent has been set that digital platforms can be used to censor ideas. That is not the future; that is happening right now, and when artificial intelligence comes, it will get even worse. I trust that my colleagues on the Front Bench and in the other place will work hard to improve the Bill as I know it can be improved.

Rachel Maclean Portrait Rachel Maclean (Redditch) (Con)
- View Speech - Hansard - - - Excerpts

I strongly support the Bill. This landmark piece of legislation promises to put the UK at the front of the pack, and I am proud to see it there. We must tackle online abuse while protecting free speech, and I believe the Bill gets that balance right. I was pleased to serve on the Bill Committee in the last Session, and I am delighted to see it returning to the Chamber. The quicker it can get on to the statute book, the more children we can protect from devastating harm.

I particularly welcome the strengthened protections for children, which require platforms to clearly articulate in their terms of service what they are doing to enforce age requirements on their site. That will go some way to reassuring parents that their children’s developing brains will not be harmed by early exposure to toxic, degrading, and demeaning extreme forms of pornography. Evidence is clear that early exposure over time warps young girls’ views of what is normal in a relationship, with the result that they struggle to form healthy equal relationships. For boys, that type of sexual activity is how they learn about sex, and it normalises abusive, non-consensual and violent acts. Boys grow up into men whose neural circuits become habituated to that type of imagery. They actually require it, regardless of the boundaries of consent that they learn about in their sex education classes—I know this is a difficult and troubling subject, but we must not be afraid to tackle it, which is what we are doing with the Bill. It is well established that the rise of that type of pornography on the internet over time has driven the troubling and pernicious rise in violence against women and girls, perpetrated by men, as well as peer-on-peer child sexual abuse and exploitation.

During Committee we had a good debate about the need for greater criminal sanctions to hold directors individually to account and drive a more effective safety culture in the boardroom. I am proud to serve in the Chamber with my hon. Friends the Members for Stone (Sir William Cash) and for Penistone and Stocksbridge (Miriam Cates). I have heard about all their work on new clause 2 and commend them heartily for it. I listened carefully to the Minister’s remarks in Committee and thank him and the Secretary of State for their detailed engagement.

19:44
I really welcome the plans to introduce measures to strengthen individual criminal liability for directors of tech companies. The concerns raised by some that that will deter investment in the UK or result in a tech exodus are absolute nonsense. That has not taken place in Ireland, which still hosts many leading industry headquarters. I am a former tech entrepreneur, part of the founding team of one of the UK’s largest software publishers, and I assure the House that this forward-leaning legislation and regulation that requires innovation to solve compliance and user problems is exactly what drives the engineers to do what they do best: to solve problem and develop solutions, in the interests of their customers, which are valuable across a host of industries and sectors.
The new measures will cement the UK’s role as a world leader in this space and underpin our ability to continue to play a leading role in the software industry. Where we lead, others will follow. They will also give our thought leaders the opportunity to develop bespoke solutions as well as ensure that children’s ages are verified robustly and that disgusting child sex abuse material is removed and does not proliferate.
On violence against women and girls, sensible and workable plans have been set out to make coercive and controlling behaviour a priority offence and to make platforms take that stuff down without women and girls having to contact them every single time. I welcome the work on creating codes of practice.
Considering the challenges that surround this area, the Bill does a really good job of protecting and upholding the freedom of speech that we hold dear in our democracy. As a feminist, I need to be able to express my view, protected under the Equality Act, that biological sex is immutable. I should not be hounded off the internet or threatened with violence for stating that view. At the same time, we should all seek to support and improve the experiences of transgender people. We can do both at the same time. We must have a nuanced, balanced and compassionate debate.
We are in an era where our discussion forums have become polarised. We are crossing new frontiers but we cannot accept the status quo. Our democracy depends on this.
Dean Russell Portrait Dean Russell (Watford) (Con)
- View Speech - Hansard - - - Excerpts

I rise to talk broadly about new clause 2, which I am pleased that the Government are engaging on. My right hon. and hon. Friends have done incredible work to make that happen. I share their elation. As—I think—the only Member who was on the Joint Committee under the fantastic Chair, my hon. Friend the Member for Folkestone and Hythe (Damian Collins), and on both Committees, I have seen the Bill’s passage over the past year or so and been happy with how the Government have engaged with it. That includes on Zach’s law, which will ensure that trolls cannot send flashing images to people with epilepsy. I shared my colleagues’ elation with my hon. Friend the Member for Stourbridge (Suzanne Webb) when we were successful in convincing the Government to make that happen.

May I reiterate the learnings from the Joint Committee and from the Committee earlier last year? When we took evidence from the tech giants—they are giants—it was clear that, as giants do, they could not see the damage underfoot and the harm that they were doing because they are so big. They were also blind to the damage they were doing because they chose not to see it. I remember challenging a witness from one of the big tech giants about whether they had followed the Committee on the harms that they were causing to vulnerable children and adults. I was fascinated by how the witnesses just did not care. Their responses were, “Well, we are doing enough already. We are already trying. We are putting billions of pounds into supporting people who are being harmed.” They did not see the reality on the ground of young people being damaged.

When I interviewed my namesake, Ian Russell, I was heartbroken because we had children of a similar age. I just could not imagine having the conversations he must have had with his family and friends throughout that terrible tragedy.

William Cash Portrait Sir William Cash
- Hansard - - - Excerpts

Is my hon. Friend aware that Ian Russell has pointed out that 26% of young people who present at hospital with self-harm and suicide attempts have accessed such predatory, irresponsible and wilful online content?

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

My hon. Friend is absolutely right. One of the real horrors is that, as I understand it, Facebook was not going to release—I do not want to break any rules here—the content that his daughter had being viewing, to help with the process of healing.

If I may, I want to touch on another point that has not been raised today, which is the role of a future Committee. I appreciate that is not part of the Bill, but I feel strongly that this House should have a separate new Committee for the Online Safety Bill. The internet and the world of social media is changing dramatically. The metaverse is approaching very rapidly, and we are seeing the rise of virtual reality and augmented reality. Artificial intelligence is even changing the way we believe what we see online and at a rate that we cannot imagine. I have a few predictions. I anticipate that in the next few years we will probably have the first No. 1 book and song written by AI. We can now hear online fake voices and impersonations of people by AI. We will have songs and so on created in ways that fool us and fool children even more. I have no doubt that in the coming months and years we will see the rise of children suing their parents for sharing content of them when they were younger without permission. We will see a changing dynamic in the way that young people engage with new content and what they anticipate from it.

John Hayes Portrait Sir John Hayes
- Hansard - - - Excerpts

My hon. Friend is making a valuable contribution to the debate, as I expected he would having discussed it with him from the very beginning. What he describes is not only the combination of heartlessness and carelessness on the part of the tech companies, but the curious marriage of an anarchic future coupled with the tyranny of their control of that future. He is absolutely right that if we are to do anything about that in this place, we need an ongoing role for a Committee of the kind he recommends.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

I thank my right hon. Friend for those comments. I will wrap up shortly, Mr Deputy Speaker. On that point, I have said before that the use of algorithms on platforms is in my mind very similar to addictive drugs: they get people addicted and get them to change their behaviours. They get them to cut off from their friends and family, and then they direct them in ways that we would not allow if we could wrap our arms around them and stop it. But they are doing that in their own bedrooms, classrooms and playgrounds.

I applaud the work on the Bill. Yes, there are ways it could be improved and a committee that looks at ways to improve it as the dynamics of social media change will be essential. However, letting the Bill go to the other place will be a major shift forwards in protecting our young people both now and in the future.

Roger Gale Portrait Mr Deputy Speaker (Sir Roger Gale)
- Hansard - - - Excerpts

Thank you for your patience, Siobhan Baillie.

Siobhan Baillie Portrait Siobhan Baillie (Stroud) (Con)
- View Speech - Hansard - - - Excerpts

Thank you, Mr Deputy Speaker.

I rise to speak to amendments 52 and 53. As you know, Mr Deputy Speaker, I have been campaigning to tackle anonymous abuse for many years now. I have been working with the fantastic Clean Up The Internet organisation, Stroud residents and the brilliant Department for Digital, Culture, Media and Sport team. We have been focused on practical measures that will empower social media users to protect themselves from anonymous abuse. I am pleased to say that the Government accepted our campaign proposals to introduce verification options. They give people the option to be followed and to follow only verified accounts if that is what they choose, and to ensure that they know who is and who is not verified. That will also assist in ensuring that the positive parts of anonymity can continue online, as there are many. I respectfully think that that work is even more important now that we have seen the removal of the “legal but harmful” clauses, because we know what will be viewed by children and vulnerable adults who want to be protected online.

We are not resting on that campaign win, however. We want to see the verification measures really work in the real world and for social media companies to adopt them quickly without any confusion about their duties. Separately, clarity is the order of the day, because the regulator Ofcom is going to have an awful lot to do thanks to the excellent clauses throughout the legislation.

This issue is urgent. We must not forget that anonymous social media accounts are spewing out hateful bile every single minute of the day. Children and vulnerable adults are left terrified: it is much more scary for them to receive comments about suicide, self-harm and bullying, and from anorexia pushers, from people when they do not know who they are.

Financial scammers tend to hide behind anonymity. Faceless bots cause mayhem and start nasty pile-ons. Perverts know that when they send a cyber-flashing dick pic to an unsuspecting woman, it is very unlikely, if it comes from an anonymous account, that it will be traced back to them. It is really powerful and important for people to have the tools to not see unverified nonsense or abuse, to be able to switch that off and to know that the people they follow are real.

I am keen for the Minister and the Government to adopt amendments 52 and 53. They are by no means the most sexy and jazzy amendments before the House; they are more tweaks than amendments. They would change the wording to bring the legislation up to date in the light of recent changes. They would also ensure that it is obvious if people are verified—blue ticks are a really good example of that—which was part of my campaign in the first place. I understand from discussions that the Government are considering adopting my amendments. I thank colleagues for calling them sensible and backing them. They are really important.

Finally, I have made the case many times that the public expect us to act and to be strong in this policy area, but they also expect things to happen very quickly. We have waited a very long time. It is incredibly important to give people the power and tools to protect themselves, whether by sliding a button or switching something off. My great hope from the campaigning that I have done is that young people and adults will think about only following unverified accounts through an active choice.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

On that specific point, does the hon. Lady realise that the empowerment duties in respect of verified and non-verified users apply only to adult users? Children will not have the option to toggle off unverified users, because the user empowerment duties do not allow that to happen.

Siobhan Baillie Portrait Siobhan Baillie
- Hansard - - - Excerpts

The evidence we have received is that it is parents who need the powers. I want to normalise the ability to turn off anonymised accounts. I think we will see children do that very naturally. We should also try to persuade their parents to take those stances and to have those conversations in the home. I obviously need to take up the matter with the hon. Lady and think carefully about it as matters proceed through the other place.

We know that parents are very scared about what their children see online. I welcome what the Minister is trying to do with the Bill and I welcome the legislation and the openness to change it. These days, we are all called rebels whenever we do anything to improve legislation, but the reality is that that is our job. We are sending this legislation to the other House in a better shape.

Paul Scully Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Paul Scully)
- View Speech - Hansard - - - Excerpts

There is a lot to cover in the short time I have, but first let me thank Members for their contributions to the debate. We had great contributions from the hon. Member for Pontypridd (Alex Davies-Jones), my right hon. Friend the Member for Witham (Priti Patel) and the right hon. Member for Barking (Dame Margaret Hodge)—I have to put that right, having not mentioned her last time—as well as from my hon. Friend the Member for Gosport (Dame Caroline Dinenage); the hon. Member for Aberdeen North (Kirsty Blackman); the former Secretary of State, my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright); and the hon. Members for Plymouth, Sutton and Devonport (Luke Pollard), for Reading East (Matt Rodda) and for Leeds East (Richard Burgon).

I would happily meet the hon. Member for Plymouth, Sutton and Devonport to talk about incel content, as he requested, and the hon. Members for Reading East and for Leeds East to talk about Olly Stephens and Joe Nihill. Those are two really tragic examples and it was good to hear the tributes to them and their being mentioned in this place in respect of the changes in the legislation.

We had great contributions from my right hon. Friend the Member for South Northamptonshire (Dame Andrea Leadsom), the hon. Member for Strangford (Jim Shannon) and my hon. Friend the Member for Dover (Mrs Elphicke). I am glad that my hon. Friend the Member for Stone (Sir William Cash) gave a three-Weetabix speech—I will have to look in the Tea Room for the Weetabix he has been eating.

There were great contributions from my hon. Friends the Members for Penistone and Stocksbridge (Miriam Cates) and for Great Grimsby (Lia Nici), from my right hon. Friend the Member for Chelmsford (Vicky Ford) and from my hon. Friend the Member for Yeovil (Mr Fysh). The latter talked about doom-scrolling; I recommend that he speaks to my right hon. Friend the Member for South Holland and The Deepings (Sir John Hayes), whose quoting of G. K. Chesterton shows the advantages of reading books rather than scrolling through a phone. I also thank my hon. Friends the Members for Redditch (Rachel Maclean), for Watford (Dean Russell) and for Stroud (Siobhan Baillie).

I am also grateful for the contributions during the recommittal process. The changes made to the Bill during that process have strengthened the protections that it can offer.

We reviewed new clause 2 carefully, and I am sympathetic to its aims. We have demonstrated our commitment to strengthening protections for children elsewhere in the Bill by tabling a series of amendments at previous stages, and the Bill already includes provisions to make senior managers liable for failing to prevent a provider from committing an offence and for failing to comply with information notices. We are committed to ensuring that children are safe online, so we will work with those Members and others to bring to the other place an effective amendment that delivers our shared aims of holding people accountable for their actions in a way that is effective and targeted at child safety, while ensuring that the UK remains an attractive place for technology companies to invest and grow.

We need to take time to get this right. We intend to base our amendments on the Irish Online Safety and Media Regulation Act 2022, which, ironically, was largely based on our work here, and which introduces individual criminal liability for failure to comply with the notice to end contravention. In line with that approach, the final Government amendment, at the end of the ping-pong between the other place and this place, will be carefully designed to capture instances in which senior managers, or those purporting to act in that capacity, have consented or connived in ignoring enforceable requirements, risking serious harm to children. The criminal penalties, including imprisonment or fines, will be commensurate with those applying to similar offences. While the amendment will not affect those who have acted in good faith to comply in a proportionate way, it will give the Act additional teeth—as we have heard—to deliver the change that we all want, and ensure that people are held to account if they fail to protect children properly.

As was made clear by my right hon. Friend the Member for Witham, child protection and strong implementation are at the heart of the Bill. Its strongest protections are for children, and companies will be held accountable for their safety. I cannot guarantee the timings for which my right hon. Friend asked, but we will not dilute our commitment. We have already started to speak to companies in this sphere, and I will also continue to work with her and others.

Sajid Javid Portrait Sajid Javid (Bromsgrove) (Con)
- Hansard - - - Excerpts

My hon. Friend has rightly prioritised the protection of children. He will recall that throughout the debate, a number of Members have asked the Government to consider the amendment that will be tabled by Baroness Kidron, which will require coroners to have access to data in cases in which the tragic death of a child may be related to social media and other online activities. Is my hon. Friend able to give a commitment from the Dispatch Box that the Government will look favourably on that amendment?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Coroners already have some powers in this area, but we are aware of instances raised by my right hon. Friend and others in which that has not been the case. We will happily work with Baroness Kidron, and others, and look favourably on changes where they are necessary.

Debbie Abrahams Portrait Debbie Abrahams
- Hansard - - - Excerpts

I entirely agree that our focus has been on protecting children, but is the Minister as concerned as I am about the information and misinformation, and about the societal impacts on our democracy, not just in this country but elsewhere? The hon. Member for Watford suggested a Committee that could monitor such impacts. Is that something the Minister will reconsider?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

For the purpose of future-proofing, we have tried to make the Bill as flexible and as technologically neutral as possible so that it can adapt to changes. I think we will need to review it, and indeed I am sure that, as technology changes, we will come back with new legislation in the future to ensure that we continue to be world-beating—but let us see where we end up with that.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

May I follow up my hon. Friend’s response to our right hon. Friend the Member for Bromsgrove (Sajid Javid)? If it is the case that coroners cannot access data and information that they need in order to go about their duties—which was the frustrating element in the Molly Russell case—will the Government be prepared to close that loophole in the House of Lords?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

We will certainly work with others to address that, and if there is a loophole, we will seek to act, because we want to ensure—

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will not give way for the moment. Oh, actually I will.

Priti Patel Portrait Priti Patel
- Hansard - - - Excerpts

I am grateful to the Minister for giving way. He was commenting on my earlier remarks about new clause 2 and the specifics around a timetable. I completely recognise that much of this work is under development. In my remarks, I asked for a timetable on engagement with the tech firms as well as transparency to this House on the progress being made on developing the regulations around criminal liability. It is important that this House sees that, and that we follow every single stage of that process.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I thank my right hon. Friend for that intervention. We want to have as many conversations as possible in this area with Members on all sides, and I hope we can be as transparent as possible in that operation. We have already started the conversation. The Secretary of State and I met some of the big tech companies just yesterday to talk about exactly this area.

My hon. Friend the Member for Dover, my right hon. Friends the Members for South Holland and The Deepings and for Maidenhead (Mrs May) and others are absolutely right to highlight concerns about illegal small boat crossings and the harm that can be caused to people crossing in dangerous situations. The use of highly dangerous methods to enter this country, including unseaworthy, small or overcrowded boats and refrigerated lorries, presents a huge challenge to us all. Like other forms of serious and organised crime, organised immigration crime endangers lives, has a corrosive effect on society, puts pressure on border security resources and diverts money from our economy.

As the Prime Minister has said, stopping these crossings is one of the Government’s top priorities for the next year. The situation needs to be resolved and we will not hesitate to take action wherever that can have the most effect, including through this Bill. Organised crime groups continue to facilitate most migrant journeys to the UK and have no respect for human life, exploiting vulnerable migrants, treating them as commodities and knowingly putting people in life-threatening situations. Organised crime gangs are increasingly using social media to facilitate migrant crossings and we need to do more to prevent and disrupt the crimes facilitated through these platforms. We need to share best practice, improve our detection methods and take steps to close illegal crossing routes as the behaviour and methods of organised crime groups evolve.

However, amendment 82 risks having unforeseen consequences for the Bill. It could bring into question the meaning of the term “content” elsewhere in the Bill, with unpredictable implications for how the courts and companies would interpret it. Following constructive discussions with my hon. Friend the Member for Dover and my right hon. Friend the Member for Maidenhead, I can now confirm that in order to better tackle illegal immigration encouraged by organised gangs, the Government will add section 2 of the Modern Slavery Act 2015 to the list of priority offences. Section 2 makes it an offence to arrange or facilitate the travel of another person, including through recruitment, with a view to their exploitation.

We will also add section 24 of the Immigration Act to the priority offences list in schedule 7. Although the offences in section 2 cannot be carried out online, paragraph 33 of the schedule states the priority illegal content includes the inchoate offences relating to the offences listed. Therefore aiding, abetting, counselling and conspiring in those offences by posting videos of people crossing the channel that show the activity in a positive light could be an offence that is committed online and therefore fall within what is priority illegal content. The result of this amendment would therefore be that platforms would have to proactively remove that content. I am grateful to my hon. Friend the Member for Dover and my right hon. Friends the Members for South Holland and The Deepings and for Maidenhead for raising this important issue and I would be happy to offer them a meeting with my officials to discuss the drafting of this amendment ahead of it being tabled in the other place.

We recognise the strength of feeling on the issue of harmful conversion practices and remain committed to protecting people from these practices and making sure that they can live their lives free from the threat of harm or abuse. We have had constructive engagement with my hon. Friend the Member for Rutland and Melton (Alicia Kearns) on her amendment 84, which seeks to prevent children from seeing harmful online content on conversion practices. It is right that this issue is tackled through a dedicated and tailored legislative approach, which is why we are announcing today that the Government will publish a draft Bill to set out a proposed approach to banning conversion practices. This will apply to England and Wales. The Bill will protect everybody, including those targeted on the basis of their sexuality or being transgender. The Government will publish the Bill shortly and will ask for pre-legislative scrutiny by a Joint Committee in this parliamentary Session.

This is a complex area and pre-legislative scrutiny exists to help ensure that any Bill introduced to Parliament does not cause unintended consequences. It will also ensure that the Bill benefits from stakeholder expertise and input from parliamentarians. The legislation must not, through a lack of clarity, harm the growing number of children and young adults experiencing gender-related distress through inadvertently criminalising or chilling legitimate conversations that parents or clinicians may have with children. This is an important issue, and it needs the targeted and robust approach that a dedicated Bill would provide.

Crispin Blunt Portrait Crispin Blunt (Reigate) (Con)
- Hansard - - - Excerpts

Will the Minister give way?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I am afraid I have only three minutes, so I am not able to give way.

The Government cannot accept the Labour amendments that would re-add the adult safety duties and the concept of content that is harmful to adults. These duties and the definition of harmful content were removed from the Bill in Committee to protect free speech and to ensure that the Bill does not incentivise tech companies to censor legal content. It is not appropriate for the Government to decide whether legal content is harmful to adult users, and then to require companies to risk assess and set terms for such content. Many stakeholders and parliamentarians are justifiably concerned about the consequences of doing so, and I share those concerns. However, the Government recognise the importance of giving users the tools and information they need to keep themselves safe online, which is why we have introduced to the Bill a fairer, simpler approach for adults—the triple shield.

Members have talked a little about user empowerment. I will not have time to cover all of that, but the Government believe we have struck the right balance of empowering adult users on the content they see and engage with online while upholding the right to free expression. For those reasons, I am not able to accept these amendments, and I hope the hon. Members for Aberdeen North (Kirsty Blackman) and for Ochil and South Perthshire (John Nicolson) will not press them to a vote.

The Government amendments are consequential on removing the “legal but harmful” sections, which were debated extensively in Committee.

The Government recognise the concern of my hon. Friend the Member for Stroud about anonymous online abuse, and I applaud her important campaigning in this area. We expect Ofcom to recommend effective tools for compliance, with the requirement that these tools can be applied by users who wish to filter out non-verified users. I agree that the issue covered by amendment 52 is important, and I am happy to continue working with her to deliver her objectives in this area.

My right hon. Friend the Member for Chelmsford spoke powerfully, and we take the issue incredibly seriously. We are committed to introducing a new communications offence of intentional encouragement and assistance of self-harm, which will apply whether the victim is a child or an adult.

Vicky Ford Portrait Vicky Ford
- Hansard - - - Excerpts

Will my hon. Friend give way?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I do not have time, but I thank all Members who contributed to today’s debate. I pay tribute to my officials and to all the Ministers who have worked on this Bill over such a long time.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to ask leave to withdraw the clause.

Clause, by leave, withdrawn.

20:13
Debate interrupted (Programme Order, 5 December, and Standing Order No. 24(7)),
The Deputy Speaker put forthwith the Questions necessary for the disposal of the business to be concluded at that time (Standing Order No. 83E).
New Clause 4
Safety duties protecting adults and society: minimum standards for terms of service
“(1) OFCOM may set minimum standards for the provisions included in a provider’s terms of service as far as they relate to the duties under sections 11, [Harm to adults and society risk assessment duties], [Safety duties protecting adults and society], 12, 16 to 19 and 28 of this Act (“relevant duties”).
(2) Where a provider does not meet the minimum standards, OFCOM may direct the provider to amend its terms of service in order to ensure that the standards are met.
(3) OFCOM must, at least once a year, conduct a review of—
(a) the extent to which providers are meeting the minimum standards, and
(b) how the providers’ terms of service are enabling them to fulfil the relevant duties.
(4) The report must assess whether any provider has made changes to its terms of service that might affect the way it fulfils a relevant duty.
(5) OFCOM must lay a report on the first review before both Houses of Parliament within one year of this Act being passed.
(6) OFCOM must lay a report on each subsequent review at least once a year thereafter.”—(Alex Davies-Jones.)
Brought up.
Question put, That the clause be added to the Bill.
20:13

Division 144

Ayes: 242

Noes: 310

Clause 5
Overview of Part 3
Amendment made: 1, in clause 5, page 4, leave out lines 41 and 42.—(Paul Scully.)
This amendment is consequential on the removal of clause 55 of the Bill as amended on Report.
Clause 6
Providers of user-to-user services: duties of care
Amendments made: 2, in clause 6, page 5, line 15, leave out “, (3) and (4)” and insert “and (3)”.
This amendment removes a reference to what was subsection (4) of clause 18, as that provision has been moved to clause 65.
Amendment 3, in clause 6, page 5, line 26, leave out paragraphs (a) and (b).—(Paul Scully.)
This amendment is consequential on the removal of clauses 12 and 13 of the Bill as amended on Report.
Clause 10
Children’s risk assessment duties
Amendment made: 4, in clause 10, page 8, line 38, leave out from “8” to “)” in line 40.—(Paul Scully.)
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Clause 12
User empowerment duties
Amendment proposed: 43, in clause 12, page 12, line 24, leave out “made available to” and insert
“in operation by default for”.—(Kirsty Blackman.)
Question put, That the amendment be made.
20:26

Division 145

Ayes: 237

Noes: 316

Clause 36
Codes of practice about duties
Amendment made: 5, page 38, line 6, leave out paragraph (c).—(Paul Scully.)
This amendment is consequential on the removal of clause 13 of the Bill as amended on Report.
Clause 46
Duties and the first codes of practice
Amendment made: 6, page 45, line 23, leave out paragraph (c).—(Paul Scully.)
This amendment is consequential on the removal of clause 13 of the Bill as amended on Report.
Clause 56
Regulations under sections 54 and 55: OFCOM’s review and report
Amendments made: 7, page 54, line 11, leave out ‘or 55’.
This amendment is consequential on the removal of clause 55 of the Bill as amended on Report.
Amendment 8, page 54, line 15, leave out sub-paragraph (ii).
This amendment is consequential on the removal of clause 55 of the Bill as amended on Report.
Amendment 9, page 54, line 18, leave out ‘individuals’ and insert ‘children’.—(Paul Scully.)
This amendment is consequential on the removal of clause 55 of the Bill as amended on Report.
Clause 89
OFCOM’s register of risks, and risk profiles, of Part 3 services
Amendments made: 10, page 79, line 14, leave out paragraph (d).
This amendment is consequential on the removal of the adult safety duties.
Amendment 11, page 79, line 31, leave out subsection (6).
This amendment is consequential on the removal of the adult safety duties.
Amendment 12, page 80, line 5, leave out ‘or (d)’.
This amendment is consequential on the removal of the adult safety duties.
Amendment 13, page 80, leave out lines 15 and 16.
This amendment is consequential on the removal of clause 55 of the Bill as amended on Report.
Amendment 14, page 80, leave out lines 20 and 21.—(Paul Scully.)
This amendment is consequential on the removal of the adult safety duties.
Clause 90
OFCOM’s guidance about risk assessments
Amendments made: 15, page 80, line 36, leave out subsection (4).
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 16, page 81, leave out lines 13 and 14.—(Paul Scully.)
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Clause 197
Parliamentary procedure for regulations
Amendment made: 17, page 162, line 26, leave out paragraph (b).—(Paul Scully.)
This amendment is consequential on the removal of clause 55 of the Bill as amended on Report.
Clause 205
“Harm” etc
Amendments made: 18, page 169, line 35, leave out ‘or adults’.
This amendment is consequential on the removal of the adult safety duties.
Amendment 19, page 169, line 35, leave out ‘or adults (as the case may be)’.—(Paul Scully.)
This amendment is consequential on the removal of the adult safety duties.
Clause 208
Index of defined terms
Amendments made: 20, page 173, leave out line 16.
This amendment is consequential on the removal of clause 55 of the Bill as amended on Report.
Amendment 21, page 174, leave out lines 6 and 7.
This amendment removes the reference in the index to the “maximum summary term for either-way offences”, as that term no longer appears in the Bill.
Amendment 22, page 174, leave out lines 24 and 25.
This amendment is consequential on the removal of clause 55 of the Bill as amended on Report.
Amendment 23, page 175, line 13, at end insert—

“restricting users’ access to content (in Part 3)

section 52”.—(Paul Scully.)

This amendment adds a definition of “restricting users’ access to content” to the index of defined terms.
Schedule 3
Timing of providers’ assessments
Amendments made: 24, page 189, line 37, leave out paragraph 6.
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 25, page 191, line 10, leave out ‘to 14’ and insert ‘and 13’.
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 26, page 191, line 18, leave out sub-paragraph (3).
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 27, page 191, line 22, leave out ‘to 14’ and insert ‘and 13’.
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 28, page 191, line 39, leave out paragraph 14.
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 29, page 192, line 14, leave out ‘or paragraph 6’.
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 30, page 192, line 15, leave out
‘, CAA or adults’ risk assessment’
and insert ‘or CAA’.
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 31, page 192, line 19, leave out ‘, 17 or 18’ and insert ‘or 17’.
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 32, page 192, line 21, leave out ‘and paragraph 6 apply’ and insert ‘applies’.
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 33, page 192, line 41, leave out paragraph 18.
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 34, page 193, line 10, leave out paragraph (b).
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 35, page 193, line 13, leave out
‘, a CAA or an adults’ risk assessment’
and insert ‘or a CAA’.
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 36, page 193, line 25, leave out
‘, a CAA or an adults’ risk assessment’
and insert ‘or a CAA’.
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 37, page 193, line 27, leave out ‘or paragraph 6.’
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 38, page 193, line 39, leave out paragraph (c).
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 39, page 193, line 41, leave out
‘, CAA or adults’ risk assessment’
and insert ‘or CAA’.
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 40, page 193, line 43, leave out ‘or paragraph 6’.
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 41, page 194, leave out lines 17 and 18.—(Paul Scully.)
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Schedule 4
Codes of practice under section 36: principles, objectives,
content
Amendment made: 42, page 198, line 19, leave out paragraph (c).—(Paul Scully.)
This amendment is consequential on the removal of clause 13 of the Bill as amended on Report.
Third Reading
20:37
Michelle Donelan Portrait The Secretary of State for Digital, Culture, Media and Sport (Michelle Donelan)
- View Speech - Hansard - - - Excerpts

I beg to move, That the Bill be now read the Third time.

It has been a long road to get here, and it has required a huge team effort that has included Members from across the House, the Joint Committee, Public Bill Committees, the Ministers who worked on this over the years in the Department for Digital, Culture, Media and Sport and my predecessors as Secretaries of State. Together, we have had some robust and forthright debates, and it is thanks to Members’ determination, expertise and genuine passion on this issue that we have been able to get to this point today. Our differences of opinion across the House have been dwarfed by the fact that we are united in one single goal: protecting children online.

I have been clear since becoming Secretary of State that protecting children is the very reason that this Bill exists, and the safety of every child up and down the UK has driven this legislation from the start. After years of inaction, we want to hold social media companies to account and make sure that they are keeping their promises to their own users and to parents. No Bill in the world has gone as far as this one to protect children online. Since this legislation was introduced last year, the Government have gone even further and made a number of changes to enhance and broaden the protections in the Bill while also securing legal free speech. If something should be illegal, we should have the courage of our convictions to make it illegal, rather than creating a quasi-legal category. That is why my predecessor’s change that will render epilepsy trolling illegal is so important, and why I was determined to ensure that the promotion of self-harm, cyber-flashing and intimate image abuse are also made illegal once and for all in this Bill.

Vicky Ford Portrait Vicky Ford
- Hansard - - - Excerpts

Will my right hon. Friend make it clear, when the Bill gets to the other place, that content that glamorises eating disorders will be treated as seriously as content glamorising other forms of self-harm?

Michelle Donelan Portrait Michelle Donelan
- Hansard - - - Excerpts

I met my right hon. Friend today to discuss that very point, which is particularly important and powerful. I look forward to continuing to work with her and the Ministry of Justice as we progress this Bill through the other place.

The changes are balanced with new protections for free speech and journalism—two of the core pillars of our democratic society. There are amendments to the definition of recognised news publishers to ensure that sanctioned outlets such as RT must not benefit.

Since becoming Secretary of State I have made a number of my own changes to the Bill. First and foremost, we have gone even further to boost protections for children. Social media companies will face a new duty on age limits so they can no longer turn a blind eye to the estimated 1.6 million underage children who currently use their sites. The largest platforms will also have to publish summaries of their risk assessments for illegal content and material that is harmful for children—finally putting transparency for parents into law.

I believe it is blindingly obvious and morally right that we should have a higher bar of protection when it comes to children. Things such as cyber-bullying, pornography and posts that depict violence do enormous damage. They scar our children and rob them of their right to a childhood. These measures are all reinforced by children and parents, who are given a real voice in the legislation by the inclusion of the Children’s Commissioner as a statutory consultee. The Bill already included provisions to make senior managers liable for failure to comply with information notices, but we have now gone further. Senior managers who deliberately fail children will face criminal liability. Today, we are drawing our line in the sand and declaring that the UK will be the world’s first country to comprehensively protect children online.

Those changes are completely separate to the changes I have made for adults. Many Members and stakeholders had concerns over the “legal but harmful” section of the Bill. They were concerned that it would be a serious threat to legal free speech and would set up a quasi-legal grey area where tech companies would be encouraged to take down content that is perfectly legal to say on our streets. I shared those concerns, so we have removed “legal but harmful” for adults. We have replaced it with a much simpler and fairer and, crucially, much more effective mechanism that gives adults a triple shield of protection. If it is illegal, it has to go. If it is banned under the company’s terms and conditions, it has to go.

Lastly, social media companies will now offer adults a range of tools to give them more control over what they see and interact with on their own feeds.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

My right hon. Friend makes an important point about things that are illegal offline but legal online. The Bill has still not defined a lot of content that could be illegal and yet promoted through advertising. As part of their ongoing work on the Bill and the online advertising review, will the Government establish the general principle that content that is illegal will be regulated whether it is an ad or a post?

Michelle Donelan Portrait Michelle Donelan
- Hansard - - - Excerpts

I completely agree with my hon. Friend on the importance of this topic. That is exactly why we have the online advertising review, a piece of work we will be progressing to tackle the nub of the problem he identifies. We are protecting free speech while putting adults in the driving seat of their own online experience. The result is today’s Bill.

I thank hon. Members for their hard work on this Bill, including my predecessors, especially my right hon. Friend the Member for Mid Bedfordshire (Ms Dorries). I thank all those I have worked with constructively on amendments, including my hon. Friends the Members for Penistone and Stocksbridge (Miriam Cates), for Stone (Sir William Cash), for Dover (Mrs Elphicke), for Rutland and Melton (Alicia Kearns), and my right hon. Friends the Members for South Holland and The Deepings (Sir John Hayes), for Chelmsford (Vicky Ford), for Basingstoke (Dame Maria Miller) and for Romsey and Southampton North (Caroline Nokes).

I would like to put on record my gratitude for the hard work of my incredibly dedicated officials—in particular, Sarah Connolly, Orla MacRae and Emma Hindley, along with a number of others; I cannot name them all today, but I note their tremendous and relentless work on the Bill. Crucially, I thank the charities and devoted campaigners, such as Ian Russell, who have guided us and pushed the Bill forward in the face of their own tragic loss. Thanks to all those people, we now have a Bill that works.

Legislating online was never going to be easy, but it is necessary. It is necessary if we want to protect our values —the values that we protect in the real world every single day. In fact, the NSPCC called this Bill “a national priority”. The Children’s Commissioner called it

“a once-in-a-lifetime opportunity to protect all children”.

But it is not just children’s organisations that are watching. Every parent across the country will know at first hand just how difficult it is to shield their children from inappropriate material when social media giants consistently put profit above children’s safety. This legislation finally puts it right.

20:46
Lucy Powell Portrait Lucy Powell (Manchester Central) (Lab/Co-op)
- View Speech - Hansard - - - Excerpts

I am relieved to finally speak on Third Reading of this important Bill. We have had a few false dawns along the way, but we are almost there. The Bill has seen parliamentary dramas, arcane procedures and a revolving door of Ministers. Every passing week throws up another example of why stronger online regulation is urgently needed, from the vile Andrew Tate and the damning Molly Russell inquest to threats to democracy and, most recently, Elon Musk’s takeover of Twitter and ripping up of its rules.

The power of the broadcast media in the past was that it reached into everybody’s living rooms. Today, in the digital age, social media is in every room in our home, in every workplace, in every school, at every event and, with the rise of virtual reality, also in our heads. It is hard to escape. What began as ideas on student campuses to join up networks of old friends are now multibillion-pound businesses that attract global advertising budgets and hold hugely valuable data and information on every aspect of our lives.

In the digital age, social media is a central influence on what we buy, often on what we think, how we interact and how we behave. The power and the money at stake are enormous, yet the responsibilities are minimal and the accountability non-existent. The need to constantly drive engagement and growth has brought with it real and actual harms to individuals, democracy, our economy, society and public health, with abusers and predators finding a new profitable home online. These harms are driven by business models and engagement algorithms that actively promote harmful content. The impact on children and young people can be particularly acute, even life-threatening.

It is for those reasons and others that, as a country and on a cross-party basis, we embarked many years ago on bringing communications from the analogue era into the digital age. Since the Bill was first mooted, we have had multiple Select Committee reports, a Joint Committee and even two Public Bill Committees. During that time, the pace of change has continued. Nobody had even heard of TikTok when we first discussed the Bill. Today, it is one of the main ways that young people get their news. It is a stark reminder of just how slow-moving Government legislation is and how we will probably need to return to these issues once again very soon—I am sorry to break that to everybody—but we have got there for now. We will at least establish a regulator with some tough powers, albeit with a much narrower scope than was originally conceived.

George Howarth Portrait Sir George Howarth (Knowsley) (Lab)
- Hansard - - - Excerpts

I warmly endorse what my hon. Friend is saying. Does she agree with the right hon. Member for Chelmsford (Vicky Ford), who intervened on the Secretary of State, that further work is needed to prevent platforms from promoting different forms of eating disorders?

Lucy Powell Portrait Lucy Powell
- Hansard - - - Excerpts

I absolutely endorse those comments and I will come on to that briefly.

We never thought that the Online Safety Bill was perfect and we have been trying to work with the Government to improve it at every stage. Some of that has paid off and I put on record my thanks to my hon. Friend the Member for Pontypridd (Alex Davies-Jones) for her truly brilliant work, which has been ably supported by my hon. Friend the Member for Worsley and Eccles South (Barbara Keeley). I thank the various Ministers for listening to our proposals on scam ads, epilepsy trolling and dealing with small but high harm platforms, and I thank the various Secretaries of State for their constructive approaches. Most of all, I, too, thank the campaigners, charities and families who have been most affected by the Bill.

I welcome today’s last-minute concessions. We have been calling for criminal liability from the start as a means to drive culture change, and we look forward to seeing the detail of the measure when it is tabled in the other place. I also welcome that the Bill will finally outlaw conversion practices, including for trans people, and will take tougher action on people traffickers who advertise online.

On major aspects, however, the Government have moved in the wrong direction. They seem to have lost their mettle and watered down the Bill significantly by dumping whole swathes of it, including many of the harms that it was originally designed to deal with. There are still protections for children, albeit that age verification is difficult and many children pass themselves off as older online, but all the previous work on tackling wider harms has been dropped.

In failing to reconcile harms that are not individually illegal with the nature of powerful platforms that promote engagement and outcomes that are harmful, the Government have let the big tech companies off the hook and left us all more at risk. Online hate, disinformation, sensationalism, abuse, terrorism, racism, self-harm, eating disorders, incels, misogyny, antisemitism, and many other things, are now completely out of scope of the Bill and will continue to proliferate. That is a major loophole that massively falls short of the Bill’s original intention.

I hope that the other place will return to some of the core principles of the duty of care, giving the regulator wider powers to direct terms and conditions, and getting transparency and accountability for the engagement algorithms and economic business models that monetise misery, as Ian Russell described it. I am confident that the other place will consider those issues carefully, sensitively and intelligently. As I have said, if the Bill is not strengthened, it will fall to the next Labour Government to bring in further legislation. For now, I am pleased to finally be able to support the Online Safety Bill to pass its Third Reading.

20:52
Kirsty Blackman Portrait Kirsty Blackman
- View Speech - Hansard - - - Excerpts

It has taken a while to get to this point; there have been hours and hours of scrutiny and so much time has been spent by campaigners and external organisations. I have received more correspondence on this Bill from people who really know what they are talking about than on any other I have worked on during my time in the House. I specifically thank the NSPCC and the Mental Health Foundation, which have provided me with a lot of information and advice about the amendments that we have tabled.

The internet is wonderful, exciting and incredibly useful, but it is also harmful, damaging and scary. The Bill is about trying to make it less harmful, damaging and scary while allowing people to still experience the wonderful, exciting and useful parts of it. The SNP will not vote against the Bill on Third Reading, but it would be remiss of me not to mention the issues that we still have with it.

I am concerned that the Government keep saying Children’s “Commissioner” when there are a number of Children’s Commissioners, and it is the Children’s Commissioner for England who has been added as a consultee, not the other ones. That is the decision that they have made, but they need to be clear when they are talking about it.

On protecting children, I am still concerned that there are issues on which the Bill is a little bit too social media-centric and does not necessarily take into account some of the ways that children generally interact with the internet, such as talking to their friends on Fortnite, talking to people they do not know on Fortnite and talking to people on Roblox. Things that are not caught by social media and things that are different are not covered by this as well as I would like. I am concerned that there is less an ability for children not to take part in risky features—to switch off private messaging and livestreaming, for example—than there is just to switch off types of content or features.

Lastly, on the changes that have been made, I do not know what people want to say that they felt they could not say as a result of the previous version of the Bill. I do not know why the Government feel it is okay to say, “Right, we’re concerned about ‘legal but harmful’, because we want people to be able to promote eating disorder content or because we want people to be able to promote self-harm content.” I am sure they do not—I am sure no Ministers want people to be able to promote that—so why have they made this change? Not one person has been able to tell me what they believe they would not be able to say under the previous iteration of the Bill. I have not had one person be able to say that. Ministers can just say “free speech” however much they like, but it does not mean anything if they cannot provide examples of what exactly it is that they believe somebody should be able to say that they could not under the previous iteration of the Bill.

I am glad that we have a Bill and I am glad to hear that a future Labour Government might change the legislation to make it better. I hope this will provide a safer environment for children online, and I hope we can get the Bill implemented as soon as it is through the Lords.

20:56
Andrew Percy Portrait Andrew Percy (Brigg and Goole) (Con)
- View Speech - Hansard - - - Excerpts

I know how unpopular it can be at 9 o’clock at night to detain the House further. However, I did speak on previous stages of the Bill, and I want to speak about a couple of issues this evening.

I thank the Secretary of State for her meetings with me and members of some of our Jewish community groups about the change to “legal but harmful”. She knows we were not particularly happy when we heard the first iteration of what was proposed, but I think we have got to a position where Jewish community groups have been able to row in behind this Bill. It may be imperfect in some ways, but it is certainly a lot better than the starting point we were coming from. I also pay tribute to the shadow Minister, the hon. Member for Pontypridd (Alex Davies-Jones), who has also worked very hard, particularly on the issue of antisemitism. As I say, I thank the Secretary of State for getting us into a position, through her hard work, where we and groups such as the Centre for Countering Digital Hate are very supportive of the Bill.

I and the hon. Member for Croydon Central (Sarah Jones) were in Washington with the Antisemitism Policy Trust just before Christmas, when we met Members of Congress and Senators, who told us how much this piece of legislation was very much world-leading and very much an indicator for where they intend or would like to get to, although things are little bit more complicated there because of the constitutional issues. This is indeed a world-beating piece of legislation. As with all legislation, it is imperfect, but it is a piece of legislation of which we can still be very proud.

I am pleased that we have dispensed with some of the nonsense about free speech arguments, because some of those put forward were nonsense. There is a misunderstanding by some people—I have to say, sadly, on my side of politics—that free speech is presented as an ability to say anything without consequences, but that is not what free speech is. We should always remember that there are consequences to some of the things we say, and there should be consequences.

I want to speak briefly about the issue of conspiracy theories and this legislation, particularly antisemitic conspiracy theories. I am sorry to detain the House, but I do think this is an important issue at the moment, given that we have had a Member of this House in recent times promoting anti-vaccine conspiracy theories. The juxtaposition of covid conspiracy theories and anti-vaccine conspiracy theories with antisemitism is, I am afraid, one that we see all too often in the online space. The Bill will do something to address that, but we have to do more.

I want to give a couple of examples in the few minutes I have of what coronavirus conspiracy theories and antisemitism have looked like. We have had huge amounts of online material produced that suggests everything from “covid is not real and is a Jewish conspiracy” to “covid is real and was designed and spread by Jews”. We have had a celebration of Jewish deaths through conspiracy theories, and even the promotion of conspiracy theories around vaccines and the role of Jews. The Antisemitism Policy Trust, and the CST in its briefing “Covid, conspiracies & Jew-hate”, highlight the anti-vaccine element of antisemitism. We have seen gratuitous online content of Jews being presented as scientists holding syringes, and Jews who work as senior executives in various pharmaceutical companies have been targeted because of their faith. We have even seen the menorah presented with lots of syringes on it. All that is deeply antisemitic, conspiracy theory hate, based around the vaccine and the antivaxxer movement.

A colleague of ours recently found himself in trouble, quite rightly—I praise our Chief Whip for acting so swiftly on this—for promoting a tweet that likened the covid vaccine to the holocaust. Although that in and of itself is not necessarily antisemitic, we have seen anti-covid groups using gratuitous holocaust imagery in their campaign against the vaccines and the promotion of other covid conspiracies. It is not a very big step from promoting a holocaust image to entering into deep and dangerous antisemitism, and I am afraid that a lot of the anti-covid and anti-vaxxer movement find themselves in that space. It is vital that people in government and across this House call that nonsense out for what it is, which is dangerous, anti-science crap.

The Bill will go some way to addressing that, particularly the elements that are related to antisemitism and illegal content, but we need to do a lot more in the future. I am a big supporter of the Bill, and pay particular tribute to the Secretary of State, her officials and ministerial team for getting it to this point, but there will be a lot more to do. I am afraid this hate is there and is not going away. Since I called out what happened last week my inbox has exploded with all sorts of conspiracy theory nonsense, threats, and antisemitic emails and calls to the office. I know the Chief Whip has suffered the same. There is a lot more to do. I hope I have not detained the House for too long, and I support the Bill. It is a good start, it is world-leading, but we will have to come back to the issue as technology develops, because there will be more to do in this space. I end by associating myself with the calls with regards to advertising. The amount of advertising money in some of these hate sites is staggering and frightening, and we will have to do more on that.

Question put and agreed to.

Bill accordingly read the Third time and passed.

Speaker’s Committee for the Independent Parliamentary Standards Authority

Ordered,

That Mrs Heather Wheeler be appointed to the Speaker’s Committee for the Independent Parliamentary Standards Authority, until the end of the present Parliament, in pursuance of paragraph 1(d) of Schedule 3 to the Parliamentary Standards Act 2009, as amended. —(Penny Mordaunt.)

Online Safety Bill

1st reading
Wednesday 18th January 2023

(1 year, 3 months ago)

Lords Chamber
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Consideration of Bill Amendments as at 17 January 2023 - large print - (17 Jan 2023)
First Reading
15:52
The Bill was brought from the Commons, read a first time and ordered to be printed.
Second Reading
16:57
Moved by
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay
- View Speech - Hansard - - - Excerpts

That the Bill be now read a second time.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Digital, Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- Hansard - - - Excerpts

My Lords, I am very glad to be here to move the Second Reading of the Online Safety Bill. I know that this is a moment which has been long awaited in your Lordships’ House and noble Lords from across the House share the Government’s determination to make the online realm safer.

That is what this Bill seeks to do. As it stands, over three quarters of adults in this country express a concern about going online; similarly, the number of parents who feel the benefits outweigh the risks of their children being online has decreased rather than increased in recent years, falling from two-thirds in 2015 to barely over half in 2019. This is a terrible indictment of a means through which people of all ages are living increasing proportions of their lives, and it must change.

All of us have heard the horrific stories of children who have been exposed to dangerous and deeply harmful content online, and the tragic consequences of such experiences both for them and their families. I am very grateful to the noble Baroness, Lady Kidron, who arranged for a number of noble Lords, including me, to see some of the material which was pushed relentlessly at Molly Russell whose family have campaigned bravely and tirelessly to ensure that what happened to their daughter cannot happen to other young people. It is with that in mind, at the very outset of our scrutiny of this Bill, that I would like to express my gratitude to all those families who continue to fight for change and a safer, healthier online realm. Their work has been central to the development of this Bill. I am confident that, through it, the Government’s manifesto commitment to make the UK the safest place in the world to be online will be delivered.

This legislation establishes a regulatory regime which has safety at its heart. It is intended to change the mindset of technology companies so that they are forced to consider safety and risk mitigation when they begin to design their products, rather than as an afterthought.

All companies in scope will be required to tackle criminal content and activity online. If it is illegal offline; it is illegal online. All in-scope platforms and search services will need to consider in risk assessments the likelihood of illegal content or activity taking place on their site and put in place proportionate systems and processes to mitigate those risks. Companies will also have to take proactive measures against priority offences. This means platforms will be required to take proportionate steps to prevent people from encountering such content.

Not only that, but platforms will also need to mitigate the risk of the platform being used to facilitate or commit such an offence. Priority offences include, inter alia: terrorist material, child sexual abuse and exploitation, so-called revenge pornography and material encouraging or assisting suicide. In practice, this means that all in-scope platforms will have to remove this material quickly and will not be allowed to promote it in their algorithms.

Furthermore, for non-priority illegal content, platforms must have effective systems in place for its swift removal once this content has been flagged to them. Gone will be the days of lengthy and arduous complaints processes and platforms feigning ignorance of such content. They can and will be held to account.

As I have previously mentioned, the safety of children is of paramount importance in this Bill. While all users will be protected from illegal material, some types of legal content and activity are not suitable for children and can have a deeply damaging impact on their mental health and their developing sense of the world around them.

All in-scope services which are likely to be accessed by children will therefore be required to assess the risks to children on their service and put in place safety measures to protect child users from harmful and age inappropriate content. This includes content such as that promoting suicide, self-harm or eating disorders which does not meet a criminal threshold; pornography; and damaging behaviour such as bullying.

The Bill will require providers specifically to consider a number of risk factors as part of their risk assessments. These factors include how functionalities such as algorithms could affect children’s exposure to content harmful to children on their service, as well as children’s use of higher risk features on the service such as livestreaming or private messaging. Providers will need to take robust steps to mitigate and effectively manage any risks identified.

Companies will need to use measures such as age verification to prevent children from accessing content which poses the highest risk of harm to them, such as online pornography. Ofcom will be able to set out its expectations about the use of age assurance solutions, including age verification tools, through guidance. This guidance will also be able to refer to relevant standards. The Bill also now makes it clear that providers may need to use age assurance to identify the age of their users to meet the necessary child safety duties and effectively enforce age restrictions on their service.

The Government will set out in secondary legislation the priority categories of content harmful to children so that all companies are clear on what they need to protect children from. Our intention is to have the regime in place as soon as possible after Royal Assent, while ensuring the necessary preparations are completed effectively and service providers understand clearly what is expected. We are working closely with Ofcom and I will keep noble Lords appraised.

My ministerial colleagues in another place worked hard to strengthen these provisions and made commitments to introduce further provisions in your Lordships’ House. With regard to increased protections for children specifically, the Government will bring forward amendments at Committee stage to name the Children’s Commissioner for England as a statutory consultee for Ofcom when it is preparing a code of practice, ensuring that the experience of children and young people is accounted for during implementation.

We will also bring forward amendments to specify that category 1 companies—the largest and most risky platforms—will be required to publish a summary of their risk assessments for both illegal content and material that is harmful to children. This will increase transparency about illegal and harmful content on in-scope services and ensure that Ofcom can do its job regulating effectively.

We recognise the great suffering experienced by many families linked to children’s exposure to harmful content and the importance of this Bill in ending that. We must learn from the horrific events from the past to secure a safe future for children online.

We also understand that, unfortunately, people of any age may experience online abuse. For many adults, the internet is a positive source of entertainment and information and a way to connect with others; for some, however, it can be an arena for awful abuse. The Bill will therefore offer adult users a triple shield of protection when online, striking the right balance between protecting the right of adult users to access legal content freely, and empowering adults with the information and tools to manage their own online experience.

First, as I have outlined, all social media firms and search services will need to tackle illegal content and activity on their sites. Secondly, the Bill will require category 1 services to set clear terms of service regarding the user-generated content they prohibit and/or restrict access to, and to enforce those terms of service effectively. All the major social media platforms such as Meta, Twitter and TikTok say that they ban abuse and harassment online. They all say they ban the promotion of violence and violent threats, yet this content is still easily visible on those sites. People sign up to these platforms expecting one environment, and are presented with something completely different. This must stop.

As well as ensuring the platforms have proper systems to remove banned content, the Bill will also put an end to services arbitrarily removing legal content. The largest platform category 1 services must ensure that they remove or restrict access to content or ban or suspend users only where that is expressly allowed in their terms of service, or where they otherwise have a legal obligation to do so.

This Bill will make sure that adults have the information they need to make informed decisions about the sites they visit, and that platforms are held to their promises to users. Ofcom will have the power to hold platforms to their terms of service, creating a safer and more transparent environment for all.

Thirdly, category 1 services will have a duty to provide adults with tools they can use to reduce the likelihood that they encounter certain categories of content, if they so choose, or to alert them to the nature of that content. This includes content which encourages, promotes, or provides instructions for suicide, self-harm or eating disorders. People will also have the ability to filter out content from unverified users if they so wish. This Bill will mean that adult users will be empowered to make more informed choices about what services they use, and to have greater control over whom and what they engage with online.

It is impossible to speak about the aspects of the Bill which protect adults without, of course, mentioning freedom of expression. The Bill needs to strike a careful balance between protecting users online, while maintaining adults’ ability to have robust—even uncomfortable or unpleasant—conversations within the law if they so choose. Freedom of expression within the law is fundamental to our democracy, and it would not be right for the Government to interfere with what legal speech is permitted on private platforms. Instead, we have developed an approach based on choice and transparency for adult users, bounded by major platforms’ clear commercial incentives to provide a positive experience for their users.

Of course, we cannot have robust debate without being accurately informed of the current global and national landscape. That is why the Bill includes particular protections for recognised news publishers, content of democratic importance, and journalistic content. We have been clear that sanctioned news outlets such as RT, formerly Russia Today, must not benefit from these protections. We will therefore bring forward an amendment in your Lordships’ House explicitly to exclude entities subject to sanctions from the definition of a recognised news publisher.

Alongside the safety duties for children and the empowerment tools for adults, platforms must also have effective reporting and redress mechanisms in place. They will need to provide accessible and effective mechanisms for users to report content which is illegal or harmful, or where it breaches terms and conditions. Users will need to be given access to effective mechanisms to complain if content is removed without good reason.

The Bill will place a duty on platforms to ensure that those reporting mechanisms are backed up by timely and appropriate redress mechanisms. Currently, internet users often do not bother to report harmful content they encounter online, because they do not feel that their reports will be followed up. That too must change. If content has been unfairly removed, it should be reinstated. If content should not have been on the site in question, it should be taken down. If a complaint is not upheld, the reasons should be made clear to the person who made the report.

There have been calls—including from the noble Lord, Lord Stevenson of Balmacara, with whom I look forward to working constructively, as we have done heretofore—to use the Bill to create an online safety ombudsman. We will listen to all suggestions put forward to improve the Bill and the regime it ushers in with an open mind, but as he knows from our discussions, of this suggestion we are presently unconvinced. Ombudsman services in other sectors are expensive, often underused and primarily relate to complaints which result in financial compensation. We find it difficult to envisage how an ombudsman service could function in this area, where user complaints are likely to be complex and, in many cases, do not have the impetus of financial compensation behind them. Instead, the Bill ensures that, where providers’ user-reporting and redress mechanisms are not sufficient, Ofcom will have the power to take enforcement action and require the provider to improve its user-redress provisions to meet the standard required of them. I look forward to probing elements of the Bill such as this in Committee.

This regulatory framework could not be effective if Ofcom, as the independent regulator, did not have a robust suite of powers to take enforcement actions against companies which do not comply with their new duties, and if it failed to take the appropriate steps to protect people from harm. I believe the chairman of Ofcom, the noble Lord, Lord Grade of Yarmouth, is in his place. I am glad that he has been and will be following our debates on this important matter.

Through the Bill, Ofcom will have wide-ranging information-gathering powers to request any information from companies which is relevant to its safety functions. Where necessary, it will be able to ask a suitably skilled person to undertake a report on a company’s activity—for example, on its use of algorithms. If Ofcom decides to take enforcement action, it can require companies to take specific steps to come back into compliance.

Ofcom will also have the power to impose substantial fines of up to £18 million, or 10% of annual qualifying worldwide revenue, whichever is higher. For the biggest technology companies, this could easily amount to billions of pounds. These are significant measures, and we have heard directly from companies that are already changing their safety procedures to ensure they comply with these regulations.

If fines are not sufficient, or not deemed appropriate because of the severity of the breach, Ofcom will be able to apply for a court order allowing it to undertake business disruption measures. This could be blocking access to a website or preventing it making money via payment or advertising services. Of course, Ofcom will be able to take enforcement action against any company that provides services to people in the UK, wherever that company is located. This is important, given the global nature of the internet.

As the Bill stands, individual senior managers can be held criminally liable and face a fine for failing to ensure their platform complies with Ofcom’s information notice. Further, individual senior managers can face jail, a fine or both for failing to prevent the platform committing the offences of providing false information, encrypting information or destroying information in response to an information notice.

The Government have also listened to and acknowledged the need for senior managers to be made personally liable for a wider range of failures of compliance. We have therefore committed to tabling an amendment in your Lordships’ House which will be carefully designed to capture instances where senior managers have consented to or connived in ignoring enforceable requirements, risking serious harm to children. We are carefully designing this amendment to ensure that it can hold senior managers to account for their actions regarding the safety of children, without jeopardising the UK’s attractiveness as a place for technology companies to invest in and grow. We intend to base our offence on similar legislation recently passed in the Republic of Ireland, as well as looking carefully at relevant precedent in other sectors in the United Kingdom.

I have discussed the safety of children, adults, and everyone’s right to free speech. It is not possible to talk about this Bill without also discussing its protections for women and girls, who we know are disproportionately affected by online abuse. As I mentioned, all services in scope will need to seek out and remove priority illegal content proactively. There are a number of offences which disproportionately affect women and girls, such as revenge pornography and cyberstalking, which the Bill requires companies to tackle as a priority.

To strengthen protections for women in particular, we will be listing controlling or coercive behaviour as a priority offence. Companies will have to take proactive measures to tackle this type of illegal content. We will also bring forward an amendment to name the Victims’ Commissioner and the domestic abuse commissioner as statutory consultees for the codes of practice. This means there will be a requirement for Ofcom to consult both commissioners ahead of drafting and amending the codes of practice, ensuring that victims, particularly victims and survivors of domestic abuse, are better protected. The Secretary of State and our colleagues have been clear that women’s and girls’ voices must be heard clearly in developing this legislation.

I also want to take this opportunity to acknowledge the concerns voiced over the powers for the Secretary of State regarding direction in relation to codes of practice that currently appear in the Bill. That is a matter on which my honourable friend Paul Scully and I were pressed by your Lordships’ Communications and Digital Committee when we appeared before it last week. As we explained then, we remain committed to ensuring that Ofcom maintains its regulatory independence, which is vital to the success of this framework. As we are introducing ground-breaking regulation, our aim is to balance the need for the regulator’s independence with appropriate oversight by Parliament and the elected Government.

We intend to bring forward two changes to the existing power: first, replacing the “public policy” wording with a defined list of reasons that a direction can be made; and secondly, making it clear that this element of the power can only be used in exceptional circumstances. I would like to reassure noble Lords—as I sought to reassure the Select Committee—that the framework ensures that Parliament will always have the final say on codes of practice, and that strong safeguards are in place to ensure that the use of this power is transparent and proportionate.

Before we begin our scrutiny in earnest, it is also necessary to recognise that this Bill is not just establishing a regulatory framework. It also updates the criminal law concerning communication offences. I want to thank the Law Commission for its important work in helping to strengthen criminal law for victims. The inclusion of the new offences for false and threatening communications offers further necessary protections for those who need it most. In addition, the Bill includes new offences to criminalise cyberflashing and epilepsy trolling. We firmly believe that these new offences will make a substantive difference to the victims of such behaviour. The Government have also committed to adding an additional offence to address the encouragement or assistance of self-harm communications and offences addressing intimate image abuse online, including deep- fake pornography. Once these offences are introduced, all companies will need to treat this content as illegal under the framework and take action to prevent users from encountering it. These new offences will apply in respect of all victims of such activity, children as well as adults.

This Bill has been years in the making. I am proud to be standing here today as the debate begins in your Lordships’ House. I realise that noble Lords have been waiting long and patiently for this moment, but I know that they also appreciate that considerable work has already been done to ensure that this Bill is proportionate and fair, and that it provides the change that is needed.

A key part of that work was conducted by the Joint Committee, which conducted pre-legislative scrutiny of the Bill, drawing on expertise from across both Houses of Parliament, from all parties and none. I am very glad that all the Members of your Lordships’ House who served on that committee are speaking in today’s debate: the noble Baroness, Lady Kidron; the noble Lords, Lord Stevenson of Balmacara and Lord Knight of Weymouth, who have very helpfully been called to service on the Opposition Front Bench; the noble Lord, Lord Clement-Jones, who speaks for the Liberal Democrats; as well as my noble friends Lord Black of Brentwood and Lord Gilbert of Panteg.

While I look forward to the contributions of all Members of your Lordships’ House, and will continue the open-minded, collaborative approach established by my right honourable friend the Secretary of State and her predecessors—listening to all ideas which are advanced to make this Bill as effective as it can be—I urge noble Lords who are not yet so well-versed in its many clauses and provisions, or who might be disinclined to accept at first utterance the points I make from this Dispatch Box, to consult those noble Lords before bringing forward their amendments in later stages of the Bill. I say that not to discourage noble Lords from doing so, but in the spirit of ensuring that what they do bring forward, and our deliberations on them, will be pithy, focused, and conducive to making this Bill law as swiftly as possible. In that spirit, I shall draw my already too lengthy remarks to a close. I beg to move.

17:19
Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, like many in your Lordships’ House, I am relieved to be finally speaking on the Second Reading of this important Bill. I am very grateful to the Minister for his introduction. Despite being central to a recent manifesto and having all-party support, it has taken nearly six years to get us to this moment, as the Minister alluded to. A revolving door of four Prime Ministers and seven changes in Secretary of State have not exactly been conducive to this process.

But it is also fair to say that the Bill has been strengthened by consultation and by the detailed pre-legislative scrutiny carried out by the Joint Committee, to whom I pay tribute. It means that this version of the Bill bears a very welcome resemblance to the Joint Committee’s report. I also thank the Communications and Digital Select Committee for its ongoing work and warmly acknowledge the long-term campaigning work of the noble Baroness, Lady Kidron, and others in and outside this House.

It seems that every passing week reminds us why stronger online regulation is needed. Just today, we read that the influence of Andrew Tate, despite his being in custody in Romania, has whipped up a storm of rape and death threats directed to my colleague in the other place, Alex Davies-Jones. And writ large is the damning verdict of the inquest into Molly Russell’s death. I want to pay tribute to the determination of her father, Ian, who is present with us today.

In today’s digital age, social media is everywhere: in our homes, workplaces and schools. With the rise of virtual reality, it is also in our heads. It is a central influence on what we buy and think, and how we interact and behave. The power and money at stake are enormous, yet the responsibilities are minimal and accountability lacking.

The focus of this long and complex Bill is on reducing the seemingly ever-increasing harms caused by social media services and search engines, whose algorithms generate detailed pictures of who we are and push us towards certain types of content, even if it impacts on our physical and mental health. As we know, Molly Russell tragically took her own life after having been bombarded with material relating to depression, self-harm and suicide.

Many platforms have upped their game since, but the need for this legislation has not diminished: there remain too many cases of children and vulnerable adults being exposed to digital content that is simply not appropriate. I welcome the arrival of the Bill, but it is too late and, due to recent changes, arguably too narrow. We must now do what we can to get it on the statute book as soon as possible.

The Government have committed to changes in your Lordships’ House, but we need to see the detail, and soon, not least because of the significant public and stakeholder interest. It has become fashionable to leave major changes to legislation until Report stage, leaving noble Lords unsighted and limiting the scope for improvement. I hope the Minister will commit to bucking this trend and give noble Lords early sight of the Government’s thinking.

On these Benches, we will, as always, work constructively with colleagues across the House, and hopefully with the Minister too, as we have already been doing. But, in so doing, we must acknowledge that this Bill is unlikely to be the last word. A future Labour Government will want to return to these issues, to tidy up any deficiencies that are identified once the Bill becomes law.

I now turn to some of our priorities. I am in no doubt that other noble Lords will add to this list. There is a legitimate concern around the decision of Ministers to take powers of direction over what is supposed to be an independent regulator and to leave so much to secondary legislation. The need for flexibility is indeed understood, but Parliament must have an active role, rather than being sidelined.

On the protection of children, despite notable progress by many platforms, too many failings exist. Several children’s charities have put forward important recommendations. The NSPCC has called for user advocacy to influence future regulation, while Barnardo’s wants restrictions on access to online pornography, holding the Government to their previous promises.

The scrapping of legal but harmful provisions means a lack of protection for vulnerable adults. The Samaritans, for example, is keen to ensure that self-harm provisions properly capture vulnerable adults as well as children. We understand that defining the term is difficult, but a solution has to be found.

On anti-Semitism, racism and general abuse, the Government shifted policy in response to a former Conservative leadership hopeful who said that we cannot legislate for hurt feelings. We believe in free speech, but it is not clear that DCMS has found the right balance with its triple shield. The toggle system may prevent users from seeing categories of harmful material, but it will still exist and influence others unless the Government compel an auto-on setting.

On violence against women and girls, I welcome the commitments made in relation to cyberflashing and making controlling behaviour a priority offence. I hope the Minister confirms that there will be work with an extensive range of relevant stakeholders to build on the amendments already made, and to identify and close potential loopholes in forthcoming text.

We find it unacceptable that the Government have stripped back the Bill’s media literacy provisions at a time when these skills are more important than ever. I am grateful to organisations such as Full Fact for highlighting the need to equip people of all ages, but particularly children, with the skills necessary to identify misinformation and disinformation. We have all seen the damage caused by vaccine disinformation, not only on Covid but on HPV. This extends to other areas; social media is awash with misleading material on nutrition, breastfeeding and natural health remedies, to name but a few. Once again, we acknowledge that some platforms perform well in response to such issues, but the recent takeover of Twitter has highlighted how swiftly and radically that can change.

I know that the Minister has been working on this agenda for some time and that he wants to get it right. We can all share our own experiences or those of friends or family in respect of online harm and abuse. We can also all cite ways in which technological innovation has improved our lives. We therefore all have a stake in improving this legislation. We have a long and complex process ahead of us, but uniquely there is no political divide on the Bill. Therefore I hope that in the finest traditions of your Lordships’ House we will work together to improve what is before us, while recognising that this is unlikely to be the last word.

17:28
Lord McNally Portrait Lord McNally (LD)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to follow the noble Lord, Lord Parkinson, and the noble Baroness, Lady Merron, and the spirit of co-operation they have both shown in introducing the Bill. On our side we will be led by my noble friend Lord Clement-Jones, who is keeping his powder dry for the summing up.

I was pleased that there was praise for the pre-legislative scrutiny, which is a very useful tool in our locker. I was a member of the Puttnam committee, which in 2002 looked at what became the last Communications Act, and I took two lessons from that. The first was the creation of Ofcom as a regulator with teeth; it is important that we go forward with that. The other was the Puttnam amendment adding the protection of citizens’ interests to that of consumer interests as part of its responsibilities. Those twin responsibilities—to the consumer and the citizen—are valuable when addressing this Bill.

It is worth remembering that, although it may be a future Labour Government who deal with this, my experience is that this is not a dress rehearsal; this is the main event and we should seize the day. It has been 20 years since the last Bill, six years since the Green Paper, and five years since the White Paper, with a cavalcade of Secretaries of State. This House is entitled to stress-test and kick tyres in today’s debate and in Committee to see if the powers and scope meet the threats, challenges and opportunities posed by this technology.

We will play our part in delivering a Bill which is fit for purpose, but the Government must play theirs by being flexible in their approach in response to legitimate concerns and sensible amendments addressing them. The noble Baroness, Lady Merron, has already voiced concerns about powers left in the hands of future Secretaries of State. We will study what has been said this afternoon on those matters.

We welcome the Bill’s focus on protecting children. I do not think anybody who went to the presentation on the evidence in the Molly Russell inquest could have left with anything other than a determination that something must be done about this. Equally, the concerns of End Violence Against Women and other groups pose questions on whether this legislation goes far enough in the protections needed, which will have to be tested. There are real worries about the lack of minimum requirements for terms of service and the removal of risk assessment for adults. The noble Lord, Lord Bethell, has been raising very pertinent questions about age verification and access to pornography. The noble Lord, Lord Lipsey, and I intend to raise questions in Committee about the free pass given to newspapers by this legislation, although much of their activity is now online. There is no specific commitment, as has been said, to expand media literacy, despite it being a major recommendation of the Puttnam committee 20 years ago.

The internet has been an amazing catalyst for change, innovation and creativity. But those benefits have come at a price of targeted actions designed to cause harms to individuals and institutions. On all Benches we believe that freedom of expression is important, but liberal democracies have a right to provide a framework of protection against those who seek to harm it. Much will depend on the response to legislation and regulation by the internet companies. The public are not stupid; they can differentiate between tick-box exercises and compliance, between profit maximisation and social responsibility. The noble Lord, Lord Grade, is also not stupid and I wish him well as chair of Ofcom.

My work on the Puttnam committee 20 years ago was among the most satisfying of my parliamentary life. I hope we will all have similar feelings when we complete our work on this Bill.

17:33
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I declare my interests as chair of 5Rights Foundation and the Digital Futures Commission, my positions at Oxford and LSE and at the UN Broadband Commission and the Institute for Ethics in AI, as deputy chair of the APPG on digital regulation and as a member of the Joint Committee on this Bill.

As has already been mentioned, on Monday I hosted the saddest of events, at which Ian Russell and Merry Varney, the Russell family’s solicitor, showed parliamentarians images and posts that had been algorithmically recommended to Molly in the lead-up to her death. These were images so horrible that they cannot be shown in the media, so numerous that we could see only a fraction, and so full of despair and violence that many of the adult professionals involved in the inquest had to seek counselling. Yet in court, much of this material was defended by two tech companies as being suitable for a 14 year-old. Something has gone terribly wrong. The question is: is this Bill sufficient to fix it?

At the heart of our debates should not be content but the power of algorithms that shape our experiences online. Those algorithms could be designed for any number of purposes, including offering a less toxic digital environment, but they are instead fixed on ranking, nudging, promoting and amplifying anything to keep our attention, whatever the societal cost. It does not need to be like that. Nothing about the digital world is a given; it is 100% engineered and almost all privately owned; it can be designed for any outcome. Now is the time to end the era of tech exceptionality and to mandate a level of product safety so that the sector, just like any other sector, does not put its users at foreseeable risk of harm. As Meta’s corporate advertising adorning bus stops across the capital says:

“The metaverse may be virtual, but the impact will be real.”


I very much welcome the Bill, but there are still matters to discuss. The Government have chosen to take out many of the protections for adults, which raises questions about the value and practicality of what remains. In Committee, it will be important to understand how enforcement of a raft of new offences will be resourced and to question the oversight and efficacy of the remaining adult provisions. Relying primarily on companies to be author, judge and jury of their own terms of service may well be a race to the bottom.

I regret that Parliament has been denied the proper opportunity to determine what kind of online world we want for adults, which, I believe, we will regret as technology enters its next phase of intelligence and automation. However, my particular concern is the fate of children, whose well-being is collateral damage to a profitable business model. Changes to the Bill will mean that child safety duties are no longer an add-on to a generally safer world; they are now the first and only line of defence. I have given the Secretary of State sight of my amendments, and I inform the House that they are not probing amendments; they are necessary to fill the gaps and loopholes in the Bill as it now stands. In short, we need to ensure that child safety duties apply to all services likely to be accessed by children. We must ensure the quality control of all age-assurance systems. Age checking must not focus on a particular harm, but on the child; it needs to be secure, privacy-preserving and proportionate, and it must work. The children’s risk assessment and the list of harms must cover each of the four Cs: content harm, conduct harm, contact harm and commercial harm, such as the recommendation loops of violence and self-hatred that push thousands of children into states of misery. Those harms must be in the Bill.

Coroners and bereaved parents must have access to data relevant to the death of a child to end the current inhumane arrangement whereby bereaved families facing the devasting loss of their child are forced to battle, unsuccessfully, with tech behemoths for years. I hope that the Minister will reiterate commitments made in the other place to close that loophole.

Children’s rights must be in the Bill. An unintended consequence of removing protections for adults is that children will now cost companies vastly more developer time, more content moderation and more legal costs than adults. The digital world is the organising technology of our society, and children need to be online for their education and information to participate in civic society—they must not be kicked out.

I thank all those who have indicated their support, and the Secretary of State, the Minister and officials for the considerable time they have given me. However, I ask the Minister to listen very carefully to the mood of the House this evening; the matters I have raised are desperately urgent and long-promised, and must now be delivered unequivocally.

While millions of children suffer from the negative effects of the online world, some pay with their lives. I am a proud supporter of a group of bereaved parents for online safety, and I put on the record that we remember Molly, Frankie, Olly, Breck, Sophie and all the others who have lost their lives. I hope that the whole House will join me in not resting until we have a Bill fit for their memory.

17:39
Lord Bishop of Manchester Portrait The Lord Bishop of Manchester
- View Speech - Hansard - - - Excerpts

My Lords, that is not an easy speech to follow, but I begin by declaring my interest as a Church Commissioner, as set out in the register. We have substantial holdings in many of the big tech companies. I am also vice-chair of the Church of England Ethical Investment Advisory Group. I commend the attention of noble Lords to our recent report on big tech that was published last September. There, we set out five core principles that we believe should guide our investment in and engagement with big tech companies: flourishing as persons, flourishing in relationships, standing with the marginalised, caring for creation and serving the common good. If we apply those principles to our scrutiny of this Bill, we will not only improve lives but save lives.

I will focus my remaining remarks on three areas. First, as the noble Baroness, Lady Merron, and the noble Lord, Lord McNally, have noted, the powers granted to the Secretary of State to direct Ofcom on its codes of practice and provide tactical and strategic guidance put Ofcom’s independence at risk. While I recognise that the Government have sought to address these concerns, more is required—Clauses 39 and 157 are not fit for purpose in their present form. We also need clear safeguards and parliamentary scrutiny for Secretary of State powers in the Bill that will allow them to direct Ofcom to direct companies in whatever we mean by “special circumstances”. Maintaining Ofcom’s autonomy in decision-making is critical to preserving freedom of expression more broadly. While the pace of technological innovation sometimes requires very timely response, the Bill places far too much power in the hands of the Secretary of State.

Secondly, while the Bill encompasses activity within the remit of regulators beyond Ofcom, it is largely silent on formal co-operation. I encourage the Government to introduce a general duty to co-operate with other regulators to ensure a good and effective enforcement of the various regulatory regimes. I would be grateful if the Minister could confirm whether the Government will commit to looking at this once more.

Finally, I turn, as others have done, to the protection of children. The noble Baroness, Lady Kidron, has just spoken powerfully. Can we really claim that this Bill serves to mitigate the harm that children face online when consultation of children has so far been lacking? I welcome the Minister’s remarks about the Children’s Commissioner in this regard, but we can and should go further. In particular, we should centre our decisions on promoting children’s well-being rather than on simply minimising harm. My right reverend friend the Bishop of Durham regrets that he is unable to be in his place today. I know he plans to raise these questions as the Bill progresses.

Related to this, we must ensure that any activity online through which children are groomed for criminal exploitation is monitored. A reporting mechanism should be brought in so that such information is shared with the police. My right reverend friend the Bishop of Derby is unable to speak today, but as vice- chair of the Children’s Society, she will follow these issues closely.

This Bill has arrived with us so late and so overcrowded that I had begun to think it was being managed by my good friends at Avanti trains. However, here at last it is. I look forward to working with noble Lords to improve this important and welcome legislation. It is my hope that, as we continue to scrutinise and improve the Bill, we will move ever closer to fulfilling those five core principles I set out: flourishing as persons, flourishing in relationships, standing with the marginalised, caring for creation and serving the common good.

17:43
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I draw attention to my interests as a trustee of the Loughborough Wellbeing Centre, director of Santander and the Financial Services Compensation Scheme, chair of the Association of British Insurers and board member at Grayling. In fact, I could draw attention to all my interests, because what we are debating today, with online search engines and online platforms, are organisations that reach into every corner of our lives now. I want to thank current Ministers for getting us to this stage. We have heard that this is long overdue regulation. I plead guilty to being one of the “cavalcade” of previous Secretaries of State mentioned by the noble Lord, Lord McNally, but I am pleased that I have played my part in keeping this Bill on the road.

When we have passed this legislation, the UK will be world leading. That needs to be recognised, but it also means that this legislation is new and not easy, as we have heard. Polling from More in Common has said that in a list of six comparative European countries, the British are most likely to say that the Government are not doing enough to regulate social media platforms. In the brief time available, I want to set out some key themes and amendments which I hope to raise in Committee.

I welcome the criminal offences relating to violence against women and girls added to the Bill, but the whole environment of these platforms, where such online violence has become normalised and misogyny allowed to flourish unchecked, needs to change. I am afraid that adding selected offences is insufficient, and I will be calling for a specific code of practice, to be drafted by Ofcom, that the platforms and search engines will need to follow to show that they are taking the proliferation of violence against women and girls seriously.

We will hear today many arguments about freedom of speech and expression, but what about the right to access and participation online without being abused and harassed? Online violence against women and girls curtails women’s freedom of expression. The advice to avoid social media—which I myself, as a Member of Parliament, received from the authorities and the police—respects no one’s freedoms. As we have heard, women and girls are 27 times more likely to experience harassment online.

We have also heard from Luke Pollard in the other place a mention of incels. While this is a complicated topic, unfortunately what is true is that data from the Center for Countering Digital Hate has found that visits to incel websites are only increasing every day, and the content on them is getting more extreme. Many small platforms hosting incels set their own terms and conditions, allowing for violent and misogynistic discussions. How the Bill tackles those issues will be of great importance and a subject of discussion in this House.

I was disappointed that the legal but harmful restrictions were dropped, but I understand why Ministers chose to do so. However, I agree that, as we have already heard, the user empowerment toggle should be set to “on” by default. Just because a user decides not to see abusive and harmful content does not mean that it is not there, either influencing others or, where it is unfortunately necessary, for the user to see so that they can provide evidence to the authorities, including the police. I include my own experience of having seen that abuse, gathering it and then sending it to the authorities. If we have the toggle set to “off”, in relation to violence against women and girls the onus will yet again be on women to protect themselves, rather than the abuser being compelled to cease their abuse. Related themes to explore in Committee will be the minimum standards needed for risk assessments, as well as minimum standards for platforms’ terms and conditions; the publication of risk assessments to create a culture of transparency on the part of service providers; and further detail on how the information gathered by Ofcom under Clause 68 is to be used.

We will hear discussion—we already have—about the welcome creation of the offence of sending communication which encourages serious self-harm. However, as we have heard, Samaritans has pointed out that all such content needs to be regulated across all platforms for all users. Turning 18 does not stop young people being vulnerable to suicide or self-harm content. I also support the calls by Vicky Ford and others to specifically include eating disorders within the self-harm clause.

It was my pleasure last year to chair this House’s special committee on the Fraud Act 2006 and digital fraud. Time is short, but there will be more to say on the issues of fraud, as well as independent researchers’ access to information. My noble friend the Minister has mentioned senior manager liability. We will wait to see what the clause introduced says, but it needs to be sufficiently tough to change the culture.

I will absolutely support the amendment proposed by the noble Baroness, Lady Kidron, and that proposed by my noble friend Lord Bethell, on age verification for online pornography.

I was recently at an event in this building with tech companies, including a major search engine, who complained that, via the Bill, the Government are experimenting on them. I put it to them then, and I say now, that these companies have experimented on us, particularly our children and vulnerable adults, for years without facing the consequences of the illegal and harmful material across their platforms and search engines. The Bill is long overdue. I look forward to the debates and amendments.

17:49
Baroness Anderson of Stoke-on-Trent Portrait Baroness Anderson of Stoke-on-Trent (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, it is a privilege to follow the noble Baroness, Lady Morgan—and slightly intimidating. I draw the House’s attention to my register of interests: I am a director of the Antisemitism Policy Trust and a director of HOPE not hate, and I remain the chief executive of Index on Censorship. I have also had appalling experiences online. In all these capacities I have been intimately involved with the passage of this legislation over the last two years. Like every one of your Lordships, I desperately want to see a better and safer internet for all users, especially children and the most vulnerable, but I worry about the unintended consequences of certain clauses, particularly for our collective and legal right of freedom of expression.

There are certain core premises that should guide our approach to online regulation. What is legal offline should be legal online. We need secure and safe communication channels to protect us all of us, but especially dissidents and journalists, so end-to-end encryption needs to be safeguarded. Our ability to protect our identities online can be life-saving, for domestic violence victims as much as for political dissidents, so we need to ensure that the principle of online anonymity is protected. Each of these principles is undermined by the current detail of the Bill, and I hope to work with many of your Lordships in the weeks ahead to add additional safeguards.

However, some of my greatest concerns about the current proposals relate to illegal content: the definition of what is illegal, the arbiters of illegality and, in turn, what happens to the content. The current proposals require the platforms to determine what is illegal content and then delete it. In theory this seems completely reasonable, but the reality will be more complicated.

I fear what a combination of algorithms and corporate prosecution may mean for freedom of expression online. The risk appetite of the platforms is likely to be severely reduced by this legislation. Therefore, I believe that they are likely to err on the side of caution when considering where the illegality threshold falls, leading to over-deletion. This will be compounded by the use of algorithms rather than people to detect nuance and illegal content.

I will give your Lordships an example of an unintended consequence this has already led to. A video of anti-government protests in Lebanon was deleted on some current platforms because an algorithm picked up only one word of the Arabic chants: Hezbollah, an organisation rightly proscribed in the UK. But the video actually featured anti-Hezbollah chants. It was an anti-extremism demonstration and, I would speculate, contained anti-extremist messaging that many of us would like to see go viral rather than be deleted.

Something is already twice as likely to be deleted from a platform by an algorithm if it is in Urdu or Arabic, rather than English. This will become even more common unless we tighten the definition of illegality and provide platforms with a digital evidence locker where content can be stored before a final decision on deletion is made, thus protecting our speech online.

The issue of deletion is deeply personal for me. Many of your Lordships may be aware that, as a female Jewish Labour Member of the other place, I was subjected to regular and vicious anti-Semitic and misogynist online abuse—abuse that too often became threats of violence and death. Unfortunately, these threats continue and have a direct effect on my personal security. I know when I am most vulnerable because I see a spike in my comments online. These comments are monitored—thankfully not by me—and, when necessary, are referred to the police, with the relevant evidence chain, so that people can be prosecuted.

Can the Minister explain how these people will be prosecuted for harassment, or worse, if the content is automatically deleted? How will I know if someone is threatening to kill me if the threat has already gone? I genuinely believe that the Government wish to make people safer online, as do we all, but I fear that this Bill will not only curtail free speech online but make me and others much less safe offline. There is significant work to do to make sure that is not the case.

17:53
Baroness Campbell of Surbiton Portrait Baroness Campbell of Surbiton (CB) [V]
- View Speech - Hansard - - - Excerpts

My Lords, the internet is a double-edged sword. It enables people to connect with work, education, information and social activities. It gives visibility to those often hidden from society. But it can be a dangerous place for many, especially disabled people, many of whom are vulnerable to attack merely for who they are. I want to focus on the indiscriminate abuse that disabled people face online.

In January 2019, the Petitions Committee published its report Online Abuse and the Experience of Disabled People, following a petition by Katie Price about her son Harvey. The committee heard evidence of extreme levels of abuse, not only on social media but in online games, web forums and in media website comments. As one disabled poet and writer wrote:

“I’ve been called an ‘it’ many times—‘What is IT doing?’ … I’ve had remarks about how I look in my wheelchair, and a few times the statements, ‘You should have been aborted’, and, ‘You don’t deserve to live’”,


and, “Why are you online?” The committee rightly concluded that the law was not fit for purpose.

The Bill does not do enough to address such abuse. The other place recently weakened the protections for disabled people, replacing the provisions on legal but harmful content with a triple shield of duties to remove illegal content for adults and harmful content for under-18s, and to empower adult users.

Under Clause 12, social media companies must now tackle content which is abusive or incites hatred towards disabled people. That is encouraging, but it is the companies that decide that, so in practice it may not change anything. We know that moderating social media is the Wild West. There is no consistency between platforms. It depends on the algorithms they use and the discretion of their moderators.

Clause 18 adds to those problems, requiring platforms also to consider freedom of expression and privacy issues. They will be in an impossible position, caught between competing claims for protection from abuse and freedom of speech. At the very least, the legal but harmful provisions must be restored.

Greater control for disabled people using social media is laudable. They must be consulted on the best way to achieve that. The Bill says that terms of service must be “clear and accessible”. It should provide for Ofcom to give guidance with input from disabled people. It should not be left to social media services to set their own standards.

Consistency is also vital for the way the verification process works. Clause 57 refers to “verification … of any kind” and “clear and accessible” explanations. Ofcom’s guidance will be crucial on both issues, with disabled people’s input essential. It should be mandatory to follow the guidance.

Will the Minister assure me that he will address these matters before Committee? Will he meet me and disability organisations which have expertise in this field for guidance? This is a landmark Bill and very welcome. Let us ensure that it works for everybody, especially those who need it most.

17:58
Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - - - Excerpts

My Lords, it is an honour and privilege to follow the noble Baroness, Lady Campbell, and all those who have spoken in this debate. As a member of your Lordships’ Committee on Artificial Intelligence and a founding member of the Centre for Data Ethics and Innovation, I have followed the slow progress of this Bill since the original White Paper. We have seen increasing evidence that many social media platforms are unwilling to acknowledge, let alone prevent, harms of the kind this vital Bill addresses. We know that there is an all too porous frontier between the virtual world and the physical world. The resulting harms damage real lives, real families, and real children, as we have heard.

There is a growing list of priority harms and now there is concern, as well as excitement, over new AIs such as ChatGPT; they demonstrate yet again that technology has no inherent precautionary principles. Without systemic checks and balances, AI in every field develops faster than society can respond. We are and for ever will be catching up with the technology.

The Bill is very welcome, marking as it does a belated but important step towards rebalancing a complex but vital aspect of public life. I pay tribute to the Government and to civil servants for their patient efforts to address a complex set of ethical and practical issues in a proportionate way. But the job is not yet fully done.

I will concentrate on three particular areas of concern with the draft Bill. First, removal of risk assessments regarding harm to adults is concerning. Surely every company has a basic moral duty to assess the risk of its products or services to customers and consumers. Removal can only undermine a risk-based approach to regulation. Can the Minister explain how conducting a risk assessment erodes or threatens freedom of speech? My second concern, mentioned by others, is the Secretary of State’s powers in relation to Ofcom. This country has a record of independence of our own media regulators. Others have touched on that, so I will not elaborate. The third area of concern I wish to raise is the Bill’s provision—or rather lack of provision—over disinformation of various kinds. I currently serve on your Lordships’ Environment and Climate Change Committee; climate disinformation and medical disinformation inflict substantial harms on society and must be included in user empowerment tools.

Other right reverend Prelates will raise their own concerns in the forthcoming Committee. My right reverend friend the Bishop of Gloucester believes that it is imperative that we prevent technology-facilitated domestic abuse, as well as bring in a code of practice to keep women and girls safe online. To help young people flourish, we should look at controlling algorithmically served content, restrictions on face and body-editing apps, as well as improving media literacy overall. She is unable to speak today, but will follow these issues closely.

The Bill is vital for the health of children and adults, and the flourishing of our whole society. I look forward to progress being made in this House.

18:02
Lord Vaizey of Didcot Portrait Lord Vaizey of Didcot (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I refer to my registered interests, in particular my work with Common Sense Media, a US not-for-profit that is focused on internet safety for children. What a pleasure it is to follow the right reverend Prelate the Bishop of Oxford—my local bishop, no less. I always find it a great thing that it is our Bishops who read their speeches from iPads; we have iBishops in this Chamber who are far more technologically advanced than the rest of us. What a pleasure it is to see our national treasure the Arts Minister on the Front Bench; yesterday he launched the 2021 report of the Portable Antiquities Scheme, which displays ancient treasures dug up from many centuries ago. I thought he might be presented with the first consultation paper on the Online Safety Bill, because it has taken so long to get to the stage where we are today.

A dozen years ago, when we talked about the impact of the internet, we were actually focused on copyright infringement; that was the big issue of the day. It is quite instructive to think about what happened there; it was a combination of technology, but also business solutions, licensing and the creation of companies such as Spotify that had an impact. But piracy remains with us, and will continue to remain with us because of the internet.

I like to think that the Jurassic journey of the Online Safety Bill began with an Adjournment debate by the then Member for Devizes, Claire Perry, who began a debate about protecting children from adult content on the internet, which is one of the most important issues. That led to her being commissioned to do a review by the then Prime Minister, David Cameron, and that began the ball rolling. But Prime Minister David Cameron’s biggest intervention, which I remember well, was to tackle Google on the issue of child sex abuse. At the time the prevailing mood, which still prevails, was that politicians do not understand technology—you cannot regulate the internet, “Get your tanks off our lawn”. But Cameron said, “We will legislate unless you do something”, and Google, which said it was impossible, eventually came up with something like 150,000 search terms which would give a non-search return and refer the searcher to get some help, frankly—that is what the page would come up with.

That was instructive because it was a combination of government action, but in tackling child sexual abuse we had relied on not-for-profits, such as the Internet Watch Foundation. As we debate a piece of legislation and call on the Government to do this or that, it is important to remember that the internet has always had many governors, if you like—civic society, business, not-for-profits and charities—all of which must continue to play an important role in internet policing, as must the platforms themselves, where technology has improved in leaps and bounds. We have heard some of the criticisms of the technology they use and the impact it has on the people who are relied on by some of these technology companies to police content. Nevertheless, they have made progress. We must also remember that the platforms are not publishers or broadcasters; they are still new technology.

I unequivocally support the Bill—frankly, in whatever form it takes once your Lordships have fully considered it. It must be passed because it is time to regulate the internet. Ofcom is absolutely the right regulator to do this. I have been hugely impressed by the amount of work it has put into preparing for this role. The overall approach taken in the Bill is the right one: to police not every piece of content but the terms and conditions. This week, Ofcom published a very important document pointing out that transparency, holding the platforms to account and exposing how they regulate their content will make a massive difference.

The Government have made the right compromise on legal but harmful. I counsel against the Christmas tree effect of wanting to hang every single different concern on to the Bill; let us keep our eye on the prize. Having said that, I will fully support my noble friend Lord Bethell in his points on age verification and the noble Baroness, Lady Kidron, with her amendment.

This is the end of the beginning. The Bill will not eradicate all the nasty things we see on the internet but, for the first time, the platforms will be accountable. It is very important to support this legislation. The Minister did not mention the European Union’s important legislation on this issue, but we are beginning to make progress across the world.

18:07
Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to follow other noble Lords on this issue. This legislation is undoubtedly long overdue. Without doubt, the internet has changed the way in which we live our lives. For many this change has been positive. However, the internet, in particular social media, has created a toxic online world. We have only to listen to the noble Baroness, Lady Kidron, and my noble friend Lady Anderson to realise that. As a result, the internet has become abusive, misogynistic and dangerous. Many noble Lords from across the House have personal experience of this toxic world of online abuse. Any measures that seek to place curbs and limits on that type of content are to be welcomed.

While it is important to protect adults from abuse online, it is more important that we get the Bill’s protections right for children. I welcome its provisions in respect of age verification, but for many across the House it is a surprise that we are even debating age verification. Legislation was passed in 2017 but inexplicably not implemented by the Government. That legislation would have ensured that age verification was in place to protect children over five years ago. While the Bill includes age assurance measures, it is disappointing that its provisions are not as robust as those passed in 2017. Also, it is concerning that age verification is not uniformly applied across Parts 3 and 5. What actions and steps will the Minister and his colleagues take in Committee with government amendments on this issue?

As this Bill makes progress through this House, it will be important to ensure that age verification is robust and consistent, but we must also ensure that what happened to the Digital Economy Act cannot be allowed to happen to this legislation. The Government cannot be allowed to slow down or even abandon age verification measures. This Bill, while welcome, needs to be amended to ensure that age verification is actually implemented and enforced. This must happen as quickly as possible after the Bill becomes law. I believe that age verification should be in place no later than six months after this Bill is passed.

The need for robust age verification is beyond any reasonable argument. Children should be protected from viewing harmful content online. The law in this regard should be simple. If a platform contains pornographic content, children should be prevented from viewing it. More than that, pornography that is prohibited offline should be prohibited online. Reading the provisions of this Bill carefully, it is my belief that the Bill falls short in both regards.

I look forward to the passage of this Bill through the House and, while it is a very welcome development to be discussing and having this Bill, it is important that the provisions and clauses within it are totally strengthened.

18:11
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I have two observations, two pleas, one offer of help and four minutes to deliver all this, so here goes.

Observation one is that this Bill is our answer to the age-old question of “quis custodiet ipsos custodes?” or, in the vernacular, “Who watches the watchmen?” With several thousand strokes of the pen, Parliament is granting to itself the power to tell tens of thousands of online services how they should manage their platforms if they wish to access the UK market. Parliament will give directions to Ofcom about the outcomes it wants to see and Ofcom will translate these into detailed instructions and ensure compliance through a team of several hundred people that the platforms will pay for. In-scope services will be given a choice—pay up and follow Ofcom’s instructions or get out of the UK market. We are awarding ourselves significant superpowers in this Bill, and with power comes great scrutiny as I am sure will happen in this House.

My second observation is that regulating online content is hard. It is hard because of scale. If regulating traditional media is like air traffic controllers managing a few thousand flights passing over the UK each day, then regulating social media is more like trying to control all the 30 million private cars that have access to UK roads. It is hard because it requires judgment. For many types of speech there is not a bright line between what is legal and illegal so you have to work on the basis of likelihoods and not certainties. It is hard because it requires trade-offs—processes designed to remove “bad” content will invariably catch some “good” content and you have to decide on the right balance between precision and recall for any particular system, and the noble Baroness, Lady Anderson of Stoke-on-Trent, has already referred to some of these challenges with specific examples.

I make this observation not to try and elicit any sympathy for online services, but rather some sympathy for Ofcom as we assign it the most challenging of tasks. This brings me to my first plea, which is that we allow Ofcom to make decisions about what constitutes compliance with the duties of care in the Bill without others second-guessing it. Because judgments and trade-offs are a necessary part of content moderation, there will always be people who take opposing views on where lines should have been drawn. These views may come from individuals, civil society or even Ministers and may form important and valuable input for Ofcom’s deliberations. But we should avoid creating mechanisms that would lead to competing and potentially conflicting definitions of compliance emerging. One chain of command—Parliament to Ofcom to the platforms—is best for accountability and effective regulation.

My second plea is for us to avoid cookie banner syndrome. The pop-ups that we all click on when visiting websites are not there for any technical reason but because of a regulatory requirement. Their origins lie in a last-minute amendment to the e-privacy directive from Members of the European Parliament who had concerns about online behavioural advertising. In practice, they have had little impact on advertising while costing many millions and leaving most users at best mildly irritated and at worst in greater risk as they learn to click through anything to close banners and get to websites.

There are several elements in this Bill that are at risk of cookie banner syndrome. Measures such as age and identity verification and content controls can be useful if done well but could also be expensive and ineffective if we mandate solutions that look good on paper but do not work in practice. If you see me mouthing “cookies” at you as we discuss the Bill, please do not see it as an offer of American biscuits but as a flag that we may be about to make an expensive mistake.

This brings to me to my final point, which is an offer of technical advice for any noble Lords trying to understand how the Bill will work in practice: my door and inbox are always open. I have spent 25 years working on internet regulation as poacher turned gamekeeper, turned poacher, turned gamekeeper. I may have a little more sympathy with the poachers than most politicians, but I am all gamekeeper now and keen to see this Bill become law. For those who like this kind of thing, I share more extensive thoughts on the Bill than I can get into four minutes in a blog and podcast called “Regulate Tech”.

18:16
Baroness Hollins Portrait Baroness Hollins (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I thank Mencap and the Royal College of Psychiatrists for their briefings. I will speak against the change in the other place which waters down the protections offered to adults, and focus in particular on adults without capacity.

The original Bill included protections for adults under the umbrella of “legal but harmful”, which gave robust directions to platforms on what content to remove. These protections must be reinstated; the triple shield is not enough. Your Lordships are presented with a system where social media platforms must filter only

“to the extent that it is proportionate to do so”,

assuming that all adults are capacitous all of the time and that they will be responsible for making their own choices to avoid seeing harmful content.

I recognise that there is an intended new duty for services to undertake a risk assessment on the impact of certain material on children, and to tackle the promotion of sites which share harmful content and to prevent children witnessing it, but this applies just to children. I agree with my noble friend Lady Kidron that tech companies must design for safety, just as we expect in the physical environment.

My main point is that there is no clear distinction between childhood and adulthood when it comes to mental health. I am concerned about the mental health consequences for anybody, whether child or adult, of seeing some of the images, messaging and push notifications which relentlessly pursue anyone who has ever engaged with one of the horrific sites like those seen by 14 year-old Molly Russell. These images are harmful to 14 year-olds; they are harmful to 24 year-olds; and they are harmful to 74 year-olds. Once seen, it is very hard to unsee them.

Misinformation and negative messaging are harmful to anyone who may struggle to belong and feel valued, whether at a vulnerable moment in their lives or as part of an ongoing struggle with depression. One in 20 Google searches is for health-related information. People in the UK apparently make 27 searches a minute for “depression”, 22 a minute for “stress”, and 21 a minute for anxiety. Given the waiting times for mental health support in the community, perhaps it is unsurprising that people seek help online. This Bill must have an emphasis on prevention. The Bill places duties on regulated providers but, as of June 2022, more than 500 hours of video were uploaded to YouTube every minute. This is content created and viewed by its users at a rate where any reactionary approach is doomed to fall quickly behind.

As legislators we must think of society as a whole, not just those who are fully engaged and economically productive citizens who currently feel invulnerable. Making sure that legislation works for people with a learning disability and those who may not have the understanding needed to protect themselves from harmful content should not be an add-on. Could the Minister suggest how the Bill could deliver greater protections to people with a learning disability or other cognitive or mental health reason for increased risk of online harm?

As I have said before, if we could get it right for people with learning disabilities, we could actually get it right for everyone.

18:19
Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am humbled to speak in this debate among many noble Lords who have spent years involved in or campaigning for this landmark legislation. I salute all of them and their work.

Like many, I support some parts of this Bill and am sceptical about others. The tension between free speech, privacy and online safety is not an easy one to resolve. We all accept, however reluctantly, that one Bill cannot cure all social ills—indeed, neither should it try. In fact, when it comes to online regulation, this is not the only legislation that is urgent and necessary: the digital markets, competition and consumer Bill is a critical, yet still missing, piece of the jigsaw to us achieving a strong regulatory framework. I hope the Government will bring it forward swiftly.

As my noble friend Lord Vaizey has already said, I see this Bill as the beginning of online regulation and not the end. I see it as our opportunity to make a strong start. For me, the top priority is to get the regulatory fundamentals right and to ensure we can keep updating the regime as needed in the years ahead. With my chair of the Communications and Digital Committee hat on, I will focus on key changes we believe are needed to achieve that. As I cannot do that justice in the time available, I direct any keen readers to our committee’s website, where my letter to the Secretary of State is available.

First, the regulator’s independence is of fundamental importance, as the noble Baroness, Lady Merron, and others have already mentioned. The separation of powers between the Executive and the regulator is the cornerstone of media regulation in western Europe. Any government powers to direct or give guidance should be clearly defined, justified and limited in scope. The Online Safety Bill, as it stands, gives us the opposite. Future Governments will have sweeping powers to direct and interfere with Ofcom’s implementation of the regulations.

I will come, in a moment, to my noble friend the Minister’s proposed remedy, which he mentioned in his opening remarks, but I stress that this is not a general complaint from me or the committee about executive overreach. Many of the Bill’s executive powers are key to ensuring the regime is responsive to changing needs, but there are some powers that are excessive and troubling. Clause 39 allows the Secretary of State to direct Ofcom to change its codes of practice on regulating social media firms. That is not about setting priorities; it is direct and unnecessary interference. In our view, the Government’s proposed amendment to clarify this clause, as my noble friend described, remains inadequate and does not respect the regulator’s independence. Clause 39 also empowers the Secretary of State to direct Ofcom in a private form of ping-pong as it develops codes of practice. This process could in theory go on for ever before any parliamentary oversight comes into play. Other powers are equally unnecessary. Clause 157 contains unconstrained powers to give “guidance” to Ofcom about any part of its work, to which it must have regard. Again, I fail to see the need, especially since the Government can already set strategic priorities and write to Ofcom.

Moving on, my committee is also calling for risk assessments for adult users to be reinstated, and this has already been mentioned by other noble Lords. That would have value for both supporters and critics of “legal but harmful”, by requiring platforms to be transparent about striking the balance between allowing adult users to filter out harmful content and protecting freedom of speech and privacy.

Finally, given the novel nature of the Bill, I hope the Government will reconsider their unwillingness to support the setting up of a Joint Committee of Parliament to scrutinise digital regulation across the board. This would address many general and specific concerns about implementation and keeping pace with digital developments that have been raised recently. Parliament needs to properly discharge its responsibilities, and fragmented oversight via a range of committees will not be good enough in this new, modern world.

Overall, and with all that said, I commend my noble friend and his colleagues for getting us to this point. I look forward to, and will support him in, completing the passage of this legislation in good order.

18:25
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to follow the noble Baroness, Lady Stowell, and so many other fine speeches today. I should remind your Lordships of my interests. In particular, I have been working with GoBubble, which provides social media filtering technology. I was also a member of the Joint Committee on this Bill and was previously on the Select Committee on Democracy and Digital Technologies, chaired by the noble Lord, Lord Puttnam.

Right at the heart of this Bill are just two interrelated factors. First, there are bad actors: people who deliberately or carelessly do harm to others both in the real world and virtually, both physically and mentally. Our problem is how content from these bad actors interacts with the systems and processes in the online world that personalise and amplify that content. In 2021, 44% of all global spending on advertising was with Meta and Alphabet-owned businesses. Their platforms, such as Facebook, Instagram and YouTube, are machines with the objective of maximising engagement time on the platform in order to sell more advertising.

The machines have no ethics; they have business objectives. If that means feeding outrageous, disturbing or harmful content, so be it. If that means pushing at Molly Russell content that has now been implicated by the coroner in her death, so be it. If that means the corruption of children, self-harm or fraud, so be it. Whatever turns you on, keeps you engaged and keeps you on the platform is what the machines will push your way. This week, the Children’s Commissioner for England reported that one in five boys watch porn at least every day; that more than half of frequent users seek out violent sex acts; and that Twitter is the site where the highest proportion report seeing explicit sexual content.

The platforms are not all bad but the harms of manipulation and corruption are real and urgent. We must, and will, work together to get this Bill improved and passed by the summer. In doing so, our job with this Bill is to impose ethics on the algorithms used by platforms. This is less about bad content and more about systems. It is about content takedown and content suppression. It is as much about freedom of reach as freedom of speech. For too many people—especially women and girls, as the noble Baroness, Lady Morgan, mentioned—their freedom of expression is constrained by platforms because they are shouted down and abused. They need better protection.

Without change, vulnerable adults with learning difficulties will not be protected by this Bill. Without change, the corruption of truth and democracy by the likes of Trump and Putin will continue. Without change, the journalistic and democratic exemptions in the Bill will be exploited by the likes of Tommy Robinson to spread bile. Without change, content from the likes of Andrew Tate will continue to be amplified. His videos have been viewed more than 13 billion times on TikTok alone, including by any of our children whom we have allowed an account. Teachers, parents and grandparents cannot keep up with what is going on with children online; they need ongoing education and help. I am afraid that Ofcom is not cutting through with its media literacy duty. We must use this Bill to change that. We need to constrain the Secretary of State’s powers over Ofcom so that it is properly independent and give young people themselves more influence over the regulator.

There is much to do. This is as important a job of work as any I have been a part of during my 22 years in Parliament. I look forward to working with all Peers to deliver a Bill that prevents harm, criminalises abusers and overlays human ethics on to these machines of mass manipulation.

18:29
Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I thank my noble friend Lady Kidron for her tenacious moral leadership on this issue. I remind noble Lords that, when we passed the Tobacco Advertising and Promotion Act, none of us predicted tobacco companies’ development and marketing of vapes with higher and more addictive nicotine content than that in cigarettes. It was a simple lesson.

A gap now in this Bill is the difficult issue of “legal but harmful”. We should not focus on the difficulty of defining this, but rather on the design and standards of algorithms that internet platforms use to commercial advantage, dodging any responsibility for what happens and blaming the end user.

Before the Government amended Clauses 12 and 13, category 1 service providers would have been forced to risk-assess across their sites and provide information on this in their terms of service, including how harmful content was to be managed. But this is now gone and as a result, the digital environment will not be detoxified as originally intended. What pressures, if any, were exerted on government by commercial and other sources to amend these clauses?

It matters that the Bill now treats people under 18 and over 18 very differently, because the brain’s development and peak addictive potential from puberty does not stop at 18. Those in their 20s are at particular risk.

The social media platforms act commercially, pushing out more content, including online challenges, as their algorithms pick up a keyword—whether spelled correctly or incorrectly—a mouse hovering over an image or a like response. Currently, platforms judge addiction and profit by the time spent on a platform, but that is not how addictions work. Addiction is the reward-reinforcing behaviour that evokes a chemical response in the brain that makes you want more. Hence the alcoholic, the gambling addict, the drug addict and so on keep going back for more; the sex addict requires ever more extreme images to gain stimulation; the user will not switch off access.

Those whose emotional expression is through abuse and violent behaviour find more ways to abuse to meet their urge to control and vent feelings, often when adverse childhood experiences were the antecedent to disastrous destructive behaviour. The unhappy young adult becomes hooked in by the images pushed to them after an internet search about depression, anorexia, suicidal ideation and so on. The algorithm-pushed images become compulsive viewing, as ever more are pushed out, unasked for and unsearched for, entrapping them into escalating harms.

Now, the duties in Clause 12 are too vague to protect wider society. The user should be required to opt in to content so that it can be followed, not opt out. The people controlling all this are the platform companies. They commission the algorithms that push content out. These could be written completely differently: they could push sources of support in response to searches for gambling, eating disorders, suicidal ideation, dangerously extreme sex and so on. Amending the Bill to avoid escalating harms is essential. Some of the harms are ones we have not yet imagined.

The platform companies are responsible for their algorithms. They must be made responsible for taking more a sophisticated, balanced-risk approach: the new technology of artificial intelligence could detect those users of their platforms who are at particular risk. In daily life offline, we weigh up risk, assessing harms and benefits in everything, filtering what we say or do. Risk assessment is part of life. That does not threaten freedom of speech, but it would allow “legal but harmful” to be addressed.

The Bill presents a fantastic opportunity. We must not throw it away.

18:33
Lord Frost Portrait Lord Frost (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I declare my interest, as set out in the register, as a member of the advisory council of the Free Speech Union.

This is an important Bill. It has taken time to get to us, and rightly so. Many important requirements have to be balanced in it—the removal of illegal material, and the protection of children, as we have heard so movingly already today. But, as legislators, we must also have an eye on all elements of public policy. We cannot eliminate every evil entirely, except at unacceptable cost to other objectives and, notably, to free speech.

The Bill, as it was developing last summer, was damaging in many ways to that objective. At times I was quite critical of it, so I welcome the efforts that have been made by the new broom and new team at DCMS to put it in a better place. It is not perfect, but is considerably better and less damaging to the free speech objective. In particular, I welcome the removal of the so-called legal but harmful provisions, their replacement with a duty to empower users and the decision to list out the areas that this provision applies to, rather than leaving it to secondary legislation. I also welcome the strengthening of provisions to protect the right to free speech and democratic debate more broadly, although I will come on to a couple of concerns, and the dropping of the new harmful communications offence in the original Bill. It is clear, from what we have heard so far today, that there will be proposals to move backwards—as I would see it—to the original version of the Bill. I hope that the Government will be robust on that, having taken the position that they have.

Although the Bill is less damaging, it must still be fit for purpose. With 25,000 companies in its scope, it also affects virtually every individual in the country, so it is important that it is clear and usable and does not encourage companies to be too risk averse. With that in mind, there are areas for improvement. Given the time constraints, I will focus on free speech.

I believe that in a free society, adults—not children but adults—should be able to cope with free debate, if they are given the tools to do so. Noble Lords have spoken already about the abuse that they get online, and we all do. I am sure I am not unique in that; some if it drifts into the real world as well, from time to time. However, I do not look to the Government to defend me from it. I already have most of the tools to turn that off when I want to, which I think is the right approach. It is the one that the Government are pursuing. Free speech is the best way of dealing with controversial issues, as we have seen in the last few weeks, and it is right for the Government to err on the side of caution and not allow a chilling effect in practice.

With this in mind, there are a couple of improvements that I hope the Government might consider. For example, they could require an opt-out from seeing the relevant “legal but harmful” content, rather than an opt-in to see it, and ensure those tools are easy to use. There is otherwise a risk that risk-averse providers will block controversial content and people will not even know about it. It could be useful to require providers to say how they intend to protect freedom of speech, just as they are required to say explicitly how they will manage the Clause 12 provisions. Without that, there is some risk that freedom of speech may become a secondary objective.

To repeat, there has been considerable improvement overall. I welcome my noble friend the Minister’s commitment to listen carefully to all proposals as we take the Bill through in this House. I am happy to support him in enabling the passage of this legislation in good order soon.

18:38
Baroness Healy of Primrose Hill Portrait Baroness Healy of Primrose Hill (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I welcome the Bill, but regret the time it has taken to arrive. To make the UK the safest place in the world to be online, it must be strengthened, and I will support amendments that would ensure greater protection for children through proper age assurance. The damage to children from exploitation by social media cannot continue. The state must regulate, using severe penalties, to force platforms to behave with greater responsibility as they cannot be trusted to self-regulate. The rise in suicide and self-harm and the loss of self-esteem are ruining young lives. The platforms must take greater responsibility; they have the money and the technology to do this but need stronger incentives to act, such as the promised executive criminal liability amendment.

Ofcom faces a formidable challenge in policing companies to adhere to its terms and conditions about content moderation. Heavy fines are not enough. Ofcom will need guidance in setting codes of practice from not only the three commissioners but NGOs, such as the Internet Watch Foundation, and an advocacy body for children to continually advise on emerging harms. A new regulatory regime to address illegal and harmful content online is essential but, having removed legal but harmful from the original Bill, we lost the opportunity to detoxify the internet.

Concentrating on the big platforms will miss the growth of bespoke platforms that promote other harms such as incel culture, a threat to women but also to young men. Incels, involuntarily celibates, use mainstream platforms such as YouTube to reel in unsuspecting young men before linking them to their own small, specialist websites, but these are outside the scope of category 1 provision and therefore any minimum standards. These sites include not only sexist and misogynistic material but anti-Semitic, racist, homophobic and transphobic items, and even paedophilia. One of the four largest incel forums is dedicated to suicide and self-harm. HOPE not hate, the anti-fascist campaign, has warned that smaller platforms used by the far right to organise and radicalise should be under the same level of scrutiny as category 1 platforms.

User empowerment features, part of the triple shield, such as options to filter out content from unverified users and abusive content, put the onus on the user to filter out material rather than filters being turned on by default. Ofcom must ensure a statutory duty to promote media literacy by the largest platforms as part of their conditions of service. The Bill should make children’s risk assessment consistent across all services, and should tackle the drivers of harm and the design of the service, not just the content.

I welcome the new offences targeting harmful behaviour, including epilepsy trolling, cyber flashing and the sending of manufactured deepfake intimate images without consent. Despite the Bill adding controlling or coercive behaviour to the list of priority offences, more needs to be done to protect women, one in three of whom has experienced online abuse. Ofcom must add a mandatory code of practice regarding violence against women and girls so that tech companies understand they have a duty to prioritise their safety.

The Bill must prevent the relentless promotion of suicide and self-harm that has destroyed the lives of young people and their families. I commend the bravery of Ian Russell, who is campaigning to prevent other deaths following the tragic suicide of his daughter, Molly. I back the amendments from the noble Baroness, Lady Kidron, to ensure that coroners and bereaved families can access social media content. I applaud all those campaigners who want to see the Bill implemented urgently, and I will work with other noble Lords to strengthen it.

18:42
Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I support this important Bill, but with some concerns. As drafted, it does not go far enough to fully protect children and young people online. The consequences of the policies we decide in this Bill will affect the whole of society in decades to come.

I have been working on the online pornography issue for the last 10 years. In April 2017, this House passed legislation that required age verification for pornography websites to prevent children accessing them. We were promised that social media platforms would be included later on, but that did not happen. It is hard to believe that almost six years ago this House passed the Digital Economy Act, whose Part 3 was never implemented by this Government. So here we are, still debating age verification for pornography. This is simply unacceptable—a shocking failure of society. It is now time to act fast, and we must make sure that we do it right.

I am concerned that the Bill does not go as far as what was passed in 2017. Even if the Bill is passed, I do not believe that it will deliver age verification quickly. If Ofcom’s road map on the implementation of the Bill is to be believed, it could be three years before enforcement proceedings are issued against pornography websites that allow children to access them.

Research by the BBFC found that children as young as seven are innocently stumbling across pornography online and that 51% of all children aged 11 to 13 have watched pornography online—according to Barnardo’s, 54 million times. We are creating a conveyor belt of children addicted to porn, which will affect their long-term well-being and sexual behaviour.

A fundamental problem with the Bill is that it does not deal with pornography as a harm. The Government state that it is designed to ensure that what is lawfully unacceptable offline would also be unacceptable online. However, in respect of pornographic content, the Bill as drafted does not meet that goal. Material that is extreme and prohibited offline is widely available online. Evidence shows that consumption of extreme and prohibited material, such as content that sexualises children—and that includes adults dressing up as children—can lead on to the viewing of illegal child sexual abuse material and an interest in child sex abuse. It is not only children who are at risk: men who watch extreme and prohibited material online are more likely to be abusive towards women and girls.

What is needed is a stand-alone part of the Bill that deals with all pornographic content and sets out a clear definition of what pornography is. Once defined, the Bill should require any website or social media platform with content that meets that definition to ensure that children cannot access that material, because porn can be a gateway to other harms. Contrary to what some people believe, technology exists that can accurately age-verify a user without compromising that person’s privacy. The groundwork is done, and as more countries implement this type of legislation, the industry is becoming increasingly equipped to deal with age verification. France and Germany are already taking legal action to enforce their own laws on the largest adult websites, with several already applying age checks. There is no reason why this cannot be implemented and enforced within six months of the Bill becoming law. If that is too hard for the social media platforms, they can simply remove porn from their pages until they are ready to keep that harm away from our kids.

Childhood lasts a lifetime, and we have the opportunity to ensure that pornography is not a harm inflicted on our children. We owe it to them. I declare an interest as vice-president of Barnardo’s.

18:47
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I declare an interest as a series producer of online and linear content. I, like many noble Lords, can hardly believe that this Bill has finally come before your Lordships’ House. It was in 2017, when I first joined the Communications and Digital Committee, that we started to look at online advertising. We went on to look at regulating the internet in three separate inquiries. I am pleased to see some of those recommendations in the Bill.

It is not surprising that I support the words of the present chair of the committee, the noble Baroness, Lady Stowell, when she said that the Secretary of State still has far too many powers over the regulator. Draft codes of practice, in which Ofcom can give the parameters and direction for the tech companies, and the review of their implementation, are going to be central in shaping its terms of service. Generally, in democracies, we are seeing regulators of the media given increasing independence, with Governments limiting themselves to setting up their framework and then allowing them to get on with the task at hand. I fear the Bill is not doing that. I understand that the codes will be laid before Parliament, but I would support Parliament having a much stronger power over the shaping of those regulations.

I know that Labour supports a Select Committee having the power to scrutinise this work, but having served on the Communications and Digital Committee, I fear that the examination of consultations from Ofcom would monopolise its entire work. I support the pre-legislative committee’s suggestion of a Joint Committee of Parliament, whose sole job would be to examine regulations and give input. I will support amendments to this effect.

I am also worried about Clauses 156 and 157. I listened to the Minister when he said that amendments to the Secretary of State’s powers of guidance will be brought before the House and that they will be used only in exceptional circumstances. However, the list of subjects on which I understand the Minister will then be able to intervene is still substantial, ranging from public safety through economic policy and burdens to business. Are the Government prepared to consider further limiting these powers to intervene?

I will also look at risk assessments in the Bill. They need to go further than illegal content and child safety. The empowerment lists in Clause 12 are not risk assessed and do not seem to have enough flexibility for what noble Lords know is an ever-changing world of harms. The volume of online content means that moderation is carried out by algorithms. During the inquiries in which I was involved, we were told repeatedly that algorithms are very bad at distinguishing humour and context when deciding on harmful content. Ensuring that the platforms’ systems moderate correctly is difficult. There was a recent case of that: the farcical blocking by Twitter of the astronomer Dr Mary McIntyre, whose account was suspended because her six-second video of a meteor shower was mistaken by the Twitter algorithms for a porn video. For weeks, she was unable to get any response from Twitter. Such mistakes happen only too frequently. Dr McIntyre’s complaint is only one of millions made every year against the tech companies, for being either too keen or not keen enough to take down content and, in some cases, to block accounts. So the Bill needs to include a risk assessment which looks at the threat to free speech from any changes in those systems. Ofcom needs to be able to create those risk assessments and to produce annual reports which can then be laid before a Joint Committee for Parliament’s consideration. That should be supported by an ombudsman.

I would also like to see the definition of safety duties on platforms to take down illegal content changed from “reasonable grounds” to the platform being aware that the content is “manifestly illegal”—and, if possible, for third parties, such as the NCA, to be involved in the process. That will reduce the chance of chilling free speech online as much as possible.

I am also aware that there has been concern over the duties to protect news publishers and journalistic content. Like other noble Lords, I am worried that the scope in respect of the latter is drawn too widely in the Bill, and that it covers all content. I would support amendments which concentrate on protecting journalism in the public interest. The term “in the public interest” is well known to the courts, is present in Section 4 of the Defamation Act, and is used to great effect to protect journalism which is judged to be in the public interest.

I welcome the Bill after its long journey to this House. I am sure that the hard work of fellow Peers and collaboration with the Minister will ensure that it leaves this House in a clearer, more comprehensive and safer state. The well-being of future generations of internet users in this country depends on us getting it right.

18:51
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, it is an enormous privilege to follow so many powerful speeches. My second daughter was born in the year Facebook launched in the UK and Apple sold its first iPhone. Today she is 15; she has lived her whole life in a digitally enabled world. She has undoubtedly benefited from the great things that digital technology brings, but, throughout that life, she has had no meaningful legal protection from its harms.

A number of noble Lords have referenced the extraordinarily moving and disturbing briefing that Ian Russell and his lawyer, Merry Varney, gave us on Monday. When I went home from that briefing, first, I hugged my two teenage girls really close, and then I talked to them about it. My 15 year-old daughter said, “Mum, of course, I know about Molly Russell and all the awful content there is on social media. Didn’t you realise? When are all you adults going to realise what’s going on and do something about it?” The Bill is important, because it is the beginning of us doing something about it.

It is also a huge Bill, so we need to be careful not to let perfect be the enemy of the good. Like other noble Lords, I urge this House to focus on the critical areas where we can improve this already much debated and discussed Bill and try to resist the temptation to attach so many baubles to it that it no longer delivers on its core purpose of protecting our children online. So, like others, I will focus my remarks on three structural changes that I hope will help make the Bill more effective at driving the positive changes that, I think, everyone in this House intends: first, the consequences for senior managers of not complying with the legislation; secondly, how compliance is defined and by whom; and, finally, which services are included.

To change digital platforms and services to protect children is not impossible—but it is hard, and it will not happen by itself. Tech business models are simply too driven by other things; development road maps are always too contested with revenue-raising projects, and competition for clicks is just too intense. So we need to ask ourselves whether the incentives in the Bill to drive compliance are strong enough to counter the very strong incentives not to.

It is clear that self-regulation will not work, and relying on corporate fines is also not enough. We have learned in other safety-critical industries and sectors that have needed dramatic culture change, such as financial services, that fines alone do not drive change. However, once you name an individual as responsible for something, with serious consequences if they fail, change happens. I look forward to the government amendment that I hope will clearly set out the consequences for named senior managers who do not deliver on their overall online safety responsibilities.

The second area I highlight is how compliance is defined. Specifically, the powers that the Bill grants the Secretary of State to amend Ofcom’s proposed code of conduct are far too wide. Just as with senior tech managers, the political incentives not to focus on safety are too strong. Almost every Minister I have ever met is keen to support tech sector growth. Giving the Secretary of State the ability to change codes of conduct for economic reasons is asking them to trade off economic growth against children’s safety—the same trade-off that tech companies have failed to make over the last 15 years. That is not right, it is not fair on the Ministers themselves, and it will not deliver the child protections we are looking for.

The third area I will cover—I will be very brief—has been highlighted by the noble Baroness, Lady Kidron. It is important that we capture all the services that are accessed by children. If not, we risk creating a dangerous false sense of security. Specifically, I am worried about why app stores are not covered. In the physical world—I say this as an erstwhile retailer—retailers have long come to terms with the responsibilities they bear for ensuring that they do not sell age-restricted products to children. Why are we shying away from the same thing in the digital world?

There are many other things I would support, not least the amendments proposed by the noble Baroness, Lady Kidron. I finish by simply saying that the most important thing is that the Bill is here. We need to do this work—our children and grandchildren have waited far too long.

18:56
Baroness D'Souza Portrait Baroness D'Souza (CB)
- View Speech - Hansard - - - Excerpts

My Lords, this is indeed a huge, complex and courageous Bill which deserves widespread support. Despite some welcome government amendments during its passage in the other place, there are residual concerns about guarantees of freedom of expression and access to information, as well as the degree to which the regulator, Ofcom, is independent of government control.

It is widely acknowledged by the Government themselves and the majority of those who have spoken to the Bill that the right to free speech is a fundamental aspect of our democracy, and that any restriction must be fully justified in the public interest. Public interest includes the freedom to access unwelcome, unpopular and even offensive material, if only to be able to refute it. It is also accepted that a functioning democracy needs new ideas and robust debate. That said, it is a fine and difficult line to draw between offensive material and illegal content. In their efforts, the Government have sought to protect above all the safety of children.

I start with a presumption in favour of free speech and a multiplicity of voices. Clauses 18 and 28 state that providers must

“have particular regard to the … users’ right to freedom of expression”

and to protecting users from breaches of any laws relating to privacy. This would be achieved by rigorous impact assessments of safety measures and policies, any infringements of which must be made publicly available. However, the definition of democratically important material as information

“specifically intended to contribute to democratic political debate in the United Kingdom”

remains vague, and other strict requirements on protecting children in the Bill could condemn offensive but necessary democratic content.

Clause 160 refers to false information intended

“to cause non-trivial psychological or physical harm”.

It may, in many cases, be entirely obvious when such harm is intended, but not in all cases. On whom does the burden of proof lie and what recourse does an individual have to appeal false accusations?

The stricture that democratically important content be preserved is by no means fully guaranteed by the following powers set out in the Bill. There is a potential danger of undue restriction that lies in the degree of control from the Secretary of State and his or her relationship with Ofcom; the terms and conditions of service for category 1 providers; the options, or lack of them, for user control of online material; and the role of Parliament.

Draft codes of practice are to be submitted to the Secretary of State, who could require Ofcom to modify codes in the interests of national security or public safety. The Secretary of State will pass any statement on strategic priorities to Ofcom, but parliamentary approval would be by means only of the negative resolution procedure.

The Secretary of State can issue guidance and directions to Ofcom, which in turn has a crucial role in acting against a provider that is not complying with the requirement to fulfil duties under the Act, including the imposition of fines of up to £18 million and “business disruption measures”—in other words, outright censorship. Although such drastic action could occur only in the case of a breach of the terms of service, there would be no restriction on taking down content to comply with other duties—for example, if it was judged that the content might be “likely” to be accessed by children. This, it is feared, would encourage providers to play safe. Furthermore, the terms and conditions can be altered at will by the provider.

The age verification process would necessarily require the user to register with a provider, preventing any casual access by adults. Furthermore, to remove unnecessary barriers to information, the controls available to the user should be a genuine option and not imposed by default.

This is a truly important Bill and I congratulate the authors and campaigners, as well as the Government, on bringing it to this advanced stage. I nevertheless believe that it could be further improved to ensure that the most liberal interpretations of online freedom of expression remain at the heart of our democracy.

19:01
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, the Secretary of State, Michelle Donelan, has acknowledged that protecting children is the very reason that this Bill exists. If only the Government had confined themselves to that crucial task. Instead, I worry that the Bill has ballooned and still could be a major threat to free expression of adults. I agreed with much of what the noble Baroness, Lady D’Souza, just spoke about.

Like some other noble Lords here, I am delighted that the Government have dropped the censorious “legal but harmful” clauses. It was disappointing to hear Labour MPs in the other place keen to see them restored. In this place, I have admired opposition resistance to assaults on civil liberties in, for example, the Public Order Bill. Perhaps I can appeal for consistency to be just as zealous on free speech as a foundational civil liberty. I urge those pushing versions of censoring “legal but harmful” for adults to think again.

The Government’s counter to many freedom of expression concerns is that free speech is protected in various clauses, but stating that service providers must have regard to the importance of protecting users’ rights of freedom of speech is incredibly weak and woolly, giving a second-class status whencontrasted with the operational safety duties that compel companies to remove material. Instead, we need a single comprehensive and robust statutory duty in favour of freedom of expression that requires providers to ensure that free speech is not infringed on by measures taken to comply with other duties. Also, free speech should be listed as a relevant duty for which Ofcom has to develop a code of practice.

The Bill requires providers to include safety provisions for content in their terms of service. However, no similar requirement for free speech exists. It seems ironic that a Bill that claims to be clipping the power of big tech could actually empower companies to police and censor legal material in the name of safety, via the commercial route of terms and conditions.

The Government brush off worries that big tech is being encouraged to limit what UK citizens say or read online by glibly asserting that these are private companies and that they must be free to develop their own terms of service. Surely that is disingenuous. The whole purpose of the legislation is to interfere in private companies, compelling them to adhere to duties or face huge penalties. If the Government do not trust big tech with users’ safety, why do they trust them with UK citizens’ free speech rights? Similarly, consider the user empowerment duties. If users ask that certain specified types of legal content are blocked or filtered out, such as hate or abuse, it is big tech that has the power to decide what is categorised under those headings.

Only last year, amendments put forward in this House on placing convicted sex-offending trans prisoners on the female estate were labelled online as hate-fuelled, transphobic abuse. However, with the ability to hear all sides of the debate online, and especially in the light of recent events in Scotland around the Gender Recognition Act, more and more people realise that such views are not hate but driven by concerns about safeguarding women’s rights. Would such a debate be filtered out online by overcautious labelling by big tech and the safety duties in its Ts and Cs?

Finally, like others, I am worried that the Secretary of State is given too much power—for example, to shape Ofcom’s codes of practice, which is a potential route for political interference. My concerns are fuelled by recent revelations. In the US, Elon Musk’s leaked Twitter files prove that, in the run-up to the 2020 election, Joe Biden’s presidential campaign routinely flagged up tweets and accounts that it wanted removed, influencing the suppression of the New York Post’s Hunter Biden laptop exposé. Here in the UK, only this week, a shocking Big Brother Watch report reveals that military operatives reported on online dissenting views on official Covid lockdown policies to No. 10 and the DCMS’s counter-disinformation unit, allowing Whitehall’s hotlines to giant media companies to suppress this legal content. Even the phrase “illegal” in the Bill can be politically weaponised, such as with the proposal to censor content allegedly promoting small boat crossings.

Free speech matters to democracy, and huge swathes of this Bill could threaten both unless we amend it appropriately.

19:06
Lord Kamall Portrait Lord Kamall (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I begin by thanking the House of Lords Library and various organisations for their briefings on the Bill. One of the ways I want to approach this discussion is to talk about where I think there is consensus and where there will need to be further debate. Of course, as many noble Lords have said, there will be incredible trade-offs, and there are many issues people feel strongly about.

There is consensus on the issue of protecting children, and I pay tribute the noble Baroness, Lady Kidron, for her work over many years on this, as well as that of other noble Lords. There is consensus on making sure that, where companies have terms and conditions, they actually enforce them. We have to be aware of that. There is obviously consensus on tackling sites promoting suicide and other self-harm measures.

Where there are concerns on my part is around freedom of expression. Quite often, everyone says that they are in favour of freedom of expression until they are offended, and then they find a reason not to be. There are also concerns about the Secretary of State’s power to intervene and influence the online safety regime. I agree with other noble Lords that Ofcom should remain independent from the Secretary of State but I am aware of public choice theory; institutions could be captured by political bias, so we have to be careful about that.

Noble Lords will submit amendments to bring back into the Bill the issue of harm to adults, but I would add a note of caution: how subjective is “harm”? A quick example is how Muslims reacted to the Danish cartoons. Some would have found them distasteful; some would have said they were harmed by them. Does that mean they should have been banned or taken down? How do we face these challenges in a free society? Can we be as technologically neutral as possible? Can we be careful of rent-seeking by organisations that will peddle their products and claim that they have the best age-assurance technology or something like that? Although we want the solution, let us make sure there is a thriving market to ensure that we get the better solutions. Regulation always lacks developing technology; we will want this Bill to be as dynamic as possible, but that may require some secondary legislation, which I know many noble Lords are often sceptical about.

I really want to focus on unintended consequences, not because I am against the Bill but to warn of the difficult issues we are going to have to look at. First, companies will be acting as police but may take an overcautious approach. In the other place, and here, people talked about criminal liability with some of the directives, but think about the impact of criminal liability on other legislation—for example, financial companies when it comes to politically exposed persons. We all know the unintended consequences of that from being overcautious.

Adult verification is another issue. Whatever we think about pornography, it is legal. What people will be concerned about is whether they can verify their age in an anonymous way. They will be concerned whether their data will be used later to blackmail them; will verification drive users to the dark web? Not everything on the dark web is illegal. Some authoritarian regimes such as Russia, China, Saudi, Iran and Venezuela have tried to ban the Tor Browser, but are we going to follow them? There are also ways around it. One way that terrorists have been known to share information was to create an email account, share the password and username, and leave messages for each other in the drafts folder. How do we tackle that without impacting on all users of the internet? How do we also make sure that firms enforce their terms and conditions and, in doing so, do not water them down?

I know that there are many questions, but I hope that we will work through them, and others that have been raised, so that we have a Bill that is proportionate, workable and effective, and that protects children, women and girls, and vulnerable adults.

19:10
Lord Brooke of Alverthorpe Portrait Lord Brooke of Alverthorpe (Lab)
- Hansard - - - Excerpts

My Lords, I generally welcome the Bill and I pay tribute to the noble Baroness, Lady Kidron, for the great work she has done. In the Bill, I particularly welcome the movement towards greater protection for children than we have had hitherto. I share the concern of the noble Baroness, Lady Benjamin, that there may be difficulties, including the age-verification system, which was raised by the noble Lord, Lord Kamall. I am in favour of age verification and I would like to see it implemented quickly. I would also like the Minister to assure us that, having waited so long, if we find that there are loopholes in it, we can find some mechanism to fill those loopholes fairly quickly—perhaps a commitment to using secondary legislation rather than having to wait for so long, as we have done in the past.

My second concern relates to Clause 12, which the right reverend Prelate the Bishop of Oxford raised and which the nobles Baronesses, Lady Hollins and Lady Finlay, also spoke to, on the protection of adults from risk and harm. I do not think enough attention has been paid to what is happening with pornography and with mental health. Here I declare an interest as the founder and vice-chair of an All-party Group for the Twelve Steps Recovery Programme from Addiction. Addiction is not just about alcohol. AA started the 12-step programme but it has been extended over the years to a whole range of other addictions—not least drugs, gambling and overeating, and in particular it is growing quite extensively in the sexual field. We have a range of 12-step programmes operating, including for SLA—sex and love addiction—and sexual addiction. As to the latter, an ever-increasing number of people are in grave trouble due to the effects of pornography, not just solely on themselves but consequently the rest of their family in a whole range of different ways.

It is quite interesting that of the number of people watching pornography—mainly men—between midnight and 4 am is the time when most porn sites are being visited. These are affecting people mentally, affecting their work and affecting their relationships. The Bill as it stands does not address that issue sufficiently well. They had a go at it in the Commons and were persuaded that the approach was incorrect. Pornography is growing. We must protect the freedom of speech and what we circulate, but equally we must protect standards. In turn, we must make sure that we are not creating in certain areas a decadence that we have not had before that is damaging to society.

I hope that we might look again at Clause 12 and try to find a way for some accommodation to be found between the Government’s viewpoint and the views being expressed by people such as the noble Baroness, Lady Finlay. It is important that we do so; if not, we will have to start campaigning privately. If we cannot get it through law, we will have to bring together those concerned about pornography and look for ways to bring to the attention of people that it must be drawn to a halt or at least diminished, given the extent and pace at which it is growing at present. I think it can be done. We have a dry January; why should we not, in the month of December, encourage people not to engage in pornography? At least it would capture attention. If we want to have a better society, we should be diminishing this practice rather than growing it.

19:14
Lord Hastings of Scarisbrick Portrait Lord Hastings of Scarisbrick (CB)
- View Speech - Hansard - - - Excerpts

My Lords, it is an honour to follow the intriguing suggestion of the noble Lord, Lord Brooke, about December—which I will not repeat at this moment. I declare my interest as a former head of public affairs at the BBC who heavily lobbied this House in 1995 and 1996 to bring about the Broadcasting Act which set BBC online on its way. I am proud to say that BBC online remains a beacon of responsible content to show the rest of the world. I am also co-chair of the all-party group on media literacy and patron of Student View, which works in over 100 schools around the country to deliver media literacy.

In the original draft Bill, media literacy was not a central point but an important point of commitment. It has since been removed from the final legislation in front of us. As the Minister said in his introduction, there are multiple provisions in the legislation which cater for enabling adults to make sensible use of their media journey. However, there is very little, other than protections for children, to enable children to make intelligent understanding of their media journey.

According to the National Literacy Trust, in its assessment a few years ago, only 2% of children had the critical thinking skills necessary to be able to distinguish between fact and fiction online, and 90% of teachers say they are in favour of media literacy but feel that they do not have the skills to be able to teach it. They also feel that the vast majority of children they teach who discuss media issues consistently in the classroom do not understand the difference between truth and misinformation.

I want to keep it simple and say two things to the Minister and one to the Opposition. First, to the Minister, given the level of fines which should become apparent as a response to abuse of this legislation, money will be available to empower media literacy programmes inside and outside of schools. There should be no excuse that there is no money; the money in fines should go not just towards Ofcom’s costs but towards improving the capability of the next generation to navigate the media landscape. Will the Minister and the Government consider that?

It is obvious that media literacy is not in this Bill now because the Government argued it was essentially an education matter. In that case, will the Minister commit the Government—as he speaks for the Government —to bringing forward a media literacy education Bill before the next election? If it is not possible and there is to be a Labour Government after the next election, will the Labour Front Bench commit to bringing forward a media literacy education Bill, rather than simply letting this issue drift into the long grass? The noble Lord, Lord Stevenson, can answer that directly at the end and make a commitment on behalf of the Labour Front Bench we can all hold him to account on.

There also needs to be substantial support for teaching teachers to understand and navigate a forest that they do not necessarily know how to enter or exit. That should be part of teacher development and support. Can we also consider the costs of misinformation and how it is damaging our social fabric? Can the Minister request of the Treasury that it brings forward cost assessments of the damage of misinformation?

19:18
Baroness Chakrabarti Portrait Baroness Chakrabarti (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, the internet is in so many ways a wonderful new continent, discovered only in my adult lifetime. But like older territories it has not been the unadulterated bastion of freedom and equality that its pilgrim and founding mothers and fathers would have dreamt of. While it has created enormous opportunities for expression, interconnection and learning, it has also allowed the monetising of hate and abuses of power up to and including serious criminal offences to the detriment of children and other vulnerable people.

To a large extent, big tech corporations with monopolistic power have become the new imperium, colonising this new continent without the desire, expertise, independence or accountability to properly regulate of police it. Further, as the technology has moved at a breath-taking pace, national Parliaments and Governments have lagged behind in even fulfilling their basic duties to resource the enforcement of existing criminal law online or, indeed, to ensure sufficient tax raising from the new emperors who can employ former senior politicians for their lobbying, influence national elections via their products and seek to further their hegemony even beyond our shrinking, burning planet.

Alongside corporate and governmental neglect, there have been abuses of people’s rights and freedoms by state and non-state entities around the world. It is very possible to be too permissive in allowing private abuse and simultaneously too interventionist so as to abuse political power. Noble Lords would be wise to hold on to that duality as they undertake the most anxious line-by-line scrutiny of this Bill. With that in mind, given the length, novelty and complexity of this draft legislation, I regret the short time allocated today. The sheer number of speakers should have justified two days of Second Reading, if only to prevent de facto Second Reading speeches in Committee.

Legislation is required and the perfect should not be the enemy of a first attempt at the possible. However, given the fast developing and global landscape, further legislation will no doubt follow. Ultimately, I believe that His Majesty’s Government should seek to pioneer a global internet and AI treaty in due course—or at least, a Labour Government should. For one thing, the black boxes of advanced algorithms must be made transparent and subject to legal control so as not to entrench inequality, discrimination and hate.

That may sound ambitious, but it will take that kind of ambition—the kind of ambition that we saw in the post-war era to establish some notion of an international rule of law and fundamental rights and freedoms in the real world truly to establish a proper rule of law with protected human rights in the virtual one. At the very least, what is already criminal should be policed online. However, we should be wary of outsourcing too much of that policing role to corporations without at least binding them more directly to the free expression and personal privacy protection duties that bind Ofcom, police and prosecutors under the Convention on Human Rights.

Furthermore, we should look again at tightening up over-broad public order offences, such as causing alarm or distress under Section 5 of the Public Order Act, before allowing them to constitute priority illegal content for proactive removal. Conversely, will the Minister confirm that, for example, euphemistic sex for rent adverts targeting poor, vulnerable women, in particular, will be a priority under Section 52 of the Sex Offences Act? As this experiment in national regulation of an international phenomenon develops, the power of the Executive to direct Ofcom sets a dangerous politicising precedent for regimes elsewhere. They should be removed.

19:23
Baroness Newlove Portrait Baroness Newlove (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am pleased to add my name to the Second Reading of such an important but complex Bill. There is very little time to speak on such positive and necessary legislation—200-plus clauses and 17 Schedules. But I know from experience of this Chamber that we will scrutinise every full stop to make it far better than when we received it.

While we must recognise that companies should have safeguarding policies and penalties in place, we should also never forget the lives of our young children, those who have been taken and the voices of bereaved families. They should be in the veins of this Bill right through to the end.

I say this as I remember that, in the trial following my husband Gary’s murder 15 years ago, some of the evidence shown was horrific violence downloaded on the offenders’ phones. The content was so horrific that the judge laid it on file for whenever they had parole hearings. It showed injuries identical to those Gary received—kicking and punching injuries that those on trial thought were very funny, even when they watched it in the courtroom from the dock. I now have three daughters who suffer from post-traumatic stress disorder. I have to ensure that they never forget their father, and do not just remember him lying on the ground that August evening.

In my role as Victims’ Commissioner, for seven years I had the pleasure and honour of listening to many victims and survivors of horrific crimes. Time is short but I would like to mention the mother of Breck. Her son was beautiful, bright and bubbly, only to become removed from any emotion and from his family. Breck was groomed online by an 18 year-old man who ran the internet gaming server that Breck and his schoolfriends used every day. Our children are most likely using Xbox consoles and have contact with these people from their own homes. The groomer used lies, manipulation and false promises to gain Breck’s trust. Despite many attempts by the family to stop Breck’s communication with his groomer, he ignored the safety advice he had been given by his family and was sadly lured to the groomer’s flat. On 17 February 2014, Breck was brutally murdered by this online groomer. So, the noble Baroness, Lady Kidron, and all those bereaved families who have worked tirelessly to make sure that the Bill has teeth and power to protect their loved ones, have my full support.

I thank Barnardo’s, the NSPCC, Refuge and the Centre for Women’s Justice for their briefing. My interest will be in the work and roles of the Victims’ Commissioner and the Domestic Abuse Commissioner, and the code of practice to protect the VAWG sector in light of women being 27 times more likely to be abused and harassed. I will be supporting my noble friend Lord Bethell’s amendment on age verification, regarding pornographic content that children can access. We must also ensure that, while this is for the professionals and absolutely about penalising the guilty, we must never forget the families who have to live, every day, through the hardship and heartbreak of losing a loved one. We must ensure that there is a channel to protect their families and support them to have a better life in memory of their loved ones.

19:27
Lord Morrow Portrait Lord Morrow (DUP)
- View Speech - Hansard - - - Excerpts

My Lords, I welcome the Bill but it is very long overdue. The Second Reading of my Private Member’s Bill was on 28 January 2022. It sought to commence Part 3 of the Digital Economy Act 2017. This would have ensured that age verification of pornography was applied to pornographic websites. It is disappointing that the Bill has not progressed and Part 3 of the Digital Economy Act—a vital tool that could have prevented children accessing online pornography —is still not being implemented.

I remind your Lordships that in February 2016—now seven years ago—the Government said:

“Pornography has never been more easily accessible online, and material that would previously have been considered extreme has become part of mainstream online pornography. When young people access this material it risks normalising behaviour that might be harmful to their future emotional and psychological development.”


Nothing has changed in seven years; the threat is still as real today as it was then. All that has changed is that, during that seven-year delay, more children’s lives have been harmed. This cannot be allowed to continue.

I welcome that the Government have listened to the concerns about access to commercial pornographic websites and have, as a result, introduced Part 5 of the Bill. However, I believe more changes are needed to make it effective. Today, I raise only three of them. First, the Bill needs a more robust definition of pornography, based on the 2017 Act. Secondly, the Bill needs to cover all pornography services. Clause 71 says that only if “a service has links” with the UK will it be required to comply with the duties in Part 5, where “links with” means only pornographic websites which have a significant number of UK users or have the UK as a target market.

I ask the Minister: what will be considered significant? Is it significant in terms of the total UK adult users who could use the service, or significant in terms of potential global users? Either way, it seems to me that there could be pornographic websites accessed in the UK that are not required to have age verification to protect those aged under 18 from accessing this content. I doubt that this is what parents expect from this flagship Bill.

Finally, the Bill needs a commencement clause for age verification. Far too many young people have grown up without the protection that age verification could have brought in, if the 2017 Act had been implemented. We have heard others refer to this. There should be no further delay and the Government should demonstrate the urgency that they spoke of when they announced in October 2019 that they would not be implementing the 2017 Act. Age verification needs to be implemented as soon and as quickly as possible, and that is why a commencement date clause is needed in the Bill.

We cannot countenance these measures not being brought into force, or even a long delay of three or more years. The children’s charity Barnardo’s, which has already been referred to, has estimated that children have accessed pornographic content almost 55 million times since the Government announced in 2019 that they would be bringing forward the Online Safety Bill as an alternative to Part 3 of the Digital Economy Act. This cannot be allowed to continue. That is why we need to get the Bill right and ensure that robust age verification, that applies to all websites and social media accessed in the UK, is brought in as quickly as possible. I look forward to exploring these issues further in Committee.

19:31
Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- View Speech - Hansard - - - Excerpts

My Lords, I begin with a brief refection on my 26 years or thereabouts on the internet, which saw me hand-coding my first website in 1999 and sees me now, I believe, as one of the few Members of your Lordships’ House with a TikTok account. I have had a lot of good times on the internet; I have learned a lot, made a lot of friends and built political communities that stretch around the world in ways that were entirely impossible before it arrived. That tells you, perhaps, that I think we should be careful in this debate about the diagnosis of the source of undoubted issues that the Bill seeks to address. It appears that some would like to wave a magic wand and shut it all down if they could—to return to some imagined golden age of the past, perhaps when your Lordships’ House was harrumphing loudly about the damaging effects of this new-fangled television.

While we are talking about young people, I have serious questions about the capacity of this House to engage with this debate. Yes, we did well in getting online during lockdown, even if we sometimes caught a glimpse of the grandchildren or great-grandchildren pressing the buttons so that their elders could speak in the House. They are the same generation; we are looking to take control over what they are doing right now. I invite noble Lords to keep that in mind as this debate proceeds.

I put it very seriously to your Lordships’ House that before we proceed further, we should invite a youth parliament into this very Chamber. We should listen to that debate on this Bill very carefully. On few subjects is the obvious need for votes at 16, or even younger, more obvious—the need for the experts by experience to be heard. They have the capacity to be the agents and to shape their own world, if their elders get out of the road.

I have no doubt that those young people would tell us that they suffer harm on the internet, with awful violent pornography and dangerous encouragements to self-harm and suicide. There need to be protections, while acknowledging that young people cannot be walled off into a little garden of their own. But I am sure young people would also say we need to address much wider issues, to build resilience and provide an education that encourages critical thinking rather than polished regurgitation of the facts. I would associate myself with the remarks of the noble Baroness, Lady Merron, and, indeed, the noble Lord, Lord Hastings of Scarisbrick, among others, about the need for media education. But how do we encourage critical thinking about the media when we are also encouraging regurgitation of the right results for the exam—that you have to repeat these 10 points? The two things do not fit together.

In a stairwell discussion with a Member of your Lordships’ House who is not a digital native—and I point out that nobody in this debate is a digital native—but is certainly someone with much experience over decades, they reflected on the early hopes of the internet for democracy, for access to information and for community. They suggested it was inevitably a lost age; I do not agree. Political decisions and choices allowed a handful of multinational companies—mostly tax dodging, unaccountable to shareholders, now immensely rich—to dominate. That is not unique to the internet; that is what the political decisions of neoliberalism over the past decades have done to our food supplies, our retailing systems, our energy, our medicines and, increasingly, our education system. Far right, misogynistic, racist, homophobic and transphobic voices have been allowed to take hold and operate without challenge in our mainstream media, our communities, our politics and on the internet.

Financial fraud is a huge problem on the internet and, hopefully, this Bill might address it; but financial fraud and corruption is a huge problem across our financial sector, as indeed is the all-pervading one of gambling. The internet is a mirror to our society, as well as a theatre of interaction. The idea that we can fix our societies by fixing the internet is a fallacy; for many with commercial and political interests, it is a comfortable one that deflects political challenges they would rather not face.

19:36
Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

My Lords, if a child goes to the Windmill club, the most famous strip club in Soho, the bouncers will rightly turn them away, no ifs, no buts: no entry, full stop. If a child tries to buy a knife on Amazon or to place a bet on Bet365.com, it will be the same story: you need proof of age. But every day, millions of children in this country watch pornography in their homes, at schools, on the bus, on devices of all kinds, without any hindrance at all. The Children’s Commissioner makes it really clear that this is not just raunchy pornography like in the old days of Razzle magazine. These are depictions of degradation, sexual coercion, aggression and exploitation, disproportionately targeted at teenage girls. As Dame Rachel de Souza said:

“Most of it is just plain abuse”.


The effects of this failed experiment are absolutely disastrous. The British Board of Film Classification says that half of 11 year-olds have seen porn, and according to the NSPCC, a third of child abuse offences are now committed by children. The answer is straight- forward in principle: we need to apply the rules on age verification for porn that exist in the real world to the online world. We need to address this harm immediately, before any more damage is done—before there is any metaverse or any more technology to spread it further.

I know that the Minister, the Secretary of State and the Prime Minister all broadly agree with this sentiment, and that is why the Bill has:

“A duty to ensure that children are not normally able to encounter content that is regulated provider pornographic content in relation to the service (for example, by using age verification).”


But this vague power simply starts a long process of negotiation with the porn industry and with tech. At a very minimum, it will require a children protection consultation, a child’s access assessment, a guidance statement, an agreement on child protection guidance and codes, secondary legislation, parliamentary approval of the Ofcom child protection code, monitoring and engagement, engagement on the enforcement regime, test cases in the courts—and so on.

I appreciate that we are creating laws flexible enough to cope with technological evolution and I totally support that principle, but we should not reinvent the wheel. We tried that 30 years ago when the online porn industry started, and it failed. We need one regime for the real world and for the online world. This is an opportunity to send a message to the tech industries and to the British people that we mean business about protecting children, and to put Britain at the vanguard of child protection regulation.

I want to see this Bill on the statute book, and I am very grateful for engagement with the Minister, the Bill team and all those supporting the Bill. I look forward to suggestions on how we can close this gap. But if we cannot, I will table amendments that replace Part 5 of the Online Safety Bill with Part 3 of the Digital Economy Bill—a measure that has considerable support in another place.

19:40
Baroness Grey-Thompson Portrait Baroness Grey-Thompson (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I draw attention to my interests as in the register, and I thank all those who have sent briefing notes. I do not think any of us underestimates the scale of what we have to achieve in the coming weeks.

Just this morning, I read an article in which Dame Rachel de Souza was quoted as saying that this Bill is an “urgent priority”. The article described a 12-year-old girl being strangled by her boyfriend during her first kiss:

“He had seen it in pornography and thought it normal.”


This afternoon, many figures have been quoted on children’s access to pornography, and each figure is deeply disturbing. I listened very carefully to the words of the noble Lord, Lord Bethell; he made a compelling argument, and I will strongly support any amendments he brings forward.

Along with age verification we need better education for children on the use of the internet, and on appropriate relationships. We have to be very aware of content that pushes weight loss, body image and appearance, appearance-improving ads, and images that have been altered.

I would like to concentrate on violence against women and girls, and I thank all the women who have been in touch with me. We must recognise the threat that women are under. Women are 27 times more likely to experience abuse—that is one in three women. Some 62% of young women have experienced abuse. Four out of five cases of online grooming involve girls, and 120 cases are being reported every week. To bring that closer to home, 93% of female MPs have experienced online abuse just for doing their job or having an opinion. I am not trying to stifle free speech. Yes, we have to accept criticism and challenge, but not abuse and threats. I really worry about us developing a social norm of trying to shut down women’s voices. I am mindful that we in this Chamber and in another place have a high degree of protection that women in the outside world do not. We live in a world where a rape threat against a woman can potentially remain online, but a woman talking about menstruation can be told that it breaches guidelines. The balance is not yet right.

I offer my support to my noble friends Lady Hollins and Lady Finlay regarding vulnerability; it does not end at the age of 18. We have to think about those who are vulnerable. The empowerment tools do not go far enough, and we need to explore that in more detail in Committee.

Finally, I pay tribute to my noble friend Lady Kidron. I thank her for her work and for arranging a meeting with the Russell family, and I thank Ian Russell for being here today. That meeting fundamentally strengthened my view on what we need to do. It was shocking to hear what various platforms deemed to be acceptable. I naively expected them to be better. It completely ignores those who are in a vulnerable position, who can be constantly bombarded with abusive images. I have spent the last couple of days trying to put into words my feelings on listening to what Molly went through. It is horrendous, and while we applaud the resilience and bravery of the Russell family, this is our chance to do so much more and to protect internet users.

19:44
Lord Davies of Brixton Portrait Lord Davies of Brixton (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I want to talk about the link between online financial scams and mental health. People who have problems with their mental health are, for a variety of reasons, more vulnerable to such scams. They are three times more likely to be the victims of online financial scams than those people without such problems and, in reflection, people who are victims of online scams are much more at risk of having mental health problems.

I understand and have been impressed by the contributions to this debate about the problems faced by children and women, but I think, given the opportunity of the Bill, it is important that this issue is addressed. The results of such scams lead to much misery. They destroy families and, in all too many cases, lives. So the question is: can, and how should, the Bill address this problem? This is the Bill on the stocks and the one in which we must address this issue.

There is no doubt that scams are a big and growing problem. Anyone can fall victim to such a scam, but people with mental health problems are more at risk than others, so we have to do what we can, first, to improve scam prevention and, secondly, to ensure that when people fall victim they get the support that they need.

I have to pay tribute to the work being undertaken by the Money and Mental Health Policy Institute. It has drawn attention to how online harm can arise in a variety of areas: gambling, retail and financial offers. A number of recurring themes have emerged where action is needed, such as where people all too easily lose control of their transactions. There is also advertising and the way in which tools and techniques are developed that pressurise people into falling victim. The institute has concluded and demonstrated how, all too often, this behaviour goes unchecked, with regulation lacking or being poorly matched to what actually happens online.

While I understand the other issues that need to be addressed in the Bill and that led to the Bill, the problems of online financial scams are sufficient to deserve attention in the Bill.

19:47
Lord Browne of Belmont Portrait Lord Browne of Belmont (DUP)
- View Speech - Hansard - - - Excerpts

My Lords, it is beyond any doubt that an Online Safety Bill is needed. The internet has been left uncontrolled and unfettered for too long. While the Bill is indeed welcome, it is clear that more work needs to be done to ensure that it adequately protects children online.

There is a substantial body of evidence suggesting that exposure to pornography is harmful to children and young people. Many have spoken in this debate already about the harm of easy access to pornography, which is carried into adult life and has a damaging impact on young people’s views of sex and relationships. For many young men addiction to pornography, which starts in teenage years, can often lead to the belief that women should be dehumanised and objectified. Pornography is becoming a young person’s main reference point for sex and there is no conversation about important issues such as consent. That is why the Bill needs to have proper and robust age verification measures to ensure that children cannot access online pornography and are protected from the obvious harms.

Even if the Bill is enacted with robust age verification, experience tells us this is no guarantee that age verification will be implemented. Parliament passed Part 3 of the Digital Economy Bill in 2017, yet the Government chose not to implement the will of this House. That cannot be allowed to be repeated. Not only must robust age verification be in the Bill, but a commencement date must be added to the Bill to ensure that what happened in the past cannot be allowed to happen again.

I know that some Members of the House are still fearful that age verification presents an insurmountable threat to privacy: that those who choose to view pornography will have to provide their ID documents to those sites and that their interests may be tracked and exposed or used for blackmail purposes. We live in an age where there is little that technology cannot deliver. Verifying your age without disclosing who you are is not a complex problem. Indeed, it has been central to the age verification industry since it first began to prepare for the Digital Economy Act, because neither consumers nor the sites they access would risk working with an age verification provider who could not provide strong reassurance and protection for privacy.

The age verification sector is built on privacy by design and data minimisation principles, which are at the heart of our data protection law. The solutions are created on what the industry calls a double-blind basis. By this, I mean that the adult websites can never know the identity of their users, and the age verification providers do not keep any records of which sites ask them to confirm the age of any particular user. To use the technical terms, it is an anonymised, tokenised solution.

The Government should place into the Bill provisions to ensure robust age verification is put in place, along with a clear time-limited commencement clause to ensure that, on this occasion, age verification is brought in and enforced. I support the Bill, but I trust that, as it makes its way through the House, provisions in it can be strengthened.

19:51
Lord Black of Brentwood Portrait Lord Black of Brentwood (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I declare my interests as deputy chairman of the Telegraph Media Group and director of the Regulatory Funding Company, and I note my other interests in the register.

I welcome the Bill as the first rung on the ladder, ensuring that the unregulated, untransparent and unaccountable platforms begin finally to be subject to the legal strictures of regulation, accountability and transparency. In 1931, Baldwin famously said the press exercised power without responsibility. Now, the press is subject to intense regulation and tough competition laws, and it is the platforms exercising power without responsibility. This vital Bill begins the journey to rectify that.

It was an honour to sit on the Joint Committee and a huge pleasure to work with colleagues from across the House under the exceptional chairmanship of Damian Collins. In particular, the noble Baroness, Lady Kidron, brought such insight and energy to our work. I believe that, as a result of its work, the Bill strikes an appropriate balance between platform regulation, freedom of expression and the protection of quality journalism.

I will make just two points about the policy backdrop to this legislation. While regulation is crucially important, it is just one side of the coin: it must go hand in hand with competition. What is vital is that legislation to deal with digital markets and consumer protection follows swiftly. It is time—to coin a phrase—to level up the playing field between platforms and publishers.

For years, news publishers have operated in a deeply dysfunctional digital market, hampering efforts to realise fair returns for their content. Local and regional publishers continue to be hardest hit. Platforms generate a huge portion of advertising revenue from news media content: figures calculated by Cambridge professor Matt Elliott estimate UK publishers generate £1 billion in UK revenues for Google, Facebook, Apple and others each year.

The news consumption trend from print to digital means digital markets must function in a fair and transparent way to secure the sustainability of quality journalism. Google has more than a 90% share of the £7.3 billion UK search advertising market. That means platforms take news content for free and the bulk of advertising, which would pay for it in the analogue world, at the same time.

I welcome the fact that the Government will bring forward legislation to deal with this by giving the Digital Markets Unit statutory powers and tough competition tools. It will be a world-leading digital regulator alongside this world first in online safety, paving the way for a sea change in how platforms operate and ensuring the sustainability of journalism.

As a new age of regulation dawns, I join my noble friend Lady Stowell in urging the Minister to ensure speedy implementation of changes that are the vital other side of the coin. The Joint Committee said in its report that this should happen as soon as possible. Indeed, these two pieces of legislation will feed off each other. As a joint report by the CMA and Ofcom concluded:

“Competition interventions can … improve online safety outcomes.”


My other point is the fluid nature of the legal ecosystem surrounding the platforms, which the noble Baroness, Lady Chakrabarti, mentioned. For almost 30 years the US tech giants have benefited from the protection of Section 230 of the Telecommunications Act of 1996. Passed while the internet was in its infancy, it provided platforms with safe harbours in which to operate as intermediaries of content without fear of being liable for it, which is why we now have the manifold, terrible problems of social media we have heard about today, which the Bill is rightly addressing. But times have changed, and that backbone of internet law is under intense scrutiny, above all from the US Supreme Court, which has for the first time in quarter of a century agreed to hear a case, Gonzalez v Google, challenging the immunity of companies that host user content online. The court’s decision will have a significant impact on the internet ecosystem, especially taken alongside anti-trust legal actions in the US and the EU. They are issues to which we will inevitably have to return.

The Bill—along with many other developments that will have a profound effect on competition, on regulation and on the protection of children—ushers in an era of radical change, but is, as we have heard a number of times today, only part of the journey. Let us now move forward swiftly to finish that job.

19:56
Baroness Featherstone Portrait Baroness Featherstone (LD)
- View Speech - Hansard - - - Excerpts

My Lords, the power to amplify, together with the volume and speed of the online world, has put power in the hands of individuals and organisations, for better or for worse. While we seek to control the worst, we also have to be aware that we now have the most extraordinary communication tool for ideas, gathering others to our cause and getting information around the world in a flash, as well as providing avenues for those in countries that do not have the miracle of free speech to contact the outside world, because their media, and they, are state-controlled.

Of course, what is illegal offline is illegal online. That is the easy bit, and where my preference undeniably lies. The new offences, dealing with what were some of the “legal but harmful” issues, cover off some of the most egregious of those issues.

In our last debate on freedom of expression, I said:

“I want maximum controls in my own home. Put power in my hands”.—[Official Report, 27/10/22; col. 1626.]


The user empowerment now in the Bill will target things such as suicide content, eating disorder content, abuse targeting race, religion, sex, sexual orientation, disability and gender reassignment, and the incitement of hatred against people with those characteristics. But I will argue, as others have, that a default setting must be in place so that such material is not available unless chosen. Thus the algorithmic onslaught of content that follows a single search can be averted. More importantly, vulnerable adults, who may not be capable of selection and exclusion, need that protection. We do not have to view what we do not want to see, but let that be our choice before we are fed it.

Equally absent with the removal of legal harms is violence against women. The onslaught of misogyny, bullying and worse at women is dangerous and totally unacceptable. A whole raft of organisations are behind this push to amend Clause 36 to require Ofcom to develop a VAWG—violence against women and girls—code of practice. I hope and trust that noble Lords across the House will be in support of this.

I want cyberflashing—sending pictures of genitals, which thankfully is now an offence in the Bill—to be amended so that it is about not whether there was intent by the sender to cause harm, as in the Bill now, but that the sender must have consent. Women are sick and tired of being made responsible for male misbehaviour. This time, let it be on the men to have that responsibility.

On children, age verification is nowhere near strong enough in the Bill in its current form. I trust that this will change during the Bill’s passage. Like probably everyone in this House, I pay tribute to the noble Baroness, Lady Kidron, for all the work she does.

In our legislative endeavour, we must guard against authoritarian creep, where the prohibition against what is truly harmful oversteps itself into a world where we are to be protected from absolutely anything that we do not like or agree with—or, worse, that the Government do not like or agree with. As others have said, the powers of the Secretary of State in the Bill are Orwellian and need to be pushed back.

Free speech presents challenges—that is the point—but the best way to challenge ideas with which you disagree is to confront them by marshalling better ethics, reason and evidence. Life can be dangerous, and ideas can be challenging. While we must not submit our intellect and freedoms to the mob, we must protect the vulnerable from that mob. That is the dividing line we must achieve in the Bill.

19:59
Lord Inglewood Portrait Lord Inglewood (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, like other contributors to the debate, I support the Bill, but that does not mean that I think that it is perfect; we must be aware of letting the best be the enemy of the good. I declare my interests as a trustee of Full Fact and the Public Interest News Foundation.

I was very glad that the noble Lord, Lord Hastings, referred to the Broadcasting Act 1996, because, during its passage as a Bill, I was fulfilling the role that the Minister is performing today. I remember that, before coming to address your Lordships, I looked at the draft speech that had been prepared and which described the Bill at length. It was incredibly boring, and I said, “No, I am not going to do that; I want to describe to the House what the world that the Bill will bring into effect will look like”. I told your Lordships that I was taking them into a world of science fiction. In fact, I may have misled your Lordships on that occasion, because I underestimated the impact of the technology that was evolving. Also, I do not think that anybody realised quite to the extent that we do now that you cannot disinvent technology: things have happened which are here for ever from here on out.

While technology has changed, sadly one thing has not changed: human wickedness. Rather, human wickedness has been innovative. The Government tell us that they are great believers in innovation, but I do not think that they believe in innovation in this context. History suggests, and the contemporary world corroborates, that countering wickedness and vice is never easy, particularly when it is complicated by issues of jurisdiction, geography and technology.

My view is that this simply cannot be done by primary law or, indeed, secondary legislation. As the noble Baroness, Lady Stowell, touched on, we need all kinds of soft law and codes of conduct to complement that. She was right that we have to move on from the kind of legislative approach we have now, which I call “stop and start”. We have a period of intense debate in Parliament about a piece of legislation and then, as has been heard this evening, it is all forgotten for five years—and then you find that the piece of legislation you passed does not really meet the problems of the day. We must find a way of passing what I like to describe as “living legislation”, so that it is possible, in an ongoing way, to allow those things to evolve in response to the problems that the world is presenting. It is not simply a matter of a cosy relationship between the Government, the regulator, media companies, pressure groups, charities and so on; Parliament must be involved in doing what is, after all, its real job: law-making. I think that the public, too, need to know what is going on.

If I am right in saying so, and I think I am, this kind of static approach to law-making cannot really be what is needed in circumstances of the kinds we are talking about now. Parliament, this House and the other place together, should somehow take the metaphorical bull by the horns and evolve ongoing procedures to complement the technological evolution of the internet, which changes every day—indeed, things will have changed during the duration of the very debate we are having. I dare say that the same is true elsewhere, including in other sectors about which I know very little. If we, as parliamentarians, do not grasp this particular nettle, the consequence will be that the citizens of this country will materially lose control over quite a lot of what surrounds their daily lives.

20:03
Baroness Wyld Portrait Baroness Wyld (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am sure I have been annoying my noble friend the Minister for the last year by asking him when the Bill is coming. Today is one of those days when everything happens at once: two of my daughters are out of school because of industrial action, and I spent most of this morning arguing with them about whether they could go on the internet and how long they could spend on it. They said it was for homework, and I said that it was not and that they should read a book; you can imagine it.

There is a point to that slightly grumpy anecdote. First, I take issue with the suggestion by the noble Baroness, Lady Bennett, that your Lordships’ House does not engage with the next generation. More to the point, there is a fundamental tension that millions of parents up and down the country face. Our children are online a lot and sometimes we want them to be online. Do not underestimate the way that lockdown accelerated their online lives through home-schooling, necessarily—I declare my interest as a non-executive at Ofsted. Sometimes this was to their advantage, but I suspect on the whole it was probably not.

My concern is that while children should be able to get on and do their homework, we have allowed big tech to mark its own homework. The really appalling evidence that we have heard today underlines the urgency to get this Bill right.

The noble Lord, Lord Knight of Weymouth, hit the nail on the head—he usually does—about the speed and complexity of the technology; it is just so fast. Most parents that I know certainly do their best to keep their children safe. It is a bit like Sisyphus rolling the boulder up the hill; it just comes back down, because it is so much easier now for our children to be deceived, abused and bullied and to view the stuff of nightmares. When this includes pornography sites, which many others have talked about, with characters from children’s TV such as “Frozen” and “Scooby-Doo”, I do not think it is particularly dramatic to wonder what we have become as a society to allow this sort of thing to happen. I welcome the consensus that we have heard around the need to protect our children, although it tragically is too late for many. I am sorry that the process has dragged.

I will work across the House at Committee stage and beyond to make sure that the Bill is sufficiently stringent, that the scope is correct and that it is workable, because we cannot risk giving parents and young people false reassurance or weak new systems. The noble Baroness, Lady Harding, was very clear on this and I share her concerns about app stores not being in scope.

Going back to pornography, I know my noble friend the Minister takes these things extremely seriously, but I do not see how anybody can feel reassured unless the Government commit to robust age verification, as set out by my noble friend Lord Bethell.

In the time I have left, I want to address cyber flashing. I am very glad the noble Baroness, Lady Featherstone, did so too. I completely agree that it should be based on consent. I felt weary having to have these sorts of conversations again: about victims having to somehow prove that they are not overreacting, or if it was a bit of a laugh then it does not really matter. It makes no difference to their experience. I do not want to be presumptuous, but I think there is cross-party impetus to ensure that the new offence is based on a principle of non-consent, and I hope the Government will be prepared to listen. This is no criticism of my noble friend the Minister, who is an excellent Minister with an excellent team at DCMS, but it seems to me that these issues have been left in the “too difficult” pile for far too long and we must not miss our chance now that it is here.

20:08
Baroness Kennedy of Shaws Portrait Baroness Kennedy of The Shaws (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I do not think anyone in this House would disagree with the idea that freedom of expression is a very precious freedom. We have only to look around the world to see that authoritarian Governments almost invariably go after free speech as one of the first things that they do. We know that media freedom is a vital part of any democracy, as indeed is the rule of law, but as the noble Lord, Lord Black and the noble Baroness, Lady Chakrabarti, said, law has been pretty absent in this whole arena, even where it could have been used. I am glad that we are now addressing the complicated issue of regulating the internet and these platforms.

I do not want to see journalists’ privacy invaded so that their sources are exposed. I do not want any possible chilling effect on investigative journalism exposing corruption and abuse of power. It is vital to our democracy. However, we have to think very seriously about the kind of regulation that we have been discussing in this House, because it has been part of our tradition. Unlike the United States, we have not fetishised freedom of expression. We have seen that there have to be occasions when we restrict freedom of speech to protect people from serious harm. That is what this discussion today is really about and will be in the course of the Bill.

I declare that I am a trustee of 5Rights, which is the foundation created by the redoubtable noble Baroness, Lady Kidron. As a lawyer who is pretty well versed in the need for law, I have learned so much from her, and I believe that the major priority of this Bill has to be the protection of children. There are still gaps, and when the noble Baroness comes to put her amendments through, I will be there speaking in support of them. I hope that all noble Lords will come onboard, because those gaps definitely still exist.

I want to speak to your Lordships about women, because last year I chaired an inquiry in Scotland into misogyny. It was a very powerful experience to hear from women and women’s organisations about the extent to which women are abused on the internet. It was absolutely overwhelming that these were not only women in councils or parliaments, or women who were journalists or campaigners, but in schools and universities, women were being traduced and abused. Threats to rape, sodomise or sexually assault women, and to facially disfigure them with acid, would take place online and then you would find people piling in. The pile-on is something this House should know about. It is where, because of algorithms and because of people having followers, huge numbers of people then jump on the bandwagon and add their bit of insult and abuse to what has gone before. Or you get “likes”. I once saw a television documentary saying that the man who invented the thumbs-up “like” regrets it to this day because, of course, he now has children and knows how painful that can be. Also, that business of liking is telling women that there are hundreds and thousands of people out there who think that these things should be done to them.

I really regret to say that, of course, it is not policed. There are not prosecutions, or only very rarely, because of the cover of anonymity, which is problematic. We are going to have to discuss this during the course of this Bill because it gives a veil over those who do it. As well as the pile-on, one of the difficulties is—and I say this as a lawyer—the thresholds you have to pass for criminal prosecution. People have learned that you do not say, “I’m going to come and rape you”; they say, “Somebody should rape you. You deserve to be raped.” The message to women, therefore, is not, “I’m coming to get you”, but “Somebody out there just might”. It has an incredible effect on women.

We have to have that in mind when we come to Committee. We have to recognise the urgency, in relation to children particularly, but we also have to be alert to the ways in which women and girls are finding their lives made wretched. They are made fearful because of threats. Prosecutions and criminal prosecutions should be brought more regularly, because if there is anything that will stop this, it will be that. We have to be very vigilant about media freedom—I agree entirely—but we also have to make sure that we keep the Secretary of State out of this. I do not want to see politicians having their fingerprints on it, but the idea of a Joint Committee to monitor the way in which regulation takes place and to watch developments, because technological developments happen so quickly, is a good one.

We have to address algorithms. We heard from the Russell family that, even after Molly Russell had died, there on her technology she was receiving—it was being pushed at her—stuff about suicide, and the child was no longer alive. This is not about soliciting information; this is it being pushed in the direction of people. I urge this House, with all its usual great expertise, to make this Bill the best we can make it, certainly just now; but the priority first and foremost must be children.

20:14
Baroness Gohir Portrait Baroness Gohir (CB)
- View Speech - Hansard - - - Excerpts

My Lords, it is a huge honour to speak immediately after the noble Baroness, Lady Kennedy. She is one of my sheroes; she did not know that but she does now—and it will be recorded in Hansard. I declare my interest as CEO of the Muslim Women’s Network UK. Let me start by saying that the speech from the noble Baroness, Lady Kidron, was heartfelt; I will support the amendments that she plans to put forward.

I will focus on four areas of concern: the abuse of women and girls; pornography; extremist and misogynistic content; and digitally altered body images. First, I share the concerns that have been raised many times today by noble Lords on the gaps in this Bill to tackle the online abuse and harassment of women and girls sufficiently. I therefore support the call from the noble Baroness, Lady Morgan, to introduce a code of practice.

Secondly, on pornography, I strongly support the recommendations from the noble Lord, Lord Bethell. I will also support any amendments that he plans to table. Inaction by successive Governments to tackle easy access to pornography by children has led to harmful sexual behaviour towards women and girls. This Government must go further to strengthen age verification. There is plenty of technology to do this. It can and should be implemented without delay.

Thirdly, there is a lack of accountability when it comes to publishing extremist and misogynistic online content. I am concerned that, according to the vague definition in the Bill, any online platform can call itself a recognised news publisher and then be exempt from complying with any requirement in the Bill. This will result in online platforms being free to promote harmful hate speech, including misogynistic content, and not having to remove it.

Finally, another urgent concern is the digital alteration of body images and sizes in advertising. Although boys are exposed to digitally altered images of men, girls are exposed to a far greater number of images of women that are highly manipulated and altered. Editing images of models involves taking inches off bodies and faces. The manipulation of images in this way is causing serious long-term harm, contributing to low self-esteem, anxiety, depression and self-harm and driving young people to cosmetic surgery. Given that advertisers are promoting an unattainable body size, this type of online communication is fraudulent and harmful; it therefore can and should be addressed in this Bill.

Earlier, the noble Baroness, Lady Merron, raised concerns about disinformation and misleading material being widely available and causing harm. This is a prime example of that, but it is often overlooked. I know that Luke Evans has introduced a Private Member’s Bill in the other place; however, this Online Safety Bill provides a prime opportunity to tackle this issue now. I urge the Government to listen to the serious concerns being raised by many campaigners, including Suzanne Samaka, founder of the campaign #HonestyAboutEditing. Other countries, such as Israel, France and Norway, have already taken decisive action by legally requiring altered images to carry a label. The UK has been left behind the curve. How will advertisers be held accountable? Will the Government consider legally requiring advertisers to label digitally altered images? Can the Minister inform the House of any alternative plans to tackle this harmful practice by advertisers, such as introducing a code of practice?

There is a common thread in all the concerns that I have shared today: how the weaknesses in this Bill will have a disproportionately negative and harmful impact on the lives of women and girls. If this Government are serious about protecting women and girls from harm, they must take a more holistic, robust approach to their safety.

20:18
Baroness Fall Portrait Baroness Fall (Con)
- View Speech - Hansard - - - Excerpts

My Lords, finally, the long-awaited Online Safety Bill arrives. The noise preceding it has been deafening. It is noise that we should be proud of because it is the sound of a healthy democracy deliberating on some of the most crucial issues in our society; between privacy and security, sensitivity and freedom of speech, it goes to the integrity of our democracy.

These are not new issues at all, but the context is. Online safety is as broad as the landscape it inhabits, making this Bill of great complexity. I support it. Most of us in this Chamber grew up without the internet—something that our children find a total anathema. Now, it equates itself with something as common as the air we breathe. However, it is not as universally available, for access is controlled by a small number of tech companies that have for years declared themselves platforms and dodged responsibility for content. So a sort of terrifying social anarchy seems to have emerged, where no one is accountable or responsible for anything. This offers a free space for terrorists, easy access to pornography, hate speech and bullying. Social media is available 24/7, 365 days a year, which has driven some of our children to despair. We face growing concerns about how our democracy is being undermined and manipulated—about what is real and what is a Russian bot. Regulation was always coming, but the question is: what sort? We should always be mindful that we do not want the sort of highly censored internet we see in China.

How do we effectively regulate something like the net, which shifts like sand? I have a few points. First, I support the establishment of a duty of care for legal but harmful content for children. In my mind, censorship around only what constitutes legal content falls woefully short of creating the sort of nurturing and safe environment we strive to create elsewhere in society for our children, whether in family units, at school or within the wider community. It is said that it takes a village to bring up a child, but now that village is online. However, we must be transparent about how we do this.

That brings me to my second point: we must avoid censorship with no transparency—whether it is by a government or a tech company—for it is only transparency that guarantees accountability.

Next, I turn to the point about anonymity, which the noble Baroness, Lady Kennedy, also raised, among others. It is my belief that the assumption in favour of anonymity on the web encourages people to be the worst, not the best, version of themselves. It gives disguise to trolls and bullies, and allows no off button and no shame. I support steps to encourage platforms to verify users’ identity. I understand that there will be some who cannot, such as victims or dissidents, but they can be drawn to sites that are known to protect them. Then there are those who will not, who can seek less mainstream sites, which we, as users, can choose not to use.

Fourthly, we should be doing more to address the challenge of the health of our democracy and the quality of discourse that underpins it. The insidious power of algorithms is driving us to echo chambers and polarising debate. We have lost a sense of a common truth, and with it what forms a lie. This is especially concerning around election campaigns, where fraudulent advertising or disinformation may be difficult to judge and may sometimes come from foreign agents. And what of spending limits? We carefully constructed these through Electoral Commission rules, yet there is a free-for-all on the web. I believe there is more we should do to secure the integrity of the poll online.

Some of the smartest people in the world created the internet; there is no reason why they cannot fix some of its worse characteristics. This is the first of what will surely be many Bills about online safety and how we regulate the internet. While we must strive to protect, we must also be mindful of the boundaries between privacy and security, and freedom of speech and censorship. These are questions which have run for generations through our democracy and always will. We must understand and be honest with ourselves that, while this is a battle worth fighting, it is a battle we will never entirely win.

20:22
Lord Storey Portrait Lord Storey (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I speak in this Second Reading debate with little detailed knowledge of the digital world. I will probably be taking up my noble friend Lord Allan’s offer. I am not on Facebook, TikTok, Instagram or Snapchat; I have occasionally dabbled on Twitter. What I do have is 40-plus years’ experience as a teacher and head teacher. I have seen first-hand how children can have their lives turned upside down and how they have been physically and emotionally scarred by the effects of social media and the online world.

Yesterday, we heard from a study by the Children’s Commissioner for England how children as young as nine are being exposed to online pornography; how a quarter of 16 to 21 year-olds saw pornography while still at primary school; and how, by the age of 13, 50% had been exposed to it. You might say, “So what?” Do we want to hear that 79% of 18 to 21 year-olds have seen pornography involving sexual violence while they were still children? Do we want to hear that a 12 year- old boy had strangled a girl during a kiss because he thought that was normal? Do we want to hear that half of young people say girls expect sex to involve physical aggression? This all comes, by the way, from the Children’s Commissioner’s report.

The Online Safety Bill, as we have heard, has been a long time coming. The Government’s aim in introducing the Bill is to make Britain the best place in the world to set up and run a digital business, while simultaneously ensuring that Britain is the safest place in the world to be online. But does the Bill really achieve that for children? Childhood is about loving and learning. It is about innocence and enjoying the wonders of life. It is not about having that innocence and wonder shattered by some perverse online content.

My interest in this Bill is how we as a society can restore childhood to our children. The Bill, as the noble Baroness, Lady Kidron, said, must cite the UN Convention on the Rights of the Child, and General Comment 25 on children’s rights in relation to the digital environment. Citing this in the Bill would mean that regulated services would have regard to children’s existing rights. The limited scope of the Bill means that, as the 5Rights Foundation points out, children will still be exposed to harmful systems and processes, including blogs and websites that promote and encourage disordered eating, online games which promote violence, financial harms such as gambling, and parts of the metaverse which have yet to be developed. The Bill will not be future-proofed. Regulating only certain services means that online environments and services which are not yet built or developed are likely not to be subject to safety duties, which will quickly make the Bill out of date.

Turning to age verification, as a teacher it always worries me that children as young as seven or eight are on Facebook. In fact, 60% of UK children aged eight to 12 have a profile on at least one social media service. Almost half of children aged eight to 15 with a social media profile have a user age of 16 plus, and 32% of children aged eight to 17 have a user age of 18. Without age assurance, children cannot be given the protections needed to have an age-appropriate experience online. Some 90% of parents think that social media platforms should enforce minimum age requirements. We should do whatever we can to protect children from harm. The Bill will establish different types of content which could be harmful to children:

“primary priority content that is harmful to children … ‘priority content that is harmful to children’ and ‘content that is harmful to children’”.


I say that any content that is harmful to children should be dealt with.

As the noble Lord, Lord Hastings, has said, media literacy is hugely important to this Bill and should be included. Media literacy allows children to question the intent of media and protect themselves from negative impacts, be it fake news, media bias, mental health concerns or internet and media access. Media literacy helps children and young people safely consume the digital world. I was a bit disappointed that the noble Lord, Lord Hastings, did not ask what a Liberal Government would do, but I can tell him that we would be dealing with this issue.

Yesterday, the Princess of Wales launched a campaign to highlight the importance of childhood. Children need to enjoy their childhood and grow up in a supportive, caring environment. They need good role models, not influencers. Children are very vulnerable, innocent and susceptible. We must do all in our power to ensure that online is a safe place for them, and to be able to say to the daughter of the noble Baroness, Lady Harding, that we did finally do something about it.

20:27
Lord Austin of Dudley Portrait Lord Austin of Dudley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, it is clear from the last speech that we must do much more to protect impressionable young people from the torrent of racism, extremism and dangerous conspiracy theories online. Sites like Facebook and Twitter fuel division, anger and extremism, which can lead to threats and violence. Small sites like 4Chan, Odysee and Minds do not even have the third layer of the so-called triple shield. People are routinely targeted, intimidated, bullied and harassed, as we have heard in so many speeches during this debate. This has a terrible impact on public debate, let alone mental health.

Research by the Antisemitism Policy Trust revealed that there are two anti-Semitic tweets per year for every Jewish person in the UK. That report was before Elon Musk’s takeover relaxed the rules. As we have heard from the noble Baroness, Lady Anderson, women get it worse, with all sorts of disgusting abuse and even rape threats. Yet the Government have so far not accepted calls for an Ofcom code on violence against women and girls. Anyone of any age can set up a Twitter account with just an email address, giving them access to hardcore pornography. Future generations will be amazed that we allowed this lawless wild west to develop.

Sites like Twitter allow the repeated publication of completely false, defamatory and made-up images, making completely unfounded allegations of the most vile behaviour. It ignores complaints, and even when you to try to take them up and can show clearly how posts break its rules, it will not do anything about it. Twitter’s entire business model is based on fuelling argument, controversy and anger, which obviously leads to abuse and in some cases threats of violence. This can become addictive, leading to a terrible impact on people’s mental health.

People abroad are making billions out of poisoning public debate and making the mental health of vulnerable people worse. Imagine it: who would be allowed to set up a business to deliver anonymous hate mail about other members of the public through people’s front doors, which is essentially what Twitter is able to do? Why are we allowing billionaires abroad to decide what young people in the UK are subjected to, instead of Parliament, which is accountable to the public, setting rules that are properly understood?

You do not need to be paranoid to ask why hostile countries might use social media to undermine western societies with extremism and violent argument. This is not about limiting free speech or censorship—remember, these sites already curate what we see anyway—but implementing proper systems of age verification and holding the executives to account when they break the rules.

I share the concerns of the noble Baroness, Lady Fall, about whether we really need anonymity on social media in the UK. Freedom of speech should not allow threats of violence or rape, or disgusting abuse. In any event, people have the freedom to say what they like, within the bounds of the law, but that does not mean they should not be held responsible for it. Nor is it true to say that this would affect whistleblowers in countries like ours. The people who make rape threats or publish violent abuse are not whistleblowers.

Finally, as the noble Lord, Lord Black, said, the boundaries between newspapers, broadcasters and social media companies are becoming more blurred all the time. Twitter and the rest of them are clearly publishers. They should be held to account for the material on their sites, in the same way as newspapers.

We need to see small, high-harm platforms brought into the scope of category 1 platforms; the re-introduction of risk assessments for legal harms; and a reversal of the current fudge on anonymity, with at the very least fines for platforms that are unable to know who their customers are. We need to look again at the status of these companies as publishers. Finally, we need to see action on search engines, including Google, which largely escape any actions in this Bill.

20:32
Baroness Prashar Portrait Baroness Prashar (CB)
- View Speech - Hansard - - - Excerpts

My Lords, we are privileged to live in an age of internet technology, which gives us greater access to information and means of communication than at any point in human history. But to get the most out of this online world it must be safe and effectively regulated to counter harm and misinformation. I fully support the Bill. It is a good start, but it needs to be improved in a number of areas.

I begin by paying tribute to the noble Baroness, Lady Kidron, for her tireless work in this area, and for educating and helping us to focus on some of the core and fundamental issues. I will underline some of the amendments she proposes that I intend to support.

As the noble Baroness said, the Bill primarily focuses on user-to-user services and search engines, as defined in Part 2, but harmful content published on websites such as blogs falls outside the Bill’s scope. The noble Baroness’s amendment to include within the Bill’s scope any internet service likely to be accessed by a child is crucial. I strongly support it.

The Bill must also address business models that drive users to this content. As we know, this occurs through platforms, algorithms and push notifications, which amplify and perpetuate access to this content, as illustrated by the tragic death of Molly Russell. I support the noble Baroness’s amendment to Clause 10, which would ensure tough regulation in this area and assessments to tackle drivers of harm, including the design and features of the platform.

Furthermore, online safety should apply not just to children. The 2019 White Paper said that content that actively harmed any user should be tackled. It is deeply regrettable that the Government removed adult safety duties from the Bill, arguing that this would undermine free speech. On the contrary: online safety for all has the potential to enhance free speech, as people can engage on platforms without being exposed to harmful content. I urge the Government to reverse this decision.

The importance of balancing privacy online with the need for public safety is of course crucial. Encrypted messaging services such as Facebook Messenger or WhatsApp are right to keep private messages confidential, but the Government have argued that there are situations where law enforcement agencies must have access to messages on these platforms. Can the Minister explain how they intend to balance privacy and online safety with regard to encrypted messaging services?

Age-verification regimes need to be strengthened to ensure that children are not exposed to pornography. The noble Lord, Lord Bethell, made a very powerful case for that, and I strongly support the amendment which he will bring forward. His proposed amendment, he said, would bring Part 3 of the Digital Economy Act into Part 5 of the Online Safety Bill. As he said, this offers a very neat solution to addressing the significant gap in the Bill, and would make the definition of pornography online consistent with regulation of content in the offline world. I also support the amendment of the noble Baroness, Lady Kidron, to Part 4, which would task Ofcom with producing statutory guidance for age assurance. I also support her amendment to Part 7, requiring platforms to provide a point of contact to bereaved families or coroners when they have reason to suspect that a regulated service holds relevant information on a child’s death, and an amendment requiring social media platforms to share information with coroners in cases like Molly Russell’s.

While this Bill is about the online safety of children, this is an opportunity to include online fraud provisions in legislation, which predominantly affect the elderly. We need a regime where law enforcement, financial services and tech platforms collaborate to reduce online fraud. Would the Government be willing to entertain an amendment that encouraged such collaboration, to ensure that user-to-user platforms and search engines are accountable for fraudulent advertising on their platforms?

The Government have signalled that they will put forward an amendment that will classify videos of people crossing the channel which show the activity in a positive light, which I of course support. Can the Minister assure the House that this amendment, intended to target those who encourage people smugglers, will not criminalise those who show sympathy online for asylum seekers?

Finally, civil liberties groups have described social media as a modern town square. To make sure that this town square is used positively, we need robust provisions for media literacy. A new media literacy duty in the draft Bill has been dropped; now it is mentioned only in the context of risk assessment, and there is no active requirement for internet companies to promote media literacy. There is a wide media literacy gap which leaves many at risk of harm. I agree with Full Fact that a stronger media literacy duty should be reinstated with Ofcom in this legislation to produce a statutory strategy.

Finally, this is a fast-changing area, as others have said. While we can improve this Bill, we cannot make it perfect. I therefore strongly urge that a commitment is given by the Government to subject this legislation to post-legislative scrutiny after three years.

20:38
Lord Balfe Portrait Lord Balfe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, this debate has attracted a lot of attention: some 60 speakers, nearly all of whom have run over their time. I will just make one or two observations. First, it is a long time that we have been waiting for this Bill, so we had better make a good job of it, because I doubt that the Government will let legislation through again for a good five or six years. The second point—I pick up something that my noble friend Lord Inglewood said—is that we need more flexibility in the law. The speed at which the internet has developed is not appropriate for the procedures that we have. It is no good saying that you can have a Henry VIII power, give it to a Minister and then forget it; we need to devise a method of reviewing laws on a regular rolling basis, such as they have in the United States, because the law will be out of date whatever we do.

I am fully behind the amendments of the noble Baroness, Lady Kidron, and my noble friend Lord Bethell. I think that they are excellent amendments, and I look forward to us discussing them. We do not need to do that now.

I would add into the procedures that we need to give careful thought to the idea of anonymity on the internet. I am against it, personally. I am a member of the Conservative Home page and I am there as “Richard Balfe”. Some people are there with very odd names, such as “Brussels Hater” and other handles which do not reveal who they are. I notice that the more obscure the name is, the more violent the contribution is. We need to look very carefully at anonymity; the people who need to hide behind anonymity are probably not the sort of people that we, in considering this Bill, would see as the best people to do things.

My next point is about penalties. The penalties look fine—for example, 10% of world turnover—but of course these are not penalties on the firms; they are business expenses, and that is how they will be seen. I am not a great admirer of the American system but I will say one thing that came out of a visit I paid to Washington. I talked to legislators about how they enforced legislation—in this case it was against financial firms—and the Congressman I was speaking to said, “It is very simple: you imprison them”. He said that if a Bill has a possibility of imprisonment, it puts the fear of God into directors in a way that no fine, however big, does, because that is a business expense and can be planned for. We need to look carefully at whether there should a custodial element in the Bill for severe breaches. I think that would help to get it implemented. Otherwise, the danger I see is that we are in competition with lawyers based in Hollywood rather than with people based in London.

I look forward to the Bill passing; I hope we will do it carefully and considerately—I am sure we will—and take onboard the amendments of my noble friend Lord Bethell and the noble Baroness, Lady Kidron, and the other improvements which have been mentioned.

20:42
Lord Dodds of Duncairn Portrait Lord Dodds of Duncairn (DUP)
- View Speech - Hansard - - - Excerpts

My Lords, I want to focus primarily on the safeguarding of children. I support the general provisions and intent of this Bill; it is clearly going to be a very important tool in keeping children safe online.

While the internet has so many benefits, it exposes children to myriad harmful content, such as pornography and content promoting self-harm and suicide, as well as targeted abuse and grooming. Molly Russell’s name has become synonymous with the Bill, and it is important that we get this legislation right so that harms online, such as those Molly encountered, are not just reduced but eliminated. We need to make the online world as safe as it can be for our children.

We know that young children are able to sign up for accounts on social media platforms with little or no protection from the harms they face; they are able to freely access pornography without restriction. It is shocking that over 60% of children under 13 have accessed harmful content online by accident. To safeguard children and young people thoroughly, we need to ensure that the protections for children offline are mirrored online. I fully endorse what the noble Lord, Lord Bethell, said and I will be supporting him in the amendments he brings forward. I also support those that will be brought forward by the noble Baroness, Lady Kidron.

In the offline sphere, under the Video Recordings Act, the British Board of Film Classification, for example, is responsible for classifying pornographic content to ensure that it is not only not illegal but meets established standards. None of these offline standards is applied online at all.

The online pornography industry has developed and evolved without any—never mind robust—regulatory oversight. But, given what is available online, much of which is illegal, oversight is greatly needed and overdue. The Bill provides the opportunity to put that right, and we must not miss this opportunity because, as we have heard, this may not return for some years.

Age verification was supposed to be implemented under the Digital Economy Act. As a result of the Government’s decision not to implement Part 3 of that Act, children have had unfettered access to pornographic content. Therefore, in my view, age verification needs to be implemented as swiftly as possible. A coalition of charities are proposing that Ofcom must prepare and issue a code of practice within four months of Royal Assent and that age verification should be implemented within six months. That is the very minimum that we should expect. We owe it to our children that they are not exposed to any more harm than they have been already.

Much of the debate in the other place on the issue of free speech focused on the Bill’s provisions to regulate what is legal but harmful. It is important that we ensure that the provisions of this Bill protect free speech, while at the same time protecting vulnerable people against deeply damaging material and content. The Bill now places a duty on user-to-user services

“to have particular regard to the importance of protecting users’ right to freedom of expression within the law”.

We need to examine the operation of this duty very carefully. The Bill must reflect the principle and the law must reflect the principle that whatever you can say offline on the street should be protected online. Large internet companies should not have the power to decide what is said and not said online. If a company removes speech that would be legal offline, it must be placed under an obligation to give reasons why that speech was removed and be held to the highest standard of accountability for removing it.

Social media companies are enormous cartels that dominate our culture. The Government, in bringing forward this Bill, have concluded that they cannot be trusted with users’ safety. They cannot be trusted to keep their platforms safe, and equally they should not be trusted with free speech. I want also to endorse those noble Lords and Baronesses who have called for action to be taken against the awful abuse and trolling of women and girls online, and particularly the use of anonymous accounts. This issue needs to be tackled, and I look forward to working with others in Committee to strengthen the Bill in all these safeguards.

20:47
Lord Mitchell Portrait Lord Mitchell (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, they say that there is no such thing as a free lunch. When it comes to the social media companies, that is certainly true. Google Search is free, as are Facebook, Twitter, Instagram, WhatsApp, YouTube, TikTok and a host of other online services. All of them are great products, hugely popular and used by billions of people every day throughout the world. So it begs the question: why are they free? It is because the mass of data that the internet companies hoover up on their billions of users is a treasure trove. They collect data such as location, shopping, searches, medical records, employment, hobbies and opinions. It is said that Google alone has more than 7,000 data points on each one of us. In our innocence, we all thought that we were searching Google; little did we realise that Google was searching us.

What do they do with this hoard of data? They synthesise it through algorithms. They sell their results to advertisers. Traditionally advertisers spend huge amounts on newspapers, television and other media, struggling to target their markets. It was imprecise. Today, using the data provided by the social media companies, advertisers can personalise their message and pinpoint it accurately. It is hugely cost effective and it generates hundreds of billions in revenue. Data truly has become the new oil.

Of the five largest companies in the world by market value, four are big tech: Apple, Microsoft, Alphabet/Google and Amazon. Indeed, Apple alone has a market value equal to the combined value of all the companies on the FTSE 100 Index. Big tech is bigger than most countries. The big tech companies are richer than us, they move faster than we do, they are aggressive, they are litigious, they are accountable to no-one, they have enormous power, and they make their own rules. They employ the smartest people in the world, even including a previous Deputy Prime Minister of our country.

The Zuckerberg shilling can buy a lot of influence. Let us take a look at Facebook. Its platform has allowed the most unspeakable acts of violence, hate and perversion to go viral, pretty much unchecked. It says that it moderates content, but it is not enough, and usually too late. Now we learn that Mr Zuckerberg is spending $10 billion a year on developing his metaverse. Already we have read of examples of virtual reality sex orgies, and participation in gruesome violence, all viewed through a Meta headset, where the avatars are quasi-people and it becomes almost impossible to distinguish reality from fiction. Imagine where that is all going. Frances Haugen, the Facebook whistleblower, had it right when she said that only profit motivates the company.

This is a landmark Bill. We have to get it right, and we have to make it tough, with no room for loopholes or ambiguities. I have tried to paint a picture of the participants. I have worked and been involved in the digital industry for over 50 years. I know the nature of the beast. They will fight to the last to preserve their business model. Do not underestimate them. These people are not our friends.

20:51
Earl of Erroll Portrait The Earl of Erroll (CB)
- View Speech - Hansard - - - Excerpts

My Lords, several Peers have mentioned the Digital Economy Act 2017 and the sadness of the constitutional impropriety when the Executive refused to implement the will of Parliament. That really concerned me because, if it had been implemented, so many children would have been protected, for several years by now. We learned some useful things during its passage that could very much be applied in this Bill.

The first was on enforcement. This is always the big problem: how do you make them comply? One of the things that will work is the withdrawal of credit card facilities. If a Government or authority ask credit card companies to withdraw facilities from a company, they will, probably internationally. In fact, this happened not that long ago, a few months ago, to one of the big porn sites. It soon fell into line, so we know it works.

The other thing is that anonymous age verification is possible. At the time I chaired it, the British Standards Institution issued PAS 1296 on how to do it, and several companies implemented it. The website itself does not check; it is done by an external company to make sure that it is right. The noble Lord, Lord Browne, has just explained exactly how it works. It was a very good explanation of the whole thing. About a year ago, they were intending to elevate it to an international standard because other countries wanted to use it. Certain European countries were very keen on it and are already implementing stuff.

The other thing that struck me is this: what is meant by “legal but harmful”? It is an expression that has sort of grown up, and I am not sure whether it means the same thing to everybody. In terms of pornography, which I and a lot of us are worried by, we do not want to be a modern Mary Whitehouse on the one hand, so you do not want to regulate for adults. But the noble Baroness, Lady Benjamin, who worked on this, explained all the dangers very well, as did several others. It is not just that children get addicted; they also do not learn how to treat each other and get completely the wrong impression of what they should do. In fact, horrifyingly, I heard that throttling, for instance, is on the increase because it has apparently been appearing on porn sites recently. It does not take long to corrupt the next generation, and that is my real concern: we are destroying the future.

To future-proof it, because that is the other worry, I would suggest quite simply that access to any website, regardless of size, that has any pornography must have anonymous age verification. It is very simple. We may not want to prosecute the small ones or those that do not matter, but it allows us to adapt it to whoever is successful tomorrow—because today’s success may disappear tomorrow, and a new website may come up that may not fall within it.

The other thing I want to mention quickly is that anonymity is necessary because it is not illegal, for instance, for any of your Lordships’ House to go and access pornography, but it is severely career limiting if anyone gets to know about it—and that is the trouble. The same thing applies if you are a Muslim leader and wish to buy some alcohol online. That is why we need to have this. It is perfectly possible, it is out there and lots of companies can do it.

Finally, what is misinformation? It is really the opposite opinion of what you yourself think, and I think there are huge dangers in how we define that.

20:55
Lord Farmer Portrait Lord Farmer (Con)
- View Speech - Hansard - - - Excerpts

My Lords, like every noble Lord today I welcome the arrival of this Bill after such a long wait. Like many others, I will focus my remarks on pornography. Standing back and looking at what has shaped and is shaping our society, we cannot ignore the fact that never in the history of humankind have we been so deluged by pornography. Graphic sexual activity is accessed through the internet on the push of a button and is almost impossible not to stumble across.

What we have been missing up to recently is the data on what this deluge is doing to us all. We have heard a lot about the Children’s Commissioner and her report yesterday. Her research found that most young people have seen pornography on Twitter, Instagram or Snapchat. Moreover, online pornography is not the same as the blue magazines previously available only by reaching up to the top shelf of the newsagents for those who had the chutzpah in those days to do it.

The adult content accessible in our youth is, she says, “quaint” compared to today’s online pornography displays. Pouting page three-type nude stills have given way to video portrayals of degrading, sexually coercive, aggressive, violent, pain-inducing and exploitative acts being perpetrated particularly against teenage girls and, of course, younger children. The title of her report published this week—'A Lot of it is Actually Just Abuse—says it all. She highlights the dangers of the normalisation of sexual violence and the template this provides for children’s understanding and expectations of sex and relationships.

Her stats are a litany of innocence despoiled. Half of children have seen pornography by age 13; some 10% by age nine and more than a quarter by age 11. Some 79% see violent pornography before age 18 and frequent users are more likely to engage, as we have heard, in physically aggressive sex acts.

While I am most concerned about the impact of pornography on children and young people, we cannot ignore its prolific use by adults. International studies show high frequency of pornography use is associated with poor semen quality and reproductive hormone quantity, as well as erectile dysfunction with flesh-and-blood partners. Meta analyses show pornography use is never positively associated with relationship quality.

So, while the Bill has been much strengthened, I will support noble Lords, such as the noble Lord, Lord Bethell, who table amendments requiring that: first, all pornography websites and social media platforms implement third-party age verification; secondly, that there is a Bill-wide definition of pornographic content; thirdly, that online pornographic content is regulated in the same way as offline; and, fourthly, that all pornographic sites must ensure actors are genuinely over 18 so they are not facilitating child sex abuse.

To reiterate, the pornification of society is skewing our values and practices towards cruelty and selfish gratification in intimate relationships. It is undermining efforts to tackle abuse and violence, particularly against women and girls. Not bringing Part 3 of the Digital Economy Act 2017 into force was a dereliction of duty. I was involved at Report stage, and strong forces were clearly at work to preclude hampering adult access to pornography—even to material that would have been illegal offline. The priority then seemed to be securing adults’ continued access to violent, misogynist, racist and degrading material and protecting their privacy.

Almost six years on, my and others’ plea is that we strike a better balance: introduce an age-verification regime at the speed befitting this public and mental health emergency. Third-party providers can give adults the privacy they crave and children the protection to which they are entitled in a civilised society. For too long, we have bowed a knee to cyber libertarian ideology that says internet regulation is impossible, unworkable and unwanted. This Bill must take big, bold, well-evidenced steps to reverse the decades of harm this ideology has caused.

21:00
Baroness Foster of Aghadrumsee Portrait Baroness Foster of Aghadrumsee (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I am very thankful to be in this House to discuss this Bill. I know many Lords have commented on the Bill being rather late but, being a relatively new Peer, I am pleased to be able to contribute to this debate—it is something I have been active on, in another place, for quite some time. So I congratulate the Minister on bringing the Bill to the House.

Everything that a person sees on social media is there as a result of a decision taken by the platform that runs it—a point very powerfully made by the noble Baroness, Lady Kidron—and we have heard the tragic outworking of that for children in this debate. In an article in the Daily Telegraph on 13 December last year, it was reported that Meta knew it was prompting content harmful to teenagers—that was in an internal document leaked to CBS News. It suggested that Meta knew Instagram was pushing girls toward dangerous content.

I will not repeat the many valuable points that have been made on the safety of children—I support them all and will be supporting the amendments from the noble Baroness, Lady Kidron—but I want to make a number of further points, some of which are unfortunately born from personal experience, somewhat like those the noble Baroness, Lady Anderson, made earlier. Women and girls are disproportionately affected by abuse online. While I do acknowledge the user empowerment duties in Clause 12 and the triple lock, I am concerned that the Government’s proposals do not go far enough to protect women and girls. They put an onus on individual users to protects themselves, and while the individual can choose to opt out, it does not protect millions of others from being able to see the content.

As well as fearing for vulnerable women and girls who see such content, I am concerned at the chill factor to women and girls getting involved in public life. Many potential political candidates have said to me that they could not go through what I endure online, and so they do not. That is not good for democracy and not good for encouraging women to come forward. Therefore, I support the proposal to produce a code of practice on violence against women and girls modelled on Carnegie UK’s previous work on hate speech, and that it should be introduced as an amendment to Clause 36. I thank Carnegie UK for its work, over a long period of time, on these issues.

Additionally, it has to be said that some of the trolling against politicians and people who speak out on issues is undoubtedly orchestrated. I hope that that level of orchestration by vicious online mobs—the pile-on that the noble Baroness, Lady Kennedy, referred to—can be looked into as well. I hope the Minister will be cognisant of that point.

I am pleased that anonymity has been raised in the Chamber this evening. The argument goes that if everyone had to be identified and verified online, this would prevent whistleblowers and others, such as the victims of violence, coming forward and speaking out, so they need anonymity. I understand that argument but, given that the majority of abuse and criminal activity comes from anonymous accounts, surely there could be a way to protect genuine free speech users from those who overstep the line and threaten violence. I believe this could be achieved by platforms holding the ID of users behind a firewall that could be breached only if there were reasonable grounds to suspect that a criminal offence had been committed. There are those who use anonymity as a cloak of protection from criminal law. That needs to be challenged. I recognise that this is a cross-jurisdictional issue. However, it is one we need to tackle in this House.

Finally, I support and endorse the amendments being brought forward by the noble Lord, Lord Bethell, on those under 18 accessing pornography, particularly on robust age verification and a clear definition of pornographic content. I commend the work of the noble Lord and the coalition of NGOs that have been working with him. I thank them for their clear papers on this issue.

I support the principle of the Bill, but we will have a lot of work to do to strengthen it. I look forward to taking part in that.

21:05
Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I shall attempt to be brief but, based on previous experience with other speakers, that may be difficult. At least it gives the Whip on the Front Bench the chance to do some agile body moves.

I welcome this overdue Bill. I think the Minister got it slightly wrong when he congratulated us on waiting patiently for it. Judging by every single contribution around the entire House today, patience has been rather wanting. We want to get on with it. Like many government Bills, this has grown like Topsy. It has grown sideways, downwards and upwards. We need to beware of going around in circles. Above all, we need to expedite this and get it on the statute book.

I will focus on three key areas. Unsurprisingly, the first will be children. Here I declare that I am a governor of Coram, the oldest children’s charity in the United Kingdom. I will certainly support amendments such as those that the noble Lord, Lord Bethell, was talking about to try to bring in proper age verification.

Like many other noble Lords, on Monday I had the privilege of sitting in on the briefing that the noble Baroness, Lady Kidron, arranged. Ian Russell, the father of Molly Russell, was present, together with one of her sisters. What we saw was truly shocking. In some ways it was particularly shocking to me because, as Ian shared some of his daughter’s diary—what she had actually written in the days and weeks before she died—I had a sudden jolt of recognition. What 14 year-old Molly was saying was almost identical to the transcript of the suicide note that my father wrote to my mother, which I have in my desk at home. It has the same self-loathing, the feeling of worthlessness and the belief—completely wrong—that you would better serve those you love and live with by departing from this life. My father was a Second World War veteran who had won the Military Cross. He was suffering from manic depression and was clearly in a depressed state, but I cannot even begin to imagine the effect it must have had on Molly to have the deluge of filthy, negative, awful, harmful content that she was deluged in 24 hours a day. Perversely, the more she looked at it, the more excited the algorithm got and the more she received.

Particularly disgraceful is that it took no less than five years for the family and their lawyer finally to get some of the platforms Molly had been watching to disgorge and show some of the content she had been viewing. Five years is wholly and utterly unacceptable.

I take the point that the noble Baroness, Lady Bennett, made about young people being involved. It would be a good idea for Ofcom in some way, shape or form to have access to young people advising it. I support in principle the idea of a Joint Committee of Parliament. Again, I think it would be very helpful to have young people advising that.

The second area is supporting the wonderful noble Baroness, Lady Kidron. I declare quite openly that I am a Beebanite. I think there are quite a few of us in the House, and we will do everything we can to support the wonderful noble Baroness in everything she does.

Lastly, I come to the companies. I speak as somebody who was a head-hunter for 30 years. A large part of our business was in North America and—surprise, surprise—a lot of our most wonderful clients were some of these new tech giants. I know a lot because of that about what I would call the psychology of attraction and repulsion. I can tell the House that for many years, on going to a candidate and saying, “Would you like to join Facebook? Would you like to join one of these other companies?”, they would get pretty excited, because it is new technology, there is a lot of money, it is sexy, it is probably in California—what could be better?

We have to change the paradigm in which people look at potentially being employed by those companies. We have to create a frisson of fear and forethought that, if they do join forces with those companies, not only might their personal reputation suffer but the reputation of the company will suffer, shareholders will suffer, and those who provide services to that company, be they banks or lawyers, will also suffer. That is what we need to change. I will do everything I can, working with others who probably know rather more about this than I do, to concentrate on getting into the minds of those companies, which have huge resources, legal and financial, to resist whatever we do. We have to get inside their minds, find their weak points and go for the jugular.

21:11
Baroness Fraser of Craigmaddie Portrait Baroness Fraser of Craigmaddie (Con)
- View Speech - Hansard - - - Excerpts

An offence in this Bill is an offence under the law of any part of the UK. There is a complex interplay between online safety, which is reserved, and devolved matters such as child and adult protection, education, justice and policing. I realise that the legislative differences between Scotland and England are quite topical. The offence, for example, protecting people with epilepsy does not cover Scotland as Scottish law already covers this behaviour, as is the case with the new cyberflashing offence.

However, the Bill does give Scottish Ministers the powers to amend regulations relating to priority offences in Part 2 of Schedule 6. I think government amendments in the other place mean that Scotland’s hate crime Act will not affect what people can and cannot say online in the rest of the UK, since it was passed by a devolved authority without the Government’s consent. But I believe a loophole remains whereby a future Government could simply approve that or any other law that has been passed in Holyrood, so Nicola Sturgeon could still become the content moderator for the whole of the UK. How should online providers therefore respond where there are differences in legislation across the four nations?

Access to data is clearly essential to ensure that the dynamic landscape of online harms is understood in the Scottish context. I am thinking of issues for rural and remote communities, how online platforms respond to sectarian content, or understanding the online experiences of people with drug or gambling addictions. Are there any differences across the UK? In terms of the transparency reports required by the Bill, will Ofcom be able to see that data in a nation-specific way?

Scotland has a thriving gaming industry, but it is unclear if there is industry awareness or involvement in this Bill and its implications for gaming platforms. I declare an interest as a board member of Creative Scotland. Will the Minister elaborate on what consultation there has been with gaming companies across the UK, including in Scotland?

The Bill rightly recognises that children are a vulnerable group, but has thought been given to the definition of a child throughout the United Kingdom, because in Scotland it varies. The 2014 Act includes all children up to the age of 18, but there are instances where someone aged 16 may legally be treated as an adult, and other circumstances where disabled or care-experienced children can be included in children’s services until their 26th birthday. As other noble Lords have mentioned, people with physical disabilities, learning disabilities or mental health issues, people in care, people with addictions and many more of all ages could be classed as being vulnerable online. What is the data on looking at online harms from purely an age perspective?

I note that there is an obligation to consult disabled people on decision-making, but should not all those within the CRPD definition of disabled be within the scope of the consultation requirements of the Bill? I would like to see the consultation duties under Clauses 36 and 69 strengthened. I also support calls from other noble Lords for requirements to be placed on providers to risk-assess their customer base, and to provide basic safety settings set to “on” by default.

However, I do welcome the Bill. It is, as others have said, a landmark piece of legislation. We will be far better off with it on the statute book than we are now, but I hope we can get some of the details right as it makes its way through your Lordships’ House.

21:15
Lord Hay of Ballyore Portrait Lord Hay of Ballyore (DUP)
- View Speech - Hansard - - - Excerpts

My Lords, I very much welcome the Bill to the House, late as it may be. Like the noble Lord, Lord Storey, I know very little about the internet. I certainly know less about the sites we are talking about tonight, but I know that some of those sites are destroying our young people and poisoning their minds.

Age verification in terms of safety for children online was first debated in 2016. It is remarkable that a child who was eight years old when this proposal was first put forward will be an adult when the protections that they deserve will finally be in place. Many children will have been allowed to live through their formative years being exposed to untold harm online. A child who was eight in 2016 could be potentially in the grips of addiction by the time that age verification is made a legal requirement. This did not need to be the case. The harms suffered by many teenagers over the last seven or eight years could have been avoided. As the noble Lord, Lord Dodds, indicated, if the Government had only done what they were supposed to do and implemented age verification through Part 3 of the Digital Economy Act, children could have been protected.

According to research by DCMS, 80% of children aged six to 12 have viewed something harmful online, while over 50% of teenagers believe that they have accessed illegal content online. We cannot allow children to continue to be let down. We need to ensure that robust age verification is in place, but, more than that, we need to get it right. While the Bill is a step in the right direction, I think there is a lot more work to be done. This is an important Bill, but it is also important for this House to get it right.

First, we need to ensure that age verification on pornography sites will be brought in on this occasion. The Government cannot be allowed to sidestep this issue. A clear commencement clause needs to be placed into the Bill.

Secondly, we need to ensure that age verification is in place, not just for children accessing pornography; the age of those acting in content must also be verified. User-to-user pornography websites are simply a hotbed of illegal material and children surviving sexual abuse that need to be stopped by the Bill. If it includes clear age verification for those involved in the content, it will be a valuable tool in ensuring that children are not exploited online.

Thirdly, we need to move to protect women and girls from the effects of online pornography. Harmful pornography content promotes violence against women and girls. Evidence shows that excessive consumption of some legal pornography material can result in offenders viewing illegal child sexual abuse material. As increasingly extreme pornography becomes available on mainstream sites, the threshold of what is acceptable is very much lowered.

There is much to support in this legislation: it offers an opportunity to ensure that we can protect women and children. I look forward to working with others to ensure that we can deliver on these important protections.

21:19
Baroness Jenkin of Kennington Portrait Baroness Jenkin of Kennington (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I welcome the Bill’s commitment to protecting children online, yet, like many noble Lords, I fear that it is not yet robust enough. I am extremely concerned about the current unfettered access that children have to online pornography—pornography that is violent, misogynistic, racist and deeply disturbing in its content. For example, analysis of videos recommended to first-time users on three of the most popular porn sites, Pornhub, Xvideos, and xHamster, found that one in every eight titles described sexual activities that constitutes sexual violence as defined by the WHO. In most cases, that violence is perpetrated against women, and, in those videos, the women respond to that violence either with pleasure or neutrality. Incest was the most frequent form of sexual violence recommended to users. The second most common category recommended was that of physical aggression and sexual assault. This is not the dark web, or some far corner of the internet; these are mainstream porn sites, and they are currently accessed every month by 1.4 million UK children.

Research released yesterday by the Children’s Commissioner states that the average age at which children first see pornography is 13. Accessing this brutal and degrading content has a devastating impact on their psychological, emotional, neurological and sexual well-being. I recommend a YouTube video called “Raised on Porn”, if noble Lords want to see the damage it can do. Boys grow up to believe that girls must enjoy violent sex acts, and girls are growing up to believe that they must enjoy painful and humiliating acts, such as anal sex and strangulation. Anecdotal evidence shows that the 5,000% increase in the number of girls going through puberty now wishing to identify as male is at least partly driven by seeing this vile porn and coming to the conclusion that they would rather not be women if that is what sex involves. Yet the Online Safety Bill does little to address this. While it includes regulations on age verification, pornography will not be defined as a primary priority content until secondary legislation. Furthermore, according to the Ofcom implementation road map, multiple consultations and processes also need to be undertaken. As we have heard from other noble Lords, it may not be until 2027 or 2028 before we see robust age verification. We cannot wait that long.

Mainstream porn consists of acutely hardcore content, which, although it does not meet the narrow definition of illegal content, is none the less extremely harmful, especially when viewed by children. Depictions of sexual coercion, abuse and exploitation of vulnerable women and children, the incest porn I have already mentioned, humiliation, punishment, torture and pain, and child sexual abuse are commonplace. In the offline world, that content would be prohibited under the British Board of Film Classification guidelines, yet it remains online with no provisions in the Bill to address the staggering gap between the online and offline worlds. That is despite the Government recognising in their own research that

“there is substantial evidence of an association between the use of pornography and harmful sexual attitudes and behaviours towards women.”

Amending the Bill to protect women and children need not be a difficult task. As many noble Lords have mentioned, provisions were made to address those issues in the Digital Economy Act, although they were not implemented. We must not make those mistakes again and allow the Bill to pass without ensuring robust protections for children and society at large.

21:22
Baroness Parminter Portrait Baroness Parminter (LD)
- View Speech - Hansard - - - Excerpts

My Lords, like other noble Lords, I welcome the Bill and the opportunity it presents, if it is strengthened, to address the many online harms which have been so eloquently outlined by colleagues around the Chamber. My starting point is ensuring that we do all we can to minimise the harms to those at risk of, or with, eating disorders. I declare an interest as the mother of a young adult daughter with anorexia, which is, as many noble Lords will know, the deadliest of any of the mental health diseases.

The evidence is clear of the harm that online content can do to people at risk of, or with, eating disorders and to exacerbate their conditions. Beat, the leading eating disorder charity, undertook research last year of 255 people with lived experience of eating disorders and their carers, which found that 91% of people with lived experience of eating disorders have encountered content which was harmful to their eating disorder condition. This includes sites that are innocuously called “pro-ana” and “pro-mia”, which encourage extreme starvation and extreme bulimic behaviours by people, and content for which there is no warning if you see an image or a video of body checking or of people being fed by naso-gastric tubes, as though that were something to be applauded.

As the noble Baroness, Lady Gohir, said, there are images which have been digitally enhanced to present pictures of people’s bodies that are completely unrealistic but are not labelled as digitally retouched—unlike in France, where the law states that those commercial images do have to be if digitally retouched. It was good that the celebrity influencer Kylie Jenner, who may not be known to all noble Lords in this place, was called out last week in the media for digitally editing pictures of her body on social media. That is the right thing to do and this Government should be doing more on that, including in the Bill.

It is not just that those images are out there. Other noble Lords have made the point that there are algorithms which constantly pump them at people. People with eating disorders feel bombarded by a constant stream of triggering images, content and advertising which feeds eating disorder behaviours and conditions. Obviously, you can recover from eating disorders; that is good news for those of us who know sufferers. But having talked to my daughter Rose about it, I know that what happens on TikTok is that your feed page—I think it is called a “for you” page—obviously is based on the content you have been looking at over the last period. It will suck you back down into an eating disorder, just when those people with mental disorders are trying to get out. For the reasons given so well by other noble Members, algorithms need to be touched on.

I fully support what the noble Baroness, Lady Hollins, said about the insufficiency of the protections for adults. I cannot get my daughter to put food in her mouth to nourish her; how on earth am I going to get her or other vulnerable people to opt out in a different way from the social media content which is harming them?

It was excellent that Vicky Ford promoted this issue in the other place, as the noble Baroness, Lady Morgan, mentioned. She had some suggestions about ensuring that eating disorders were treated on a par and that the obligations on social media companies applied regarding those disorders. I support that entirely and hope that I can work with other Members from around the House to ensure that we can shut that loophole down, so that people with eating disorders, and their carers, are given another tool in the fight against these vicious and deadly diseases.

21:27
Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

My Lords, no Bill that we can devise now can ever offer a complete solution to every online risk while balancing all the competing priorities. But I welcome this Bill as a critical early step down a hard road, because it sets up an adaptive structure to respond to emerging technologies and needs. We heard the phrase “living legislation earlier”, and that expresses it very well.

I would like to offer three examples of what some of our future challenges in this space are going to be. The first is AI: given the sheer quantity of content and genuine difficulty of some decisions that have to be made about that content, no platform can make the delicate judgments at the huge speed and scale we are looking for without automated algorithmic solutions. That inevitably comes to mean AI overseeing our activity and, given the vast behaviour-modification capabilities of the large platforms, AI coming to modify our collective behaviour in ways we are unlikely to understand or control. However benignly intended, the results of such developments are far-reaching and unknowable.

Secondly, there is digital identity. We have heard some brilliant contributions about this and I think we can all agree that a cornerstone of dangerous behaviour online is anonymity. Age-verification checks are easily circumvented today and I wholly support, of course, the analysis and proposals of my noble friend Lord Bethell in this area. There is a broad principle here: that online behaviour should be guided by the same constraints as behaviour in real life. In my view, the only real way to bring that about is by requiring a digital identity for everyone. That is not to say that everybody has to identify themselves at all times, but they should be identifiable if the need arises and should criminal or dangerous behaviour take place.

Thirdly, and lastly, there is the issue of enforcement, particularly in Web 3.0. We can foresee the enforcement of compliance by well-known platforms led and owned by household names, but we are increasingly going to see more and more online services provided by much larger numbers of decentralised platforms, run by so- called DAOs—decentralised autonomous organisations. These are organisations without boards and managers; they do not necessarily have employees or even bank accounts. They are going to require very different levers of enforcement. Put simply, you cannot easily apply criminal sanctions with neither owners to arrest nor real assets to seize. I am pleased that the Minister and his team have already started thinking about these organisations, as discussed at the briefing that he kindly arranged last week.

Of course, worrying about these future problems in no way diminishes the very real challenges of the present, which have been covered so movingly in our debate today. However, none of the risks to online safety is going to get any easier to manage. The growth of malicious activity and extremism will be multiplied by the greater emotional intensity of the immersive experience that will be enabled by some of the virtual reality technologies that we are now starting to see come on to the market. With this Bill, we are making a bold and important start, which I welcome, but I fear that the harder part of our journey lies ahead of us.

21:31
Lord Londesborough Portrait Lord Londesborough (CB)
- View Speech - Hansard - - - Excerpts

My Lords, as a former journalist and online publisher, I welcome this Bill. It is imperfect, of course, but it is much needed, as can be seen by the deeply disturbing data around online media and its impact on the young and vulnerable.

I believe that the free-for-all nature of the digital age requires us to build far more rigorous layers of protection and regulation than ever before. I say this having benefited myself hugely as an entrepreneur both from freedom of expression and information and from the extraordinary reach of online media. However, in this digital era of business to consumer as well as consumer to consumer—whether via social media or user-generated content—we cannot let freedom of expression trump all else. Users need protection from not just unscrupulous organisations but each other.

This is about addressing damaging behaviour and unhealthy lifestyles that the digital world has engendered, especially among the young—and not just in the well-documented areas of online hate, abuse and bullying but around increasing obesity, falling levels of exercise, declining levels of academic performance and, some argue, lower economic productivity. The need for teaching media literacy could not be any more clear.

As the noble Baroness, Lady Benjamin, pointed out, children come across pornography online from as young as the age of seven and more than 50% of 11 to 13 year-olds in the UK have accessed pornography. Even more staggering to me is that, by the age of 18, 79% of young people have been exposed to violent porn. Such exposure has contributed to surging increases in mental ill-health, child abuse, bullying, violence and sexual assault. The evidence is overwhelming—just read the research from the NSPCC, Barnardo’s, Parent Zone and many others.

This issue is so serious and widespread that, like the noble Lord, Lord Bethell, the noble Baroness, Lady Ritchie, and many others, I believe that, although it is well intentioned, the tightening regulation and guidance in Part 5 of the Bill do not go far enough. We must grasp the nettle and insist that all pornography sites, without exception, adopt robust, and ideally standardised, age-verification technology, as we have for online gambling. Given the nature of many of these sites, can we really trust them to abide by a new code of practice and expect Ofcom to enforce it effectively?

I accept that social media is a much more complex beast, but here too I believe the time has come for age verification. TikTok claims to have a minimum age requirement of 13, yet Ofcom reports that 42% of our eight to 12 year-olds are on that platform. Much of the content is unsuitable for children, but TikTok monetises traffic whatever your age. Elon Musk take note: more than 40% of young people in this country have accessed porn via Twitter.

The majority of our children and grandchildren are being exposed to a barrage of disturbing content at the most formative stages of their lives. They need protection. Yes, the implementation of mandatory AV will depress audiences and revenues. It will raise privacy issues and there will be loopholes. But in my view the social benefits far outweigh the costs.

21:35
Lord Sarfraz Portrait Lord Sarfraz (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I declare an interest as an investor, adviser and entrepreneur in the technology industry, as set out in the register. I welcome the Bill, although it is big, complicated and difficult to understand. However, there is a real risk that we are regulating the past instead of thinking about the imminent threats of the future. I will focus on two very narrow issues.

First, as immersive environments—metaverses—become more and more popular, we have the issue of actions in these environments. This is not content, photos, videos and texts but actions. Anyone can buy a haptic glove and touch inappropriately a child in a metaverse. The child would not even know that he or she was being abused. In fact, you can buy a vest with 30 different sensors so that it feels real.

There is a whole community around age play, where adults play the role of children. This is happening right now. There are virtual reality brothels with child avatars. What if that avatar has the likeness of a real child? How much of a likeness is a likeness? What if it has the name of a real child?

This industry, particularly the immersive industry, needs guidance, and it has said that it does. I hope the Minister can elaborate on the guidance that will be provided to it. On this note, I express my support for my noble friend Lord Bethell’s amendment on mandatory age verification. The technology exists and it works. There is no reason why it should not be implemented.

Secondly, the Bill defines user-generated content very clearly, but it is completely silent on machine-generated content. What if an AI chatbot was to groom or abuse a child? Who is responsible: the owner of the dataset on which that AI has been trained or the server on which that data has been transmitted? I thought: why not ask a chatbot? I did. It said, “Yes, an AI bot can abuse a child but liability for abuse by AI bots is a complex issue.” So AI bots are already trying to get out of liability for future abuse. That is what the machines are telling us today.

There are a lot of great things in the Bill and I support it very much, but we cannot always play catch-up with technology. I hope the Minister will tell us how this guidance will be provided as it relates to emerging technologies.

21:38
Baroness Uddin Portrait Baroness Uddin (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, what a privilege it is to follow so many distinguished noble Lords, and in particular the speech of my noble friend Lord Sarfraz.

Our deliberation is ever more imperative, given the latest heightened and explicit concerns stated in the Children’s Commissioner’s report on young people and pornography. As a social worker, I have witnessed first hand the devastating aftermath and the lifelong impact of child sexual abuse and violence, long before children possessed the internet in their hands and pockets, and big tech companies used algorithms for content, evidently enticing children towards dangerous cycles of harm.

The backdrop of this Bill is the aim to make Britain a global leader for digital business while ensuring that it is the safest place online, and to navigate the balance between protecting consumers and stimulating innovation in a fast-moving digital world that can preserve safety and enhance freedom of speech without compromising one or the other. At a time of deepening and detrimental public services cuts, achieving best outcomes for the legislation will require considerable financial resources, impactful monitoring and skilled oversight. The Bill will address many of the anomalies and flaws that plague the current system and stop its preventing harms, as authoritatively detailed by my noble friend Lady Kidron. I salute her and acknowledge the presence of Mr Ian Russell. I too was horrified on hearing the briefing.

I welcome this opportunity to ensure that platforms are held accountable for their interactions with users, even chatbots. I also value innovations, emerging technologies and the right to freedom of expression, but, cognisant of the evident danger presented by many platforms, government cannot be the protector of profits to the detriment of young minds and lives. Big tech platforms have resisted remedies, including identity assurance and age verification. Therefore, I will definitely be supporting my noble friend Lady Kidron and the noble Lord, Lord Bethell—unless Government concede beforehand. I cannot support preserving anonymity as a shield of protection for any subscribers, content-makers and users. If we end anonymity, it will be a huge leap in monitoring harmful content and traceability.

As co-chair of the APPG on the metaverse and web 3.0, working with stakeholders in this space, I recognise the power of innovative technology as a force for good. At the same time, as a social worker, I want to scream out loud its threat. If we do not address the gravity of harmful content that normalises children viewing extreme material on violent pornography, diet, sexual exploitation, self-harm and revenge porn that shapes their young minds, we will have abdicated our role as protector of standards. Statistics from the NSPCC, Barnardo’s, Big Brother Watch and the Internet Watch Foundation, on unprecedented and worsening levels of online access to material on grooming, sexual abuse, self-harm, bulimia and millions of unfiltered pieces of content, make horrific reading.

Many NGOs are fearful that Ofcom is not fit to address these complex matters without incorporating children’s views into regulatory decision-making and, more importantly, to counterbalance the big tech lobbyists, their infinite resources and proficiency at skewing available data on child safety. I agree that Ofcom needs strengthening and must work with safeguarding experts to uphold standards, but it must also identify and respond to the evolving nature of harms across multidimensional interconnected platforms and a plethora of small, less well-moderated operators to ensure that children’s safety and voices are not drowned out by large tech companies whose business models are not predicated on protection and thorough risk assessment.

The APPG on the metaverse and web 3.0 wants to see children’s views prioritised, and we intend to incorporate them into our reports and programmes. Our partners are also considering the balance between safeguarding and the opportunity for increasing diversity within social media companies, recognising the historical disfranchisement and exclusion prevalent within the first wave of the social media revolution platforms. There is promise on the horizon from the newcomers —the smaller, emerging generation of conscientious organisations and companies that are proactive in engaging local communities, and inclusive in their approach. Widening participation will require institutions to consider workforce training in this sector.

Finally, the online safety Bill may not prevent all children accessing harmful content as this new virtual space becomes more sophisticated within the infinite metaverse and artificial intelligence space. We will need to respond smartly to this rapidly shifting national and international digital environment of emerging technology, placing the safety of children at the forefront of our consideration.

21:44
Baroness Bull Portrait Baroness Bull (CB)
- View Speech - Hansard - - - Excerpts

My Lords, no one who has heard Molly Russell’s story can be in any doubt about the need to better protect young people online, and I join others in paying tribute to her family for their tireless campaign.

As we have heard, vulnerability online does not evaporate on turning 18. Some adults will be at risk because mental illness, disability, autism, learning disabilities or even age leaves them unable to protect themselves from harm. Others will be vulnerable only at certain times, or in relation to specific issues. The “legal but harmful” provisions were not perfect, but stripping out adult safety duties—when, as the Minister himself said, three-quarters of adults are fearful of going online—is a backward step.

With category 1 services no longer required to assess risks to adults, it is hard to agree when the Minister says this will be

“a regulatory regime which has safety at its heart”.

Without risk assessments, how will platforms work out what they need to include in their terms and conditions? How will users make informed choices? How will the effectiveness of user empowerment tools be measured? Without the real-time information that risk assessments provide, how will the regulator stay on top of new risks, and advise the Secretary of State accordingly?

Instead, the Bill sets out duties for category 1 services to write and enforce their own terms and conditions—they will be “author, judge and jury”, to quote my noble friend Lady Kidron—and to provide tools that empower adult users to increase control over types of content listed at Clause 12. Harms arise and spread quickly online, yet this list is static, and it has significant gaps already. Harmful or false health content is missing, as are harms relating to body image, despite evidence linking body shaming to eating disorders, self-harm and suicide ideation. Smaller sites that target specific vulnerabilities, including suicide forums, would fall outside scope of these duties.

Describing this list as “content over which users may wish to increase control” is euphemism at its best. This is not content some might consider in poor taste, or a bit off-colour. This is content encouraging or promoting suicide, self-harm and eating disorders. It is content that is abusive or incites hate on the basis of race, ethnicity, religion, disability, sex, gender, sexual orientation and misogyny, which evidence connects directly to violence against women and girls.

And yet tools to hide this content will be off by default, meaning that people at the point of crisis, those seeking advice on self-harm or starvation, will need to find and activate those settings when they may well be in an affected mental state that leaves them unable to self-protect. The complexities of addiction and eating disorders disempower choice, undermining the very basis on which Clause 12 is built.

We heard it said today that all adults, given the tools, are capable of protecting themselves from online abuse and harm. This is just not true. Of course, many adults are fortunate to be able to do so, but as my noble and expert friends Lady Hollins and Lady Finlay explained, there are many adults who, for reasons of vulnerability or capacity, cannot do so. Requiring the tools to be on by default would protect adults at risk and cause no hardship whatever to those who are not: a rational adult will be as capable of finding the off button as the one that turns them on.

Last week, Ministers defended the current approach on the basis that failing to give all users equal access to all material constitutes a chilling effect on freedom of expression. It is surely more chilling that this Bill introduces a regime in which content promoting suicide, self-harm, or racist and misogynistic abuse is deemed acceptable, and is openly available, harming some but influencing many, as long as the platform in question gives users an option to turn it off. This cannot be right, and I very much hope Ministers will go back and reconsider.

When the Government committed to making the UK the safest place in the world to be online, I find it hard to believe that this is the environment that they had in mind.

21:48
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, it is hard to think of something new to say at the end of such a long debate, but I am going to try. I am helped by the fact that I find myself, very unusually, somewhat out of harmony with the temper of the debate in your Lordships’ House over the course of this afternoon and evening. I rather felt at some points that I had wandered into a conference of medieval clerics trying to work out what measures to take to mitigate the harmful effects of the invention of moveable type.

In fact, it probably does require an almost religious level of faith to believe that the measures we are discussing are actually going to work, given what my noble friends Lord Camrose and Lord Sarfraz have said about the agility of the cyber world and the avidity of its users for content. Now, we all want to protect children, and if what had come forward had been a Bill which made it a criminal offence to display or allow to be displayed to children specified harmful content—with condign punishment—we would all, I am sure, have rallied around that and rejoiced. That is how we would have dealt with this 50 years ago. But instead we have this; this is not a short Bill doing that.

Let me make three brief points about the Bill in the time we have available. The first is a general one about public administration. We seem to be wedded to the notion that the way in which we should be running large parts of the life of the country is through regulators rather than law, and that the independence of those regulators must be sacrosanct. In a different part of your Lordships’ House, there has been discussion in the last few days of the Financial Services and Markets Bill in Committee. There, of course, we have been discussing the systemic failures of regulators—that is, the box ticking, the legalism, the regulatory capture and the emergence of the interests of the regulator and how they motivate them. None the less, we carry on giving more and more powers. Ofcom is going to be one of the largest regulators and one of the most important in our lives, and it is going to be wholly unaccountable. We are not going to be happy about that.

The second point I want to make is that the Bill represents a serious threat to freedom of speech. This is not contentious; the Front Bench admits it. The Minister says that it is going to strike the right balance. I have seen very little evidence in the Bill, or indeed in the course of the day’s debate, that that balance is going to be struck at all, let alone in what I might consider the right place—and what I might consider the right place might not be what others consider it to be. These are highly contentious issues; we will be hiving them off to an unaccountable regulator, in effect, at the end.

The third point that I want to make, because I think that I am possibly going to come in under my four minutes, is that I did vote Conservative at the last general election; I always have. But that does not mean that I subscribe to every jot and tittle of the manifesto; in particular, I do not think that I ever signed up to live in a country that was the safest place in the world to be on the internet. If I had, I would have moved to China already, where nothing is ever out of place on the internet. That is all I have to say, and I shall be supporting amendments that move in the general direction that I have indicated.

21:53
Lord Weir of Ballyholme Portrait Lord Weir of Ballyholme (DUP)
- View Speech - Hansard - - - Excerpts

My Lords, I also welcome this belated Bill, particularly its protections for children. All of us, I think, very sadly over the last number of years, have witnessed the outcome of inquiries into a litany of horrific crimes against children, through decades of historic institutional abuse. That abuse, sadly, was facilitated by inaction. That might have been motivated by ignorance and complacency rather than by being complicit, but nevertheless society as a whole let down those generations of children. We must make sure that history does not repeat itself.

I am the first to admit that the internet can be a great tool for value. We saw during the recent pandemic, for example, the contribution that the internet was able to make to education, in a way that would have been inconceivable a decade ago. But there is also no doubt that there is a very negative side to the internet, through body-shaming, trolling, misogyny, anti-Semitism, racism and incitement to violence—among many other things—and most particularly, the damage that occurs to our young people and the tragic loss of life in cases such as Molly Russell and others. That is why I particularly support the amendments that will be brought forward by the noble Baroness, Lady Kidron, and by the noble Lord, Lord Bethell.

We know that early exposure to pornography, particularly violent pornography, leads to degrading and destructive attitudes and actions, especially towards women, as has been highlighted by the Government themselves in their reports on violence against women and girls. Therefore, we must take definitive action to be able to counteract that.

As the noble Lord, Lord Bethell, has indicated, there are three particular areas on which we have to intervene when it comes to amendments. First, we need robust age verification, both for users and—as has been highlighted by a previous speaker—for those involved in the porn industry itself and are producing it. We know that the porn industry, and many within it, are not exactly protective of those whom they employ, and we must make sure that everything is done to protect everyone who is underage.

Secondly, I believe that, in regulations, we need to have what is clear and consistent: consistent in a single definition of pornography; consistent that what is illegal offline is mirrored by what is illegal online; and consistent in ensuring that high standards apply across all platforms. I join with a number of speakers today who have been highly critical of large, conglomerate tech companies and the approach that they take, but that should not blind us to the fact that some of the vilest imagery, some of the vilest abuse and some of the vilest actions happen on small platforms as well. We must make sure that we hold all platforms equally to a high standard.

Thirdly, we must ensure, particularly in terms of age verification, that we see swift and early implementation. I agree that, in terms of the detail of regulation, Ofcom is best placed to be able to deliver that. However, we also know that the full package of regulations that Ofcom will produce might be three, four or five years away. We cannot allow that level of destruction to take place in the meantime. That means, particularly in regard to age verification, that we need to see that early and swift intervention.

In conclusion, I think we have a good Bill, but it could be a better Bill. Collectively, we must ensure that it is the best Bill that is possible, so that we do not face a situation in which, for families and for children—either of the current generation or of future ones—we let them down in the way that the previous generations have been let down.

21:57
Lord Strathcarron Portrait Lord Strathcarron (Con)
- View Speech - Hansard - - - Excerpts

My Lords, it has been well observed that the social media companies and YouTube are now the public square—only, of course, they are not public at all but privately owned companies whose primary concern is to earn profits for their shareholders in the normal way. Against this, the reality is that we have effectively outsourced our censorship to Silicon Valley AI bots, and, faced with the prospect of enormous fines for breaching the new laws, these private companies are going to programme the AI bots on the side of caution. The bots, after all, have no way of knowing the legal cut-off point of mature teenagers and immature adults, and, of course, the censoring bot has no sense of irony or satire or parody or context.

The threat to free speech will therefore now come from two sources. First, as we have seen from the Twitter files, from Big Brother Watch’s Ministry of Truth report and from Matt Hancock’s diaries, Governments covertly lean on the platforms to suppress dissent from the official line. Secondly, the threat will come from these private companies instructing the bots not to go anywhere near anything that might upset the Governments. In this sense, both have crossed the line between attacking disinformation and attacking dissent, and the ability to express dissent is at the core of freedom of speech. We therefore now have the reality of big government and big tech working together to suppress freedom of expression.

I am looking forward to initiating or supporting any amendments that will check the power of government or big tech to shut down legitimate questioning voices, which, from the Great Barrington declaration to the Wuhan lab-leak theory to the ineffectiveness of masks to the collateral damage caused by the lockdowns, over and over again have often proved to be closer to the truth than the official government line at the time.

I would like to use the few moments left to support resistance to restricting end-to-end encryption, to support the initiatives of the noble Lord, Lord Bethell, on age verification, and to follow the lead of the noble Baroness, Lady Kidron, on child safety initiatives.

22:00
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the Minister for his detailed introduction and his considerable engagement on the Bill to date. This has been a comprehensive, heartfelt and moving debate, with a great deal of cross-party agreement about how we must regulate social media going forward. With 66 speakers, however, I sadly will not be able to mention many significant contributors by name.

It has been a long and winding road to get to this point, as noble Lords have pointed out. As the Minister pointed out, along with a number of other noble Lords today, I sat on the Joint Committee which reported as far back as December 2021. I share the disappointment of many that we are not further along with the Bill. It is still a huge matter of regret that the Government chose not to implement Part 3 of the DEA in 2019. Not only, as mentioned by many, have we had a cavalcade of five Culture Secretaries, we have diverged a long way from the 2019 White Paper with its concept of the overarching duty of care. I share the regret that the Government have chosen to inflict last-minute radical surgery on the Bill to satisfy the, in my view, unjustified concerns of a very small number in their own party.

Ian Russell—I pay tribute to him, like other noble Lords—and the Samaritans are right that this is a major watering down of the Bill. Mr Russell showed us just this week how Molly had received thousands and thousands of posts, driven at her by the tech firms’ algorithms, which were harmful but would still be classed as legal. The noble Lord, Lord Russell, graphically described some of that material. As he said, if the regulator does not have powers around that content, there will be more tragedies like Molly’s.

The case for proper regulation of harms on social media was made eloquently to us in the Joint Committee by Ian and by witnesses such Edleen John of the FA and Frances Haugen, the Facebook whistleblower. The introduction to our report makes it clear that the key issue is the business model of the platforms, as described by the noble Lords, Lord Knight and Lord Mitchell, and the behaviour of their algorithms, which personalise and can amplify harmful content. A long line of reports by Select Committees and all-party groups have rightly concluded that regulation is absolutely necessary given the failure of the platforms even today to address these systemic issues. I am afraid I do not agree with the noble Baroness, Lady Bennett; being a digital native is absolutely no protection—if indeed there is such a thing as a digital native.

We will be examining the Bill and amendments proposed to it in a cross-party spirit of constructive criticism on these Benches. I hope the Government will respond likewise. The tests we will apply include: effective protections for children and vulnerable adults; transparency of systems and power for Ofcom to get to grips with the algorithms underlying them; that regulation is practical and privacy protecting; that online behaviour is treated on all fours with offline; and that there is a limitation of powers of the Secretary of State. We recognise the theme which has come through very strongly today: the importance of media literacy.

Given that there is, as a result of the changes to the Bill, increased emphasis on illegal content, we welcome the new offences, recommended in the main by the Law Commission, such as hate and communication crimes. We welcome Zach’s law, against sending flashing images or “epilepsy trolling”, as it is called, campaigned for by the Epilepsy Society, which is now in Clause 164 of the Bill. We welcome too the proposal to make an offence of encouraging self-harm. I hope that more is to come along the lines requested by my noble friend Lady Parminter.

There are many other forms of behaviour which are not and will not be illegal, and which may, according to terms of service, be entirely legal, but are in fact harmful. The terms of service of a platform acquire great importance as a result of these changes. Without “legal but harmful” regulation, platforms’ terms of service may not reflect the risks to adults on that service, and I was delighted to hear what the noble Baroness, Lady Stowell, had to say on this. That is why there must be a duty on platforms to undertake and publish risk and impact assessments on the outcomes of their terms of service and the use of their user empowerment tools, so that Ofcom can clearly evaluate the impact of their design and insist on changes or adherence to terms of service, issue revised codes or argue for more powers as necessary, for all the reasons set out by the noble Baroness, Lady Gohir, and my noble friend Lady Parminter.

The provisions around user empowerment tools have now become of the utmost importance as a result of these changes. However, as Carnegie, the Antisemitism Policy Trust, and many noble Lords today have said, these should be on by default to protect those suffering from poor mental health or who might lack faculty to turn them on.

Time is short today, so I can give only a snapshot of where else we on these Benches—and those on others, I hope—will be focusing in Committee. The current wording around “content of democratic importance” and “journalistic content” creates a lack of clarity for moderation processes. As recommended by the Joint Committee, these definitions should be replaced with a single statutory requirement to protect content where there are reasonable grounds to believe it will be in the public interest, as supported by the Equality and Human Rights Commission.

There has been a considerable amount of focus on children today, and there are a number of amendments that have clearly gained a huge amount of support around the House, and from the Children’s Charities’ Coalition on Internet Safety. They were so well articulated by the noble Baroness, Lady Kidron. I will not adumbrate them, but they include that children’s harms should be specified in the Bill, that we should include reference to the UN convention, and that there should be provisions to prevent online grooming. Particularly in the light of what we heard this week, we absolutely support those campaigning to ensure that the Bill provides for coroners to have access to children’s social media accounts after their deaths. We want to see Minister Scully’s promise to look at this translate into a firm government amendment.

We also need to expressly future-proof the Bill. It is not at all clear whether the Bill will be adequate to regulate and keep safe children in the metaverse. One has only to read the recent Institution of Engineering and Technology report, Safeguarding the Metaverse, and the report of the online CSA covert intelligence team, to realise that it is a real problem. We really need to make sure that we get the Bill right from this point of view.

As far as pornography is concerned, if we needed any more convincing of the issues surrounding children’s access to pornography, the recent research by the Children’s Commissioner, mentioned by several noble Lords, is the absolute clincher. It underlines the importance of the concerns of the coalition of charities, the noble Lord, Lord Bethell, and many other speakers today, who believe that the Online Safety Bill does not go far enough to prevent children accessing harmful pornographic content. We look forward to debating those amendments when they are put forward by the noble Lord, Lord Bethell.

We need to move swiftly on Part 5 in particular. The call to have a clear time limit to bring it in within six months of the Bill becoming law is an absolutely reasonable and essential demand.

We need to enshrine age-assurance principles in the Bill. The Minister is very well aware of issues relating to the Secretary of State’s powers. They have been mentioned by a number of noble Lords, and we need to get them right. Some can be mitigated by further and better parliamentary scrutiny, but many should simply be omitted from the Bill.

As has been mentioned by a number of noble Lords, there is huge regret around media literacy. We need to ensure that there is a whole-of-government approach to media literacy, with specific objectives set for not only Ofcom but the Government itself. I am sure that the noble Lord, Lord Stevenson, will be talking about an independent ombudsman.

End-to-end encryption has also come up; of course, that needs protecting. Clause 110 on the requirement by Ofcom to use accredited technology could lead to a requirement for continual surveillance. We need to correct that as well.

There is a lot in the Bill. We need to debate and tackle the issue of misinformation in due course, but this may not be the Bill for it. There are issues around what we know about the solutions to misinformation and disinformation and the operation of algorithmic amplification.

The code for violence against women and girls has been mentioned. I look forward to debating that and making sure that Ofcom has the power and the duty to produce a code which will protect women and girls against that kind of abuse online. We will no doubt consider criminal sanctions against senior managers as well. A Joint Committee, modelled on the Joint Committee on Human Rights, to ensure that the Bill is future-proofed along the lines that the noble Lords, Lord Inglewood and Lord Balfe, talked about is highly desirable.

The Minister was very clear in his opening remarks about what amendments he intends to table in Committee. I hope that he has others under consideration and that he will be in listening mode with regard to the changes that the House has said it wants to see today. Subject to getting the Bill in the right shape, these Benches are very keen to see early implementation of its provisions. I hope that the Ofcom implementation road map will be revised, and that the Minister can say something about that. It is clearly the desire of noble Lords all around the House to improve the Bill, but we also want to see it safely through the House so that the long-delayed implementation can start.

This Bill is almost certainly not going to be the last word on the subject, as the noble Baroness, Lady Merron, very clearly said at the beginning of this debate, but it is a vital start. I am glad to say that today we have started in a very effective way.

22:12
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I start by apologising for having absented myself during part of the debate. I promise those noble Lords whose speeches I missed that I will read them very carefully. The reason is slightly self-serving: I decided to tear up my speech, for two reasons. First, I suddenly realised that the noble Lord, Lord Clement-Jones, being the brilliant lawyer he has been and still is, would probably say everything I was going to say but better—and indeed that has proved to be the case. There is not much point in me boring noble Lords by trying to repeat what he said. The list of items I had is almost exactly identical. I did not give it to him, but we had an exchange of views before the debate, so I was not surprised by that. I will come on to that point.

Secondly, I want to deal with the noble Lord, Lord Hastings, who challenged me in my very junior position as an acting Front-Bencher to commit the Labour Government to a future policy on media education. I am sure the noble Lord opposite will not out-trump me on this one, but I cannot do that. I will, however, get back at him, because I will say that the BBC has never been in better shape than when he was the PR person operating at the front of it. In fact, I do not think it has recovered since he left, so there you are. I think that what he said was quite important.

One of the big, strange things about media education—in fact, this is true of most education policy—is that it is very hard to get changes in the education system. That is partly because it is now so disparate and uncoordinated in many ways, through policy, that you cannot say that there is a core curriculum, or that it will include media education and that that will be examined on the following days, as they might do in other countries such as France. The Government should think very hard about how they might take forward the idea from the noble Lord, Lord Hastings. My answer is that you have to examine media education or assess it in some way, otherwise schools will not care about it. This is really a question for Ofsted, not Ofcom. In a sense, the Government have got it right there, but if we could put some pressure on Ofsted to include in its assessment of all schools—indeed, all education at that level—some form of ability to assess whether media education is meeting the needs of Ofcom or the needs of society, we might make some progress. Let us work on that together.

I declare an interest as a member of the Joint Committee on the pre-legislative scrutiny of the Bill. That was a wonderful experience and has been mentioned by others. I am also a former member of the Communications and Digital Committee. I should also drop in that I am veteran of the Digital Economy Act—much mentioned today—so I have been there, got the scars and am aware of the issues very clearly.

The second reason why I wanted to tear up my speech was that it seemed to me that, as the noble Lord, Lord Clement-Jones, said, there has been an extraordinary amount of agreement on the issues facing the House in trying to get this Bill right. They are not fuelled in any sense by party-political points, because we have no political issue in this, and I do not think the Liberal Democrats or Cross Benches have. We are talking about an issue that we want to do together. I will come back at the end with a proposal, which I think is slightly novel, for how we might take advantage of that. I do not think we want to get ourselves into a situation of antagonism—firing amendments across the Dispatch Box during Committee —because we are broadly agreed about where we want to go. Yes, there are difference of detail, but we have to think about it. I want to come back to that as an issue—and that was what I was doing while I was away.

I want to go back to the introduction to the Joint Committee report, as I would have done in my original speech, because it says so much about what we have been doing in the last two or three years. Self-regulation of online services had failed. While the online world has revolutionised our lives and created many benefits, underlying systems designed to service business models based on data harvesting and micro-targeted advertising shape the way we experience it. Algorithms, invisible to the public, decide what we see, hear and experience. For some service providers, this means valuing the engagement of users at all cost, regardless of what holds their attention. This can result in amplifying the false over the true, the extreme over the considered, and the harmful over the benign. The human cost can be counted in mass murders in Myanmar, intensive care beds full of unvaccinated Covid-19 patients, insurrection at the US Capitol, and teenagers sent down rabbit holes of content promoting self-harm, eating disorders and suicide. As we have learned, we do not just mean teenagers—there are others involved in that. As the noble Baroness, Lady Kidron, and others have reminded us, too many children have suffered from infractions of this type. I pay tribute, again, to Ian Russell—who is still with us—for his campaign and for his extraordinary willingness to share his story. We all owe him a great debt.

These points, already made in other speeches, are important; they are at the heart of what this is about. This is about finding a way of organising what we all value, want and need, in a way that will allow us to get the benefits from it without paying the price that we already are. This debate, in the best traditions of this House, has brought a lot of views to bear on this, but, as I have tried to explain, it seems to me that a lot of them are very similar. There are differences and one or two outliers, but the points made broadly point in one direction: that the Bill is nearly there. It needs a little work and a bit of polishing and it will get over the finishing line.

The Bill needs to be in its best shape—there is no doubt about that—but we could identify alongside it the other issues that we will need to return to in future. We should not worry about that; I think we have all agreed that there will be other opportunities to do so. As we were reminded by the noble Lord, Lord Black, and others, there are other elements that also need to go ahead, and we should be thinking harder about them—the DMU and the need for competition in this whole area. As I said, the noble Lord, Lord Clement-Jones, gave a very good summary of all the issues; I will not run through them again because it was exactly what I would have said myself.

We are in a very strange situation. There is no political divide and we all want the same things: we want the Bill improved and we want to see it pass as soon as possible. I am assuming that the Government will work with us on that—that is an assumption, because that is not the normal way it goes. I am assuming also that they recognise that there are one or two quite sensible compromises to be made—again, that is not a given, but I am getting a few nods that suggest that it might be the case. From this side, I cannot think of any issue that I have heard today, or in any of the discussions we have had recently about this Bill—and they have gone on for a number of years—that we would push to ping-pong. That is very unusual.

I suggest that we try to work together on getting the best Bill we can—while, of course, going through the various stages, because these things all eventually have to go back into the Bill—avoiding the war of attrition approach that so often bedevils the work we do here. Such an approach is important when there are big political issues at stake, but there are not, so let us use that and try to move forward. I would like to get together quite quickly and identify the policies we can move on together, and to take a route forward which will minimise the votes and the dissent and yet deliver the Bill, let us hope, by Report. That is a big ask; I do not think it has been done, except during wartime. But we are at war—at war with these people who are trying to run our lives, and we should try to get together and defeat them. It is unusual, but we live in unusual times. I look forward to hearing from the Minister.

22:20
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to the very many noble Lords who have spoken this afternoon and this evening. They have spoken with passion—we heard that in the voices of so many—about their own experiences, the experiences of their families and the experiences of far too many of our fellow subjects, who have harrowing examples of the need for this Bill. But noble Lords have also spoken with cool-headed precision and forensic care about the aspects of the Bill that demand our careful scrutiny. Both hearts and heads are needed to make this Bill worth the wait.

I am very grateful for the strong consensus that has come through in noble Lords’ speeches on the need to make this Bill law and to do so quickly, and therefore to do our work of scrutiny diligently and speedily. I am grateful for the very generous and public-spirited offer the noble Lord, Lord Stevenson, has just issued. I, too, would like to make this not a party-political matter; it is not and has not been in the speeches we have heard today. The work of your Lordships’ House is to consider these matters in detail and without party politics intruding, and it would be very good if we could proceed on the basis of collaboration, co-operation and, on occasion, compromise.

In that spirit, I should say at the outset that I share the challenge faced by the noble Lords, Lord Clement-Jones and Lord Stevenson. Given that so many speakers have chosen to contribute, I will not be able to cover or acknowledge everyone who has spoken. I shall undoubtedly have to write on many of the issues to provide the technical detail that the matters they have raised deserve. It is my intention to write to noble Lords and invite them to join a series of meetings to look in depth at some of the themes and areas between now and Committee, so that as a group we can have well-informed discussions in Committee. I shall write with details suggesting some of those themes, and if noble Lords feel that I have missed any, or particular areas they would like to continue to talk about, please let me know and I will be happy to facilitate those.

I want to touch on a few of the issues raised today. I shall not repeat some of the points I made in my opening speech, given the hour. Many noble Lords raised the very troubling issue of children accessing pornography online, and I want to talk about that initially. The Government share the concerns raised about the lack of protections for children from this harmful and deeply unsuitable content. That is why the Bill introduces world-leading protections for children from online pornography. The Bill will cover all online sites offering pornography, including commercial pornography sites, social media, video-sharing platforms and fora, as well as search engines, which play a significant role in enabling children to access harmful and age-inappropriate content online. These companies will have to prevent children accessing pornography or face huge fines. To ensure that children are protected from this content, companies will need to put in place measures such as age verification, or demonstrate that the approach they are taking delivers the same level of protection for children.

While the Bill does not mandate that companies use specific technologies to comply with these new duties, in order to ensure that the Bill is properly future-proofed, we expect Ofcom to take a robust approach to sites which pose the highest risk of harm to children, including sites hosting online pornography. That may include directing the use of age verification technologies. Age verification is also referred to in the Bill. This is to make clear that these are measures that the Government expect to be used for complying with the duties under Part 3 and Part 5 to protect children from online pornography. Our intention is to have the regime operational as soon as possible after Royal Assent, while ensuring that the necessary preparations are completed effectively and that service providers understand what is expected of them. We are working very closely with Ofcom to ensure this.

The noble Lord, Lord Morrow, and others asked about putting age verification in the Bill more clearly, as was the case with the Digital Economy Act. The Online Safety Bill includes references to age assurance and age verification in the way I have just set out. That is to make clear that these are measures which the Government expect to be used for complying with the duties where proportionate to do so. While age assurance and age verification are referred to in the Bill, the Government do not mandate the use of specific approaches or technologies. That is similar to the approach taken in the Digital Economy Act, which did not mandate the use of a particular technology either.

I think my noble friend Lord Bethell prefers the definition of pornography in Part 3 of the Digital Economy Act. There is already a robust definition of “pornographic content” in this Bill which is more straightforward for providers and Ofcom to apply. That is important. The definition we have used is similar to the definition of pornographic content used in existing legislation such as the Coroners and Justice Act 2009. It is also in line with the approach being taken by Ofcom to regulate UK-established video-sharing platforms, meaning that the industry will already have familiarity with this definition and that Ofcom will already have experience in regulating content which meets this definition. That means it can take action more swiftly. However, I have heard the very large number of noble Lords who are inclined to support the work that my noble friend is doing in the amendments he has proposed. I am grateful for the time he has already dedicated to conversations with the Secretary of State and me on this and look forward to discussing it in more detail with him between now and Committee.

A number of noble Lords, including the noble Baronesses, Lady Finlay of Llandaff and Lady Kennedy of The Shaws, talked about algorithms. All platforms will need to undertake risk assessments for illegal content. Services likely to be accessed by children will need to undertake a children’s risk assessment to ensure they understand the risks associated with their services. That includes taking into account in particular the risk of algorithms used by their service. In addition, the Bill includes powers to ensure that Ofcom is able effectively to assess whether companies are fulfilling their regulatory requirements, including in relation to the operating of their algorithms. Ofcom will have the power to require information from companies about the operation of their algorithms and the power to investigate non-compliance as well as the power to interview employees. It will have the power to require regulated service providers to undergo a skilled persons report and to audit company systems and processes, including in relation to their algorithms.

The noble Baroness, Lady Kidron, rightly received many tributes for her years of work in relation to so many aspects of this Bill. She pressed me on bereaved parents’ access to data and, as she knows, it is a complex issue. I am very grateful to her for the time she has given to the meetings that the Secretary of State and I have had with her and with colleagues from the Ministry of Justice on this issue, which we continue to look at very carefully. We acknowledge the distress that some parents have indeed experienced in situations such as this and we will continue to work with her and the Ministry of Justice very carefully to assess this matter, mindful of its complexities which, of course, were something the Joint Committee grappled with as well.

The noble Baroness, Lady Featherstone, my noble friend Lady Wyld and others focused on the new cyberflashing offence and suggested that a consent-based approach would be preferable. The Law Commission looked at that in drawing up its proposals for action in this area. The Law Commission’s report raised concerns about the nature of consent in instant messaging conversations, particularly where there are misjudged attempts at humour or intimacy that could particularly affect young people. There is a risk, which we will want to explore in Committee, of overcriminalising young people. That is why the Government have brought forward proposals based on the Law Commission’s work. If noble Lords are finding it difficult to see the Law Commission’s reports, I am very happy to draw them to their attention so that they can benefit from the consultation and thought it conducted on this difficult issue.

The noble Baroness, Lady Gohir, talked about the impact on body image of edited images in advertising. Through its work on the online advertising programme, DCMS is considering how the Government should approach advertisements that contribute to body image concerns. A consultation on this programme closed in June 2022. We are currently analysing the responses to the consultation and developing policy. Where there is harmful user-generated content related to body image that risks having an adverse physical or psychological impact on children, the Online Safety Bill will require platforms to take action against that. Under the Bill’s existing risk assessment duties, regulated services are required to consider how media literacy can be used to mitigate harm for child users. That could include using content provenance technology, which can empower people to identify when content has been digitally altered in ways such as the noble Baroness mentioned.

A number of noble Lords focused on the changes made in relation to the so-called “legal but harmful” measures to ensure that adults have the tools they need to curate and control their experience online. In particular, noble Lords suggested that removing the requirement for companies to conduct risk assessments in relation to a list of priority content harmful to adults would reduce protections available for users. I do not agree with that assessment. The new duties will empower adult users to make informed choices about the services they use and to protect themselves on the largest platforms. The new duties will require the largest platforms to enforce all their terms of service regarding the moderation of user-generated content, not just the categories of content covered in a list in secondary legislation. The largest platforms already prohibit the most abusive and harmful content. Under the new duties, platforms will be required to keep their promises to users and take action to remove it.

There was rightly particular focus on vulnerable adult users. The noble Baronesses, Lady Hollins and Lady Campbell of Surbiton, and others spoke powerfully about that. The Bill will give vulnerable adult users, including people with disabilities, greater control over their online experience too. When using a category 1 service, they will be able to reduce their exposure to online abuse and hatred by having tools to limit the likelihood of their encountering such content or to alert them to the nature of it. They will also have greater control over content that promotes, encourages or provides instructions for suicide, self-harm and eating disorders. User reporting and redress provisions must be easy to access by all users, including people with a disability and adults with caring responsibilities who are providing assistance. Ofcom is of course subject to the public sector equality duty as well, so when performing its duties, including writing its codes of practice, it will need to take into account the ways in which people with protected characteristics, including people with disabilities, can be affected. I would be very happy to meet the noble Baronesses and others on this important matter.

The noble Lords, Lord Hastings of Scarisbrick and Lord Londesborough, and others talked about media literacy. The Government fully recognise the importance of that in achieving online safety. As well as ensuring that companies take action to keep users safe through this Bill, we are taking steps to educate and empower them to make safe and informed choices online. First, the Bill strengthens Ofcom’s existing media literacy functions. Media literacy is included in Ofcom’s new transparency reporting and information-gathering powers. In response to recommendations from the Joint Committee, the legislation also now specifies media literacy in the risk-assessment duties. In July 2021, DCMS published the online media literacy strategy, which sets out our ambition to improve national media literacy. We have committed to publishing annual action plans in each financial year until 2024-25, setting out our plans to deliver that. Furthermore, in December of that year, Ofcom published Ofcom’s Approach to Online Media Literacy, which includes an ambitious range of work focusing on media literacy.

Your Lordships’ House is, understandably, not generally enthusiastic about secondary legislation and secondary legislative powers, so I was grateful for the recognition by many tonight of the importance of providing for them in certain specific instances through this Bill. As the noble Lord, Lord Brooke of Alverthorpe, put it, there may be loopholes that Parliament wishes to close, and quickly. My noble friend Lord Inglewood spoke of the need for “living legislation”, and it is important to stress, as many have, that this Bill seeks to be technology-neutral—not specifying particular technological approaches that may quickly become obsolete—in order to cater for new threats and challenges as yet not envisaged. Some of those threats and challenges were alluded to in the powerful speech of my noble friend Lord Sarfraz. I know noble Lords will scrutinise those secondary powers carefully. I can tell my noble friend that the Bill does apply to companies that enable users to share content online or interact with each other, as well as to search services. That includes a broad range of services, including the metaverse. Where haptics enable user interaction, companies must take action. The Bill is also clear that content generated by bots is in scope where it interacts with user-generated content such as on Twitter, but not if the bot is controlled by or on behalf of the service, such as providing customer services for a particular site.

Given the range of secondary powers and the changing technological landscape, a number of noble Lords understandably focused on the need for post-legislative scrutiny. The Bill has undoubtedly benefited from pre-legislative scrutiny. As I said to my noble friend Lady Stowell of Beeston in her committee last week, we remain open-minded on the best way of doing that. We must ensure that once this regime is in force, it has the impact we all want it to have. Ongoing parliamentary scrutiny will be vital in ensuring that is the case. We do not intend to legislate for a new committee, not least because it is for Parliament itself to decide what committees it sets up. But I welcome further views on how we ensure that we have effective parliamentary scrutiny, and I look forward to discussing that in Committee. We have also made it very clear that the Secretary of State will undertake a review of the effectiveness of the regime between two and five years after it comes into force, producing a report that will then be laid in Parliament, thus providing a statutory opportunity for Parliament to scrutinise the effectiveness of the legislation.

My noble friend and other members of her committee followed up with a letter to me about the Secretary of State’s powers. I shall reply to that letter in detail and make that available to all noble Lords to see ahead of Committee. This is ground-breaking legislation, and we have to balance the need for regulatory independence with the appropriate oversight for Parliament and the Government. In particular, concerns were raised about the Secretary of State’s power of direction in Clause 39. Ofcom’s independence and expertise will be of utmost importance here, but the very broad nature of online harms means that there may be subjects that go beyond its expertise and remit as a regulator. That was echoed by Ofcom itself when giving evidence to the Joint Committee: it noted that there will clearly be some issues in respect of which the Government have access to expertise and information that the regulator does not, such as national security.

The framework in the Bill ensures that Parliament will always have the final say on codes of practice, and the use of the affirmative procedure will further ensure that there is an increased level of scrutiny in the exceptional cases where that element of the power is used. As I said, I know that we will look at that in detail in Committee.

My noble friend Lord Black of Brentwood, quoting Stanley Baldwin, talked about the protections for journalistic content. He and others are right that the free press is a cornerstone of British democracy; that is why the Bill has been designed to protect press and media freedom and why it includes robust provisions to ensure that people can continue to access diverse news sources online. Category 1 companies will have a new duty to safeguard all journalistic content shared on their platform, which includes citizen journalism. Platforms will need to put systems and processes in place to protect journalistic content, and they must enforce their terms of service consistently across all moderation and in relation to journalistic content. They will also need to put in place expedited appeals processes for producers of journalistic content.

The noble Baroness, Lady Anderson of Stoke-on-Trent, spoke powerfully about the appalling abuse and threats of violence she sustained in her democratic duties, and the noble Baroness, Lady Foster, spoke powerfully of the way in which that is putting off people, particularly women, from going into public life. The noble Baroness, Lady Anderson, asked about a specific issue: the automatic deletion of material and the implications for prosecution. We have been mindful of the scenario where malicious users post threatening content which they then delete themselves, and of the burden on services that retaining that information in bulk would cause. We have also been mindful of the imperative to ensure that illegal content cannot be shared and amplified online by being left there. The retention of data for law enforcement purposes is strictly regulated, particularly through the Investigatory Powers Act, which the noble Lord, Lord Anderson of Ipswich, is reviewing at the request of the Home Secretary. I suggest that the noble Baroness and I meet to speak about that in detail, mindful of that ongoing review and the need to bring people to justice.

The noble Baroness, Lady Chakrabarti, asked about sex for rent. Existing offences can be used to prosecute that practice, including Sections 52 and 53 of the Sexual Offences Act 2003, both of which are listed as priority offences in Schedule 7 to the Bill. As a result, all in-scope services must take proactive measures to prevent people being exposed to such content.

The noble Lord, Lord Davies of Brixton, and others talked about scams. The largest and most popular platforms and search engines—category 1 and category 2A services in the Bill—will have a duty to prevent paid-for fraudulent adverts appearing on their services, making it harder for fraudsters to advertise scams online. We know that that can be a particularly devastating crime. The online advertising programme builds on this duty in the Bill and will look at the role of the whole advertising system in relation to fraud, as well as the full gamut of other harms which are caused.

My noble friend Lady Fraser talked about the devolution aspects, which we will certainly look at. Internet services are a reserved matter for the UK Government. The list of priority offences in Schedule 7 can be updated only by the Secretary of State, subject to approval by this Parliament.

The right reverend Prelate the Bishop of Manchester asked about regulatory co-operation, and we recognise the importance of that. Ofcom has existing and strong relationships with other regulators, such as the ICO and the CMA, which has been supported and strengthened by the establishment of the Digital Regulation Cooperation Forum in 2020. We have used the Bill to strengthen Ofcom’s ability to work closely with, and to disclose information to, other regulatory bodies. Clause 104 ensures that Ofcom can do that, and the Bill also requires Ofcom to consult the Information Commissioner.

I do not want to go on at undue length—I am mindful of the fact that we will have detailed debates on all these issues and many more in Committee—but I wish to conclude by reiterating my thanks to all noble Lords, including the many who were not able to speak today but to whom I have already spoken outside the Chamber. They all continue to engage constructively with this legislation to ensure that it meets our shared objectives of protecting children and giving people a safe experience online. I look forward to working with noble Lords in that continued spirit.

My noble friend Lady Morgan of Cotes admitted to being one of the cavalcade of Secretaries of State who have worked on this Bill; I pay tribute to her work both in and out of office. I am pleased that my right honourable friend the Secretary of State was here to observe part of our debate today and, like all noble Lords, I am humbled that Ian Russell has been here to follow our debate in its entirety. The experience of his family and too many others must remain uppermost in our minds as we carry out our duty on the Bill before us; I know that it will be. We have an important task before us, and I look forward to getting to it.

Bill read a second time.
Committee (1st Day)
Relevant document: 28th Report from the Delegated Powers Committee
16:49
Clause 1 agreed.
Amendment 1
Moved by
1: After Clause 1, insert the following new Clause—
“Purposes of Act
(1) This Act has the following purposes—(a) to secure that regulated internet services comply with UK law and do not endanger public health or national security,(b) to provide a higher level of protection for children than for adults in respect of regulated internet services, (c) to identify and mitigate the risk of reasonably foreseeable harm arising from the operation and design of regulated internet services,(d) to recognise and respond to the disproportionate level of harms experienced in relation to regulated internet services by people on the basis of one or more protected characteristic,(e) to apply the overarching principle that regulated internet services should be safe by design,(f) to safeguard freedom of expression within the law and privacy in relation to regulated internet services, and(g) to secure that regulated internet services operate with transparency and accountability in respect of online safety.(2) The Secretary of State and OFCOM must have regard to the need to fulfil these purposes in exercising functions under this Act.”Member’s explanatory statement
This new Clause would implement a recommendation of the Joint Committee which carried out pre-legislative scrutiny of the Bill, setting out a range of purposes for the legislation and making clear that both the Secretary of State and OFCOM must have regard to those purposes when exercising their statutory functions.
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I must say I am quite relieved that so many noble Lords have stayed; I thought that a single group with a single amendment on a sunny afternoon might have been enough to drive most noble Lords away. I take it as a thoroughly good- going sign that this will be a useful debate for us to have in Committee. I am privileged, and it is a great honour, to open this Committee stage with Amendment 1— at last.

Amendment 1 is in my name and those of the noble Baroness, Lady Kidron, and the noble Lords, Lord Clement-Jones and Lord Gilbert of Panteg. The noble Lord, Lord Gilbert, has let me know that he would have liked to have been present today and had intended to speak but, unfortunately, he has a hospital appointment. As noble Lords will be aware, he was recently a distinguished chair of your Lordships’ Communications and Digital Committee and would, I think, have had a lot to say about some of the issues that we are going to discuss this afternoon. I had the pleasure of working with him there, and he has kindly agreed that I can mention a couple of the points that he would have liked to make had he been present; I will be delighted to do so.

I am grateful to the noble Baroness and noble Lords for signing this amendment; that highlights the all-party support for ensuring that the Bill will achieve the high hopes that we all have for it. It also points to the fact that all the signatories were members of the Joint Committee of both Houses which undertook comprehensive pre-legislative scrutiny of the Bill 18 months ago—a process that I thoroughly endorse and count as one of the highlights of my time in your Lordships’ House.

I observe in passing that this amendment, based as it is on a recommendation from that Joint Committee, represents one of the few recommendations not yet implemented in the Bill before us today—just saying, Minister. I got that phrase from my kids; I am not quite sure what it means but they use it a lot, so I think it must have some commonality.

This amendment is intended to be declaratory, although it is also what the Public Bill Office—it has done a great job for us, we should all say—says is purposive. I had to look that one up, I confess; I discovered that it means “having or tending to fulfil a conscious purpose or design”. So this is a purposive amendment—indeed, it does what it says on the tin.

As the noble Lord, Lord Gilbert, would have said had he been present, the Bill is very difficult to understand, in part because of its innate complexity and in part because it has been revised so often. A simple statement of its purpose will help us all. I agree.

I stress at the outset that the amendment on its own does not seek to add anything to the considerable detail already in the Bill. However, it does five important things. It says up front what the Government are trying to achieve with this legislation and highlights what those companies within the scope of the Bill will need to bear in mind when they prepare for the new regime. It makes it clear that the new regime is centred on ensuring that the duties of care are placed on the companies that are in scope

“to identify and mitigate the risk of reasonably foreseeable harm arising from the operation and design of”

their services. It calls for “transparency and accountability” from all concerned in respect of online safety.

Had he been present, the noble Lord, Lord Gilbert, would have added that the amendment also sets out a few important principles that Ministers claim are fundamental to the way in which the Bill works but are absent from the detailed provisions when one comes to read them—such as, for example, that this Bill is about systems, not content. We will have to keep reminding ourselves of those words as we go through the Bill: it is about the systems that deliver the content but not the content itself.

Finally, this amendment would send a clear message about the trust that we in Parliament are placing in our independent regulator, Ofcom. That is a very important point. The amendment leads with a requirement that regulated services comply with UK law and do not endanger public health or national security. National security and public health are of course topical issues, but even if we were not in the midst of a storm about USA national security leaks shared on a Minecraft Discord server, which is certainly a user-to-user service that is widely accessed in the United Kingdom, it is probably wise to stress early on how vital it is for leaks of this nature to be at the forefront of regulated companies’ approach to the Bill. Today’s warnings by a Cabinet Minister and former Secretary of State at DCMS about cybersecurity affecting our national infrastructure are relevant here—likewise for public health.

I will not go through the amendment line by line. I am sure that others will want to comment on how it is laid out, the order of it and other matters, which are relevant but do not capture what the amendment is trying to do. However, I will focus on one: the reference to regulated companies having to have regard to reasonably foreseeable harm, as outlined in proposed new subsection (1)(c). I regret that the term “reasonably foreseeable harms” is absent from the Bill, although of course it featured heavily in the preceding White Paper when Sir Jeremy Wright was Secretary of State. The dropping of the “legal but harmful” category raises the question of how Ofcom will future-proof the system. Now that a wide-ranging risk assessment is no longer required by Ofcom, it will be hard to see what harms are coming down the track that might harm children in the future when applied to them or indeed hobble the regime by undermining the ability to look forward with the full resources of Ofcom and the companies working in concert. There are amendments on this issue which we will come to later, including one tabled by the right reverend Prelate the Bishop of Oxford that may test this issue.

The Government confirmed in a Written Answer to me of 8 February that AI products in a user-to-user or search engine service would be covered by the Bill, but the sudden recent explosion of AI products is a very good example of why a more general sense of foreseeability of harms may be required, rather than simply relying, as I think we will have to, on a list of things that we currently know about.

Our Joint Committee report made clear that the inclusion of this overarching objectives amendment would help all of us to ensure that the Online Safety Bill will be easy to understand, not just for service providers but for the public. Its inclusion would mean that we would be able to get into the detail of the Bill with a much better understanding of what the Government are seeking. I see the flow, which the committee was very clear about—having clear objectives that lead into precise duties on the regulated providers, robust powers for the regulator to act when the platforms fail to meet those legal and regulatory requirements, and a continuing role for Parliament, which is something that we will come to in future debates.

The internet is a wonderful invention. The major online services have become central to how people around the world access news and information, do business, play games, and keep in touch with family and friends, and the internet is free to use. But is it free? These services are highly profitable businesses. Where does that money come from? It is a commercial model based on selling targeted advertising. User data—our data—is collected and used to train algorithms to maximise engagement and users’ attention. The length of time and the frequency with which users engage on the platforms increase their value. More spent online means that more advertising reaches users, which leads to more revenue for the companies. It is a vicious circle.

However, we are where we are. Actively seeking to increase engagement through personalisation has the power to create more harmful user experiences for vulnerable people and children, who are more likely to see content which will increase their vulnerabilities or do them harm. The more that people interact with conspiracy theories, for example, the more of them they will see. The grouping together of users with similar interests can create environments which normalise hate speech and extremism. Design features that favour the spread of information over safety facilitate the targeting and amplification of abuse, as we have seen.

There is no doubt that this Online Safety Bill is a key step forward for our citizens and consumers. I have made it absolutely clear that I support the Government in their Bill and that we will do what we can to make sure that it reaches the statute book as quickly as possible. It is also important to remember that it is showing other democratic societies that want to bring accountability and responsibility to the internet how it can be done, and I believe that this Bill will do it very well. However, it will only be effective if online services are held accountable for the design and operation of their systems by the regulations introduced by this Bill—and of course its successors, because this is the first of a number of Bills which we know we will be seeing in this area. There are very important points here about how we approach this, the need to maintain the will of Parliament throughout these areas, and the appointment of an independent regulator rather than those who happen to reside in Silicon Valley.

17:00
In conclusion, this is a very long and complex Bill, so it is a very fair question to ask: why on earth are we adding to it? I strongly believe that the impact of the Bill will be lessened if there is no clear, concise statement of what the Bill is designed to do and what it will change once it is up and running. Rather than critiquing the rationale of moving this amendment, therefore, I hope the debate this afternoon will focus on ensuring that it does what the Joint Committee wanted. Does it make clear what the Bill is trying to achieve? After the Bill receives Royal Assent, will we see a reduction in the seemingly ever-increasing harms caused by social media services and search engines? Is the balance between privacy and freedom of expression correct? Is the priority need to protect children secure and well embedded in the Bill? Will the system as a whole operate with transparency and accountability? Does it foreground what citizens and consumers want and need? Will they be able to get redress if their interests are harmed?
The Government will likely respond by saying that everything that I have set out in this amendment is already in the Bill. I agree. However, while the words are all there, they are scattered about in various places, with most of the substance being in Schedule 4, which, of course, will have limited application. It is very hard to get a sense of what the overarching priorities are: hence the amendment. I also agree with the Government when they say that legislating for a largely unprecedented, comprehensive, future-proofed and enforceable framework has required the creation of a range of targeted duties and a series of new definitions; and that an overall objective might risk confusing rather than clarifying. That is a fair point.
It is correct to say, however, as I am sure the Minister will say in his response, that the fact that the Bill is complicated—and, boy, it is—does not necessarily mean that the framework itself will, in practice, be overly complicated for services to comply with and for Ofcom to enforce. My argument today is not about a concern that Ofcom’s codes of practice and any associated guidance might not provide detail and clarity for services as to what steps they need to take to comply with their legislative duties, taking into account their risk profiles. I have confidence that Ofcom will do that very well indeed, and recent discussions on that have confirmed my impression. It will be independent, it will be important for it to be seen as independent, and it will act in the best interests of what we require in this space.
My concern is that, without also asserting a clear vision of what we want to happen as a result of passing this Bill, Parliament, and the people who use the social media and search resources of the internet, will have no framework by which to judge the success or otherwise of this important Bill. This amendment, as I said, is primarily declaratory, but I hope I have proved that it is also purposive. I think that the Government should look at it very carefully and hope that they will accept it. I beg to move.
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I draw attention to my interests in the register, which I declared in full at Second Reading. It is an absolute pleasure to follow the noble Lord, Lord Stevenson, and, indeed, to have my name on this amendment, along with those of fellow members of the pre-legislative committee. It has been so long that it almost qualifies as a reunion tour.

This is a fortuitous amendment on which to start our deliberations, as it sets out the very purpose of the Bill—a North Star. I want to make three observations, each of which underlines its importance. First, as the pre-legislative committee took evidence, it was frequently remarked by both critics and supporters that it was a complicated Bill. We have had many technical briefings from DSIT and Ofcom, and they too refer to the Bill as “complicated”. As we took advice from colleagues in the other place, expert NGOs, the tech sector, academics and, in my own case, the 5Rights young advisory group, the word “complicated” repeatedly reared its head. This is a complex and ground-breaking area of policy, but there were other, simpler structures and approaches that have been discarded.

Over the five years with ever-changing leadership and political pressures, the Bill has ballooned with caveats and a series of very specific, and in some cases peculiar, clauses—so much so that today we start with a Bill that even those of us who are paying very close attention are often told that we do not understand. That should make the House very nervous.

It is a complicated Bill with intersecting and dependent clauses—grey areas from which loopholes emerge—and it is probably a big win for the deepest pockets. The more complicated the Bill is, the more it becomes a bonanza for the legal profession. As the noble Lord, Lord Stevenson, suggests, the Minister is likely to argue that the contents of the amendment are already in the Bill, but the fact that the word “complicated” is firmly stuck to its reputation and structure is the very reason to set out its purpose at the outset, simply and unequivocally.

Secondly, the OSB is a framework Bill, with vast amounts of secondary legislation and a great deal of work to be implemented by the regulator. At a later date we will discuss whether the balance between the Executive, the regulator and Parliament is exactly as it should be, but as the Bill stands it envisages a very limited future role for Parliament. If I might borrow an analogy from my previous profession, Parliament’s role is little more than that of a background extra.

I have some experience of this. In my determination to follow all stages of the age-appropriate design code, I found myself earlier this week in the Public Gallery of the other place to hear DSIT Minister Paul Scully, at Second Reading of the Data Protection and Digital Information (No. 2) Bill, pledge to uphold the AADC and its provisions. I mention this in part to embed it on the record—that is true—but primarily to make this point: over six years, there have been two Information Commissioners and double figures of Secretaries of State and Ministers. There have been many moments at which the interpretation, status and purpose of the code has been put at risk, at least once to a degree that might have undermined it altogether. At these moments, each time the issue was resolved by establishing the intention of Parliament beyond doubt. Amendment 1 moves Parliament from background extra to star of the show. It puts the intention of Parliament front and centre for the days, weeks, months and years ahead in which the work will still be ongoing—and all of us will have moved on.

The Bill has been through a long and fractured process in which the pre-legislative committee had a unique role. Many attacks on the Bill have been made by people who have not read it. Child safety was incorrectly cast as the enemy of adult freedom. While some wanted to apply the existing and known concepts and terms of public interest, protecting the vulnerable, product safety and the established rights and freedoms of UK citizens, intense lobbying has seen them replaced by untested concepts and untried language over which the tech sector has once again emerged as judge and jury. This has further divided opinion.

In spite of all the controversy, when published, the recommendations of the committee report received almost universal support from all sides of the debate. So I ask the Minister not only to accept the committee’s view that the Bill needs a statement of purpose, the shadow of which will provide shelter for the Bill long into the future, but to undertake to look again at the committee report in full. In its pages lies a landing strip of agreement for many of the things that still divide us.

This is a sector that is 100% engineered and almost all privately owned, and within it lie solutions to some of the greatest problems of our age. It does not have to be as miserable, divisive and exploitative as this era of exceptionalism has allowed it to be. As the Minister is well aware, I have quite a lot to say about proposed new subsection (1)(b),

“to provide a higher level of protection for children than for adults”,

but today I ask the Minister to tell us which of these paragraphs (a) to (g) are not the purpose of the Bill and, if they are not, what is.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I am pleased that we are starting our Committee debate on this amendment. It is a pleasure to follow the noble Lord, Lord Stevenson, and the noble Baroness, Lady Kidron.

In this Bill, as has already been said, we are building a new and complex system and we can learn some lessons from designing information systems more generally. There are three classic mistakes that you can make. First, you can build systems to fit particular tools. Secondly, you can overcommit beyond what you can actually achieve. Thirdly, there is feature creep, through which you keep adding things on as you develop a new system. A key defence against these mistakes is to invest up front in producing a really good statement of requirements, which I see in Amendment 1.

On the first risk, as we go through the debate, there is a genuine risk that we get bogged down in the details of specific measures that the regulator might or might not include in its rules and guidance, and that we lose sight of our goals. Developing a computer system around a particular tool—for example, building everything with Excel macros or with Salesforce—invariably ends in disaster. If we can agree on the goals in Amendment 1 and on what we are trying to achieve, that will provide a sound framework for our later debates as we try to consider the right regulatory technologies that will deliver those goals.

The second cardinal error is overcommitting and underdelivering. Again, it is very tempting when building a new system to promise the customer that it will be all-singing, all-dancing and can be delivered in the blink of an eye. Of course, the reality is that in many cases, things prove to be more complex than anticipated, and features sometimes have to be removed while timescales for delivering what is left are extended. A wise developer will instead aim to undercommit and overdeliver, promising to produce a core set of realistic functions and hoping that, if things go well, they will be able to add in some extra features that will delight the customer as an unexpected bonus.

This lesson is also highly relevant to the Bill, as there is a risk of giving the impression to the public that more can be done quicker than may in fact be possible. Again, Amendment 1 helps us to stay grounded in a realistic set of goals once we put those core systems in place. The fundamental and revolutionary change here is that we will be insisting that platforms carry out risk assessments and share them with a regulator, who will then look to them to implement actions to mitigate those risks. That is fundamental. We must not lose sight of that core function and get distracted by some of the bells and whistles that are interesting, but which may take the regulator’s attention away from its core work.

We also need to consider what we mean by “safe” in the context of the Bill and the internet. An analogy that I have used in this context, which may be helpful, is to consider how we regulate travel by car and aeroplane. Our goal for air travel is zero accidents, and we regulate everything down to the nth degree: from the steps we need to take as passengers, such as passing through security and presenting identity documents, to detailed and exacting safety rules for the planes and pilots. With car travel, we have a much higher degree of freedom, being able to jump in our private vehicles and go where we want, when we want, pretty much without restrictions. Our goal for car travel is to make it incrementally safer over time; we can look back and see how regulation has evolved to make vehicles, roads and drivers safer year on year, and it continues to do so. Crucially, we do not expect car travel to be 100% safe, and we accept that there is a cost to this freedom to travel that, sadly, affects thousands of people each year, including my own family and, I am sure, many others in the House. There are lots of things we could do to make car travel even safer that we do not put into regulation, because we accept that the cost of restricting freedom to travel is too high.

Without over-labouring this analogy, I ask that we keep it in mind as we move through Committee—whether we are asking Ofcom to implement a car-like regime whereby it is expected to make continual improvements year on year as the state of online safety evolves, or we are advocating an aeroplane-like regime whereby any instance of harm will be seen as a failure by the regulator. The language in Amendment 1 points more towards a regime of incremental improvements, which I believe is the right one. It is in the public interest: people want to be safer online, but they also want the freedom to use a wide range of internet services without excessive government restriction, and they accept some risk in doing so.

I hope that the Minister will respond positively to the intent of Amendment 1 and that we can explore in this debate whether there is broad consensus on what we hope the Bill will achieve and how we expect Ofcom to go about its work. If there is not, then we should flush that out now to avoid later creating confused or contradictory rules based on different understandings of the Bill’s purpose. I will keep arguing throughout our proceedings for us to remain focused on giving the right goals to Ofcom and allowing it considerable discretion over the specific tools it needs, and for us to be realistic in our aims so that we do not overcommit and underdeliver.

Finally, the question of feature creep is very much up to us. There will be a temptation to add things into the Bill as it goes through. Some of those things are essential; I know that the noble Baroness, Lady Kidron, has some measures that I would also support. This is the right time to do that, but there will be other things that would be “nice to have”, and the risk of putting them in might detract from those core mechanisms. I hope we are able to maintain our discipline as we go through these proceedings to ensure we deliver the right objectives, which are incredibly well set out in Amendment 1, which I support.

17:15
Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to follow other noble Lords who have spoken. I too support this key first amendment. Clarity of purpose is essential in any endeavour. The amendment overall sets out the Bill’s aims and enhances what will be vital legislation for the world, I hope, as well as for the United Kingdom. The Government have the very welcome ambition of making Britain the safest country in the world to go online. The OSB is a giant step in that direction.

As has been said, there has been remarkable consensus across the Committee on what further measures may still be needed to improve the Bill and on this first amendment, setting out these seven key purposes. Noble Lords may be aware that in the Christian tradition the number seven is significant: in the medieval period the Church taught the dangers of the seven deadly sins, the merits of the seven virtues and the seven acts of mercy. Please speak to me later if a refresher course is needed.

Amendment 1 identifies seven deadly dangers—I think they are really deadly. They are key risks which we all acknowledge are unwelcome and destructive companions of the new technologies which bring so many benefits: risks to public health or national security; the risk of serious harm to children; the risk of new developments and technologies not currently in scope; the disproportionate risk to those who manifest one or more protected characteristics; risks that occur through poor design; risks to freedom of expression and privacy; and risks that come with low transparency and low accountability. Safety and security are surely one of the primary duties of government, especially the safety and security of children and the vulnerable. There is much that is good and helpful in new technology but much that can be oppressive and destructive. These seven risks are real and present dangers. The Bill is needed because of actual and devastating harm caused to people and communities.

As we have heard, we are living through a period of rapid acceleration in the development of AI. Two days ago, CBS broadcast a remarkable documentary on the latest breakthroughs by Google and Microsoft. The legislation we craft in these weeks needs future-proofing. That can happen only through a clear articulation of purpose so that the framework provided by the Bill continues to evolve under the stewardship of the Secretary of State and of Ofcom.

I have been in dialogue over the past five years with tech companies in a variety of contexts and I have seen a variety of approaches, from the highly responsible in some companies to the frankly cavalier. Good practice, especially in design, needs stronger regulation to become uniform. I really enjoyed the analogy from the noble Lord, Lord Allan, a few minutes ago. We would not tolerate for a moment design and safety standards in aeroplanes, cars or washing machines which had the capacity to cause harm to people, least of all to children. We should not tolerate lesser standards in our algorithms and technologies.

There is no map for the future of technology and its use, even over the rest of this decade, but this amendment provides a compass—a fixed point for navigation in the future, for which future generations will thank this Government and this House. These seven deadly dangers need to be stated clearly in the Bill and, as the noble Baroness, Lady Kidron, said, to be a North Star for both the Secretary of State and Ofcom. I support the amendment.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I too support this amendment. I was at a dinner last night in the City for a group of tech founders and investors—about 500 people in a big hotel ballroom, all focused on driving the sort of positive technology growth in this country that I think everyone wants to see. The guest speaker runs a large UK tech business. He commented in his speech that tech companies need to engage with government because—he said this as if it was a revelation—all Governments turned out not to speak with one voice and that understanding what was required of tech companies by Governments is not always easy. Business needs clarity, and anyone who has run a large or small business knows that it is not really the clarity in the detail that matters but the clarity of purpose that enables you to lead change, because then your people understand why they need to change, and if they understand why, then in each of the micro-decisions they take each day they can adjust those decisions to fit with the intent behind your purpose. That is why this amendment is so important.

I have worked in this space of online safety for more than a decade, both as a technology leader and in this House. I genuinely do not believe that business is wicked and evil, but what it lacks is clear direction. The Bill is so important in setting those guardrails that if we do not make its purpose clear, we should not be surprised if the very businesses which really do want Governments to be clear do not know what we intend.

I suspect that my noble friend the Minister might object to this amendment and say that it is already in the Bill. As others have already said, I actually hope it is. If it is not, we have a different problem. The point of an upfront summary of purpose is to do precisely that: to summarise what is in what a number of noble Lords have already said is a very complicated Bill. The easier and clearer we can make it for every stakeholder to engage in the Bill, the better. If alternatively my noble friend the Minister objects to the detailed wording of this amendment, I argue that that simply makes getting this amendment right even more important. If the four noble Lords, who know far more about this subject than I will ever do in a lifetime, and the joint scrutiny committee, which has done such an outstanding job at working through this, have got the purposes of the Bill wrong, then what hope for the rest of us, let alone those business leaders trying to interpret what the Government want?

That is why it is so important that we put the purposes of the Bill absolutely at the front of the Bill, as in this amendment. If we have misunderstood that in the wording, I urge my noble friend the Minister to come back with wording on Report that truly encapsulates what the Government want.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I welcome this opportunity to clarify the purposes of the Bill, but I am not sure that the amendment helps as my North Star. Like the Bill, it throws up as many questions as answers, and I found myself reading it and thinking “What does that word mean?”, so I am not sure that clarity was where I ended up.

It is not a matter of semantics, but in some ways you could say—and certainly this is as publicly understood—that the name of the Bill, the Online Safety Bill, gives it its chief purpose. Yet however well-intentioned, and whatever the press releases say or the headlines print, even a word such as “safety” is slippery, because safety as an end can be problematic in a free society. My worry about the Bill is unintended consequences, and that is not rectified by the amendment. As the Bill assumes safety as the ultimate goal, we as legislators face a dilemma. We have the responsibility of weighing up the balance between safety and freedom, but the scales in the Bill are well and truly weighted towards safety at the expense of freedom before we start, and I am again not convinced the amendment weights them back again.

Of course, freedom is a risky business, and I always like the opportunity to quote Karl Marx, who said:

“You cannot pluck the rose without its thorns!”


However, it is important to recognise that “freedom” is not a dirty word, and we should avoid saying that risk-free safety is more important than freedom. How would that conversation go with the Ukrainian people who risk their safety daily for freedom? Also, even the language of safety, or indeed what constitutes the harms that the Bill and the amendments promise to keep the public safe from, need to be considered in the cultural and social context of the norms of 2023. A new therapeutic ethos now posits safety in ever-expanding pseudo-psychological and subjective terms, and this can be a serious threat to free speech. We know that some activists often exploit that concept of safety to claim harm when they merely encounter views they disagree with. The language of safety and harm is regularly used to cancel and censor opponents—and the Government know that, so much so that they considered it necessary to introduce the Higher Education (Freedom of Speech) Bill to secure academic freedom against an escalating grievance culture that feigns harm.

Part of the triple shield is a safety duty to remove illegal content, and the amendment talks about speech within the law. That sounds unobjectionable—in my mind it is far better than “legal but harmful”, which has gone—but, while illegality might sound clear and obvious, in some circumstances it is not always clear. That is especially true in any legal limitations of speech. We all know about the debates around hate speech, for example. These things are contentious offline and even the police, in particular the College of Policing, seem to find the concept of that kind of illegality confusing and, at the moment, are in a dispute with the Home Secretary over just that.

Is it really appropriate that this Bill enlists and mandates private social media companies to judge criminality using the incredibly low bar of “reasonable grounds to infer”? It gets even murkier when the legal standard for permissible speech online will be set partly by compelling platforms to remove content that contravenes their terms and conditions, even if these terms of service restrict speech far more than domestic UK law does. Big tech is being incited to censor whatever content it wishes as long as it fits in with their Ts & Cs. Between this and determining, for example, what is in filters—a whole different issue—one huge irony here, which challenges one of the purposes of the Bill, is that despite the Government and many of us thinking that this legislation will de-fang and regulate big tech’s powers, actually the legislation could inadvertently give those same corporates more control of what UK citizens read and view.

Another related irony is that the Bill was, no doubt, designed with Facebook, YouTube, Twitter, Google, TikTok and WhatsApp in mind. However, as the Bill’s own impact assessment notes, 80% of impacted entities have fewer than 10 employees. Many sites, from Wikipedia to Mumsnet, are non-profit or empower their own users to make moderation or policy decisions. These sites, and tens of thousands of British businesses of varying sizes, perhaps unintentionally, now face an extraordinary amount of regulatory red tape. These onerous duties and requirements might be actionable if not desirable for larger platforms, but for smaller ones with limited compliance budgets they could prove a significant if not fatal burden. I do not think that is the purpose of the Bill, but it could be an unintended outcome. This also means that regulation could, inadvertently, act as barrier to entry to new SMEs, creating an ever more monopolistic stronghold for big tech, at the expense of trialling innovations or allowing start-ups to emerge.

I want to finish with the thorny issue of child protection. I have said from the beginning—I mean over the many years since the Bill’s inception—that I would have been much happier if it was more narrowly titled as the Children’s Online Safety Bill, to indicate that protecting children was its sole purpose. That in itself would have been very challenging. Of course, I totally agree with Amendment 1’s intention

“to provide a higher level of protection for children than for adults”.

That is how we treat children and adults offline.

17:30
However, even then there are dilemmas. For example, if a filter for suicide might prevent a teenage user seeing some of the most awful, hideous and nihilistic images—those that we have in mind that the Bill’s purpose is to get rid of—how do we ensure it does not also reduce that teenager’s exposure to help, which they might want if they are feeling suicidal? How do we ensure that they are not denied valuable news items, debate and discussion for educational merit? Parents and society have those sorts of cost-benefit analysis challenges every day. Everyone wants their own children, indeed wants all children, to be kept safe from harms. But we do not lock children in their bedroom 24/7 just in case they encounter risk. We know that that would deprive them of crucial developmental opportunities to grow and learn, and to manage risk. A whole body of educational scholarship exists looking at some of the downsides of adult fears creating a generation of cotton-wool kids. That has been detrimental to children’s resilience, and children are often victims when adults overprotect. So I would just warn against overselling the Bill as a guarantee of risk-free safety for the young online, at any cost.
The whole issue of children is a difficult area. I know to my cost, from a rather ill-chosen way in which I expressed myself in a newspaper interview some years ago on the dilemmas of child protection versus free speech, that mis-speaking can mean being branded as complacent or even as an apologist for the most heinous horrors that can be inflicted on the young, from grooming to access to pornography. However, when people say “Think of the children”, or when we are rightly reminded to consider the tragedy of Molly Russell, for example, we can find ourselves chilled into walking on eggshells and not saying what we think.
We need to be bravely dispassionate in our discussions on protecting children online, and to scrutinise the Bill carefully for unintended consequences for children. But we must also avoid allowing our concern for children to spill over into infantilising adults and treating adult British citizens as though they are children who need protection from speech. There is a lot to get through in the Bill but the amendment, despite its good intentions, does not resolve the dilemmas we are likely to face in the following weeks.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I have had a helpful reminder about declarations of interest. I once worked for Facebook; I divested myself of any financial interest back in 2020, but of course a person out there may think that what I say today is influenced by the fact that I previously took the Facebook shilling. I want that to be on record as we debate the Bill.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I have not engaged with this amendment in any particular detail—until the last 24 hours, in fact. I thought that I would come to listen to the debate today and see if there was anything that I could usefully contribute. I have been interested in the different points that have been raised so far. I find myself agreeing with some points that are perhaps in tension or conflict with each other. I emphasise from the start, though, my complete respect for the Joint Committee and the work that it did in the pre-legislative scrutiny of the Bill. I cannot compare my knowledge and wisdom on the Bill with those who, as has already been said, have spent so much intensive time thinking about it in the way that they did at that stage.

Like my noble friend Lady Harding, I always have a desire for clarity of purpose. It is critical for the success of any organisation, or anything that we are trying to do. As a point of principle, I like the idea of setting out at the start of this Bill its purpose. When I looked through the Bill again over the last couple of weeks in preparation for Committee, it was striking just how complicated and disjointed a piece of work it is and so very difficult to follow.

There are many reasons why I am sympathetic towards the amendment. I can see why bringing together at the beginning of the Bill what are currently described as “Purposes” might be for it to meet its overall aims. But that brings me to some of the points that the noble Baroness, Lady Fox, has just made. The Joint Committee’s report recommends that the objectives of the Bill

“should be that Ofcom should aim to improve online safety for UK citizens by ensuring that service providers”—

it then set out objectives aimed at Ofcom rather than them actually being the purposes of the Bill.

I was also struck by what the noble Lord, Lord Allen, said about what we are looking for. Are we looking for regulation of the type that we would expect of airlines, or of the kind we would expect from the car industry? If we are still asking that question, that is very worrying. I think we are looking for something akin to the car industry model as opposed to the airline model. I would be very grateful if my noble friend the Minister was at least able to give us some assurance on that point.

If I were to set out a purpose of the Bill at the beginning of the document, I would limit myself to what is currently in proposed new subsection (1)(g), which is

“to secure that regulated internet services operate with transparency and accountability in respect of online safety”.

That is all I would say, because that, to me, is what this Bill is trying to do.

The other thing that struck me when I looked at this—I know that there has been an approach to this legislation that sought to adopt regulation that applies to the broadcasting world—was the thought, “Somebody’s looked at the BBC charter and thought, well, they’ve got purposes and we might adopt a similar sort of approach here.” The BBC charter and the purposes set out in it are important and give structure to the way the BBC operates, but they do not give the kind of clarity of purpose that my noble friend Lady Harding is seeking—which I too very much support and want to see—because there is almost too much there. That is my view on what the place to start would be when setting out a very simple statement of purpose for this Bill.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this day has not come early enough for me. I am pleased to join others on embarking on the Committee stage of the elusive Online Safety Bill, where we will be going on an intrepid journey, as we have heard so far. Twenty years ago, while I was on the Ofcom content board, I pleaded for the internet to be regulated, but was told that it was mission impossible. So this is a day I feared might not happen, and I thank the Government for making it possible.

I welcome Amendment 1, in the names of the noble Lords, Lord Stevenson, Lord Clement-Jones, and others. It does indeed encapsulate the overarching purpose of the Bill. But it also sets out the focus of what other amendments will be needed if the Bill is to achieve the purpose set out in that amendment.

The Bill offers a landmark opportunity to protect children online, and it is up to us to make sure that it is robust, effective and evolvable for years to come. In particular, I welcome subsection (1)(a) and (b) of the new clause proposed by Amendment 1. Those paragraphs highlight an omission in the Bill. If the purposes set out in them are to be met, the Bill needs to go much further than it currently does.

Yes, the Bill does not go far enough on pornography. The amendment sets out a critical purpose for the Bill: children need a “higher level of protection”. The impact that pornography has on children is known. It poses a serious risk to their mental health and their understanding of consent, healthy sex and relationships. We know that children as young as seven are accessing pornographic content. Their formative years are being influenced by hardcore, abusive pornography.

As I keep saying, childhood lasts a lifetime, so we need to put children first. This is why I have dedicated my life to the protection of children and their well-being. This includes protection from pornography, where I have spent over a decade campaigning to prevent children easily accessing online pornographic content.

I know that others have proposed amendments that will be debated in due course which meet this purpose. I particularly support the amendments in the names of the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell. Those amendments meet the purpose of the Bill by ensuring that children are protected from pornographic content wherever it is found through robust, anonymous age verification that proves the user’s age beyond reasonable doubt.

Online pornographic content normalises abusive sexual acts, with the Government’s own research finding

“substantial evidence of an association between the use of pornography and harmful sexual attitudes and behaviours towards women”

and children. This problem is driven largely by the types of content that are easily available online. Pornography is no longer the stereotype that we might imagine from the 1970s and 1980s. It is now vicious, violent and pervasive. Content that would be prohibited offline is readily available online for free with just a few clicks. The Online Safety Bill comes at a crucial moment to regulate online pornography. That is why I welcome the amendment introducing a purpose to the Bill that ensures that internet companies “comply with UK law”.

We have the Obscene Publications Act 1959 and UK law does not allow the offline distribution of material that sexualises children—such as “barely legal” pornography, where petite-looking adult actors are made to look like children—content which depicts incest and content which depicts sexual violence, including strangulation. That is why it is important that the Bill makes that type of material illegal online as well. Such content poses a high risk to children as well as women and girls. There is evidence that such content acts as a gateway to more hardcore material, including illegal child sexual abuse material. Some users spiral out of control, viewing content that is more and more extreme, until the next click is illegal child sexual abuse material, or even going on to contact and abuse children online and offline.

My amendment would require service providers to exclude from online video on-demand services any pornographic content that would be classified as more extreme than R18 and that would be prohibited offline. This would address the inconsistency between online and offline regulation of pornographic content—

17:45
Lord Harlech Portrait Lord Harlech (Con)
- Hansard - - - Excerpts

My Lords, we have had a good-natured and informative opening debate, but we should keep our remarks to this particular amendment, in the knowledge that all future amendments will have their rightful discussion in due course.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- Hansard - - - Excerpts

I thank the noble Lord. I hope that the amendments I support will be supported by CEASE, Refuge and Barnardo’s—I declare an interest here. Let us not let the chance of creating a robust Online Safety Bill slip through our fingers. It is now time to act with boldness, vision, morality and determination. I trust that we will continue to focus on the purpose of the Bill: to make the online world safer, especially for our children. They are relying on us to do the right thing, so let us do so.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

I strongly support my noble friend in his amendment. I clarify that, in doing so, I am occupying a guest slot on the Front Bench: I do so as a member of his team but also as a member of the former Joint Committee. As my noble friend set out, this reflects where we got to in our thinking as a Joint Committee all that time ago. My noble friend said “at last”, and I echo that and what others said. I am grateful for the many briefings and conversations that we have had in the run-up to Committee, but it is good to finally be able to get on with it and start to clear some of these things out of my head, if nothing else.

In the end, as everyone has said, this is a highly complex Bill. Like the noble Baroness, Lady Stowell, in preparation for this I had another go at trying to read the blooming thing, and it is pretty much unreadable —it is very challenging. That is right at the heart of why I think this amendment is so important. Like the noble Baroness, Lady Kidron, I worry that this will be a bonanza for the legal profession, because it is almost impenetrable when you work your way through the wiring of the Bill. I am sure that, in trying to amend it, some of us will have made errors. We have been helped by the Public Bill Office, but we will have missed things and got things the wrong way around.

It is important to have something purposive, as the Joint Committee wanted, and to have clarity of intent for Ofcom, including that this is so much more about systems than about content. Unlike the noble Baroness, Lady Stowell—clearly, we all respect her work chairing the communications committee and the insights she brings to the House—I think that a very simple statement, restricting it just to proposed new paragraph (g), is not enough. It would almost be the same as the description at the beginning of the Bill, before Clause 1. We need to go beyond that to get the most from having a clear statement of how we want Ofcom to do its job and the Secretary of State to support Ofcom.

I like what the noble Lord, Lord Allan, said about the risk of overcommitment and underdevelopment. When the right reverend Prelate the Bishop of Oxford talked about being the safest place in the world to go online, which is the claim that has been made about the Bill from the beginning, I was reminded again of the difficulty of overcommitting and underdelivering. The Bill is not perfect, and I do not believe that it will be when this Committee and this House have finished their work; we will need to keep coming back and legislating and regulating in this area, as we pursue the goal of being the safest place in the world to go online —but it will not be any time soon.

I say to the noble Baroness, Lady Fox, who I respect, that I understand what she is saying about some of her concerns about a risk-free child safety regime and the unintended consequences that may come in this legislation. But at its heart, what motivate us and make us believe that getting the Bill right is one of the most important things we will do in all of our times in this Parliament are the unintended consequences of the algorithms that these tech companies have created in pushing content at children that they do not want to hear. I see the noble Baroness, Lady Kidron, wanting to comment.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I just want to say to the noble Baroness, Lady Fox, that we are not looking to mollycoddle children or put them in cotton wool; we are asking for a system where they are not systematically exploited by major companies.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I very much agree. The core of what I want to say in supporting this amendment is that in Committee we will do what we are here to do. There are a lot of amendments to what is a very long and complicated Bill: we will test the Minister and his team on what the Government are trying to achieve and whether they have things exactly right in order to give Ofcom the best possible chance to make it work. But when push comes to shove at the end of the process, at its heart we need to build trust in Ofcom and give it the flexibility to be able to respond to the changing online world and the changing threats to children and adults in that online world. To do that, we need to ensure that we have the right amount of transparency.

I was particularly pleased to see proposed new paragraph (g) in the amendment, on transparency, as referenced by the noble Baroness, Lady Stowell. It is important that we have independence for Ofcom; we will come to that later in Committee. It is important that Parliament has a better role in terms of accountability so that we can hold Ofcom to account, having given it trust and flexibility. I see this amendment as fundamental to that, because it sets the framework for the flexibility that we then might want to be able to give Ofcom over time. I argue that this is about transparency of purpose, and it is a fundamental addition to the Bill to make it the success that we want.

Lord Inglewood Portrait Lord Inglewood (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, the noble Baroness, Lady Harding, made possibly one of the truest statements that has ever been uttered in this House when she told us that this is a very complicated Bill. It is complicated to the extent that I have no confidence that I fully understand it and all its ramifications, and a number of other speakers have said the same. For that reason—because I am aware of my own limitations, and I am pretty sure they are shared by others—it is important to have a statement of purpose at the outset to provide the co-ordinates for the discussion we are going to have; I concur with the approach of the noble Lord, Lord Allan. Because there is then a framework within which we can be sure, we hope, that we will manage to achieve an outcome that is both comprehensive and coherent. As a number of noble Lords have said, there are a number of completely different, or nearly different, aspects to what we are discussing, yet the whole lot have to link together. In the words of EM Forster, we have to

“connect the prose and the passion”.

The Minister may say, “We can’t do that at the outset”. I am not so sure. If necessary, we should actually draft this opening section, or any successor to it, as the last amendment to the Bill, because then we would be able to provide an overview. That overview will be important because, just as I am prepared to concede that I do not think I understand it all now, there is a very real chance that I will not understand it all then either. If we have this at the head of the Bill, I think that will be a great help not only to us but to all those who are subsequently going to have to make use of it.

Lord Griffiths of Burry Port Portrait Lord Griffiths of Burry Port (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I want to say something simple in support of what has already been said. If it is true that the Bill’s purposes are already scattered in the course of the Bill and throughout its substance, I cannot see what possible objection there can be to having them extracted and put at the beginning. They are not contentious—they are there already—so let us have them at the beginning to set a direction of travel. It seems so obvious to me.

It is an important Bill. I thank the Minister and his colleagues because they have put an enormous amount of work into this, and of course the Joint Committee has done its work. We have all been sent I cannot say how many briefing papers from interested bodies and so on. It is vital that, as we try to hold as much of this together as we possibly can in taking this very important Bill forward, we should have a sense of purpose and criteria against which we can measure what we eventually go on to discuss, make decisions about and introduce into the body of the Bill. I cannot see that the logic of all that can possibly be faulted.

Of course, there will be words that are slippery, as has been said. I cannot think of a single word, and I have been a lexicographer in my life, that does not lend itself to slipperiness. I could use words that everybody thinks we have in common in a way that would befuddle noble Lords in two minutes. It seems to me self-evident that these purposes, as stated here at the outset of our consideration in Committee, are logical and sensible. I will be hoping, as the Bill proceeds, to contribute to and build on the astounding work that the noble Baroness, Lady Kidron, has laid before us, with prodigious energy, in alerting all kinds of people, not just in your Lordships’ House but across the country, to the issues at stake here. I hope that she will sense that the Committee is rallying behind her in the astute way that she is bringing this matter before us. But again, I will judge outcomes against the provisions in this opening statement, a criterion for judging even the things that I feel passionate about.

The noble Baroness, Lady Morgan, and I have been in our own discussions about different parts of the Bill, about things such as suicide and self-harm. That is content. There are amendments. We will discuss them. Again, we can hold our own decisions about those matters against what we are seeking to achieve as stated so clearly at the outset of the Bill.

I remember working with the noble Lord, Lord Stevenson. It is so fabulous to have him back; the place feels right when he is here. When I was a bit of a greenhorn—he was the organ grinder and I was the monkey—I remember him pleading at the beginning of what was at that time the Data Protection Bill to have a statement like this at the beginning of that Bill. We were told, “Oh, but it is all in the Bill; all the words are there”. Then why not put them at the beginning, so that we can see them clearly and have something against which to measure our progress?

With all these things said, I hope we will not spend too much time on this. I hope we will nod it through, and then I hope we will remind ourselves of what it seeks to achieve as we go on in the interminable days that lie ahead of us. I have one last word as an old, old preacher remembering what I was told when I started preaching: “First, you tell ‘em what you’re gonna tell ‘em; then you tell ‘em; and then you tell ‘em what you’ve told ‘em”. Let us take at least the first of those steps now.

18:00
Lord Bishop of Leeds Portrait The Lord Bishop of Leeds
- View Speech - Hansard - - - Excerpts

My Lords, first, I am relieved to hear that I am not the only thick person in this Committee, because I have struggled to understand and follow the detail and interconnectedness of everything in the Bill. The maxim that you need simplicity and clarity, especially if the Bill is going to be effective, is really important. That is why I think this amendment is a no-brainer: just set it out at the front.

Secondly, the amendment provides a guideline, or a lens through which we read the complexity of what follows. That might even lead us, as we go through some of the detail, to strip stuff out and make it simpler for everybody to understand. It does not have to grow the extent of the Bill. It might help us to be—I think this is the most important word I have heard—disciplined as we proceed. I support the amendment.

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I suggest, very briefly, that we look at this amendment in a slightly different way. Understandably, we have a tendency in Parliament to look at things through our own lens, and perhaps some of us are viewing this amendment as a reminder of what the Bill is about.

The noble Baroness, Lady Harding, made a very good point about clarity. I suggest we imagine that we are one of the companies that the Bill is designed to try to better manage. Imagine you are in the boardroom, or on the executive management team, and you are either already doing business in the United Kingdom or are considering entering the UK market. You know there is an enormous piece of legislation that is designed to try to bring some order to the area your business is in. At the moment, without this amendment, the Bill is a lawyer’s paradise, because it can be looked at in a multitude of ways. I put it to the Minister and the Bill team that it would be extremely helpful to have something in the Bill that makes it completely clear, to any business thinking of engaging in any online activities in the United Kingdom, what this legislation is about.

Lord Cormack Portrait Lord Cormack (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am one of those who found the Bill extremely complicated, but I do not find this amendment extremely complicated. It is precise, simple, articulate and to the point, and I think it gives us a good beginning for debating what is an extremely complex Bill.

I support this amendment because I believe, and have done so for a very long time, that social media has done a great deal more harm than good, even though it is capable of doing great good. Whether advertently or inadvertently, the worst of all things it has done is to destroy childhood innocence. We are often reminded in this House that the prime duty of any Government is to protect the realm, and of course it is. But that is a very broad statement. We can protect the realm only if we protect those within it. Our greatest obligation is to protect children—to allow them to grow up, so far as possible, uncorrupted by the wicked ways of a wicked world and with standards and beliefs that they can measure actions against. Complex as it is, the Bill is a good beginning, and its prime purpose must be the protection and safeguarding of childhood innocence.

The noble Lord, Lord Griffiths of Burry Port, spoke a few moments ago about the instructions he was given as a young preacher. I remember when I was training to be a lay reader in the Church of England, 60 or more years ago, being told that if you had been speaking for eight minutes and had not struck oil, stop boring. I think that too is a good maxim.

We have got to try to make the Bill comprehensible to those around the country whom it will affect. The worst thing we do, and I have mentioned this in connection with other Bills, is to produce laws that are unintelligible to the people in the country; that is why I was very sympathetic to the remarks of my noble friend Lord Inglewood. This amendment is a very good beginning. It is clear and precise. I think nearly all of us who have spoken so far would like to see it in the Bill. I see the noble Baroness, Lady Fox, rising—does she wish to intervene?

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I want to explain more broadly that I am all for clarifying what the law is about and for simplicity, but that ship has sailed. We have all read the Bill. It is not simple. I do not want this amendment to somehow console us, so that we can say to the public, “This is what the Bill is about”, because it is not what the Bill is about. It is about a range of things that are not contained within the amendment—I would wish them to be removed from the Bill. I am concerned that we think this amendment will resolve a far deeper and greater problem of a complicated Bill that very few of us can grasp in its entirety. We should not con the public that it is a simple Bill; it is not.

Lord Cormack Portrait Lord Cormack (Con)
- Hansard - - - Excerpts

Of course we should not. What I am saying is that this amendment is simple. If it is in the Bill, it should then be what we are aiming to create as the Bill goes through this House, with our hours of scrutiny. I shall not take part in many parts of this Bill, as I am not equipped to do so, but there are many in this House who are. Having been set the benchmark of this amendment, they can seek to make the Bill comprehensible to those of us—and that seems to include the noble Baroness, Lady Fox—who at the moment find it incomprehensible.

In a way, we are dealing with the most important subject of all: the protection of childhood innocence. We have got to err in that direction. Although I yield to no one in my passionate belief in the freedom of speech, it must have respect for the decencies of life and not be propagator of the profanities of life.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I think we need to move now to closing speeches, if that seems appropriate—

Baroness Chakrabarti Portrait Baroness Chakrabarti (Lab)
- View Speech - Hansard - - - Excerpts

I have tried to be patient, and I will be very brief. A lot has been said about a lawyer’s paradise. At the moment, the lawyers are over here and paradise is over there and there is a gulf between us. Like the noble Lord, Lord Allan of Hallam, I declare my former interest. I did not get any shillings from Facebook or any other big tech empires, but I was a government lawyer for some years, and it is in that vein that I may have a small contribution to make, if the noble Lord, Lord Clement-Jones, does not mind.

There can be a real benefit to an amendment such as this. I want to explain why, not by repeating anything that I said at Second Reading on the substance of the Bill but by speaking from the perspective of legislative drafting and its policy. I will confine my short remarks to that.

In my view, length is always an issue. My noble friend was quite right when he moved his amendment to say that the burden was on him because he was going to add to the length of a very long Bill. In my experience as a government lawyer for about five and half years, with the mixed privilege of sitting over there through many Bills, sometimes counterintuitively a little extra length can actually aid clarity. Sometimes, a very tightly drafted Bill that is complex can be more difficult to read if, for example, it has many schedules and you need a number of copies open at any one time in order to make reference to what will be substantive sections and subsections of the Act. Ironically, it is sometimes beneficial to add a clause of this kind.

There are, I would argue, three potential reasons why Governments sometimes want to do this in relation to legislative policy. One reason is accessibility, and that has been mentioned by a number of noble Lords today. That is, I think, generally a good thing. It is not easy to achieve; I do not blame any colleagues in the Box or the Office of the Parliamentary Counsel, or Ministers, for the challenge of legislating in a complex, fast-developing area that is only going to change over time. But accessibility can be aided at times by a provision of the kind that my noble friend Lord Stevenson of Balmacara, the noble Baroness, Lady Kidron, and others are proposing.

A second possible reason is to aid interpretation, which can be very beneficial as well. That is not just interpretation for judges, litigators and these wicked barracuda lawyers that everyone is so concerned about. Interpretation is important in practice when people are having to deal on a day-to-day basis with the functioning of contentious and important legislation; that is when they have executive, regulatory and legislative functions under a measure of this kind. It is to aid their interpretation—a point made rather well, if I may say so, by the noble Baroness, Lady Harding.

So, it is not just about interpretation for lawyers, in order to sue based on what things mean; it is to aid regulators of those in the regulated sector and, potentially, members of the public and pressure groups, with some advice. As a lawyer, I consider myself a half-decent legislative professional, and this is a complex Bill for me. It would be aided by a provision of the kind my noble friends are proposing. I am saying this, really, to tempt the Minister seriously to consider something like it. I suppose I am partly trying to pre-empt what I suspect is in his brief to say by way of rebuttal in just a moment.

The third potential reason to have a provision like this at the beginning of the Bill is pure politics, and we sometimes see that in Bills: it is total flummery, and just a way of making a big political statement of intent. That is never, in my view, a good enough reason by itself. But that is not what is happening or what is suggested in my noble friend’s amendment.

I now come to complexity and the benefits of a purposive provision in this Bill. Before the Minister says that it is not appropriate, not what we do and not what parliamentary counsel does, may I remind noble Lords of another Bill going through Parliament at the moment? In contrast to this Bill, which consists of 247 pages, 212 clauses and 17 schedules, we are going to have another controversial—more controversial, I would argue—Bill in due course with a mere 59 pages, 58 clauses and one schedule, which is just a list of countries. That Illegal Migration Bill has, in fact, a purposive provision right at the beginning, in the first subsection of Clause 1. I am not making a point about the substance of that legislation; I am just pre-empting any argument that this is not what we do and not how we draft Bills. Sometimes, it appears, it is. As I say, it is a much shorter, much simpler, dare I say even more controversial Bill, and perhaps there is more politics there than accessibility of interpretation.

That was my cheap point. What I really want to say to all noble Lords in this Committee is that for the purposes of debating this amendment, let us put to one side what we think about the Bill and the various clauses and amendments we would like to see or not see. Let us just ask: is this amendment as drafted and the approach recommended by my noble friend going to aid accessibility and interpretation—not litigation and lawyers and those wicked people in my profession, but the people who, day to day, will have to live and work with the proposed new regime? Whatever one’s views—be they those of the noble Baroness, Lady Fox, or others—about the Bill as it stands or as it should or should not stand, as amended, something like Amendment 1, in my submission, is a very good idea.

18:15
Viscount Stansgate Portrait Viscount Stansgate (Lab)
- View Speech - Hansard - - - Excerpts

If I may, I will prevail upon the noble Lord, Lord Clement-Jones, to wait just another few seconds before beginning his winding-up speech. I have found this an extremely interesting and worthwhile debate, and there seems to be an enormous amount of consensus that the amendment is a good thing to try to achieve. It is also true that this is a very complex Bill. My only point in rising is to say to the Minister—who is himself about to speak, telling us why the Government are not going to accept Amendment 1—that, as a result of the very long series of debates we are going to have on this Bill over a number of days, perhaps the Government might still be able, at the end of this very long process, to rethink the benefits of an having amendment of this kind at the beginning of the Bill. I hope that, just because he is going to ask us that the amendment be withdrawn today, he will not lose sight of the benefits of such an amendment.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, just before the noble Lord, Lord Clement-Jones gets to wind up, I wanted to ask a question and make a point of clarification. I am grateful for the contribution from the noble Baroness, Lady Chakrabarti; that was a helpful point to make.

My question, which I was going to direct to the noble Lord, Lord Stevenson—although it may be one that the noble Lord, Lord Clement-Jones, wants to respond to if the noble Lord, Lord Stevenson, is not coming back—is about the use of the word “purpose” versus “objective”. The point I was trying to make in referring to the Joint Committee’s report was that, when it set out the limbs of this amendment, it was referring to them as objectives for Ofcom. What we have here is an amendment that is talking about purposes of the Bill, and in the course of this debate we have been talking about the need for clarity of purpose. The point I was trying to make was not that I object to the contents of this amendment, but that if we are looking for clarity of purpose to inform the way we want people to behave as a result of this legislation, I would make it much shorter and simpler, which is why I pointed to subsection (g) of the proposed clause.

It may be that the content of this amendment—and this is where I pick up the point the noble Baroness, Lady Chakrabarti, was making—is not objectionable, although I take the point made by the noble Baroness, Lady Fox. However, the noble Baroness, Lady Chakrabarti, is right: at the moment, let us worry less about the specifics. Then, we can be clearer about what bits of the amendment are meant to be doing what, rather than trying to get all of them to offer clarity of purpose. That is my problem with it: there are purposes, which, as I say, are helpful structurally in terms of how an organisation might go about its work, and there is then the clarity of purpose that should be driving everything. The shorter, simpler and more to the point we can make that, the better.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the noble Baroness. I hope I have not appeared to rush the proceedings, but I am conscious that there are three Statements after the Bill. I thank the noble Lord, Lord Stevenson, for tabling this amendment, speaking so cogently to it and inspiring so many interesting and thoughtful speeches today. He and I have worked on many Bills together over the years, and it has been a real pleasure to see him back in harness on the Opposition Front Bench, both in the Joint Committee and on this Bill. Long may that last.

It has been quite some journey to get to this stage of the Bill; I think we have had four Digital Ministers and five Prime Ministers since we started. It is pretty clear that Bismarck never said, “Laws are like sausages: it’s best not to see them being made”, but whoever did say it still made a very good point. The process leading to today’s Bill has been particularly messy, with Green and White Papers; a draft Bill; reports from the Joint Committee and Lords and Commons Select Committees; several versions of the Bill itself; and several government amendments anticipated to come. Obviously, the fact that the Government chose to inflict last-minute radical surgery on the Bill to satisfy what I believe are the rather unjustified concerns of a small number in the Government’s own party made it even messier.

It is extremely refreshing, therefore, to start at first principles, as the noble Lord, Lord Stevenson, has done. He has outlined them and the context in which we should see them—namely, we should focus essentially on the systems, what is readily enforceable and where safety by design and transparency are absolutely the essence of the purpose of the Bill. I share his confidence in Ofcom and its ability to interpret those purposes. I say to the noble Baroness, Lady Stowell, that I am not going to dance on the heads of too many pins about the difference between “purpose” and “objective”. I think it is pretty clear what the amendment intends, but I do have a certain humility about drafting; the noble Baroness, Lady Chakrabarti, reminded us of that. Of course, one should always be open to change and condensation of wording if we need to do that. But we are only at Amendment 1 in Committee, so there is quite a lot of water to flow under the bridge.

It is very heartening that there is a great deal of cross-party agreement about how we must regulate social media going forward. These Benches—and others, I am sure—will examine the Bill extremely carefully and will do so in a cross-party spirit of constructive criticism, as we explained at Second Reading. Our Joint Committee on the draft Bill exemplified that cross-party spirit, and I am extremely pleased that all four signatories to this amendment served on the Joint Committee and readily signed up to its conclusions.

Right at the start of our report, we made a strong case for the Bill to set out these core objectives, as the noble Lord, Lord Stevenson, has explained, so as to provide clarity—that word has been used around the Committee this afternoon—for users and regulators about what the Bill is trying to achieve and to inform the detailed duties set out in the legislation. In fact, I believe that the noble Lord, Lord Stevenson, has improved on that wording by including a duty on the Secretary of State, as well as Ofcom, to have regard to the purposes.

We have heard some very passionate speeches around the Committee for proper regulation of harms on social media. The case for that was made eloquently to the Joint Committee by Ian Russell and by witnesses such as Edleen John of the FA and Frances Haugen, the Facebook whistleblower. A long line of reports by Select Committees and all-party groups have rightly concluded that regulation is absolutely necessary given the failure of the platforms even today to address the systemic issues inherent in their services and business models.

The introduction to our Joint Committee report makes it clear that without the original architecture of a duty of care, as the White Paper originally proposed, we need an explicit set of objectives to ensure clarity for Ofcom when drawing up the codes and when the provisions of the Bill are tested in court, as they inevitably will be. Indeed, in practice, the tests that many of us will use when judging whether to support amendments as the Bill passes through the House are inherently bound up with these purposes, several of which many of us mentioned at Second Reading. Decisions may need to be made on balancing some of these objectives and purposes, but that is the nature of regulation. I have considerable confidence, as I mentioned earlier, in Ofcom’s ability to do this, and those seven objectives—as the right reverend Prelate reminded us, the rule of seven is important in other contexts—set that out.

In their response to the report published more than a year ago, the Government repeated at least half of these objectives in stating their own intentions for the Bill. Indeed, they said:

“We are pleased to agree with the Joint Committee on the core objectives of the Bill”,


and, later:

“We agree with all of the objectives the Joint Committee has set out, and believe that the Bill already encapsulates and should achieve these objectives”.


That is exactly the point of dispute: we need this to be explicit, and the Government seem to believe that it is implicit. Despite agreeing with those objectives, at paragraph 21 of their response the Government say:

“In terms of the specific restructure that the Committee suggested, we believe that using these objectives as the basis for Ofcom’s regulation would delegate unprecedented power to a regulator. We do not believe that reformulating this regulatory framework in this way would be desirable or effective. In particular, the proposal would leave Ofcom with a series of high-level duties, which would likely create an uncertain and unclear operating environment”.


That is exactly the opposite of what most noble Lords have been saying today.

It has been an absolute pleasure to listen to so many noble Lords across the Committee set out their ambitions for the Bill and their support for this amendment. It started with the noble Baroness, Lady Kidron, talking about this set of purposes being the “North Star”. I pay tribute to her tireless work, which drove all of us in the Joint Committee on in an extremely positive way. I am not going to go through a summing-up process, but what my noble friend had to say about the nature of the risk we are undertaking and the fact that we need to be clear about it was very important. The whole question of clarity and certainty for business and the platforms, in terms of making sure that they understand the purpose of the Bill—as the noble Baroness, Lady Harding, and many other noble Lords mentioned—is utterly crucial.

If noble Lords look at the impact assessment, they will see that the Government seem to think the cost of compliance is a bagatelle—but, believe me, it will not be. It will be a pretty expensive undertaking to train people in those platforms, across social media start-ups and so on to understand the nature of their duties.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I was just refreshing myself on what the impact assessment says. It says that the cost of reading and understanding the regulations will range from £177 for a small business to £2,694 for a large category 1 service provider. To reinforce my noble friend’s point: it says it will cost £177 to read and understand the Bill. I am not sure that will be what happens in practice.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I thank my noble friend for having the impact assessment so close to hand; that is absolutely correct.

The noble Baroness, Lady Fox, talked about unintended consequences—apart from bringing the people of Ukraine into the argument, which I thought was slightly extraneous. I think we need a certain degree of humility about the Bill. As the noble Lord, Lord Knight, said, this may well be part 1; we may need to keep iterating to make sure that this is effective for child safety and for the various purposes set out in the Bill. The Government have stated that this amendment would create greater uncertainty, but that is exactly the opposite of what our committee concluded. I believe, as many of us do, that the Government are wrong in taking the view that they have; I certainly hope that they will reconsider.

At Second Reading, the noble Lord, Lord Stevenson, made something that he probably would not want, given the antecedence of the phrase, to characterise as a big open offer to the Minister to work on a cross-party basis to improve the Bill. We on these Benches absolutely agree with that approach. We look forward to the debates in Committee in that spirit. We are all clearly working towards the same objective, so I hope the Government will respond in kind. Today is the first opportunity to do so—I set out that challenge to the Minister.

18:30
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, let me start by saying how saying how pleased I, too, am that we are now in Committee. I thank all noble Lords for giving up their time to attend the technical briefings that officials in my department and I have held since Second Reading and for the collaborative and constructive nature of their contributions in those discussions.

In particular, not least because today is his birthday, I pay tribute to the noble Lord, Lord Stevenson of Balmacara, for his tireless work on the Bill—from his involvement in its pre-legislative scrutiny to his recall to the Front Bench in order to see the job through. We are grateful for his diligence and, if I may say so, the constructive and collaborative way in which he has gone about it. He was right to pay tribute both to my noble friend Lord Gilbert of Panteg, who chaired the Joint Committee, and to the committee’s other members, including all the other signatories to this amendment. The Bill is a better one for their work, and I repeat my thanks to them for it. In that spirit, I am grateful to the noble Lord for bringing forward this philosophical opening amendment. As noble Lords have said, it is a helpful place for us to start and refocus our thoughts as we begin our line-by-line scrutiny of this Bill.

Although I agree with the noble Lord’s broad description of his amendment’s objectives, I am happy to respond to the challenge that lies behind it and put the objectives of this important legislation clearly on the record at the outset of our scrutiny. The Online Safety Bill seeks to bring about a significant change in online safety. The main purposes of the Bill are: to give the highest levels of protection to children; to protect users of all ages from being exposed to illegal content; to ensure that companies’ approach focuses on proactive risk management and safety by design; to protect people who face disproportionate harm online including, for instance, because of their sex or their ethnicity or because they are disabled; to maintain robust protections for freedom of expression and privacy; and to ensure that services are transparent and accountable.

The Bill will require companies to take stringent measures to tackle illegal content and protect children, with the highest protections in the Bill devoted to protecting children; as the noble Baroness, Lady Benjamin, my noble friend Lord Cormack and others have again reminded us today, that is paramount. Children’s safety is prioritised throughout this Bill. Not only will children be protected from illegal content through its illegal content duties but its child safety duties add an additional layer of protection so that children are protected from harmful or inappropriate content such as grooming, pornography and bullying. I look forward to contributions from the noble Baroness, Lady Kidron, and others who will, I know, make sure that our debates are properly focused on that.

Through their duties of care, all platforms will be required proactively to identify and manage risk factors associated with their services in order to ensure both that users do not encounter illegal content and that children are protected from harmful content. To achieve this, they will need to design their services to reduce the risk of harmful content or activity occurring and take swift action if it does.

Regulated services will need to prioritise responding to online content and activity that present the highest risk of harm to users, including where this is linked to something classified as a protected characteristic under the terms of the Equality Act 2010. This will ensure that platforms protect users who are disproportionately affected by online abuse—for example, women and girls. When undertaking child safety and illegal content risk assessments, providers must consider whether certain people face a greater risk of harm online and ensure that those risks are addressed and mitigated.

The Bill will place duties relating to freedom of expression and privacy on both Ofcom and all in-scope companies. Those companies will have to consider and implement safeguards for freedom of expression when fulfilling their duties. Ofcom will need to carry out its new duties in a way that protects freedom of expression. The largest services will also have specific duties to protect democratic and journalistic content.

Ensuring that services are transparent about the risks on their services and the actions they are taking to address them is integral to this Bill. User-to-user services must set out in their terms of service how they are complying with their illegal and child safety duties. Search services must do the same in public statements. In addition, government amendments that we tabled yesterday will require the biggest platforms to publish summaries of their illegal and their child safety risk assessments, increasing transparency and accountability, and Ofcom will have a power to require information from companies to assess their compliance with providers’ duties.

Finally, the Bill will also increase transparency and accountability relating to platforms with the greatest influence over public discourse. They will be required to ensure that their terms of service are clear and properly enforced. Users will be able to hold platforms accountable if they fail to enforce those terms.

The noble Baroness, Lady Kidron, asked me to say which of the proposed new paragraphs (a) to (g), to be inserted by Amendment 1, are not the objectives of this Bill. Paragraph (a) sets out that the Bill must ensure that services

“do not endanger public health or national security”.

The Bill will certainly have a positive impact on national security, and a core objective of the Bill is to ensure that platforms are not used to facilitate terrorism. Ofcom will issue a stand-alone code on terrorism, setting out how companies can reduce the risk of their services being used to facilitate terrorist offences, and remove such content swiftly if it appears. Companies will also need to tackle the new foreign interference offence as a priority offence. This will ensure that the Bill captures state-sponsored disinformation, which is of most concern—that is, attempts by foreign state actors to manipulate information to interfere in our society and undermine our democratic, political and legal processes.

The Bill will also have a positive impact on public health but I must respectfully say that that is not a primary objective of the legislation. In circumstances where there is a significant threat to public health, the Bill already provides powers for the Secretary of State both to require Ofcom to prioritise specified objectives when carrying out its media literacy activity and to require companies to report on the action they are taking to address the threat. Although the Bill may lead to additional improvements—I am sure that we all want to see them—for instance, by increasing transparency about platforms’ terms of service relating to public health issues, making this a primary objective on a par with the others mentioned in the noble Lord’s amendment risks making the Bill much broader and more unmanageable. It is also extremely challenging to prohibit such content, where it is viewed by adults, without inadvertently capturing useful health advice or legitimate debate and undermining the fundamental objective of protecting freedom of expression online—a point to which I am sure we will return.

The noble Lord’s amendment therefore reiterates many objectives that are interwoven throughout the legislation. I am happy to say again on the record that I agree with the general aims it proposes, but I must say that accepting it would be more difficult than the noble Lord and others who have spoken to it have set out. Accepting this amendment, or one like it, would create legal uncertainty. I have discussed with the officials sitting in the Box—the noble Baroness, Lady Chakrabarti, rightly paid tribute to them—the ways in which such a purposive statement, as the noble Lord suggests, could be made; we discussed it between Second Reading and now.

I appreciate the care and thought with which the noble Lord has gone about this—mindful of international good practice in legislation and through discussion with the Public Bill Office and others, to whom he rightly paid tribute—but any deviation from the substantive provisions of the Bill and the injection of new terminology risk creating uncertainty about the proper interpretation and application of those provisions. We have heard that again today; for example, the noble Baroness, Lady Fox, said that she was not clear what the meaning of certain words may be while my noble friend Lady Stowell made a plea for simplicity in legislation. The noble Lord, Lord Griffiths, also gave an eloquent exposition of the lexicographical befuddlement that can ensue when new words are added. All pointed to some confusion; indeed, there have been areas of disagreement even in what I am sure the noble Lord, Lord Stevenson, thinks was a very consensual summary of the purposes of the Bill.

That legal uncertainty could provide the basis for an increased number of judicial reviews or challenges to the decisions taken under the Bill and its framework, creating significant obstacles to the swift and effective implementation of the new regulatory framework, which I know is not something that he or other noble Lords would want. As noble Lords have noted, this is a complicated Bill, but adding further statements and new terminology to it, for however laudable a reason, risks adding to that complication, which can only benefit those with, as the noble Baroness, Lady Kidron, put it, the deepest pockets.

However, lest he think that I and the Government have not listened to his pleas or those of the Joint Committee, I highlight, as my noble friend Lady Stowell did, that the Joint Committee’s original recommendation was that these objectives

“should be for Ofcom”.

The Government took that up in Schedule 4 to the Bill, and in Clause 82(4), which set out objectives for the codes and for Ofcom respectively. At Clause 82(4) the noble Lord will see the reference to

“the risk of harm to citizens presented by content on regulated services”

and

“the need for a higher level of protection for children than for adults”.

I agree with the noble Baroness, Lady Chakrabarti, that it is not impossible to add purposive statements to Bills and nor is it unprecedented. I echo her tribute to the officials and lawyers in government who have worked on this Bill and given considerable thought to it. She has had the benefit of sharing their experience and the difficulties of writing tightly worded legislation. In different moments of her career, she has also had the benefit of picking at the loose threads in legislation and poking at the holes in it. That is the purpose of lawyers who question the thoroughness with which we have all done our work. I will not call them “pesky lawyers”, as she did—but I did hear her say it. I understand the point that she was making in anticipation but reassure her that she has not pre-empted the points that I was going to make.

To the layperson, legislation is difficult to understand, which is why we publish Explanatory Notes, on which the noble Baroness and others may have had experience of working before. I encourage noble Lords, not just today but as we go through our deliberations, to consult those as well. I hope that noble Lords will agree that they are more easily understood, but if they do not do what they say and provide explanation, I will be very willing to listen to their thoughts on it.

So, while I am not going to give the noble Lord, Lord Stevenson, the birthday present of accepting his amendment, I hope that the clear statement that I gave at the outset from this Dispatch Box, which is purposive as well, about the objectives of the Bill, and my outline of how it tries to achieve them, is a sufficient public statement of our intent, and that it achieves what I hope he was intending to get on the record today. I invite him to withdraw his amendment.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

Well, my Lords, it has been a very good debate, and we should be grateful for that. In some senses, I should bank that; we have got ourselves off to a good start for the subsequent debates and discussions that we will have on the nearly 310 amendments that we must get through before the end of the process that we have set out on.

However, let us pause for a second. I very much appreciated the response, not least because it was very sharp and very focused on the amendment. It would have been tempting to go wider and wider, and I am sure that the Minister had that in mind at some point, but he has not done that. The first substantial point that he made seemed to be a one-pager about what this Bill is about. Suitably edited and brought down to manageable size, it would fit quite well into the Bill. I am therefore a bit puzzled as to why he cannot make the jump, intellectually or otherwise, from having that written for him and presumably working on it late at night with candles so that it was perfect—because it was pretty good; I will read it very carefully in Hansard, but it seemed to say everything that I wanted to say and covered most of the points that everybody else thought of to say, in a way that would provide clarity for those seeking it.

The issue we are left with was touched on by the noble Baroness, Lady Stowell, in her very perceptive remarks. Have we got this pointing in the right direction? We should think about it as a way for the Government to get out of this slightly ridiculous shorthand of the safest place to be online, to a statement to themselves about what they are trying to do, rather than an instruction to Ofcom—because that is where it gets difficult and causes problems with the later stages. This is really Parliament and government agreeing to say this, in print, rather than just through reading Hansard. That then reaches back to where my noble friend Lady Chakrabarti is, and it helps the noble Baroness, Lady Harding, with her very good point, that this will not work if people do not even bother to get through the first page.

18:45
My noble friend Lord Knight mentioned the first page and the opening statement, which the Minister nearly touched on himself in his excellent speech but did not quite. This is a Bill to:
“Make provision for and in connection with the regulation by OFCOM of certain internet services; for and in connection with communications offences; and for connected purposes”.
Really? We can do better than that. Yes, of course there are Explanatory Notes, but it is the Bill that matters and the Bill that Parliament will sign on to, and there is a gap. I understand the downside of this and am not in any sense trying to force us down a road which will lead to unfortunate consequences—although probably not the same ones as the noble Baroness, Lady Fox, talked about. However, seven deadly sins stalk us as we go down this road. Surely between now and the end of Committee we can find a package that would work and cover us. I will leave it there at this stage because we have talked at length. It has been a very good debate. It is my birthday and I want to go and celebrate, but the Minister did not share the real killer, which is that it is my wedding anniversary; I must go.
I beg leave to withdraw the amendment.
Amendment 1 withdrawn.
Clause 2 agreed.
House resumed.

Online Safety Bill

Committee (2nd Day)
Relevant document: 28th Report from the Delegated Powers Committee
15:46
Clause 3: “Regulated service”, “Part 3 service” etc
Amendment 2
Moved by
2: Clause 3, page 3, line 14, at end insert—
“(d) an internet service, other than a regulated user-to-user service or search service, that meets the child user condition and enables or promotes harmful activity and content as set out in Schedule (Online harms to children).”Member’s explanatory statement
This amendment would mean any service that meets the 'child user condition' and enables or promotes harmful activity and content to children, as per a new Schedule, would be in scope of the regulation of the bill.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I refer the Committee to my interests as put in the register and declared in full at Second Reading. I will speak to Amendment 2 in my name and those of the right reverend Prelate the Bishop of Oxford and the noble Baroness, Lady Harding, to Amendments 3 and 5 in my name, and briefly to Amendments 19, 22, 298 and 299 in the name of the noble Baroness, Lady Harding.

The digital world does not have boundaries in the way that the Bill does. It is an ecosystem of services and products that are interdependent. A user journey is made up of incremental signals, nudges and enticements that mean that, when we use our devices, very often we do not end up where we intended to start. The current scope covers user-to-user, search and commercial porn services, but a blog or website that valorises self-harm and depression or suggests starving yourself to death is still exempt because it has limited functionality. So too are games without a user-to-user function, in spite of the known harm associated with game addiction highlighted recently by Professor Henrietta Bowden-Jones, national expert adviser on gambling harms, and the World Health Organization in 2019 when it designated gaming disorder as a behavioural addiction.

There is also an open question about immersive technologies, whose protocols are still very much in flux. I am concerned that the Government are willing to assert that these environments will meet the bar of user-to-user when those that are still building immersive environments make quite clear that that is not a given. Indeed, later in Committee I will be able to demonstrate that already the very worst harms are happening in environments that are not clearly covered by the Bill.

Another unintended consequence of the current drafting is that the task of working out whether you are on a regulated or unregulated service is left entirely to children. That is not what we had been promised. In December the Secretary of State wrote in a public letter to parents,

“I want to reassure every person reading this letter that the onus for keeping young people safe online will sit squarely on the tech companies’ shoulders”.

It is likely that the Minister will suggest that the limited- functionality services will be caught by the gatekeepers. But, as in the case of immersive technology, it is dangerous to suggest that, just because search and user- to-user are the primary access points in 2023, that will remain the case. We must be more forward thinking and ensure that services likely to be accessed that promote harm are in scope by default.

Amendments 3 and 5 are consequential, so I will not debate them now. I have listened to the Government and come back with a reasonable and implementable amendment that applies only to services that are likely to be accessed by children and that enable harm. I now ask the Government to listen and do likewise.

Amendments 92 and 193 cover the child user condition. The phrase “likely to be accessed”, introduced in this House into what became the Data Protection Act 2018, is one of the most unlikely successful British exports. Both the phrase and its definition, set out by the ICO, have been embedded in regulations in countries the world over—yet the Bill replaces this established language while significantly watering down the definition.

The Bill requires

“a significant number of children”

to use the service, or for the service to be

“likely to attract a significant number of users who are children”.

“Significant” in the Bill is defined relative to the overall UK user base, which means that extremely large platforms could deem a few thousand child users not significant compared with the several million-strong user base. Since only services that cross this threshold need comply with the child safety duties, thousands of children will not benefit from the safety duties that the Minister told us last week were at the heart of the Bill.

Amendment 92 would put the ICO’s existing and much-copied definition into the Bill. It says a service is

“likely to be accessed by children”

if

“the service is designed or intended for use by children … children form a substantive and identifiable user group … the possibility of a child accessing the service is more probable than not, taking into consideration … the nature and content of the service and whether that has particular appeal for children … the way in which the service is accessed and any measures in place to prevent children gaining access … market research, current evidence on user behaviour, the user base of similar or existing services”

that are likely to be accessed.

Having two phrases and definitions is bad for business and even worse for regulators. The ICO has first-mover advantage and a more robust test. It is my contention that parents, media and perhaps even our own colleagues would be very shocked to know that the definition in the Bill has the potential for many thousands, and possibly tens of thousands, of children to be left without the protections that the Bill brings forward. Perhaps the Minister could explain why the Government have not chosen regulatory alignment, which is good practice.

Finally, I will speak briefly in support of Amendments 19, 22, 298 and 299. I am certain that the noble Baroness, Lady Harding, will spell out how the app stores of Google and Apple are simply a subset of “search”, in that they are gatekeepers to accessing more than 5 million apps worldwide and the first page of each is indeed a search function. Their inclusion should be obvious, but I will add a specific issue about which I have spoken directly with both companies and about which the 5Rights Foundation, of which I am chair, has written to the ICO.

When we looked at the age ratings of apps across Google Play Store and Apple, four things emerged. First, apps are routinely rated much lower than their terms and conditions: for example, Amazon Shopping says 18 but has an age rating of 4 on Apple. This pattern goes across both platforms, covering social sites, gaming, shopping, et cetera.

Secondly, the same apps and services did not have the same age rating across both services, which, between them, are gatekeepers for more than 95% of the app market. In one extreme case, an app rated four on one of them was rated 16 on the other, with other significant anomalies being extremely frequent.

Thirdly, almost none of the apps considered their data protection duties in coming to a decision on their age rating, which is a problem, since privacy and safety and inextricably linked.

Finally, in the case of Apple, using a device registered to a 15 year-old, we were able to download age-restricted apps including a dozen or more 18-plus dating sites. In fairness, I give a shoutout to Google, which, because of the age-appropriate design code, chose more than a year ago not to show 18-plus content to children in its Play Store. So this is indeed a political and business choice and not a question of technology. Millions of services are accessed via the App Store. Given the Government’s position—that gatekeepers have specific responsibilities in relation to harmful content and activity—surely the amendments in the name of the noble Baroness, Lady Harding, are necessary.

My preference was for a less complicated Bill based on principles and judged on outcomes. I understand that that ship has sailed, but it is not acceptable for the Government now to use the length and complexity of the Bill as a reason not to accept amendments that would fill loopholes where harm has been proven. It is time to deliver on the promises made to parents and children, and to put the onus for keeping young people safe online squarely on tech companies’ shoulders. I beg to move.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to speak to Amendments 19, 22, 298 and 299 in my name and those of the noble Baroness, Lady Stowell, and the noble Lords, Lord Knight and Lord Clement-Jones. I will also briefly add at the end of my speech my support for the amendments in the name of my friend, the noble Baroness, Lady Kidron. It has been a huge privilege to be her support act all the way from the beginnings of the age-appropriate design code; it feels comfortable to speak after her.

I want briefly to set out what my amendments would do. Their purpose is to bring app stores into the child protection elements of the Bill. Amendment 19 would require app stores to prepare

“risk assessments equal to user-to-user services due to their role in distributing online content through apps to children and as a primary facilitator of user-to-user”

services reaching children. Amendment 22 would mandate app stores

“to use proportionate and proactive measures, such as age assurance, to prevent children”

coming into contact with

“primary priority content that is harmful to children”.

Amendments 298 and 299 would simply define “app” and “app stores”.

Let us be clear what app stores do. They enable customers to buy apps and user-to-user services. They enable customers to download free apps. They offer up curated content in the app store itself and decide what apps someone would like to see. They enable customers to search for apps for user-to-user content. They provide age ratings; as the noble Baroness, Lady Kidron, said, they may be different age ratings in different app stores for the same app. They sometimes block the download of apps based on the age rating and their assessment of someone’s age, but not always, and it is different for different app stores.

Why should they be included in this Bill—if it is not obvious from what I have already said? First, two companies are profiting from selling user-to-user products to children. Two app stores account for some 98%-plus of all downloads of user-to-user services, with no requirements to assess the risk of selling those products to children or to mitigate those risks. We do not allow that in the physical world so we should not allow it in the digital world.

Secondly, parents and teenagers tell us that this measure would help. A number of different studies have been done; I will reference just two. One was by FOSI, the Family Online Safety Institute, which conducted an international research project in which parents consistently said that having age assurance at the app store level would make things simpler and more effective for them; ironically, the FOSI research was conducted with Google.

16:00
A second research study, conducted by Internet Matters and TikTok, unambiguously shows that teenagers themselves would prefer having app store age assurance. Neither of those research projects suggests that the age assurance should be instead of age assurance in the apps themselves. They view it as additive, as an addition that would make it simpler for them and ensure that fewer children reach the point of downloading apps that they should not.
The third reason why this is necessary is that, as the noble Baroness, Lady Kidron, said, Google and Apple are already doing some of this. They are doing it differently and should be commended, to some extent, for the progress that they have made over the past five years. Google Family Link and the family functionality on the Apple store are better than they were five years ago. However, we should be troubled that this is currently not regulated. They are age-rating apps differently. Can you imagine, in the physical world, Sainsbury’s deciding that alcohol was suitable for 17 year-olds and above, Tesco deciding that it was suitable for 18 year-olds and above, and government not being able to intervene? That is the world which we are in with access to pornography today.
I am the mother of a 17 year-old girl. I went into her iPhone last night and searched on the Apple App Store. Pornography apps come up as age appropriate for 17+. This is the consequence of an unregulated app store world. Today, as I said, the vast majority is with Google and Apple. On the day that the Government launch their digital competition Bill, we should hope that over time there will be further app stores. What is to say that those app stores will do anything to protect children as they try to compete with Google and Apple?
The final reason why we should do this is that a number of app developers, particularly small ones, have expressed to me a concern that app stores might abuse their power of age-gating the internet to block apps that compete with their own. That is exactly why we should regulate this space, rather than leaving it for Google and Apple to decide what an age gate should or should not look like. Self-regulation has failed to protect children online over the past 15 years. Many of us in the Chamber today have been working in this space for at least that long. There is no reason to believe that self-regulation would be any more successful for app stores than it has been for the rest of the internet.
I have tabled these amendments and ask my noble friend the Minister to recognise that I have done so in the spirit of starting the conversation on how we regulate app stores. It is unambiguously clear that we should regulate them. The last thing that I would want to do is have my amendment slow down the progress of this Bill. The last thing that I would want is to slow down Ofcom’s implementation of the Bill. However, we keep being told that this is a framework Bill to focus on systems and processes, and it is an essential part of that framework that app stores are included.
Very briefly, I will speak in support of the amendments tabled by the noble Baroness, Lady Kidron, by telling you a story. One of my first jobs in the retail world was as the commercial director for Woolworths—we are all old enough in this Chamber to remember Woolworths —which was the leading retailer of toys. One of my first category directors for the toy category had come from outside the toy industry. I will never forget the morning when he came to tell me that an own-label Woolworths toy had caused a near-fatal accident with a child. He was new to the industry and had not worked in toys before. He said, “It’s only one child; don’t worry, it’ll be okay”. I remember saying, “That is not how health and safety with children works. This is one incident; we need to delist the product immediately; we need to treat this incredibly seriously. Imagine if that was your child”. I do not begrudge his reaction; he had never worked in that sector before.
However, the reality is that if we do not look at the impact of the digital world on every child, then we are adopting a different standard in the digital world than we do in the physical world. That is why the “likely to be accessed by children” definition that has been tried and tested, not just in this House but in legislatures around the world, should be what is used in this Bill.
Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to follow the two noble Baronesses. I remind the Committee of my background as a board member of the Centre for Data Ethics and Innovation. I also declare an indirect interest, as my oldest son is the founder and studio head of Mediatonic, which is now part of Epic Games and is the maker of “Fall Guys”, which I am sure is familiar to your Lordships.

I speak today in support of Amendments 2 and 92 and the consequent amendments in this group. I also support the various app store amendments proposed by the noble Baroness, Lady Harding, but I will not address them directly in these remarks.

I was remarkably encouraged on Wednesday by the Minister’s reply to the debate on the purposes of the Bill, especially by the priority that he and the Government gave to the safety of children as its primary purpose. The Minister underlined this point in three different ways:

“The main purposes of the Bill are: to give the highest levels of protection to children … The Bill will require companies to take stringent measures to tackle illegal content and protect children, with the highest protections in the Bill devoted to protecting children … Children’s safety is prioritised throughout this Bill”.—[Official Report, 19/4/23; col. 724.]


The purpose of Amendments 2 and 92 and consequent amendments is to extend and deepen the provisions in the Bill to protect children against a range of harms. This is necessary for both the present and the future. It is necessary in the present because of the harms to which children are exposed through a broad range of services, many of which are not currently in the Bill’s scope. Amendment 2 expands the scope to include any internet service that meets the child user condition and enables or promotes harmful activity and content as set out in the schedule provided. Why would the Government not take this step, given the aims and purposes of the Bill to give the highest protection to children?

Every day, the diocese of Oxford educates some 60,000 children in our primary and secondary schools. Almost all of them have or will have access to a smartphone, either late in primary, hopefully, or early in secondary school. The smartphone is a wonderful tool to access educational content, entertainment and friendship networks, but it is also a potential gateway for companies, children and individuals to access children’s inner lives, in secret, in the dead of night and without robust regulation. It therefore exposes them to harm. Sometimes that harm is deliberate and sometimes unintentional. This power for harm will only increase in the coming years without these provisions.

The Committee needs to be alert to generational changes in technology. When I was 16 in secondary school in Halifax, I did a computer course in the sixth form. We had to take a long bus ride to the computer building in Huddersfield University. The computer filled several rooms in the basement. The class learned how to program using punch cards. The answers to our questions came back days later, on long screeds of printed paper.

When my own children were teenagers and my oldest was 16, we had one family computer in the main living room of the house. The family was able to monitor usage. Access to the internet was possible, but only through a dial-up modem. The oldest of my grandchildren is now seven and many of his friends have smartphones now. In a few years, he will certainly carry a connected device in his pocket and, potentially, have access to the entire internet 24/7.

I want him and millions of other children to have the same protection online as he enjoys offline. That means recognising that harms come in a variety of shapes and sizes. Some are easy to spot, such as pornography. We know the terrible damage that porn inflicts on young lives. Some are more insidious and gradual: addictive behaviours, the promotion of gambling, the erosion of confidence, grooming, self-harm and suicidal thoughts, encouraging eating disorders, fostering addiction through algorithms and eroding the barriers of the person.

The NSPCC describes many harms to children on social networks that we are all now familiar with, but it also highlights online chat, comments on livestream sites, voice chat in games and private messaging among the vectors for harm. According to Ofcom, nine in 10 children in the UK play video games, and they do so on devices ranging from computers to mobile phones to consoles. Internet Matters says that most children’s first interaction with someone they do not know online is now more likely to be in a video game such as “Roblox” than anywhere else. It also found that parents underestimate the frequency with which their children are contacted by strangers online.

The Gambling Commission has estimated that 25,000 children in the UK aged between 11 and 16 are problem gamblers, with many of them introduced to betting via computer games and social media. Families have been left with bills, sometimes of more than £3,000, after uncontrolled spending on loot boxes.

Online companies, we know, design their products with psychological principles of engagement firmly in view, and then refine their products by scraping data from users. According to the Information Commissioner, more than 1 million underage children could have been exposed to underage content on TikTok alone, with the platform collecting and using their personal data.

As the noble Baroness, Lady Kidron, has said, we already have robust and tested definitions of scope in the ICO’s age-appropriate design code—definitions increasingly taken up in other jurisdictions. To give the highest protection to children, we need to build on these secure definitions in this Bill and find the courage to extend robust protection across the internet now.

We also need to future-proof this Bill. These key amendments would ensure that any development, any new kind of service not yet imagined which meets the child user condition and enables or promotes harmful activity and content, would be in scope. This would give Ofcom the power to develop new guidance and accountabilities for the applications that are certain to come in the coming years.

We have an opportunity and a responsibility, as the Minister has said, to build the highest protection into this Bill. I support the key amendments standing in my name.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, first, I beg the indulgence of the Committee to speak briefly at this juncture. I know that no one from the Lib Dem or Labour Benches has spoken yet, but I need to dash over to the Moses Room to speak to some amendments I am moving on the Bill being considered there. Secondly, I also ask the Committee that, if I do not get back in time for the wind-ups, I be forgiven on this occasion.

I simply wanted to say something briefly in support of Amendments 19, 22, 298 and 299, to which I have added my name. My noble friend Lady Harding has already spoken to them comprehensively, so there little I want to add; I just want to emphasise a couple of points. But first, if I may, I will pick up on something the right reverend Prelate said. I think I am right in saying that the most recent Ofcom research shows that 57% of 7 year-olds such as his grandchild have their own phone, and by the time children reach the age of 12 they pretty much all have their own phone. One can only imagine that the age at which children possess their own device is going to get lower.

Turning to app stores, with which these amendments are concerned, currently it is the responsibility of parents and developers to make sure that children are prevented from accessing inappropriate content. My noble friend’s amendments do not dilute in any way the responsibility that should be held by those two very important constituent groups. All we are seeking to do is ensure that app stores, which are currently completely unregulated, take their share of responsibility for making sure that those seeking to download and then use such apps are in the age group the apps are designed for.

As has already been very powerfully explained by my noble friend and by the noble Baroness, Lady Kidron, different age ratings are being given by the two different app stores right now. It is important for us to understand, in the context of the digital markets and competition Bill, which is being introduced to Parliament today—I cannot tell noble Lords how long we have waited for that legislation and how important it is, not least because it will open up competition, particularly in app stores—that the more competition there will be across app stores and the doorways through which children can go to purchase or download apps, the more important it is that there is consistency and some regulation. That is why I support my noble friend and was very happy to add my name to her amendments.

16:15
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, it falls to me to inject some grit into what has so far been a very harmonious debate, as I will raise some concerns about Amendments 2 and 22.

I again declare my interest: I spent 10 years working for Facebook, doing the kind of work that we will regulate in this Bill. At this point noble Lords are probably thinking, “So it’s his fault”. I want to stress that, if I raise concerns about the way the regulation is going, it is not that I hold those views because I used to work for the industry; rather, I felt comfortable working in the industry because I always had those views, back to 2003 when we set up Ofcom. I checked the record, and I said things then that are remarkably consistent with how I feel today about how we need to strike the balance between the power of the state and the power of the citizen to use the internet.

I also should declare an interest in respect of Amendment 2, in that I run a blog called regulate.tech. I am not sure how many children are queueing up to read my thoughts about regulation of the tech industry, but they would be welcome to do so. The blog’s strap- line is:

“How to regulate the internet without breaking it”.


It is very much in that spirit that I raise concerns about these two amendments.

I certainly understand the challenges for content that is outside of the user-to-user or search spaces. I understand entirely why the noble Baroness, Lady Kidron, feels that something needs to be done about that content. However, I am not sure that this Bill is the right vehicle to address that kind of content. There are principled and practical reasons why it might be a mistake to extend the remit here.

The principle is that the Bill’s fundamental purpose is to restrict access to speech by people in the United Kingdom. That is what legislation such as this does: it restricts speech. We have a framework in the Human Rights Act, which tells us that when we restrict speech we have to pass a rigorous test to show that those restrictions are necessary and proportionate to the objective we are trying to achieve. Clearly, when dealing with children, we weight very heavily in that test whether something is necessary and proportionate in favour of the interest of the welfare of the children, but we cannot do away with the test altogether.

It is clear that the Government have applied that test over the years that they have been preparing this Bill and determined that there is a rationale for intervention in the context of user-to-user services and search services. At the same time, we see in the Bill that the Government’s decision is that intervention is not justified in all sorts of other contexts. Email and SMS are excluded. First-party publisher content is excluded, so none of the media houses will be included. We have a Bill that is very tightly and specifically framed around dealing with intermediaries, whether that is user-to-user intermediaries who intermediate in user-generated content, or search as an intermediary, which scoops up content from across the internet and presents it to you.

This Bill is about regulating the regulators; it is not about regulating first-party speakers. A whole world of issues will come into play if we move into that space. It does not mean that it is not important, just that it is different. There is a common saying that people are now bandying around, which is that freedom of speech is not freedom of reach. To apply a twist to that, restrictions on reach are not the same as restrictions on speech. When we talk about restricting intermediaries, we are talking about restricting reach. If I have something I want to say and Facebook or Twitter will not let me say it, that is a problem and I will get upset, but it is not the same as being told that I cannot say it anywhere on the internet.

My concern about Amendment 2 is that it could lead us into a space where we are restricting speech across the internet. If we are going to do that—there may be a rationale for doing it—we will need to go back and look at our necessity and proportionality test. It may play out differently in that context from user-to-user or intermediary-based services.

From a practical point of view, we have a Bill that, we are told, will give Ofcom the responsibility of regulating 25,000 more or less different entities. They will all be asked to pay money to Ofcom and will all be given a bunch of guidance and duties that they have to fulfil. Again, those duties, as set out in painful length in the Bill, are very specifically about the kind of things that an intermediary should do to its users. If we were to be regulating blogs or people’s first-party speech, or publishers, or the Daily Telegraph, or whoever else, I think we would come up with a very different set of duties from the duties laid out in the Bill. I worry that, however well-motivated, Amendment 2 leads us into a space for which this Bill is not prepared.

I have a lot of sympathy with the views of the noble Baroness, Lady Harding, around the app stores. They are absolutely more like intermediaries, or search, but again the tools in the Bill are not necessarily dedicated to how one would deal with app stores. I was interested in the comments of the noble Baroness, Lady Stowell, on what will be happening to our competition authorities; a lot will be happening in that space. On app stores, I worry about what is in Amendment 22: we do not want app stores to think that it is their job to police the content of third-party services. That is Ofcom’s job. We do not want the app stores to get in the middle, not least because of these commercial considerations. We do not want Apple, for instance, thinking that, to comply with UK legislation, it might determine that WhatsApp is unsafe while iMessage is safe. We do not want Google, which operates Play Store, to think that it would have a legal rationale for determining that TikTok is unsafe while YouTube is safe. Again, I know that this is not the noble Baroness’s intention or aim, but clearly there is a risk that we open that up.

There is something to be done about app stores but I do not think that we can roll over the powers in the Bill. When we talk about intermediaries such as user-to-user services and search, we absolutely want them to block bad content. The whole thrust of the Bill is about forcing them to restrict bad content. When it comes to app stores, the noble Baroness set out some of her concerns, but I think we want something quite different. I hesitate to say this, as I know that my noble friend is supportive of it, but I think that it is important as we debate these issues that we hear some of those concerns.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Could it not be argued that the noble Lord is making a case for regulation of app stores? Let us take the example of Apple’s dispute with “Fortnite”, where Apple is deciding how it wants to police things. Perhaps if this became a more regulated space Ofcom could help make sure that there was freedom of access to some of those different products, regardless of the commercial interests of the people who own the app stores.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

The noble Lord makes a good point. I certainly think we are heading into a world where there will be more regulation of app stores. Google and Apple are commercial competitors with some of the people who are present in their stores. A lot of the people in their stores are in dispute with them over things such as the fees that they have to pay. It is precisely for that reason that I do not think we should be throwing online safety into the mix.

There is a role for regulating app stores, which primarily focuses on these commercial considerations and their position in the market. There may be something to be done around age-rating; the noble Baroness made a very good point about how age-rating works in app stores. However, if we look at the range of responsibilities that we are describing in this Bill and the tools that we are giving to intermediaries, we see that they are the wrong, or inappropriate, set of tools.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

Would the noble Lord acknowledge that app stores are already undertaking these age-rating and blocking decisions? Google has unilaterally decided that, if it assesses that you are under 18, it will not serve up over-18 apps. My concern is that this is already happening but it is happening indiscriminately. How would the noble Lord address that?

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

The noble Baroness makes a very good point; they are making efforts. There is a role for app stores to play but I hope she would accept that it is qualitatively different from that played by a search engine or a user-to-user service. If we were to decide, in both instances, that we want app stores to have a greater role in online safety and a framework that allows us to look at blogs and other forms of content, we should go ahead and do that. All I am arguing is that we have a Bill that is carefully constructed around two particular concepts, a user-to-user service and a search engine, and I am not sure it will stretch that far.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I want to reassure the noble Lord: I have his blog in front of me and he was quite right—there were not a lot of children on that site. It is a very good blog, which I read frequently.

I want to make two points. First, age-rating and age-gating are two different things, and I think the noble Lord has conflated them. There is a lot of age- rating going on, and it is false information. We need good information, and we have not managed to get it by asking nicely. Secondly, I slightly dispute his idea that we have a very structured Bill regarding user-to-user and so on. We have a very structured Bill from a harms perspective that describes the harms that must be prevented—and then we got to commercial porn, and we can also get to these other things.

I agree with the noble Lord’s point about freedom of speech, but we are talking about a fixed set of harms that will, I hope, be in the Bill by the end. We can then say that if children are likely to be accessed by this test, and known harm is there, that is what we are looking at. We are certainly not looking at the noble Lord’s blog.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I appreciate the intervention by the noble Baroness; I hope through this grit we may conjure up a pearl of some sort. The original concept of the Bill, as championed by the noble Baroness, would have been a generalised set of duties of care which could have stretched much more broadly. It has evolved in a particular direction and become ever more specific and tailored to those three services: user-to-user, search, and pornography services. Having arrived at that point, it is difficult to then open it back up and stretch it to reach other forms of service.

My intention in intervening in this debate is to raise some of those concerns because I think they are legitimate. I may be at the more sceptical end of the political world, but I am at the more regulation-friendly end of the tech community. This is said in a spirit of trying to create a Bill that will actually work. I have done the work, and I know how hard Ofcom’s job will be. That sums up what I am trying to say: my concern is that we should not give Ofcom an impossible job. We have defined something quite tight—many people still object to it, think it is too loose and do not agree with it—but I think we have something reasonably workable. I am concerned that, however tempting it is, by re-opening Pandora’s box we may end up creating something less workable.

That does not mean we should forget about app stores and non-user-to-user content, but we need to think of a way of dealing with those which does not necessarily just roll over the mechanism we have created in the Online Safety Bill to other forms of application.

Baroness Healy of Primrose Hill Portrait Baroness Healy of Primrose Hill (Lab)
- View Speech - Hansard - - - Excerpts

I strongly support the amendments in the name of the noble Baroness, Lady Kidron, because I want to see this Bill implemented but strengthened in order to fulfil the admirable intention that children must be safe wherever they are online. This will not be the case unless child safety duties are applicable in all digital environments likely to be accessed by children. This is not overly ambitious or unrealistic; the platforms need clarity as to these new responsibilities and Ofcom must be properly empowered to enforce the rules without worrying about endless legal challenges. These amendments will give that much-needed clarity in this complex area.

As the Joint Committee recommended, this regulatory alignment would simplify compliance with businesses while giving greater clarity to people who use the service and greater protection for children. It would give confidence to parents and children that they need not work out if they are in a regulated or unregulated service while online. The Government promised that the onus for keeping young people safe online would sit squarely on the tech companies’ shoulders.

Without these amendments, there is a real danger that a loophole will remain whereby some services, even those that are known to harm, are exempt, leaving thousands of children exposed to harm. They would also help to future-proof the Bill. For example, some parts of the metaverse as yet undeveloped may be out of scope, but already specialist police units have raised concerns that abuse rooms, limited to one user, are being used to practise violence and sexual violence against women and girls.

We can and must make this good Bill even better and support all the amendments in this group.

16:30
Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, as I listen to the words echoing around the Chamber, I try to put myself in the shoes of parents or children who, in one way or another, have suffered as a result of exposure to things happening online. Essentially, the world that we are talking about has been allowed to grow like Topsy, largely unregulated, at a global level and at a furious pace, and that is still happening as we do this. The horses have not just bolted the stable; they are out of sight and across the ocean. We are talking about controlling and understanding an environment that is moving so quickly that, however fast we move, we will be behind it. Whatever mousetraps we put in place to try to protect children, we know there are going to be loopholes, not least because children individually are probably smarter than we are collectively at knowing how to get around well-meaning safeguards.

There are ways of testing what is happening. Certain organisations have used what they term avatars. Essentially, you create mythical profiles of children, which are clearly stated as being children, and effectively let them loose in the online world in various directions on various platforms and observe what happens. The tests that have been done on this—we will go into this in more detail on Thursday when we talk about safety by design—are pretty eye-watering. The speed with which these avatars, despite being openly stated as being profiles of children, are deluged by a variety of content that should be nowhere near children is dramatic and incredibly effective.

I put it to the Minister and the Bill team that one of the challenges for Ofcom will be not to be so far behind the curve that it is always trying to catch up. It is like being a surfer: if you are going to keep going then you have to keep on the front side of the wave. The minute you fall behind it, you are never going to catch up. I fear that, however well-intentioned so much of the Bill is, unless and until His Majesty’s Government and Ofcom recognise that we are probably already slightly behind the crest of the wave, whatever we try to do and whatever safeguards we put in place are not necessarily going to work.

One way we can try to make what we do more effective is the clever, forensic use of approaches such as avatars, not least because I suspect their efficacy will be dramatically increased by the advent and use of AI.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

Tim Cook, the CEO of Apple, put it very well:

“Kids are born digital, they’re digital kids now … And it is, I think, really important to set some hard rails around it”.


The truth is that in the area of app stores, Google and Apple, which, as we have heard, have a more than 95% share of the market, are just not voluntarily upholding their responsibilities in making the UK a safe place for children online. There is an air of exceptionalism about the way they behave that suggests they think the digital world is somehow different from the real world. I do not accept that, which is why I support the amendments in the name of my noble friend Lady Harding and others—Amendments 19, 22, 298, 299 and other connected amendments.

There are major holes in the app stores’ child safety measures, which mean that young teens can access adult apps that offer dating, random chats, casual sex and gambling, even when Apple and Google emphatically know that the user is a minor. I will give an example. Using an Apple ID for a simulated 14 year-old, the Tech Transparency Project looked at 80 apps in the App Store that are theoretically limited to 17 and older. It found that underage users could very easily evade age restrictions in the vast majority of cases. There is a dating app that opens directly into pornography before ever asking the user’s age; adult chat apps filled with explicit images that never ask the user’s age, and a gambling app that lets the minor account deposit and withdraw money.

What kind of apps are we talking about here? We are talking about apps such as UberHoney; Eros, the hook-up and adult chat app; Hahanono—Chat & Get Naughty, and Cash Clash Games: Win Money. The investigation found that Apple and other apps essentially pass the buck to each other when it comes to blocking underage users, making it easy for young teens to slip through the system. My day-to-day experience as a parent of four children completely echoes that investigation, and it is clear to me that Apple and Google just do not share age data with the apps in their app stores, or else children would not be able to download those apps.

There is a wilful blindness to minors tweaking their age. Parental controls on mobile phones are, to put it politely, a joke. It takes a child a matter of minutes to circumvent them—I know from my experience—and I have wasted many hours fruitlessly trying to control these arrangements. That is just not good enough for any business. It is not good enough because so many teenagers have mobile phones, as we discussed—two-thirds of children have a smartphone by the age of 10. Moreover, it is not good enough because they are accessing huge amounts of filthy content, dodgy services and predatory adults, things that would never be allowed in the real world. The Office of the Children’s Commissioner for England revealed that one in 10 children had viewed pornography by the time they were nine years old. The impact on their lives is profound: just read the testimony on the recent Mumsnet forums about the awful impact of pornography on their children’s lives.

To prevent minors from accessing adult-only apps, the most efficient measure would be, as my noble friend Lady Harding pointed out, to check users’ ages during the distribution step, which means directly in the app store or on the web browser, prior to the app store or the internet browser initiating the app or the platform download. This can be done without the developer knowing the user’s specific age. Developing a reliable age-verification regime applied at that “distribution layer” of the internet supply chain would significantly advance the UK’s objective of creating a safer online experience and set a precedent that Governments around the world could follow. It would apply real-world principles to the internet.

This would not absolve any developer, app or platform of their responsibilities under existing legislation—not at all: it would build on that. Instead, it would simply mandate that every player in the ecosystem, right from the app store distribution layer, was legally obliged to promote a safer experience online. That is completely consistent with the principles and aims of the Online Safety Bill.

These amendments would subject two of the biggest tech corporations to the same duties regarding their app stores as we do the wider digital ecosystem and the real world. It is all about age assurance and protecting children. To the noble Lord, Lord Allan, I say that I cannot understand why my corner shop requires proof of age to buy cigarettes, pornography or booze, but Apple and Google think it is okay to sell apps with inappropriate content and services without proper age-verification measures and with systems that are wilfully unreliable.

There is a tremendous amount that is very good about Tim Cook’s commitment to privacy and his objections to the data industrial complex; but in this matter of the app stores, the big tech companies have had a blind spot to child safety for decades and a feeling of exceptionalism that is just no longer relevant. These amendments are an important step in requiring that app store owners step up to their responsibilities and that we apply the same standards to shopkeepers in the digital world as we would to shopkeepers in the real world.

Lord Storey Portrait Lord Storey (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I enter this Committee debate with great trepidation. I do not have the knowledge and expertise of many of your Lordships, who I have listened to with great interest. What I do have is experience working with children, for over 40 years, and as a parent myself. I want to make what are perhaps some innocent remarks.

I was glad that the right reverend Prelate the Bishop of Oxford raised the issue of online gaming. I should perhaps declare an interest, in that I think Liverpool is the third-largest centre of online gaming in terms of developing those games. It is interesting to note that over 40% of the entertainment industry’s global revenue comes from gaming, and it is steadily growing year on year.

If I am an innocent or struggle with some of these issues, imagine how parents must feel when they try to cope every single day. I suppose that the only support they currently have, other than their own common sense of course, are rating verifications or parental controls. Even the age ratings confuse them, because there are different ratings for different situations. We know that films are rated by the British Board of Film Classification, which also rates Netflix and now Amazon. But it does not rate Disney, which has its own ratings system.

We also know that the gaming industry has a different ratings system: the PEGI system, which has a number linked to an age. For example PEGI 16, if a parent knew this, says that that rating is required when depiction of violence or sexual activity reaches a stage where it looks realistic. The PEGI system also has pictures showing that.

Thanks to the Video Recordings Act 1984, the PEGI 12, PEGI 16 and PEGI 18 ratings became legally enforceable in the UK, meaning that retailers cannot sell those video games to those below those ages. If a child or young person goes in, they could not be sold those games. However, the Video Recordings Act does not currently apply to online games, meaning that children’s safety in online gaming relies primarily on parents setting up parental controls.

I will listen with great interest to the tussles between various learned Lords, as all these issues show to me that perhaps the most important issue will come several Committee days down the path, when we talk about media literacy. That is because it is not just about enforcement, regulation or ratings; it is about making sure that parents have the understanding and the capacity. Let us not forget this about young people: noble Lords have talked about them all having a phone and wanting to go on pornographic sites, but I do not think that is the case at all. Often, young people, because of peer pressure and because of their innocence, are drawn into unwise situations. Then there are the risks that gaming can lead to: for example, gaming addiction was mentioned by the right reverend Prelate the Bishop of Oxford. There is also the health impact and maybe a link with violent behaviour. There is the interactive nature of video game players, cyber bullying and the lack of a feeling of well-being. All these things can happen, which is why we need media literacy to ensure that young people know of those risks and how to cope with them.

The other thing that we perhaps need to look at is standardising some of the simple gateposts that we currently have, hence the amendment.

Baroness Wyld Portrait Baroness Wyld (Con)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to follow the noble Lord, Lord Storey. I support Amendments 19, 22 and so on in the name of my noble friend Lady Harding, on app stores. She set it out so comprehensively that I am not sure there is much I can add. I simply want to thank her for her patience as she led me through the technical arguments.

16:45
I support these amendments as I accept, reluctantly, that children are becoming more and more independent on the internet. I have ummed and ahhed about where parental responsibility starts and ends. I have a seven year-old, a 10 year-old and a 12 year-old. I do not see why any seven year-old, frankly, should have a smartphone. I do not know why any parent would think that is a good idea. It might make me unpopular, but there we are. I accept that a 12 year-old, realistically, has to have a smartphone in this day and age.
I said at Second Reading that Covid escalated digital engagement. It had to, because children had to go onto “Seesaw” and various other apps to access education. As a result, their social lives changed. They became faster and more digital. It seems to be customary to stand up and say that this Bill is very complicated, but at the end, when it passes after all this time, the Government will rightly want to go to parents and say, “We’ve done it; we’ve made this the safest place in the world to be online”.
Unless we support my noble friend’s amendments and can say to parents that we have been holistic about this and recognised a degree of parental responsibility but also the world that children will go into and how it may change—we have heard about the possibility of more app stores, creating a more confusing environment for parents and young people—I do not think we can confidently, hand on heart, say that we achieved what this Bill set out to achieve. On that note, I wholeheartedly support my noble friend’s amendments.
Lord Bishop of Guildford Portrait The Lord Bishop of Guildford
- View Speech - Hansard - - - Excerpts

My Lords, one of our clergy in the diocese of Guildford has been campaigning for more than a decade, as have others in this Committee, on children’s access to online pornography. With her, I support the amendments in the names of the noble Baronesses, Lady Kidron and Lady Harding.

Her concerns eventually made their way to the floor of the General Synod of the Church of England in a powerful debate in July last year. The synod voted overwhelmingly in favour of a motion, which said that we

“acknowledge that our children and young people are suffering grave harm from free access to online pornography”

and urged us to

“have in place age verification systems to prevent children from having access to those sites”.

It asked Her Majesty’s Government to use their best endeavours to secure the passage and coming into force of legislation requiring age-verification systems preventing access by people under the age of 18. It also recommended more social and educational programmes to increase awareness of the harms of pornography, including self-generated sexually explicit images.

Introducing the motion, my chaplain, Reverend Jo Winn-Smith, said that age verification

“ought to be a no-brainer … Exposure to sexualised material is more likely to lead to young people engaging in more sexualised behaviour and to feel social pressure to have sex”,

as well as normalising sexual violence against girls and women. A speech from the chaplain-general of the Prison Service towards the end of the debate highlighted just where such behaviours and pressures could lead in extreme circumstances.

One major theme that emerged during the debate is highlighted by the amendments this afternoon: that access to online pornography goes far beyond materials that fall into what the Bill defines as Part 5 services. Another is highlighted in a further group of amendments: age assurance needs to be both mandatory and effective beyond reasonable doubt.

It was also commented on how this whole area has taken such an age to get on to the statute book, given David Cameron’s proposals way back in 2013 and further legislation proposed in 2018 that was never enacted. Talk of secondary legislation to define harmful content in that regard is alarming, as a further amendment indicates, given the dragging of feet that has now been perpetuated for more than a decade. That is a whole generation of children and young people.

In an imaginative speech in the synod debate, the most reverend Primate the Archbishop of York, Archbishop Stephen, reminded us that the internet is not a platform; it is a public space, where all the rights and norms you would expect in public should apply. In the 1970s, he continued, we famously put fluoride in the water supply, because we knew it would be great for dental health; now is the opportunity to put some fluoride into the internet. I add only this: let us not water down the fluoride to a point where it becomes feeble and ineffective.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak in support of the amendments in this group in the names of the intrepid noble Baroness, Lady Kidron, the noble Baroness, Lady Harding, and my noble friend Lord Storey—we are kindred spirits.

As my noble friend said, the expectations of parents are clear: they expect the Bill to protect their children from all harm online, wherever it is encountered. The vast majority of parents do not distinguish between the different content types. To restrict regulation to user-to-user services, as in Part 3, would leave a great many websites and content providers, which are accessed by children, standing outside the scope of the Bill. This is a flagship piece of legislation; there cannot be any loopholes leaving any part of the internet unregulated. If there is a website, app, online game, educational platform or blog—indeed, any content that contains harmful material—it must be in the scope of the Bill.

The noble Baroness, Lady Kidron, seeks to amend the Bill to ensure that it aligns with the Information Commissioner’s age-appropriate design code—it is a welcome amendment. As the Bill is currently drafted, the threshold for risk assessment is too high. It is important that the greatest number of children and young people are protected from harmful content online. The amendments achieve that to a greater degree than the protection already in the Bill.

While the proposal to align with the age-appropriate design code is welcome, I have one reservation. Up until recently, it appears that the ICO was reluctant to take action against pornography platforms that process children’s data. It has perhaps been deemed that pornographic websites are unlikely to be accessed by children. Over the years, I have shared with this House the statistics of how children are accessing pornography and the harm it causes. The Children’s Commissioner also recently highlighted the issue and concerns. Pornography is being accessed by our children, and we must ensure that the provisions of the Bill are the most robust they can be to ensure that children are protected online.

I am concerned with ensuring two things: first, that any platform that contains harmful material falls under the scope of the Bill and is regulated to ensure that children are kept safe; and, secondly, that, as far as possible, what is harmful offline is regulated in the same way online. The amendments in the name of my noble friend Lord Storey raise the important question of online-offline equality. Amendments 33A and 217A seek to regulate online video games to ensure they meet the same BBFC ratings as would be expected offline, and I agree with that approach. Later in Committee, I will raise this issue in relation to pornographic content and how online content should be subject to the same BBFC guidance as content offline. I agree with what my noble friend proposes: namely, that this should extend to video game content as well. Video games can be violent and sexualised in nature, and controls should be in place to ensure that children are protected. The BBFC guidelines used offline appear to be the best way to regulate online as well.

Children must be kept safe wherever they are online. This Bill must have the widest scope possible to keep children safe, but ensuring online/offline alignment is crucial. The best way to keep children safe is to legislate for regulation that is as far reaching as possible but consistently applied across the online/offline world. These are the reasons why I support the amendments in this group.

Baroness Berridge Portrait Baroness Berridge (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I will lend my support to Amendments 19 and 22. It is a pleasure to speak after the noble Baroness, Lady Benjamin. I may be one of those people in your Lordships’ House who relies significantly on the British Board of Film Classification for movie watching, as I am one of the faint-hearted.

In relation to app stores, it is not just children under 18 for whom parents need the age verification. If you are a parent of a child who has significant learning delay, the internet is a wonderful place where they can get access to material and have development that they might not ordinarily have had. But, of course, turning 17 or 18 is not the threshold for them. I have friends who have children with significant learning delay. Having that assurance, so they know which apps are which in the app store, goes well beyond 18 for them. Obviously it will not be a numerical equivalent for their child—now a young adult—but it is important to them to know that the content they get on a free app or an app purchased from the app store is suitable.

I just wanted to raise that with noble Lords, as children and some vulnerable adults—not all—would benefit from the kind of age verification that we have talked about. I appreciate the points that the noble Lord, Lord Allan, raised about where the Bill has ended up conceptually and the framework that Ofcom will rely on. Like him, I am a purist sometimes but, pragmatically, I think that the third concept raised by the noble Baroness, Lady Kidron, about protection and putting this in the app store and bringing it parallel with things such as classification for films and other video games is really important.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a really fascinating debate and I need to put a stake in the ground pretty early on by saying that, although my noble friend Lord Allan has raised some important points and stimulated an important debate, I absolutely agree with the vast majority of noble Lords who have spoken in favour of the amendment so cogently put forward by the noble Baronesses, Lady Kidron and Lady Harding.

Particularly as a result of the Bill’s being the subject of a Joint Committee, it has changed considerably over time in response to comment, pressure, discussion and debate and I believe very much that during Committee stage we will be able to make changes, and I hope the Minister will be flexible enough. I do not believe that the framework of the Bill is set in concrete. There are many things we can do as we go through, particularly in the field of making children safer, if we take some of the amendments that have been put forward on board. In particular, the noble Baroness, Lady Kidron, set out why the current scope of the Bill will fail to protect children if it is kept to user-to-user and search services. She talked about blogs with limited functionalities, gaming without user functionalities and mentioned the whole immersive environment, which the noble Lord, Lord Russell, described as eye-watering. As she said, it is not fair to leave parents or children to work out whether they are on a regulated service. Children must be safe wherever they are online.

As someone who worked with the noble Baroness, Lady Kidron, in putting the appropriate design code in place in that original Data Protection Act, I am a fervent believer that it is perfectly appropriate to extend in the way that is proposed today. I also support her second amendment, which would bring the Bill’s child user condition in line with the threshold of the age-appropriate design code. It is the expectation—I do not think it an unfair expectation—of parents, teachers and children themselves that the Bill will apply to children wherever they are online. Regulating only certain services will mean that emerging technologies that do not fit the rather narrow categories will not be subject to safety duties.

17:00
The noble Baroness talked about thousands of children being potentially at risk of not having the protection of the Bill. That is absolutely fair comment. Our Joint Committee report said:
“We recommend that the ‘likely to be accessed by children’ test in the draft Online Safety Bill should be the same as the test underpinning the Age Appropriate Design Code’.
The Government responded:
“The government considers that the approach taken in the Bill is aligned with the Age Appropriate Design Code and will ensure consistency for businesses. In addition, the status of the legislative test in the Online Safety Bill is binding in a way that the test in the Age Appropriate Design Code is not”.
In that case, since both those statements in the current Bill are patently not the case, it is incumbent on the Government to change the Bill in the direction that the noble Baroness has asked for.
My noble friend stimulated a very important debate about the amendments of the noble Baroness, Lady Harding, in particular. That is another major potential omission in the Bill. The tech giants responsible for the distribution of nearly all apps connecting smartphone users to the internet are not currently covered in the scope of the Bill. She said that the online safety regime must look at this whole area much more broadly. App stores should be added to the list of service providers who will be mandated by the Bill to protect children and all users online. I am not going to go into all the arguments that have been made so well by noble Lords today, but of course Google and app stores have a monopoly on app distribution, yet they do not control users’ ages. They have the technical ability to prevent minors accessing certain applications reserved for adults, as evidenced by the existing parental control functions on both smartphone operating systems and their corresponding app stores, and of course, as the noble Baroness, Lady Berridge, said, this applies not just to children but to vulnerable adults as well.
I thought the noble Lord, Lord Bethell, put it very well: other sectors of the economy have already implemented such control in the distribution of goods and services in the offline world; alcohol consumption provides a good example for understanding those issues. Why cannot Google and Apple have duties that a corner store can adhere to? App stores do not have age assurance systems in place and do not actually seem to wish to take any responsibility for the part they can play in permitting harms. I say to my noble friend that the word “store” is the clue: these are products being sold through the app store and there should be age-gating on those apps. The only way to improve safety is to make sure that app developers and companies that distribute these apps do more to ensure that children and vulnerable adults are appropriately kept away from adult applications and content. That is an entirely reasonable duty to place on them: it is an essential part, I think, of the framework of the Bill that we should take these sets of amendments on board.
The right reverend Prelate the Bishop of Oxford talked about the fact that harms will only increase in coming years, particularly, as he said, with ever younger children having access to mobile technology. Of course, I agree with my noble friend about the question of media literacy. This goes hand in hand with regulation, as we will discover when we talk about this later on. These amendments will not, in the words of my noble friend, break the internet: I think they will add substantially and beneficially to regulation.
I say to my noble friend Lord Storey that I support his amendments too; they are more like probing amendments. There is a genuine gap that I think many of us were not totally aware of. I assumed that, in some way, the PEGI classifications applied here, but if age ratings do not apply to online games, that is a major gap. We need to look at that very carefully, alongside these amendments, which I very much hope the Minister will accept.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I echo the comments of the noble Lord, Lord Clement-Jones. This is an important group of amendments, and it has been a useful debate. I was slightly concerned when I heard the noble Baroness, Lady Harding, talk about using her daughter’s device to see whether it could access porn sites in terms of what that is going to do to her daughter’s algorithm and what it will now feed her. I will put that concern to one side, but any future report on that would be most welcome.

Amendments 2, 3 and 5, introduced so well by the noble Baroness, Lady Kidron, test what should be in scope to protect children. Clearly, we have a Bill that has evolved over some time, with many Ministers, to cover unambiguously social media, as user-to-user content, and search. I suspect that we will spend a lot more time discussing social media than search, but I get the rationale that those are perhaps the two main access points for a lot of the content we are concerned about. However, I would argue that apps are also main access points. I will come on to discuss the amendments in the name of the noble Baroness, Lady Harding, which I have also signed. If we are going to go with access points, it is worth probing and testing the Government’s intent in excluding some of these other things. The noble Lord, Lord Storey, raises in his amendments the issue of games, as others have done. Games are clearly a point of access for lots of children, as well as adults, and there is plenty of harm that can be created as a result of consuming them.

Along with some other noble Lords, some time ago I attended an all-party group which looked at the problems related to incel harm online and how people are breadcrumbed from mainstream sites to quite small websites to access the really problematic, most hateful and most dangerous content. Those small websites, as far as I can see, are currently excluded from the regime in the Bill, but the amendments in the name of the noble Baroness, Lady Kidron, potentially would bring them into scope. That meeting also discussed cloud services and the supply chain of the technical infrastructure that such risks, including incels and other things, use. Why are cloud services not included in some context in terms of the harms that might be created?

Questions have been asked about large language model AIs such as ChatGPT. These are future technologies that have now arrived, which lots of people are talking about and variously freaking out about or getting excited by. There is an important need to bring those quite quickly into the scope of regulation by Ofcom. ChatGPT is a privately owned platform—a privately owned technology—that is offering up not only access to the range of knowledge that is online but, essentially, the range of human concepts that are online in interaction with that knowledge—privately owned versions of truth.

What is to stop any very rich individual deciding to start their own large language model with their own version of the truth, perhaps using their own platform? Former President Trump comes to mind as someone who could do that and I suggest that, if truth is now a privatised thing, we might want to have some regulation here.

The future-proofing issues are why we should be looking very seriously at the amendments in the name of the noble Baroness, Lady Kidron. I listened carefully to the noble Lord, Lord Allan, as always, and I have reflected a lot on his very useful car safety and plane safety regulation analogy from our previous day in Committee. The proportionality issue that he raised in his useful contribution this time is potentially addressed by the proposed new clause we discussed last time. If the Bill sets out quite clearly the aim of the legislation, that would set the frame for the regulator and for how it would regulate proportionately the range of internet services that might be brought into scope by this set of amendments.

I also support Amendment 92, on bringing in safety by design and the regime that has been so successful in respect of the age-related design code and the probability of access by children, rather than what is set out in the Bill.

I turn to Amendments 19, 22, 298 and 299 in the names of the noble Baronesses, Lady Harding and Lady Stowell, the noble Lord, Lord Clement-Jones, and myself. Others, too, have drawn the analogy between app stores and corner shops selling alcohol, and it makes sense to think about the distribution points in the system—the pinch points that all users go through—and to see whether there is a viable way of protecting people and regulating through those pinch points. The Bill seeks to protect us via the platforms that host and promote content having regulation imposed on them, and risk assessments and so on, but it makes a lot of sense to add app stores, given how we now consume the internet.

I remember, all those years ago, having CD drives—floppy disk drives, even—in computers, and going off to buy software from a retail store and having to install it. I do not go quite as far back as the right reverend Prelate the Bishop of Oxford, but I remember those days well. Nowadays as consumers almost all of us access our software through app stores, be it software for our phones or software for our laptops. That is the distribution point for mobiles and essentially it is, as others have said, a duopoly that we hope will be addressed by the Digital Markets, Competition and Consumers Bill.

As others have said, 50% of children under 10 in this country use smartphones and tablets. When you get to the 12 to 15 bracket, you find that 97% of them use mobile phones and tablets. We have, as noble Lords have also said, Google Family Link and the Apple Family Sharing function. That is something we use in my family. My stepdaughter is 11—she will be 12 in June—and I appear to be in most cases the regulator who has to give her the Family Link code to go on to Google Classroom when she does her homework, and who has to allow her to download an app or add another contact—there is a whole range of things on her phone for which I provide the gatekeeper function. But you have to be relatively technically competent and confident to do all those things, and to manage her screen time, and I would like to see more protection for those who do not have that confidence—and indeed for myself as well, because maybe I would not have to be bothered quite as often.

It is worth noting that the vast majority of children in this country who have smartphones—the last time I looked at the stats, it was around 80%—have iPhones; there must be a lot of old iPhones that have been recycled down the family. To have an iCloud account, if you are under 13, you have to go through a parent or other suitable adult. However, if you are over 13, you can get on with it; that raises a whole set of issues and potential harms for children over the age of 13.

17:15
I am less familiar with the user journey and how it works on Google Play—we are more of an Apple family—but my understanding is that, for both Google Play and the Apple App Store, in order to set up an account you need credit card billing information. This creates ID verification, and the assurance that many of us are looking for is then provided as an additional safeguard for children. This is not something that anyone is arguing should replace the responsibilities set out in the Bill for internet service providers—for example, that they should carry out risk assessments and be regulated. This is about having additional safeguards at the point of distribution. We are not asking Apple and Google, in this case, to police the apps. We are asking them to ensure that the publishers of the applications set an age limit and then facilitate ensuring that that age limit is adhered to, according to everything that they know about the user of that device and their age. I am grateful to the noble Baroness, Lady Harding, for her amendments on this important issue.
Finally, let me say this in anticipation of the Minister perhaps suggesting that this might be a good idea but we are far down the road with the Bill and Ofcom is ready to go and we want to get on with implementing it, so maybe let us not do this now but perhaps in another piece of legislation. Personally, I am interested in having a conversation about the sequence of implementation. It might be that we can implement the regime that Ofcom is good to go on but with the powers there in the Bill for it to cover app stores and some other wider internet services, according to a road map that it sets out and that we in Parliament can scrutinise. However, my general message is, as the noble Baroness, Lady Kidron, said, that we should get this right in this legislation and grab the opportunity, particularly with app stores, to bring other internet services in—given that we consume so much through applications—and to provide a safer environment for our children.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I share noble Lords’ determination to deliver the strongest protections for children and to develop a robust and future-proofed regulatory regime. However, it will not be possible to solve every problem on the internet through this Bill, nor through any piece of legislation, flagship or otherwise. The Bill has been designed to confer duties on the services that pose the greatest risk of harm—user-to-user services and search services—and where there are proportionate measures that companies can take to protect their users.

As the noble Baroness, Lady Kidron, and others anticipated, I must say that these services act as a gateway for users to discover and access other online content through search results and links shared on social media. Conferring duties on these services will therefore significantly reduce the risk of users going on to access illegal or harmful content on non-regulated services, while keeping the scope of the Bill manageable and enforceable.

As noble Lords anticipated, there is also a practical consideration for Ofcom in all this. I know that many noble Lords are extremely keen to see this Bill implemented as swiftly as possible; so am I. However, as the noble Lord, Lord Allan, rightly pointed out, making major changes to the Bill’s scope at this stage would have significant implications for Ofcom’s implementation timelines. I say this at the outset because I want to make sure that noble Lords are aware of those implications as we look at these issues.

I turn first to Amendments 2, 3, 5, 92 and 193, tabled by the noble Baroness, Lady Kidron. These aim to expand the number of services covered by the Bill to incorporate a broader range of services accessed by children and a broader range of harms. I will cover the broader range of harms more fully in a separate debate when we come to Amendment 93, but I am very grateful to the noble Baroness for her constructive and detailed discussions on these issues over the past few weeks and months.

These amendments would bring new services into scope of the duties beyond user-to-user and search services. This could include services which enable or promote commercial harms, including consumer businesses such as online retailers. As I have just mentioned in relation to the previous amendments, bringing many more services into scope would delay the implementation of Ofcom’s priorities and risk detracting from its work overseeing existing regulated services where the greatest risk of harm exists—we are talking here about the services run by about 2.5 million businesses in the UK alone. I hope noble Lords will appreciate from the recent communications from Ofcom how challenging the implementation timelines already are, without adding further complication.

Amendment 92 seeks to change the child-user condition in the children’s access assessment to the test in the age-appropriate design code. The test in the Bill is already aligned with the test in that code, which determines whether a service is likely to be accessed by children, in order to ensure consistency for providers. The current child-user condition determines that a service is likely to be accessed by children where it has a significant number or proportion of child users, or where it is of a kind likely to attract a significant number or proportion of child users. This will already bring into scope services of the kind set out in this amendment, such as those which are designed or intended for use by children, or where children form a—

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I am sorry to interrupt. Will the Minister take the opportunity to say what “significant” means, because that is not aligned with the ICO code, which has different criteria?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

If I can finish my point, this will bring into scope services of the kind set out in the amendments, such as those designed or intended for use by children, or where children form a substantial and identifiable user group. The current condition also considers the nature and content of the service and whether it has a particular appeal for children. Ofcom will be required to consult the Information Commissioner’s Office on its guidance to providers on fulfilling this test, which will further support alignment between the Bill and the age-appropriate design code.

On the meaning of “significant”, a significant number of children means a significant number in itself or a significant proportion of the total number of UK-based users on the service. In the Bill, “significant” has its ordinary meaning, and there are many precedents for it in legislation. Ofcom will be required to produce and publish guidance for providers on how to make the children’s access assessment. Crucially, the test in the Bill provides more legal certainty and clarity for providers than the test outlined in the code. “Substantive” and “identifiable”, as suggested in this amendment, do not have such a clear legal meaning, so this amendment would give rise to the risk that the condition is more open to challenge from providers and more difficult to enforce. On the other hand, as I said, “significant” has an established precedent in legislation, making it easier for Ofcom, providers and the courts to interpret.

The noble Lord, Lord Knight, talked about the importance of future-proofing the Bill and emerging technologies. As he knows, the Bill has been designed to be technology neutral and future-proofed, to ensure that it keeps pace with emerging technologies. It will apply to companies which enable users to share content online or to interact with each other, as well as to search services. Search services using AI-powered features will be in scope of the search duties. The Bill is also clear that content generated by AI bots is in scope where it interacts with user-generated content, such as bots on Twitter. The metaverse is also in scope of the Bill. Any service which enables users to interact as the metaverse does will have to conduct a child access test and comply with the child safety duties if it is likely to be accessed by children.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

I know it has been said that the large language models, such as that used by ChatGPT, will be in scope when they are embedded in search, but are they in scope generally?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

They are when they apply to companies enabling users to share content online and interact with each other or in terms of search. They apply in the context of the other duties set out in the Bill.

Amendments 19, 22, 298 and 299, tabled by my noble friend Lady Harding of Winscombe, seek to impose child safety duties on application stores. I am grateful to my noble friend and others for the collaborative approach that they have shown and for the time that they have dedicated to discussing this issue since Second Reading. I appreciate that she has tabled these amendments in the spirit of facilitating a conversation, which I am willing to continue to have as the Bill progresses.

As my noble friend knows from our discussions, there are challenges with bringing application stores—or “app stores” as they are popularly called—into the scope of the Bill. Introducing new duties on such stores at this stage risks slowing the implementation of the existing child safety duties, in the way that I have just outlined. App stores operate differently from user-to-user and search services; they pose different levels of risk and play a different role in users’ experiences online. Ofcom would therefore need to recruit different people, or bring in new expertise, to supervise effectively a substantially different regime. That would take time and resources away from its existing priorities.

We do not think that that would be a worthwhile new route for Ofcom, given that placing child safety duties on app stores is unlikely to deliver any additional protections for children using services that are already in the scope of the Bill. Those services must already comply with their duties to keep children safe or will face enforcement action if they do not. If companies do not comply, Ofcom can rely on its existing enforcement powers to require app stores to remove applications that are harmful to children. I am happy to continue to discuss this matter with my noble friend and the noble Lord, Lord Knight, in the context of the differing implementation timelines, as he has asked.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

The Minister just said something that was material to this debate. He said that Ofcom has existing powers to prevent app stores from providing material that would have caused problems for the services to which they allow access. Can he confirm that?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Perhaps the noble Lord could clarify his question; I was too busy finishing my answer to the noble Lord, Lord Knight.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

It is a continuation of the point raised by the noble Baroness, Lady Harding, and it seems that it will go part of the way towards resolving the differences that remain between the Minister and the noble Baroness, which I hope can be bridged. Let me put it this way: is it the case that Ofcom either now has powers or will have powers, as a result of the Bill, to require app stores to stop supplying children with material that is deemed in breach of the law? That may be the basis for understanding how you can get through this. Is that right?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Services already have to comply with their duties to keep children safe. If they do not comply, Ofcom has powers of enforcement set out, which require app stores to remove applications that are harmful to children. We think this already addresses the point, but I am happy to continue discussing it offline with the noble Lord, my noble friend and others who want to explore how. As I say, we think this is already covered. A more general duty here would risk distracting from Ofcom’s existing priorities.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, on that point, my reading of Clauses 131 to 135, where the Bill sets out the business disruption measures, is that they could be used precisely in that way. It would be helpful for the Minister responding later to clarify that Ofcom would use those business disruption measures, as the Government explicitly anticipate, were an app store, in a rogue way, to continue to list a service that Ofcom has said should not be made available to people in the United Kingdom.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will be very happy to set that out in more detail.

Amendments 33A and 217A in the name of the noble Lord, Lord Storey, would place a new duty on user-to-user services that predominantly enable online gaming. Specifically, they would require them to have a classification certificate stating the age group for which they are suitable. We do not think that is necessary, given that there is already widespread, voluntary uptake of approval classification systems in online gaming.

17:30
The Government work closely with the industry and with the Video Standards Council to promote and encourage the displaying of Pan-European Games Information—PEGI—age ratings online. That has contributed to almost 3 million online games being given such ratings, including Roblox, mentioned by the right reverend Prelate the Bishop of Oxford. Most major online storefronts have made it mandatory for game developers supplying digital products on their platforms to obtain and display PEGI ratings. These include Google Play, Microsoft, PlayStation, Nintendo, Amazon Luna and Epic. Apple uses its own age ratings, rather than PEGI ratings, on all the video games available on its App Store.
Online games in the UK can obtain PEGI ratings by applying directly to the Video Standards Council or via the international age rating coalition system, which provides ratings based on answers to a questionnaire when a game is uploaded. That system ensures that, with unprecedented volumes of online video games—and the noble Lord is right to point to the importance of our creative industries—all digital content across most major digital storefronts can carry a PEGI rating. These ratings are regularly reviewed by international regulators, including our own Video Standards Council, and adjusted within hours if found to be incorrect.
I hope that gives the noble Lord the reassurance that the points he is exploring through his amendments are covered. I invite him not to press them and, with a promise to continue discussions on the other amendments in this group, I invite their proposers to do the same.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I thank the Minister for an excellent debate; I will make two points. First, I think the Minister was perhaps answering on my original amendment, which I have narrowed considerably to services

“likely to be accessed by children”

and with proven harm on the basis of the harms described by the Bill. It is an “and”, not an “or”, allowing Ofcom to go after places that have proven to be harmful.

Secondly, I am not sure the Government can have it both ways—that it is the same as the age-appropriate design code but different in these ways—because it is exactly in the ways that it is different that I am suggesting the Government might improve. We will come back to both those things.

Finally, what are we asking here? We are asking for a risk assessment. The Government say there is no risk assessment, no harm, no mitigation, nothing to do. This is a major principle of the conversations we will have going forward over a number of days. I also believe in proportionality. It is basic product safety; you have a look, you have standards, and if there is nothing to do, let us not make people do silly things. I think we will return to these issues, because they are clearly deeply felt, and they are very practical, and my own feeling is that we cannot risk thousands of children not benefiting from all the work that Ofcom is going to do. With that, I beg leave to withdraw.

Amendment 2 withdrawn.
Amendment 3 not moved.
Amendment 4
Moved by
4: Clause 3, page 3, line 17, leave out paragraphs (a) and (b) and insert “the service has at least one million monthly United Kingdom users.”
Member’s explanatory statement
This amendment replaces the two tests currently set out in subsection (5) of clause 3, relating to a service’s links with the United Kingdom, with a requirement that the service have at least a million monthly United Kingdom users.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

My Lords, in moving Amendment 4, I will also speak to Amendments 6 to 8 and 12 and consequential Amendments 288 and 305, largely grouped under the heading “exemptions”. In this group I am also particularly sympathetic to Amendment 9 in the names of the noble Lords, Moylan and Lord Vaizey, and I will leave them to motivate that. I look forward to hearing from the noble Lord, Lord Knight, an explanation for his Amendment 9A.

Last Wednesday we discussed the purposes of the Bill, and there was much agreement across the Chamber on one issue at least: that we need to stay focused and make sure that an already highly complex piece of legislation does not become even more unwieldy. My concern in general is that the Bill already suffers throughout from being overly broad in its aims, resulting in restricting the online experience and expressions of everyone. This series of amendments is about trying to rein in the scope, allowing us to focus on clear targets rather than a one-size-fits-all Bill that sweeps all in its wake with perhaps unintended and damaging consequences.

The Bill creates an extraordinary set of regulatory burdens on tens of thousands of British businesses, micro-communities and tech platforms, no matter the size. The impact assessment claims that 25,000 businesses are in scope, and that is considered a conservative estimate. This implies that an extraordinary range of platforms, from Mumsnet and Wikipedia to whisky-tasting forums and Reddit, will be caught up in this Bill. Can we find a way of removing the smaller platforms from scope? It will destroy too many of them if they have to comply with the regulatory burden created with huge Silicon Valley behemoths in mind.

Let us consider some of the regulatory duties that these entities are expected to comply with. They will need to undertake extensive assessments that must be repeated whenever a product changes. They will need to proactively remove certain types of content, involving assessing the risk of users encountering each type of illegal content, the speed of dissemination and functionality, the design of the platform and the nature and severity of the risk of harms presented to individual users. This will mean assessing their user base and implementing what are effectively surveillance systems to monitor all activity on their platforms.

Let us consider what a phrase such as “prevent from encountering” would mean to a web host such as Wikipedia. It would mean that it would need to scan and proactively analyse millions of edits across 250 languages for illegality under UK-specific law and then block content in defiance of the wishes of its own user community. There is much more, of course. Rest assured, Ofcom’s guidance and risk assessment will, over time, increase the regulatory complexity and the burdens involved.

Those technological challenges do not even consider the mountain of paperwork and administrative obligations that will be hugely costly and time consuming. All that might be achievable, if onerous, for larger platforms. But for smaller ones it could prove a significant problem, with SMEs and organisations working with a public benefit remit particularly vulnerable. Platforms with the largest profits and the most staff dedicated to compliance will, as a consequence, dominate at the expense of start-ups, small companies and community-run platforms.

No doubt the Government and the Minister will assure us that the duties are not so onerous and that they are manageable and proportionate. The impact assessment estimates that implementing the Bill will cost businesses £2.5 billion over the first 10 years, but all the commentators I have read think this is likely to be a substantial underestimate, especially when we are told in the same impact assessment that the legal advice is estimated to cost £39.23 per hour. I do not know what lawyers the Government hang out with, but they appear not to have a clue about the going rate for specialist law firms.

Also, what about the internal staff time? Again, the impact assessment assumes that staff will require only 30 minutes to familiarise themselves with the requirements of the legislation and 90 minutes to read, assess and change the terms and conditions in response to the requirements. Is this remotely serious? Even working through the groups of amendments has taken me hours. It has been like doing one of those 1,000-piece jigsaws, but at least at the end of those you get to see the complete picture. Instead, I felt as though somebody had come in and thrown all the pieces into the air again. I was as confused as ever.

If dealing with groups of amendments to this Bill is complex, that is nothing on the Bill itself, which is dense and often impenetrable. Last week, the Minister helpfully kept telling us to read the Explanatory Notes. I have done that several times and I am still in a muddle, yet somehow the staff of small tech companies will conquer all this and the associated regulatory changes in an hour and a half.

Many fear that this will replicate the worst horrors of GDPR, which, according to some estimates, led to an 8% reduction in the profits of smaller firms while it had little or no effect on the profits of large tech companies. That does not even take into account the cost of the near nervous breakdowns that GDPR caused small organisations, as I know from my colleagues at the Academy of Ideas.

These amendments try to tackle this disproportionate burden on smaller platforms—those companies that are, ironically, often useful challenges and antidotes to big tech’s dominance. The amendments would exempt them unless there is a good reason for specific platforms to be in scope. Of course, cutting out those in scope may not appeal to everyone here. From looking at the ever-increasing amendments list, it seems that some noble Lords have an appetite for expanding the number of services the legislation will apply to; we have already heard the discussion about app stores and online gaming. But we should note that the Government have carved out other exemptions for certain services that are excluded from the new regulatory system. They have not included emails, SMS messages, one-to-one oral communications and so on. I am suggesting some extra exemptions and that we remove services with fewer than 1 million monthly UK users. Ofcom would have the power to issue the provider with a notice bringing them into scope, but only based on reasonable grounds, having identified a safety risk and with 30 days’ notice.

If we do not tackle this, I fear that there is a substantial, serious and meaningful risk that smaller platforms based outside and inside the UK will become inaccessible to British users. It is notable that over 1,000 US news websites blocked European users during the EU’s introduction of GDPR, if noble Lords remember. Will there be a similar response to this law? What, for example, will the US search engine DuckDuckGo conclude? The search engine emphasises privacy and refuses to gather information on its users, meaning that it will be unable to fulfil the duties contained in the Bill of identifying or tailoring search results to users based on their age. Are we happy for it to go?

I fear that this Bill will reduce the number of tech platforms operating in the UK. This is anti-competitive. I do not say that because I have a particular commitment to competition and the free market, by the way. I do so because competition is essential and important for users’ choice and empowerment, and for free speech—something I fear the Bill is threatening. Indeed, the Lords’ Communications and Digital Committee’s extensive inquiry into the implications of giving large tech companies what is effectively a monopoly on defining which speech is free concluded:

“Increasing competition is crucial to promoting freedom of expression online. In a more competitive market, platforms would have to be more responsive to users’ concerns about freedom of expression and other rights”.


That is right. If users are concerned that a platform is failing to uphold their freedom of expression, they can join a different platform with greater ease if there is a wide choice. Conversely, users who are concerned that they do not want to view certain types of material would be more easily able to choose another platform that proscribes said material in its terms and conditions.

I beg to move the amendment as a way of defending diversity, choice and innovation—and as a feeble attempt to make the Bill proportionate.

17:45
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, before I speak to my Amendment 9, which I will be able to do fairly briefly because a great deal of the material on which my case rests has already been given to the Committee by the noble Baroness, Lady Fox of Buckley, I will make the more general and reflective point that there are two different views in the Committee that somehow need to be reconciled over the next few weeks. There is a group of noble Lords who are understandably and passionately concerned about child safety. In fact, we all share that concern. There are others of us who believe that this Bill, its approach and the measures being inserted into it will have massive ramifications outside the field of child safety, for adults, of course, but also for businesses, as the noble Baroness explained. The noble Baroness and I, and others like us, believe that these are not sufficiently taken into account either by the Bill or by those pressing for measures to be harsher and more restrictive.

Some sort of balance needs to be found. At Second Reading my noble friend the Minister said that the balance had been struck in the right place. It is quite clear that nobody really agrees with that, except on the principle, which I think is always a cop-out, that if everyone disagrees with you, you must be right, which I have never logically understood in any sense at all. I hope my noble friend will not resort to claiming that he has got it right simply because everyone disagrees with him in different ways.

My amendment is motivated by the considerations set out by the noble Baroness, which I therefore do not need to repeat. It is the Government’s own assessment that between 20,000 and 25,000 businesses will be affected by the measures in this Bill. A great number of those—some four-fifths—are small businesses or micro-businesses. The Government appear to think in their assessment that only 120 of those are high risk. The reason they think they are high risk is not that they are engaged in unpleasant activities but simply that they are engaged in livestreaming and contacting new people. That might be for nefarious purposes but equally, it might not, so the 120 we need to worry about could actually be a very small number. We handle this already through our own laws; all these businesses would still be subject to existing data protection laws and complying with the law generally on what they are allowed to publish and broadcast. It would not be a free-for-all or a wild west, even among that very small number of businesses.

My Amendment 9 takes a slightly different approach to dealing with this. I do not in any way disagree with or denigrate the approach taken by the noble Baroness, Lady Fox, but my approach would be to add two categories to the list of exemptions in the schedules. The first of these is services provided by small and medium-sized enterprises. We do not have to define those because there is already a law that helps define them for us: Section 33 of the Small Business, Enterprise and Employment Act 2015. My proposal is that we take that definition, and that those businesses that comply with it be outside the scope of the Bill.

The second area that I would propose exempting was also referred to by the noble Baroness, Lady Fox of Buckley: community-based services. The largest of these, and the one that frequently annoys us because it gets things wrong, is Wikipedia. I am a great user of Wikipedia but I acknowledge that it does make errors. Of course, most of the errors it makes, such as saying, “Lord Moylan has a wart on the end of his nose”, would not be covered by the Bill anyway. Nothing in the Bill will force people to correct factual statements that have been got wrong—my year of birth or country of birth, or whatever. That is not covered. Those are the things they usually get wrong and that normally annoy us when we see them.

However, I do think that these services are extremely valuable. Wikipedia is an immense achievement and a tremendous source of knowledge and information for people. The fact that it has been put together in this organic, community-led way over a number of years, in so many languages, is a tremendous advantage and a great human advance. Yet, under the proposed changes, Wikipedia would not be able to operate its existing model of people posting their comments.

Currently, you go on Wikipedia and you can edit it. Now, I know this would not apply to any noble Lords but, in the other place, it has been suggested that MPs have discovered how to do this. They illicitly and secretly go on to and edit their own pages, usually in a flattering way, so it is possible to do this. There is no prior restraint, and no checking in advance. There are moderators at Wikipedia—I do not know whether they are employed—who review what has been done over a period, but they do not do what this Bill requires, which is checking in advance.

It is not simply about Wikipedia; there are other community sites. Is it sensible that Facebook should be responsible if a little old lady alters the information on a community Facebook page about what is happening in the local parish? Why should Facebook be held responsible for that? Why would we want it to be responsible for it—and how could it do it without effectively censoring ordinary activities that people want to carry out, using the advantages of the internet that have been so very great?

What I am asking is not dramatic. We have many laws in which we very sensibly create exemptions for small and medium-sized enterprises. I am simply asking that this law be considered under that heading as well, and similarly for Wikipedia and community-based sites. It is slightly unusual that we have had to consider that; it is not normal, but it is very relevant to this Bill and I very much hope the Government will agree to it.

The answer that I would not find satisfactory—I say this in advance for the benefit of my noble friend the Minister, in relation to this and a number of other amendments I shall be moving in Committee—is that it will all be dealt with by Ofcom. That would not be good enough. We are the legislators and we want to know how these issues will be dealt with, so that the legitimate objectives of the Bill can be achieved without causing massive disruption, cost and disbenefit to adults.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to speak in support of Amendment 9, tabled by the noble Lord, Lord Moylan, and in particular the proposed new paragraph 10A to Schedule 1. I hope I will find myself more in tune with the mood of the Committee on this amendment than on previous ones. I would be interested to know whether any noble Lords believe that Ofcom should be spending its limited resources supervising a site like Wikipedia under the new regime, as it seems to me patently obvious that that is not what we intend; it is not the purpose of the legislation.

The noble Lord, Lord Moylan, is right to remind us that one of the joys of the internet is that you buy an internet connection, plug it in and there is a vast array of free-to-use services which are a community benefit, produced by the community for the community, with no harm within them. What we do not want to do is interfere with or somehow disrupt that ecosystem. The noble Baroness, Lady Fox, is right to remind us that there is a genuine risk of people withdrawing from the UK market. We should not sidestep that. People who try to be law-abiding will look at these requirements and ask themselves, “Can I meet them?” If the Wikimedia Foundation that runs Wikipedia does not think it can offer its service in a lawful way, it will have to withdraw from the UK market. That would be to the detriment of children in the United Kingdom, and certainly not to their benefit.

There are principle-based and practical reasons why we do not want Ofcom to be operating in this space. The principle-based one is that it makes me uncomfortable that a Government would effectively tell their regulator how to manage neutral information sites such as Wikipedia. There are Governments around the world who seek to do that; we do not want to be one of those.

The amendment attempts to define this public interest, neutral, informational service. It happens to be user-to-user but it is not like Facebook, Instagram or anything similar. I would feel much more comfortable making it clear in law that we are not asking Ofcom to interfere with those kinds of services. The practical reason is the limited time Ofcom will have available. We do not want it to be spending time on things that are not important.

Definitions are another example of how, with the internet, it can often be extremely hard to draw bright lines. Functionalities bleed into each other. That is not necessarily a problem, until you try to write something into law; then, you find that your definition unintentionally captures a service that you did not intend to capture, or unintentionally misses out a service that you did intend to be in scope. I am sure the Minister will reject the amendment because that is what Ministers do; but I hope that, if he is not willing to accept it, he will at least look at whether there is scope within the Bill to make it clear that Wikipedia is intended to be outside it.

Paragraph 4 of Schedule 1 refers to “limited functionality services”. That is a rich vein to mine. It is clear that the intention is to exclude mainstream media, for example. It refers to “provider content”. In this context, Encyclopaedia Britannica is not in scope but Wikipedia is, the difference being that Wikipedia is constructed by users, while Encyclopaedia Britannica is regarded as being constructed by a provider. The Daily Mail is outside scope; indeed, all mainstream media are outside scope. Anyone who declares themselves to be media—we will debate this later on—is likely to be outside scope.

Such provider exemption should be offered to other, similar services, even if they happen to be constructed from the good will of users as opposed to a single professional author. I hope the Minister will be able to indicate that the political intent is not that we should ask Ofcom to spend time and energy regulating Wikipedia-like services. If so, can he point to where in the legislation we might get that helpful interpretation, in order to ensure that Ofcom is focused on what we want it to be focused on and not on much lower priority issues?

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I will speak to a couple of the amendments in this group. First, small is not safe, and you cannot necessarily see these platforms in isolation. For example, there is an incel group that has only 4,000 active users, but it posts a great deal on YouTube and has 24.2 million users in that context. So we have to be clear that small and safe are not the same thing.

However, I am sympathetic to the risk-based approach. I should probably have declared an interest as someone who has given money to Wikipedia on several occasions to keep it going. I ask the Minister for some clarity on the systems and processes of the Bill, and whether the risk profile of Wikipedia—which does not entice you in and then follow you for the next six months once you have looked at something—is far lower than something very small that gets hold of you and keeps on going. I say that particularly in relation to children, but I feel it for myself also.

18:00
Finally, to the noble Lords who are promoting this group of amendments I say that I would be very supportive if they could find some interventions that simplify the processes companies have to do in the early stages to establish levels of risk, and then we can get heavy on the mitigation of harm. That is something upon which we all agree; if we could find a very low bar of entry, check whether there is harm and then escalate, I believe that would be something we could all work on together.
Lord McCrea of Magherafelt and Cookstown Portrait Lord McCrea of Magherafelt and Cookstown (DUP)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to Amendment 4 in the name of the noble Baroness, Lady Fox of Buckley.

At Second Reading, my noble friend Lord Morrow raised the point that the Bill needs to cover all online pornography. A factsheet on the Bill, helpfully circulated to Peers last week by the Government, says:

“The Bill’s regulatory framework will cover all online sites with pornographic content, including commercial pornography sites, social media, video-sharing platforms and fora. It will also cover search engines, which play a significant role in enabling children to access pornography”.


This is a welcome commitment but I would like to explore it further.

The Government say “all”, but the definition of which services are in scope of the Bill, as set out in Clause 3(5) and Clause 71(4), requires that there are either

“a significant number of United Kingdom users, or … United Kingdom users form one of the target markets for the service (or the only target market)”.

At Second Reading, my noble friend Lord Morrow asked the Minister what will be considered as “significant”. Is it significant in terms of the total UK adult users who could use a service, or significant in terms of potential global users?

The noble Baroness, Lady Fox of Buckley, is exploring the same issue in her Amendment 4. She is proposing that the Bill’s current definition be replaced with something much easier to understand: that a site must have at least 1 million users per month in the UK to be within the scope of the Bill. That definition is certainly clear. However, I am looking forward to hearing whether it reflects the Government’s intention. For my part, I am concerned about what it might mean for clarifying which pornographic websites would fall into Part 3.

In December, the Government published an analysis carried out in January 2021 by the British Board of Film Classification on the top 200 pornographic websites. It reported that these 200 sites received 76% of the total UK visits to adult sites, based on data during August 2020. Ofcom published a similar list of the top 10 sites visited in September 2020—the site at number 10 had 3.8 million visitors. We do not know how many visitors there were to websites 100 or 200, but it is not unreasonable to speculate that it could be less than a million and would therefore fall outside the definition proposed by the noble Baroness, and nor is it clear whether those websites would fall within the Government’s original definition.

It is important for the Minister to tell the Committee quite clearly whether he expects the top 200 pornographic websites to be within the scope of Parts 3 and 5 of the Bill. If he does, I ask him to explain how that will be possible within the current definition in the Bill, not because I am trying to trip him up but as a genuine inquiry that the Bill does what we are expect it to do. If he does not expect the top 200 pornographic websites to be in scope, how many does he estimate would fall within Parts 3 and 5? Either way, it seems to me that there could be pornographic websites accessed in the United Kingdom that are not required to have age verification to protect those aged under 18 from accessing this content.

As I said, I doubt that this is what parents expect from this flagship Bill, especially as the Government set out in their factsheet that their own commissioned evidence says,

“exposure to pornography may impact children's perceptions of sex and relationships, may lead to replication of practices found in pornography, increased likelihood of engaging in sexual activities and harmful or aggressive behaviour, and reduced concern for consent from partners”.

It seems to me that “significant” should focus on the significant harm a website or content provider would cause if it were accessed in the UK. The number of visitors or popularity of the site should be irrelevant when considering whether or not children should be allowed to access it. My view is quite simple: if a website, social media or content provider wishes to host pornographic material, that is of potential significant harm to children and should be age-verified. I am therefore interested, given what the Government have said previously, to know whether the Minister agrees that all pornographic content must be age-verified if it is to be accessed in the UK. That is certainly what I believe most parents expect, and I will listen carefully to the Minister’s response.

Lord Strathcarron Portrait Lord Strathcarron (Con)
- View Speech - Hansard - - - Excerpts

I will speak in support of my noble friend Lord Moylan and Amendment 9. I declare an interest as an author and publisher.

Last week, we had the London Book Fair, and proposed new paragraph 10A could read almost like an executive summary of the main talking point, which was how AI will influence all aspects of the media but particularly publishing. For the sake of future-proofing, paragraph 10A would be a particularly useful step to adopt. Proposed new paragraph 10B would be in the interest of fairness because publishing, and a lot of media, is made up of micro-businesses, often one-man or one-woman companies. This is certain to happen with AI as well, as the intermediary roles are taken up by these. In the interest of future-proofing and fairness, I recommend this amendment.

Lord Vaizey of Didcot Portrait Lord Vaizey of Didcot (Con)
- View Speech - Hansard - - - Excerpts

My Lords, as my name is on Amendment 9, I speak to support these amendments and say that they are worthy of debate. As your Lordships know, I am extremely supportive of the Bill and hope that it will be passed in short order. It is much needed and overdue that we have the opportunity for legislation to provide us with a regulator that is able to hold platforms to account, protect users where it can and enhance child safety online. I can think of no better regulator for that role than Ofcom.

I have listened to the debate with great interest. Although I support the intentions of my noble friend Lord Moylan’s amendment, I am not sure I agree with him that there are two cultures in this House, as far as the Bill is concerned; I think everybody is concerned about child safety. However, these amendments are right to draw attention to the huge regulatory burden that this legislation can potentially bring, and to the inadvertent bad consequences it will bring for many of the sites that we all depend upon and use.

I have not signed many amendments that have been tabled in this Committee because I have grown increasingly concerned, as has been said by many others, that the Bill has become a bit like the proverbial Christmas tree where everyone hangs their own specific concern on to the legislation, turning it into something increasingly unwieldy and difficult to navigate. I thought the noble Baroness, Lady Fox, put it extremely well when she effectively brought to life what it would be like to run a small website and have to comply with this legislation. That is not to say that certain elements of micro-tweaking are not welcome—for example, the amendment by the noble Baroness, Lady Kidron, on giving coroners access to data—but we should be concerned about the scope of the Bill and the burden that it may well put on individual websites.

This is in effect the Wikipedia amendment, put forward and written in a sort of wiki way by this House—a probing amendment in Committee to explore how we can find the right balance between giving Ofcom the powers it needs to hold platforms to account and not unduly burdening websites that all of us agree present a very low risk and whose provenance, if you like, does not fit easily within the scope of the Bill.

I keep saying that I disagree with my noble friend Lord Moylan. I do not—I think he is one of the finest Members of this House—but, while it is our job to provide legislation to set the framework for how Ofcom regulates, we in this House should also recognise that in the real world, as I have also said before, this legislation is simply going to be the end of the beginning. Ofcom will have to find its way forward in how it exercises the powers that Parliament gives it, and I suspect it will have its own list of priorities in how it approaches these issues, who it decides to hold to account and who it decides to enforce against. A lot of its powers will rest not simply on the legislation that we give it but on the relationship that it builds with the platforms it is seeking to regulate.

For example, I have hosted a number of lunches for Google in this House with interested Peers, and it has been interesting to get that company’s insight into its working relationship with Ofcom. By the way, I am by no means suggesting that that is a cosy relationship, but it is at least a relationship where the two sides are talking to each other, and that is how the effectiveness of these powers will be explored.

I urge noble Lords to take these amendments seriously and take what the spirit of the amendments is seeking to put forward, which is to be mindful of the regulatory burden that the Bill imposes; to be aware that the Bill will not, simply by being passed, solve the kinds of issues that we are seeking to tackle in terms of the most egregious content that we find on the internet; and that, effectively, Ofcom’s task once this legislation is passed will be the language of priorities.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, this is not the first time in this Committee, and I suspect it will not be the last, when I rise to stand somewhere between my noble friend Lord Vaizey and the noble Baroness, Lady Kidron. I am very taken by her focus on risk assessments and by the passionate defences of Wikipedia that we have heard, which really are grounded in a sort of commoner’s risk assessment that we can all understand.

Although I have sympathy with the concerns of the noble Baroness, Lady Fox, about small and medium-sized businesses being overburdened by regulation, I am less taken with the amendments on that subject precisely because small tech businesses become big tech businesses extremely quickly. It is worth pointing out that TikTok did not even exist when Parliament began debating this Bill. I wonder what our social media landscape would have been like if the Bill had existed in law before social media started. We as a country should want global tech companies to be born in the UK, but we want their founders—who, sadly, even today, are predominantly young white men who do not yet have children—to think carefully about the risks inherent in the services they are creating, and we know we need to do that at the beginning of those tech companies’ journeys, not once they have reached 1 million users a month.

While I have sympathy with the desire of the noble Baroness, Lady Fox, not to overburden, just as my noble friend Lord Vaizey has said, we should take our lead from the intervention of the noble Baroness, Lady Kidron: we need a risk assessment even for small and medium-sized businesses. It just needs to be a risk assessment that is fit for their size.

18:15
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to follow the noble Baroness, Lady Harding. If one is permitted to say this in the digital age, I am on exactly the same page as she is.

There are two elements to the debate on this group. It is partly about compliance, and I absolutely understand the point about the costs of that, but I also take comfort from some of the things that the noble Lord. Lord Vaizey, said about the way that Ofcom is going to deliver the regulation and the very fact that this is going to be largely not a question of interpretation of the Act, when it comes down to it, but is going to be about working with the codes of practice. That will be a lot more user-friendly than simply having to go to expensive expert lawyers, as the noble Baroness, Lady Fox, said—not that I have anything against expensive expert lawyers.

I am absolutely in agreement with the noble Baroness, Lady Kidron, that small is not safe. As the noble Baroness, Lady Harding, described, small can become big. We looked at this in our Joint Committee and recommended to the Government that they should take a more nuanced approach to regulation, based not just on size and high-level functionality but on factors such as risk, reach, user base, safety performance and business model. All those are extremely relevant but risk is the key, right at the beginning. The noble Baroness, Lady Fox, also said that Reddit should potentially be outside, but Reddit has had its own problems, as we know. On that front, I am on absolutely the same page as those who have spoken about keeping us where we are.

The noble Lord, Lord Moylan, has been very cunning in the way that he has drawn up his Amendment 9. I am delighted to be on the same page as my noble friend —we are making progress—but I agree only with the first half of the amendment because, like the noble Baroness, Lady Kidron, I am a financial contributor to Wikipedia. A lot of us depend on Wikipedia; we look up the ages of various Members of this House when we see them in full flight and think, “Good heavens!” Biographies are an important part of this area. We have all had Jimmy Wales saying, as soon as we get on to Wikipedia, “You’ve already looked at Wikipedia 50 times this month. Make a contribution”, and that is irresistible. There is quite a strong case there. It is risk-based so it is not inconsistent with the line taken by a number of noble Lords in all this. I very much hope that we can get something out of the Minister—maybe some sort of sympathetic noises for a change—at this stage so that we can work up something.

I must admit that the briefing from Wikimedia, which many of us have had, was quite alarming. If the Bill means that we do not have users in high-risk places then we will find that adults get their information from other sources that are not as accurate as Wikipedia —maybe from ChatGPT or GPT-4, which the noble Lord, Lord Knight, is clearly very much an expert in—and that marginalised websites are shut down.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

For me, one of the features of the schedule’s list of exempted sites is foreign state entities. Therefore, we could end up in the absurd situation where you could not read about the Ukraine war on Wikipedia, but you would be able to read about the Ukraine war on the Russian Government website.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, if we needed an example of something that gave us cause for concern, that would be it; but a very good case has been made, certainly for the first half of the amendment in the name of the noble Lord, Lord Moylan, and we on these Benches support it.

Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, it has certainly been an interesting debate, and I am grateful to noble Lords on all sides of the Committee for their contributions and considerations. I particularly thank the noble Lords who tabled the amendments which have shaped the debate today.

In general, on these Benches, we believe that the Bill offers a proportionate approach to tackling online harms. We feel that granting some of the exemptions proposed in this group would be unintentionally counterproductive and would raise some unforeseen difficulties. The key here—and it has been raised by a number of noble Lords, including the noble Baronesses, Lady Harding and Lady Kidron, and, just now, the noble Lord, Lord Clement-Jones, who talked about the wider considerations of the Joint Committee and factors that should be taken into account—is that we endorse a risk-based approach. In this debate, it is very important that we take ourselves back to that, because that is the key.

My view is that using other factors, such as funding sources or volunteer engagement in moderation, cuts right across this risk-based approach. To refer to Amendment 4, it is absolutely the case that platforms with fewer than 1 million UK monthly users have scope to create considerable harm. Indeed, noble Lords will have seen that later amendments call for certain small platforms to be categorised on the basis of the risk—and that is the important word—that they engender, rather than the size of the platform, which, unfortunately, is something of a crude measure. The point that I want to make to the noble Baroness, Lady Fox, is that it is not about the size of the businesses and how they are categorised but what they actually do. The noble Baroness, Lady Kidron, rightly said that small is not safe, for all the reasons that were explained, including by the noble Baroness, Lady Harding.

Amendment 9 would exempt small and medium-sized enterprises and certain other organisations from most of the Bill’s provisions. I am in no doubt about the well-meaning nature of this amendment, tabled by the noble Lord, Lord Moylan, and supported by the noble Lord, Lord Vaizey. Indeed, there may well be an issue about how start-ups and entrepreneur unicorns cope with the regulatory framework. We should attend to that, and I am sure that the Minister will have something to say about it. But I also expect that the Minister will outline why this would actually be unhelpful in combating many of the issues that this Bill is fundamentally designed to deal with if we were to go down the road of these exclusions.

In particular, granting exemptions simply on the basis of a service’s size could lead to a situation where user numbers are capped or perhaps even where platforms are deliberately broken up to avoid regulation. This would have an effect that none of us in this Chamber would want to see because it would embed harmful content and behaviour rather than helping to reduce them.

Referring back to the comments of the noble Lord, Lord Moylan, I agree with the noble Lord, Lord Vaizey, in his reflection. I, too, have not experienced the two sides of the Chamber that the noble Lord, Lord Moylan, described. I feel that the Chamber has always been united on the matter of child safety and in understanding the ramifications for business. It is the case that good legislation must always seek a balance, but, to go back to the point about excluding small and medium-sized enterprises, to call them a major part of the British economy is a bit of an understatement when they account for 99.9% of the business population. In respect of the exclusion of community-based services, including Wikipedia—and we will return to this in the next group—there is nothing for platforms to fear if they have appropriate systems in place. Indeed, there are many gains to be had for community-based services such as Wikipedia from being inside the system. I look forward to the further debate that we will have on that.

I turn to Amendment 9A in the name of my noble friend Lord Knight of Weymouth, who is unable to participate in this section of the debate. It probes how the Bill’s measures would apply to specialised search services. Metasearch engines such as Skyscanner have expressed concern that the legislation might impose unnecessary burdens on services that pose little risk of hosting the illegal content targeted by the Bill. Perhaps the Minister, in his response, could confirm whether or not such search engines are in scope. That would perhaps be helpful to our deliberations today.

While we on these Benches are not generally supportive of exemptions, the reality is that there are a number of online search services that return content that would not ordinarily be considered harmful. Sites such as Skyscanner and Expedia, as we all know, allow people to search for and book flights and other travel services such as car hire. Obviously, as long as appropriate due diligence is carried out on partners and travel agents, the scope for users to encounter illegal or harmful material appears to be minimal and returns us to the point of having a risk-based approach. We are not necessarily advocating for a carve-out from the Bill, but it would perhaps be helpful to our deliberations if the Minister could outline how such platforms will be expected to interact with the Ofcom-run online safety regime.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am sympathetic to arguments that we must avoid imposing disproportionate burdens on regulated services, but I cannot accept the amendments tabled by the noble Baroness, Lady Fox, and others. Doing so would greatly reduce the strong protections that the Bill offers to internet users, particularly to children. I agree with the noble Baroness, Lady Merron, that that has long been the shared focus across your Lordships’ House as we seek to strike the right balance through the Bill. I hope to reassure noble Lords about the justification for the existing balance and scope, and the safeguards built in to prevent undue burdens to business.

I will start with the amendments tabled by the noble Baroness, Lady Fox of Buckley—Amendments 4, 6 to 8, 12, 288 and 305—which would significantly narrow the definition of services in scope of regulation. The current scope of the Bill reflects evidence of where harm is manifested online. There is clear evidence that smaller services can pose a significant risk of harm from illegal content, as well as to children, as the noble Baroness, Lady Kidron, rightly echoed. Moreover, harmful content and activity often range across a number of services. While illegal content or activity may originate on larger platforms, offenders often seek to move to smaller platforms with less effective systems for tackling criminal activity in order to circumvent those protections. Exempting smaller services from regulation would likely accelerate that process, resulting in illegal content being displaced on to smaller services, putting users at risk.

These amendments would create significant new loopholes in regulation. Rather than relying on platforms and search services to identify and manage risk proactively, they would require Ofcom to monitor smaller harmful services, which would further annoy my noble friend Lord Moylan. Let me reassure the noble Baroness, however, that the Bill has been designed to avoid disproportionate or unnecessary burdens on smaller services. All duties on services are proportionate to the risk of harm and the capacity of companies. This means that small, low-risk services will have minimal duties imposed on them. Ofcom’s guidance and codes of practice will set out how they can comply with their duties, in a way that I hope is even clearer than the Explanatory Notes to the Bill, but certainly allowing for companies to have a conversation and ask for areas of clarification, if that is still needed. They will ensure that low-risk services do not have to undertake unnecessary measures if they do not pose a risk of harm to their users.

18:30
In addition, the Bill includes explicit exemptions for many small and medium-sized enterprises, through the low-risk functionality exemptions in Schedule 1. This includes an exemption for any service that offers users the ability only to post comments or reviews on digital content published by it, which will exempt many online retailers, news sites and web logs. The Bill also provides the Secretary of State with a power to exempt further types of user-to-user or search services from the Bill if the risk of harm presented by a particular service is low, ensuring that other low-risk services are not subject to unnecessary regulation. There was quite a lot of talk about Wikipedia—
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, while my noble friend is talking about the possibility of excessive and disproportionate burden on businesses, can I just ask him about the possibility of excessive and disproportionate burden on the regulator? He seems to be saying that Ofcom is going to have to maintain, and keep up to date regularly, 25,000 risk assessments—this is on the Government’s own assessment, produced 15 months ago, of the state of the market then—even if those assessments carried out by Ofcom result in very little consequence for the regulated entity.

We know from regulation in this country that regulators already cannot cope with the burdens placed on them. They become inefficient, sclerotic and unresponsive; they have difficulty in recruiting staff of the same level and skills as the entities that they regulate. We have a Financial Services and Markets Bill going through at the moment, and the FCA is a very good example of that. Do we really think that this is a sensible burden to place on a regulator that is actually able to discharge it?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

The Bill creates a substantial new role for Ofcom, but it has already substantially recruited and prepared for the effective carrying out of that new duty. I do not know whether my noble friend was in some of the briefings with officials from Ofcom, but it is very happy to set out directly the ways in which it is already discharging, or preparing to discharge, those duties. The Government have provided it with further resource to enable it to do so. It may be helpful for my noble friend to have some of those discussions directly with the regulator, but we are confident that it is ready to discharge its duties, as set out in the Bill.

I was about to say that we have already had a bit of discussion on Wikipedia. I am conscious that we are going to touch on it again in the debate on the next group of amendments so, at the risk of being marked down for repetition, which is a black mark on that platform, I shall not pre-empt what I will say shortly. But I emphasise that the Bill does not impose prescriptive, one-size-fits-all duties on services. The codes of practice from Ofcom will set out a range of measures that are appropriate for different types of services in scope. Companies can follow their own routes to compliance, so long as they are confident that they are effectively managing risks associated with legal content and, where relevant, harm to children. That will ensure that services that already use community moderation effectively can continue to do so—such as Wikipedia, which successfully uses that to moderate content. As I say, we will touch on that more in the debate on the next group.

Amendment 9, in the name of my noble friend Lord Moylan, is designed to exempt small and medium sized-enterprises working to benefit the public from the scope of the Bill. Again, I am sympathetic to the objective of ensuring that the Bill does not impose undue burdens on small businesses, and particularly that it should not inhibit services from providing valuable content of public benefit, but I do not think it would be feasible to exempt service providers deemed to be

“working to benefit the public”.

I appreciate that this is a probing amendment, but the wording that my noble friend has alighted on highlights the difficulties of finding something suitably precise and not contestable. It would be challenging to identify which services should qualify for such an exemption.

Taking small services out of scope would significantly undermine the framework established by the Bill, as we know that many smaller services host illegal content and pose a threat to children. Again, let me reassure noble Lords that the Bill has been designed to avoid disproportionate or unnecessary regulatory burdens on small and low-risk services. It will not impose a disproportionate burden on services or impede users’ access to value content on smaller services.

Amendment 9A in the name of the noble Lord, Lord Knight of Weymouth, is designed to exempt “sector specific search services” from the scope of the Bill, as the noble Baroness, Lady Merron, explained. Again, I am sympathetic to the intention here of ensuring that the Bill does not impose a disproportionate burden on services, but this is another amendment that is not needed as it would exempt search services that may pose a significant risk of harm to children, or because of illegal content on them. The amendment aims to exempt specialised search services—that is, those that allow users to

“search for … products or services … in a particular sector”.

It would exempt specialised search services that could cause harm to children or host illegal content—for example, pornographic search services or commercial search services that could facilitate online fraud. I know the noble Lord would not want to see that.

The regulatory duties apply only where there is a significant risk of harm and the scope has been designed to exclude low-risk search services. The duties therefore do not apply to search engines that search a single database or website, for example those of many retailers or other commercial websites. Even where a search service is in scope, the duties on services are proportionate to the risk of harm that they pose to users, as well as to a company’s size and capacity. Low-risk services, for example, will have minimal duties. Ofcom will ensure that these services can quickly and easily comply by publishing risk profiles for low-risk services, enabling them easily to understand their risk levels and, if necessary, take steps to mitigate them.

The noble Lord, Lord McCrea, asked some questions about the 200 most popular pornographic websites. If I may, I will respond to the questions he posed, along with others that I am sure will come in the debate on the fifth group, when we debate the amendments in the names of the noble Lord, Lord Morrow, and the noble Baroness, Lady Ritchie of Downpatrick, because that will take us on to the same territory.

I hope that provides some assurance to my noble friend Lord Moylan, the noble Baroness, Lady Fox, and others, and that they will be willing not to press their amendments in this group.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I thank people for such a wide-ranging and interesting set of contributions. I take comfort from the fact that so many people understood what the amendments were trying to do, even if they did not fully succeed in that. I thought it was quite interesting that in the first debate the noble Lord, Lord Allan of Hallam, said that he might be a bit isolated on the apps, but I actually agreed with him—which might not do his reputation any good. However, when he said that, I thought, “Welcome to my world”, so I am quite pleased that this has not all been shot down in flames before we started. My amendment really was a serious attempt to tackle something that is a real problem.

The Minister says that the Bill is designed to avoid disproportionate burdens on services. All I can say is, “Sack the designer”. It is absolutely going to have a disproportionate burden on a wide range of small services, which will not be able to cope, and that is why so many of them are worried about it. Some 80% of the companies that will be caught up in this red tape are small and micro-businesses. I will come to the small business point in a moment.

The noble Baroness, Lady Harding, warned us that small tech businesses become big tech businesses. As far as I am concerned, that is a success story—it is what I want; is it not what we all want? Personally, I think economic development and growth is a positive thing—I do not want them to fail. However, I do not think it will ever happen; I do not think that small tech businesses will ever grow into big tech businesses if they face a disproportionate burden in the regulatory sense, as I have tried to describe. That is what I am worried about, and it is not a positive thing to be celebrated.

I stress that it is not small tech and big tech. There are also community sites, based on collective moderation. Wikipedia has had a lot of discussion here. For a Bill that stresses that it wants to empower users, we should think about what it means when these user-moderated community sites are telling us that they will not be able to carry on and get through. That is what they are saying. It was interesting that the noble Lord, Lord Clement-Jones, said that he relies on Wikipedia—many of us do, although please do not believe what it says about me. There are all of these things, but then there was a feeling that, well, Reddit is a bit dodgy. The Bill is not meant to be deciding which ones to trust in quite that way, or people’s tastes.

I was struck that the noble Baroness, Lady Kidron, said that small is not safe, and used the incel example. I am not emphasising that small is safe; I am saying that the small entities will not survive this process. That is my fear. I do not mean that the big ones are nasty and dangerous and the small ones are cosy, lovely and Wikipedia-like. I am suggesting that smaller entities will not be able to survive the regulatory onslaught. That is the main reason I raised this.

The noble Baroness, Lady Merron, said that these entities can cause great harm. I am worried about a culture of fear, in which we demonise tens of thousands of innocent tech businesses and communities and end up destroying them when we do not intend to. I tried to put in the amendment an ability for Ofcom, if there are problematic sites that are risky, to deal with them. As the Minister kept saying, low-risk search engines have been exempted. I am suggesting that low-risk small and micro-businesses are exempted, which is the majority of them. That is what I am suggesting, rather than that we assume they are all guilty and then they have to get exempted.

Interestingly, the noble Lord, Lord McCrea, asked how many pornography sites are in scope and which pornographic websites have a million or fewer users. I am glad I do not know the answer to that, otherwise people might wonder why I did. The point is that there are always going to be sites that are threatening or a risk to children, as we are discussing. But we must always bear in mind—this was the important point that the noble Lord, Lord Moylan, made—that in our absolute determination to protect children via this Bill we do not unintendedly damage society as a whole. Adult access to free speech, for example, is one of my concerns, as are businesses and so on. We should not have that as an outcome.

18:45
I am sure that my amendments could be majorly improved. The approach of the noble Lord, Lord Moylan, might be better. I am happy to look at the metric and whether or not it is 1 million monthly users. However, I am insistent that the bipartisan approach to risk from the Minister and the Opposition will not help us achieve what we want from this Bill and will cause unnecessary problems. We have to avoid a recipe for risk aversion that will hold back the progressive and wonderful aspects of the online world, or at least the educational and in some instances business aspects.
I am obviously not going to push the amendments now, but I will come back to this. If it is not me, I hope somebody does, because the fact that some people said that half the points the noble Lord, Lord Moylan, made were correct was a step forward. I have no interest in noble Lords supporting my amendments, as long as we take seriously the content of my concerns and those expressed by the noble Lords, Lord Vaizey and Lord Moylan, particularly. I beg leave to withdraw my amendment.
Amendment 4 withdrawn.
Amendments 5 to 8 not moved.
Clause 3 agreed.
Schedule 1: Exempt user-to-user and search services
Amendments 9 and 9A not moved.
Schedule 1 agreed.
Schedule 2 agreed.
Clause 4: Disapplication of Act to certain parts of services
Amendment 10
Moved by
10: Clause 4, page 4, line 8, at end insert—
“(2A) This Act does not apply in relation to moderation actions taken, or not taken, by users of a Part 3 service.”Member’s explanatory statement
The drafting of some Bill provisions, such as Clauses 17(4)(c) or 65(1), leaves room for debate as to whether community moderation gives rise to liability and obligations for the provider. This amendment, along with the other amendment to Clause 4 in the name of Lord Moylan, clarifies that moderation carried out by the public, for example on Wikipedia, is not fettered by this Bill.
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, I have to start with a slightly unprofessional confession. I accepted the Bill team’s suggestion on how my amendments might be grouped after I had grouped them rather differently. The result is that I am not entirely clear why some of these groupings are quite as they are. As my noble friend the Minister said, my original idea of having Amendments 9, 10 and 11 together would perhaps have been better, as it would have allowed him to give a single response on Wikipedia. Amendments 10 and 11 in this group relate to Wikipedia and services like it.

I am, I hope, going to cause the Committee some relief as I do not intend to repeat remarks made in the previous group. The extent to which my noble friend wishes to amplify his comments in response to the previous group is entirely a matter for him, since he said he was reserving matter that he would like to bring forward but did not when commenting on the previous group. If I do not speak further on Amendments 10 and 11, it is not because I am not interested in what my noble friend the Minister might have to say on the topic of Wikipedia.

To keep this fairly brief, I turn to Amendment 26 on age verification. I think we have all agreed in the Chamber that we are united in wanting to see children kept safe. On page 10 of the Bill, in Clause 11(3), it states that there will be a duty to

“prevent children of any age from encountering”

this content—“prevent” them “encountering” is extremely strong. We do not prevent children encountering the possibility of buying cigarettes or encountering the possibility of being injured crossing the road, but we are to prevent children from these encounters. It is strongly urged in the clause—it is given as an example—that age verification will be required for that purpose.

Of course, age verification works only if it applies to everybody: one does not ask just the children to prove their age; one has to ask everybody online. Unlike when I go to the bar in a pub, my grey hair cannot be seen online. So this provision will almost certainly have to extend to the entire population. In Clause 11(3)(b), we have an obligation to protect. Clearly, the Government intend a difference between “prevent” and “protect”, or they would not have used two different verbs, so can my noble friend the Minister explain what is meant by the distinction between “prevent” and “protect”?

My amendment would remove Clause 11(3) completely. But it is, in essence, a probing amendment and what I want to hear from the Government, apart from how they interpret the difference between “prevent” and “protect”, is how they expect this duty to be carried out without having astonishingly annoying and deterring features built into every user-to-user platform and website, so that every time one goes on Wikipedia—in addition to dealing with the GDPR, accepting cookies and all the other nonsense we have to go through quite pointlessly—we then have to provide age verification of some sort.

What mechanism that might be, I do not know. I am sure that there are many mechanisms available for age verification. I do not wish to get into a technical discussion about what particular techniques might be used—I accept that there will be a range and that they will respond and adapt in the light of demand and technological advance—but I would like to know what my noble friend the Minister expects and how wide he thinks the obligation will be. Will it be on the entire population, as I suspect? Focusing on that amendment—and leaving the others to my noble friend the Minister to respond to as he sees fit—and raising those questions, I think that the Committee would like to know how the Government imagine that this provision will work. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to the amendments in the name of the noble Lord, Lord Moylan, on moderation, which I think are more important than he has given himself credit for—they go more broadly than just Wikipedia.

There is a lot of emphasis on platform moderation, but the reality is that most moderation of online content is done by users, either individually or in groups, acting as groups in the space where they operate. The typical example, which many Members of this House have experienced, is when you post something and somebody asks, “Did you mean to post that?”, and you say, “Oh gosh, no”, and then delete it. A Member in the other place has recently experienced a rather high-profile example of that through the medium of the newspaper. On a much smaller scale, it is absolutely typical that people take down content every day, either because they regret it or, quite often, because their friends, families or communities tell them that it was unwise. That is the most effective form of moderation, because it is the way that people learn to change their behaviour online, as opposed to the experience of a platform removing content, which is often experienced as the big bad hand of the platform. The person does not learn to change their behaviour, so, in some cases, it can reinforce bad behaviour.

Community moderation, not just on Wikipedia but across the internet, is an enormous public good, and the last thing that we want to do in this legislation is to discourage people from doing it. In online spaces, that is often a volunteer activity: people give up their time to try to keep a space safe and within the guidelines they have set for that space. The noble Lord, Lord Moylan, has touched on a really important area: in the Bill, we must be absolutely clear to those volunteers that we will not create all kinds of new legal operations and liabilities on them. These are responsible people, so, if they are advised that they will incur all kinds of legal risk when trying to comply with the Online Safety Bill, they will stop doing the moderation—and then we will all suffer.

On age-gating, we will move to a series of amendments where we will discuss age assurance, but I will say at the outset, as a teaser to those longer debates, that I have sympathy with the points made by the noble Lord, Lord Moylan. He mentioned pubs—we often talk about real-world analogies. In most of the public spaces we enter in the real world, nobody does any ID checking or age checking; we take it on trust, unless and until you carry out an action, such as buying alcohol, which requires an age check.

It is legitimate to raise this question, because where we fall in this debate will depend on how we see public spaces. I see a general-purpose social network as equivalent to walking into a pub or a town square, so I do not expect to have my age and ID checked at the point at which I enter that public space. I might accept that my ID is checked at a certain point where I carry out various actions. Others will disagree and will say that the space should be checked as soon as you go into it—that is the boundary of the debate we will have across a few groups. As a liberal, I am certainly on the side that says that it is incumbent on the person wanting to impose the extra checks to justify them. We should not just assume that extra checks are cost-free and beneficial; they have a cost for us all, and it should be imposed only where there is a reasonable justification.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

Far be it for me to suggest that all the amendments tabled by the noble Lord, Lord Moylan, are in the wrong place, but I think that Amendment 26 might have been better debated with the other amendments on age assurance.

On community moderation, I underscore the point that Ofcom must have a risk profile as part of its operations. When we get to that subject, let us understand what Ofcom intends to do with it—maybe we should instruct Ofcom a little about what we would like it to do with it for community moderation. I have a lot of sympathy—but do not think it is a get-out clause—with seeing some spaces as less risky, or, at least, for determining what risky looks like in online spaces, which is a different question. This issue belongs in the risk profile: it is not about taking things out; we have to build it into the Bill we have.

On age assurance and AV, I do not think that today is the day to discuss it in full. I disagree with the point that, because we are checking kids, we have to check ourselves—that is not where the technology is. Without descending into technical arguments, as the noble Lord, Lord Moylan, asked us not to, we will bring some of those issues forward.

The noble Lords, Lord Bethell and Lord Stevenson, and the right reverend Prelate the Bishop of Oxford have a package of amendments which are very widely supported across the Committee. They have put forward a schedule of age assurance that says what the rules of the road are. We must stop pretending that age assurance is something that is being invented now in this Bill. If you log into a website with your Facebook login, it shares your age—and that is used by 42% of people online. However, if you use an Apple login, it does not share your age, so I recommend using Apple—but, interestingly, it is harder to find that option on websites, because websites want to know your age.

So, first, we must not treat age assurance as if it has just been invented. Secondly, we need to start to have rules of the road, and ask what is acceptable, what is proportionate, and when we will have zero tolerance. Watching faces around the Committee, I say that I will accept zero tolerance for pornography and some other major subjects, but, for the most part, age assurance is something that we need to have regulated. Currently, it is being done to us rather than in any way that is transparent or agreed, and that is very problematic.

Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I hesitated to speak to the previous group of amendments, but I want to speak in support of the issue of risk that my noble friend Lady Kidron raised again in this group of amendments. I do not believe that noble Lords in the Committee want to cut down the amount of information and the ability to obtain information online. Rather, we came to the Bill wanting to avoid some of the really terrible harms promoted by some websites which hook into people’s vulnerability to becoming addicted to extremely harmful behaviours, which are harmful not only to themselves but to other people and, in particular, to children, who have no voice at all. I also have a concern about vulnerable people over the age of 18, and that may be something we will come to later in our discussions on the Bill.

19:00
It seems really important that we stick with the principle that if it is profoundly illegal in the offline world then we cannot allow it to be perpetrated in the online world. That compass needle has been behind some of the thinking of a lot of us in trying to grapple with this issue, which is very complex for those of us who are outside the world of tech and internet and coming new to it, but who have seen the results of some of those harms perpetrated. That is where the problem arises.
Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I violently agree with my noble friend Lord Moylan that the grouping of this amendment is unfortunate. For that reason I am not going to plunge into the issue in huge detail. but there are a couple of things I would like to reassure my noble friend on, and I have a question for the Minister.

The noble Baroness, Lady Kidron, said there is a package of amendments around age verification and that we will have a lot of time to dive into this, and I think that is probably the right format for doing it. However, I reassure my noble friend Lord Moylan that he is absolutely right. The idea is not in any way to shut off the town square from everyone simply because there might be something scary there.

Clause 11(3) refers to priority content, which the noble Lord will know is to do with child abuse and fraudulent and severely violent content. This is not just any old stuff; this is hardcore porn and the rest. As in the real world, that content should be behind an age-verification barrier. At the moment we have a situation on the internet where, because it has not been well-managed for a generation, this content has found itself everywhere: on Twitter and Reddit, and all sorts of places where really it should not be because there are children there. We envisage a degree of tidying up of social media and the internet to make sure that the dangerous content is put behind age verification. What we are not seeking to do, and what would not be a benign or positive action, is to put the entire internet behind some kind of age-verification boundary. From that point of view, I completely agree with my noble friend.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - - - Excerpts

My Lords, as might be expected, I will speak against Amendment 26 and will explain why.

The children’s charity Barnardo’s—here I declare an interest as vice-president—has said, as has been said several times before, that children are coming across pornographic content from as young as seven. Often they stumble across the content accidentally, unwittingly searching for terms such as “sex” or “porn”, without knowing what they mean. The impact that this is having on children is huge. It is harming their mental health and distorting their perception of healthy sexual relationships and consent. That will go with them into adulthood.

Age verification for pornography and age assurance to protect children from other harms are crucial to protect children from this content. In the offline world, children are rightly not allowed to buy pornographic DVDs in sex shops but online they can access this content at the click of a button. This is why I will be supporting the amendments from the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, and am fully supportive of their age assurance and age verification schedule.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, to go back not just to the age question, the noble Lord, Lord Allan of Hallam, reminded us that community-led moderation is not just Wikipedia. What I tried to hint at earlier is that that is one of the most interesting, democratic aspects of the online world, which we should protect.

We often boast that we are a self-regulating House and that that makes us somehow somewhat superior to up the road—we are all so mature because we self-regulate; people do behave badly but we decide. It is a lesson in democracy that you have a self-regulating House, and there are parts of the online world that self-regulate. Unless we think that the citizens of the UK are less civilised than Members of the House of Lords, which I would refute, we should say that it is positive that there are self-moderating, self-regulating online sites. If you can say something and people can object and have a discussion about it, and things can be taken down, to me that is the way we should deal with speech that is inappropriate or wrong. The bulk of these amendments—I cannot remember how many there are now—are right.

I was glad that the noble Lord, Lord Moylan, said he could not understand why this grouping had happened, which is what I said earlier. I had gone through a number of groupings thinking: “What is that doing there? Am I missing something? Why is that in that place?” I think we will come back to the age verification debate and discussion.

One thing to note is that one of the reasons organisations such as Wikipedia would be concerned about age verification—and they are—is anonymity. It is something we have to consider. What is going to happen to anonymity? It is so important for journalists, civil liberty activists and whistleblowers. Many Wikipedia editors are anonymised, maybe because they are politically editing sites on controversial issues. Imagine being a Wikipedia editor from Russia at the moment—you would not want to have to say who you are. We will come back to it but it is important to understand that Amendment 26, and those who are saying that we should look at the question of age verification, are not doing so because they do not care about children and are not interested in protecting them. However, the dilemmas of any age-gating or age verification for adult civil liberties have to be considered. We have to worry that, because of an emphasis on checking age, some websites will decide to sanitise what they allow to be published to make it suitable for children, just in case they come across it. Again, that will have a detrimental impact on adult access to all knowledge.

These will be controversial issues, and we will come back to them, but it is good to have started the discussion.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a very strange debate. It has been the tail end of the last session and a trailer for a much bigger debate coming down the track. It was very odd.

We do not want to see everything behind an age-gating barrier, so I agree with my noble friend. However, as the noble Baroness, Lady Kidron, reminded us, it is all about the risk profile, and that then leads to the kind of risk assessment that a platform is going to be required to carry out. There is a logic to the way that the Bill is going to operate.

When you look at Clause 11(3), you see that it is not disproportionate. It deals with “primary priority content”. This is not specified in the Bill but it is self-harm and pornography—major content that needs age-gating. Of course we need to have the principles for age assurance inserted into the Bill as well, and of course it will be subject to debate as we go forward.

There is technology to carry out age verification which is far more sophisticated than it ever was, so I very much look forward to that debate. We started that process in Part 3 of the Digital Economy Act. I was described as an internet villain for believing in age verification. I have not changed my view, but the debate will be very interesting. As regards the tail-end of the previous debate, of course we are sympathetic on these Benches to the Wikipedia case. As we said on the last group, I very much hope that we will find a way, whether it is in Schedule 1 or in another way, of making sure that Wikipedia is not affected overly by this—maybe the risk profile that is drawn up by Ofcom will make sure that Wikipedia is not unduly impacted.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

Like others, I had prepared quite extensive notes to respond to what I thought the noble Lord was going to say about his amendments in this group, and I have not been able to find anything left that I can use, so I am going to have to extemporise slightly. I think it is very helpful to have a little non-focused discussion about what we are about to talk about in terms of age, because there is a snare and a delusion in quite a lot of it. I was put in mind of that in the discussions on the Digital Economy Act, which of course precedes the Minister but is certainly still alive in our thinking: in fact, we were talking about it earlier today.

The problem I see is that we have to find a way of squaring two quite different approaches. One is to prevent those who should not be able to see material, because it is illegal for them to see it. The other is to find a way of ensuring that we do not end up with an age-gated internet, which I am grateful to find that we are all, I think, agreed about: that is very good to know.

Age is very tricky, as we have heard, and it is not the only consideration we have to bear in mind in wondering whether people should be able to gain access to areas of the internet which we know will be bad and difficult for them. That leads us, of course, to the question about legal but harmful, now resolved—or is it? We are going to have this debate about age assurance and what it is. What is age verification? How do they differ? How does it matter? Is 18 a fixed and final point at which we are going to say that childhood ends and adulthood begins, and therefore one is open for everything? It is exactly the point made earlier about how to care for those who should not be exposed to material which, although legal for them by a number called age, is not appropriate for them in any of the circumstances which, clinically, we might want to bring to bear.

I do not think we are going to resolve these issues today—I hope not. We are going to talk about them for ever, but at this stage I think we still need a bit of thinking outside a box which says that age is the answer to a lot of the problems we have. I do not think it is, but whether the Bill is going to carry that forward I have my doubts. How we get that to the next stage, I do not know, but I am looking forward to hearing the Minister’s comments on it.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I agree that this has been a rather unfortunate grouping and has led to a slightly strange debate. I apologise if it is the result of advice given to my noble friend. I know there has been some degrouping as well, which has led to slightly odd combinations today. However, as promised, I shall say a bit more about Wikipedia in relation to my noble friend’s Amendments 10 and 11.

The effect of these amendments would be that moderation actions carried out by users—in other words, community moderation of user-to-user and search services —would not be in scope of the Bill. The Government support the use of effective user or community moderation by services where this is appropriate for the service in question. As I said on the previous group, as demonstrated by services such as Wikipedia, this can be a valuable and effective means of moderating content and sharing information. That is why the Bill does not impose a one-size-fits-all requirement on services, but instead allows services to adopt their own approaches to compliance, so long as these are effective. The noble Lord, Lord Allan of Hallam, dwelt on this. I should be clear that duties will not be imposed on individual community moderators; the duties are on platforms to tackle illegal content and protect children. Platforms can achieve this through, among other things, centralised or community moderation. Ultimately, however, it is they who are responsible for ensuring compliance and it is platforms, not community moderators, who will face enforcement action if they fail to do so.

19:15
The amendments in the name of my noble friend Lord Moylan appear to intend to take services which rely only on user moderation entirely out of the scope of the Bill, so that those services are not subject to the new regulatory framework in any way. That would create a gap in the protections created by the Bill and would create incentives for services to adopt nominal forms of user moderation to avoid being subject to the illegal content and child safety duties. This would significantly undermine the efficacy of the Bill and is therefore not something we could include in it. His Amendment 26 would remove the duties on providers in Clause 11(3) to prevent children encountering primary priority content, and to protect children in age groups at risk of harm from other content that is harmful to children. This is a key duty which must be retained.
Contrary to what some have said, there is currently no requirement in the Bill for users to verify their age before accessing search engines and user-to-user services. We expect that only services which pose the highest risk to children will use age-verification technologies, but this is indeed a debate to which we will return in earnest and in detail on later groups of amendments. Amendment 26 would remove a key child safety duty, significantly weakening the Bill’s protections for children. The Bill takes a proportionate approach to regulation, which recognises the diverse range of services that are in scope of it. My noble friend’s amendments run counter to that and would undermine the protections in the Bill. I hope he will feel able not to press them and allow us to return to the debates on age verification in full on another group.
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords I am grateful to all noble Lords who have contributed to this slightly disjointed debate. I fully accept that there will be further opportunities to discuss age verification and related matters, so I shall say no more about that. I am grateful, in particular, to the noble Lord, Lord Allan of Hallam, for supplying the deficiency in my opening remarks about the importance of Amendments 10 and 11, and for explaining just how important that is too. I also thank the noble Lord, Lord Stevenson. It was good of him to say, in the open approach he took on the question of age, that there are issues still to be addressed. I do not think anybody feels that we have yet got this right and I think we are going to have to be very open in that discussion, when we get to it. That is also true about what the noble Lord, Lord Allan of Hallam, said: we have not yet got clarity as to where the age boundary is—I like his expression—for the public space. Where is the point at which, if checks are needed, those checks are to be applied? These are all matters to discuss and I hope noble Lords will forgive me if I do not address each individual contribution separately.

I would like to say something, I hope not unfairly or out of scope, about what was said by the noble Baronesses, Lady Finlay of Llandaff and Lady Kidron, when they used, for the first time this afternoon, the phrase “zero tolerance”, and, at the same time, talked about a risk-based approach. I have, from my own local government experience, a lot of experience of risk-based approaches taken in relation to things—very different, of course, from the internet—such as food safety, where local authorities grade restaurants and food shops and take enforcement action and supervisory action according to their assessment of the risk that those premises present. That is partly to do with their assessment of the management and partly to do with their experience of things that have gone wrong in the past. If you have been found with mouse droppings and you have had to clean up the shop, then you will be examined a great deal more frequently until the enforcement officers are happy; whereas if you are always very well run, you will get an inspection visit maybe only once a year. That is what a risk-based assessment consists of. The important thing to say is that it does not produce zero tolerance or zero outcomes.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I just want to make the point that I was talking about zero tolerance at the end of a ladder of tolerance, just to be clear. Letting a seven-year-old child into an 18-plus dating app or pornographic website is where the zero tolerance is—everything else is a ladder up to that.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

I beg the noble Baroness’s pardon; I took that for granted. There are certain things—access to pornography, material encouraging self-harm and things of that sort—where one has to have zero tolerance, but not everything. I am sorry I took that for granted, so I fully accept that I should have made that more explicit in my remarks. Not everything is to be zero-toleranced, so to speak, but certain things are. However, that does not mean that they will not happen. One has to accept that there will be leakage around all this, just as some of the best-run restaurants that have been managed superbly for years will turn out, on occasion, to be the source of food poisoning. One has to accept that this is never going to be as tight as some of the advocates wanted, but with that, I hope I will be given leave to withdraw—

Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- Hansard - - - Excerpts

May I intervene, because I have also been named in the noble Lord’s response? My concern is about the most extreme, most violent, most harmful and destructive things. There are some terrible things posted online. You would not run an open meeting on how to mutilate a child, or how to stab somebody most effectively to do the most harm. It is at this extreme end that I cannot see anyone in society in the offline world promoting classes for any of these terrible activities. Therefore, there is a sense that exposure to these things is of no benefit but promotes intense harm. People who are particularly vulnerable at a formative age in their development should not be exposed to them, because they would not be exposed to them elsewhere. I am speaking personally, not for anybody else, but I stress that this is the level at which the tolerance should be set to zero because we set it to zero in the rest of our lives.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

Everything the noble Baroness has said is absolutely right, and I completely agree with her. The point I simply want to make is that no form of risk-based assessment will achieve a zero-tolerance outcome, but—

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

I am so sorry, but may I offer just one final thought from the health sector? While the noble Lord is right that where there are human beings there will be error, there is a concept in health of the “never event”—that when that error occurs, we should not tolerate it, and we should expect the people involved in creating that error to do a deep inspection and review to understand how it occurred, because it is considered intolerable. I think the same exists in the digital world in a risk assessment framework, and it would be a mistake to ignore it.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, I am now going to attempt for the third time to beg the House’s leave to withdraw my amendment. I hope for the sake of us all, our dinner and the dinner break business, for which I see people assembling, that I will be granted that leave.

Amendment 10 withdrawn.
Amendment 11 not moved.
Clause 4 agreed.
Amendment 12 not moved.
Clause 5 agreed.
Clause 6: Providers of user-to-user services: duties of care
Amendment 12A
Moved by
12A: Clause 6, page 5, line 11, at end insert “(2) to (8)”
Member’s explanatory statement
This amendment is consequential on the amendments in the Minister’s name to clause 9 below (because the new duty to summarise illegal content risk assessments in the terms of service is only imposed on providers of Category 1 services).
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, this group of government amendments relates to risk assessments; it may be helpful if I speak to them now as the final group before the dinner break.

Risk management is at the heart of the Bill’s regulatory framework. Ofcom and services’ risk assessments will form the foundation for protecting users from illegal content and content which is harmful to children. They will ensure that providers thoroughly identify the risks on their own websites, enabling them to manage and mitigate the potential harms arising from them. Ofcom will set out the risks across the sector and issue guidance to companies on how to conduct their assessments effectively. All providers will be required to carry out risk assessments, keep them up-to-date and update them before making a significant change to the design or operation of their service which could put their users at risk. Providers will then need to put in place measures to manage and mitigate the risks they identify in their risk assessments, including any emerging risks.

Given how crucial the risk assessments are to this framework, it is essential that we enable them to be properly scrutinised by the public. The government amendments in this group will place new duties on providers of the largest services—that is, category 1 and 2A services—to publish summaries of their illegal and child safety risk assessments. Through these amendments, providers of these services will also have a new duty to send full records of their risk assessments to Ofcom. This will increase transparency about the risk of harm on the largest platforms, clearly showing how risk is affected by factors such as the design, user base or functionality of their services. These amendments will further ensure that the risk assessments can be properly assessed by internet users, including by children and their parents and guardians, by ensuring that summaries of the assessments are publicly available. This will empower users to make informed decisions when choosing whether and how to use these services.

It is also important that Ofcom is fully appraised of the risks identified by service providers. That is why these amendments introduce duties for both category 1 and 2A services to send their records of these risk assessments, in full, to Ofcom. This will make it easier for Ofcom to supervise compliance with the risk assessment duties, as well as other duties linked to the findings of the risk assessments, rather than having to request the assessments from companies under its information-gathering powers.

These amendments also clarify that companies must keep a record of all aspects of their risk assessments, which strengthens the existing record-keeping duties on services. I hope that noble Lords will welcome these amendments. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, it is risky to stand between people and their dinner, but I rise very briefly to welcome these amendments. We should celebrate the good stuff that happens in Committee as well as the challenging stuff. The risk assessments are, I think, the single most positive part of this legislation. Online platforms already do a lot of work trying to understand what risks are taking place on their platforms, which never sees the light of day except when it is leaked by a whistleblower and we then have a very imperfect debate around it.

The fact that platforms will have to do a formal risk assessment and share it with a third-party regulator is huge progress; it will create a very positive dynamic. The fact that the public will be able to see those risk assessments and make their own judgments about which services to use—according to how well they have done them—is, again, a massive public benefit. We should welcome the fact that risk assessments are there and the improvements that this group of amendments makes to them. I hope that was short enough.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I also welcome these amendments, but I have two very brief questions for the Minister. First, in Amendment 27A, it seems that the child risk assessment is limited only to category 1 services and will be published only in the terms of service. As he probably knows, 98% of people do not read terms of service, so I wondered where else we might find this, or whether there is a better way of dealing with it.

My second question is to do with Amendments 64A and 88A. It seems to me—forgive me if I am wrong—that the Bill previously stipulated that all regulated search and user services had to make and keep a written record of any measure taken in compliance with a relevant duty, but now it seems to have rowed back to only category 1 and 2A services. I may be wrong on that, but I would like to check it for the record.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, the noble Baroness, Lady Kidron, put her finger exactly on the two questions that I wanted to ask: namely, why only category 1 and category 2A, and is there some rowing back involved here? Of course, none of this prejudices the fact that, when we come later in Committee to talk about widening the ambit of risk assessments to material other than that which is specified in the Bill, this kind of transparency would be extremely useful. But the rationale for why it is only category 1 and category 2A in particular would be very useful to hear.

19:30
Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to the Minister for introducing this group, and we certainly welcome this tranche of government amendments. We know that there are more to come both in Committee and as we proceed to Report, and we look forward to seeing them.

The amendments in this group, as other noble Lords have said, amount to a very sensible series of changes to services’ risk-assessment duties. This perhaps begs the question of why they were not included in earlier drafts of the Bill, but we are glad to see them now.

There is, of course, the issue of precisely where some of the information will appear, as well as the wider status of terms of service. I am sure those issues will be discussed in later debates. It is certainly welcome that the department is introducing stronger requirements around the information that must be made available to users; it will all help to make this a stronger and more practical Bill.

We all know that users need to be able to make informed decisions, and it will not be possible if they are required to view multiple statements and various documents. It seems that the requirements for information to be provided to Ofcom go to the very heart of the Bill, and I suggest that the proposed system will work best if there is trust and transparency between the regulator and those who are regulated. I am sure that there will be further debate on the scope of risk assessments, particularly on issues that were dropped from previous iterations of the Bill, and certainly this is a reasonable starting point today.

I will try to be as swift as possible as I raise a few key issues. One is about avoiding warnings that are at such a high level of generality that they get put on to everything. Perhaps the Minister could indicate how Ofcom will ensure that the summaries are useful and accessible to the reader. The test, of course, should be that a summary is suitable and sufficient for a prospective user to form an assessment of the likely risk they would encounter when using the service, taking into account any special vulnerabilities that they might have. That needs to be the test; perhaps the Minister could confirm that.

Is the terms of service section the correct place to put a summary of the illegal content risk assessment? Research suggests, unsurprisingly, that only 3% of people read terms before signing up—although I recall that, in an earlier debate, the Minister confessed that he had read all the terms and conditions of his mobile phone contract, so he may be one of the 3%. It is without doubt that any individual should be supported in their ability to make choices, and the duty should perhaps instead be to display a summary of the risks with due prominence, to ensure that anyone who is considering signing up to a service is really able to read it.

I also ask the Minister to confirm that, despite the changes to Clause 19 in Amendment 16B, the duty to keep records of risk assessments will continue to apply to all companies, but with an enhanced responsibility for category 1 companies.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

I am grateful to noble Lords for their questions on this, and particularly grateful to the noble Lord, Lord Allan, and the noble Baroness, Lady Kidron, for their chorus of welcome. Where we are able to make changes, we will of course bring them forward, and I am glad to be able to bring forward this tranche now.

As the noble Lord, Lord Allan, said, ensuring the transparency of services’ risk assessments will further ensure that the framework of the Bill delivers its core objectives relating to effective risk management and increased accountability regarding regulated services. As we have discussed, it is imperative that these providers take a thorough approach to identifying risks, including emerging risks. The Government believe that it is of the utmost importance that the public are able effectively to scrutinise the risk assessments of the largest in-scope services, so that users can be empowered to make informed decisions about whether and how to use their services.

On the questions from the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones, about why it is just category 1 and category 2A services, we estimate that there will be around 25,000 UK service providers in scope of the Bill’s illegal and child safety duties. Requiring all these companies to publish full risk assessments and proactively to send them to Ofcom could undermine the Bill’s risk-based and proportionate approach, as we have discussed in previous groups on the burdens to business. A large number of these companies are likely to be low risk and it is unlikely that many people will seek out their risk assessments, so requiring all companies to publish them would be an excessive regulatory burden.

There would also be an expectation that Ofcom would proactively monitor a whole range of services, even ones that posed a minimal risk to users. That in turn could distract Ofcom from taking a risk-based approach in its regulation by overwhelming it with paperwork from thousands of low-risk services. If Ofcom wants to see records of the risk assessments of providers that are not category 1 or category 2A services, it has extensive information-gathering powers that it can use to require a provider to send it such records.

The noble Baroness, Lady Merron, was right to say that I read the terms of my broadband supply—I plead guilty to the nerdiness of doing that—but I have not read all the terms and conditions of every application and social medium I have downloaded, and I agree that many people do skim through them. They say the most commonly told lie on the planet at the moment is “I agree to the terms and conditions”, and the noble Baroness is right to point to the need for these to be intelligible, easily accessible and transparent—which of course we want to see.

In answer to her other question, the record-keeping duty will apply to all companies, but the requirement to publish is only for category 1 and category 2A companies.

The noble Baroness, Lady Kidron, asked me about Amendment 27A. If she will permit me, I will write to her with the best and fullest answer to that question.

I am grateful to noble Lords for their questions on this group of amendments.

Amendment 12A agreed.
Amendment 12B
Moved by
12B: Clause 6, page 5, line 16, at end insert “(2) to (6)”
Member’s explanatory statement
This amendment is consequential on the amendments in the Minister’s name to clause 19 below (because the new duty to supply records of risk assessments to OFCOM is only imposed on providers of Category 1 services).
Amendment 12B agreed.
House resumed.
Committee (2nd Day) (Continued)
20:39
Amendment 12BA
Moved by
12BA: Clause 6, page 5, line 16, at end insert—
“(g) the duties on regulated provider pornographic content set out in section 72.”Member’s explanatory statement
This amendment requires user-to-user services to comply with duties under Part 5.
Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- Hansard - - - Excerpts

My Lords, I wish to speak to the amendments in this group, which are in my name and are also supported by the noble Lord, Lord Morrow. There are three interconnected issues raised by these amendments. First, there should be a level playing field across the Bill for regulating pornographic content. The same duties should apply to all content, whether it is found in social media or a user-to-user pornography site, which fall under Part 3, or a commercial pornography site, with producer content that falls within Part 5 of the Bill. Secondly, these common duties in respect of pornography must come into effect at the same time. My requiring that the same duties under Clause 72 apply to both Part 3 and Part 5 services means that they will be regulated for pornographic content at the same time, ensuring uniformity across the Bill.

Thirdly, through Amendment 125A, I wish to probe how Part 5 will function more specifically. Will any website or platform actually be covered by Part 5 if these amendments are not made? I had the privilege of speaking earlier to the Minister on these issues, and one question I would pose at this stage is, how many sites are covered by Part 5? That is one of the questions to which your Lordships’ House requires an answer.

The issue of ensuring that pornography is named as a harm on the face of the Bill, and that all pornographic content is age-verified, is not new. Indeed, it has been raised from the outset of the Bill, including at Second Reading in your Lordships’ House. In pre-legislative scrutiny even, the Joint Committee on the draft Bill recommended that

“key, known risks of harm to children are set out on the face of the Bill. We would expect these to include (but not be limited to) access to or promotion of age-inappropriate material such as pornography”.

To partly address this issue, the Government added Part 5 to the Bill, which sought to ensure that any platform that was not in scope of Part 3 but which included pornographic content should be subject to age-verification measures. I know that other noble Lords have put forward amendments that would add to the list of online harms on the face of the Bill, which we will be debating later in group 10.

Other amendments add to the duties that platforms hosting pornographic content need to comply with. These include Amendment 184, in the name of the noble Baroness, Lady Kidron, which proposes that consent be obtained from performers, and Amendment 185, in the name of the noble Baroness, Lady Benjamin, which seeks to ensure parity between what is permissible offline and online. The amendments I propose in this group are, I believe, complementary to those amendments. My amendments seek to ensure that duties across Part 3 and Part 5 in respect of pornography are aligned. Therefore, those additional duties contained in other amendments would be aligned across the Bill as well. When we get to that stage in Committee, I will be supporting those amendments.

The harms of pornography are well known and I do not propose to go over those again. I do, however, want to highlight one issue raised in a recent report published by the Children’s Commissioner for England. Her report states:

“Pornography is not confined to dedicated adult sites. We found that Twitter was the online platform where young people were most likely to have seen pornography. Fellow mainstream social networking platforms Instagram and Snapchat rank closely after dedicated pornography sites.”


The report found that 41% of children encountered pornography on Twitter, 33% on Instagram and 32% on Snapchat, while only 37% of children encountered pornography on main commercial websites. This means that children are more likely to encounter pornographic content on social media. That is why we need to ensure that standards across all platforms are uniform. The same standards need to apply to social media as to commercial pornography websites.

20:45
While I appreciate that the Government state it is their intention that Part 3 services will have to implement age verification, and that all platforms will have similar duties to ensure that children are protected from accessing pornographic content, it would clearly be better to remove all doubt and have age verification for the protection of children in the Bill. This would ensure a level playing field for all pornographic content, which brings me to my second point.
Not only does there need to be a level playing field but there needs to be a concurrent timing of these requirements coming into effect. These amendments would ensure that age verification will apply to all platforms across the Bill at the same time. I am sure your Lordships will agree that this is what the public will expect. I am not sure that parents, or indeed children and young people, would understand why one website has age verification while other content does not.
As the Bill is drafted, pornography would need to be named as the primary priority content by the Secretary of State, alongside other online harms. I hope that the Minister, in his reply, could address that issue. Codes of practice for pornography and other harms will need to be drafted and implemented before Part 3 can come into effect. We know the harm that pornography causes: it is the only harm that is already named in the Bill. It has been given its own part within the Bill, and therefore we do not need any secondary legislation to name pornography as a harm and set down duties for pornographic websites and social media to protect children. By simply making user-to-user services subject to the duties of Part 5, children can be protected more quickly. Part 5 will be much more straightforward to implement, and extending the duties to Part 3 services with pornographic content will ensure parity across all services within the scope of the Bill.
This brings me to my third and final point. Amendment 125A, upon which the Minister and I had a discussion earlier this afternoon, probes the devil in the detail of what is defined as user-generated content. I ask the Committee to bear with me, as I am required to get into the detail of the definitions in Clause 49, which is important. This detail matters because it determines whether provider pornographic content, as defined in Clause 70, could be considered user-generated content.
Put simply, if a site is user generated it is regulated under Part 3 and if a site produces its own content it is covered by Part 5. The Government said, in their helpful factsheet circulated before the start of Committee, that Part 3 covers services
“such as social media platforms and dedicated pornography sites that enable user interaction”.
In the case of Part 5 services, the intention is that this will cover content provided only by the service. My Amendment 125A probes what happens next and what constitutes user interaction.
If users of a platform can interact, this seems to move the service into Part 3 of the Bill, as per its definition of user-generated content. The definition in Clause 49(2)(e) includes “comments and reviews”, which itself refers to Clause 49(6). However, Clause 49(6) does not bring much clarity about what
“Comments and reviews on provider content”
consist of. On a plain reading of Clause 49 it would appear that a pornography provider which currently falls under Part 5 would move to being a Part 3 service under the Bill if they allow users to comment on the content and allow user interaction. Therefore, the important question is: what is user-to-user functionality?
The British Board of Film Classification undertook an analysis of user-to-user functionality on adult sites in October 2020. It assumed that likes did not constitute user-to-user functionality, specifically saying:
“We have not included sites which offer users the chance to ‘rate’ content—for example, with a ‘thumbs up’ or ‘thumbs down’—as we were concerned that this would be too generous an interpretation of ‘user interaction’”.
Elsewhere, it said that such ratings
“would be a questionable interpretation of ‘user interaction’”.
This seems a reasonable interpretation to me. However, Clause 49 does not seem to be clear on this. It seems to allow ratings to constitute a review, thus giving room for interpretation.
My Amendment 125A would make it clear that likes or dislikes, or the use of an emoji, would not be considered a review and would therefore not be user-to-user content. That would keep a service which allowed this, but no textual comments, in Part 5. This may seem inconsequential but it is important, as it prevents services moving from one part of the Bill to another to utilise different regulatory requirements, or indeed to evade regulations.
I ask the Minister to set out the Government’s intentions and how the definitions in Clause 49 might move a service with provider pornographic content from Part 5 to Part 3. Furthermore, I would be grateful if the Minister could put on record how many of the top 200 pornographic websites visited in the UK he expects to be regulated by Part 3 and Part 5 respectively, and how many people in the UK he expects to visit services under each part.
My main concern is to ensure that, as soon as is practically possible after the Bill passes, children are protected. This issue was raised at Second Reading and earlier this evening in various contributions. It would not be acceptable for services such as social media and large pornography sites that fall under Part 3 to be left for three or even four years without a duty to protect children from pornographic content. It would be worse if sites were allowed to move from one part of the Bill to another by simply utilising user interaction and thereby avoiding regulation.
These amendments, in my name and that of the noble Lord, Lord Morrow, will ensure that all pornographic content is regulated in the same way, at the same time, and that Part 5 can be brought into force more quickly to ensure all content is treated in the same way. I believe that was certainly the will of your Lordships at Second Reading. I look forward to hearing the Minister’s views on how this will be achieved. I beg to move.
Lord Browne of Belmont Portrait Lord Browne of Belmont (DUP)
- View Speech - Hansard - - - Excerpts

My Lords, first, I tender an apology from my noble friend Lord Morrow, whose name is attached to the amendments. Unfortunately, he is unable to participate in tonight’s debate, as he had to return home at very short notice. I will speak to the amendments in this group. I thank the noble Baroness, Lady Ritchie, and my noble friend Lord Morrow for tabling the amendments, allowing for a debate on how the duties of Part 5 should apply to Part 3 services and to probe what sites Part 5 will cover once it is implemented.

The Government have devised a Bill which attempts carefully to navigate regulation of several different types of service. I am sure that it will eventually become an exemplar emulated around the world, so I understand why there may be a general resistance on the part of the Government to tamper with the Bill’s architecture. However, these amendments are designed to treat pornographic content as a clear exception wherever it is found online. This can be achieved, because we already know the harm caused by pornography and Part 5 already creates a duty to ensure that rigorous age verification is in place to stop children accessing it.

The Government recognised that the original drafting of the Bill would not address the unfinished business of Part 3 of the Digital Economy Act. In 2017, as many will recall, this House and the other place expressed the clear demand that online pornography should not be accessible to children. Part 5 of the Bill is the evolution of that 2017 debate, but, regrettably, it was bolted on belatedly after pre-legislative scrutiny. That bolt-on approach has had the unfortunate consequence of creating two separate regimes to deal with pornography. Part 5 applies only to “provider pornographic content”, which is content

“published or displayed on the service by the provider … or by a person acting on behalf of the provider”.

Clause 70 makes it clear:

“Pornographic content that is user-generated content … is not to be regarded as provider pornographic content”;


in other words, if pornography is on social media or the large tube sites, it falls under Part 3, not Part 5. That means that not all content will be regulated in the same way or at the same time.

Amendment 125A addresses an issue raised by this two-tier approach to regulation. Clause 49 defines “user-generated content” as content

“generated directly on the service by a user of the service, or … uploaded to or shared on the service by a user of the service, and … that may be encountered by another user”.

Encounter is defined broadly, meaning to

“read, view, hear or otherwise experience content”,

including adding “comments and reviews”. By including reviews, that seems to be a broad definition. Does it include a like, an up vote or an emoji? That is an important question that Amendment 125A probes. On this basis, it seems that almost all the most popular pornographic websites are user-to-user services, and therefore will fall into Part 3.

21:00
I echo the question asked by the noble Baroness, Lady Ritchie: can the Minister identify what sites will be regulated by Part 5, how much United Kingdom traffic is directed to those sites and how will any site covered by Part 5 be prevented from adding functionality to allow encounters on its platform to move that site from Part 5 to Part 3 to delay implementation?
These are important questions. Ofcom could accelerate implementation of Part 5 separately and indeed it would be disappointing if Ofcom delayed Part 5 implementation to avoid these very questions. These amendments are needed to allow Part 5 to be implemented quickly and for pornography to be regulated across the Bill swiftly. Put simply, Part 5 does not rely on the vast amount of secondary legislation which we must consider before Part 3 can be brought into operation. But to do so without these amendments would be to abandon a sensible goal of creating a level playing field for any site which publishes pornographic content and would affect only a minority of the smaller sites, or indeed no websites at all.
The amendments before your Lordships today apply a far simpler logic. They place within scope of Part 5 any pornographic content wherever it is found and place a duty on Part 3 services to comply with the duties in Clause 72. I emphasise that this is not to apply age gates to an entire service, only the adult content. So, for example, on Twitter, research has found that 13% of tweets lead to pornographic images and videos. That platform would need users to prove they were 18 or over before they could see those particular tweets but not for the rest of the Twitter platform. Part 5 can very simply deal with all pornography online, so it can be introduced on a stand-alone basis allowing us to do so within, say, six months of Royal Assent. I understand that amendments seeking to achieve this are tabled for later in Committee.
We should keep in mind that in December 2018 Ministers announced that age verification would be required from the following Easter. We know the major porn sites had contracts negotiated with age-verification providers as they had accepted the inevitability of the policy and were prepared to comply. We saw in France, just over a year ago, that these sites were able to implement age checks with just 10 days’ notice.
By addressing all pornographic content under one part of the Bill, we would also remove the ambiguity Part 3 creates. User-to-user platforms are required only to act proportionately. For example, the social media site I referred to earlier may determine that only 13% of its content is pornographic. One may think that is more than enough to merit age verification. Ofcom may well agree. But this is a judgment, susceptible to judicial review. For the price of a small legal team, a site could delay enforcement for years as it argued through hearings and appeals that the demands on it were disproportionate. Part 3 suggests the use of age assurance and offers age verification as an example, not a requirement, but Part 5 leaves no room for doubt.
The unsuitability of pornography for children is not something we expect to change. We do not need to future-proof the Bill against the possibility that one day a Minister decides it is no longer something we wish to protect children from seeing. Indeed, if they do, I much prefer it if they have to return to Parliament to amend primary legislation before the law is relaxed.
I hope the Minister will see the logic of a level playing field to deliver a policy with widespread support across all ages and political parties. Indeed, without addressing pornography separately—and, in turn, quickly —we will pass a Bill with no discernible impact before the next general election. While they are walking to the polling station, parents will still fear what their children are looking at online. This is a quick win and a popular move, and I hope the Government will amend the Bill accordingly so that this House does not need to do so when we revisit this important issue on Report.
Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

My Lords, it is a tremendous honour to follow the noble Lord, Lord Browne, who put the case extremely well; I agree with every word he just said. I thank the noble Baroness, Lady Ritchie, for bringing forward this issue, which she has done extremely well. I thank Christian Action Research and Education, which has been fundamental in thinking through some of these issues and has written an extremely good brief on the subject. There is indeed an urgent need for consistent regulation of pornographic content wherever it occurs online, whether it is in Section 3, Section 4, Section 5 or wherever. That is why, with the noble Baroness, Lady Kidron, the right reverend Prelate the Bishop of Oxford and the noble Lord, Lord Stevenson, I have tabled amendments to address age verification on pornography and harms in the round.

Our amendments, which we will get to on Thursday and on later days in Committee, are different from those raised by the noble Baroness, Lady Ritchie, and others, but it is worth noting that many of the principles are the same. In particular, all pornographic content should be subject to the same duties, in the interests of consistency and transparency, wherever it is. Porn is porn, regardless of where it occurs online, and it carries the same risk of harm, particularly to children, whether it is accessed on social media or on a dedicated porn site.

We know from the Children’s Commissioner’s research that, for instance, Twitter was absolutely the online platform where young people were most likely to have seen pornography. Not Pornhub or one of the big tubes—on Twitter. We also know that children will consistently watch porn on dedicated porn sites. So why do we have inconsistent regulation of pornographic content in the Bill? This is the question I address to my noble friend the Minister. We can and we will get to the debate on how we will do this—indeed, I welcome further discussion with the Minister on how, and encourage him to have conversations across the House on this.

For today, we must look at why we have inconsistent regulation for pornographic content and what that means. As currently drafted, Part 3 services and Part 5 services are not subject to the same duties, as the noble Baroness rightly pointed out. Part 3 services, which include the biggest and most popular pornographic websites, such as Pornhub and Xvideos, as well as sites that host pornographic content, such as Twitter, will not be subject to regulation, including age verification, until secondary legislation is introduced, thereby delaying regulation of the biggest porn sites until at the very least 2025, if not 2026. This will create a massively unlevel playing field which, as others have said, will disincentivise compliance across the board, as well as leaving children with unfettered access to pornography on both social media sites and other user-to-user sites such as Pornhub.

Meanwhile, whichever commercially produced pornography websites are left in Part 5 will, as has already been suggested, simply change their functionality to become user-to-user and avoid regulation for another three years. I have a way in which this can be prevented and the noble Baroness, Lady Ritchie, has her way, but for today I stand with her in asking why the Government think this lack of consistency and fragmentation in the regulation of an industry that destroys childhoods and has implications that reverberate across society are to be accepted.

I look forward to hearing what the Minister has to say. It is clear to me that there is a consensus across the Committee and at every stage of the Bill that pornography should be regulated in a way that is consistent, clear and implemented as quickly as possible following Royal Assent—I have suggested within six months. Therefore, I would welcome discussions with the noble Baroness, Lady Ritchie, the Minister and others to ensure that this can be achieved.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I want to inject into the debate some counterarguments, which I hope will be received in the constructive spirit in which they are intended. Primarily, I want to argue that a level playing field is not the right solution here and that there is a strong logic for a graduated response. It is often tempting to dial everything up to 11 when you have a problem, and we clearly do have an issue around child access to pornography. But from a practical point of view, the tools we are giving our regulator are better served by being able to treat different kinds of services differently.

I think there are three classes of service that we are thinking about here. The first is a service with the primary purpose and explicit intent to provide pornography and nothing else. A regime dedicated to those sites is quite appropriate. Such a service might have not just the strongest levels of age verification but a whole other set of requirements, which I know we will debate later, around content verification and all sorts of other things that kick into play. The second category is made up of services that are primarily designed for social interaction which prohibit pornography and make quite strenuous efforts to keep it off. Facebook is such a service. I worked there, and we worked hard to try to keep pornography off. We could not guarantee that it was never present, but that was our intent: we explicitly wanted to be a non-pornographic site. Then there are—as the noble Lord, Lord Bethell, pointed out—other services, such as Twitter, where the primary purpose is social but a significant proportion of adult content is allowed.

I suggest that one of the reasons for having a graduated response is that, from our point of view, we would like services to move towards porn reduction, and for those general-purpose services to prohibit porn as far as possible. That is our intent. If we have a regulatory system that says, “Look, we’re just going to treat you all the same anyway”, we may provide a perverse incentive for services not to move up the stack, as it were, towards a regime where by having less pornographic or sexualised content, they are able to see some benefit in terms of their relationship with the regulator. That is the primary concern I have around this: that by treating everybody the same, we do not create any incentive for people to deal with porn more effectively and thereby get some relief from the regulator.

From a practical point of view, the relationship that the regulator has is going to be critical to making all these things work. Look at what has been happening in continental Europe. There have been some real issues around enforcing laws that have been passed in places such as France and Germany because there has not been the kind of relationship that the regulator needs with the providers. I think we would all like to see Ofcom in a better position, and one of the ways it can do that is precisely by having different sets of rules. When it is talking to a pure pornography site, it is a different kind of conversation from the one it is going to have with a Twitter or a Facebook. Again, they need to have different rules and guidance that are applied separately.

The intent is right: we want to stop under-18s getting on to those pure porn sites, and we need one set of tools to do that. When under-18s get on to a social network that has porn on it, we want the under-18s, if they meet the age requirement, to have access—that is perfectly legitimate—but once they get there, we want them kept out of the section that is adult. For a general-purpose service that prohibits porn, I think we can be much more relaxed, at least in respect of pornography but not in respect of other forms of harmful content—but we want the regulator to be focused on that and not on imposing porn controls. That graduated response would be helpful to the regulator.

Some of the other amendments that the noble Lord, Lord Bethell, has proposed will help us to talk about those kinds of measures—what Twitter should do inside Twitter, and so on—but the amendments we have in front of us today are more about dialling it all up to 11 and not allowing for that graduation. That is the intent I heard from the amendments’ proposers. As I say, that is the bit that, respectfully, may end up being counterproductive.

21:15
Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

Could the noble Lord advise us on how he would categorise a site such as Twitter, on which it is estimated that 13% of the page deliveries are to do with pornography? Does it qualify as a pornography site? To me, it is ambiguous. Such a large amount of its financial revenue comes from pages connected with pornography that it seems it has a very big foot in the pornography industry. How would he stop sites gaming definitions to benefit from one schedule or another? Does he think that puts great pressure on the regulator to be constantly moving the goalposts in order to capture who it thinks might be gaming the system, instead of focusing on content definition, which has a 50-year pedigree, is very well defined in law and is an altogether easier status to analyse and be sure about?

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

The Twitter scenario, and other scenarios of mixed sites, are some of the most challenging that we have to deal with. But I would say, straightforwardly, “Look, 13% is a big chunk, but the primary purpose of Twitter is not the delivery of pornography”. I use Twitter on a daily basis and I have never seen pornography on it. I understand that it is there and that people can go for it, and that is an issue, but I think people out there would say that for most people, most of the time, the primary purpose of Twitter is not pornography.

What we want to do—in answer to the noble Lord’s second point—is create an incentive for people to be recategorised in the right direction. There is an assumption here that it is all going to be about gaming the system. I actually think that there is an opportunity here for genuine changes. There will be a conversation with Twitter. It will be interesting, given Twitter’s current management—apparently it is run by a dog, so there will be a conversation with the dog that runs Twitter. In that conversation, the regulator, Ofcom, on our behalf, will be saying, “You could change your terms of service and get rid of pornography”. Twitter will say yes or no. If it says no, Ofcom will say, “Well, here are all the things we expect you to do in order to wall off that part of the site”.

That is a really healthy and helpful conversation to have with Twitter. I expect it is listening now and already thinking about how it will respond. But it would expect that kind of treatment and conversation to be different; and I think the public would expect that conversation to be a different and better conversation than just saying “Twitter, you’re Pornhub. We’re just going to treat you like Pornhub”.

That is the distinction. As I say, we have an opportunity to get people to be more robust about either limiting or removing pornography, and I fear that the amendments we have in front of us would actually undermine rather than enhance that effort.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

At the centre of this is the question of whether we are trying to block the entire service or block at the level of porn content. It is the purpose of a set of amendments in the names of the noble Lord, Lord Bethell, myself and a number of other noble Lords to do exactly the latter. But I have to say to the noble Baroness that I am very much in sympathy with, first, putting porn behind an age gate; secondly, having a commencement clause; and, thirdly and very importantly—this has not quite come up in the conversation—saying that harms must be on the face of the Bill and that porn is not the only harm. I say, as a major supporter of the Bereaved Families for Online Safety, that “Porn is the only harm children face” would be a horrendous message to come from this House. But there is nothing in the noble Baroness’s amendments, apart from where the action happens, that I disagree with.

I also felt that the noble Baroness made an incredibly important point when she went into detail on Amendment 125A. I will have to read her speech in order to follow it, because it was so detailed, but the main point she made is salient and relates to an earlier conversation: the reason we have Part 5 is that the Government have insisted on this ridiculous thing about user-to-user and search, instead of doing it where harm is. The idea that you have Part 5, which is to stop the loophole of sites that do not have user-to-user, only to find that they can add user-to-user functionality and be another type of site, is quite ludicrous. I say to the Committee and the Minister, who I am sure does not want me to say it, “If you accept Amendment 2, you’d be out of that problem”—because, if a site was likely to be accessed by children and it had harm and we could see the harm, it would be in scope. That is the very common-sense approach. We are where we are, but let us be sensible about making sure the system cannot be gamed, because that would be ludicrous and would undermine everybody’s efforts—those of the Government and of all the campaigners here.

I just want to say one more thing because I see that the noble Lord, Lord Moylan, is back in his place. I want to put on the record that age assurance and identity are two very separate things. I hope that, when we come to debate the package of harms—unfortunately, we are not debating them all together; we are debating harms first, then AV—we get to the bottom of that issue because I am very much in the corner of the noble Lord and the noble Baroness, Lady Fox, on this. Identity and age assurance must not be considered the same thing by the House, and definitely not by the legislation.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I add my support for all the amendments in this group. I thank the noble Baroness, Lady Ritchie, for bringing the need for the consistent regulation of pornographic content to your Lordships’ attention. Last week, I spoke about my concerns about pornography; I will not repeat them here. I said then that the Bill does not go far enough on pornography, partly because of the inconsistent regulation regimes between Part 3 services and Part 5 ones.

In February, the All-Party Parliamentary Group on Commercial Sexual Exploitation made a series of recommendations on the regulation of pornography. Its first recommendation was this:

“Make the regulation of pornography consistent across different online platforms, and between the online and offline spheres”.


It went on to say:

“The reforms currently contained in the Online Safety Bill not only fail to remedy this, they introduce further inconsistencies in how different online platforms hosting pornography are regulated”.


This is our opportunity to get it right but we are falling short. The amendments in the name of the noble Baroness, Lady Ritchie, go to the heart of the issue by ensuring that the duties that currently apply to Part 5 services will also apply to Part 3 services.

Debates about how these duties should be amended or implemented will be dealt with later on in our deliberations; I look forward to coming back to them in detail then. Today, the question is whether we are willing to have inconsistent regulation of pornographic content across the services that come into the scope of the Bill. I am quite sure that, if we asked the public in an opinion poll whether this was the outcome they expected from the Bill, they would say no.

An academic paper published in 2021 reported on the online viewing of 16 and 17 year-olds. It said that pornography was much more frequently viewed on social media, showing that the importance of the regulation of such sites remains. The impact of pornography is no different whether it is seen on a social media or pornography site with user-to-user facilities that fall within Part 3 or on a site that has only provider content that would fall within Part 5. There should not be an either/or approach to different services providing the same content, which is why I think that Amendment 125A is critical. If all pornographic content is covered by Part 5, what does and does not constitute user-generated material ceases to be our concern. Amendment 125A highlights this issue; I too look forward to hearing the Minister’s response.

There is no logic to having different regulatory approaches in the same Bill. They need to be the same and come into effect at the same time. That is the simple premise of these amendments; I fully support them.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, earlier today the noble Baroness, Lady Benjamin, referred to a group of us as kindred spirits. I suggest that all of us contributing to this debate are kindred spirits in our desire to see consistent outcomes. All of us would like to see a world where our children never see pornography on any digital platform, regardless of what type of service it is. At the risk of incurring the ire of my noble friend Lord Moylan, we should have zero tolerance for children seeing and accessing pornography.

I agree with the desire to be consistent, as the noble Baroness, Lady Ritchie, and the noble Lord, Lord Browne, said, but it is consistency in outcomes that we should focus on. I am very taken with the point made by the noble Lord, Lord Allan, that we must be very careful about the unintended consequences of a consistent regulatory approach that might end up with inconsistent outcomes.

When we get to it later—I am not sure when—I want to see a regulatory regime that is more like the one reflected in the amendments tabled by the noble Baroness, Lady Kidron, and my noble friend Lord Bethell. We need in the Bill a very clear definition of what age assurance and age verification are. We must be specific on the timing of introducing the regulatory constraints on pornography. We have all waited far too long for that to happen and that must be in the Bill.

I am nervous of these amendments that we are debating now because I fear other unintended consequences. Not only does this not incentivise general providers, as the noble Lord, Lord Allan, described them, to remove porn from their sites but I fear that it incentivises them to remove children from their sites. That is the real issue with Twitter. Twitter has very few child users; I do not want to live in a world where our children are removed from general internet services because we have not put hard age gates on the pornographic content within them but instead encouraged those services to put an age gate on the front door. Just as the noble Lord, Lord Allan, said earlier today, I fear that, with all the best intentions, the desire to have consistent outcomes and these current amendments would regulate the high street rather than the porn itself.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, there is absolutely no doubt that across the Committee we all have the same intent; how we get there is the issue between us. It is probably about the construction of the Bill, rather than the duties that we are imposing.

It is a pleasure again to follow the noble Baroness, Lady Harding. If you take what my noble friend Lord Allan said about a graduated response and consistent outcomes, you then get effective regulation.

I thought that the noble Baroness, Lady Kidron, had it right. If we passed her amendments in the second group, and included the words “likely to be accessed”, Clause 11 would bite and we would find that there was consistency of outcomes for primary priority content and so on, and we would then find ourselves in much the same space. However, it depends on the primary purpose. The fear that we have is this. I would not want to see a Part 5 service that adds user-generated content then falling outside Part 5 and finding itself under Part 3, with a different set of duties.

I do not see a huge difference between Part 3 and Part 5, and it will be very interesting when we come to debate the later amendments tabled by the noble Lord, Lord Bethell, and the noble Baroness, Lady Kidron. Again, why do we not group these things together to have a sensible debate? We seem to be chunking-up things in a different way and so will have to come back to this and repeat some of what we have said. However, I look forward to the debate on those amendments, which may be a much more effective way of dealing with this than trying to marry Part 5 and Part 3.

I understand entirely the motives of the noble Baroness, Lady Ritchie, and that we want to ensure that we capture this. However, it must be the appropriate way of regulating and the appropriate way of capturing it. I like the language about consistent outcomes without unintended consequences.

21:30
Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a very helpful debate and I hope it sets up the Committee up for when we return to these issues. As the noble Lord, Lord Clement-Jones, just said, it is about having appropriate regulation that does the job that we want. I feel from this debate, as I have felt before, that we are in agreement about what we want; the question, as ever, is how we get there.

The noble Lord, Lord Allan of Hallam, spoke well on the practicalities of the different ways that pornography is accessible and, because they are so different, the need to respond differently. An example is Twitter, which is primarily a social network but its content can be inappropriate when accessed by a group who should not be accessing it—children, in this case. It is important that the way this is approached does not take away the ability of Twitter, for example, to do the job that it is there to do but does protect those who need to be protected. The words that came to mind is that regulation needs to be fit for purpose, but the question is what the purpose is and how we make it fit for it.

I am grateful to all noble Lords who have spoken today. The noble Baroness, Lady Harding, spoke of consistency of outcome. That is a very good place from which to look backwards to see what is required. The noble Baroness, Lady Kidron, was right to say that we must not send out the message that pornography is, somehow, the only harm or that there is a hierarchy of harms. In my view, we are simply debating that at this stage. So pornography is not the only harm, nor is it of a higher order than other harms.

I would like to say how grateful I am to my noble friend Lady Ritchie of Downpatrick, who was supported in the Chamber by the noble Lord, Lord Browne, on behalf of his noble friend the noble Lord, Lord Morrow, who put his name to some of these amendments. I am grateful because the debate in this area facilitated an early debate on the issue of regulation and online pornography, and did it thoroughly. It raised a number of questions that we will need to address when debating later amendments.

There is no denying the damage that can be caused by young people readily having access to pornographic content online. They see material that it would be illegal for them to see offline. If we have already dealt with offline, our challenge is to protect children and young people in the same way online. However, as we will discuss later and probably at some length, this side of the House does not accept that access to illegal pornography is the only issue affecting how children can and should use the internet. Exposure to pornographic content changes young people’s perceptions of sexual activity and, in the worst cases, can contribute to sexual assault. Even in cases where there is consent, evidence is available that shows that depictions of certain high-risk activities in pornographic material mean that many more people are engaging in, for example, choking and asphyxiation, with the predictable but tragic outcome of permanent injury or even death.

Having said that, later we will be debating measures that need to be put in place to protect children of 18 and under from accessing sites that they are likely to encounter. We need to ensure that age-appropriate design is the keystone to the protection of children online. We are relying heavily on effective terms of service to protect vulnerable adults from accessing material which would cause them harm, and that issue definitely needs more debate.

Pornography has an influence on adult sexual behaviour and, regardless of our own personal views, we have to remember that much adult content is in fact perfectly legal, and for whatever reason, it is also very popular. While some of the most widely used user-to-user platforms have opted not to carry adult material, there are others, as we have heard in the debate, such as Twitter and Reddit, that do allow users to share explicit but legal content. There has been an explosion in the number of so-called content creators who upload their own material to sites such as OnlyFans. There has also been an explosion in user-to-user services such as Twitter, which I would presume to be the very valid motivation behind Amendment 183A.

Steps taken to restrict child access to adult content and user-to-user platforms are often easy to bypass, so the question of whether such services should be within the scope of Part 5 is indeed a valid one. There are some platforms that do take their responsibilities seriously, with OnlyFans having engaged with the topic of online safety long before it will be compelled to do so; but others have not. So, on that basis, it is clear that we cannot continue with the status quo given the ever-increasing risk that illegal material does not get taken down by algorithms and automated moderation.

We recognise that the Government have had their own reasons for not implementing Part 3 of the Digital Economy Act. That decision was disappointing, and in fact, the disappointment was made even worse by repeated empty promises, dither and delay. However, the department clearly recognises the issue, which is a welcome first step, and it is not clear that simply rerunning the arguments from the DEA is going to bear fruit this time round. This Bill is largely apolitical, and colleagues on all sides of the House, including from these Benches, have the opportunity to come together; we have the opportunity to agree a way forward to protect children, to reduce exposure to extreme forms of pornography but ultimately to allow adults to consume pornography if they wish to do so. That is the challenge that we have.

These Benches support robust age verification for access to pornographic content, but it is vital that these systems are secure and take appropriate steps to preserve the user’s privacy. The questions raised in this group are extremely valid, and the proposals presented by other colleagues, including the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, deserve very serious consideration. We hope that the Minister can demonstrate in his response that progress is being made.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, first, I will address Amendments 12BA, 183A and 183B, tabled by the noble Baroness, Lady Ritchie of Downpatrick, who I was grateful to discuss them with earlier today, and the noble Lord, Lord Morrow, whose noble friend, the noble Lord, Lord Browne of Belmont, I am grateful to for speaking to them on his behalf.

These amendments seek to apply the duties in Part 5 of the Bill, which are focused on published pornographic content and user-generated pornography. Amendments 183A and 183B are focused particularly on making sure that children are protected from user-to-user pornography in the same way as from published pornography, including through the use of age verification. I reassure the noble Baroness and the noble Lord that the Government share their concerns; there is clear evidence about the impact of pornography on young people and the need to protect children from it.

This is where I come to the questions posed earlier by the noble Lord, Lord McCrea of Magherafelt and Cookstown. The research we commissioned from the British Board of Film Classification assessed the functionality of and traffic to the UK’s top 200 most visited pornographic websites. The findings indicated that 128 of the top 200 most visited pornographic websites—that is just under two-thirds, or 64%—would have been captured by the proposed scope of the Bill at the time of the Government’s initial response to the online harms White Paper, and that represents 85% of the traffic to those 200 websites.

Since then, the Bill’s scope has been broadened to include search services and pornography publishers, meaning that children will be protected from pornography wherever it appears online. The Government expect companies to use age-verification technologies to prevent children accessing services which pose the highest risk to children, such as online pornography. Age-assurance technologies and other measures will be used to provide children with an age-appropriate experience on their service.

As noble Lords know, the Bill does not mandate that companies use specific approaches or technologies when keeping children safe online as it is important that the Bill is future-proofed: what is effective today might not be so effective in the future. Moreover, age verification may not always be the most appropriate or effective approach for user-to-user companies to comply with their duties under the Bill. For instance, if a user-to-user service, such as a social medium, does not allow pornography under its terms of service, measures such as strengthening content moderation and user reporting would be more appropriate and effective for protecting children than age verification. That would allow content to be better detected and removed, instead of restricting children from a service that is designed to be appropriate for their use—as my noble friend Lady Harding of Winscombe puts it, avoiding the situation where children are removed from these services altogether.

While I am sympathetic to the aims of these amendments, I assure noble Lords that the Bill already has robust, comprehensive protections in place to keep children safe from all pornographic content, wherever or however it appears online. This amendment is therefore unnecessary because it duplicates the existing provisions for user-to-user pornography in the child safety duties in Part 3.

It is important to be clear that, wherever they are regulated in the Bill, companies will need to ensure that children cannot access pornographic content online. This is made clear, for user-to-user content, in Clause 11(3); for search services, in Clause 25(3); and for published pornographic content in Clause 72(2). Moving the regulation of pornography from Part 3 to Part 5 would not be a workable or desirable option because the framework is effective only if it is designed to reflect the characteristics of the services in scope.

Part 3 has been designed to address the particular issues arising from the rapid growth in platforms that allow the sharing of user-generated content but are not the ones choosing to upload that content. The scale and speed of dissemination of user-generated content online demands a risk-based and proportionate approach, as Part 3 sets out.

It is also important that these companies understand the risks to children in the round, rather than focusing on one particular type of content. Risks to children will often be a consequence of the design of these services—for instance, through algorithms, which need to be tackled holistically.

I know that the noble Baroness is concerned about whether pornography will indeed be designated as primary priority content for the purposes of the child safety duties in Clauses 11(3) and 25(3). The Government fully intend this to be the case, which means that user-to-user services will need to have appropriate systems to prevent children accessing pornography, as defined in Clause 70(2).

The approach taken in Part 3 is very different from services captured under Part 5, which are publishing content directly, know exactly where it is located on their site and already face legal liability for the content. In this situation the service has full control over its content, so a risk-based approach is not appropriate. It is reasonable to expect that service to prevent children accessing pornography. We do not therefore consider it necessary or effective to apply the Part 5 duties to user-to-user pornographic content.

I also assure the noble Baroness and the noble Lord that, in a case where a provider of user-to-user services is directly publishing pornographic content on its own service, it will already be subject to the Part 5 duties in relation to that particular content. Those duties in relation to that published pornographic content will be separate from and in addition to their Part 3 duties in relation to user-generated pornographic content.

This means that, no matter where published pornographic content appears, the obligation to ensure that children are not normally able to encounter it will apply to all in-scope internet service providers that publish pornographic content. This is made clear in Clause 71(2) and is regardless of whether they also offer user-to-user or search services.

21:45
As set out in a recent letter to your Lordships, Ofcom will prioritise protecting children from pornography and other harmful content. In the autumn it intends to publish draft guidance for Part 5 pornography duties and draft codes of practice for Part 3 illegal content duties, including for child sexual exploitation and abuse content. Draft codes of practice for children’s safety duties will follow in summer 2024. These elements of the regime are being prioritised ahead of others, such as category 1 duties, to reflect the critical importance of protecting children.
It is right that Ofcom consult on Part 5 guidance as quickly as possible, to protect children from accessing pornography on Part 5 services. Part 5 guidance is focused entirely on the provision of pornography, whereas codes of practice under Part 3 are significantly more complex, as they deal with other forms of harmful content and so require longer to develop. This may mean there will be a limited period of time during which Part 5 protections are in place ahead of those in Part 3. It would not be right to delay the Part 5 consultation on that basis.
As the Bill makes clear, we expect companies to use technology such as age verification to prevent children accessing pornography, whether it is user-generated or published. Any technology used to comply with the Bill will need to be effective in accurately identifying the age of users. Ofcom will be able to take enforcement action if a company uses inadequate technological solutions. But, as I mentioned earlier, the Bill will not mandate specific approaches or technologies.
In her Amendment 125A, the noble Baroness, Lady Ritchie of Downpatrick, raises concerns that a provider of pornographic content could move from being a Part 5 service to a Part 3 service if they allow comments or reviews on their content. I am grateful to her for raising and discussing the issue earlier. Amendment 125A in her name intends to narrow—
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I am sorry, but can the Minister just clarify that? Is he saying that it is not possible to be covered by both Part 3 and Part 5, so that where a Part 5 service has user-generated content it is also covered by Part 3? Can he clarify that you cannot just escape Part 5 by adding user-generated content?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes, that is correct. I was trying to address the points raised by the noble Baroness, but the noble Lord is right. The point on whether people might try to be treated differently by allowing comments or reviews on their content is that they would be treated the same way. That is the motivation behind the noble Baroness’s amendment trying to narrow the definition. There is no risk that a publisher of pornographic content could evade their Part 5 duties by enabling comments or reviews on their content. That would be the case whether or not those reviews contained words, non-verbal indications that a user liked something, emojis or any other form of user-generated content.

That is because the Bill has been designed to confer duties on different types of content. Any service with provider pornographic content will need to comply with the Part 5 duties to ensure that children cannot normally encounter such content. If they add user-generated functionality—

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

I am sorry to come back to the same point, but let us take the Twitter example. As a publisher of pornography, does Twitter then inherit Part 5 responsibilities in as much as it is publishing pornography?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

It is covered in the Bill as Twitter. I am not quite sure what my noble friend is asking me. The harms that he is worried about are covered in different ways. Twitter or another social medium that hosts such content would be hosting it, not publishing it, so would be covered by Part 3 in that instance.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

Maybe my noble friend the Minister could write to me to clarify that point, because it is quite a significant one.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Perhaps I will speak to the noble Lord afterwards and make sure I have his question right before I do so.

I hope that answers the questions from the noble Baroness, Lady Ritchie, and that on that basis, she will be happy to withdraw her amendment.

Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a very wide-ranging debate, concentrating not only on the definition of pornography but on the views of noble Lords in relation to how it should be regulated, and whether it should be regulated, as the noble Baroness, Lady Kidron, the noble Lords, Lord Bethell and Lord Browne, and I myself believe, or whether it should be a graduated response, which seems to be the view of the noble Lords, Lord Allan and Lord Clement-Jones.

I believe that all pornography should be treated the same. There is no graduated response. It is something that is pernicious and leads to unintended consequences for many young people, so therefore it needs to be regulated in all its forms. I think that is the point that the noble Lord, Lord Bethell, was making. I believe that these amendments should have been debated along with those of the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, because then we could have an ever wider-ranging debate, and I look forward to that in the further groups in the days to come. The focus should be on the content, not on the platform, and the content is about pornography.

I agree with the noble Baroness, Lady Kidron, that porn is not the only harm, and I will be supporting her amendments. I believe that they should be in the Bill because if we are serious about dealing with these issues, they have to be in there.

I do not think my amendments are suggesting that children will be removed from social media. I agree that it is a choice to remove pornography or to age-gate. Twitter is moving to subscriber content anyway, so it can do it; the technology is already available to do that. I believe you just age-gate the porn content, not the whole site. I agree with the noble Lord, Lord Clement-Jones, as I said. These amendments should have been debated in conjunction with those of the noble Lord, Lord Bethell, and the noble Baroness, Lady Kidron, as I believe that the amendments in this group are complementary to those, and I think I already said that in my original submission.

I found the Minister’s response interesting. Obviously, I would like time to read Hansard. I think certain undertakings were given, but I want to see clearly spelled out where they are and to discuss with colleagues across the House where we take these issues and what we come back with on Report.

I believe that these issues will be debated further in Committee when the amendments from the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, are debated. I hope that in the intervening period the Minister will have time to reflect on the issues raised today about Parts 3 and 5 and the issue of pornography, and that he will be able to help us in further sessions in assuaging the concerns that we have raised about pornography. There is no doubt that these issues will come back. The only way that they can be dealt with, that pornography can be dealt with and that all our children throughout the UK can be dealt with is through proper regulation.

I think we all need further reflection. I will see, along with colleagues, whether it is possible to come back on Report. In the meantime, I beg leave to withdraw the amendment.

Amendment 12BA withdrawn.
Amendments 12C to 12E
Moved by
12C: Clause 6, page 5, line 23, at end insert “(2) to (10)”
Member’s explanatory statement
This amendment is consequential on the amendments in the Minister’s name to clause 11 below (because the new duty to summarise children’s risk assessments in the terms of service is only imposed on providers of Category 1 services).
12D: Clause 6, page 5, line 25, at end insert—
“(za) the duty about illegal content risk assessments set out in section 9(8A),(zb) the duty about children’s risk assessments set out in section 11(10A),”Member’s explanatory statement
This amendment ensures that the new duties set out in the amendments in the Minister’s name to clauses 9 and 11 below (duties to summarise risk assessments in the terms of service) are imposed on providers of Category 1 services only.
12E: Clause 6, page 5, line 32, at end insert “, and
(f) the duty about record-keeping set out in section 19(8A).”Member’s explanatory statement
This amendment ensures that the new duty set out in the amendment in the Minister’s name to clause 19 below (duty to supply records of risk assessments to OFCOM) is imposed on providers of Category 1 services only.
Amendments 12C to 12E agreed.
House resumed.
House adjourned at 9.55 pm.

Online Safety Bill

Committee (3rd Day)
11:51
Relevant document: 28th Report from the Delegated Powers Committee
Clause 6: Providers of user-to-user services: duties of care
Amendment 13
Moved by
13: Clause 6, page 5, line 33, after “services” insert “that are not Category 2A services”
Member’s explanatory statement
This amendment is consequential on other amendments in the name of Lord Moylan to remove Clause 23(3) and the subsequent new Clause after 23, the effect of which is that the duties imposed on search services vary depending on whether or not they are Category 2A services: this needs to be reflected in the provision about combined services (regulated user-to-user services that include public search services) in Clause 6.
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, in moving my Amendment 13 I will speak to all the amendments in the group, all of which are in my name with the exception of Amendment 157 in the name of my noble friend Lord Pickles. These are interlinked amendments; they work together. There is effectively only one amendment going on. A noble Lord challenged me a day or two ago as to whether I could summarise in a sentence what the amendment does, and the answer is that I think I can: Clause 23 imposes various duties on search engines, and this amendment would remove one of those duties from search engines that fall into category 2B.

There are two categories of search engines, 2A and 2B, and category 2B is the smaller search engines. We do not know the difference between them in greater detail than that because the schedule that relates to them reserves to the Secretary of State the power to set the thresholds that will define which category a search engine falls into, but I think it is clear that category 2B is the smaller ones.

These amendments pursue a theme that I brought up in Committee earlier in the week when I argued that the Bill would put excessively onerous and unnecessary obligations on smaller businesses. The particular duty that these amendments would take away from smaller search engines is referred to in Clause 23(2):

“A duty, in relation to a service, to take or use proportionate measures relating to the design or operation of the service to effectively mitigate and manage the risks of harm to individuals, as identified in the most recent illegal content risk assessment of the service”.


The purpose of that is to recognise that very large numbers of smaller businesses do not pose a risk, according to the Government’s own assessment of the market, and to allow them to get on with their business without taking these onerous and difficult measures. They are probing amendments to try to find out what the Government are willing to do in relation to smaller businesses that will make this a workable Bill.

I can already imagine that there are noble Lords in the Chamber who will say that small does not equal safe, and that small businesses need to be covered by the same rigorous regulations as larger businesses. But I am not saying that small equals safe. I am saying—as I attempted to say when the Committee met earlier—that absolute safety is not attainable. It is not attainable in the real world, nor can we expect it to be attainable in the online world. I imagine that objection will be made. I see it has some force, but I do not think it has sufficient compelling force to put the sort of burden on small businesses that this Bill would do, and I would like to hear more about it.

I will say one other thing. Those who object to this approach need to be sure in their own minds that they are not contributing to creating a piece of legislation that, when it comes into operation, is so difficult to implement that it becomes discredited. There needs to be a recognition that this has to work in practice. If it does not—if it creates resentment and opposition—we will find the Government not bringing sections of it into force, needing to repeal them or going easy on them once the blowback starts, so to speak. With that, I beg to move.

Baroness Deech Portrait Baroness Deech (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to Amendment 157 in the name of the noble Lord, Lord Pickles, and others, since the noble Lord is unavoidably absent. It is along the same lines as Amendment 13; it is relatively minor and straightforward, and asks the Government to recognise that search services such as Google are enormously important as an entry to the internet. They are different from social media companies such as Twitter. We ask that the Government be consistent in applying their stated terms when these are breached in respect of harm to users, whether that be through algorithms, through auto-prompts or otherwise.

As noble Lords will be aware, the Bill treats user-to-user services, such as Meta, and search services, such as Google, differently. The so-called third shield or toggle proposed for shielding users from legal but harmful content, should they wish to be shielded, does not apply when it comes to search services, important though they are. Indeed, at present, large, traditional search services, including Google and Microsoft Bing, and voice search assistants, including Alexa and Siri, will be exempted from several of the requirements for large user-to-user services—category 1 companies. Why the discrepancy? Though search services rightly highlight that the content returned by a search is not created or published by them, the algorithmic indexing, promotion and search prompts provided in search bars—the systems they design and employ—are their responsibility, and these have been proven to do harm.

Some of the examples of such harm have already been cited in the other place, but not before this Committee. I do not want to give them too much of an airing because they were in the past, and the search people have taken them down after complaints, but some of the dreadful things that emerge from searching on Google et cetera are a warning of what could occur. It has been pointed out that search engines would in the past have thrown up, for example, swastikas, SS bolts and other Nazi memorabilia when people searched for desk ornaments. If George Soros’s name came up, he would be included in a list of people responsible for world evils. The Bing service, which I dislike anyway, has been directing people—at last, it did in the past—to anti-Semitic and homophobic searches through its auto-complete, while Google’s image carousel highlighted pictures of portable barbecues to those searching for the term “Jewish baby stroller”.

12:00
These search engines, which are larger than some countries in terms of the funds they raise, should be treated in the same way as Meta, Twitter and others, knowing the harm that their systems can cause. The Joint Committee on the draft Bill, and Ministers in meetings with the APPG Against Antisemitism, have been clear that this is an issue and recognised that it needs addressing. I hope the Minister will agree that our amendment, or perhaps one similar to it that the Government might care to introduce in the next stages, would be a small, smart and meaningful technical fix to the Bill in addressing the unnecessary imbalance that allows major search companies to avoid protecting the public to the full extent that we, in the Bill, expect of other large companies. I hope that the Minister will agree to meet interested parties and to do the sensible and right thing about search engines.
Lord Weir of Ballyholme Portrait Lord Weir of Ballyholme (DUP)
- View Speech - Hansard - - - Excerpts

My Lords, I also support Amendment 157, which stands in the name of the noble Lord, Lord Pickles, and others, including my own. As the noble Baroness, Lady Deech, indicated, it is specific in the nature of what it concentrates on. The greatest concern that arises through the amendment is with reference to category 2A. It is not necessarily incompatible with what the noble Lord, Lord Moylan, proposes; I do not intend to make any direct further comment on his amendments. While the amendment is specific, it has a resonance with some of the other issues raised on the Bill.

I am sure that everyone within this Committee would want to have a Bill that is as fit for purpose as possible. The Bill was given widespread support at Second Reading, so there is a determination across the Chamber to have that. Where we can make improvements to the Bill, we should do that and, as much as possible, try to future-proof the Bill. The wider resonance is the concern that if the Bill is to be successful, we need as much consistency and clarity within it as possible, particularly for users. Where we have a level of false dichotomy of regulations, that runs contrary to the intended purposes of the Bill and creates inadvertent opportunities for loopholes. As such, and as has been indicated, the concern is that in the Bill at present, major search engines are effectively treated in some of the regulations on a different basis from face-to-face users. For example, some of the provisions around risk assessment, the third shield and the empowerment tools are different.

As also indicated, we are not talking about some of the minor search engines. We are talking about some of the largest companies in the world, be it Google, Microsoft through Bing, Amazon through its devices or Apple through its Siri voice tool, so it is reasonable that they are brought into line with what is there is for face-to-face users. The amendment is therefore appropriate and the rationale for it is that there is a real-world danger. Mention has been made—we do not want to dwell too long on some of the examples, but I will use just one—of the realms of anti-Semitism, where I have a particular interest. For example, on search tools, a while ago there was a prompt within one search engine that Jews are evil. It was found that when that prompt was there, searches of that nature increased by 10% and when it was removed, they were reduced. It is quite fixable and it goes into a wide range of areas.

One of the ways in which technology has changed, I think for us all, is the danger that it can be abused by people who seek to radicalise others and make them extreme, particularly young children. Gone are the days when some of these extremists or terrorists were lonely individuals in an attic, with no real contact with the outside world, or hanging around occasionally in the high street while handing out poorly produced A4 papers with their hateful ideology. There is a global interconnection here and, in particular, search engines and face-to-face users can be used to try to draw young people into their nefarious activities.

I mentioned the example of extremism and radicalisation when it comes to anti-Semitism. I have seen it from my own part of the world, where there is at times an attempt by those who still see violence as the way forward in Northern Ireland to draw new generations of young people into extremist ideology and terrorist acts. There is an attempt to lure in young people and, sadly, search engines have a role within that, which is why we need to see that level of protection. Now, the argument from search engines is that they should have some level of exemptions. How can they be held responsible for everything that appears through their searches, or indeed through the web? But in terms of content, the same argument could be used for face-to-face users. It is right, as the proposer of this amendment has indicated, that there are things such as algorithmic indexing and prompt searches where they do have a level of control.

The use of algorithms has moved on considerably since my schooldays, as they surely have for everyone in this Committee, and I suspect that none of us felt that they would be used in such a fashion. We need a level of protection through an amendment such as this and, as its proposers, we are not doctrinaire on the precise form in which this should take place. We look, for example, at the provisions within Clause 11—we seek to hear what the Government have to say on that—which could potentially be used to regulate search engines. Ensuring that that power is given, and will be used by Ofcom, will go a long way to addressing many of the concerns.

I think all of us in this Committee are keen to work together to find the right solutions, but we feel that there is a need to make some level of change to the regulations that are required for search engines. None of us in this Committee believes that we will ultimately have a piece of legislation that reflects perfection, but there is a solemn duty on us all to produce legislation that is as fit for purpose and future-proofed as possible, while providing children in particular with the maximum protection in what is at times an ever-changing and sometimes very frightening world.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I agree in part with the noble Lord, Lord Moylan. I was the person who said that small was not safe, and I still feel that. I certainly do not think that anything in the Bill will make the world online 100% safe, and I think that very few noble Lords do, so it is important to say that. When we talk about creating a high bar or having zero tolerance, we are talking about ensuring that there is a ladder within the Bill so that the most extreme cases have the greatest force of law trying to attack them. I agree with the noble Lord on that.

I also absolutely agree with the noble Lord about implementation: if it is too complex and difficult, it will be unused and exploited in certain ways, and it will have a bad reputation. The only part of his amendment that I do not agree with is that we should look at size. Through the process of Committee, if we can look at risk rather than size, we will get somewhere. I share his impatience—or his inquiry—about what categories 2A and 2B mean. If category 2A means the most risky and category 2B means those that are less risky, I am with him all the way. We need to look into the definition of what they mean.

Finally, I mentioned several times on Tuesday that we need to look carefully at Ofcom’s risk profiles. Is this the answer to dealing with where risk gets determined, rather than size?

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to speak along similar lines to the noble Baroness, Lady Kidron. I will address my noble friend Lord Moylan’s comments. I share his concern that we must not make the perfect the enemy of the good but, like the noble Baroness, I do not think that size is the key issue here, because of how tech businesses grow. Tech businesses are rather like building a skyscraper: if you get the foundations wrong, it is almost impossible to change how safe the building is as it goes up and up. As I said earlier this week, small tech businesses can become big very quickly, and, if you design your small tech business with the risks to children in mind at the very beginning, there is a much greater chance that your skyscraper will not wobble as it gets taller. On the other hand, if your small business begins by not taking children into account at all, it is almost impossible to address the problem once it is huge. I fear that this is the problem we face with today’s social media companies.

The noble Baroness, Lady Kidron, hit the nail on the head, as she so often does, in saying that we need to think about risk, rather than size, as the means of differentiating the proportionate response. In Clause 23, which my noble friend seeks to amend, the important phrase is “use proportionate measures” in subsection (2). Provided that we start with a risk assessment and companies are then under the obligation to make proportionate adjustments, that is how you build safe technology companies—it is just like how you build safe buildings.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I will build on my noble friend’s comments. We have what I call the Andrew Tate problem. That famous pornographer and disreputable character started a business in a shed in Romania with a dozen employees. By most people’s assessment, it would have been considered a small business but, through his content of pornography and the physical assault of women, he extremely quickly built something that served an estimated 3 billion pages, and it has had a huge impact on the children of the English-speaking world. A small business became a big, nasty business very quickly. That anecdote reinforces the point that small does not mean safe, and, although I agree with many of my noble friend’s points, the lens of size is perhaps not the right one to look through.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I did not want to interrupt the noble Lord, Lord Moylan, in full flow as he introduced the amendments, but I believe he made an error in terms of the categorisation. The error is entirely rational, because he took the logical position rather than the one in the Bill. It is a helpful error because it allows us to quiz the Minister on the rationale for the categorisation scheme.

As I read it, in Clause 86, the categories are: category 1, which is large user-to-user services; category 2A, which is search or combined services; and category 2B, which is small user-to-user services. To my boring and logical binary brain, I would expect it to be: “1A: large user-to-user”; “1B: small user-to-user”; “2A: large search”; and “2B: small search”. I am curious about why a scheme like that was not adopted and we have ended up with something quite complicated. It is not only that: we now have this Part 3/Part 5 thing. I feel that we will be confused for years to come: we will be deciding whether something is a Part 3 2B service or a Part 5 service, and we will end up with a soup of numbers and letters that do not conform to any normal, rational approach to the world.

12:15
I am sure that a rationale does underlie that—the people who wrote the legislation will of course have come up with the schema for a reason—but it is important to push on that, because we want our legislation to be intelligible to people out there. Again, it would be entirely logical to have a schema that says, “1A: large user-to-user; 1B: small user-to-user; 2A: large search; 2B: small search; and 3: pornographic”. If the noble Baroness, Lady Kidron, has her way and we add extra services, we could make them categories 4 and 5, and we could have categories 4A and 4B.
Again, I hope that the Minister can take this opportunity to respond on the substance of whether there should be different requirements and to explain why we have that categorisation, where category 2B is small user-to-user services, category 1 is big user-to-user services and category 2A is search and combined services. That would probably not be the first assumption of most people in the House, and it has been bugging me since I first read the Bill, so it would be nice to get an answer today.
Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I welcome this debate, which revisits some of the areas discussed in earlier debates about the scope of the Bill, as many noble Lords said. It allows your Lordships’ House to consider what has to be the primary driver for assessment. In my view and as others said, it ought to be about risk, which has to be the absolute driver in all this. As the noble Baroness, Lady Harding, said, businesses do not remain static: they start at a certain size and then change. Of course, we hope that many of the businesses we are talking about will grow, so this is about preparation for growth and the reality of doing businesses.

As we discussed, there certainly are cases where search providers may, by their very nature, be almost immune from presenting users with content that could be considered either harmful or illegal under this legislative framework. The new clause proposed by the noble Lord, Lord Moylan—I am grateful to him for allowing us to explore these matters—and its various consequential amendments, would limit the duty to prevent access to illegal content to core category 2A search providers, rather than all search providers, as is currently the case under Clause 23(3).

The argument that I believe the noble Lord, Lord Moylan, put forward is that the illegal content duty is unduly wide, placing a disproportionate and otherwise unacceptable burden on smaller and/or supposedly safer search providers. He clearly said he was not saying that small was safe—that is now completely understood—but he also said that absolute safety is not achievable. As the noble Baroness, Lady Kidron, said, that is indeed so. If this legislation is too complex and creates the wrong provisions, we will clearly be a long way away from our ambition, which here has to be to have in place the best legislative framework, one that everyone can work with and that provides the maximum opportunity for safety and what we all seek to achieve.

Of course, the flip side of the argument about an unacceptable burden on smaller, or on supposedly safer, search providers may be that they would in fact have very little work to do to comply with the illegal content duty, at least in the short term. But the duty would act as an important safeguard, should the provider’s usual systems prove ineffective with the passage of time. Again, that point was emphasised in this and the previous debate by the noble Baroness, Lady Harding.

We look forward to the Minister’s response to find out which view he and his department subscribe to or, indeed, whether they have another view they can bring to your Lordships’ House. But, on the face of it, the current arrangements do not appear unacceptably onerous.

Amendment 157 in the name of the noble Lord, Lord Pickles, and introduced by the noble Baroness, Lady Deech, deals with search by a different approach by inserting requirements about search services’ publicly available statements into Clause 65. In the debate, the noble Baroness and the noble Lord, Lord Weir, raised very important, realistic examples of where search engines can take us, including to material that encourages racism directed at Jews and other groups and encourages hatred of various groups, including Jews. The amendment talks about issues such as the changing of algorithms or the hiding of content and the need to ensure that the terms of providers’ publicly available statements are applied as consistently.

I look forward to hearing from the Minister in response to Amendment 157 as the tech certainly moves us beyond questions of scope and towards discussion of the conduct of platforms when harm is identified.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I must first apologise for my slightly dishevelled appearance as I managed to spill coffee down my shirt on my way to the Chamber. I apologise for that—as the fumes from the dried coffee suffuse the air around me. It will certainly keep me caffeinated for the day ahead.

Search services play a critical role in users’ online experience, allowing them easily to find and access a broad range of information online. Their gateway function, as we have discussed previously, means that they also play an important role in keeping users safe online because they have significant influence over the content people encounter. The Bill therefore imposes stringent requirements on search services to tackle the risks from illegal content and to protect children.

Amendments 13, 15, 66 to 69 and 73 tabled by my noble friend Lord Moylan seek to narrow the scope of the Bill so that its safety search duties apply only to the largest search services—categorised in the Bill as category 2A services—rather than to all search services. Narrowing the scope in this way would have an adverse impact on the safety of people using search services, including children. Search services, including combined services, below the category 2A threshold would no longer have a duty to minimise the risk of users encountering illegal content or children encountering harmful content in or via search results. This would increase the likelihood of users, including children, accessing illegal content and children accessing harmful content through these services.

The Bill already takes a targeted approach and the duties on search services will be proportionate to the risk of harm and the capacity of companies. This means that services which are smaller and lower-risk will have a lighter regulatory burden than those which are larger and higher-risk. All search services will be required to conduct regular illegal content risk assessments and, where relevant, children’s risk assessments, and then implement proportionate mitigations to protect users, including children. Ofcom will set out in its codes of practice specific steps search services can take to ensure compliance and must ensure that these are proportionate to the size and capacity of the service.

The noble Baroness, Lady Kidron, and my noble friend Lady Harding of Winscombe asked how search services should conduct their risk assessments. Regulated search services will have a duty to conduct regular illegal content risk assessments, and where a service is likely to be accessed by children it will have a duty to conduct regular children’s risk assessments, as I say. They will be required to assess the level and nature of the risk of individuals encountering illegal content on their service, to implement proportionate mitigations to protect people from illegal content, and to monitor them for effectiveness. Services likely to be accessed by children will also be required to assess the nature and level of risk of their service specifically for children to identify and implement proportionate mitigations to keep children safe, and to monitor them for effectiveness as well.

Companies will also need to assess how the design and operation of the service may increase or reduce the risks identified and Ofcom will have a duty to issue guidance to assist providers in carrying out their risk assessments. That will ensure that providers have, for instance, sufficient clarity about what an appropriate risk assessment looks like for their type of service.

The noble Lord, Lord Allan, and others asked about definitions and I congratulate noble Lords on avoiding the obvious

“To be, or not to be”


pun in the debate we have just had. The noble Lord, Lord Allan, is right in the definition he set out. On the rationale for it, it is simply that we have designated as category 1 the largest and riskiest services and as category 2 the smaller and less risky ones, splitting them between 2A, search services, and 2B, user-to-user services. We think that is a clear framework. The definitions are set out a bit more in the Explanatory Notes but that is the rationale.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I am grateful to the Minister for that clarification. I take it then that the Government’s working assumption is that all search services, including the biggest ones, are by definition less risky than the larger user-to-user services. It is just a clarification that that is their thinking that has informed this.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

As I said, the largest and riskiest sites may involve some which have search functions, so the test of large and most risky applies. Smaller and less risky search services are captured in category 2A.

Amendment 157 in the name of my noble friend Lord Pickles, and spoken to by the noble Baroness, Lady Deech, seeks to apply new duties on the largest search services. I agree with the objectives in my noble friend’s amendment of increasing transparency about the search services’ operations and enabling users to hold them to account. It is not, however, an amendment I can accept because it would duplicate existing duties while imposing new duties which we do not think are appropriate for search services.

As I say, the Bill will already require search services to set out how they are fulfilling their illegal content and child safety duties in publicly available statements. The largest search services—category 2A—will also be obliged to publish a summary of their risk assessments and to share this with Ofcom. That will ensure that users know what to expect on those search services. In addition, they will be subject to the Bill’s requirements relating to user reporting and redress. These will ensure that search services put in place effective and accessible mechanisms for users to report illegal content and content which is harmful to children.

My noble friend’s amendment would ensure that the requirements to comply with its publicly available statements applied to all actions taken by a search service to prevent harm, not just those relating to illegal content and child safety. This would be a significant expansion of the duties, resulting in Ofcom overseeing how search services treat legal content which is accessed by adults. That runs counter to the Government’s stated desire to avoid labelling legal content which is accessed by adults as harmful. It is for adult users themselves to determine what legal content they consider harmful. It is not for us to put in place measures which could limit their access to legal content, however distasteful. That is not to say, of course, that where material becomes illegal in its nature that we do not share the determination of the noble Baroness, my noble friend and others to make sure that it is properly tackled. The Secretary of State and Ministers have had extensive meetings with groups making representations on this point and I am very happy to continue speaking to my noble friend, the noble Baroness and others if they would welcome it.

I hope that that provides enough reassurance for the amendment to be withdrawn at this stage.

12:30
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to all noble Lords who have spoken in this debate. I hope that the noble Baroness, Lady Deech, and the noble Lord, Lord Weir of Ballyholme, will forgive me if I do not comment on the amendment they spoke to in the name of my noble friend Lord Pickles, except to say that of course they made their case very well.

I will briefly comment on the remarks of the noble Baroness, Lady Kidron. I am glad to see a degree of common ground among us in terms of definitions and so forth—a small piece of common ground that we could perhaps expand in the course of the many days we are going to be locked up together in your Lordships’ House.

I am grateful too to the noble Lord, Lord Allan of Hallam. I am less clear on “2B or not 2B”, if that is the correct way of referring to this conundrum, than I was before. The noble Baroness, Lady Kidron, said that size does not matter and that it is all about risk, but my noble friend the Minister cunningly conflated the two and said at various points “the largest” and “the riskiest”. I do not see why the largest are necessarily the riskiest. On the whole, if I go to Marks & Spencer as opposed to going to a corner shop, I might expect rather less risk. I do not see why the two run together.

I address the question of size in my amendment because that is what the Bill focuses on. I gather that the noble Baroness, Lady Kidron, may want to explore at some stage in Committee why that is the case and whether a risk threshold might be better than a size threshold. If she does that, I will be very interested in following and maybe even contributing to that debate. However, at the moment, I do not think that any of us is terribly satisfied with conflating the two—that is the least satisfactory way of explaining and justifying the structure of the Bill.

On the remarks of my noble friend Lady Harding of Winscombe, I do not want in the slightest to sound as if there is any significant disagreement between us—but there is. She suggested that I was opening the way to businesses building business models “not taking children into account at all”. My amendment is much more modest than that. There are two ways of dealing with harm in any aspect of life. One is to wait for it to arrive and then to address it as it arises; the other is constantly to look out for it in advance and to try to prevent it arising. The amendment would leave fully in place the obligation to remove harm, which is priority illegal content or other illegal content, that the provider knows about, having been alerted to it by another person or become aware of it in any other way. That duty would remain. The duty that is removed, especially from small businesses—and really this is quite important—is the obligation constantly to be looking out for harm, because it involves a very large, and I suggest possibly ruinous, commitment to constant monitoring of what appears on a search engine. That is potentially prohibitive, and it arises in other contexts in the Bill as well.

There should be occasions when we can say that knowing that harmful stuff will be removed as soon as it appears, or very quickly afterwards, is adequate for our purposes, without requiring firms to go through a constant monitoring or risk-assessment process. The risk assessment would have to be adjudicated by Ofcom, I gather. Even if no risk was found, of course, that would not be the end of the matter, because I am sure that Ofcom would, very sensibly, require an annual renewal of that application, or after a certain period, to make sure that things had not changed. So even to escape the burden is quite a large burden for small businesses, and then to implement the burden is so onerous that it could be ruinous, whereas taking stuff down when it appears is much easier to do.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

Perhaps I might briefly come in. My noble friend Lord Moylan may have helped explain why we disagree: our definition of harm is very different. I am most concerned that we address the cumulative harms that online services, both user-to-user services and search, are capable of inflicting. That requires us to focus on the design of the service, which we need to do at the beginning, rather than the simpler harm that my noble friend is addressing, which is specific harmful content—not in the sense in which “content” is used in the Bill but “content” as in common parlance; that is, a piece of visual or audio content. My noble friend makes the valid point that that is the simplest way to focus on removing specific pieces of video or text; I am more concerned that we should not exclude small businesses from designing and developing their services such that they do not consider the broader set of harms that are possible and that add up to the cumulative harm that we see our children suffering from today.

So I think our reason for disagreement is that we are focusing on a different harm, rather than that we violently disagree. I agree with my noble friend that I do not want complex bureaucratic processes imposed on small businesses; they need to design their services when they are small, which makes it simpler and easier for them to monitor harm as they grow, rather than waiting until they have grown. That is because the backwards re-engineering of a technology stack is nigh-on impossible.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My noble friend makes a very interesting point, and there is much to ponder in it—too much to ponder for me to respond to it immediately. Since I am confident that the issue is going to arise again during our sitting in Committee, I shall allow myself the time to reflect on it and come back later.

While I understand my noble friend’s concern about children, the clause that I propose to remove is not specific to children; it relates to individuals, so it covers adults as well. I think I understand what my noble friend is trying to achieve—I shall reflect on it—but this Bill and the clauses we are discussing are a very blunt way of going at it and probably need more refinement even than the amendments we have seen tabled so far. But that is for her to consider.

I think this debate has been very valuable. I did not mention it, but I am grateful also for the contribution from the noble Baroness, Lady Merron. I beg leave to withdraw the amendment.

Amendment 13 withdrawn.
Baroness Deech Portrait Baroness Deech (CB)
- View Speech - Hansard - - - Excerpts

Having listened to the Minister, I think we need clarification on the issue of duplication and what is illegal as opposed to just harmful. If we can clarify that, I shall not move my Amendment 157.

Lord Beith Portrait The Deputy Chairman of Committees (Lord Beith) (LD)
- Hansard - - - Excerpts

When we come to Amendment 157, that will be noted.

Amendments 13A to 13C

Moved by
13A: Clause 6, page 5, line 35, after “service” insert “is not a Category 2A service and”
Member’s explanatory statement
This technical amendment ensures that the duties imposed on providers of combined services in relation to the search engine are correct following the changes to clause 20 arising from the new duties in clauses 23, 25 and 29 which are imposed on providers of Category 2A services only.
13B: Clause 6, page 5, line 37, after “service” insert “is not a Category 2A service and”
Member’s explanatory statement
This technical amendment ensures that the duties imposed on providers of combined services in relation to the search engine are correct following the changes to clause 20 arising from the new duties in clauses 23, 25 and 29 which are imposed on providers of Category 2A services only.
13C: Clause 6, page 5, line 38, at end insert—
“(c) if the service is a Category 2A service not likely to be accessed by children, the duties set out in Chapter 3 referred to in section 20(2) and (3A);(d) if the service is a Category 2A service likely to be accessed by children, the duties set out in Chapter 3 referred to in section 20(2), (3) and (3A).”Member’s explanatory statement
This amendment ensures that the new duties set out in the amendments in the Minister’s name to clauses 23, 25 and 29 below (duties to summarise risk assessments in a publicly available statement and to supply records of risk assessments to OFCOM) are imposed on providers of combined services that are Category 2A services in relation to the search engine.
Amendments 13A to 13C agreed.
Amendment 14
Moved by
14: Clause 6, page 5, line 38, at end insert—
“(6A) Providers of regulated user-to-user services are required to comply with duties under subsections (2) to (6) for each such service which they provide to the extent that is proportionate and technically feasible without making fundamental changes to the nature of the service (for example, by removing or weakening end-to-end encryption on an end-to-end encrypted service).”Member’s explanatory statement
This amendment is part of a series of amendments by Lord Clement-Jones intended to ensure risk assessments are not used as a tool to undermine users’ privacy and security.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, I propose Amendment 14 on behalf of my noble friend Lord Clement-Jones and the noble Lord, Lord Hunt of Kings Heath, who are not able to be present today due to prior commitments. I notice that the amendment has been signed also by the noble Baroness, Lady Fox, who I am sure will speak to it herself. I shall speak to the group of amendments as a whole.

I shall need to speak at some length to this group, as it covers some quite complex issues, even for this Bill, but I hope that the Committee will agree that this is appropriate given the amendments’ importance. I also expect that this is one area where noble Lords are receiving the most lobbying from different directions, so we should do it justice in our Committee.

We should start with a short summary of the concern that lies behind the amendments: that the Bill, as drafted, particularly under Clause 110, grants Ofcom the power to issue technical notices to online services that could, either explicitly or implicitly, require them to remove privacy protections—and, in particular, that this could undermine a technology that is increasingly being deployed on private messaging services called end-to-end encryption. The amendments in this group use various mechanisms to reduce the likelihood of that being an outcome. Amendments 14 and 108 seek to make it clear in the Bill that end-to-end encryption would be out of scope—and, as I understand it, Amendment 205, tabled by the noble Lord, Lord Moylan, seeks to do something similar.

A second set of amendments would add in extra controls over the issuing of technical notices. While not explicitly saying that these could not target E2EE—if noble Lords will excuse the double negative—they would make it less likely by ensuring that there is more scrutiny. They include a whole series of amendments—Amendments 202 and 206, tabled by the noble Lord, Lord Stevenson, and Amendment 207—that have the effect of ensuring that there is more scrutiny and input into issuing such a notice.

The third set of amendments aim to ensure that Ofcom gives weight more generally to privacy and to all the actions it takes in relation to it. In particular, Amendment 190 talks about a broader privacy duty, and Amendment 285—which I think noble Lord, Lord Moylan, will be excited about—seeks to restrict general monitoring.

I will now dig into why this is important. Put simply, there is a risk that under the Bill a range of internet services will feel that they are unable to offer their products in the UK. This speaks to a larger question as we debate the measures in the Bill, as it can sometimes feel as though we are comfortable ratcheting up the requirements in the Bill under the assumption that services will have no choice but to meet them and carry on. While online services will not have a choice about complying if they wish to be lawfully present in the UK, they will be free to exit the market altogether if they believe that the requirements are excessively onerous or impossible to meet.

In the Bill, we are constructing, in effect, a de facto licensing mechanism, where Ofcom will contact in-scope services—the category 2A, category 2B, Part 3 and Part 5 services we discussed in relation to the previous group of amendments—will order them to follow all the relevant regulation and guidance and will instruct them to pay a fee for that supervision. We have to consider that some services, on receipt of that notice, will take steps to restrict access by people in the UK rather than agree to such a licence. Where those are rogue services, this reaction is consistent with the aims of the Bill. We do not want services which are careless about online safety to be present in the UK market. But I do not believe that it is our aim to force mainstream services out of the UK market and, if there is a chance of that happening, it should give us pause for thought.

As a general rule, I am not given to apocalyptic warnings, but I believe there is a real risk that some of the concerns that noble Lords will be receiving in their inboxes are genuine, so I want to unpick why that may be the case. We should reflect for a moment on the assumptions we may have about the people involved in this debate and their motivations. We often see tech people characterised as oblivious to harms, and security services people as uncaring about human rights. In my experience, both caricatures are off the mark, as tech people hate to see their services abused and security service representatives understand that they need to be careful about how they exercise the great powers we have given them. We should note that, much of the time, those two communities work well together in spaces such the Global Internet Forum to Counter Terrorism.

If this characterisation is accurate, why do I think we may have a breakdown over the specific technology of end-to-end encryption? To understand this subject, we need to spend a few moments looking at trends in technology and regulation over recent years. First, we can look at the growth of content-scanning tools, which I think may have been in the Government’s mind when they framed and drafted the new Clause 110 notices. As social media services developed, they had to consider the risks of hosting content on the services that users had uploaded. That content could be illegal in all sorts of ways, including serious forms, such as child sexual abuse material and terrorist threats, as well as things such as copyright infringement, defamatory remarks and so on. Platforms have strong incentives to keep that material off their servers for both moral and legal reasons, so they began to develop and deploy a range of tools to identify and remove it. As a minimum, most large platforms now deploy systems to capture child sexual abuse material and copyright-infringing material, using technologies such as PhotoDNA and Audible Magic.

12:45
I stress again that, in the context of our debate on these amendments, a key element in the rationale for deploying these tools voluntarily—not because they are required to do so by law—is the fact that social media services are acting as hosts for content on their servers, so they feel partially liable for it; in fact, in legal terms, they may well be strictly liable for it. By contrast, modern private messaging services tend to have quite a different architecture, where the provider does not host content on its servers but simply moves it from one device on the network to another. There are some exceptions to that with legacy services, such as Facebook Messenger and the caching of large files—we could go into that subject, if noble Lords are interested. But the key point is that there has been a trend towards more functionality at the edge—namely, on the device in your pocket—as we move from classic social media, which depended on servers, to messaging. That distinction is critical when we consider what is commonly referred to as client-side scanning. The scanning that takes place today generally takes place on platform servers on content they are hosting themselves. The introduction of scanning on to people’s own devices is a different beast in technical, legal and ethical terms; I am sure we will want to tease that out in the debate.
The second trend we have seen is the concern over government surveillance. Back in the day, we may have been comfortable with the security services having a desk in the telephone exchange or asking their mate Bob, who does the filing at some company, to pass them information about a dodgy character—but the landscape has shifted. The Snowden revelations triggered a huge debate about the reach of Governments into our online lives—even those whom we think are on our side, such as the UK Government or the US Government—and we are increasingly concerned about foreign surveillance at home, to the extent that we are willing to spend a fortune pulling Huawei devices out of core UK telecom networks to mitigate the risk of Chinese government access. If you think that a foreign Government have gained access to the UK’s telecom networks, using an end-to-end encrypted service is one of the best ways to protect yourself, which, I am sure, is on the minds of the technical staff of UK political parties when they choose to put their teams on encrypted apps such as WhatsApp.
Thirdly, there is a general trend in privacy expectations and legislation, which are all heading in one direction: improving transparency over what is being done with data and giving people more power to withhold or grant consent. This reflects the fact that more of our lives are moving online, so being able to control it becomes more critical to us all. We see this trend playing out in multiple pieces of legislation, such as the general data protection regulation and the privacy regulation, as well as in actions taken by regulators to step up enforcement.
Far from being an irrational move by platforms careless as to its negative impacts, the adoption of end-to-end encryption is an entirely rational response to these three powerful regulatory and societal trends. It can help to mitigate the ever-increasing risks related to content liability—which the Bill, in fact, adds to—it makes hostile government surveillance much harder, and it is a key safeguard against privacy violations.
If this is where we have been with regulation incentivising the adoption of end-to-end encryption, how might this play out as we introduce a new element in the mix with the Online Safety Bill? I can see three scenarios that could play out as the Bill comes into force and Ofcom gains powers to issue directions to platforms. First, the Government could declare that their intent is to impose technical requirements that would mean that people in the UK will no longer be able to use truly secure end-to-end encrypted products. That could be either through explicit instructions to messaging service providers to remove the end-to-end encryption, or through requiring client-side scanning to be installed on user devices in the UK, which would, in effect, render them less secure. That is not my preferred option, but it would at least allow for an orderly transition, if services choose to withdraw products from the UK market rather than operate here on these terms. It might be that there are no significant withdrawals, and the UK Government could congratulate themselves on calling the companies’ bluff and getting what they want at little cost, but I doubt that this would be the case, given the strength of feeling out there—which, I am sure, we have all seen. We would at least want to know, one way or the other, which way that will go before adopting that course of action.
The second course is for the Government to continue with the posture of intentional ambiguity, as they have done to date. They are careful to say that they have no intention of banning end-to-end encryption, and I expect to hear that again in the Minister’s response today, but at the same time refuse to confirm that they could not do so under the new powers in the Bill. This creates a high-stakes game of chicken, where the Government think companies will give them more if they hold the threat of drastic technical orders over them. That “more” might include providing more metadata—who messaged whom, when and from where—or tools to identify patterns of suspicious behaviour without reading message content. These are all things we can expect the Government to be discussing with companies, as well as urging them to deploy forms of client-side scanning voluntarily.
As a veteran of a thousand psychic wars of this kind, I have to say that I do not think it is as productive a way to proceed as some in government may believe. It is all too common to have meetings with government representatives where you are working together on responding to terrorist content only to find a Minister going out the next day to say that your platform does not care about terrorism. I get it; this is politics. However, it is hard to explain to engineers who you are asking to go the extra mile to build new safety tools why they should do so when the Government who asked for the tools give them no credit for this. I understand the appeal from the government side of going into a negotiation with a big regulatory stick that you can show to the other side, but I think it is misguided.
The Government’s hope is that companies will blink first in the game of chicken and give them what they want, but it is at least as likely that the Government will blink first and have to abandon proposals, which risks discrediting their efforts as a whole. If nobody blinks, and we allow an unstoppable force to hit an immovable object, we could end up with the complete breakdown of key relationships and years of unproductive litigation. I believe that the interests of people in the UK lie in government being able to work with the services that millions of us use to find the best ways to combat harms—harms that everybody, on both sides, agree are a priority.
That brings me to my third and final scenario, and the one that these amendments are seeking to create. This is where the Government accept that end-to-end encrypted communication services are a legitimate part of the modern online environment that should not be undermined or pushed out of the UK market. The Government would explicitly rule out any intention to use orders under Clause 110 to weaken end-to-end encrypted services and instead focus their efforts on making it clear to people that end-to-end encryption does not mean impunity.
I was talking to my children as I came in about the fact that end-to-end encryption is not entirely secure and does not grant absolute privacy, and they said, “Of course—everyone should do the online safety classes we do at school”. These offer the simple message that it is foolish to send things over any internet service that you would not want to be shared widely, and the training tells you that any message can be screenshotted and passed around. Rather than talking up the fact that end-to-end encryption is protecting people sharing bad content, we should be talking up the ways in which you remain exposed.
Sadly, we have become used to reading stories about awful content being shared in groups on messaging services used by serving police officers—these were WhatsApp end-to-end encrypted messages. If there is legitimate interest in investigating content, we will see it serviced, whether or not it is shared on an encrypted service. Unless people are communicating only with themselves, there are multiple ways that their content, if illegal, might come to the attention of the authorities. The most obvious is that someone who is privy to the content hands it over, either voluntarily or because they are themselves under investigation. But the police and security services also have a range of intrusive surveillance tools at their disposal which can compromise the devices of their targets under properly warranted authority, and all the content on any apps they use can be provided to the security services properly, under the controls in the Regulation of Investigatory Powers Act. There are long-standing powers, sometimes used controversially, to require people to grant access to their own devices if there are grounds to think it is necessary to investigate some types of offence.
I hope the Government will give serious consideration to moving in this direction and to accepting the force of the amendments that have been put forward today. This is not about weakening the fight against appalling harms such as child sexual abuse material and terrorism, but rather about finding the most practical way to wage that fight in a world where end-to-end encryption exists and is being used to mitigate other material risks to our online lives. I beg to move.
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I support Amendment 190 in the name of the noble Lord, Lord Clement-Jones, and Amendment 285 in the name of the noble Lord, Lord Stevenson. That is not to say that I do not have a great deal of sympathy for the incredibly detailed and expert speech we have just heard, but I want to say just a couple of things.

First, I think we need to have a new conversation about privacy in general. The privacy that is imagined by one community is between the state and the individual, and the privacy that we do not have is between individuals and the commercial companies. We live in a 3D world and the argument remains 2D. We cannot do that today, but I agree with the noble Lord that many in the enforcement community do have one hand on human rights, and many in the tech world do care about human rights. However, I do not believe that the tech sector has fully fessed up to its role and the contribution it could make around privacy. I hope that, as part of the debate on the Bill, and the debate that we will have subsequently on the data Bill No. 2, we come to untangle some of the things that they defend—in my view, unnecessarily and unfairly.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I point out that one of the benefits of end-to-end encryption is that it precisely stops companies doing things such as targeted advertising based on the content of people’s communications. Again, I think there is a very strong and correct trend to push companies in that direction.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I thank the noble Lord for the intervention. For those noble Lords who are not following the numbers, Amendment 285, which I support, would prevent general monitoring. Apart from anything else, I am worried about equivalence and other issues in relation to general monitoring. Apart from a principled position against it, I think to be explicit is helpful.

Ofcom needs to be very careful, and that is what Amendment 190 sets out. It asks whether the alternatives have been thought about, whether the conditions have been thought about, and whether the potential impact has been thought about. That series of questions is essential. I am probably closer to the community that wants to see more powers and more interventions, but I would like that to be in a very monitored and regulated form.

I thank the noble Lord for his contribution. Some of these amendments must be supported because it is worrying for us as a country to have—what did the noble Lord call it?—ambiguity about whether something is possible. I do not think that is a useful ambiguity.

Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- View Speech - Hansard - - - Excerpts

My Lords, my name is attached to Amendment 203 in this group, along with those of the noble Lords, Lord Clement-Jones, Lord Strathcarron and Lord Moylan. I shall speak in general terms about the nature of the group, because it is most usefully addressed through the fundamental issues that arise. I sincerely thank the noble Lord, Lord Allan, for his careful and comprehensive introduction to the group, which gave us a strong foundation. I have crossed out large amounts of what I had written down and will try not to repeat, but rather pick up some points and angles that I think need to be raised.

As was alluded to by the noble Baroness, Lady Kidron, this debate and the range of these amendments shows that the Bill is currently extremely deficient and unclear in this area. It falls to this Committee to get some clarity and cut-through to see where we could end up and change where we are now.

I start by referring to a briefing, which I am sure many noble Lords have received, from a wide range of organisations, including Liberty, Big Brother Watch, the Open Rights Group, Article 19, the Electronic Frontier Foundation, Reset and Fair Vote. It is quite a range of organisations but very much in the human rights space, particularly the digital human rights space. The introduction of the briefing includes a sentence that gets to the heart of why many of us have received so many emails about this element of the Bill:

“None of us want to feel as though someone is looking over our shoulder when we are communicating”.

13:00
I take the point made by the noble Baroness, Lady Kidron, that many of our communications are scanned and this has an impact, but as the noble Lord, Lord Allan, said, end-to-end encryption prevents this. There is an increasing public awareness and understanding about that, and a desire to get away from the big tech companies the public utilises and clearly wishes to continue to utilise. That is a general public view, and one of the points made in the briefing is that so many people have very good reason to desire to maintain their privacy and be able to express themselves freely. The briefing notes that LGBTQIA+ people, for example, may wish that individual communications remain private.
I want to focus mostly on the broader issue of people who are looking to use services for public good. Some 40 million people in the UK use private messaging services every day, but some of those are journalists and activists for democracy and human rights around the world who are potentially putting themselves, and those with whom they communicate, in danger from repressive regimes. We have the problem that, if we open up the encryption, it can then be used by all kinds of different actors. It is worth putting on the record that the National Union of Journalists—I declare my position as a former newspaper editor—has expressed grave concerns about the duties currently being put on breaking encryption. It notes that it places
“journalists, sources and whistle-blowers in danger, creating a chilling effect that prevents individuals providing information that could help inform public interest journalism, and hold the powerful to account”.
I regret that I cannot be in the Committee on the economic crime Bill running in parallel to this, where we are talking about some of these issues. On the last group in that Bill, we talked about the importance of the media and NGOs in exposing economic crime, and that is true of many other areas of our society.
The noble Lord, Lord Allan, stressed what we might see if organisations choose to withdraw from the UK rather than leave their services here, but I want to address the point about what could happen if the organisations remain here and allow the set-up of systems for client-side scanning, as this Bill appears to point towards. That would open up the tools to being available and we know from experience around the world that, once we have those tech approaches out there, they spread literally in the manner of a virus—both in the biological and technical sense. They are available then to a whole lot of actors whom we do not want to have them.
It is worth coming back to the overall view of this. Sometimes we say that the security services were always able to open letters and look at individual communications, when we hope they had the legal basis to do so. Here we are talking about everything, everybody, all the time, which is an entirely different world situation to that individual, targeted legal basis.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I want to take advantage of the noble Baroness having raised that point to say that perhaps I was not clear enough in my speech. While I absolutely agree about not everything, everybody, all the time, for my specific concerns around child sexual abuse, abuse of women and so on, we have to find new world order ways of creating targeted approaches so it does not have to be everything, everybody, all the time.

Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- Hansard - - - Excerpts

I am glad I gave the noble Baroness the opportunity for that intervention. I have a reasonable level of technical knowledge—I hand-coded my first website in 1999, so I go back some way—but given the structures we are dealing with, I question the capacity and whether it is possible to create the tools and say they will be used only in a certain way. If you break the door open, anyone can walk through the door—that is the situation we are in.

As the noble Lord, Lord Allan, said, this is a crucial part of the Bill that was not properly examined and worked through in the other place. I will conclude by saying that it is vital we have a full and proper debate in this area. I hope the Minister can reassure us that he and the department will continue to be in dialogue with noble Lords as the Bill goes forward.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to speak to Amendment 205 in my name, but like other noble Lords I will speak about the group as a whole. After the contributions so far, not least from the noble Lord, Lord Allan of Hallam, and the noble Baroness, Lady Bennett of Manor Castle, there is not a great deal left for me to add. However, I will say that we have to understand that privacy is contextual. At one extreme, I know the remarks I make in your Lordships’ House are going to be carefully preserved and cherished; for several centuries, if not millennia, people will be able to see what I said today. If I am in my sitting room, having a private conversation, I expect that not to be heard by somebody, although at the same time I am dimly aware that there might be somebody on the other side of the wall who can hear what I am saying. Similarly, I am aware that if I use the telephone, it is possible that somebody is listening to the call. Somebody may have been duly authorised to do so by reference to a tribunal, having taken all the lawful steps necessary in order to listen to that call, because there are reasons that have persuaded a competent authority that the police service, or whatever, listening to my telephone call has a reason to do so, to avoid public harm or meet some other justified objective agreed on through legislation.

Here, we are going into a sphere of encryption where one assumes privacy and feels one is entitled to some privacy. However, it is possible that the regulator could at any moment step in and demand records from the past—records up to that point—without the intervention of a tribunal, as far as I can see, or without any reference to a warrant or having to explain to anybody their basis for doing so. They would be able to step in and do it. This is the point made by the noble Baroness, Lady Bennett of Manor Castle: unlike the telephone conversation, where it does not have to be everyone, everywhere, all the time—they are listening to just me and the person to whom I am talking—the provider has to have the capacity to go back, get all those records and be able to show Ofcom what it is that Ofcom is looking for. To do that requires them to change their encryption model fundamentally. It is not really possible to get away from everyone, everywhere, all the time, because the model has to be changed in order to do it.

That is why this is such an astonishing thing for the Government to insert in this Bill. I can understand why the security services and so forth want this power, and this is a vehicle to achieve something they have been trying to achieve for a long time. But there is very strong public resistance to it, and it is entirely understood, and to do it in this space is completely at odds with the way in which we felt it appropriate to authorise listening in on private conversations in the past—specific conversations, with the authority of a tribunal. To do it this way is a very radical change and one that needs to be considered almost apart from the Bill, not slipped in as a mere clause and administrative adjunct to it.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, there have been some excellent speeches so far. The noble Lord, Lord Allan of Hallam, brilliantly laid out why these amendments matter, and the noble Lord, Lord Moylan, explained why this has gained popular interest outside of the House. Not everything that goes on in this House is of interest and people do not study all of the speeches made by the noble Lord, Lord Moylan, even though they are always in the public sphere, but this particular group of amendments has elicited a huge amount of discussion.

We should remember that encrypted chat has become an indispensable part of the way that we live in this country and around the world. According to the Open Rights Group it has replaced the old-fashioned wired telephone—a rather quaint phrase. The fact that the citizens of the United Kingdom think that chat services matter so much that they are used by 60% of the total population should make us think about what we are doing regarding these services.

End-to-end encryption—the most secure form of encryption available—means that your messages are stored on your phone; people feel that they are in control because they are not on some server somewhere. Even WhatsApp cannot read your WhatsApp messages; that is the point of encryption. That is why people use it: the messages are secured with a lock which only you and the recipient have the special key to unlock to read them.

Obviously, there are certain problems. Certain Government Ministers wanted to voluntarily share all of their WhatsApp messages with a journalist who would then share them with the rest of us. If your Lordships were in that group you might have thought that was a rude thing to do. People have their WhatsApp messages leaked all the time, and when it happens we all think, “Oh my God, I’m glad I wasn’t in that WhatsApp group”, because you assume a level of privacy, even though as a grown-up you need to remember that somebody might leak them. But the main point is that they are a secure form of conversation that is widely used.

Everyone has a right to private conversations. I was thinking about how, when society closed down during the lockdown period, we needed technology in order to communicate with each other. We understood that we needed to WhatsApp message or Zoom call our friends and family, and the idea that this would involve the state listening in would have appalled us—we considered that our private life.

We want to be able to chat in confidence and be confident that only ourselves and the recipients can see what we are sharing and hear what we are saying. That is true of everyday life, but there are very good specific cases to be made for its importance, ranging through everything from Iranian women resisting the regime and communicating with each other, to all the civil liberties organisations around the world that use WhatsApp. The security of knowing that you can speak without Putin listening in or that President Xi will not be sent your WhatsApp messages is important.

The Government keep assuring us that we do not need to worry, but the Bill gives Ofcom the power to require services to install tools that would require the surveillance of encrypted communications regarding child exploitation and terrorism content, for example. Advocates and people on my side argue that this is not possible without undermining encryption because, just as you cannot be half pregnant, you cannot be half encrypted once you install tools for scanning for certain content. There is a danger that we say, “We’re only doing it for those things”, but actually it would be an attack on encryption itself.

Unlike the noble Baroness, Lady Bennett of Manor Castle, I know nothing about the technical aspects of this, as noble Lords can hear from the way I am speaking about it. But I can see from a common-sense point of view what encryption is: you cannot say, “We’re only going to use it a little bit”. That is my point.

I want to tackle the issue of child abuse, because I know that it lurks around here. It is what really motivates the people who say, “It’s ok as long as we can deal with that”. This is put forward as a proposed solution to the problem of encrypted chat services that send messages of that nature and the question of what we can do about it. Of course I stress that images of child abuse and exploitation are abhorrent—that is a very important background to this conversation—but I want to draw attention to the question of what we are prepared to do about child abuse, because I think it was referred to in an earlier group. I am nervous that we are promising a silver bullet through this Bill that it will all be solved through some of these measures.

13:15
I noted that Professor Ross Anderson of the University of Cambridge said that we cannot expect artificial intelligence to replace police officers, teachers, and social workers in child protection. He said:
“The idea that complex social problems are amenable to cheap technical solutions is the siren song of the software salesman and has lured many a gullible government department on to the rocks”.
This is true. Most child abuse happens offline. Online child abuse needs to be dealt with, but I worry that we will say to people, “Don’t worry, all will be well because we’re dealing with it in the Online Safety Bill”.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

No one in the Committee or anyone standing behind us who speaks up for children thinks that this is going to be a silver bullet. It is unacceptable to suggest that we take that position. Much child abuse takes place offline and is then put online, but the exponential way in which it is consumed, created, and spread is entirely new because of the services we are talking about. Later in Committee I will explain some of the new ways in which it is creating child abuse—new forms, new technologies, new abuse.

I am sorry to interrupt the noble Baroness. I have made my feelings clear that I am not an end-to-end encryption “breaker”. There are amendments covering this; I believe some of them will come up later in the name of the noble Lord, Lord Russell, on safety by design and so on. I also agree with the noble Baroness that we need more resources in this area for the police, teachers, social workers and so on. However, I do not want child sexual abuse to be a football in this conversation.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I agree with the noble Baroness, which is precisely why I am suggesting that we need to consider whether privacy should be sacrificed totally in relation to the argument around encryption. It is difficult, and I feel awkward saying it. When I mentioned a silver bullet I was not talking about the noble Baroness or any other noble Lords present, but I have heard people say that we need this Bill because it will deal with child abuse. In this group of amendments, I am raising the fact that when I have talked about encryption with people outside of the House they have said that we need to do something to tackle the fact that these messages are being sent around. It is not just child abuse; it is also terrorism. There is a range of difficult situations.

Things can go wrong with this, and that is what I was trying to raise. For example, we have a situation where some companies are considering using, or are being asked to use, machine learning to detect nudity. Just last year, a father lost his Google account and was reported to the police for sending a naked photo of their child to the doctor for medical reasons. I am raising these as examples of the problems that we have to consider.

Child abuse is so abhorrent that we will do anything to protect children, but let me say this to the Committee, as it is where the point on privacy lies: children are largely abused in their homes, but as far as I understand it we are not as yet arguing that the state should put CCTV cameras in every home for 24/7 surveillance to stop child abuse. That does not mean that we are glib or that we do not understand the importance of child abuse; it means that we understand the privacy of your home. There are specialist services that can intervene when they think there is a problem. I am worried about the possibility of putting a CCTV camera in everyone’s phone, which is the danger of going down this route.

My final point is that these services, such as WhatsApp, will potentially leave the UK. It is important to note that. I agree with the noble Lord, Lord Allan: this is not like threatening to storm off. It is not done in any kind of pique in that way. In putting enormous pressure on these platforms to scan communications, we must remember that they are global platforms. They have a system that works for billions of people all around the world. A relatively small market such as the UK is not something for which they would compromise their billions of users around the world. As I have explained, they would not put up with it if the Chinese state said, “We have to see people’s messages”. They would just say, “We are encrypted services”. They would walk out of China and we would all say, “Well done”. There is a real, strong possibility of these services leaving the UK so we must be very careful.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I just want to add to the exchange between the noble Baronesses, Lady Kidron and Lady Fox. The noble Baroness, Lady Fox, referred to WhatsApp’s position. Again, it is important for the public out there also to understand that if someone sends them illegal material—in particular child sexual abuse material; I agree with the noble Baroness, Lady Kidron, that this is a real problem—and they report it to WhatsApp, which has a reporting system, that material is no longer encrypted. It is sent in clear text and WhatsApp will give it to the police. One of the things I am suggesting is that, rather than driving WhatsApp out of the country, because it is at the more responsible end of the spectrum, we should work with it to improve these kinds of reporting systems and put the fear of God into people so that they know that this issue is not cost-free.

As a coda to that, if you ever receive something like that, you should report it to the police straightaway because, once it is on your phone, you are liable and you have a problem. The message from here should be: if you receive it, report it and, if it is reported, make sure that it gets to the police. We should be encouraging services to put those systems in place.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

The noble Lord has concluded with my conclusion, which was to say that those services will be driven out, but not because they are irresponsible around horrible, dangerous messages. They do not read our messages because they are private. However, if we ever receive anything that makes us feel uncomfortable, they should be put under pressure to act. Many of them already do and are actually very responsible, but that is different from demanding that they scan our messages and we breach that privacy.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, that last exchange was incredibly helpful. I am grateful to the noble Lord, Lord Allan, for what he just said and the way in which he introduced this group. I want to make only a few brief remarks.

I have put my name to two amendments in this group: Amendment 202 in the name of the noble Lord, Lord Stevenson, which seeks to ensure that Ofcom will be subject to the same kind of requirements and controls as exist under the Regulation of Investigatory Powers Act before issuing a technology notice

“to a regulated service which offers private messaging with end-to-end encryption”;

and Amendment 285, also in the name of the noble Lord, Lord Stevenson, and that of the noble Lord, Lord Clement-Jones. This amendment would make sure that no social media platforms or private end-to-end messaging services have an obligation generally to monitor what is going on across their platforms. When I looked at this group and the various amendments in it, those were the two issues that I thought were critical. These two amendments seemed to approach them in the most simple and straightforward manner.

Like other noble Lords, my main concern is that I do not want search and social media platforms to have an obligation to become what we might describe as thought police. I do not want private messaging firms to start collecting and storing the content of our messages so that they have what we say ready to hand over in case they are required to do so. What the noble Lord, Lord Allan, just said is an important point to emphasise. Some of us heard from senior representatives from WhatsApp a few weeks ago. I was quite surprised to learn how much they are doing in this area to co-operate with the authorities; I felt very reassured to learn about that. I in no way want to discourage that because they are doing an awful amount of good stuff.

Basically, this is such a sensitive matter, as has been said, that it is important for the Government to be clear what their policy intentions are by being clear in the Bill. If they do not intend to require general monitoring that needs to be made explicit. It is also important that, if Ofcom is to be given new investigatory powers or powers to insist on things through these technology notices, it is clear that its powers do not go beyond those that are already set out in law. As we have heard from noble Lords, there is widespread concern about this matter not just from the social media platforms and search engines themselves but from news organisations, journalists and those lobby groups that often speak out on liberty-type matters. These topics go across a wide range of interest groups, so I very much hope that my noble friend the Minister will be able to respond constructively and open-mindedly on them.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I was not intending to intervene on this group because my noble friend Lord Stevenson will address these amendments in their entirety, but listening in to this public conversation about this group of amendments has stimulated a question that I want both to put on the record and to give the Minister time to reflect on.

If we get the issues of privacy and encrypted messaging wrong, it will push more people into using VPN—virtual private network—services. I went into the app store on my phone to search for VPN software. There is nothing wrong with such software—our parliamentary devices have it to do general monitoring and make sure that we do not use services such as TikTok—but it is used to circumnavigate much of the regulatory regime that we are seeking to put together through this Bill. When I search for VPNs in the app store, the first one that comes up that is not a sponsored, promoted advertisement has an advisory age limit of four years old. Several of them are the same; some are 17-plus but most are four-plus. Clearly, the app promotes itself very much on the basis that it offers privacy and anonymity, which are the key features of a VPN. However, a review of it says, “I wouldn’t recommend people use this because it turns out that this company sends all its users’ data to China so that it can do general monitoring”.

I am not sure how VPNs are being addressed by the Bill, even though they seem really pertinent to the issues of privacy and encryption. I would be interested to hear whether—and if we are, how—we are bringing the regulation and misuse of VPNs into scope for regulation by Ofcom.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I would like to say something very quickly on VPN. I had a discussion with some teenagers recently, who were all prepared for this Bill—I was quite surprised that they knew a lot about it. They said, “Don’t worry, we’ve worked out how to get around it. Have you heard of VPN?” It reminded me of a visit to China, where I asked a group of students how they dealt with censorship and not being able to google. They said, “Don’t worry about it”, and showed me VPN. It is right that we draw attention to that. There is a danger of inadvertently forcing people on to the unregulated dark web and into areas that we might not imagine. That is why we have to be careful and proportionate in our response.

13:30
Lord McNally Portrait Lord McNally (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I have long been on record as being for radical reform of the House of Lords, but I do not think there are many Chambers in the world that could have had such an interesting debate on such a key subject—certainly not the House of Commons, sadly. Without falling into the old trap of saying what a wonderful lot we all are, it is important that, in such an important Bill, covering so many important areas of civil liberties and national security, there should be an opportunity, before we get to voting, to have this kind of debate and get some of the issues into the public domain.

I am on the same side as the noble Baroness, Lady Fox, on knowledge of the technology—looking back to 20 years ago, when I was on the committee that worked on the communications Bill which set up Ofcom, I see that we were genuinely innocents abroad. We deliberately decided not to try regulating the internet, because we did not know what it was going to do. I do not think that we can have that excuse today.

Perhaps an even more frightening background is that, for three and a half years, during the coalition Government, I was Minister for Digital Protection—a less equipped Minister to protect your digital I cannot imagine. However, I remember being taken to some place over the river to have a look at our capacities in this area. Having seen some of the things that were being done, I rather timidly asked the expert who was showing me round, “Aren’t there civil liberty issues in what you’re doing?” He said, “Oh no, sir. Tesco know far more about you than we do”.

There is this element about what is secret. The noble Baroness, Lady Fox, in her last contribution, said that children look with contempt at some of the safeguards and blockages that keep them away from things. I do not think anybody is deluding themselves that there is some silver bullet. As always, Parliament must do its best to address real national concerns and real problems in the best way that we see at this time. There is a degree of cross-party and Cross-Bench unity, in that there are real and present dangers in how these technologies are being used, and real and present abuses of a quite horrific kind. The noble Baroness, Lady Kidron, is right. This technology has given a quantum leap to the damage that the abuser and the pornographer can do to our society, in the same way that it has given a quantum leap to those who want to undermine the truth and fairness of our election system. There are real problems that must be addressed.

Although it has not been present in this debate, it is no help to polarise the argument as being between the state wanting to accrue more and more powers and brave defenders of civil liberties. As somebody who has practised some of these dark arts myself, I advise those who are organising letters to ensure that those sending them do not leave in the paragraph that says, “Here you may want to include some personal comments”. It waters down the credibility of this as some independent exercising of a democratic right.

I make a plea, as someone on the edges of the debate who at times had some direct responsibilities, to use what the Bill has thrown up to address whether it is now in the right shape—I hope the Minister hears it. The Government should not be ashamed to take it away and think a bit. It may be that we can add some of the protections that we quite often do, such as allowing certain interventions after a judge or senior police officer or others have been involved. That may already be in other parts of the Bill. However, it would be wrong to allow the Bill to polarise this, given that there was no one who spoke this morning who is not trying to deal with very real difficulties, problems and challenges, within the framework of a democratic society, in a way that protects our freedoms but also protects us from real and present dangers.

Lord Kamall Portrait Lord Kamall (Con)
- View Speech - Hansard - - - Excerpts

My Lords, this is the first time that I have spoken on the Bill in Committee. I know noble Lords are keen to move on and get through the groups as quickly as possible, but I hope they will forgive me if I say that I will speak only about twice on the Bill, and this is one of the groups that I want to speak to. I will try not to make your Lordships impatient.

I should tell the Committee a little about where I am coming from. I was very geeky as a kid. I learned to program and code. I did engineering at university and coded there. My master’s degree in the late 1980s was about technology and policy, so I have been interested in technology policy since then, having followed it through in my professional life. In 1996, I wrote a book on EU telecoms—it sold so well that no one has ever heard of it. One thing I said in that book, which though not an original thought is pertinent today, is that the regulation will always be behind the technology. We will always play catch-up, and we must be concerned about that.

Interestingly, when you look at studies of technology adoption—pioneers, early adopters and then the rest of the population—quite often you see that the adult industry is at the leading edge, such as with cable TV, satellite TV, video cassettes, online conferencing, et cetera. I assure your Lordships that I have not done too much primary research into this, but it is an issue that we ought to be aware of.

I will not speak often in this debate, because there are many issues that I do not want to disagree on. For example, I have already had a conversation with the noble Baroness, Lady Kidron, and we all agree that we need to protect children. We also know that we need to protect vulnerable adults; there is no disagreement on that. However, in these discussions there will be inevitable trade-offs between security and safety and freedom. It is right to have these conversations to ensure that we get the balance right, with the wisdom of noble Lords. Sacrifices will be made on either side of the debate, and we should be very careful as we navigate this.

I am worried about some of the consequences for freedom of expression. When I was head of a research think tank, one of the phenomena that I became interested in was that of unintended consequences. Well-meaning laws and measures have often led to unintended consequences. Some people call it a law of unintended consequences, and some call it a principle, and we should be careful about this. The other issue is subjectivity of harms. Given that we have taken “legal but harmful” out and there are amendments to the Bill to tackle harms, there will be a debate on the subjectivity of harms.

One reason I wanted to speak on this group is that some of the amendments tabled by noble Lords—too many to mention—deal with technology notices and ensuring that we are consistent between the offline and online worlds, particularly regarding the Regulation of Investigatory Powers Act. I welcome and support those amendments.

We also have to be aware that people will find a way around it, as the noble Baroness, Lady Fox, said. When I was looking at terrorism and technology, one of the issues that people raised with me was not to forget that one way around it was to create an email account and store stuff in a draft folder. You could then share the username and password with others who could then access that data, those pictures or those instructions in a draft folder. The noble Lord, Lord Allan, has gone some way to addressing that issue.

The other issue that we have to be clear about is how the tech sector can do more. It was interesting when my noble friend Lady Stowell organised a meeting with Meta, which was challenged particularly on having access to information and pictures from coroners. It was very interesting when Meta told us what it could access: it does not know what is in the messages, but there are things that it can access, or advise people to access, on the user’s phone or at the other end. I am not sure whether the noble Baroness, Lady Kidron, has had the conversation with Meta, but it would be helpful and important to find some common ground there, and to probe and push Meta and others to make sure that they share that information more quickly, so we do not have to wait five years to get it via the coroner or whatever. We ought to push that as much as possible.

I want to talk in particular about unintended consequences, particularly around end-to-end encryption. Even if you do not believe the big businesses and think that they are crying wolf when they say that they will quit the UK—although I believe that there is a threat of that, particularly when we continually want the UK to be a global hub for technology and innovation and so cannot afford for companies such as Meta, Signal and others to leave—you should listen to the journalists who are working with people, quite often dissidents, in many countries, and rely on encrypted communications to communicate with them.

The other risk we should be aware of is that it is very difficult to keep technology to a few people. In my academic career, I also looked at technology transfer, both intentional and unintentional. We should look at the intelligence services and some of the innovations that happened: for example, when Concorde was designed, it was not very long after that the Soviets got their hands on that equipment. Just as there used to be a chap called Bob in the exchange who could share information, there is always a weak spot in chains: the humans. Lots of humans have a price and can be bought, or they can be threatened, and things can be shared. The unintended consequence I am worried about is that this technology will get into the hands of totalitarian regimes. At the same time, it means people over here who are really trying desperately to help dissidents and others speak up for freedom in other countries will be unable to support them. We should be very careful and think about unintended consequences. For that reason, I support this group of amendments.

I really am looking forward to the responses from the Minister. I know that the noble Lord, Lord McNally, said that he was a Minister for three years on data protection; I was a Minister in this department for one month. I was so pleased that I had my dream job, as Minister for Civil Society and Heritage, and so proud of my party and this country because we had elected the first Asian Prime Minister; then, six days later, I got sacked. So, as they say, be careful what you wish for.

In this particular case, I am grateful to the noble Lords who have spoken up in this debate. I do not want to repeat any other points but just wanted to add that. I will not speak often, but I want to say that it is really critical that, when we look at this trade-off between security, safety and freedom, we get it right. One way of doing that is to make sure that, on technology notices and RIPA, we are consistent between the online and offline worlds.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, it has been a very good debate indeed. When I first saw this grouping, my heart sank: the idea that we should be able to encompass all that within the space of just over an hour seemed a bit beyond all of us, however skilled and experienced we were, and whatever background we were able to bring to the debate today. I agree with both noble Lords who observed that we have an expertise around here that is very unusual and extremely helpful in trying to drill down into some of these issues.

The good thing that has come out from this debate, which was summed up very well by the noble Lord, Lord Kamall, is that we are now beginning to address some of the underlying currents that the Bill as a boat is resting on—and the boat is a bit shaky. We have a very strong technological bias, and we are grateful for the masterclass from the noble Lord, Lord Allan of Hallam, on what is actually going on in the world that we are trying to legislate for. It leaves me absolutely terrified that we are in a situation where we appear to be trying to future-proof, possibly in the wrong direction. We should be very careful about that. We will want to reflect on the point he made on where the technology is driving this particular aspect of our social media and search engine operations.

13:45
The Bill is very wide ranging and, therefore, the amendments must necessarily follow it. But, in this group, we seem to be doing three things: we are trying to recognise whether there is a problem with encrypted messaging, and its relationship to security on the one hand and privacy and human rights on the other. I am very pleased that we are doing this, but I am not quite sure that we are in a position to make long-lasting conclusions. Like everybody else, I think that the burden falls on the Minister to convince us that he has reached the right place in the consideration of this and that his proposals will be right for the present day, let alone the future.
The noble Baroness, Lady Stowell, was right: we need to be very clear what the Government are trying to do here. I am afraid that I am not convinced that I know what it is. I put it to the Minister that he should make it very evident up front. This section of the Bill, and the way that we have been grouped into discussing it—because there are other things that we will need to come back to that relate to it—will need to be convincing. At the moment, I do not think that it is.
I say that because, if you go down where the Bill is trying to get to, it is very odd indeed that Ofcom has the powers to look at the messaging of private individuals and that the same body is also regulating. In other words, Ofcom is expected to be both gamekeeper and poacher. The points made around the Chamber on this issue are unanswerable. In the offline world, we have a structure that works through RIPA, which seems an exemplary model. I have heard the Minister say in private meetings that the procedures which will be in place in Ofcom will replicate that in every way and that there should be no concern about it, but the problem is the fact that it is the same body that is doing it. Enough has been said to make a very good case that at the very least, if we go ahead on the basis of what the Bill says, the decisions on whether or not the technologies can begin to peek into the encrypted world need to be authorised by an external body at a judicial level, and that it should follow the RIPA model, which has stood the test of time and seems to work very well and to everyone’s satisfaction. That is my first point.
My second point is that if we are to go down a technological route, we have to be certain that it is necessary; I worry that it is in advance of where we perhaps need to go, and that having a bit more time before it comes into place might be a way forward. I think we have heard enough from those who have written in and in the meetings we have had that this does not seem to be a hotspot for the police, who will have responsibility for doing quite a lot of the legwork on this. They seem to have powers which they could use to get to where they need to be in order to make sure that the crimes being commissioned or committed can be investigated and that those responsible are brought to justice. If that is the case, why are we putting in this extra step? Again, I do not have the confidence that the Bill is going in the right direction here.
We can add to that some of the technological issues, which are as important. If we have a technology capable of carrying out the inquisition of encrypted material in a way which will be satisfactory as defined by the legislation, is there not a risk that we are simply opening up the whole process to hackers and those who might be able to do more harm than good? One representation we had said that the requirement under Clause 110 to use accredited technology to identify CSEA and/or terrorism content, whether, in the words of the Bill,
“communicated publicly or privately by means of the service”,
means that a currently secure platform check and a scan of users will be opened up. That proposal imposes the decryption of something that is encrypted, which cannot be right. That would open up too much of a risk for those who are, as we have heard, in many ways and in many parts of the world dependent on encryption to carry on doing the things that we want them to do. The ability to hijack this type of technology is a worry which I have not seen reflected in any of the discussions we have had with the Government on this point.
Finally, I know this is unpopular as far as the Government are concerned, but is there not a concern that we are running a coach and horses through some of our well thought-through and important issues relating to human rights? The EHRC’s paper says that the provisions in Clause 110 may be disproportionate and an infringement of millions of individuals’ rights to privacy where those individuals are not suspected of any wrongdoing. This is not a right or wrong issue; it is a proportion issue. We need to balance that. I do not know if have heard the Minister set out exactly why the measures in the Bill meet that set of conditions, so I would be grateful if he could talk about that or, if not, write to us. If we are in danger of heading into issues which are raised by Article 8 of the ECHR—I know the noble Lord opposite may not be a huge supporter of it, but it is an important part of our current law, and senior Ministers have said how important it will be in the future—surely we must have safeguards which will protect it.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, this has indeed been a very good debate on a large group of amendments. We have benefited from two former Ministers, the noble Lord, Lord McNally, and my noble friend Lord Kamall. I hope it is some solace to my noble friend that, such a hard act is he to follow, his role has been taken on by two of us on the Front Bench—myself at DCMS and my noble friend Lord Camrose at the new Department for Science, Innovation and Technology.

The amendments in this group are concerned with the protection of user privacy under the Bill and the maintenance of end-to-end encryption. As noble Lords have noted, there has been some recent coverage of this policy in the media. That reporting has not always been accurate, and I take this opportunity to set the record straight in a number of areas and seek to provide the clarity which the noble Lord, Lord Stevenson of Balmacara, asked for just now.

Encryption plays a crucial role in the digital realm, and the UK supports its responsible use. The Bill does not ban any service design, nor will it require services materially to weaken any design. The Bill contains strong safeguards for privacy. Broadly, its safety duties require platforms to use proportionate systems and processes to mitigate the risks to users resulting from illegal content and content that is harmful to children. In doing so, platforms must consider and implement safeguards for privacy, including ensuring that they are complying with their legal responsibilities under data protection law.

With regard to private messaging, Ofcom will set out how companies can comply with their duties in a way that recognises the importance of protecting users’ privacy. Importantly, the Bill is clear that Ofcom cannot require companies to use proactive technology, such as automated scanning, on private communications in order to comply with their safety duties.

In addition to these cross-cutting protections, there are further safeguards concerning Ofcom’s ability to require the use of proactive technology, such as content identification technology on public channels. That is in Clause 124(6) of the Bill. Ofcom must consider a number of matters, including the impact on privacy and whether less intrusive measures would have the equivalent effect, before it can require a proactive technology.

The implementation of end-to-end encryption in a way that intentionally blinds companies to criminal activity on their services, however, has a disastrous effect on child safety. The National Center for Missing & Exploited Children in the United States of America estimates that more than half its reports could be lost if end-to-end encryption were implemented without preserving the ability to tackle child sexual abuse—a conundrum with which noble Lords grappled today. That is why our new regulatory framework must encourage technology companies to ensure that their safety measures keep pace with this evolving and pernicious threat, including minimising the risk that criminals are able to use end-to-end encrypted services to facilitate child sexual abuse and exploitation.

Given the serious risk of harm to children, the regulator must have appropriate powers to compel companies to take the most effective action to tackle such illegal and reprehensible content and activity on their services, including in private communications, subject to stringent legal safeguards. Under Clause 110, Ofcom will have a stand-alone power to require a provider to use, or make best endeavours to develop, accredited technology to tackle child sexual exploitation and abuse, whether communicated publicly or privately, by issuing a notice. Ofcom will use this power as a last resort only when all other measures have proven insufficient adequately to address the risk. The only other type of harm for which Ofcom can use this power is terrorist content, and only on public communications.

The use of the power in Clause 110 is subject to additional robust safeguards to ensure appropriate protection of users’ rights online. Ofcom will be able to require the use of technology accredited as being highly accurate only in specifically detecting illegal child sexual exploitation and abuse content, ensuring a minimal risk that legal content is wrongly identified. In addition, under Clause 112, Ofcom must consider a number of matters, including privacy and whether less intrusive means would have the same effect, before deciding whether it is necessary and proportionate to issue a notice.

The Bill also includes vital procedural safeguards in relation to Ofcom’s use of the power. If Ofcom concludes that issuing a notice is necessary and proportionate, it will need to publish a warning notice to provide the company an opportunity to make representations as to why the notice should not be issued or why the detail contained in it should be amended. In addition, the final notice must set out details of the rights of appeal under Clause 149. Users will also be able to complain to and seek action from a provider if the use of a specific technology results in their content incorrectly being removed and if they consider that technology is being used in a way that is not envisaged in the terms of service. Some of the examples given by the noble Baroness, Lady Fox of Buckley, pertain in this instance.

The Bill also recognises that in some cases there will be no available technology compatible with the particular service design. As I set out, this power cannot be used by Ofcom to require a company to take any action that is not proportionate, including removing or materially weakening encryption. That is why the Bill now includes an additional provision for this scenario, to allow Ofcom to require technology companies to use their best endeavours to develop or find new solutions that work on their services while meeting the same high standards of accuracy and privacy protection. Given the ingenuity and resourcefulness of the sector, it is reasonable to ask it to do everything possible to protect children from abuse and exploitation. I echo the comments made by the noble Lord, Lord Allan, about the work being done across the sector to do that.

More broadly, the regulator must uphold the right to privacy under its Human Rights Act obligations when implementing the new regime. It must ensure that its actions interfere with privacy only where it is lawful, necessary and proportionate to do so. I hope that addresses the question posed by the noble Lord, Lord Stevenson. In addition, Ofcom will be required to consult the Information Commissioner’s Office when developing codes of practice and relevant pieces of guidance.

I turn now to Amendments 14—

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Before the Minister does so, can he give a sense of what he means by “best endeavours” for those technology companies? If it is not going to be general monitoring of what is happening as the message moves from point to point—we have had some discussions about the impracticality and issues attached to monitoring at one end or the other—what, theoretically, could “best endeavours” possibly look like?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am hesitant to give too tight a definition, because we want to remain technology neutral and make sure that we are keeping an open mind to developing changes. I will think about that and write to the noble Lord. The best endeavours will inevitably change over time as new technological solutions present themselves. I point to the resourcefulness of the sector in identifying those, but I will see whether there is anything more I can add.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

While the Minister is reflecting, I note that the words “best endeavours” are always a bit of a worry. The noble Lord, Lord Allan, made the good point that once it is on your phone, you are in trouble and you must report it, but the frustration of many people outside this Chamber, if it has been on a phone and you cannot deal with it, is what comes next to find the journey of that piece of material without breaking encryption. I speak to the tech companies very often—indeed, I used to speak to the noble Lord, Lord Allan, when he was in position at then Facebook—but that is the question that we would like answered in this Committee, because the frustration that “It is nothing to do with us” is where we stop with our sympathy.

14:00
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

The noble Baroness’s intervention has given me an opportunity to note that I am about to say a little more on best endeavours, which will not fully answer the question from the noble Lord, Lord Knight, but I hope fleshes it out a little more.

I do that in turning to Amendments 14, 108 and 205, which seek to clarify that companies will not be required to undertake fundamental changes to the nature of their service, such as the removal or weakening of end-to-end encryption. As I previously set out, the Bill does not require companies to weaken or remove any design and there is no requirement for them to do so as part of their risk assessments or in response to a notice. Instead, companies will need to undertake risk assessments, including consideration of risks arising from the design of their services, before taking proportionate steps to mitigate and manage these risks. Where relevant, assessing the risks arising from end-to-end encryption will be an integral part of this process.

This risk management approach is well established in almost every other industry and it is right that we expect technology companies to take user safety into account when designing their products and services. We understand that technologies used to identify child sexual abuse and exploitation content, including on private communications, are in some cases nascent and complex. They continue to evolve, as I have said. That is why Ofcom has the power through the Bill to issue a notice requiring a company to make best endeavours to develop or source technology.

This notice will include clear, proportionate and enforceable steps that the company must take, based on the relevant information of the specific case. Before issuing a warning notice, Ofcom is expected to enter into informal consultation with the company and/or to exercise information-gathering powers to determine whether a notice is necessary and proportionate. This consultation period will assist in establishing what a notice to develop a technology may require and appropriate steps for the company to take to achieve best endeavours. That dialogue with Ofcom is part of the process.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - - - Excerpts

There are a lot of phrases here—best endeavour, proportionate, appropriate steps—that are rather subjective. The concern of a number of noble Lords is that we want to address this issue but it is a matter of how it is applied. That is one of the reasons why noble Lords were asking for some input from the legal profession, a judge or otherwise, to make those judgments.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

All the phrases used in the Bill are subject to the usual scrutiny through the judicial process—that is why we debate them now and think about their implications—but of course they can, and I am sure will, be tested in the usual legal ways. Once a company has developed a new technology that meets minimum standards of accuracy, Ofcom may require its use but not before considering matters including the impact on user privacy, as I have set out. The Bill does not specify which tools are likely to be required, as we cannot pre-empt Ofcom’s evidence-based and case-by-case assessment.

Amendment 285 intends to clarify that social media platforms will not be required to undertake general monitoring of the activity of their users. I agree that the protection of privacy is of utmost importance. I want to reassure noble Lords, in particular my noble friend Lady Stowell of Beeston, who asked about it, that the Bill does not require general monitoring of all content. The clear and strong safeguards for privacy will ensure that users’ rights are protected.

Setting out clear and specific safeguards will be more effective in protecting users’ privacy than adopting the approach set out in Amendment 285. Ofcom must consider a number of matters, including privacy, before it can require the use of proactive technology. The government amendments in this group, Amendments 290A to 290G, further clarify that technology which identifies words, phrases or images that indicate harm is subject to all of these restrictions. General monitoring is not a clearly defined concept—a point made just now by my noble friend Lord Kamall. It is used in EU law but is not defined clearly in that, and it is not a concept in UK law. This lack of clarity could create uncertainty that some technology companies might attempt to exploit in order to avoid taking necessary and proportionate steps to protect their users. That is why we resist Amendment 285.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I understand the point the Minister is making, but it is absolutely crystal clear that, whatever phrase is used, the sensibility is quite clear that the Government are saying on record, at the Dispatch Box, that the Bill can in no way be read as requiring anybody to provide a view into private messaging or encrypted messaging unless there is good legal cause to suspect criminality. That is a point that the noble Baroness, Lady Stowell, made very clearly. One may not like the phrasing used in other legislatures, but could we find a form of words that will make it clear that those who are operating in this legal territory are absolutely certain about where they stand on that?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, I want to give clear reassurance that the Bill does not require general monitoring of all content. We have clear and strong safeguards for privacy in the Bill to ensure that users’ rights are protected. I set out the concerns about use of the phrase “general monitoring”. I hope that provides clarity, but I may have missed the noble Lord’s point. The brief answer to the question I think he was asking is yes.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

Let the record stand clear: yes. It was the slight equivocation around how the Minister approached and left that point that I was worried about, and that people might seek to use that later. Words from the Dispatch Box are never absolute and they are never meant to be, but the fact that they have been said is important. I am sure that everybody understands that point, and the Minister did say “yes” to my question.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I did, and I am happy to say it again: yes.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

Perhaps I might go back to an earlier point. When the Minister said the Government want to make sure, I think he was implying that certain companies would try to avoid obligations to keep their users safe by threatening to leave or whatever. I want it to be clear that the obligations to the users of the service are, in the instance of encrypted services, to protect their privacy, and they see that as keeping them safe. It would be wrong to make that a polar opposite. I think that companies that run unencrypted services believe that to be what their duties are—so that in a way is a clash.

Secondly, I am delighted by the clarity in the Minister’s “yes” answer, but I think that maybe there needs to be clearer communication with people outside this Chamber. People are worried about whether duties placed on Ofcom to enact certain things would lead to some breach of encryption. No one thinks that the Government intend to do this or want to spy on anyone, but that the unintended consequences of the duty on Ofcom might have that effect. If that is not going to be the case, and that can be guaranteed by the Government, and they made that clear, it would reassure not just the companies but the users of messaging services, which would be helpful.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

The points the noble Baroness has just made bring me neatly to what I was about to say in relation to the question raised earlier by the noble Lord, Lord Knight of Weymouth. But first, I would say that Ofcom as a public body is subject to public law principles already, so those apply in this case.

The noble Lord, Lord Knight, asked about virtual private networks and the risk of displacing people on to VPNs or other similar alternatives. That is a point worth noting, not just in this group but as we consider all these amendments, particularly when we talk later on about age verification, pornography and so on. Services will need to think about how safety measures could be circumvented and take steps to prevent that, because they need to mitigate risk effectively. There may also be a role in enforcement action, too; Ofcom will be able to apply to the courts to require these services where appropriate to apply business disruption measures. We should certainly be mindful of the incentives for people to do that, and the example the noble Lord, Lord Knight, gave earlier is a useful lesson in the old adage “Caveat emptor” when looking at some of these providers.

I want to say a little bit about Amendments 205A and 290H in my name. Given the scale of child sexual abuse and exploitation that takes place online, and the reprehensible nature of these crimes, it is important that Ofcom has effective powers to require companies to tackle it. This brings me to these government amendments, which make small changes to the powers in Clause 110 to ensure that they are effective. I will focus particularly, in the first instance, on Amendment 290H, which ensures that Ofcom considers whether a service has features that allow content to be shared widely via another service when deciding whether content has been communicated publicly or privately, including for the purposes of issuing a notice. This addresses an issue highlighted by the Independent Reviewer of Terrorism Legislation, Jonathan Hall, and Professor Stuart Macdonald in a recent paper. The separate, technical amendment, Amendment 205A, clarifies that Clause 110(7) refers only to a notice on a user-to-user service.

Amendment 190 in the name of the noble Lord, Lord Clement-Jones, seeks to introduce a new privacy duty on Ofcom when considering whether to use any of its powers. The extensive privacy safeguards that I have already set out, along with Ofcom’s human rights obligations, would make this amendment unnecessary. Ofcom must also explicitly consult persons whom it considers to have expertise in the enforcement of the criminal law and the protection of national security, which is relevant to online safety matters in the course of preparing its draft codes. This may include the integrity and security of internet services where relevant.

Amendments 202 and 206, in the name of the noble Lord, Lord Stevenson of Balmacara, and Amendments 207, 208, 244, 246, 247, 248, 249 and 250 in the name of the noble Lord, Lord Clement-Jones, all seek to deliver privacy safeguards to notices issued under Clause 110 through additional review and appeals processes. There are already strong safeguards concerning this power. As part of the warning notice process, companies will be able to make representations to Ofcom which it is bound to consider before issuing a notice. Ofcom must also review any notice before the end of the period for which it has effect.

Amendment 202 proposes mirroring the safeguards of the investigatory powers Act when issuing notices to encrypted messaging services under this power. First, this would be inappropriate, because the powers in the investigatory powers Act serve different purposes from those in this Bill. The different legal safeguards in the investigatory powers Act reflect the potential intrusion by the state into an individual’s private communications; that is not the case with this Bill, which does not grant investigatory powers to state bodies, such as the ability to intercept private communications. Secondly, making a reference to encryption would be—

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

Is that right? I do not need a yes or no answer. It was rhetorical; I am just trying to frame the right question. The Minister is making a very strong point about the difference between RIPA requirements and those that might be brought in under this Bill. But it does not really get to the bottom of the questions we were asking. In this situation, whatever the exact analogy between the two systems is, it is clear that Ofcom is marking its own homework—which is fair enough, as there are representations, but it is not getting external advice or seeking judicial approval.

The Minister’s point was that that was okay because it was private companies involved. But we are saying here that these would be criminal offences taking place and therefore there is bound to be interest from the police and other agencies, including anti-terrorism agencies. It is clearly similar to the RIPA arrangements, so he could he just revisit that?

14:15
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes, I think it is right. The investigatory powers Act is a tool for law enforcement and intelligence agencies, whereas the Bill is designed to regulate technology companies—an important high-level distinction. As such, the Bill does not grant investigatory powers to state bodies. It does not allow the Government or the regulator to access private messages. Instead, it requires companies to implement proportionate systems and processes to tackle illegal content on their platforms. I will come on to say a little about legal redress and the role of the courts in looking at Ofcom’s decisions so, if I may, I will respond to that in a moment.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

The investigatory powers Act includes a different form of technical notice, which is to put in place surveillance equipment. The noble Lord, Lord Stevenson, has a good point: we need to ensure that we do not have two regimes, both requiring companies to put in place technical equipment but with quite different standards applying.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will certainly take that point away and I understand, of course, that different Acts require different duties of the same platforms. I will take that away and discuss it with colleagues in other departments who lead on investigatory powers.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Hansard - - - Excerpts

Before my noble friend moves on, when he is reviewing that back in the office, could he also satisfy himself that the concerns coming from the journalism and news organisations in the context of RIPA are also understood and have been addressed? That is another angle which, from what my noble friend has said so far, I am not sure has really been acknowledged. That is not a criticism but it is worth him satisfying himself on it.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am about to talk about the safeguards for journalists in the context of the Bill and the questions posed by the noble Baroness, Lady Bennett. However, I take my noble friend’s point about the implications of other Acts that are already on the statute book in that context as well.

Just to finish the train of thought of what I was saying on Amendment 202, making a reference to encryption, as it suggests, would be out of step with the wider approach of the Bill, which is to remain technology-neutral.

I come to the safeguards for journalistic protections, as touched on by the noble Baroness, Lady Bennett. The Government are fully committed to protecting the integrity of journalistic sources, and there is no intention or expectation that the tools required to be used under this power would result in a compromising of those sources. Any tools required on private communications must be accredited by Ofcom as highly accurate only in detecting child sexual abuse and exploitation content. These minimum standards of accuracy will be approved and published by the Secretary of State, following advice from Ofcom. We therefore expect it to be very unlikely that journalistic content will be falsely detected by the tools being required.

Under Clause 59, companies are obliged to report child sexual abuse material which is detected on their service to the National Crime Agency; this echoes a point made by the noble Lord, Lord Allan, in an earlier contribution. That would include child sexual abuse and exploitation material identified through tools required by a notice and, even in this event, the appropriate protections in relation to journalistic sources would be applied by the National Crime Agency if it were necessary to identify individuals involved in sharing illegal material.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I want to flag that in the context of terrorist content, this is quite high risk for journalists. It is quite common for them, for example, to be circulating a horrific ISIS video not because they support ISIS but because it is part of a news article they are putting together. We should flag that terrorist content in particular is commonly distributed by journalists and it could be picked up by any system that is not sufficiently sophisticated.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I see that my noble friend Lord Murray of Blidworth has joined the Front Bench in anticipation of the lunch-break business for the Home Office. That gives me the opportunity to say that I will discuss some of these points with him, my noble friend Lord Sharpe of Epsom and others at the Home Office.

Amendment 246 aims to ensure that there is no requirement for a provider to comply with a notice until the High Court has determined the appeal. The Government have ensured that, in addition to judicial review through the High Court, there is an accessible and relatively affordable alternative means of appealing Ofcom’s decisions via the Upper Tribunal. We cannot accept amendments such as this, which could unacceptably delay Ofcom’s ability to issue a notice, because that would leave children vulnerable.

To ensure that Ofcom’s use of its powers under Clause 110, and the technology that underpins it, are transparent, Ofcom will produce an annual report about the exercise of its functions using these powers. This must be submitted to the Secretary of State and laid before Parliament. The report must also provide the details of technology that has been assessed as meeting minimum standards of accuracy, and Ofcom may also consider other factors, including the impact of technologies on privacy. That will be separate to Ofcom’s annual report to allow for full scrutiny of this power.

The legislation also places a statutory requirement on Ofcom to publish guidance before its functions with regard to Clause 110 come into force. This will be after Royal Assent, given that the legislation is subject to change until that point. Before producing the guidance, Ofcom must consult the Information Commissioner. As I said, there are already strong safeguards regarding Ofcom’s use of these powers, so we think that this additional oversight is unnecessary.

Amendments 203 and 204, tabled by the noble Lord, Lord Clement-Jones, seek to probe the privacy implications of Ofcom’s powers to require technology under Clause 110. I reiterate that the Bill will not ban or weaken any design, including end-to-end encryption. But, given the scale of child sexual abuse and exploitation taking place on private communications, it is important that Ofcom has effective powers to require companies to tackle this abhorrent activity. Data from the Office for National Statistics show that in nearly three-quarters of cases where children are contacted online by someone they do not know, this takes place by private message. This highlights the scale of the threat and the importance of technology providers taking steps to safeguard children in private spaces online.

As already set out, there are already strong safeguards regarding the use of this power, and these will prevent Ofcom from requiring the use of any technology that would undermine a platform’s security and put users’ privacy at risk. These safeguards will also ensure that platforms will not be required to conduct mass scanning of private communications by default.

Until the regime comes into force, it is of course not possible to say with certainty which tools would be accredited. However, some illustrative examples of the kinds of current tools we might expect to be used—providing that they are highly accurate and compatible with a service’s design—are machine learning or artificial intelligence, which assess content to determine whether it is illegal, and hashing technology, which works by assigning a unique number to an image that has been identified as illegal.

Given the particularly abhorrent nature of the crimes we are discussing, it is important that services giving rise to a risk of child sexual abuse and exploitation in the UK are covered, wherever they are based. The Bill, including Ofcom’s ability to issue notices in relation to this or to terrorism, will therefore have extraterritorial effect. The Bill will apply to any relevant service that is linked to the UK. A service is linked to the UK if it has a significant number of UK users, if UK users form a target market or if the service is capable of being used in the UK and there is a material risk of significant harm to individuals in the UK arising from the service. I hope that that reassures the noble Lord, on behalf of his noble friend, about why that amendment is not needed.

Amendments 209 to 214 seek to place additional requirements on Ofcom to consider the effect on user privacy when using its powers under Clause 110. I agree that tackling online harm needs to take place while protecting privacy and security online, which is why Ofcom already has to consider user privacy before issuing notices under Section 110, among the other stringent safeguards I have set out. Amendment 202A would impose a duty on Ofcom to issue a notice under Clause 110, where it is satisfied that it is necessary and proportionate to do so—this will have involved ensuring that the safeguards have been met.

Ofcom will have access to a wide range of information and must have the discretion to decide the most appropriate course of action in any particular scenario, including where this action lies outside the powers and procedures conferred by Clause 110; for instance, an initial period of voluntary engagement. This is an in extremis power. It is essential that we balance users’ rights with the need to enable a strong response, so Ofcom must be able to assess whether any alternative, less intrusive measures would effectively reduce the level of child sexual exploitation and abuse or terrorist content occurring on a service before issuing a notice.

I hope that that provides reassurance to noble Lords on the amendments in this group, and I invite the noble Lord to withdraw Amendment 14.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, this has been a very useful debate and serves as a good appetite builder for lunch, which I understand we will be able to take shortly.

I am grateful to the Minister for his response and to all noble Lords who have taken part in the debate. As always, the noble Baroness, Lady Kidron, gave us a balanced view of digital rights—the right to privacy and to security—and the fact that we should be trying to advance these two things simultaneously. She was right again to remind us that this is a real problem and there is a lot we can do. I know she has worked on this through things such as metadata—understanding who is communicating with whom—which might strike that nice balance where we are not infringing on people’s privacy too grossly but are still able to identify those who wish harm on our society and in particular on our children.

The noble Baroness, Lady Bennett, was right to pick up this tension between everything, everywhere, all at once and targeted surveillance. Again, that is really interesting to tease out. I am personally quite comfortable with quite intrusive targeted surveillance. I do not know whether noble Lords have been reading the Pegasus spyware stories: I am not comfortable with some Governments placing such spyware on the phones of human rights defenders but I would be much more relaxed about the British authorities placing something similar on the phones of people who are going to plant bombs in Manchester. We need to be really honest about where we are drawing our red lines if we want to go in the direction of targeted surveillance.

The noble Lord, Lord Moylan, was right again to remind us about the importance of private conversations. I cited the example of police officers whose conversations have been exposed. Although it is hard, we should remember that if ordinary citizens want to exchange horrible racist jokes with each other and so on in private groups that is not a matter for the state, but it is when it is somebody in a position of public authority; we have a right to intervene there. Again, we have to remember that as long as it is not illegal people can say horrible things in private, and we should not encourage any situation where we suggest that the state would interfere unless there are legitimate grounds—for example, it is a police officer or somebody is doing something that crosses the line of legality.

The noble Baroness, Lady Fox, reminded us that it is either encrypted or it is not. That is really helpful, as things cannot be half encrypted. If a service provider makes a commitment it is critical that it is truthful. That is what our privacy law tells us. If I say, “This service is encrypted between you and the person you send the message to”, and I know that there is somebody in between who could access it, I am lying. I cannot say it is a private service unless it is truly private. We have to bear that in mind. Historically, people might have been more comfortable with fudging it, but not in 2023, when have this raft of privacy legislation.

The noble Baroness is also right to remind us that privacy can be safety. There is almost nothing more devastating than the leaking of intimate images. When services such as iCloud move to encrypted storage that dramatically reduces the risk that somebody will get access to your intimate images if you store them there, which you are legally entitled to do. Privacy can be a critical part of an individual maintaining their own security and we should not lose that.

The noble Baroness, Lady Stowell, was right again to talk about general monitoring. I am pleased that she found the WhatsApp briefing useful. I was unable to attend but I know from previous contact that there are people doing good work and it is sad that that often does not come out. We end up with this very polarised debate, which my noble friend Lord McNally was right to remind us is unhelpful. The people south of the river are often working very closely in the public interest with people in tech companies. Public rhetoric tends to focus on why more is not being done; there are very few thanks for what is being done. I would like to see the debate move a little more in that direction.

The noble Lord, Lord Knight, opened up a whole new world of pain with VPNs, which I am sure we will come back to. I say simply that if we get the regulatory frameworks right, most people in Britain will continue to use mainstream services as long as they are allowed to be offered. If those services are regulated by the European Union under its Digital Services Act and pertain to the UK and the US in a similar way, they will in effect have global standards, so it will not matter where you VPN from. The scenario the noble Lord painted, which I worry about, is where those mainstream services are not available and we drive people into small, new services that are not regulated by anyone. We would then end up inadvertently driving people back to the wild west that we complain about, when most of them would prefer to use mainstream services that are properly regulated by Ofcom, the European Commission and the US authorities.

14:30
I shall search out the book written by the noble Lord, Lord Kamall, but he was right to talk about unintended consequences. Critically, we are in a world of known unknowns here: we know that there will be an issue when the technical notices are issued, but we do not have the technical notices, so it is really hard for us to understand how far they will be a problem.
The noble Lord, Lord Stevenson, talked about the human rights aspect. Again, that is critical. How do we know whether the powers are proportionate if we do not know what Ofcom is going to tell companies to do? That is the problem. To his credit, the Minister tried to respond and gave some more clarity. There was some in there—and people out there will pore over this like a sacred text to try to understand what was said—but what I heard was, “If you’re already offering an end-to-end encrypted service, we’re not going to tell you to get rid of it, but if your service isn’t currently end-to-end encrypted, we may”. I heard the words “if you are deliberately blinding yourself to the bad content”. That sounds to me like, “Don’t start encrypting if you’re not already encrypted”. If that is the Government’s intention, it may be reasonable, but we will need to tease it out further. It is quite a big deal. Looking forward, we have to ask whether, if end-to-end encrypted services did not exist today and were coming on to the market, Ofcom would try to use the powers to stop them coming on to the market or whether they would be relaxed. We still have a lot of known unknowns in this space that I am sure we will come back to.
I am conscious of the time. I am sure that there will be people out there who are looking at this debate. I remind them that, at this stage, we never vote on anything. I am sure that we will come back to this issue at later stages, where we may vote on it. I beg leave to withdraw the amendment.
Amendment 14 withdrawn.
Amendment 15 not moved.
Clause 6, as amended, agreed.
Clause 7 agreed.
House resumed.

Online Safety Bill

Committee (3rd Day) (Continued)
15:03
Clause 8: Illegal content risk assessment duties
Amendment 16
Moved by
16: Clause 8, page 7, line 16, after “governance,” insert “terms of service,”
Member’s explanatory statement
This amendment makes clear that “design and operation of a service” includes its terms of service.
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

My Lords, this group of amendments concerns terms of service. All the amendments either have the phrase “terms of service” in them or imply that we wish to see more use of the phrase in the Bill, and seek to try to tidy up some of the other bits around that which have crept into the Bill.

Why are we doing that? Rather late in the day, terms of service has suddenly become a key fulcrum, under which much of the operations of the activity relating to people’s usage of social media and service functions on the internet will be expressed in relation to how they view the material coming to them. With the loss of the adult “legal but harmful” provisions, we also lost quite a considerable amount of what would have been primary legislation, which no doubt would have been backed up by codes of practice. The situation we are left with, and which we need to look at very closely, is the triple shield at the heart of the new obligations on companies, and, in particular, on their terms of service. That is set out primarily in Clauses 64, 65, 66 and 67, and is a subject to which my amendments largely refer.

Users of the services would be more confident that the Government have got their focus on terms of service right, if they actually said what should be said on the tin, as the expression goes. If it is the case that something in a terms of service was so written and implemented so that material which should be taken down was indeed taken down, these would become reliable methods of judging whether or not the service is the one people want to have, and the free market would be seen to be working to empower people to make their own decisions about what level of risk they can assume by using a service. That is a major change from the way the Bill was originally envisaged. Because this was done late, we have one or two of the matters to which I have referred already, which means that the amendments focus on changing what is currently in the Bill.

It is also true that the changes were not consulted upon; I do not recall there being any document from government about whether this was a good way forward. The changes were certainly not considered by the Joint Committee, of which several of those present were members—we did not discuss it in the Joint Committee and made no recommendation on it. The level of scrutiny we have enjoyed on the Bill has been absent in this area. The right reverend Prelate the Bishop of Oxford will speak shortly to amendments about terms of service, and we will be able to come back to it. I think it would have been appropriate had the earlier amendment in the name of the noble Lord, Lord Pickles, been in this group because the issue was the terms of service, even though it had many other elements that were important and that we did discuss.

The main focus of my speech is that the Government have not managed to link this new idea of terms of service and the responsibilities that will flow from that to the rest of the Bill. It does not seem to fit into the overall architecture. For example, it is not a design feature, and does not seem to work through in that way. This is a largely self-contained series of clauses. We are trying to ask some of the world’s largest companies, on behalf of the people who use them, to do things on an almost contractual basis. Terms of service are not a contract that you sign up to, but you certainly click something—or occasionally click it, if you remember to—by which you consent to the company operating in a particular set of ways. In a sense, that is a contract, but is it really a contract? At the heart of that contract between companies and users is whether the terms of service are well captured in the way the Bill is organised. I think there are gaps.

The Bill does have something that we welcome and want to hold on to, which is that the process under which the risks are assessed and decisions taken about how companies operate and how Ofcom relates to those decisions is about the design and operation of the service—both the design and the operation, something that the noble Baroness, Lady Kidron, is very keen to emphasise at all times. It all starts and ends with design, and the operation is a consequence of design choices. Other noble Baronesses have mentioned in the debate that small companies get it right and so, when they grow, can be confident that what they are doing is something that is worth doing. Design, and operating that design to make a service, is really important. Are terms of service part of that or are they different, and does it matter? It seems to me that they are downstream from the design: something can be designed and then have terms of service that were not really part of the original process. What is happening here?

My Amendments 16, 21, 66DA, 75 and 197 would ensure that the terms of service are included within the list of matters that constitute “design and operation” of the service at each point that it occurs. I have had to go right through the Bill to add it in certain areas—in a rather irritating way, I am sure, for the Bill team—because sometimes we find that what I think should be a term of service is actually described as something else, such as a “a publicly available statement”, whatever that is. It would be an advantage if we went through it again and defined terms of service and made sure that that was what we were talking about.

Amendments 70 to 72, 79 to 81 and 174 seek to help the Government and their officials with tidying up the drafting, which probably has not been scrutinised enough to pick up these issues. It may not matter, at the end of the day, but what is in the Bill is going to be law and we may as well try to get it right as best we can. I am sure the Minister will say we really do not need to worry about this because it is all about risks and outcomes, and if a company does not protect children or has illegal content, or the user-empowerment duties—the toggling—do not work, Ofcom will find a way of driving the company to sort it out. What does that mean in practice? Does it mean that Ofcom has a role in defining what terms of service are? It is not in the Bill and may not reach the Bill, but it is something that will be a bit of problem if we do not resolve what we mean by it, even if it is not by changing the legislation.

If the Minister were to disagree with my approach, it would be quite nice to have it said at the Dispatch Box so that we can look at that. The key question is: are terms of service an integral part of the design and operation of a service and, if so, can we extend the term to make sure that all aspects of the services people consume are covered by adequate and effective terms of service? There is probably going to be division in the way we approach this because, clearly, whether they are terms of service or have another name, the actual enforcement of illegal and children’s duties will be effected by Ofcom, irrespective of the wording of the Bill—I do not want to question that. However, there is obviously an overlap into questions about adults and others who are affected by the terms of service. If you cannot identify what the terms of service say in relation to something you might not wish to receive because the terms of service are imprecise, how on earth are you going to operate the services, the toggles and things, around it? If you look at that and accept there will be pressure within the market to get these terms of service right, there will be a lot of dialogue with Ofcom. I accept that all that will happen, but it would be good if the position of the terms of service was clarified in the Bill before it becomes law and that Ofcom’s powers in relation to those are clarified—do they or do they not have the chance to review terms of service if they turn out to be ineffective in practice? If that is the case, how are we going to see this work out in practice in terms of what people will be able to do about it, either through redress or by taking the issue to court? I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I support these amendments, which were set out wonderfully by the noble Lord, Lord Stevenson. I want to raise a point made on Tuesday when the noble Baroness, Lady Merron, said that only 3% of people read terms of service and I said that 98% of people do not read them, so one of us is wrong, but I think the direction of travel is clear. She also used a very interesting phrase about prominence, and I want to use this opportunity to ask the Minister whether there is some lever whereby Ofcom can insist on prominence for certain sorts of material—a hierarchy of information, if you like—because these are really important pieces of information, buried in the wrong place so that even 2% or 3% of people may not find them.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I am very pleased that the noble Lord, Lord Stevenson, has given us the opportunity to talk about terms of service, and I will make three points again, in a shorter intervention than on the previous group.

First, terms of service are critical as the impact of terms of service will generally be much greater in terms of the amount of intervention that occurs on content than it will ever be under the law. Terms of service create, in effect, a body of private law for a community, and they are nearly always a superset of the public law—indeed, it is very common for the first items of a terms of service to say, “You must not do anything illegal”. This raises the interesting question of “illegal where?”—what it generally means is that you must not do anything illegal in the jurisdiction in which the service provider is established. The terms of service will say, “Do not do anything illegal”, and then they will give a whole list of other things, as well as illegality, that you cannot do on the platform, and I think this is right because they have different characteristics.

15:15
Secondly, to back up the point made by the noble Baroness, Lady Kidron, we need to be realistic that no one will ever read all the terms of service of the services that they use. There was a study that looked at how long it would take to read the terms of service on a typical mobile phone—I think it is around 10 days; given that they get updated most years, are any of us going to spend 10 days a year reading the terms of service?
We like our real-world analogues: we read all of the law, but none of the people out there read all of the laws of the land unless and until they have a problem, at which point they do read them. Terms of service are very similar in that people are not going to read them and we should not expect people to read them unless and until they have a problem that requires them to do so. I do not mean that as a counsel of despair, but we have to be realistic about what we are expecting people to do.
Thirdly, the Bill is going to make terms of service longer, and we need to get over that. The challenge is always that you want your terms of service to be comprehensive and easy for users, and as we move in the Bill towards making terms of service more actionable—which we are doing because the Bill says that Ofcom will be able to say, “Did you apply your terms of service properly?”—the lawyers for the platforms are going to be saying “What have we missed out?” and “If there is anything we have missed out, we have to go and stick it in there because now we are going to have a regulator breathing down our neck, checking whether or not we have done what we say”.
We should be realistic that we are asking companies to be entirely comprehensive and transparent, and in general that will mean making their terms of service longer. Again, this is not a complete counsel of despair. We can follow Mark Twain’s advice:
“I didn’t have time to write you a short letter, so I wrote you a long one”
and invest the time. That is what we can do to try to make terms of service shorter, rather than just saying to lawyers, “We will pay you by the word, and the more words there are, the happier we are”. But, again, we should be realistic: if it is comprehensive, it is going to be long; there is no way to avoid that.
The noble Lord, Lord Stevenson, asked whether or not it is a contract—that is an interesting question, certainly for the US providers. In the US the regulation, such that there is, is done largely by the Federal Trade Commission, and the concept is whether or not services are engaged in unfair or deceptive practices. An unfair or deceptive practice is not doing what you said you would do in the terms of service or a critical document of that nature.
Interestingly, all the incentive in the US is to be as vague as possible because if you have not said that you will do things, you cannot be hauled in front of the FTC. The EU generally creatives incentives to be as comprehensive as possible, and I was involved in a number of cases where the company I worked for was taken to court and forced to add in more text because the US text was seen as too skeletal—that is a familiar debate to us here, whether we like things to be skeletal or for everything to be filled in.
So we need to be cognisant of that as we build terms of service into the Bill. This is not an argument against the amendments, but rather to say that as we do this, we need to be clear that we may be pulling in opposite directions. They need to be comprehensive, yet easy to use. “We are going to hold you accountable in the US; therefore, you should be vague; but we are also going to hold you accountable in the UK if you are too vague”—where is the right point of specificity and vagueness?
Having said that, it is really important that we focus on this because from a user’s point of view you are far more likely to come across an issue with the terms than an issue with the law—this is great, because most people in this country are law-abiding and not seeking to break the law.
The final point is that sometimes there is a tendency to think that everyone should have uniform terms of service. I can see the argument for a baseline, but in a vibrant market there is a strong case to say that we should celebrate where they are different, and there are communities that are different. For example, if you have a service that targets young people you might want to prohibit swearing; whereas, for example, it would be completely inappropriate to prohibit swearing in a vibrant political community for adults only. There are lots of areas where people understand that the context is different. For example, there are places where nudity—not pornography—is okay, and places where it is not.
So having different terms of service for different types of service is healthy, but I also think that Ofcom making sure that people do what they say they do is a reasonably healthy development, as long as we recognise and accept the consequences of that.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful for this short and focused debate, which has been helpful, and for the points made by the noble Lords, Lord Stevenson and Lord Allan, and the noble Baroness, Lady Kidron. I think we all share the same objective: ensuring that terms of service promote accountability and transparency, and empower users.

One of the Bill’s key objectives is to ensure that the terms of service of user-to-user platforms are suitable and effective. Under the Bill, companies will be required both to set out clearly how they will tackle illegal content and protect children and to ensure that their terms of service are properly enforced. The additional transparency and accountability duties on category 1 services will further ensure that users know what to expect on the largest platforms. This will put an end to these services arbitrarily removing content or, conversely, failing to remove content that they profess to prohibit.

The Bill will also ensure that search services are clear to their users about how they are complying with their adult and child safety duties under this new law. Given the very different way in which search services operate, however, this will be achieved through a publicly available statement rather than through terms of service. The two are meant distinctly.

Noble Lords are right to point to the question of intelligibility. It struck me that, if it takes 10 days to read terms of service, perhaps we should have a race during the 10 days allotted to this Committee stage to see which is quicker—but I take the point. The noble Lord, Lord Allan, is also right that the further requirements imposed through this Bill will only add to that.

The noble Baroness, Lady Kidron, asked a fair question about what “accessibility” means. The Bill requires all platforms’ terms of service for illegal content and child safety duties to be clear and accessible. Ofcom will provide guidance on what that means, including ensuring that they are suitably prominent. The same applies to terms of service for category 1 services relating to content moderation.

I will focus first on Amendments 16, 21, 66DA, 75 and 197, which seek to ensure that both Ofcom and platforms consider the risks associated with platforms’ terms of service with regard to the illegal content and child safety duties in the Bill. We do not think that these amendments are needed. User-to-user services will already be required to assess the risks regarding their terms of service for illegal content. Clause 8 requires companies to assess the “design and operation” of a service in relation to illegal content. As terms of service are integral to how a service operates, they would be covered by this provision. Similarly, Clause 10 sets out that companies likely to be accessed by children will be required to assess the “design and operation” of a service as part of their child risk assessments, which would include the extent to which their terms of service may reduce or increase the risk of harm to children.

In addition to those risk assessment duties, the safety duties will require companies to take proportionate measures effectively to manage and mitigate the risk of harm to people whom they have identified through risk assessments. This will include making changes to their terms of service, if appropriate. The Bill does not impose duties on search services relating to terms of service, as search services’ terms of service play a less important role in determining how users can engage on a platform. I will explain this point further when responding to specific amendments relating to search services but I can assure the noble Lord, Lord Stevenson, that search services will have comprehensive duties to understand and mitigate how the design and operation of their service affects risk.

Amendment 197 would require Ofcom to assess how platforms’ terms of service affect the risk of harm to people that the sector presents. While I agree that this is an important risk factor which Ofcom must consider, it is already provided for in Clause 89, which requires Ofcom to undertake an assessment of risk across regulated services. That requires Ofcom to consider which characteristics of regulated services give rise to harm. Given how integral terms of service are to how many technology companies function, Ofcom will necessarily consider the risk associated with terms of service when undertaking that risk assessment.

However, elevating terms of service above other systems and processes, as mentioned in Clause 89, would imply that Ofcom needs to take account of the risk of harm on the regulated service, more than it needs to do so for other safety-by-design systems and processes or for content moderation processes, for instance. That may not be suitable, particularly as the service delivery methods will inevitably change over time. Instead, Clause 89 has been written to give Ofcom scope to organise its risk assessment, risk register and risk profiles as it thinks suitable. That is appropriate, given that it is best placed to develop detailed knowledge of the matters in question as they evolve over time.

Amendments 70, 71, 72, 79, 80, 81, 174 and 302 seek to replace the Bill’s references to publicly available statements, in relation to search services, with terms of service. This would mean that search services would have to publish how they are complying with their illegal content and child protection duties in terms of service rather than in publicly available statements. I appreciate the spirit in which the noble Lord has tabled and introduced these amendments. However, they do not consider the very different ways in which search services operate.

User-to-user services’ terms of service fulfil a very specific purpose. They govern a user’s behaviour on the service and set rules on what a user is allowed to post and how they can interact with others. If a user breaks these terms, a service can block his or her access or remove his or her content. Under the status quo, users have very few mechanisms by which to hold user-to-user platforms accountable to these terms, meaning that users can arbitrarily see their content removed with few or no avenues for redress. Equally, a user may choose to use a service because its terms and conditions lead them to believe that certain types of content are prohibited while in practice the company does not enforce the relevant terms.

The Bill’s duties relating to user-to-user services’ terms of service seek to redress this imbalance. They will ensure that people know what to expect on a platform and enable them to hold platforms accountable. In contrast, users of search services do not create content or interact with other users. Users can search for anything without restriction from the search service provider, although a search term may not always return results. It is therefore not necessary to provide detailed information on what a user can and cannot do on a search service. The existing duties on such services will ensure that search engines are clear to users about how they are complying with their safety duties. The Bill will require search services to set out how they are fulfilling them, in publicly available statements. Their actions must meet the standards set by Ofcom. Using these statements will ensure that search services are as transparent as user-to-user services about how they are complying with their safety duties.

The noble Lord’s Amendment 174 also seeks to expand the transparency reporting requirements to cover the scope and application of the terms of service set out by search service providers. This too is unnecessary because, via Schedule 8, the Bill already ensures transparency about the scope and application of the provisions that search services must make publicly available. I hope that gives the noble Lord some reassurance that the concerns he has raised are already covered. With that, I invite him to withdraw Amendment 16.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am very grateful to the Minister for that very detailed response, which I will have to read very carefully because it was quite complicated. That is the answer to my question. Terms of service will not be very easy to identify because to answer my questions he has had to pray in aid issues that Ofcom will necessarily have to assess—terms of services—to get at whether the companies are performing the duties that the Bill requires of them.

I will not go further on that. We know that there will be enough there to answer the main questions I had about this. I take the point about search being distinctively different in this area, although a tidy mind like mine likes to see all these things in one place and understand all the words. Every time I see “publicly available statement”, I do not know why but I think about people being hanged in public rather than a term of service or a contract.

15:30
The noble Lord, Lord Allan, made the point that nobody ever reads these terms of service. We generally agree with that, but if you are married to a lawyer, as I am, you read an awful lot more of these things than you perhaps feel are good for your diet. I cannot even go on holiday until I have proven to her that I have read every word of my insurance policy on what I will be shipped home with. It is a frightening thought that some people do that because they like doing it, and she does.
I will not take this much further. The jibe that I had at the beginning—that this does not quite fit with the rest of the Bill—is still there, but we will not get much change out of what we are doing. The important thing is that, even though it is a rather complicated route, it looks as though Ofcom will have, possibly retrospectively and with more transparency than actual powers, the ability to look at terms of service when they are not working.
What I miss is the ability to set a standard for terms of service that is broadly acceptable to people, which was exactly the point that the noble Lord made: they cannot be so complex that you will not read them but they have to be sufficient to achieve what they do. I am still lost about what you can use the triple shield for if you do not know whether the services will deliver what you know you do not want. I beg leave to withdraw the amendment.
Amendment 16 withdrawn.
Amendment 16A
Moved by
16A: Clause 8, page 7, line 23, after “19(2)” insert “and (8A)”
Member’s explanatory statement
This amendment inserts a signpost to the new duty in clause 19 about supplying records of risk assessments to OFCOM.
Amendment 16A agreed.
Clause 8, as amended, agreed.
Clause 9: Safety duties about illegal content
Amendments 16B and 16C
Moved by
16B: Clause 9, page 7, line 27, leave out “all”
Member’s explanatory statement
This is a technical amendment needed because the new duty to summarise illegal content risk assessments in the terms of service (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 1 services.
16C: Clause 9, page 7, line 27, at end insert “(as indicated by the headings).”
Member’s explanatory statement
This amendment provides clarification because the new duty to summarise illegal content risk assessments in the terms of service (see the amendment inserting new subsection (8A) below) is imposed only on providers of Category 1 services.
Amendments 16B and 16C agreed.
Amendment 17
Moved by
17: Clause 9, page 7, line 30, leave out “prevent individuals from” and insert “protect individuals from harms arising due to them”
Member’s explanatory statement
This amendment, along with the other amendment to Clause 9 in the name of Lord Moylan, adds a requirement to protect individuals from harm, rather than monitoring, prior restraint and/or denial of access. Further obligations to mitigate and manage harm, including to remove unlawful content that is signalled to the service provider, are unchanged by this amendment.
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, this is a very large and wide-ranging group of amendments. Within it, I have a number of amendments that, on their own, span three separate subjects. I propose to address these one after the other in my opening remarks, but other subjects will be brought in as the debate continues and other noble Lords speak to their own amendments.

If I split the amendments that I am speaking to into three groups, the first is Amendments 17 and 18. These relate to Clause 9, on page 7, where safety duties about illegal content are set out. The first of those amendments addresses the obligation to prevent individuals encountering priority illegal content by means of the service.

Earlier this week in Committee, I asked the Minister whether the Government understood “prevent” and “protect”, both of which they use in the legislation, to have different weight. I did not expect my noble friend to give an answer at that point, but I know that he will have reflected on it. We need clarity about this at some point, because courts will be looking at, listening to and reading what the Government say at the Dispatch Box about the weight to be given to these words. To my mind, to prevent something happening requires active measures in advance that ensure as far as reasonably and humanly possible that it does not actually happen, but one could be talking about something more reactive to protect someone from something happening.

This distinction is of great importance to internet companies—I am not talking about the big platforms—which will be placed, as I say repeatedly, under very heavy burdens by the Bill. It is possible that they simply will not be able to discharge them and will have to go out of business.

Let us take Wikipedia, which was mentioned earlier in Committee. It operates in 300 languages but employs 700 moderators globally to check what is happening. If it is required by Clause 9 to

“prevent individuals from encountering priority illegal content by means of the service”,

it will have to scrutinise what is put up on this community-driven website as or before it appears. Quite clearly, something such as Welsh Wikipedia—there is Wikipedia in Welsh—simply would not get off the ground if it had to meet that standard, because the number of people who would have to be employed to do that would be far more than the service could sustain. However, if we had something closer to the wording I suggest in my amendment, where services have to take steps to “protect” people—so they could react to something and take it down when they become aware of it—it all becomes a great deal more tolerable.

Similarly, Amendment 18 addresses subsection (3) of the same clause, where there is a

“duty to operate a service using proportionate systems and processes … to … minimise the length of time”

for which content is present. How do you know whether you are minimising the length of time? How is that to be judged? What is the standard by which that is to be measured? Would it not be a great deal better and more achievable if the wording I propose, which is that you simply are under an obligation to take it down, were inserted? That is my first group of amendments. I put that to my noble friend and say that all these amendments are probing to some extent at this stage. I would like to hear how he thinks that this can actually be operated.

My second group is quite small, because it contains only Amendment 135. Here I am grateful to the charity JUSTICE for its help in drawing attention to this issue. This amendment deals with Schedule 7, on page 202, where the priority offences are set out. Paragraph 4 of the schedule says that a priority offence includes:

“An offence under any of the following provisions of the Public Order Act 1986”.


One of those is Section 5 of that Act, “Harassment, alarm or distress”. Here I make a very different point and return to territory I have been familiar with in the past. We debated this only yesterday in Grand Committee, although I personally was unable to be there: the whole territory of hate crimes, harmful and upsetting words, and how they are to be judged and dealt with. In this case, my amendment would remove Section 5 of the Public Order Act from the list of priority offences.

If society has enough problems tolerating the police going round and telling us when we have done or said harmful and hurtful things and upbraiding us for it, is it really possible to consider—without the widest form of censorship—that it is appropriate for internet platforms to judge us, shut us down and shut down our communications on the basis of their judgment of what we should be allowed to say? We already know that there is widespread suspicion that some internet platforms are too quick to close down, for example, gender critical speech. We seem to be giving them something close to a legislative mandate to be very trigger-happy when it comes to closing down speech by saying that it engages, or could engage, Section 5 of the Public Order Act. I will come to the question of how they judge it in my third group, in a moment—but the noble Lord might be able to help me.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

Just to reinforce the point the noble Lord, Lord Moylan, made on that, I certainly had experience of where the police became the complainants. They would request, for example, that you take down an English Defence League event, claiming that it would be likely to cause a public order problem. I have no sympathy whatever with the English Defence League, but I am very concerned about the police saying “You must remove a political demonstration” to a platform and citing the legal grounds for doing that. The noble Lord is on to a very valid point to be concerned about that.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

I am grateful to the noble Lord. I really wonder whether the Government realise what they are walking into here. On the one hand, yesterday the Grand Committee was debating the statutory instrument putting in place new statutory guidance for the police on how to enforce, much more sensitively than in the past, non-crime hate incidents. However, on the other hand, the next day in this Chamber we are putting an obligation on a set of mostly foreign private companies to act as a police force to go around bullying us and closing us down if we say something that engages Section 5 of the Public Order Act. I think this is something the Government are going to regret, and I would very much like to hear what my noble friend has to say about that.

Finally, I come to my third group of amendments: Amendments 274, 278, 279 and 283. They are all related and on one topic. These relate to the text of the Bill on page 145, in Clause 170. Here we are discussing what judgments providers have to make when they come to decide what material to take down. Inevitably, they will have to make judgments. That is one of the unfortunate things about this Bill. A great deal of what we do in our lives is going to have to be based on judgments made by private companies, many of which are based abroad but which we are trying to legislate for.

It makes a certain sense that the law should say what they should take account of in making those judgments. But the guidance—or rather, the mandate—given to those companies by Clause 170 is, again, very hair-trigger. Clause 170(5), which I am proposing we amend, states:

“In making such judgements, the approach to be followed is whether a provider has reasonable grounds to infer that content is … of the kind in question”.


I am suggesting that “reasonable grounds to infer” should be replaced with “sufficient evidence to infer”, so that they have to be able to produce some evidence that they are justified in taking content down. The test should be higher than simply having “reasonable grounds”, which may rest on a suspicion and little evidence at all. So one of those amendments relates to strengthening that bar so that they must have real evidence before they can take censorship action.

I add only two words to subsection (6), which talks about reasonable grounds for the inference—it defines what the reasonable grounds are—that

“exist in relation to content and an offence if, following the approach in subsection (2)”

and so on. I am saying “if and only if”—in other words, I make it clear that this is the only basis on which material can be censored using the provisions in this section, so as to limit it from going more widely. The third amendment in my group is essentially consequential to that.

15:45
We are all worried in this Committee about prospect of speech being censored in a way which infringes the freedom of speech rights that we fought so hard to establish and which are also embedded in Article 10 of the European Convention on Human Rights. We want to have a legal structure that does not empower providers to act as private sector censors ranging over what we do, except in circumstances where it is wholly justified and in the public interest. The language in the Bill is far too loose for this purpose. It does not give us the protection. It does not do what my noble friend said it would do when he spoke at Second Reading, which is to strike the right balance. These amendments in my third group—and indeed the one in my second group—are there to help strike the right balance. I beg to move.
Lord Bishop of Guildford Portrait The Lord Bishop of Guildford
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to Amendments 128, 130 and 132, as well as Amendments 143 to 153 in this grouping. They were tabled in the name of my right reverend colleague the Bishop of Derby, who is sorry that she cannot be here today.

The Church of England is the biggest provider of youth provision in our communities and educates around 1 million of our nation’s children. My colleague’s commitment to the principles behind these amendments also springs from her experience as vice chair of the Children’s Society. The amendments in this grouping are intended to strengthen legislation on online grooming for the purpose of child criminal exploitation, addressing existing gaps and ensuring that children are properly protected. They are also intended to make it easier for evidence of children being groomed online for criminal exploitation to be reported by online platforms to the police and the National Crime Agency.

Research from 2017 shows that one in four young people reported seeing illicit drugs advertised for sale on social media—a percentage that is likely to be considerably higher six years on. According to the Youth Endowment Fund in 2022, 20% of young people reported having seen online content promoting gang membership in the preceding 12 months, with 24% reporting content involving the carrying, use or promotion of weapons.

In relation to drugs, that later research noted that these platforms provide opportunities for dealers to build trust with potential customers, with young people reporting that they are more likely to see a groomer advertising drugs as a friend than as a dealer. This leaves young people vulnerable to exploitation, thereby reducing the scruples or trepidation they might feel about buying drugs in the first place. Meanwhile, it is also clear that social media is changing the operation of the county lines model. There is no longer the need to transport children from cities into the countryside to sell drugs, given that children who live in less populated areas can be groomed online as easily as in person. A range of digital platforms is therefore being used to target potential recruits among children and young people, with digital technologies also being deployed—for example, to monitor their whereabouts on a drugs run.

More research is being carried out by the Children’s Society, whose practitioners reported a notable increase in the number of perpetrators grooming children through social media and gaming sites during the first and second waves of the pandemic. Young people were being contacted with promotional material about lifestyles they could lead and the advantages of working within a gang, and were then asked to do jobs in exchange for money or status within this new group. It is true that some such offences could be prosecuted under the Modern Slavery Act 2015, but there remains a huge disparity between the scale of exploitation and the number of those being charged under the Act. Without a definition of child exploitation for criminal purposes, large numbers of children are being groomed online and paying the price for crimes committed by some of their most dangerous and unscrupulous elders.

It is vital that we protect our children from online content which facilitates that criminal exploitation, in the same way that we are looking to protect them from sexual exploitation. Platforms must be required to monitor for illegal content related to child criminal exploitation on their sites and to have mechanisms in place for users to flag it with those platforms so it can be removed. This can be achieved by including modern slavery and trafficking, of which child criminal exploitation is a form, into the scope of illegal content within the Bill, which is what these amendments seek to do. It is also vital that the law sets out clear expectations on platforms to report evidence of child criminal exploitation to the National Crime Agency in the same way as they are expected to report content involving child sexual exploitation and abuse to enable child victims to be identified and to receive support. Such evidence may enable action against the perpetrators without the need of a disclosure from child victims. I therefore fully support and endorse the amendments standing in the name of the right reverend Prelate.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this is again a very helpful set of amendments. I want to share some experience that shows that legality tests are really hard. Often from the outside there is an assumption that it is easy to understand what is legal and illegal in terms of speech, but in practice that is very rarely the case. There is almost never a bright line, except in a small class of child sexual abuse material where it is always illegal and, as soon as you see the material, you know it is illegal and you can act on it. In pretty much every other case, you have to look at what is in front of you.

I will take a very specific example. Something we had to deal with was images of Abdullah Öcalan, the leader of the PKK in Turkey. If somebody shared a picture of Abdullah Öcalan, were they committing a very serious offence, which is the promotion of terrorism? Were they indicating support for the peace process that was taking place in Turkey? Were they showing that they support his socialist and feminist ideals? Were they supporting the YPG, a group in Syria to which we were sending arms, that venerates him? This is one example of many I could give where the content in front of you does not tell you very clearly whether or not the speech is illegal or speech that should be permitted. Indeed, we would take speech like that down and I would get complaints, including from Members of Parliament, saying, “Why have you removed that speech? I’m entitled to talk about Abdullah Öcalan”, and we would enter into an argument with them.

We would often ask lawyers in different countries whether they could tell us whether a speech was legal or illegal. The answer would come back as probably illegal, likely illegal, maybe illegal and, occasionally, definitely not illegal, but it was nearly always on the spectrum. The amendments we are proposing today are to try to understand where the Government intend people to draw that line when they get that advice. Let us assume the company wants to do the right thing and follow the instructions of the Bill and remove illegal content. At what level do they say it has met the test sufficiently, given that in the vast majority of cases, apart from the small class of illegal content, they are going to be given only a likelihood or a probability? As the noble Lord, Lord Moylan, pointed out, we have to try to insert this notion of sufficient evidence with Amendments 273, 275, 277, 280 and 281 in the names of my noble friend Lord Clement-Jones and the noble Viscount, Lord Colville, who is unable to be in his place today. I think the noble Baroness, Lady Kidron, may also have signed them. We are trying to flesh out the point at which that illegality standard should kick in.

Just to understand again how this often works when the law gets involved, I say that there is a law in Germany; the short version is NetzDG. If there are any German speakers who can pronounce the compound noun that is its full title, there will be a prize. It is a long compound word that means “network enforcement Act”. It has been in place for a few years and it tells companies to do something similar—to remove content that is illegal in Germany. There would be cases where we would get a report from somebody saying, “This is illegal”, and we would take action; then it went into the German system and three months later we would finally get told whether it was actually illegal in a 12-page judgment that a German court had figured out. In the meantime, all we could do was work on our best guess while that process was going on. I think we need to be very clear that illegality is hard.

Cross-jurisdictional issues present us with another set of challenges. If both the speaker and the audience are in the United Kingdom, it is fairly clear. But in many cases, when we are talking about online platforms, one or other, or even both the speaker and the audience, may be outside the United Kingdom. Again, when does the speech become illegal? It may be entirely legal speech between two people in the United States. I think—and I would appreciate clarification from the Minister—that the working assumption is that if the speech was reported by someone not in the United State but in the UK, the platform would be required to restrict access to it from the UK, even though the speech is entirely legal in the jurisdiction in which it took place. Because the person in the UK encountered it, there would be a duty to restrict it. Again, it has been clarified that there is certainly not a duty to take the speech down, because it is entirely legal speech outside the UK. These cross-jurisdictional issues are interesting; I hope the Minister can clarify that.

The amendments also try to think about how this would work in practice. Amendment 287 talks about how guidance should be drawn up in consultation with UK lawyers. That is to avoid a situation where platforms are guessing too much at what UK lawyers want; they should at least have sought UK legal advice. That advice will then be fed into the guidance given to their human reviewers and their algorithms. That is the way, in practice, in which people will carry out the review. There is a really interesting practical question—which, again, comes up under NetzDG—about the extent to which platforms should be investing in legal review of content that is clearly against their terms of service.

There will be two kinds of platform. There will be some platforms that see themselves as champions of freedom of expression and say they will only remove stuff that is illegal in the UK, and everything else can stay up. I think that is a minority of platforms—they tend to be on the fringes. As soon as a platform gets a mainstream audience, it has to go further. Most platforms will have terms of service that go way beyond UK law. In that case, they will be removing the hate speech, and they will be confident that they will remove UK-illegal hate speech within that. They will remove the terrorist content. They will be confident and will not need to do a second test of the legality in order to be able to remove that content. There is a practical question about the extent to which platforms should be required to do a second test if something is already illegal under their terms.

There will be, broadly speaking again, four buckets of content. There will be content that is clearly against a platform’s terms, which it will want to get rid of immediately. It will not want to test it again for legality; it will just get rid of it.

There will be a second bucket of content that is not apparently against a platform’s terms but clearly illegal in the UK. That is a very small subset of content: in Germany, that is Holocaust denial content; in the United Kingdom, this Parliament has looked at Holocaust denial and chosen not to criminalise it, so that will not be there, but an equivalent for us would be migration advice. Migration advice will not be against the terms of service of most platforms, but in the Government’s intention, the Illegal Migration Bill is to make it illegal and require it to be removed, and the consequent effect will be that it will have to be removed under the terms of this Bill. So there will be that small set of content that is illegal in the UK but not against terms of service.

There will be a third bucket of content that is not apparently against the terms or the law, and that actually accounts for most of the complaints that a platform gets. I will choose my language delicately: complaint systems are easy, and people complain to make a point. They use complaint systems such as dislike buttons. The reality is that one of the most common sets of complaints you get is when there is a football match and the two opposing teams report the content on each other’s pages as illegal. They will do that every time, and you get used to it, and that is why you learn to discount mass-volume complaints. But again, we should be clear that there are a great many complaints that are merely vexatious.

The final bucket is of content that is unclear and legal review will be needed. Our amendment is intended to deal with those. A platform will go out and get advice. It is trying to understand at what point something like migration advice tips over into the illegal as opposed to being advice about going on holiday, and it is trying to understand that based on what it can immediately see. Once it has sought that advice, it will feed that back into the guidance to reviewers and the algorithms to try and remove content more effectively and be compliant with the Bill as a whole and not get into trouble with Ofcom.

Some areas are harder than others. The noble Lord, Lord Moylan, already highlighted one: public order offences, which are extremely hard. If somebody says something offensive or holds an offensive political view—I suspect the noble Baroness, Lady Fox, may have something to say on this—people may well make contact and claim that it is in breach of public order law. On the face of it, they may have a reasonably arguable case but again, as a platform, you are left to make a decision.

16:00
There is a really interesting potential role for Ofcom here. One thing that is frustrating if you work at a platform is that you will often get stuck and when you go out and look for advice, you find it is hard to get it. When I ran a working group with some French lawyers, including quite senior judges, they came into the working group saying, “This is all straightforward—you’re just not removing the illegal stuff”. So we gave them real cases and it was interesting to see how half of the lawyers in the room would be on one side, saying “It must come down—it’s against French law” while the other half was saying, “How could you possibly take this down in France?”, because it was protected speech. It is really difficult to get that judgment but, interestingly, an unintended consequence of the Bill may be that Ofcom will ultimately get stuck in that position.
The Bill is not about Ofcom making rulings on individual items of content but if—as in the example I shared with the noble Lord, Lord Moylan, earlier—the police have said to a platform, “You must remove this demonstration. It is illegal”, and the platform said, “No, we judge it not to be illegal”, where are the police going to go? They will go to Ofcom and say, “Look, this platform is breaching the law”, so Ofcom is going to get pulled into that kind of decision-making. I do not envy it that but, again, we need to plan for that scenario because people who complain about illegality will go wherever they think they can get a hearing, and Ofcom will be one of those entities.
A huge amount on this illegal content area still needs to be teased out. I ask the Minister to respond specifically to the points I have raised around whose jurisdiction it is. If the speaker is speaking legally, because they are in a country outside the United Kingdom, what is the Government’s expectation on platforms in those circumstances? Will he look at the issue of the tests and where on this spectrum, from probably illegal through to likely to be illegal and may be illegal, the Government expect platforms to draw the line? If platforms have removed the bad content, will he consider carefully to what extent the Government think that the platforms should have to go through the process of investing time and energy to work out whether they removed it for illegality or for a terms of service breach? That is interesting but if our focus is on safety, frankly, it is wasted effort. We need to question how far we expect the platforms to do that.
Baroness Buscombe Portrait Baroness Buscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, before speaking to my Amendment 137, I want to put a marker down to say that I strongly support Amendment 135 in the name of my noble friend Lord Moylan. I will not repeat anything that he said but I agree with absolutely every word.

Amendment 137 is in my name and that of my noble and learned friend Lord Garnier and the noble Lord, Lord Moore of Etchingham. This amendment is one of five which I have tabled with the purpose of meeting a core purpose of the Bill. In the words of my noble friend the Minister in response to Amendment 1, it is

“to protect users of all ages from being exposed to illegal content”—[Official Report, 19/4/23; col. 724.]

—in short, to ensure that what is illegal offline is illegal online.

If accepted, this small group of amendments would, I strongly believe, make a really important difference to millions of people’s lives—people who are not necessarily listed in Clause 12. I therefore ask the Committee to allow me to briefly demonstrate the need for these amendments through the prism of millions of people and their families working and living in rural areas. They are often quite isolated and working alone in remote communities, and are increasingly at risk of or are already suffering awful online abuse and harassment. This abuse often goes way beyond suffering; it destroys businesses and a way of life.

I find it extraordinary that the Bill seems to be absent of anything to do with livelihoods. It is all about focusing on feelings, which of course are important—and the most important focus is children—but people’s businesses and livelihoods are being destroyed through abuse online.

Research carried out by the Countryside Alliance has revealed a deeply disturbing trend online that appears to be disproportionately affecting people who live in rural areas and who are involved in rural pursuits. Beyond direct abuse, a far more insidious tactic that activists have adopted involves targeting businesses involved in activities of which they disapprove, such as livestock farming or hosting shoots. They post fake reviews on platforms including Tripadvisor and Google Maps, and their aim is to damage the victim, their business and their reputation by, to put it colloquially, trashing their business and thereby putting off potential customers. This is what some call trolling.

Let me be clear that I absolutely defend, to my core, the right to freedom of expression and speech, and indeed the right to offend. Just upsetting someone is way below the bar for the Bill, or any legislation. I am deeply concerned about the hate crime—or non-crime—issue we debated yesterday; in fact, I put off reading the debate because I so disagree with this nonsense from the College of Policing.

Writing a negative review directly based on a negative experience is entirely acceptable in my book, albeit unpleasant for the business targeted. My amendments seek to address something far more heinous and wrong, which, to date, can only be addressed as libel and, therefore, through the civil courts. Colleagues in both your Lordships’ House and in another place shared with me tremendously upsetting examples from their constituents and in their neighbourhoods of how anonymous activists are ruining the lives of hard-working people who love this country and are going the extra mile to defend our culture, historic ways of life and freedoms.

Fortunately, through the Bill, the Government are taking an important step by introducing a criminal offence of false communications. With the leave of the Committee, I will briefly cite and explain the other amendments in order to make sense of Amendment 137. One of the challenges of the offence of false communications is the need to recognise that so much of the harm that underpins the whole reason why the Bill is necessary is the consequence of allowing anonymity. It is so easy to destroy and debilitate others by remaining anonymous and using false communications. Why be anonymous if you have any spine at all to stand up for what you believe? It is not possible offline—when writing a letter to a newspaper, for example—so why is it acceptable online? The usual tech business excuse of protecting individuals in rogue states is no longer acceptable, given the level of harm that anonymity causes here at home.

Therefore, my Amendment 106 seeks to address the appalling effect of harm, of whatever nature, arising from false or threatening communications committed by unverified or anonymous users—this is what we refer to as trolling. Amendments 266 and 267, in my name and those of my noble and learned friend Lord Garnier and my noble friend Lord Leicester, would widen the scope of this new and welcome offence of false communications to include financial harm, and harm to the subject of the false message arising from its communication to third parties.

The Bill will have failed unless we act beyond feelings and harm to the person and include loss of livelihood. As I said, I am amazed that it is not front and centre of the Bill after safety for our children. Amendment 268, also supported by my noble and learned friend, would bring within the scope of the communications offences the instigation of such offences by others—for example, Twitter storms, which can involve inciting others to make threats without doing so directly. Currently, we are unsure whether encouraging others to spread false information—for example, by posting fake reviews of businesses for ideologically motivated reasons—would become an offence under the Bill. We believe that it should, and my Amendment 268 would address this issue.

I turn briefly to the specifics of my Amendment 137. Schedule 7 lists a set of “priority offences” that social media platforms must act to prevent, and they must remove messages giving rise to certain offences. However, the list does not include the new communications offences created elsewhere in Part 10. We believe that this is a glaring anomaly. If there is a reason why the new communications offences are not listed, it is important that we understand why. I hope that my noble friend the Minister can explain.

The practical effect of Amendment 137 would be to include the communications offences introduced in the Bill and communications giving rise to them within the definition of “relevant offence” and “priority illegal content” for the purposes of Clause 53(4) and (7) and otherwise.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I ask the Committee to have a level of imagination here because I have been asked to read the speech of the noble Viscount, Lord Colville—

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

I do not know who advised the noble Baroness—and forgive me for getting up and getting all former Leader on her—but this is a practice that we seem to have adopted in the last couple of years and that I find very odd. It is perfectly proper for the noble Baroness to deploy the noble Viscount’s arguments, but to read his speech is completely in contravention of our guidance.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I beg the pardon of the Committee. I asked about it and was misinformed; I will do as the noble Baroness says.

The noble Viscount, Lord Colville, is unable to be with us. He put his name to Amendments 273, 275, 277 and 280. His concern is that the Bill sets the threshold for illegality too low and that in spite of the direction provided by Clause 170, the standards for determining illegality are too vague.

I will make a couple of points on that thought. Clause 170(6) directs that a provider must have

“reasonable grounds to infer that all elements necessary for the commission of the offence, including mental elements, are present or satisfied”,

but that does not mean that the platform has to be certain that the content is illegal before it takes it down. This is concerning when you take it in combination with what or who will make judgments on illegality.

If a human moderator makes the decision, it will depend on the resources and time available to them as to how much information they gather in order to make that judgment. Unlike in a court case, when a wide range of information and context can be gathered, when it comes to decisions about content online, these resources are very rarely available to human moderators, who have a vast amount of content to get through.

If an automated system makes the judgment, it is very well established that algorithms are not good at context—the Communications and Digital Committee took evidence on this repeatedly when I was on it. AI simply uses the information available in the content itself to make a decision, which can lead to significant missteps. Clause 170(3) provides the requirement for the decision-makers to judge whether there is a defence for the content. In the context of algorithms, it is very unclear how they will come to such a judgment from the content itself.

I understand that these are probing amendments, but I think the concern is that the vagueness of the definition will lead to too much content being taken down. This concern was supported by Parliament’s Joint Committee on Human Rights, which wrote to the former Culture Secretary, Nadine Dorries, on that matter. I apologise again.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I support the amendments in this group that probe how removing illegal material is understood and will be used under the Bill. The noble Lord, Lord Moylan, explained a lot of my concerns, as indeed did the noble Viscount, Lord Colville, via his avatar. We have heard a range of very interesting contributions that need to be taken seriously by the Government. I have put my name to a number of amendments.

The identification of illegal material might be clear and obvious in some cases—even many cases. It sounds so black and white: “Don’t publish illegal material”. But defining communications of this nature can be highly complex, so much so that it is traditionally reserved for law enforcement bodies and the judicial system. We have already heard from the noble Lord, Lord Moylan, that, despite Home Secretaries, this House, regulations and all sorts of laws having indicated that non-crime hate incidents, for example, should not be pursued by the police, they continue to pursue them as though they are criminal acts. That is exactly the kind of issue we have.

16:15
I noted earlier that the noble Lord, Lord Bethell, made a passionate intervention about, of all things, Andrew Tate and his illegality in relation to this Bill. That prompted me to think a number of things. Andrew Tate is an influencer who I despise, as I do the kind of things he says. But, as far as I know, the criminal allegations he faces are not yet resolved, so he has to be seen as innocent until proven guilty. Most of what he has online that is egregious might well be in bad taste, as people say—I would say that it is usually misogynist—but it is not against the law. If we get to a situation where that is described as illegality, that is the kind of thing that I worry about. As we have heard from other noble Lords, removing so-called illegal content for the purpose of complying with this regulatory system will mean facing such dilemmas.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

In talking about individuals and investigations, the noble Baroness reminded me of one class of content where we do have clarity, and that is contempt of court. That is a frequent request. We know that it is illegal in that case because a judge writes to the company and says, “You must not allow this to be said because it is in contempt of court”, but that really is the exception. In most other cases, someone is saying, “I think it is illegal”. In live proceedings, in most cases it is absolutely clear because a judge has told you.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

That is very helpful.

I am concerned that removing so-called illegal content for the purpose of complying with the regulatory system covers not only that which reaches conviction in a criminal court but possibly anything that a platform determines could be illegal, and therefore it undermines our own legal system. As I have said, that marks a significant departure from the rule of law. It seems that the state is asking or mandating private companies to make determinations about what constitutes illegality.

The obligations on a platform to determine what constitutes illegality could obviously become a real problem, particularly in relation to limitations on free expression. As we have already heard, the Public Order Act 1986 criminalises, for example, those who stir up hatred through the use of words, behaviour or written material. That is contentious in the law offline. By “contentious”, I mean that it is a matter of difficulty that requires the full rigour of the criminal justice system, understanding the whole history of established case law. That is all necessary to make a conviction under that law for offences of this nature.

Now we appear to be saying that, without any of that, social media companies should make the decision, which is a nerve-racking situation to be in. We have already heard the slippery phrase “reasonable grounds to infer”. If that was the basis on which you were sent to prison—if they did not have to prove that you were guilty but they had reasonable grounds to infer that you might be, without any evidence—I would be worried, yet reasonable grounds to infer that the content could be illegal is the basis on which we are asking for those decisions to be made. That is significantly below the ordinary burden of proof required to determine that an illegal act has been committed. Under this definition, I fear that platforms will be forced to overremove and censor what ultimately will be entirely lawful speech.

Can the Minister consider what competency social media companies have to determine what is lawful? We have heard some of the dilemmas from somebody who was in that position—let alone the international complications, as was indicated. Will all these big tech companies have to employ lots of ex-policemen and criminal lawyers? How will it work? It seems to me that there is a real lack of qualifications in that sphere— that is not a criticism, because those people decided to work in big tech, not in criminal law, and yet we are asking them to pursue this. That is a concern.

I will also make reference to what I think are the controversies around government Amendments 136A and 136B to indicate the difficulties of these provisions. They concern illegal activity—such as “assisting unlawful immigration”, illegal entry, human trafficking and similar offences—but I am unsure as to how this would operate. While it is the case that certain entrances to the UK are illegal, I suddenly envisage a situation where a perfectly legitimate political debate—for example, about the small boats controversy—would be taken down, and that people advocating for a position against the Government’s new Illegal Migration Bill could be accused of supporting illegality. What exactly will be made illegal in those amendments to the Online Safety Bill?

The noble Baroness, Lady Buscombe, made a fascinating speech about an interesting group of amendments. Because of the way the amendments are grouped, I feel that we have moved to a completely different debate, so I will not go into any detail on this subject. Anonymous trolling, Twitter storms and spreading false information are incredibly unpleasant. I am often the recipient of them—at least once a week—so I know personally that you feel frustrated that people tell lies and your reputation is sullied. However, I do not think that these amendments offer the basis on which that activity should be censored, and I will definitely argue against removing anonymity clauses—but that will be in another group. It is a real problem, but I do not think that the solution is contained in these amendments.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, my contribution will be less officious than my intervention earlier in this group. In the last couple of years since I returned to the House—as I describe it—having spent time at the Charity Commission, I have noticed a new practice emerging of noble Lords reading out other people’s speeches. Every time I had seen it happen before, I had not said anything, but today I thought, “I can’t sit here and not say anything again”. I apologise for my intervention.

I am grateful to my noble friend Lord Moylan for bringing forward his amendments and for introducing them in the incredibly clear way he did; they cover some very complex and diverse issues. I know that there are other amendments in the group which might be described as similar to his.

There are a couple of things I want to highlight. One interesting thing about the debate on this group is the absence of some of our legal friends—I apologise to my noble friend Lady Buscombe, who is of course a very distinguished lawyer. The point I am making is that we are so often enriched by a lot of legal advice and contributions on some of the more challenging legal issues that we grapple with, but we do not have that today, and this is a very difficult legal issue.

It is worth highlighting again, as has been touched on a little in some of the contributions, the concern, as I understand it, with how the Bill is drafted in relation to illegal content and the potential chilling effect of these clauses on social media platforms. As has already been said, there is a concern that it might lead them to take a safety-first approach in order to avoid breaking the law and incurring the sanctions and fines that come with the Bill, which Ofcom will have the power to apply. That is the point we are concerned with here. It is the way in which this is laid out, and people who are much better equipped than I am have already explained the difference between evidence versus reasonable grounds to infer.

What the noble Lord, Lord Allan, hit on in his contribution is also worth taking into account, and that is the role of Ofcom in this situation. One of the things I fear, as we move into an implementation phase and the consequences of the Bill start to impact on the social media firms, is the potential for the regulator to be weaponised in a battle on the cultural issues that people are becoming increasingly exercised about. I do not have an answer to this, but I think it is important to understand the danger of where we might get to in the expectations of the regulator if we create a situation where the social media platforms are acting in a way that means people are looking for recourse or a place to generate further an argument and a battle that will not be helpful at all.

I am not entirely sure, given my lack of legal expertise —this is why I would have been very grateful for some legal expertise on this group—whether what my noble friend is proposing in his amendments is the solution, but I think we need to be very clear that this is a genuine problem. I am not sure, as things stand in the Bill, that we should be comfortable that it is not going to create problems. We need to find a way to be satisfied that this has been dealt with properly.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

It is a great honour to follow my noble friend. I completely agree with her that this is a powerful discussion and there are big problems in this area. I am grateful also to my noble friend Lord Moylan for raising this in the first place. It has been a very productive discussion.

I approach the matter from a slightly different angle. I will not talk about the fringe cases—the ones where there is ambiguity, difficulty of interpretation, or responsibility or regulatory override, all of which are very important issues. The bit I am concerned about is where primary priority content that clearly demonstrates some kind of priority offence is not followed up by the authorities at all.

The noble Lord, Lord Allan, referred to this point, although he did slightly glide over it, as though implying, if I understood him correctly, that this was not an area of concern because, if a crime had clearly been committed, it would be followed up on. My fear and anxiety is that the history of the internet over the last 25 years shows that crimes—overt and clear crimes that are there for us to see—are very often not followed up by the authorities. This is another egregious example of where the digital world is somehow exceptionalised and does not have real-world rules applied to it.

16:30
The noble Baroness, Lady Fox, quite reasonably asked me about Andrew Tate. That matter is sub judice; the noble Lord, Lord Allan, referred to it and I do not want to drag the conversation into dangerous legal territory. However, she makes the good point that we sometimes see, particularly in the online abuse of women, offences that are quite clearly crimes; they are crimes of rape, violent abuse and child abuse. It would not take any of us long to find videos that showed clear examples of crime, but very often they are not followed up with the energy and determination that they could or should be, because things on the internet somehow do not seem to touch the authorities in the way they should do.
His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services—
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I want to clarify one point. I have had a slightly different experience, which is that for many people—women, at least—whom I have talked to recently, there is an over-enthusiasm and an over-zealous attitude to policing the speech of particular women and, as we have already heard, gender-critical women. It is often under the auspices of hate speech and there is all sorts of discussion about whether the police are spending too long trawling through social media. By contrast, if you want to get a policeman or policewoman involved in a physical crime in your area, you cannot get them to come out. So I am not entirely convinced. I think policing online speech at least is taking up far too much of the authorities’ time, not too little time, and distracting them from solving real social and criminal activity.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

I defer to the noble Baroness, Lady Fox, on speech crime. That is not the area of my expertise, and it is not the purpose of my points. My points were to do with the kinds of crime that affect children in particular. His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services is very specific about that point. It says that “unacceptable delays are commonplace” and it gives a very large number of case studies. I will not go through them now because it is Thursday afternoon, but I think noble Lords can probably imagine the kinds of things we are talking about. They include years of delay, cases not taken seriously or overlooked, evidence lost, and so forth. The report found that too often children were put at risk because of this, and offenders were allowed to escape justice, and it gave 17 recommendations for how the police force should adapt in order to meet this challenge.

So my questions to the Minister are these. When we talk about things such as age verification for hardcore porn, we are quite often told that we do not need to worry about some of this because it is covered by illegal content provisions, and we should just leave it to the police to sort out. His Majesty’s Inspectorate gives clear evidence—this is a recent report from last month—that this is simply not happening in the way it should be. I therefore wondered what, if anything, is in the Bill to try to close down this particular gap. That would be very helpful indeed.

If it is really not for the purposes of this Bill at all—if this is actually to do with other laws and procedures, other departments and the way in which the resources for the police are allocated, as the noble Baroness, Lady Fox, alluded to—what can the Government do outside the boundaries of this legislation to mobilise the police and the prosecution services to address what I might term “digital crimes”: that is, crimes that would be followed up with energy if they occurred in the real world but, because they are in the digital world, are sometimes overlooked or forgotten?

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, I would like to mention one issue that I forgot to mention, and I think it would be more efficient to pose the question now to the Minister rather than interject when he is speaking.

On the Government’s Amendments 136A, 136B and 136C on the immigration offences, the point I want to make is that online services can be literal life-savers for people who are engaged in very dangerous journeys, including journeys across the Channel. I hope the Minister will be clear that the intention here is to require platforms to deal only with content, for example, from criminals who are offering trafficking services, and that there is no intention to require platforms somehow to withdraw services from the victims of those traffickers when they are using those services in the interest of saving their own lives or seeking advice that is essential to preserving their own safety.

That would create—as I know he can imagine—real ethical and moral dilemmas, and we should not be giving any signal that we intend to require platforms to withdraw services from people who are in desperate need of help, whatever the circumstances.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, we seem to have done it again—a very long list of amendments in a rather ill-conceived group has generated a very interesting discussion. We are getting quite good at this, exchanging views across the table, across the Committee, even within the Benches—Members who perhaps have not often talked together are sharing ideas and thoughts, and that is a wonderful feeling.

I want to start with an apology. I think I may be the person who got the noble Baroness, Lady Kidron, shopped by the former leader—once a leader, always a leader. What I thought I was being asked was whether the Committee would be interested in hearing the views of the noble Viscount who could not be present, and I was very keen, because when he does speak it is from a point of view that we do not often hear. I did not know that it was a transgression of the rules—but of course it is not, really, because we got round it. Nevertheless, I apologise for anything that might have upset the noble Baroness’s blood pressure—it did not stop her making a very good contribution later.

We have covered so much ground that I do not want to try and summarise it in one piece, because you cannot do that. The problem with the group as it stands is that the right reverend Prelate the Bishop of Derby and myself must have some secret connection, because we managed to put down almost the same amendments. They were on issues that then got overtaken by the Minister, who finally got round to—I mean, who put down a nice series of amendments which exactly covered the points we made, so we can lose all those. But this did not stop the right reverend Prelate the Bishop of Guildford making some very good additional points which I think we all benefited from.

I welcome back the noble Baroness, Lady Buscombe, after her illness; she gave us a glimpse of what is to come from her and her colleagues, but I will leave the particular issue that she raised for the Minister to respond to. It raises an issue that I am not competent on, but it is a very important one—we need to get the right balance between what is causing the alarm and difficulty outside in relation to what is happening on the internet, and I think we all agree with her that we should not put any barrier in the way of dealing with that.

Indeed, that was the theme of a number of the points that have been raised on the question of what is or can constitute illegal content, and how we judge it. It is useful to hear again from the master about how you do it in practice. I cannot imagine being in a room of French lawyers and experts and retaining my sanity, let alone making decisions that affect the ability of people to carry on, but the noble Lord did it; he is still here and lives to tell the tale—bearded or otherwise.

The later amendments, particularly from the noble Lord, Lord Clement-Jones, are taking us round in a circle towards the process by which Ofcom will exercise the powers that it is going to get in this area. These are probably worth another debate on their own, and maybe it will come up in a different form, because—I think the noble Baroness, Lady Stowell, made this point as well—there is a problem in having an independent regulator that is also the go-to function for getting advice on how others have to make decisions that are theirs to rule on at the end if they go wrong. That is a complicated way of saying that we may be overloading Ofcom if we also expect it to provide a reservoir of advice on how you deal with the issues that the Bill puts firmly on the companies—I agree that this is a problem that we do not really have an answer to.

My amendments were largely overtaken by the Government’s amendments, but the main one I want to talk about was Amendment 272. I am sorry that the noble Baroness, Lady Morgan, is not here, because her expertise is in an area that I want to talk about, which is fraud—cyber fraud in particular—and how that is going to be brought into the Bill. The issue, which I think has been raised by Which?, but a number of other people have also written to us about it, is that the Bill in Clauses 170 and 171 is trying to establish how a platform should identify illegal content in relation to fraud—but it is quite prescriptive. In particular, it goes into some detail which I will leave for the Minister to respond to, but uniquely it sets out a specific way for gathering information to determine whether content is illegal in this area, although it may have applicability in other areas.

One of the points that have to be taken into account is whether the platform is using human moderators, automated systems or a combination of the two. I am not quite sure why that is there in the Bill; that is really the basis for the tabling of our amendments. Clearly, one would hope that the end result is whether or not illegality has taken place, not how that information has been gathered. If one must make concessions to the process of law because a judgment is made that, because it is automated, it is in some way not as valid as if it had been done by a human moderator, there seems to be a whole world there that we should not be going into. I certainly hope that that is not going to be the case if we are talking about illegality concerning children or other vulnerable people, but that is how the Bill reads at present; I wonder whether the Minister can comment on that.

There is a risk of consumers being harmed here. The figures on fraud in the United Kingdom are extraordinary; the fact that it is not the top priority for everybody, let alone the Government, is extraordinary. It is something like the equivalent of consumers being scammed at the rate of around £7.5 billion per year. A number of awful types of scamming have emerged only because of the internet and social media. They create huge problems of anxiety and emotional distress, with lots of medical care and other things tied in if you want to work out the total bill. So we have a real problem here that we need to settle. It is great that it is in the Bill, but it would be a pity if the movement towards trying to resolve it is in any way infringed on by there being imperfect instructions in the Bill. I wonder whether the Minister would be prepared to respond to that; I would be happy to discuss it with him later, if that is possible.

As a whole, this is an interesting question as we move away from what a crime is towards how people judge how to deal with what they think is a crime but may not be. The noble Lord, Lord Allan, commented on how to do it in practice but one hopes that any initial problems will be overcome as we move forward and people become more experienced with this.

When the Joint Committee considered this issue, we spent a long time talking about why we were concerned about having certainty on the legal prescription in the Bill; that is why we were very much against the idea of “legal but harmful” because it seemed too subjective and too subject to difficulties. Out of that came another thought, which answers the point made by the noble Baroness, Lady Stowell: so much of this is about fine judgments on certain things that are there in stone and that you can work to but you then have to interpret them.

There is a role for Parliament here, I think; we will come on to this in later amendments but, if there is a debate to be had on this, let us not forget the points that have been made here today. If we are going to think again about Ofcom’s activity in practice, that is the sort of thing that either a Joint Committee or Select Committees of the two Houses could easily take on board as an issue that needs to be reflected on, with advice given to Parliament about how it might be taken forward. This might be the answer in the medium term.

In the short term, let us work to the Bill and make sure that it works. Let us learn from the experience but let us then take time out to reflect on it; that would be my recommendation but, obviously, that will be subject to the situation after we finish the Bill. I look forward to hearing the Minister’s response.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, as well as throwing up some interesting questions of law, this debate has provoked some interesting tongue-twisters. The noble Lord, Lord Allan of Hallam, offered a prize to the first person to pronounce the Netzwerkdurchsetzungsgesetz; I shall claim my prize in our debate on a later group when inviting him to withdraw his amendment.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes, that would be welcome.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

Can I suggest one of mine?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I thank the noble Lord.

I was pleased to hear about Wicipedia Cymraeg—there being no “k” in Welsh. As the noble Lord, Lord Stevenson, said, there has been a very good conversational discussion in this debate, as befits Committee and a self-regulating House. My noble friend Lady Stowell is right to point out matters of procedure, although we were grateful to know why the noble Viscount, Lord Colville, supports the amendments in question.

16:45
My noble friend Lord Moylan’s first group within a group—Amendments 17 and 18—alters the duties in Clause 9 of the Bill. These amendments would weaken the illegal content duties by removing any obligation on services to take upstream measures to remove illegal content, including child sexual abuse material. They would therefore seriously undermine the Bill’s focus on proactive risk management. Similarly, Amendments 272 to 283 seek to alter how services should judge what is illegal. I understand that noble Lords are concerned, rightly, about the over-removal of content.
The amendments tabled by the noble Lord, Lord Clement-Jones, would require providers to have sufficient evidence that content is illegal before taking action against it, replacing the current test of “reasonable grounds to infer”. Sufficient evidence is a subjective measure. We have discussed the difficulties for those who must make these decisions and we think that this formulation would set an unclear threshold for providers to determine how they should judge illegality, which could result in the under-removal of illegal content, putting users at risk, or the over-removal of it, with adverse consequences for freedom of expression.
The amendments tabled by my noble friend Lord Moylan would narrow the test to require the removal only of content which, based on all reasonably available contextual evidence, is manifestly illegal, and we think that that threshold is too high. Context and analysis can give a provider good reasons to infer that content is illegal even though the illegality is not immediately obvious. This is the case with, for example, some terrorist content which is illegal only if shared with terrorist purposes in mind, and intimate image abuse, where additional information or context is needed to know whether content has been posted against the subject’s wishes.
Amendment 276 would remove the detail in Clause 170 that specifies the point at which providers must treat content as illegal or fraudulent. That would enable providers to interpret their safety duties in broader ways. Rather than having greater discretion, Ofcom would be given less certainty about whether it could successfully take enforcement action. I take the point raised by noble Lords about the challenges of how platforms will identify illegal content, and I agree with my noble friend Lady Stowell that the contributions of noble and learned Lords would be helpful in these debates as well. However, Clause 170 sets out how companies should determine whether or not content is illegal or an advertisement is fraudulent. I will say a little more about the context behind that, as the noble Lord, Lord Allan, may have a question.
The Bill recognises that it will often be difficult for providers to make judgments about content without considering the context. Clause 170 therefore clarifies that providers must ascertain whether, on the basis of on all reasonably available information, there are reasonable grounds to infer that all the relevant elements of the offence—including the mental elements—are present and that no defence is available. The amount of information that would be reasonably available to a particular service provider will depend on the size and capacity of the provider, among other factors.
Companies will need to ensure that they have effective systems to enable them to check the broader context relating to content when deciding whether or not to remove it. This will provide greater certainty about the standard to be applied by providers when assessing content, including judgments about whether or not content is illegal. We think that protects against over-removal by making it clear that platforms are not required to remove content merely on the suspicion of it being illegal. Beyond that, the framework also contains provisions about how companies’ systems and processes should approach questions of mental states and defences when considering whether or not content is an offence in the scope of the Bill.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I am struggling a little to understand why the Minister thinks that sufficient evidence is subjective, and therefore, I assume, reasonable grounds to infer is objective. Certainly, in my lexicon, evidence is more objective than inference, which is more subjective. I was reacting to that word. I am not sure that he has fully made the case as to why his wording is better.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

Or indeed any evidence.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I take the noble Lord’s point and my noble friend’s further contribution. I will see whether I can give a clearer and more succinct description in writing to flesh that out, but that it is the reason that we have alighted on the words that we have.

The noble Lord, Lord Allan, also asked about jurisdiction. If an offence has been committed in the UK and viewed by a UK user, it can be treated as illegal content. That is set out in Clause 53(11), which says:

“For the purposes of determining whether content amounts to an offence, no account is to be taken of whether or not anything done in relation to the content takes place in any part of the United Kingdom”.


I hope that that bit, at least, is clearly set out to the noble Lord’s satisfaction. It looks like it may not be.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

Again, I think that that is clear. I understood from the Bill that, if an American says something that would be illegal were they to be in the United Kingdom, we would still want to exclude that content. But that still leaves it open, and I just ask the question again, for confirmation. If all of the activities are outside the United Kingdom—Americans talking to each other, as it were—and a British person objects, at what point would the platform be required to restrict the content of the Americans talking to each other? Is it pre-emptively or only as and when somebody in the United Kingdom objects to it? We should flesh out that kind of practical detail before this becomes law.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

If it has been committed in the UK and is viewed by a UK user, it can be treated as illegal. I will follow up on the noble Lord’s further points ahead of the next stage.

Amendment 272 explicitly provides that relevant information that is reasonably available to a provider includes information submitted by users in complaints. Providers will already need to do this when making judgments about content, as it will be both relevant and reasonably available.

My noble friend Lord Moylan returned to the question that arose on day 2 in Committee, querying the distinction between “protect” and “prevent”, and suggesting that a duty to protect would or could lead to the excessive removal of content. To be clear, the duty requires platforms to put in place proportionate systems and processes designed to prevent users encountering content. I draw my noble friend’s attention to the focus on systems and processes in that. This requires platforms to design their services to achieve the outcome of preventing users encountering such content. That could include upstream design measures, as well as content identification measures, once content appears on a service. By contrast, a duty to protect is a less stringent duty and would undermine the proactive nature of the illegal content duties for priority offences.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

Before he moves on, is my noble friend going to give any advice to, for example, Welsh Wikipedia, as to how it will be able to continue, or are the concerns about smaller sites simply being brushed aside, as my noble friend explicates what the Bill already says?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will deal with all the points in the speech. If I have not done so by the end, and if my noble friend wants to intervene again, I would be more than happy to hear further questions, either to answer now or write to him about.

Amendments 128 to 133 and 143 to 153, in the names of the right reverend Prelate the Bishop of Derby and the noble Lord, Lord Stevenson of Balmacara, seek to ensure that priority offences relating to modern slavery and human trafficking, where they victimise children, are included in Schedule 6. These amendments also seek to require technology companies to report content which relates to modern slavery and the trafficking of children—including the criminal exploitation of children—irrespective of whether it is sexual exploitation or not. As noble Lords know, the strongest provisions in the Bill relate to children’s safety, and particularly to child sexual exploitation and abuse content. These offences are captured in Schedule 6. The Bill includes a power for Ofcom to issue notices to companies requiring them to use accredited technology or to develop new technology to identify, remove and prevent users encountering such illegal content, whether communicated publicly or privately.

These amendments would give Ofcom the ability to issue such notices for modern slavery content which affects children, even when there is no child sexual exploitation or abuse involved. That would not be appropriate for a number of reasons. The power to tackle illegal content on private communications has been restricted to the identification of content relating to child sexual exploitation and abuse because of the particular risk to children posed by content which is communicated privately. Private spaces online are commonly used by networks of criminals to share illegal images—as we have heard—videos, and tips on the commitment of these abhorrent offences. This is highly unlikely to be reported by other offenders, so it will go undetected if companies do not put in place measures to identify it. Earlier in Committee, the noble Lord, Lord Allan, suggested that those who receive it should report it, but of course, in a criminal context, a criminal recipient would not do that.

Extending this power to cover the identification of modern slavery in content which is communicated privately would be challenging to justify and could represent a disproportionate intrusion into someone’s privacy. Furthermore, modern slavery is usually identified through patterns of behaviour or by individual reporting, rather than through content alone. This reduces the impact that any proactive technology required under this power would have in tackling such content. Schedule 6 already sets out a comprehensive list of offences relating to child sexual exploitation and abuse which companies must tackle. If these offences are linked to modern slavery—for example, if a child victim of these offences has been trafficked—companies must take action. This includes reporting content which amounts to an offence under Schedule 6 to the National Crime Agency or another reporting body outside of the UK.

My noble friend Lord Moylan’s Amendment 135 seeks to remove the offence in Section 5 of the Public Order Act 1986 from the list of priority offences. His amendment would mean that platforms were not required to take proactive measures to reduce the risk of content which is threatening or abusive, and intended to cause a user harassment, alarm or distress, from appearing on their service. Instead, they would be obliged to respond only once they are made aware of the content, which would significantly reduce the impact of the Bill’s framework for tackling such threatening and abusive content. Given the severity of the harm which can be caused by that sort of content, it is right that companies tackle it. Ofcom will have to include the Public Order Act in its guidance about illegal content, as provided for in Clause 171.

Government Amendments 136A to 136C seek to strengthen the illegal content duties by adding further priority offences to Schedule 7. Amendments 136A and 136B will add human trafficking and illegal entry offences to the list of priority offences in the Bill. Crucially, this will mean that platforms will need to take proactive action against content which encourages or assists others to make dangerous, illegal crossings of the English Channel, as well as those who use social media to arrange or facilitate the travel of another person with a view to their exploitation.

The noble Lord, Lord Allan, asked whether these amendments would affect the victims of trafficking themselves. This is not about going after the victims. Amendment 136B addresses only content which seeks to help or encourage the commission of an existing immigration offence; it will have no impact on humanitarian communications. Indeed, to flesh out a bit more detail, Section 2 of the Modern Slavery Act makes it an offence to arrange or facilitate the travel of another person, including through recruitment, with a view to their exploitation. Facilitating a victim’s travel includes recruiting them. This offence largely appears online in the form of advertisements to recruit people into being exploited. Some of the steps that platforms could put in place include setting up trusted flagger programmes, signposting users to support and advice, and blocking known bad actors. Again, I point to some of the work which is already being done by social media companies to help tackle both illegal channel crossings and human trafficking.

17:00
Government Amendment 136C will add the offence of foreign interference to the list of priority offences in the Bill. As your Lordships will know, the Government previously made an amendment via the National Security Bill to include this offence in this Bill. Because of the relative pace at which these two Bills are now passing through Parliament, we are now doing it directly in the Online Safety Bill.
My noble friend Lady Buscombe’s Amendment 137 seeks to list the false and threatening communication offences in Schedule 7. Listing the communication offences as priority offences would require platforms to identify and determine the illegality of such content proactively. I appreciate the reasons she set out for raising this issue, but as these offences rely heavily on a user’s mental state, it would be challenging for services to identify this content without significant additional context. Let me reassure her, however, that platforms will still need to have systems and processes in place to remove this content quickly when it is reported to them, as with all other illegal content which is not in Schedule 7.
My noble friend Lord Bethell anticipated later debates on age verification and pornography. If he permits, I will come back on his points then. I have noted his question for that discussion as well as the question from the noble Lord, Lord Stevenson, on financial scams and fraud, which we will have the chance to discuss in full. I am not sure if my noble friend Lord Moylan wants to ask a further question at this juncture or to accept a reassurance that I will consult the Official Report and write on any further points he raised which I have not dealt with.
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, it is genuinely difficult to summarise such a wide-ranging debate, which was of a very high standard. Only one genuinely bright idea has emerged from the whole thing: as we go through Committee, each group of amendments should be introduced by the noble Lord, Lord Allan of Hallam, because it is only after I have heard his contribution on each occasion that I have begun to understand the full complexity of what I have been saying. I suspect I am not alone in that and that we could all benefit from hearing the noble Lord before getting to our feet. That is not meant to sound the slightest bit arch; it is absolutely genuine.

The debate expressed a very wide range of concerns. Concerns about gang grooming and recruiting were expressed on behalf of the right reverend Prelate the Bishop of Derby and my noble friend Lady Buscombe expressed concerns about trolling of country businesses. However, I think it is fair to say that most speakers focused on the following issues. The first was the definition of legality, which was so well explicated by the noble Lord, Lord Allan of Hallam. The second was the judgment bar that providers have to pass to establish whether something should be taken down. The third was the legislative mandating of private foreign companies to censor free speech rights that are so hard-won here in this country. These are the things that mainly concern us.

I was delighted that I found myself agreeing so much with what the noble Baroness, Lady Kidron, said, even though she was speaking in another voice or on behalf of another person. If her own sentiments coincide with the sentiments of the noble Viscount—

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I am sorry to intrude, but I must say now on the record that I was speaking on my own behalf. The complication of measuring and those particular things are terribly important to establish, so I am once again happy to agree with the noble Lord.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

I am delighted to hear the noble Baroness say that, and it shows that that pool of common ground we share is widening every time we get to our feet. However, the pool is not particularly widening, I am afraid to say—at least in respect of myself; other noble Lords may have been greatly reassured—as regards my noble friend the Minister who, I am afraid, has not in any sense addressed the issues about free speech that I and many other noble Lords raised. On some issues we in the Committee are finding a consensus that is drifting away from the Minister. We probably need to put our heads together more closely on some of these issues with the passage of time in Committee.

My noble friend also did not say anything that satisfied me in respect of the practical operation of these obligations for smaller sites. He speaks smoothly and persuasively of risk-based proactive approaches without saying that, for a large number of sites, this legislation will mean a complete re-engineering of their business model. For example, where Wikipedia operates in a minority language, such as in Welsh Wikipedia, which is the largest Welsh language website in the world, if its model is to involve monitoring what is put out by the community and correcting it as it goes along, rather than having a model in advance that is designed to prevent things being put there in the first place, then it is very likely to close down. If that is one of the consequences of this Bill the Government will soon hear about it.

Finally, although I remain concerned about public order offences, I have to say to the Minister that if he is so concerned about the dissemination of alarm among the population under the provisions of the Public Order Act, what does he think that His Majesty’s Government were doing on Sunday at 3 pm? I beg leave to withdraw the amendment.

Amendment 17 withdrawn.
Amendment 18 not moved.
Amendment 18A
Moved by
18A: Clause 9, page 8, line 23, at end insert—
“(8A) A duty to summarise in the terms of service the findings of the most recent illegal content risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to individuals).”Member’s explanatory statement
This amendment requires providers of Category 1 services to summarise (in their terms of service) the findings of their latest risk assessment regarding illegal content and activity. The limitation to Category 1 services is achieved by an amendment in the name of the Minister to clause 6.
Amendment 18A agreed.
Clause 9, as amended, agreed.
Clause 10: Children’s risk assessment duties
Amendment 19 not moved.
Lord Beith Portrait The Deputy Chairman of Committees (Lord Beith) (LD)
- Hansard - - - Excerpts

If Amendment 20 is agreed, I cannot call Amendment 21 by reason of pre-emption.

Amendment 20

Moved by
20: Clause 10, page 9, line 11, leave out paragraphs (a) to (h) and insert—
“(a) the level of risk that children who are users of the service encounter the harms as outlined in Schedule (Online harms to children) by means of the service;(b) any of the level of risks to children encountered singularly or in combination, having regard to—(i) the design of functionalities, algorithms and other features that present or increase risk of harm, such as low-privacy profile settings by default;(ii) the business model, revenue model, governance, terms of service and other systems and processes or mitigation measures that may reduce or increase the risk of harm;(iii) risks which can build up over time;(iv) the ways in which level of risks can change when experienced in combination with others;(v) the level of risk of harm to children in different age groups;(vi) the level of risk of harm to children with certain characteristics or who are members of certain groups; and(vii) the different ways in which the service is used including but not limited to via virtual and augmented reality technologies, and the impact of such use on the level of risk of harm that might be suffered by children;(c) whether the service has shown regard to the rights of children as set out in the United Nations Convention on the Rights of the Child (see general comment 25 on children’s rights in relation to the digital environment).”Member’s explanatory statement
This amendment would require providers to look at and assess risks on their platform in the round and in line with the 4 Cs of online risks to children (content, contact, conduct and contractual/commercial risks). Although these risks will not be presented on every service, this amendment requires providers to reflect on these risks, so they are not forgotten and can be built into future development of the service.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, this amendment and Amendments 74, 93 and 123 are part of a larger group that have been submitted as a package loosely referred to as the AV and harms package. They have been the subject of much private debate with the Government, for which we are grateful, and among parliamentarians, and have featured prominently in the media. The amendments are in my name and those of the noble Lord, Lord Bethell, the right reverend Prelate the Bishop of Oxford and the noble Lord, Lord Stevenson, but enjoy the support of a vast array of Members of both Houses. I thank all those who have voiced their support.

The full package of amendments defines and sets out the rules of the road for age assurance, including the timing of its introduction, and the definition of terms such as age verification and age assurance. They introduce the concept of measuring the efficacy of systems with one eye on the future so that we as parliamentarians can indicate where and when we feel that proportionality is appropriate and where it is simply not—for example, in relation to pornography. In parallel, we have developed a schedule of harms, which garners rather fewer column inches but is equally important in establishing Parliament’s intention. It is that schedule of harms that is up for debate today.

Before I lay out the amendment, I thank the 26 children’s charities which have so firmly got behind this package and acknowledge, in particular, Barnardo’s, CEASE and 5Rights, of which I am chair, which have worked tirelessly to ensure that the full expertise of children’s charities has been embedded in these amendments. I also pay tribute to the noble Baroness, Lady Benjamin, who in this area of policy has shown us all the way.

The key amendment in this group is Amendment 93, which would place a schedule of harms to children in the Bill. There are several reasons for doing so, the primary one being that by putting them in the Bill we are stating the intention of Parliament, which gives clarity to companies and underlines the authority of Ofcom to act on these matters. Amendments 20, 74 and 123 ensure that the schedule is mirrored in risk assessments and tasks Ofcom with updating its guidance every six months to capture new and emerging harms, and as such are self-evident.

The proposed harms schedule is centred around the four Cs, a widely used and understood taxonomy of harm used in legislation and regulation around the globe. Importantly, rather than articulate individual harms that may change over time, it sets its sight on categories of harm: content, contact, conduct and contract, which is sometimes referred to as commercial harm. It also accounts for cumulative harms, where two or more risk factors create a harm that is greater than any single harm or is uniquely created by the combination. The Government’s argument against the four Cs is that they are not future-proof, which I find curious since the very structure of the four Cs is to introduce broad categories of harm to which harms can be added, particularly emerging harms. By contrast, the Government are adding an ever-growing list of individual harms.

I wish to make three points in favour of our package of amendments relating first to language, secondly to the nature of the digital world, and finally to clarity of purpose. It is a great weakness of the Bill that it consistently introduces new concepts and language—for example, the terms “primary priority content”, “priority content” and “non-designated content”. These are not terms used in other similar Bills across the globe, they are not evident in current UK law and they do not correlate with established regimes, such as equalities legislation or children’s rights under the convention, more of which in group 7.

The question of language is non-trivial. It is the central concern of those who fight CSAE around the world, who frequently find that enforcement against perpetrators or takedown is blocked by legal systems that define child sexual abuse material differently—not differently in some theoretical sense but because the same image can be categorised differently in two countries and then be a barrier to enforcement across jurisdictions. Leadership from WeProtect, the enforcement community and representatives that I recently met from Africa, South America and Asia have all made this point. It undermines the concept of UK leadership in child protection that we are wilfully and deliberately rejecting accepted language which is embedded in treaties, international agreements and multilateral organisations to start again with our own, very likely with the same confused outcome.

Secondly, I am concerned that while both the Bill and the digital world are predicated on system design, the harms are all articulated as content with insufficient emphasis on systems harms, such as careless recommendations, spreading engagement and the sector-wide focus on maximising engagement, which are the very things that create the toxic and dangerous environment for children. I know, because we have discussed it, that the Minister will say that this is all in the risk assessment, but the risk assessment asks regulated companies to assess how a number of features contribute to harm, mostly expressed as content harm.

What goes through my mind is the spectre of Meta’s legal team, which I watched for several days during Molly Russell’s inquest; they stood in a court of law and insisted that hundreds, in fact thousands, of images of cut bodies and depressive messages did not constitute harm. Rather, they regarded them as cries for help or below the bar of harm as they interpreted it. Similarly, there was material that featured videos of people jumping off buildings—some of them sped-up versions of movie clips edited to suggest that jumping was freedom—and I can imagine a similar argument that says that kind of material cannot be considered harmful, because in another context it is completely legitimate. Yet this material was sent to Molly at scale.

17:15
It is not good enough to characterise harms simply by establishing what is or is not harmful content. The previous debate really underlined that it takes a long time and it is very complicated to see what is harmful. But we must make utterly clear that the drip feed of nudges, enticements and recommendations and the creation of a toxic environment, overwhelming a child of 14 with more than 1,400 messages, whether they meet that bar of harmful content or not, is in itself a harm. A jukebox of content harms is not future-proof, and it fails to name the risks of the system. It is to misunderstand where the power of digital design actually lies.
Finally, there is the question of simplicity and clarity. As we discussed on the first day of Committee, business wants clarity, campaigners want clarity, parents want clarity, and Ofcom could do with some clarity. If not the four Cs, my challenge to the Government is to deliver a schedule that has the clarity and simplicity of the amendments in front of us, in which harm is defined by category not by individual content measurements, so that it is flexible now and into the future, and foregrounds the specific role of the system design not only as an accomplice to the named harm but as a harm itself. I beg to move.
Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to follow the noble Baroness, Lady Kidron. I have listened intently today, and there is no doubt that this Bill not only presents many challenges but throws up the complexity of the whole situation. I think it was the noble Lord, Lord Kamall, in an earlier group who raised the issues of security, safety and freedom. I would add the issue of rights, because we are trying to balance all these issues and characterise them in statute, vis-à-vis the Bill.

On Tuesday, we spoke about one specific harm—pornography—on the group of amendments that I had brought forward. But I made clear at that time that I believe this is not the only harm, and I fully support the principles of the amendments from the noble Baroness, Lady Kidron. I would obviously like to get some clarity from her on the amendments, particularly as to how they relate to other clauses in the Bill.

The noble Baroness has been the pioneer in this field, and her expertise is well recognised across the House. I believe that these amendments really take us to the heart of the Bill and what we are trying to achieve—namely, to identify online harms to children, counteract them and provide a level of safety to young people.

As the noble Lord, Lord Clement-Jones, said on Tuesday,

“there is absolutely no doubt that across the Committee we all have the same intent; how we get there is the issue between us”.—[Official Report, 25/4/23; col. 1196.]

There is actually not that much between us. I fully agree with the principle of putting some of the known harms to children in the Bill. If we know the harms, there is little point in waiting for them to be defined in secondary legislation by Clause 54.

It is clear to me that there are harms to children that we know about, and those harms will not change. It would be best to name those harms clearly in the Bill when it leaves this House. That would allow content providers, search engines and websites in scope of the Bill to prepare to make any changes they need to keep children safe. Perhaps the Minister could comment on that aspect. We also know that parents will expect some harms to be in the Bill. The noble Baroness, Lady Kidron, laid out what they are, and I agree with her analysis. These issues are known and we should not wait for them to be named.

While known harms should be placed into the Bill, I know, understand and appreciate that the Government are concerned about future-proofing. However, I am of the view that a short list of key topics will not undermine that principle. Indeed, the Joint Committee’s report on the draft Bill stated,

“we recommend that key, known risks of harm to children are set out on the face of the Bill”.

In its report on the Bill, the DCMS Select Committee in the other place agreed, saying

“that age-inappropriate or otherwise inherently harmful content and activity (like pornography, violent material, gambling and content that promotes or is instructive in eating disorders, self-harm and suicide) should appear on the face of the Bill”.

Has there been any further progress in discussions on those issues?

At the beginning of the year, the Children’s Commissioner urged Parliamentarians

“to define pornography as a harm to children on the fact of the … Bill, such that the regulator, Ofcom, may implement regulation of platforms hosting adult content as soon as possible following the passage of the Bill”.

I fully agree with the Children’s Commissioner. While the ways in which pornographic content is delivered will change over time, the fact that pornography is harmful to children will not change. Undoubtedly, with the speed of technology—something that the noble Lord, Lord Allan of Hallam, knows a lot more about than the rest of us, having worked in this field—it will no doubt change and we will be presented with new types of challenges.

I therefore urge the Government to support the principle that the key risks are in the Bill, and I thank the noble Baroness, Lady Kidron, for raising this important principle. However, I hope she will indulge me as I seek to probe some of the detail of her amendments and their interactions with the architecture of other parts of the Bill. As I said when speaking to Clause 49 on Tuesday, the devil is obviously in the detail.

First, Clause 54 defines what constitutes

“Content that is harmful to children”,

and Clause 205 defines harm, and Amendment 93 proposes an additional new list of harms. As I have already said, I fully support the principle of harms being in the Bill, but I raise a question for the noble Baroness. How does she see these three definitions working together? That might refer back to a preliminary discussion that we had in the tearoom earlier.

These definitions of harms are in addition to the content to be defined as primary priority content and priority content. Duties in Clauses 11 and 25 continue to refer to these two types of content for Part 3 services, but Amendments 20 and 74 would remove the need for risk assessments in Clauses 10 and 24 to address these two types of content. It seems that the amendments could create a tension in the Bill, and I am interested to ascertain how the noble Baroness, Lady Kidron, foresees that tension operating. Maybe she could give us some detail in her wind-up about that issue. An explanation of that point may bring some clarity to understanding how the new schedule that the noble Baroness proposes will work alongside the primary priority content and the priority content lists. Will the schedule complement primary priority content, or will it be an alternative?

Secondly, as I said, some harms are known but there are harms that are as yet unknown. Will the noble Baroness, Lady Kidron, consider a function to add to the list of content in her Amendment 93, in advance of us coming back on Report? There is no doubt that the online space is rapidly changing, as this debate has highlighted. I can foresee a time when other examples of harm should be added to the Bill. I accept that the drafting is clear that the list is not exclusive, but it is intended to be a significant guide to what matters to the public and Parliament. I also accept that Ofcom can provide guidance on other content under Amendment 123, but, without a regulatory power added to Amendment 93, it feels that we are perhaps missing a belt-and-braces approach to online harms to children. After all, our principal purpose here is to protect children from online harm.

I commend the noble Baroness, Lady Kidron, on putting these important amendments before the Committee, and I fully support the principle of what she seeks to achieve. But I hope that, on further reflection, she will look at the points I have suggested. Perhaps she might suggest other ideas in her wind-up, and we could have further discussions in advance of Report. I also look forward to the Minister’s comments on these issues.

Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - - - Excerpts

My Lords, I support Amendments 20, 93 and 123, in my name and those of the noble Baroness, Lady Kidron, and the noble Lords, Lord Bethell and Lord Stevenson. I also support Amendment 74 in the name of the noble Baroness, Lady Kidron. I pay tribute to the courage of all noble Lords and their teams, and of the Minister and the Bill team, for their work on this part of the Bill. This work involves the courage to dare to look at some very difficult material that, sadly, shapes the everyday life of too many young people. This group of amendments is part of a package of measures to strengthen the protections for children in the Bill by introducing a new schedule of harms to children and plugging a chronological gap between Part 3 and Part 5 services, on when protection from pornography comes into effect.

Every so often in these debates, we have been reminded of the connection with real lives and people. Yesterday evening, I spent some time speaking on the telephone with Amanda and Stuart Stephens, the mum and dad of Olly Stephens, who lived in Reading, which is part of the diocese of Oxford. Noble Lords will remember that Olly was tragically murdered, aged 13, in a park near his home, by teenagers of a similar age. Social media played a significant part in the investigation and in the lives of Olly and his friends—specifically, social media posts normalising knife crime and violence, with such a deeply tragic outcome.

17:30
Last year in June, “Panorama” dared to look into this world. The programme revealed the depth and extent of the normalisation of knives and knife crime in posts offered to young people. I was struck by the comments of Frances Haugen, filmed when she met Stuart and Amanda. She said that each of us sees social media through a pinhole: a tiny snapshot of the total content. We have no idea how much darkness and evil are shaping children and young people, destroying their sense of proportion and deeply affecting offline behaviour. The only group that has the whole picture, of course, is the companies themselves.
The noble Baroness, Lady Kidron, and others have outlined the remarkable degree of support for this raft of amendments from charities working to protect children. We should listen. These amendments will ensure a much wider definition of “harm” and will again future-proof the Bill in terms of technology which is even now coming over the horizon.
The Center for Countering Digital Hate speaks about an arms race to devise ever more effective ways of keeping users’ attention, even if it means putting them at risk. Its researchers set up new accounts in the United States, United Kingdom, Canada and Australia at the minimum age TikTok allows: 13 years old. Those accounts paused briefly on videos about body image and mental health and liked them. What the researchers found was deeply disturbing. Within 2.6 minutes, TikTok recommended suicide content. Within 8 minutes, TikTok served content relating to eating disorders. Every 39 seconds, TikTok recommended videos about body image and mental health to teens. CCDH researchers found a community for eating disorder content on the platform amassing 13.2 billion views across 56 hashtags, often designed to evade moderation.
As the noble Baroness, Lady Kidron, said, this fourfold classification of harms to children is being adapted elsewhere in the world, including the European Union. The schedule in the amendment gives clear but non-exhaustive examples to guide service providers on the meaning of each of the four Cs. It is vital to have more comprehensive agreed definitions of harm in the Bill.
I will reflect for a moment on what each of the four Cs means. Content harms are the most familiar. At the moment, children who go online are likely to encounter age-inappropriate content, including violent, gory and graphic communication, hate speech, terrorism, online prostitution, drugs, eating disorders and self-harm. Research also shows that exposure to different types of harmful content is interrelated: so, if a child reports seeing one type of disturbing content, it is likely that they have seen others as well.
Secondly, contact harms encourage harmful actions in the non-virtual world. A 10 year-old girl was left with burns after spraying an aerosol deodorant with the nozzle right up against her skin to create a freezing sensation. Jane Platt’s daughter Sarah, aged 15, was rushed to hospital in February 2020 after doing the “skull-breaker challenge”, which involves two people kicking the legs from under a third, making them fall over. These suggestions could never be offered in young people’s magazines or broadcast media.
Thirdly, there are conduct harms. In a global survey, 54% of young people—57% of girls and 48% of boys—reported having experienced online sexual harms before they were 18 years old, including within interaction with adults and being asked something sexually explicit or being sent sexually explicit content.
Finally, there are commercial harms. Over half of the games on Google Play now include loot boxes and more than 93% of games that feature loot boxes are marked suitable for children aged 12 years-plus.
As the noble Baroness, Lady Kidron, and others have argued, these harms are often cumulative and interrelated. The social media companies are the only ones not looking through a keyhole but monitoring social media in the round and able to assess what is happening, but evidence suggests that they will do not so until compelled by legislation. These amendments are a vital step forward in fulfilling the Bill’s purpose of providing additional protection from harm for children. I urge the Government to adopt them.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I really appreciated the contribution from the noble Baroness, Lady Ritchie of Downpatrick, because she asked a lot of questions about this group of amendments. Although I might be motivated by different reasons, I found it difficult to fully understand the impact of the amendments, so I too want to ask a set of questions.

Harm is defined in the Bill as “physical or psychological harm”, and there is no further explanation. I can understand the frustration with that and the attempts therefore to use what are described as the

“widely understood and used 4 Cs of online risk to children”.

They are not widely understood by me, and I have ploughed my way through it. I might well have misunderstood lots in it, but I want to look at and perhaps challenge some of the contents.

I was glad that Amendment 20 recognises the level of risk of harm to different age groups. That concerns me all the time when we talk about children and young people, and then end up treating four year-olds, 14 year-olds and 18 year-olds. I am glad that that is there, and I hope that we will look at it again in future.

I want to concentrate on Amendment 93 and reflect and comment more generally on the problem of a definition, or a lack of definition, of harm in the Bill. For the last several years that we have been considering bringing this Bill to this House and to Parliament, I have been worried about the definition of psychological harm. That is largely because this category has become ever more expansive and quite subjective in our therapeutic age. It is a matter of some discussion and quite detailed work by psychologists and professionals, who worry that there is an expanding concept group of what is considered harmful and what psychological harm really means.

As an illustration, I was invited recently to speak to a group of sixth-formers and was discussing things such as trigger warnings and so on. They said, “Well, you know, you’ve got to understand what it’s like”—they were 16 year-olds. “When we encounter certain material, it makes us have PTSD”. I was thinking, “No, it doesn’t really, does it?” Post-traumatic stress disorder is something that you might well gain if you have been in the middle of a war zone. The whole concept of triggering came from psychological and medical insights from the First World War, which you can understand. If you hear a car backfiring, you think it is somebody shooting at you. But the idea here is that we should have trigger warnings on great works of literature and that if we do not it will lead to PTSD.

I am not being glib, because an expanded, elastic and pathologised view of harm is being used quite cavalierly and casually in relation to young people and protecting them, often by the young people themselves. It is routinely used to close down speech as part of the cancel culture wars, which, as noble Lords know, I am interested in. Is there not a danger that this concept of harm is not as obvious as we think, and that the psychological harm issue makes it even more complicated?

The other thing is that Amendment 93 says:

“The harms in this Schedule are a non-exhaustive list of categories and other categories may be relevant”.


As with the discussion on whose judgment decides the threshold for removing illegal material, I think that judging what is harmful is even more tricky for the young in relation to psychological harm. I was reminded of that when the noble Baroness, Lady Kidron, complained that what she considered to be obviously and self-evidently harmful, Meta did not. I wondered whether that is just the case with Meta, or whether views will differ when it comes to—

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

The report found—I will not give a direct quotation—that social media contributed to the death of Molly Russell, so it was the court’s judgment, not mine, that Meta’s position was indefensible.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I completely understand that; I was making the point that there will be disagreements in judgments. In that instance, it was resolved by a court, but we are talking about a situation where I am not sure how the judgment is made.

In these amendments, there are lists of particular harms—a variety are named, including self-harm—and I wanted to provide some counterexamples of what I consider to be harms. I have been inundated by algorithmic adverts for “Naked Education” on Channel 4, maybe because of the algorithms I am on. I think that the programme is irresponsible; I say that having watched it, rather than just having read a headline. Channel 4 is posing this programme with naked adults and children as educational by saying that it is introducing children to the naked body. I think it is harmful for children and that it should not be on the television, but it is advertised on social media—I have seen quite a lot of it.

The greatest example of self-harm we encounter at present is when gender dysphoric teenagers—as well as some younger than teenagers; they are predominately young women—are affirmed by adults, as a kind of social contagion, into taking body-changing and body-damaging hormones and performing self-mutilation, whether by breast binding or double mastectomies, which is advertised and praised by adults. That is incredibly harmful for young people, and it is reflected online at lot, because much of this is discussed, advertised or promoted online.

This is related to the earlier contributions, because I am asking: should those be added to the list of obvious harms? Although not many noble Lords are in the House now, if there were many more here, they would object to what I am saying by stating, “That is not harmful at all. What is harmful is what you’re saying, Baroness Fox, because you’re causing psychological harm to all those young people by being transphobic”. I am raising these matters because we think we all agree that there is a consensus on what is harmful material online for young people, but it is not that straightforward.

The amendment states that the Bill should target any platform that posts

“links to, or … encourages child users to seek”

out “dangerous or illegal activity”. I understand “illegal activity”, but on “dangerous” activities, I assume that we do not mean extreme sports, mountain climbing and so on, which are dangerous—that comes to mind probably because I have spent too much time with young people who spend their whole time looking at those things. I worry about the unintended consequences of things being banned or misinterpreted in that way.

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

To respond briefly to the noble Baroness, I shall give a specific example of how Amendment 93 would help. Let us go back to the coroner’s courtroom where the parents of Molly Russell were trying to get the coroner to understand what had happened to their daughter. The legal team from Meta was there, with combined salaries probably in seven figures, and the argument was about the detail of the content. At one point, I recall Ian Russell saying that one of the Meta lawyers said, “We are topic agnostic”. I put it to the noble Baroness that, had the provisions in Amendment 93 been in place, first, under “Content harms” in proposed new paragraph 3(c) and (d), Meta would have been at fault; under “Contact harms” in proposed new paragraph 4(b), Meta would have been at fault; under “Conduct harms” in proposed new paragraph 5(b), Meta would have been at fault; and under “Commercial harms” in proposed new paragraph 6(a) and (b), Meta would have been at fault. That would have made things a great deal simpler.

17:45
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I appreciate that that this is the case we all have in the back of our minds. I am asking whether, when Meta says it is content agnostic, the Bill is the appropriate place for us to list the topics that we consider harmful. If we are to do that, I was giving examples of contentious, harmful topics. I might have got this wrong—

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I will answer the noble Baroness more completely when I wind up, but I just want to say that she is missing the point of the schedule a little. Like her, I am concerned about the way we concentrate on content harms, but she is bringing it back to content harms. If she looks at it carefully, a lot of the provisions are about contact and conduct: it is about how the system is pushing children to do certain things and pushing them to certain places. It is about how things come together, and I think she is missing the point by keeping going back to individual pieces of content. I do not want to take the place of the Minister, but this is a systems and processes Bill; it is not going to deal with individual pieces of content in that way. It asks, “Are you creating these toxic environments for children? Are you delivering this at scale?” and that is the way we must look at this amendment.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I will finish here, because we have to get on, but I did not introduce content; it is in the four Cs. One of the four Cs is “content” and I am reacting to amendments tabled by the noble Baroness. I do not think I am harping on about content; I was responding to amendments in which content was one of the key elements.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Let us leave it there.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I speak in support of these amendments with hope in my heart. I thank the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, for leading the charge with such vigour, passion and determination: I am with them all the way.

The Government have said that the purpose of the Bill is to protect children, and it rests on our shoulders to make sure it delivers on this mission. Last week, on the first day in Committee, the Minister said:

“Through their duties of care, all platforms will be required proactively to identify and manage risk factors associated with their services in order to ensure that users do not encounter illegal content and that children are protected from harmful content. To achieve this, they will need to design their services to reduce the risk of harmful content or activity occurring and take swift action if it does.—[Official Report, 19/4/23; cols. 274-75.]


This is excellent and I thank the Government for saying it. But the full range of harms and risk to children will not be mitigated by services if they do not know what they are expected to risk-assess for and if they must wait for secondary legislation for this guidance.

The comprehensive range of harms children face every day is not reflected in the Bill. This includes sexual content that does not meet the threshold of pornography. This was highlighted recently in an investigation into TikTok by the Telegraph, which found that a 13 year-old boy was recommended a video about the top 10 porn-making countries, and that a 13 year-old girl was shown a livestream of a pornography actor in her underwear answering questions from viewers. This content is being marketed to children without a user even seeking out pornographic content, but this would still be allowed under the Bill.

Furthermore, high-risk challenges, such as the Benadryl and blackout challenges, which encourage dangerous behaviour on TikTok, are not dealt with in the Bill. Some features, such as the ability of children to share their location, are not dealt with either. I declare an interest as vice-president of Barnardo’s, which has highlighted how these features can be exploited by organised criminal gangs that sexually exploit children to keep tabs on them and trap them in a cycle of exploitation.

It cannot be right that the user-empowerment duties in the Bill include a list of harmful content that services must enable adults to toggle off, yet the Government refuse to produce this list for children. Instead, we have to wait for secondary legislation to outline harms to children, causing further delay to the enforcement of services’ safety duties. Perhaps the Minister can explain why this is.

The four Cs framework of harm, as set out in these amendments, is a robust framework that will ensure service risk assessments consider the full range of harms children face. I will repeat it once again: childhood lasts a lifetime, so we cannot fail children any longer. Protections are needed now, not in years to come. We have waited far too long for this. Protections need to be fast-tracked and must be included in the Bill. That is why I fully support these amendments.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, in keeping with the Stevenson-Knight double act, I am leaving it to my noble friend to wind up the debate. I will come in at this point with a couple of questions and allow the Minister to have a bit of time to reflect on them. In doing so, I reinforce my support for Amendment 295 in the name of the noble Lord, Lord Russell, which refers to volume and frequency also being risk factors.

When I compare Amendment 20 with Clause 10(6), which refers to children’s risk assessments and what factors should be taken into account in terms of the risk profile, I see some commonality and then some further things which Amendment 20, tabled by the noble Baroness, Lady Kidron, adds. In my opinion, it adds value. I am interested in how the Minister sees the Bill, as it stands currently, covering some issues that I will briefly set out. I think it would be helpful if the Committee could understand that there may be ways that the Bill already deals with some of the issues so wonderfully raised by the noble Baroness; it would be helpful if we can flush those out.

I do not see proposed new subsection (b)(iii),

“risks which can build up over time”,

mentioned in the Bill, nor explicit mention of proposed new subsection (b)(iv),

“the ways in which level of risks can change when experienced in combination with others”,

which I think is critical in terms of the way the systems work. Furthermore, proposed new subsection (b)(vii),

“the different ways in which the service is used including but not limited to via virtual and augmented reality technologies”,

starts to anticipate some other potential harms that may be coming very rapidly towards us and our children. Again, I do not quite see it included. I see “the design of functionalities”, “the business model” and “the revenue model”. There is a lot about content in the original wording of the Bill, which is less so here, and, clearly, I do not see anything in respect of the UN Convention on the Rights of the Child, which has been debated in separate amendments anyway. I wanted to give the Minister some opportunity on that.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I restate my commitment to Amendments 20, 93 and 123, which are in my name and those of the noble Baroness, Lady Kidron, the right reverend Prelate the Bishop of Oxford, and the noble Lord, Lord Stevenson, and the noble Baroness’s Amendment 74. It is a great honour to follow the noble Lord, Lord Knight. He put extremely well some key points about where there are gaps in the existing Bill. I will build on why we have brought forward these amendments in order to plug these gaps.

In doing so, I wish to say that it has been a privilege to work with the right reverend Prelate, the noble Baroness and the noble Lord, Lord Stevenson. We are not from the same political geographies, but that collaboration demonstrates the breadth of the political concern, and the strength of feeling across the Committee, about these important gaps when it comes to harms—gaps that, if not addressed, will put children at great risk. In this matter we are very strongly united. We have been through a lot together, and I believe this unlikely coalition demonstrates how powerful the feelings are.

It has been said before that children are spending an increasing amount of their lives online. However, the degree of that inflection point in the last few years has been understated, as has how much further it has got to go. The penetration of mobile phones is already around 75% of 10 year-olds—it is getting younger, and it is getting broader.

In fact, the digital world is totally inescapable in the life of a child, whether that is for a young child who is four to six years old or an older child who is 16 or 17. It is increasingly where they receive their education—I do not think that is necessarily a good thing, but that is arguable—it is where they establish and maintain their personal relationships and it is a key forum for their self-expression.

For anyone who suspects otherwise, I wish to make it clear that I firmly believe in innovation and progress, and I regard the benefits of the digital world as really positive. I would never wish to prevent children accessing the benefits of the internet, the space it creates for learning and building community, and the opportunities it opens for them. However, environments matter. The digital world is not some noble wilderness free from original sin or a perfect, frictionless marketplace where the best, nicest, and most beautiful ideas triumph. It is a highly curated experience defined by the algorithms and service agreements of the internet companies. That is why we need rules to ensure that it is a safe space for children.

I started working on my first internet business in 1995, nearly 30 years ago. I was running the Ministry of Sound, and we immediately realised that the internet was an amazing way of getting through to young people. Our target audiences were either clubbers aged over 18 or the younger brothers and sisters of clubbers who bought our merchandise. The internet gave us an opportunity to get past all the normal barriers—past parents and regulation to reach a wonderful new market. I built a good business and it worked out well for me, but those were the days before GDPR and what we understand from the internet. I know from my experience that we need to ensure that children are protected and shielded from the harms that bombard them, because there are strong incentives—mainly financial but also other, malign incentives—for bad actors to use the internet to get through to children.

Unfortunately, as the noble Baroness, Lady Kidron, pointed out, the Bill as it stands does not achieve that aim. Take, for example, contact harms, such as grooming and child sexual abuse. In February 2020, Bark, a US-based organisation that helps families manage and protect their children’s digital lives, launched an 11 year-old persona online who it called Bailey. Bailey’s online persona clearly shows that she is an ordinary 11 year-old, posting content that is ordinary for an 11 year-old. Within 30 seconds of her persona being launched online she received a like from a man whose profile picture was a penis. Within two minutes, multiple messages were received from men, and within five minutes a video call. Shortly afterwards, she received requests from men to meet up. I remind your Lordships that Bailey was 11 years old. These are not trivial content harms; these are attempts to contact a minor using the internet as a medium.

18:00
The Bill does try to address contact harms. I am supportive of the Bill and its principles, and am a big fan of the team that is trying to drive it through Parliament. For example, the Bill specifies that features that enable adults to search for and contact children must be risk-assessed. However, that is where the Bill currently stops. There is no comprehensive list of harms or features to inform that risk assessment. For instance, a feature such as live-streaming, which can enable adults to access children directly, is not specifically referenced, meaning that there is no explicit obligation for services to risk-assess such features—a big gap. Services cannot be expected to read between the lines of what the Government’s intentions may be. We must be explicit and clear in the Bill if we are serious about delivering on children’s safety.
That is why our amendments would do four things. First, they would introduce into the Bill a new schedule of harms to children, framed around the four categories of risk: content, contact, conduct and contract or commercial. This will ensure that harms to children in the Bill reflect the full range of harms that children encounter online, including from design features that facilitate pathways to harm to pornography, self-harm, pro-suicide content and grooming. Secondly, we seek to ensure that services must risk-assess for the harms listed in this proposed new schedule in order to ensure that these risk assessments are comprehensive. Thirdly, we seek to task Ofcom with producing guidance, to be updated every 12 months, on new and emerging harms. Fourthly, we seek to ensure that Ofcom consults with children’s advocates and charities when producing this guidance.
The timing of this is very important. The real-life Bailey cannot wait for harms to be outlined in secondary legislation. These problems have been around for a long time but, as I said, the inflection curve on technology is shooting right up the hockey stick at the moment. If the primary purpose of the Bill is to protect children, it must include the harms that children face every day in primary legislation—not in secondary legislation, not after Royal Assent, not in a year or two, but now. Our Amendment 93 would introduce in the Bill a schedule of harms to children. This non-exhaustive list would produce a robust and future-proof framework, and have the clarity and mandate to produce comprehensive risk assessments to ensure that the Bill delivers on its chief purpose: protecting children.
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I support this group of amendments, so ably introduced by my noble friend and other noble Lords this afternoon.

I am not a lawyer and I would not say that I am particularly experienced in this business of legislating. I found this issue incredibly confusing. I hugely appreciate the briefings and discussions—I feel very privileged to have been included in them—with my noble friend the Minister, officials and the Secretary of State herself in their attempt to explain to a group of us why these amendments are not necessary. I was so determined to try to understand this properly that, yesterday, when I was due to travel to Surrey, I took all my papers with me. I got on the train at Waterloo and started to work my way through the main challenges that officials had presented.

The first challenge was that, fundamentally, these amendments cut across the Bill’s definitions of “primary priority content” and “priority content”. I tried to find them in the Bill. Unfortunately, in Clause 54, there is a definition of primary priority content. It says that, basically, primary priority content is what the Secretary of State says it is, and that content that is harmful to children is primary priority content. So I was none the wiser on Clause 54.

One of the further challenges that officials have given us is that apparently we, as a group of noble Lords, were confusing the difference between harm and risk. I then turned to Clause 205, which comes out with the priceless statement that a risk of harm should be read as a reference to harm—so maybe they are the same thing. I am still none the wiser.

Yesterday morning, I found myself playing what I can only describe as a parliamentary game of Mornington Crescent, as I went round and round in circles. Unfortunately, it was such a confusing game of Mornington Crescent that I forgot that I needed to change trains, ended up in Richmond instead of Redhill, and missed my meeting entirely. I am telling the Committee this story because, as the debate has shown, it is so important that we put in the Bill a definition of the harms that we are intending to legislate for.

I want to address the points made by the noble Baroness, Lady Fox. She said that we might not all agree on what harms are genuinely harmful for children. That is precisely why Parliament needs to decide this, rather than abdicate it to a regulator who, as other noble Lords said earlier today, is then put into a political space. It is the job of Parliament to decide what is dangerous for our children and what is not. That is the approach that we take in the physical world, and it should be the approach that we take in the online world. We should do that in broad categories, which is why the four Cs is such a powerful framework. I know that we are all attempting to predict the known unknowns, which is impossible, but this framework, which gives categories of harm, is clear that it can be updated, developed and, as my noble friend Lord Bethell, said, properly consulted on. We as parliamentarians should decide; that is the purpose of voting in Parliament.

I have a couple of questions for my noble friend the Minister. Does he agree that Parliament needs to decide what the categories of online harms are that the Bill is attempting to protect our children from? If he does, why is it not the four Cs? If he really thinks it is not the four Cs, will he bring back an alternative schedule of harms?

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I will echo the sentiments of the noble Baroness, Lady Harding, in my contribution to another very useful debate, which has brought to mind the good debate that we had on the first day in Committee, in response to the amendment tabled by the noble Lord, Lord Stevenson, in which we were seeking to get into the Bill what we are actually trying to do.

I thought that the noble Baroness, Lady Fox, was also welcoming additional clarity, specifically in the area of psychological harm, which I agree with. Certainly in its earlier incarnations, the Bill was scattered throughout with references, some of which have been removed, but they are very much open to interpretation. I hope that we will come back to that.

I was struck by the point made by the noble Lord, Lord Russell, around what took place in that coroner’s hearing. You had two different platforms with different interpretations of what they thought that their duty of care would be. That is very much the point. In my experience, platforms will follow what they are told to follow. The challenge is when each of them comes to their own individual view around what are often complex areas. There we saw platforms presenting different views about their risk assessments. If we clarify that for them through amendments such as these, we are doing everyone a favour.

I again compliment my noble friend Lady Benjamin for her work in this area. Her speech was also a model of clarity. If we can bring some of that clarity to the legislation and to explaining what we want, that will be an enormous service.

The noble Lord, Lord Knight, made some interesting points around how this would add value to the Bill, teasing out some of the specific gaps that we have there. I look forward to hearing the response on that.

I was interested in the comments from the noble Lord, Lord Bethell, on mobile phone penetration. We should all hold in common that we are not going back to a time BC—before connection. Our children will be connected, which creates the imperative for us to get this right. There has perhaps been a tendency for us to bury our heads in the sand, and occasionally you hear that still—it is almost as if we would wish this world away. However, the noble Baroness, Lady Kidron, is at the other end of the spectrum; she has come alive on this subject, precisely because she recognises that that will not happen. We are in a world where our children will be connected, so it is on us to figure out how we want those connections to work and to instruct the people who provide those connective services on what they should do. It is certainly not for us to imagine that somehow they will all go away. We will come to that in later groups when we talk about minimum ages; if younger children are online, there is a real issue around how we are going to deal with that.

The right reverend Prelate the Bishop of Oxford highlighted some really important challenges based on real experiences that families today are suffering—let us use the word as it should be—and made the case for clarity. I do not know how much we are allowed to talk in praise of EU legislation, but I am looking at the Digital Services Act—I have looked at a lot of EU legislation—and this Bill, and there is a certain clarity to EU regulation, particularly the process of adding recitals, which are attached to the law and explain what it is meant to do. That is sometimes missing here. I know that there are different legal traditions, but you can sometimes look at an EU regulation and the UK law and the former appears to be much clearer in its intent.

That brings me to the substance of my comments in response to this group, so ably introduced by the noble Baroness, Lady Kidron. I hope that the Government heed and recognise that, at present, no ordinary person can know what is happening in the Bill—other than, perhaps, the wife of the noble Lord, Lord Stevenson, who will read it for fun—and what we intend to do.

I was thinking back to the “2B or not 2B” debate we had earlier about the lack of clarity around something even as simple as the classification of services. I was also thinking that, if you ask what the Online Safety Bill does to restrict self-harm content, the answer would be this: if it is a small social media platform, it will probably be categorised as a 2B service, then we can look at Schedule 7, where it is prohibited from assisting suicide, but we might want to come back to some of the earlier clauses with the specific duties—and it will go on and on. As the noble Baroness, Lady Harding, described, you are leaping backwards and forwards in the Bill to try to understand what we are trying to do with the legislation. I think that is a genuine problem.

In effect, the Bill is Parliament setting out the terms of service for how we want Ofcom to regulate online services. We debated terms of service earlier. What is sauce for the goose is sauce for the gander. We are currently failing our own tests of simplicity and clarity on the terms of service that we will give to Ofcom.

As well as platforms, if ordinary people want to find out what is happening, then, just like those platforms with the terms of service, we are going to make them read hundreds of pages before they find out what this legislation is intended to do. We can and should make this simpler for children and parents. I was able to meet Ian Russell briefly at the end of our Second Reading debate. He has been an incredibly powerful and pragmatic voice on this. He is asking for reasonable things. I would love to be able to give a Bill to Ian Russell, and the other families that the right reverend Prelate the Bishop of Oxford referred to, that they can read and that tells them very clearly how Parliament has responded to their concerns. I think we are a long way short of that simple clarity today.

It would be extraordinarily important for service providers, as I already mentioned in response to the noble Lord, Lord Russell. They need that clarity, and we want to make sure that they have no reason to say, “I did not understand what I was being asked to do”. That should be from the biggest to the smallest, as the noble Lord, Lord Moylan, keeps rightly raising with us. Any small service provider should be able to very clearly and simply understand what we are intending to do, and putting more text into the Bill that does that would actually improve it. This is not about adding a whole load of new complications and the bells and whistles we have described but about providing clarity on our intention. Small service providers would benefit from that clarity.

The noble Baroness, Lady Ritchie, rightly raised the issue of the speed of the development of technology. Again, we do not want the small service provider in particular to think it has to go back and do a whole new legal review every time the technology changes. If we have a clear set of principles, it is much quicker and simpler for it to say, “I have developed a new feature. How does it match up against this list?”, rather than having to go to Clause 12, Clause 86, Clause 94 and backwards and forwards within the Bill.

It will be extraordinarily helpful for enforcement bodies such as Ofcom to have a yardstick—again, this takes us back to our debate on the first day—for its prioritisation, because it will have to prioritise. It will not be able to do everything, everywhere, all at once. If we put that prioritisation into the legislation, it will, frankly, save potential arguments between Parliament, the Government and Ofcom later on, when they have decided to prioritise X and we wanted them to prioritise Y. Let us all get aligned on what we are asking them to do up front.

Dare I say—the noble Baroness, Lady Harding, reminded me of this—that it may also be extraordinarily helpful for us as politicians so that we can understand the state of the law. I mean not just the people who are existing specialists or are becoming specialists in this area and taking part in this debate but the other hundreds of Members of both Houses, because this is interesting to everyone. I have experience of being in the other place, and every Member of the other place will have constituents coming to them, often with very tragic circumstances, and asking what Parliament has done. Again, if they have the Online Safety Bill as currently drafted, I think it is hard for any Member of Parliament to be able to say clearly, “This is what we have done”. With those words and that encouraging wind, I hope the Government are able to explain, if not in this way, that they have a commitment to ensuring that we have that clarity for everybody involved in this process.

18:15
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, over the last few hours I have praised us for having developed a style of discussion and debate that is certainly relatively new and not often seen in the House, where we have tried to reach out to each other and find common ground. That was not a problem in this last group of just over an hour; I think we are united around the themes that were so brilliantly introduced in a very concise and well-balanced speech by the noble Baroness, Lady Kidron, who has been a leading and inspirational force behind this activity for so long.

Although different voices have come in at different times and asked questions that still need to be answered, I sense that we have reached a point in our thinking, if not in our actual debates, where we need a plan. I too reached this point; that was exactly the motivation I had in tabling Amendment 1, which was discussed on the first day. Fine as the Bill is—it is a very impressive piece of work in every way—it lacks what we need as a Parliament to convince others that we have understood the issues and have the answers to their questions about what this Government, or this country as a whole, are going to do about this tsunami of difference, which has arrived in the wake of the social media companies and search engines, in the way we do our business and live our lives these days. There is consensus, but it is slightly different to the consensus we had in earlier debates, where we were reassuring ourselves about the issues we were talking about but were not reaching out to the Government to change anything so much as being happy that we were speaking the same language and that they were in the same place as we are gradually coming to as a group, in a way.

Just before we came back in after the lunch break, I happened to talk to the noble Lord, Lord Grade, who is the chair of Ofcom and is listening to most of our debates and discussions when his other duties allow. I asked him what he thought about it, and he said that it was fascinating for him to recognise the level of expertise and knowledge that was growing up in the House, and that it would be a useful resource for Ofcom in the future. He was very impressed by the way in which everyone was engaging and not getting stuck in the niceties of the legislation, which he admitted he was experiencing himself. I say that softly; I do not want to embarrass him in any way because he is an honourable man. However, the point he makes is really important.

I say to the Minister that I do not think we are very far apart on this. He knows that, because we have discussed it at some length over the last six to eight weeks. What I think he should take away from this debate is that this is a point where a decision has to be taken about whether the Government are going to go with the consensus view being expressed here and put deliberately into the Bill a repetitive statement, but one that is clear and unambiguous, about the intention behind the Government’s reason for bringing forward the Bill and for us, the Opposition and other Members of this House, supporting it, which is that we want a safe internet for our children. The way we are going to do that is by having in place, up front and clearly in one place, the things that matter when the regulatory structure sits in place and has to deal with the world as it is, of companies with business plans and business models that are at variance with what we think should be happening and that we know are destroying the lives of people we love and the future of our country—our children—in a way that is quite unacceptable when you analyse it down to its last detail.

It is not a question of saying back to us across the Dispatch Box—I know he wants to but I hope he will not—“Everything that you have said is in the Bill; we don’t need to go down this route, we don’t need another piece of writing that says it all”. I want him to forget that and say that actually it will be worth it, because we will have written something very special for the world to look at and admire. It is probably not in its perfect form yet, but that is what the Government can do: take a rough and ready potential diamond, polish it, chamfer it, and bring it back and set it in a diadem we would all be proud to wear—Coronations excepted—so that we can say, “Look, we have done the dirty work here. We’ve been right down to the bottom and thought about it. We’ve looked at stuff that we never thought in our lives we would ever want to see and survived”.

I shake at some of the material we were shown that Molly Russell was looking at. But I never want to be in a situation where I will have to say to my children and grandchildren, “We had the chance to get this right and we relied on a wonderful piece of work called the Online Safety Act 2023; you will find it in there, but it is going to take you several weeks and a lot of mental harm and difficulty to understand what it means”.

So, let us make it right. Let us not just say “It’ll be alright on the night”. Let us have it there. It is almost right but, as my noble friend Lord Knight said, it needs to be patched back into what is already in the Bill. Somebody needs to look at it and say, “What, out of that, will work as a statement to the world that we care about our kids in a way that will really make a difference?” I warn the Minister that, although I said at Second Reading that I wanted to see this Bill on the statute book as quickly as possible, I will not accept a situation where we do not have more on this issue.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

I am grateful to all noble Lords who have spoken on this group and for the clarity with which the noble Lord, Lord Stevenson, has concluded his remarks.

Amendments 20, 74, 93 and 123, tabled by the noble Baroness, Lady Kidron, would mean a significant revising of the Bill’s approach to content that is harmful to children. It would set a new schedule of harmful content and risk to children—the 4 Cs—on the face of the Bill and revise the criteria for user-to-user and search services carrying out child safety risk assessments.

I start by thanking the noble Baroness publicly—I have done so privately in our discussions—for her extensive engagement with the Government on these issues over recent weeks, along with my noble friends Lord Bethell and Lady Harding of Winscombe. I apologise that it has involved the noble Baroness, Lady Harding, missing her stop on the train. A previous discussion we had also very nearly delayed her mounting a horse, so I can tell your Lordships how she has devoted hours to this—as they all have over recent weeks. I would like to acknowledge their campaigning and the work of all organisations that the noble Baroness, Lady Kidron, listed at the start of her speech, as well as the families of people such as Olly Stephens and the many others that the right reverend Prelate the Bishop of Oxford mentioned.

I also reassure your Lordships that, in developing this legislation, the Government carried out extensive research and engagement with a wide range of interested parties. That included reviewing international best practice. We want this to be world-leading legislation, including the four Cs framework on the online risks of harm to children. The Government share the objectives that all noble Lords have echoed in making sure that children are protected from harm online. I was grateful to the noble Baroness, Lady Benjamin, for echoing the remarks I made earlier in Committee on this. I am glad we are on the same page, even if we are still looking at points of detail, as we should be.

As the noble Baroness, Lady Kidron, knows, it is the Government’s considered opinion that the Bill’s provisions already deliver these objectives. I know that she remains to be convinced, but I am grateful to her for our continuing discussions on that point, and for continuing to kick the tyres on this to make sure that this is indeed legislation of which we can be proud.

It is also clear that there is broad agreement across the House that the Bill should tackle harmful content to children such as content that promotes eating disorders, illegal behaviour such as grooming and risk factors for harm such as the method by which content is disseminated, and the frequency of alerts. I am pleased to be able to put on record that the Bill as drafted already does this in the Government’s opinion, and reflects the principles of the four Cs framework, covering each of those: content, conduct, contact and commercial or contract risks to children.

First, it is important to understand how the Bill defines content, because that question of definition has been a confusing factor in some of the discussions hitherto. When we talk in general terms about content, we mean the substance of a message. This has been the source of some confusion. The Bill defines “content”, for the purposes of this legislation, in Clause 207 extremely broadly as

“anything communicated by means of an internet service”.

Under this definition, in essence, all user communication and activity, including recommendations by an algorithm, interactions in the metaverse, live streams, and so on, is facilitated by “content”. So, for example, unwanted and inappropriate contact from an adult to a child would be treated by the Bill as content harm. The distinctions that the four Cs make between content, conduct and contact risks is therefore not necessary. For the purposes of the Bill, they are all content risks.

Secondly, I know that there have been concerns about whether the specific risks highlighted in the new schedule will be addressed by the Bill.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

Where are the commercial harms? I cannot totally get my head around my noble friend’s definition of content. I can sort of understand how it extends to conduct and contact, but it does not sound as though it could extend to the algorithm itself that is driving the addictive behaviour that most of us are most worried about.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

In that vein, will the noble Lord clarify whether that definition of content does not include paid-for content?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I was about to list the four Cs briefly in order, which will bring me on to commercial or contract risk. Perhaps I may do that and return to those points.

I know that there have been concerns about whether the specific risks highlighted in the new schedule will be addressed by the Bill. In terms of the four Cs category of content risks, there are specific duties for providers to protect children from illegal content, such as content that intentionally assists suicide, as well as content that is harmful to children, such as pornography. Regarding conduct risks, the child safety duties cover harmful conduct or activity such as online bullying or abuse and, under the illegal content safety duties, offences relating to harassment, stalking and inciting violence.

With regard to commercial or contract risks, providers specifically have to assess the risks to children from the design and operation of their service, including their business model and governance under the illegal content and child safety duties. In relation to contact risks, as part of the child safety risk assessment, providers will need specifically to assess contact risks of functionalities that enable adults to search for and contact other users, including children, in a way that was set out by my noble friend Lord Bethell. This will protect children from harms such as harassment and abuse, and, under the illegal content safety duties, all forms of child sexual exploitation and abuse, including grooming.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I agree that content, although unfathomable to the outside world, is defined as the Minister says. However, does that mean that when we see that

“primary priority content harmful to children”

will be put in regulations by the Secretary of State under Clause 54(2)—ditto Clause 54(3) and (4)—we will see those contact risks, conduct risks and commercial risks listed as primary priority, priority and non-designated harms?

18:30
I do not want to make my speech twice, but in my final sentence I said that my challenge to the Government is to have a very simple way forward by other means, if those things were articulated, but my understanding is that they are to bring forward content harms that describe only content as we normally believe it.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I have tried to outline the Bill’s definition of content, which I think will give some reassurance that other concerns that noble Lords have raised are covered. I will turn in a moment to address priority and primary priority content, if the noble Baroness will allow me to do that, and then perhaps intervene again if I have not done so to her satisfaction. I want to set that out and try to keep track of all the questions which have been posed as I do so.

For now, I know there have been concerns from some noble Lords that if functionalities are not labelled as harm in the legislation they would not be addressed by providers, and I reassure your Lordships’ House that this is not the case. There is an important distinction between content and other risk factors such as, for instance, an algorithm, which without content cannot risk causing harm to a child. That is why functionalities are not covered by the categories of primary, priority and priority content which is harmful to children. The Bill sets out a comprehensive risk assessment process which will cover content or activity that poses a risk of harm to children and other factors, such as functionality, which may increase the risk of harm. As such, the existing children’s risk assessment criteria already cover many of the changes proposed in this amendment. For example, the duties already require service providers to assess the risk of harm to children from their business model and governance. They also require providers to consider how a comprehensive range of functionalities affect risk, how the service is used and how the use of algorithms could increase the risks to children.

Turning to the examples of harmful content set out in the proposed new schedule, I am happy to reassure the noble Baroness and other noble Lords that the Government’s proposed list of primary, priority and priority content covers a significant amount of this content. In her opening speech she asked about cumulative harm—that is, content sent many times or content which is harmful due to the manner of its dissemination. We will look at that in detail on the next group as well, but I will respond to the points she made earlier now. The definition of harm in the Bill under Clause 205 makes it clear that physical or psychological harm may arise from the fact or manner of dissemination of the content, not just the nature of the content—content which is not harmful per se, but which if sent to a child many times, for example by an algorithm, would meet the Bill’s threshold for content that is harmful to children. Companies will have to consider this as a fundamental part of their risk assessment, including, for example, how the dissemination of content via algorithmic recommendations may increase the risk of harm, and they will need to put in place proportionate and age-appropriate measures to manage and mitigate the risks they identify. I followed the exchanges between the noble Baronesses, Lady Kidron and Lady Fox, and I make it clear that the approach set out by the Bill will mean that companies cannot avoid tackling the kind of awful content which Molly Russell saw and the harmful algorithms which pushed that content relentlessly at her.

This point on cumulative harm was picked up by my noble friend Lord Bethell. The Bill will address cumulative risk where it is the result of a combination of high-risk functionality, such as live streaming, or rewards in service by way of payment or non-financial reward. This will initially be identified through Ofcom’s sector risk assessments, and Ofcom’s risk profiles and risk assessment guidance will reflect where a combination of risk in functionalities such as these can drive up the risk of harm to children. Service providers will have to take Ofcom’s risk profiles into account in their own risk assessments for content which is illegal or harmful to children. The actions that companies will be required to take under their risk assessment duties in the Bill and the safety measures they will be required to put in place to manage the services risk will consider this bigger-picture risk profile.

The amendments of the noble Baroness, Lady Kidron, would remove references to primary priority and priority harmful content to children from the child risk assessment duties, which we fear would undermine the effectiveness of the child safety duties as currently drafted. That includes the duty for user-to-user providers to prevent children encountering primary priority harms, such as pornography and content that promotes self-harm or suicide, as well as the duty to put in place age-appropriate measures to protect children from other harmful content and activity. As a result, we fear these amendments could remove the requirement for an age-appropriate approach to protecting children online and make the requirement to prevent children accessing primary priority content less clear.

The noble Baroness, Lady Kidron, asked in her opening remarks about emerging harms, which she was right to do. As noble Lords know, the Bill has been designed to respond as rapidly as possible to new and emerging harms. First, the primary priority and priority list of content can be updated by the Secretary of State. Secondly, it is important to remember the function of non-designated content that is harmful to children in the Bill—that is content that meets the threshold of harmful content to children but is not on the lists designated by the Government. Companies are required to understand and identify this kind of content and, crucially, report it to Ofcom. Thirdly, this will inform the actions of Ofcom itself in its review and report duties under Clause 56, where it is required to review the incidence of harmful content and the severity of harm experienced by children as a result of it. This is not limited to content that the Government have listed as being harmful, as it is intended to capture new and emerging harms. Ofcom will be required to report back to the Government with recommendations on changes to the primary priority and priority content lists.

I turn to the points that the noble Lord, Lord Knight of Weymouth, helpfully raised earlier about things that are in the amendments but not explicitly mentioned in the Bill. As he knows, the Bill has been designed to be tech-neutral, so that it is future-proof. That is why there is no explicit reference to the metaverse or virtual or augmented reality. However, the Bill will apply to service providers that enable users to share content online or interact with each other, as well as search services. That includes a broad range of services such as websites, applications, social media sites, video games and virtual reality spaces such as the metaverse; those are all captured. Any service that allows users to interact, as the metaverse does, will need to conduct a children’s access assessment and comply with the child safety duties if it is likely to be accessed by children.

Amendment 123 from the noble Baroness, Lady Kidron, seeks to amend Clause 48 to require Ofcom to create guidance for Part 3 service providers on this new schedule. For the reasons I have just set out, we do not think it would be workable to require Ofcom to produce guidance on this proposed schedule. For example, the duty requires Ofcom to provide guidance on the content, whereas the proposed schedule includes examples of risky functionality, such as the frequency and volume of recommendations.

I stress again that we are sympathetic to the aim of all these amendments. As I have set out, though, our analysis leads us to believe that the four Cs framework is simply not compatible with the existing architecture of the Bill. Fundamental concepts such as risk, harm and content would need to be reconsidered in the light of it, and that would inevitably have a knock-on effect for a large number of clauses and timing. The Bill has benefited from considerable scrutiny—pre-legislative and in many discussions over many years. The noble Baroness, Lady Kidron, has been a key part of that and of improving the Bill. The task is simply unfeasible at this stage in the progress of the Bill through Parliament and risks delaying it, as well as significantly slowing down Ofcom’s implementation of the child safety duties. We do not think that this slowing down is a risk worth taking, because we believe the Bill already achieves what is sought by these amendments.

Even so, I say to the Committee that we have listened to the noble Baroness, Lady Kidron, and others and have worked to identify changes which would further address these concerns. My noble friend Lady Harding posed a clear question: if not this, what would the Government do instead? I am pleased to say that, as a result of the discussions we have had, the Government have decided to make a significant change to the Bill. We will now place the categories of primary priority and priority content which is harmful to children on the face of the Bill, rather than leaving them to be designated in secondary legislation, so Parliament will have its say on them.

We hope that this change will reassure your Lordships that protecting children from the most harmful content is indeed the priority for the Bill. That change will be made on Report. We will continue to work closely with the noble Baroness, Lady Kidron, my noble friends and others, but I am not able to accept the amendments in the group before us today. With that, I hope that she will be willing to withdraw.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I thank all the speakers. There were some magnificent speeches and I do not really want to pick out any particular ones, but I cannot help but say that the right reverend Prelate described the world without the four Cs. For me, that is what everybody in the Box and on the Front Bench should go and listen to.

I am grateful and pleased that the Minister has said that the Government are moving in this direction. I am very grateful for that but there are a couple of things that I have to come back on. First, I have swiftly read Amendment 205’s definition of harm and I do not think it says that you do not have to reach a barrier of harm; dissemination is quite enough. There is always the problem of what the end result of the harm is. The thing that the Government are not listening to is the relationship between the risk assessment and the harm. It is about making sure that we are clear that it is the functionality that can cause harm. I think we will come back to this at another point, but that is what I beg them to listen to. Secondly, I am not entirely sure that it is correct to say that the four Cs mean that you cannot have primary priority, priority and so on. That could be within the schedule of content, so those two things are not actually mutually exclusive. I would be very happy to have a think about that.

What was not addressed in the Minister’s answer was the point made by the noble Lord, Lord Allan of Hallam, in supporting the proposal that we should have in the schedule: “This is what you’ve got to do; this is what you’ve got to look at; this is what we’re expecting of you; and this is what Parliament has delivered”. That is immensely important, and I was so grateful to the noble Lord, Lord Stevenson, for putting his marker down on this set of amendments. I am absolutely committed to working alongside him and to finding ways around this, but we need to find a way of stating it.

Ironically, that is my answer to both the noble Baronesses, Lady Ritchie and Lady Fox: we should have our arguments here and now, in this Chamber. I do not wish to leave it to the Secretary of State, whom I have great regard for, as it happens, but who knows: I have seen a lot of Secretaries of State. I do not even want to leave it to the Minister, because I have seen a lot of Ministers too—ditto Ofcom, and definitely not the tech sector. So here is the place, and we are the people, to work out the edges of this thing.

Not for the first time, my friend, the noble Baroness, Lady Harding, read out what would have been my answer to the noble Baroness, Lady Ritchie. I have gone round and round, and it is like the Marx brothers’ movie: in the end, harm is defined by subsection (4)(c), but that says that harm will defined by the Secretary of State. It goes around like that through the Bill.

18:45
There are three members of the pre-legislative committee in the Chamber. We were very clear about design features, and several members who are not present were even clearer. So I hear where we are with the Bill, but I have been following it for five years and have been saying the same thing, so if we are a little late to the party I do not think that is because of me. I do not want to delay the Bill but I want to stamp the authority of Parliament on the question of how harm happens, as well as what it is.
My last sentence has to be: let us remember our conversation about trying to measure illegal harm and then think about it at scale for children. We have to have something softer than that; we cannot do it for each piece of content. The saving grace of the Bill is its systems and processes: it will make the tsunami a trickle—that is what we want to do. It is not to say that young people should not have access to the internet. Although I spent quite a lot of time disagreeing with the noble Baroness, Lady Fox, today, I absolutely agree with her about evolving capacities, and I hope that we revisit that question later. With that, I beg leave to withdraw my amendment.
Amendment 20 withdrawn.
Amendment 21 not moved.
Amendment 21A
Moved by
21A: Clause 10, page 10, line 1, after “19(2)” insert “and (8A)”
Member’s explanatory statement
This amendment inserts a signpost to the new duty in clause 19 about supplying records of risk assessments to OFCOM.
Amendment 21A agreed.
Clause 10, as amended, agreed.
Clause 11: Safety duties protecting children
Amendment 22 not moved.
Amendment 22A
Moved by
22A: Clause 11, page 10, line 6, at end insert “(as indicated by the headings).”
Member’s explanatory statement
This amendment provides clarification because the new duty to summarise children’s risk assessments in the terms of service (see the amendment in the Minister’s name inserting new subsection (10A) below) is imposed only on providers of Category 1 services.
Amendment 22A agreed.
House resumed.
House adjourned at 6.48 pm.
Committee (4th Day)
16:18
Relevant document: 28th Report from the Delegated Powers Committee
Clause 11: Safety duties protecting children
Amendment 23
Moved by
23: Clause 11, page 10, line 9, at beginning insert “eliminate,”
Member’s explanatory statement
This amendment would require user to user services to eliminate identified risks to children from their platforms in addition to mitigating and managing them.
Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- Hansard - - - Excerpts

My Lords, this large group of 33 amendments is concerned with preventing harm to children, by creating a legal requirement to design the sites and services that children will access in a way that will put their safety first and foremost. I thank my co-sponsors, the noble Baronesses, Lady Kidron and Lady Harding, and the noble Lord, Lord Knight. First of all, I wish to do the most important thing I will do today: to wish the noble Baroness, Lady Kidron, a very happy birthday.

None Portrait Noble Lords
- Hansard -

Hear, hear!

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- Hansard - - - Excerpts

My co-sponsors will deal with some of the more detailed elements of the 30 amendments that we are dealing with. These will include safety duties, functionality and harm, and codes of practice. I am sure that the noble Lords, Lord Stevenson and Lord Knight, and the right reverend Prelate the Bishop of Oxford will speak to their own amendments.

I will provide a brief overview of why we are so convinced of the paramount need for a safety by design approach to protect children and remind digital companies and platforms, forcibly and legally, of their obligation to include the interests and safety of children as a paramount element within their business strategies and operating models. These sites and services are artificial environments. They were designed artificially and can be redesigned artificially.

In her testimony to the US Senate in July 2021, the Facebook whistleblower Frances Haugen put her finger on it rather uncomfortably when talking about her erstwhile employer:

“Facebook know that they are leading young users to anorexia content … Facebook’s internal research is aware that there are a variety of problems facing children on Instagram … they know that severe harm is happening to children”.


She was talking about, probably, three years ago.

On the first day of Committee, the noble Lord, Lord Allan, who is not with us today, used the analogy of the legally mandated and regulated safe design of aeroplanes and automobiles and the different regimes that cover their usage to illustrate some of our choices in dealing with regulation. We know why aeroplanes and cars have to be designed safely; we also know that either form of transportation could be used recklessly and dangerously, which is why we do not allow children to fly or drive them.

First, let us listen to the designers of these platforms and services through some research done by the 5Rights Foundation in July 2021. These are three direct quotes from the designers:

“Companies make their money from attention. Reducing attention will reduce revenue. If you are a designer working in an attention business, you will design for attention … Senior stakeholders like simple KPIs. Not complex arguments about user needs and human values … If a senior person gives a directive, say increase reach, then that’s what designers design for without necessarily thinking about the consequences”.


Companies know exactly what they need to do to grow and to drive profitability. However, they mostly choose not to consider, mitigate and prioritise to avoid some of the potentially harmful consequences. What they design and prioritise are strategies to maximise consumption, activity and profitability. They are very good at it.

Let us hear what the children say, remembering that some recent research indicates that 42% of five to 12 year-olds in this country use social media. The Pathways research project I referred to earlier worked closely with 21 children aged 12 to 18, who said: “We spend more time online than we feel we should, but it’s tough to stop or cut down”. “If we’re not on social media, we feel excluded”. “We like and value the affirmations and validations we receive”. “We create lots of visual content, much of it about ourselves, and we share it widely”. “Many of us are contacted by unknown adults”. “Many of us recognise that, through using social media, we have experienced body image and relationships problems”.

To test whether the children in this research project were accurately reporting their experiences, the project decided to place a series of child avatars—ghost children, in effect—on the internet, whose profiles very clearly stated that they were children. It did this to test whether these experiences were true.

They found—in many cases within a matter of hours of the profiles going online—proactive contacting by strangers and rapid recommendations to engage more and more. If searches were conducted for eating disorders or self-harm, the avatars were quickly able to access content irrespective of their stated ages and clearly evident status as children. At the same time they were being sent harmful or inappropriate content, they also received age-relevant advertising for school revision and for toys—the social media companies knew that these accounts were registered as children.

This research was done two years ago. Has anything improved since then? It just so happens that 5Rights has produced another piece of research which is about to be released, and which used the exact same technique—creating avatars to see what they would experience online. They used 10 avatars based on real children aged between 10 and 16, so what happened? For an 11 year-old avatar, Instagram was recommending images of knives with the caption “This is what I use to self-harm”; design features were leading children from innocent searches to harmful content very quickly.

I think any grandparents in the Chamber will be aware of an interesting substance known as “Slime”—a form of particularly tactile playdough which one’s grandchildren seem to enjoy. Typing in “Slime” on Reddit was one search, and one click, away from pornography; exactly the same thing happened on Reddit when the avatar typed in “Minecraft”, another very popular game with our children or grandchildren. A 15 year-old female avatar was private-messaged on Instagram by a user that she did not follow—an unknown adult who encouraged her to link on to pornographic content on Telegram, another instant messaging service. On the basis of this evidence, it appears that little or nothing has changed; it may have even got slightly worse.

By an uncomfortable coincidence, last week, Meta, the parent company of Facebook and Instagram, published better than expected results and saw its market value increase by more than $50 billion in after-hours trading. Mark Zuckerberg, the founder of Meta, proudly announced that Meta is pouring investment into artificial intelligence tools to make its platform more engaging and its advertising more effective. Of particular interest and concern given the evidence of the avatars was his announcement that since the introduction of Reels, a short-term video feed designed specifically to respond to competition from TikTok, its AI-driven recommendations had boosted the average time people spend on Instagram by 24%.

To return to the analogy of planes and cars used by the noble Lord, Lord Allan, we are dealing here with planes and cars in the shape of platforms and applications which we know are flawed in their design. They are not adequately designed for safety, and we know that they can put users, particularly children and young people, in the way of great harm, as many grieving families can testify.

In conclusion, our amendments propose that companies must design digital services that cater for the vulnerabilities, needs, and rights of children and young people by default; children’s safety cannot and must not be an afterthought or a casualty of their business models. We are asking for safety by design to protect children to become the mandatory standard. What we have today is unsafe design by default, driven by commercial strategies which can lead to children becoming collateral damage.

Given that it is the noble Baroness’s birthday, I am sure we can feel confident that the Minister will have a positive tone when he replies. I beg to move.

16:30
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

It is a great pleasure to follow my noble friend Lord Russell and to thank him for his good wishes. I assure the Committee that there is nowhere I would rather spend my birthday, in spite of some competitive offers. I remind noble Lords of my interests in the register, particularly as the chair of 5Rights Foundation.

As my noble friend has set out, these amendments fall in three places: the risk assessments, the safety duties and the codes of practice. However, together they work on the overarching theme of safety by design. I will restrict my detailed remarks to a number of amendments in the first two categories. This is perhaps a good moment to recall the initial work of Carnegie, which provided the conceptual approach of the Bill several years ago in arguing for a duty of care. The Bill has gone many rounds since then, but I think the principle remains that a regulated service should consider its impact on users before it causes them harm. Safety by design, to which all the amendments in this group refer, is an embodiment of a duty of care. In thinking about these amendments as a group, I remind the Committee that both the proportionality provisions and the fact that this is a systems and processes Bill means that no company can, should or will be penalised for a single piece of content, a single piece of design or, indeed, low-level infringements.

Amendments 24, 31, 77 and 84 would delete “content” from the Government’s description of what is harmful to children, meaning that the duty is to consider harm in the round rather than just harmful content. The definition of “content” is drawn broadly in Clause 207 as

“anything communicated by means of an internet service”,

but the examples in the Bill, including

“written material … music and data of any description”,

once again fail to include design features that are so often the key drivers of harm to children.

On day three of Committee, the Minister said:

“The Bill will address cumulative risk where it is the result of a combination of high-risk functionality, such as live streaming, or rewards in service … This will initially be identified through Ofcom’s sector risk assessments, and Ofcom’s risk profiles and risk assessment guidance will reflect where a combination of risk in functionalities such as these can drive up the risk of harm to children. Service providers will have to take Ofcom’s risk profiles into account in their own risk assessments for content which is illegal or harmful to children”.—[Official Report, 27/4/23; col. 1385.]


However, in looking at the child safety duties, Clause 11(5) says:

“The duties … in subsections (2) and (3) apply across all areas of a service, including the way it is designed, operated and used”,


but subsection (14) says:

“The duties set out in subsections (3) and (6)”—


which are the duties to operate proportionate systems and processes to prevent and protect children from encountering harmful content and to include them in terms of service—

“are to be taken to extend only to content that is harmful to children where the risk of harm is presented by the nature of the content (rather than the fact of its dissemination)”.

I hesitate to say whether that is contradictory. I am not actually sure, but it is confusing. I am concerned that while we are reassured that “content” means content and activity and that the risk assessment considers functionality, “harm” is then repeatedly expressed only in the form of content.

Over the weekend, I had an email exchange with the renowned psychoanalyst and author, Norman Doidge, whose work on the plasticity of the brain profoundly changed how we think about addiction and compulsion. In the exchange, he said that

“children’s exposures to super doses, of supernormal images and scenes, leaves an imprint that can hijack development”.

Then, he said that

“the direction seems to be that AI would be working out the irresistible image or scenario, and target people with these images, as they target advertising”.

His argument is that it is not just the image but the dissemination and tailoring of that image that maximises the impact. The volume and frequency of those images create habits in children that take a lifetime to change—if they change at all. Amendments 32 and 85 would remove this language to ensure that content that is harmful by virtue of its dissemination is accounted for.

I turn now to Amendments 28 and 82, which cut the reference to the

“size and capacity of the provider of the service”

in deeming what measures are proportionate. We have already discussed that small is not safe. Such platforms such as Yubo, Clapper and Discord have all been found to harm children and, as both the noble Baroness, Lady Harding, and the noble Lord, Lord Clement-Jones, told us, small can become big very quickly. It is far easier to build to a set of rules than it is to retrofit them after the event. Again, I point out that Ofcom already has duties of proportionality; adding size and capacity is unnecessary and may tip the scale to creating loopholes for smaller services.

Amendment 138 seeks to reverse the exemption in Clause 54 of financial harms. More than half of the 100 top-grossing mobile phone apps contain loot boxes, which are well established as unfair and unhealthy, priming young children to gamble and leading to immediate hardship for parents landed with extraordinary bills.

By rights, Amendments 291 and 292 could fit in the future-proof set of amendments. The way that the Bill in Clause 204 separates out functionalities in terms of search and user-to-user is in direct opposition to the direction of travel in the tech sector. TikTok does shopping, Instagram does video, Amazon does search; autocomplete is an issue across the full gamut of services, and so on and so forth. This amendment simply combines the list of functionalities that must be risk-assessed and makes them apply on any regulated service. I cannot see a single argument against this amendment: it cannot be the Government’s intention that a child can be protected, on search services such as Google, from predictive search or autocomplete, but not on TikTok.

Finally, Amendment 295 will embed the understanding that most harm is cumulative. If the Bereaved Parents for Online Safety were in the Chamber, or any child caught up in self-harm, depression sites, gambling, gaming, bullying, fear of exposure, or the inexorable feeling of losing their childhood to an endless scroll, they would say at the top of their voices that it is not any individual piece of content, or any one moment or incident, but the way in which they are nudged, pushed, enticed and goaded into a toxic, harmful or dangerous place. Adding the simple words

“the volume of the content and the frequency with which the content is accessed”

to the interpretation of what can constitute harm in Clause 205 is one of the most important things that we can do in this Chamber. This Bill comes too late for a whole generation of parents and children but, if these safety by design amendments can protect the next generation of children, I will certainly be very glad.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, it is an honour, once again, to follow the noble Baroness, Lady Kidron, and the noble Lord, Lord Russell, in this Committee. I am going to speak in detail to the amendments that seek to change the way the codes of practice are implemented. Before I do, however, I will very briefly add my voice to the general comments that the noble Baroness, Lady Kidron, and the noble Lord, Lord Russell, have just taken us through. Every parent in the country knows that both the benefit and the harm that online platforms can bring our children is not just about the content. It is about the functionality: the way these platforms work; the way they suck us in. They do give us joy but they also drive addiction. It is hugely important that this Bill reflects the functionality that online platforms bring, and not just content in the normal sense of the word “content”.

I will now speak in a bit more detail about the following amendments: Amendments 65, 65ZA, 65AA, 89, 90, 90B, 96A, 106A, 106B, 107A, 114A—I will finish soon, I promise—112, 122ZA, 122ZB and 122ZC.

Lord Vaizey of Didcot Portrait Lord Vaizey of Didcot (Con)
- Hansard - - - Excerpts

My noble friend may have left one out.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

I am afraid I may well have done.

That list shows your Lordships some of the challenges we all have with the Bill. All these amendments seek to ensure that the codes of practice relating to child safety are binding. Such codes should be principles-based and flexible to allow companies to take the most appropriate route of compliance, but implementing these codes should be mandatory, rather than, as the Bill currently sets out, platforms being allowed to use “alternative measures”. That is what all these amendments do—they do exactly the same thing. That was a clear and firm recommendation from the joint scrutiny committee. The government’s response to that joint scrutiny committee report was really quite weak. Rather than rehearse the joint scrutiny committee’s views, I will rehearse the Government’s response and why it is not good enough to keep the Bill as it stands.

The first argument the Government make in their response to the joint scrutiny report is that there is no precedent for mandatory codes of conduct. But actually there are. There is clear precedent in child protection. In the physical world, the SEND code for how we protect some of our most vulnerable children is mandatory. Likewise, in the digital world, the age-appropriate design code, which we have mentioned many a time, is also mandatory. So there is plenty of precedent.

The second concern—this is quite funny—was that stakeholders were concerned about having multiple codes of conduct because it could be quite burdensome on them. Well, forgive me for not crying too much for these enormous tech companies relative to protecting our children. The burden I am worried about is the one on Ofcom. This is an enormous Bill, which places huge amounts of work on a regulator that already has a very wide scope. If you make codes of conduct non-mandatory, you are in fact making the work of the regulator even harder. The Government themselves in their response say that Ofcom has to determine what the minimum standards should be in these non-binding codes of practice. Surely it is much simpler and more straightforward to make these codes mandatory and, yes, to add potentially a small additional burden to these enormous tech companies to ensure that we protect our children.

The third challenge is that non-statutory guidance already looks as if it is causing problems in this space. On the video-sharing platform regime, which is non-mandatory, Ofcom has already said that in its first year of operation it has

“seen a large variation in platforms’ readiness to engage with Ofcom”.

All that will simply make it harder and harder, so the burden will lie on this regulator—which I think all of us in this House are already worried is being asked to do an awful lot—if we do not make it very clear what is mandatory and what is not. The Secretary of State said of the Bill that she is

“determined to put these vital protections for … children … into law as quickly as possible”.

A law that puts in place a non-mandatory code of conduct is not what parents across the country would expect from that statement from the Secretary of State. People out there—parents and grandparents across the land—would expect Ofcom to be setting some rules and companies to be required to follow them. That is exactly what we do in the physical world, and I do not understand why we would not want to do it in the digital world.

Finally—I apologise for having gone on for quite a long time—I will very briefly talk specifically to Amendment 32A, in the name of the noble Lord, Lord Knight, which is also in this group. It is a probing amendment which looks at how the Bill will address and require Ofcom and participants to take due regard of VPNs: the ability for our savvy children—I am the mother of two teenage girls—to get round all this by using a VPN to access the content they want. This is an important amendment and I am keen to hear what my noble friend Minister will say in response. Last week, I spoke about my attempts to find out how easy it would be for my 17 year-old daughter to access pornography on her iPhone. I spoke about how I searched in the App Store on her phone and found that immediately a whole series of 17-plus-rated apps came up that were pornography sites. What I did not mention then is that with that—in fact, at the top of the list—came a whole series of VPN apps. Just in case my daughter was naive enough to think that she could just click through and watch it, and Apple was right that 17 year-olds were allowed to watch pornography, which obviously they are not, the App Store was also offering her an easy route to access it through a VPN. That is not about content but functionality, and we need to properly understand why this bundle of amendments is so important.

16:45
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I was not going to speak on this group, but I was provoked into offering some reflections on the speech by the noble Lord, Lord Russell of Liverpool, especially his opening remarks about cars and planes, which he said were designed to be safe. He did not mention trains, about which I know something as well, and which are also designed to be safe. These are a few initial reflective points. They are designed in very different ways. An aeroplane is designed never to fail; a train is designed so that if it fails, it will come to a stop. They are two totally different approaches to safety. Simply saying that something must be designed to be safe does not answer questions; it opens questions about what we actually mean by that. The noble Lord went on to say that we do not allow children to drive cars and fly planes. That is absolutely true, but the thrust of his amendment is that we should design the internet so that it can be driven by children and used by children— so that it is designed for them, not for adults. That is my problem with the general thrust of many of these amendments.

A further reflection that came to mind as the noble Lord spoke was on a book of great interest that I recommend to noble Lords. It is a book by the name of Risk written in 1995 by Professor John Adams, then professor of geography at University College London. He is still an emeritus professor of geography there. It was a most interesting work on risk. First, it reflected how little we actually know of many of the things of which we are trying to assess risk.

More importantly, he went on to say that people have an appetite for risk. That appetite for risk—that risk budget, so to speak—changes over the course of one’s life: one has much less appetite for risk when one gets to a certain age than perhaps one had when one was young. I have never bungee jumped in my life, and I think I can assure noble Lords that the time has come when I can say I never shall, but there might have been a time when I was younger when I might have flung myself off a cliff, attached to a rubber band and so forth—noble Lords may have done so. One has an appetite for risk.

The interesting thing that he went on to develop from that was the notion of risk compensation: that if you have an appetite for risk and your opportunities to take risks are taken away, all you do is compensate by taking risks elsewhere. So a country such as New Zealand, which has some of the strictest cycling safety laws, also has a very high incidence of bungee jumping among the young; as they cannot take risks on their bicycles, they will find ways to go and do it elsewhere.

Although these reflections are not directly germane to the amendments, they are important as we try to understand what we are seeking to achieve here, which is a sort of hermetically sealed absence of risk for children. I do not think it will work. I said at Second Reading that I thought the flavour of the debate was somewhat similar to a late medieval conclave of clerics trying to work out how to mitigate the harmful effects of the invention of movable type. That did not work either, and I think we are in a very similar position today as we discuss this.

There is also the question of harm and what it means. While the examples being given by noble Lords are very specific and no doubt genuinely harmful, and are the sorts of things that we should like to stop, the drafting of the amendments, using very vague words such as “harm”, is dangerous overreach in the Bill. To give just one example, for the sake of speed, when I was young, administering the cane periodically was thought good for a child in certain circumstances. The mantra was, “Spare the rod and spoil the child”, though I never heard it said. Nowadays, we would not think it morally or psychologically good to do physical harm to a child. We would regard it as an unmitigated harm and, although not necessarily banned or illegal, it is something that—

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- Hansard - - - Excerpts

My Lords, I respond to the noble Lord in two ways. First, I ask him to reflect on how the parents of the children who have died through what the parents would undoubtedly view as serious and unbearable harm would feel about his philosophical ruminations. Secondly, as somebody who has the privilege of being a Deputy Speaker in your Lordships’ House, it is incumbent and germane for us all to focus on the amendment in question and stay on it, to save time and get through the business.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

Well, I must regard myself as doubly rebuked, and unfairly, because my reflections are very relevant to the amendments, and I have developed them in that direction. In respect of the parents, they have suffered very cruelly and wrongly, but although it may sound harsh, as I have said in this House before on other matters, hard cases make bad law. We are in the business of trying to make good law that applies to the whole population, so I do not think that these are wholly—

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

If my noble friend could, would he roll back the health and safety regulations for selling toys, in the same way that he seems so happy to have no health and safety regulations for children’s access to digital toys?

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, if the internet were a toy, aimed at children and used only by children, those remarks would of course be very relevant, but we are dealing with something of huge value and importance to adults as well. It is the lack of consideration of the role of adults, the access for adults and the effects on freedom of expression and freedom of speech, implicit in these amendments, that cause me so much concern.

I seem to have upset everybody. I will now take issue with and upset the noble Baroness, Lady Benjamin, with whom I have not engaged on this topic so far. At Second Reading and earlier in Committee, she used the phrase, “childhood lasts a lifetime”. There are many people for whom this is a very chilling phrase. We have an amendment in this group—a probing amendment, granted—tabled by the noble Lord, Lord Knight of Weymouth, which seeks to block access to VPNs as well. We are in danger of putting ourselves in the same position as China, with a hermetically sealed national internet, attempting to put borders around it so that nobody can breach it. I am assured that even in China this does not work and that clever and savvy people simply get around the barriers that the state has erected for them.

Before I sit down, I will redeem myself a little, if I can, by giving some encouragement to the noble Baroness, Lady Kidron, on Amendments 28 and 32 —although I think the amendments are in the name of the noble Lord, Lord Russell of Liverpool. These amendments, if we are to assess the danger posed by the internet to children, seek to substitute an assessment of the riskiness of the provider for the Government’s emphasis on the size of the provider. As I said earlier in Committee, I do not regard size as being a source of danger. When it comes to many other services— I mentioned that I buy my sandwich from Marks & Spencer as opposed to a corner shop—it is very often the bigger provider I feel is going to be safer, because I feel I can rely on its processes more. So I would certainly like to hear how my noble friend the Minister responds on that point in relation to Amendments 28 and 32, and why the Government continue to put such emphasis on size.

More broadly, in these understandable attempts to protect children, we are in danger of using language that is far too loose and of having an effect on adult access to the internet which is not being considered in the debate—or at least has not been until I have, however unwelcomely, raised it.

Lord Vaizey of Didcot Portrait Lord Vaizey of Didcot (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I assure your Lordships that I rise to speak very briefly. I begin by reassuring my noble friend Lord Moylan that he is loved in this Chamber and outside. I was going to say that he is the grit in the oyster that ensures that a consensus does not establish itself and that we think hard about these amendments, but I will revise that and say he is now the bungee jumper in our ravine. I think he often makes excellent and worthwhile points about the scope and reach of the Bill and the unintended consequences. Indeed, we debated those when we debated the amendments relating to Wikipedia, for example.

Obviously, I support these amendments in principle. The other reason I wanted to speak was to wish the noble Baroness, Lady Kidron—Beeban—a happy birthday, because I know that these speeches will be recorded on parchment bound in vellum and presented to her, but also to thank her for all the work that she has done for many years now on the protection of children’s rights on the internet. It occurred to me, as my noble friend Lady Harding was speaking, that there were a number of points I wanted to seek clarity on, either from the Minister or from the proponents of the amendments.

First, the noble Baroness, Lady Harding, mentioned the age-appropriate design code, which was a victory for the noble Baroness, Lady Kidron. It has, I think, already had an impact on the way that some sites that are frequented by children are designed. I know, for instance, that TikTok—the noble Baroness will correct me—prides itself on having made some changes as a result of the design code; for example, its algorithms are able, to a certain extent, to detect whether a child is under 13. I know anecdotally that children under 13 sometimes do have their accounts taken away; I think that is a direct result of the amendments made by the age-appropriate design code.

I would like to understand how these amendments, and the issue of children’s rights in this Bill, will interact with the age-appropriate design code, because none of us wants the confetti of regulations that either overlap or, worse, contradict themselves.

Secondly, I support the principle of functionality. I think it is a very important point that these amendments make: the Bill should not be focused solely on content but should take into account that functionality leads to dangerous content. That is an important principle on which platforms should be held to account.

Thirdly, going back to the point about the age-appropriate design code, the design of websites is extremely important and should be part of the regulatory system. Those are the points I wanted to make.

17:00
In relation to how my noble friend Lord Moylan is approaching the Bill, I would say this: having been a Minister when the British Government—and, indeed, other Governments—had no power at all, it was very telling when the then Prime Minister threatened Google with legislation on the issue of child abuse images, saying, “If you do not do something, I will legislate”.
At that time, I was on the tech side of the argument. Google went from saying, “It is impossible to do anything” to identifying 130,000 phrases that people might type into search engines when searching for child abuse images, which, in theory—I have not tried this myself, I hasten to add—would come up with no return and, indeed, a warning that the person in question was searching for those images.
Again, I say to my noble friend Lord Moylan—who I encourage to keep going with his scepticism about the Bill; it is important—that it is a bit of a dead end at any point in his argument to compare us with China. That is genuinely comparing apples with oranges. When people were resisting regulation in this sphere, they would always say, “That’s what the Chinese want”. We have broadcasting regulation and other forms of health and safety regulation. It is not the mark of an autocratic or totalitarian state to have regulation; platforms need to be held to account. I simply ask the proponents of the amendments to make it clear as they proceed how this fits in with existing regulations, such as the age-appropriate design code.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I want, apart from anything else, to speak in defence of philosophical ruminations. The only way we can scrutinise the amendments in Committee is to do a bit of philosophical rumination. We are trying to work out what the amendments might mean in terms of changing the Bill.

I read these amendments, noted their use of “eliminate” —we have to “eliminate” all risks—and wondered what that would mean. I do not want to feel that I cannot ask these kinds of difficult questions for fear that I will offend a particular group or that it would be insensitive to a particular group of parents. It is difficult but we are required as legislators to try to understand what each other are trying to change, or how we are going to try to change the law.

I say to those who have put “eliminate” prominently in a number of these amendments that it is impossible to eliminate all risks to children—is it not?—if they are to have access to the online world, unless you ban them from the platforms completely. Is “eliminate” really helpful here?

Previously in Committee, I talked a lot about the potential dangers, psychologically and with respect to development, of overcoddling young people, of cotton wool kids, and so on. I noted an article over the weekend by the science journalist Tom Chivers, which included arguments from the Oxford Internet Institute and various psychologists that the evidence on whether social media is harmful, particularly for teenagers, is ambiguous.

I am very convinced by the examples brought forward by the noble Baroness, Lady Kidron—and I too wish her a happy birthday. We all know about the targeting of young people and so forth, but I am also aware of the positives. I always try to balance these things out and make sure that we do not deny young people access to the positives. In fact, I found myself cheering at the next group of amendments, which is unusual. First, they depend on whether you are four or 14—in other words, you have to be age-specific—and, secondly, they recognise that we do not want to pass anything in the Bill that actually denies children access to either their own privacy or the capacity to know more.

I also wanted to explore a little the idea of expanding the debate away from content to systems, because this is something that I think I am not quite understanding. My problem is that moving away from the discussion on whether content is removed or accessible, and focusing on systems, does not mean that content is not in scope. My worry is that the systems will have an impact on what content is available.

Let me give some examples of things that can become difficult if we think that we do not want young people to encounter violence and nudity—which makes it seem as though we know what we are talking about when we talk about “harmful”. We will all recall that, in 2018, Facebook removed content from the Anne Frank Centre posted by civil rights organisations because it included photographs of the Holocaust featuring undressed children among the victims. Facebook apologised afterwards. None the less, my worry is these kinds of things happening. Another example, in 2016, was the removal of the Pulitzer Prize-winning photograph “The Terror of War”, featuring fleeing Vietnamese napalm victims in the 1970s, because the system thought it was something dodgy, given that the photo was of a naked child fleeing.

I need to understand how system changes will not deprive young people of important educational information such as that. That is what I am trying to distinguish. The point made by the noble Lord, Lord Moylan, about “harmful” not being defined—I have endlessly gone on about this, and will talk more about it later—is difficult because we think that we know what we mean by “harmful” content.

Finally, on the amendments requiring compliance with Ofcom codes of practice, that would give an extraordinary amount of power to the regulator and the Secretary of State. Since I have been in this place, people have rightly drawn my attention to the dangers of delegating power to the Executive or away from any kind of oversight—there has been fantastic debate and discussion about that. It seems to me that these amendments advocate delegated powers being given to the Secretary of State and Ofcom, an unelected body —the Secretary of State could amend for reasons of public policy in order to protect children—and this is to be put through the negative procedure. In any other instance, I would have expected outcry from the usual suspects, but, because it involves children, we are not supposed to object. I worry that we need to have more scrutiny of such amendments and not less, because in the name of protecting children unintended consequences can occur.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I want to answer the point that amendments cannot be seen in isolation. Noble Lords will remember that we had a long and good debate about what constituted harms to children. There was a big argument and the Minister made some warm noises in relation to putting harms to children in the Bill. There is some alignment between many people in the Chamber whereby we and Parliament would like to determine what harm is, and I very much share the noble Baroness’s concern about pointing out what that is.

On the issue of the system versus the content, I am not sure that this is the exact moment but the idea of unintended consequences keeps getting thrown up when we talk about trying to point the finger at what creates harm. There are unintended consequences now, except neither Ofcom nor the Secretary of State or Parliament but only the tech sector has a say in what the unintended consequences are. As someone who has been bungee jumping, I am deeply grateful that there are very strict rules under which that is allowed to happen.

Lord Bishop of Gloucester Portrait The Lord Bishop of Gloucester
- View Speech - Hansard - - - Excerpts

My Lords, I support the amendments in this group that, with regard to safety by design, will address functionality and harms—whatever exactly we mean by that—as well as child safety duties and codes of practice. The noble Lord, Lord Russell, and the noble Baronesses, Lady Harding and Lady Kidron, have laid things out very clearly, and I wish the noble Baroness, Lady Kidron, a happy birthday.

I also support Amendment 261 in the name of my right reverend friend the Bishop of Oxford and supported by the noble Lord, Lord Clement-Jones, and the noble Viscount, Lord Colville. This amendment would allow the Secretary of State to consider safety by design, and not just content, when reviewing the regime.

As we have heard, a number of the amendments would amend the safety duties to children to consider all harms, not just harmful content, and we have begun to have a very interesting debate on that. We know that service features create and amplify harms to children. These harms are not limited to spreading harmful content; features in and of themselves may cause harm—for example, beautifying filters, which can create unrealistic body ideals and pressure on children to look a certain way. In all of this, I want us to listen much more to the voices of children and young people—they understand this issue.

Last week, as part of my ongoing campaign on body image, including how social media can promote body image anxiety, I met a group of young people from two Gloucestershire secondary schools. They were very good at saying what the positives are, but noble Lords will also be very familiar with many of the negative issues that were on their minds, which I will not repeat here. While they were very much alive to harmful content and the messages it gives them, they were keen to talk about the need to address algorithms and filters that they say feed them strong messages and skew the content they see, which might not look harmful but, because of design, accentuates their exposure to issues and themes about which they are already anxious. Suffice to say that underpinning most of what they said to me was a sense of powerlessness and anxiety when navigating the online world that is part of their daily lives.

The current definition of content does not include design features. Building in a safety by design principle from the outset would reduce harms in a systematic way, and the amendments in this group would address that need.

Baroness Fraser of Craigmaddie Portrait Baroness Fraser of Craigmaddie (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I support this group of amendments. Last week, I was lucky—that is not necessarily the right word—to participate in a briefing organised by the noble Lord, Lord Russell of Liverpool, with the 5Rights Foundation on its recent research, which the noble Lord referred to. As the mother of a 13 year-old boy, I came away wondering why on earth you would not want to ensure safety by design for children.

I am aware from my work with disabled children that we know, as Ofcom knows from its own research, that children—or indeed anyone with a long-term health impact or a disability—are far more likely to encounter and suffer harm online. As I say, I struggle to see why you would not want to have safety by design.

This issue must be seen in the round. In that briefing we were taken through how quickly you could get from searching for something such as “slime” to extremely graphic pornographic content. As your Lordships can imagine, I went straight back to my 13 year-old son and said, “Do you know about slime and where you have you seen it?” He said, “Yes, Mum, I’ve watched it on YouTube”. That echoes the point made by the noble Baroness, Lady Kidron—to whom I add my birthday wishes—that these issues have to be seen in the round because you do not just consume content; you can search on YouTube, shop on Google, search on Amazon and all the rest of it. I support this group of amendments.

17:15
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - - - Excerpts

I too wish my noble friend Lady Kidron a happy birthday.

I will speak to Amendment 261. Having sat through the Communications Committee’s inquiries on regulating the internet, it seemed to me that the real problem was the algorithms and the way they operated. We have heard that again and again throughout the course of the Bill. It is no good worrying just about the content, because we do not know what new services will be created by technology. This morning we heard on the radio from the Google AI expert, who said that we have no idea where AI will go or whether it will become cleverer than us; what we need to do is to keep an eye on it. In the Bill, we need to make sure that we are looking at the way technology is being developed and the possible harms it might create. I ask the Minister to include that in his future-proofing of the Bill, because, in the end, this is a very fast-moving world and ecosystem. We all know that what is present now in the digital world might well be completely changed within a few years, and we need to remain cognisant of that.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, we have already had some very significant birthdays during the course of the Bill, and I suspect that, over many more Committee days, there will be many more happy birthdays to celebrate.

This has been a fascinating debate and the Committee has thrown up some important questions. On the second day, we had a very useful discussion of risk which, as the noble Lord, Lord Russell, mentioned, was prompted by my noble friend Lord Allan. In many ways, we have returned to that theme this afternoon. The noble Baroness, Lady Fox, who I do not always agree with, asked a fair question. As the noble Baroness, Lady Kidron, said, it is important to know what harms we are trying to prevent—that is how we are trying to define risk in the Bill—so that is an absolutely fair question.

The Minister has shown flexibility. Sadly, I was not able to be here for the previous debate, and it is probably because I was not that he conceded the point and agreed to put children’s harms in the Bill. That takes us a long way further, and I hope he will demonstrate that kind of flexibility as we carry on through the Bill.

The noble Lord, Lord Moylan, and I have totally different views about what risk it is appropriate for children to face. I am afraid that I absolutely cannot share his view that there is this level of risk. I do not believe it is about eliminating risk—I do not see how you can—but the Bill should be about preventing online risk to children; it is the absolute core of the Bill.

As the noble Lord, Lord Russell, said, the Joint Committee heard evidence from Frances Haugen about the business model of the social media platforms. We listened to Ian Russell, the father of Molly, talk about the impact of an unguarded internet on his daughter. It is within the power of the social media companies to do something about that; this is not unreasonable.

I was very interested in what the noble Viscount, Lord Colville, said. He is right that this is about algorithms, which, in essence, are what we are trying to get to in all the amendments in this really important group. It is quite possible to tackle algorithms if we have a requirement in the Bill to do so, and that is why I support Amendment 261, which tries to address to that.

However, a lot of the rest of the amendments are trying to do exactly the same thing. There is a focus not just on moderating harmful content but on the harmful systems that make digital services systematically unsafe for children. I listened with great interest to what the noble Lord, Lord Russell, said about the 5Rights research which he unpacked. We tend to think that media platforms such as Reddit are relatively harmless but that is clearly not the case. It is very interesting that the use of avatars is becoming quite common in the advertising industry to track where advertisements are ending up—sometimes, on pornography sites. It is really heartening that an organisation such as 5Rights has been doing that and coming up with its conclusions. It is extremely useful for us as policymakers to see the kinds of risks that our children are undertaking.

We were reminded about the origins—way back, it now seems—of the Carnegie duty of care. In a sense, we are trying to make sure that that duty of care covers the systems. We have talked about the functionality and harms in terms of risk assessment, about the child safety duties and about the codes of practice. All those need to be included within this discussion and this framework today to make sure that that duty of care really sticks.

I am not going to go through all the amendments. I support all of them: ensuring functionalities for both types of regulated service, and the duty to consider all harms and not just harmful content. It is absolutely not just about the content but making sure that regulated services have a duty to mitigate the impact of harm in general, not just harms stemming from content.

The noble Baroness, Lady Harding, made a terrific case, which I absolutely support, for making sure that the codes of practice are binding and principle based. At the end of the day, that could be the most important amendment in this group. I must admit that I was quite taken with her description of the Government’s response, which was internally contradictory. It was a very weak response to what I, as a member of the Joint Committee, thought was a very strong and clear recommendation about minimum standards.

This is a really important group of amendments and it would not be a difficult concession for the Government to make. They may wish to phrase things in a different way but we must get to the business case and the operation of the algorithms; otherwise, I do not believe this Bill is going to be effective.

I very much take on board what about the noble Viscount said about looking to the future. We do not know very much about some of these new generative AI systems. We certainly do not know a great deal about how algorithms within social media companies operate. We will come, no doubt, to later amendments on the ability to find out more for researchers and so on, but transparency was one of the things our Joint Committee was extremely keen on, and this is a start.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I too agree that this has been a really useful and interesting debate. It has featured many birthday greetings to the noble Baroness, Lady Kidron, in which I obviously join. The noble Lord, Lord Moylan, bounced into the debate that tested the elasticity of the focus of the group, and bounced out again. Like the noble Lord, Lord Clement-Jones, I was particularly struck by the speech from the noble Baroness, Lady Harding, on the non-mandatory nature of the codes. Her points about reducing Ofcom’s workload, and mandatory codes having precedent, were really significant and I look forward to the Minister’s response.

If I have understood it correctly, the codes will be generated by Ofcom, and the Secretary of State will then table them as statutory instruments—so they will be statutory, non-mandatory codes, but with statutory penalties. Trying to unravel that in my mind was a bit of a thing as I was sitting there. Undoubtedly, we are all looking forward to the Minister’s definition of harm, which he promised us at the previous meeting of the Committee.

I applaud the noble Lord, Lord Russell, for the excellent way in which he set out the issues in this grouping and—along with the Public Bill Office—for managing to table these important amendments. Due to the Bill’s complexity, it is an achievement to get the relatively simple issue of safety by design for children into amendments to Clause 10 on children’s risk assessment duties for user-to-user services; Clause 11 on the safety duties protecting children; and the reference to risk assessments in Clause 19 on record-keeping. There is a similar set of amendments applying to search; to the duties in Clause 36 on codes of practice duties; to Schedule 4 on the content of codes of practice; and to Clause 39 on the Secretary of State’s powers of direction. You can see how complicated the Bill is for those of us attempting to amend it.

What the noble Lord and his amendments try to do is simple enough. I listened carefully to the noble Baroness, Lady Fox, as always. The starting point is, when designing, to seek to eliminate harm. That is not to say that they will eliminate all potential harms to children, but the point of design is to seek to eliminate harms if you possibly can. It is important to be clear about that. Of course, it is not just the content but the systems that we have been talking about, and ensuring that the codes of practice that we are going to such lengths to legislate for are stuck to—that is the point made by the noble Baroness, Lady Harding—relieving Ofcom of the duty to assess all the alternative methods. We certainly support the noble Lord, Lord Russell, in his amendments. They reinforce that it is not just about the content; the algorithmic dissemination, in terms of volume and context, is really important, especially as algorithms are dynamic—they are constantly changing in response to the business models that underpin the user-to-user services that we are debating.

The business models want to motivate people to be engaged, regardless of safety in many ways. We have had discussion of the analogy on cars and planes from the noble Lord, Lord Allan. As I recall, in essence he said that in this space there are some things that you want to regulate like planes, to ensure that there are no accidents, and some where you trade off freedom and safety, as we do with the regulation of cars. In this case, it is a bit more like regulating for self-driving cars; in that context, you will design a lot more around trying to anticipate all the things that humans when driving will know instinctively, because they are more ethical individuals than you could ever programme an AI to be when driving a car. I offer that slight adjustment, and I hope that it helps the noble Lord, Lord Moylan, when he is thinking about trains, planes and automobiles.

In respect of the problem of the business models and their engagement over safety, I had contact this weekend and last week from friends much younger than I am, who are users of Snap. I am told that there is an AI chatbot on Snap, which I am sure is about engaging people for longer and collecting more data so that you can engage them even longer and, potentially, collect data to drive advertising. But you can pay to get rid of that chatbot, which is the business model moving somewhere else as and when we make it harder for it to make money as it is. Snap previously had location sharing, which you had to turn off. It created various harms and risks for children that their location was being shared with other people without them necessarily authorising it. We can all see how that could create issues.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

Does the noble Lord have any reflections, talking about Snap, as to how the internet has changed in our time? It was once really for adults, when it was on a PC and it was only adults who had access to it. There has, of course, been a huge explosion in child access to the internet because of the mobile phone—as we have heard, two-thirds of 10 year-olds now have a mobile phone—and an app such as Snap now has a completely different audience from the one it had five or 10 years ago. Does the noble Lord have any reflections on what the consequences of the explosion of children’s access to applications such as Snap has been on those thinking about the harms and protection of children?

17:30
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I am grateful to the noble Lord. In many ways, I am reminded of the article I read in the New York Times this weekend and the interview with Geoffrey Hinton, the now former chief scientist at Google. He said that as companies improve their AI systems, they become increasingly dangerous. He said of AI technology:

“Look at how it was five years ago and how it is now. Take the difference and propagate it forwards. That’s scary”.


Yes, the huge success of the iPhone, of mobile phones and all of us, as parents, handing our more redundant iPhones on to our children, has meant that children have huge access. We have heard the stats in Committee around the numbers who are still in primary school and on social media, despite the terms and conditions of those platforms. That is precisely why we are here, trying to get things designed to be safe as far as is possible from the off, but recognising that it is dynamic and that we therefore need a regulator to keep an eye on the dynamic nature of these algorithms as they evolve, ensuring that they are safe by design as they are being engineered.

My noble friend Lord Stevenson has tabled Amendment 27, which looks at targeted advertising, especially that which requires data collection and profiling of children. In that, he has been grateful to Global Action Plan for its advice. While advertising is broadly out of scope of the Bill, apart from in respect of fraud, it is significant for the Minister to reflect on the user experience for children. Whether it is paid or organic content, it is pertinent in terms of their safety as children and something we should all be mindful of. I say to the noble Lord, Lord Vaizey, that as I understand it, the age-appropriate design code does a fair amount in respect of the data privacy of children, but this is much more about preventing children encountering the advertising in the first place, aside from the data protections that apply in the age-appropriate design code. But the authority is about to correct me.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Just to add to what the noble Lord has said, it is worth noting that we had a debate, on Amendment 92, about aligning the age-appropriate design code likely to be accessed and the very important issue that the noble Lord, Lord Vaizey, raised about alignment of these two regimes. I think we can say that these are kissing cousins, in that they take a by-design approach. The noble Lord is completely right that the scope of the Bill is much broader than data protection only, but they take the same approach.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I am grateful, as ever, to the noble Baroness, and I hope that has assisted the noble Lord, Lord Vaizey.

Finally—just about—I will speak to Amendment 32A, tabled in my name, about VPNs. I was grateful to the noble Baroness for her comments. In many ways, I wanted to give the Minister the opportunity to put something on the record. I understand, and he can confirm whether my understanding is correct, that the duties on the platforms to be safe is regardless of whether a VPN has been used to access the systems and the content. The platforms, the publishers of content that are user-to-user businesses, will have to detect whether a VPN is being used, one would suppose, in order to ensure that children are being protected and that that is genuinely a child. Is that a correct interpretation of how the Bill works? If so, is it technically realistic for those platforms to be able to detect whether someone is landing on their site via a VPN or otherwise? In my mind, the anecdote that the noble Baroness, Lady Harding, related, about what the App Store algorithm on Apple had done in pushing VPNs when looking for porn, reinforces the need for app stores to become in scope, so that we can get some of that age filtering at that distribution point, rather than just relying on the platforms.

Substantially, this group is about platforms anticipating harms, not reviewing them and then fixing them despite their business model. If we can get the platforms themselves designing for children’s safety and then working out how to make the business models work, rather than the other way around, we will have a much better place for children.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I join in the chorus of good wishes to the bungee-jumping birthday Baroness, Lady Kidron. I know she will not have thought twice about joining us today in Committee for scrutiny of the Bill, which is testament to her dedication to the cause of the Bill and, more broadly, to protecting children online. The noble Lord, Lord Clement-Jones, is right to note that we have already had a few birthdays along the way; I hope that we get only one birthday each before the Bill is finished.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My birthday is in October, so I hope not.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Very good—only one each, and hopefully fewer. I thank noble Lords for the points they raised in the debate on these amendments. I understand the concerns raised about how the design and operation of services can contribute to risk and harm online.

The noble Lord, Lord Russell, was right, when opening this debate, that companies are very successful indeed at devising and designing products and services that people want to use repeatedly, and I hope to reassure all noble Lords that the illegal and child safety duties in the Bill extend to how regulated services design and operate their services. Providers with services that are likely to be accessed by children will need to provide age-appropriate protections for children using their service. That includes protecting children from harmful content and activity on their service. It also includes reviewing children’s use of higher-risk features, such as live streaming or private messaging. Service providers are also specifically required to consider the design of functionalities, algorithms and other features when delivering the child safety duties imposed by the Bill.

I turn first to Amendments 23 and 76 in the name of the noble Lord, Lord Russell. These would require providers to eliminate the risk of harm to children identified in the service’s most recent children’s risk assessment, in addition to mitigating and managing those risks. The Bill will deliver robust and effective protections for children, but requiring providers to eliminate the risk of harm to children would place an unworkable duty on providers. As the noble Baroness, Lady Fox, my noble friend Lord Moylan and others have noted, it is not possible to eliminate all risk of harm to children online, just as it is not possible entirely to eliminate risk from, say, car travel, bungee jumping or playing sports. Such a duty could lead to service providers taking disproportionate measures to comply; for instance, as noble Lords raised, restricting children’s access to content that is entirely appropriate for them to see.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Does the Minister accept that that is not exactly what we were saying? We were not saying that they would have to eliminate all risk: they would have to design to eliminate risks, but we accept that other risks will apply.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

It is part of the philosophical ruminations that we have had, but the point here is that elimination is not possible through the design or any drafting of legislation or work that is there. I will come on to talk a bit more about how we seek to minimise, mitigate and manage risk, which is the focus.

Amendments 24, 31, 32, 77, 84, 85 and 295, from the noble Lord, Lord Russell, seek to ensure that providers do not focus just on content when fulfilling their duties to mitigate the impact of harm to children. The Bill already delivers on those objectives. As the noble Baroness, Lady Kidron, noted, it defines “content” very broadly in Clause 207 as

“anything communicated by means of an internet service”.

Under this definition, in essence, all communication and activity is facilitated by content.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I hope that the Minister has in his brief a response to the noble Baroness’s point about Clause 11(14), which, I must admit, comes across extraordinarily in this context. She quoted it, saying:

“The duties set out … are to be taken to extend only to content that is harmful to children where the risk of harm is presented by the nature of the content (rather than the fact of its dissemination)”.


Is not that exception absolutely at the core of what we are talking about today? It is surely therefore very difficult for the Minister to say that this applies in a very broad way, rather than purely to content.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will come on to talk a bit about dissemination as well. If the noble Lord will allow me, he can intervene later on if I have not done that to his satisfaction.

I was about to talk about the child safety duties in Clause 11(5), which also specifies that they apply to the way that a service is designed, how it operates and how it is used, as well as to the content facilitated by it. The definition of content makes it clear that providers are responsible for mitigating harm in relation to all communications and activity on their service. Removing the reference to content would make service providers responsible for all risk of harm to children arising from the general operation of their service. That could, for instance, bring into scope external advertising campaigns, carried out by the service to promote its website, which could cause harm. This and other elements of a service’s operations are already regulated by other legislation.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I apologise for interrupting. Is that the case, and could that not be dealt with by defining harm in the way that it is intended, rather than as harm from any source whatever? It feels like a big leap that, if you take out “content”, instead of it meaning the scope of the service in its functionality and content and all the things that we have talked about for the last hour and a half, the suggestion is that it is unworkable because harm suddenly means everything. I am not sure that that is the case. Even if it is, one could find a definition of harm that would make it not the case.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Taking it out in the way that the amendment suggests throws up that risk. I am sure that it is not the intention of the noble Lord or the noble Baroness in putting it, but that is a risk of the drafting, which requires some further thought.

Clause 11(2), which is the focus of Amendments 32, 85 and 295, already means that platforms have to take robust action against content which is harmful because of the manner of its dissemination. However, it would not be feasible for providers to fulfil their duties in relation to content which is harmful only by the manner of its dissemination. This covers content which may not meet the definition of content which is harmful to children in isolation but may be harmful when targeted at children in a particular way. One example could be content discussing a mental health condition such as depression, where recommendations are made repeatedly or in an amplified manner through the use of algorithms. The nature of that content per se may not be inherently harmful to every child who encounters it, but, when aggregated, it may become harmful to a child who is sent it many times over. That, of course, must be addressed, and is covered by the Bill.

17:45
The Bill requires providers to specifically consider as part of their risk assessments how algorithms could affect children’s exposure to illegal content and content which is harmful to children on their service. Service providers will need specifically to consider the harm from content that arises from the manner of dissemination —for example, content repeatedly sent to someone by a person or persons, which is covered in Clause 205(3)(c). Providers will also need to take steps to mitigate and effectively manage any risks, and to consider the design of functionalities, algorithms and other features to meet their illegal content and child safety duties. Ofcom will have a range of powers at its disposal to help it assess whether providers are fulfilling their duties. That includes the power to require information from providers about the operation of their algorithms.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Can the Minister assure us that he will take another look at this between Committee and Report? He has almost made the case for this wording to be taken out—he said that it is already covered by a whole number of different clauses in the Bill—but it is still here. There is still an exception which, if the Minister is correct, is highly misleading: it means that you have to go searching all over the Bill to find a way of attacking the algorithm, essentially, and the way that it amplifies, disseminates and so on. That is what we are trying to get to: how to address the very important issue not just of content but of the way that the algorithm operates in social media. This seems to be highly misleading, in the light of what the Minister said.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I do not think so, but I will certainly look at it again, and I am very happy to speak to the noble Lord as I do. My point is that it would not be workable or proportionate for a provider to prevent or protect all children from encountering every single instance of the sort of content that I have just outlined, which would be the effect of these amendments. I will happily discuss that with the noble Lord and others between now and Report.

Amendment 27, by the noble Lord, Lord Stevenson, seeks to add a duty to prevent children encountering targeted paid-for advertising. As he knows, the Bill has been designed to tackle harm facilitated through user-generated content. Some advertising, including paid-for posts by influencers, will therefore fall under the scope of the Bill. Companies will need to ensure that systems for targeting such advertising content to children, such as the use of algorithms, protect them from harmful material. Fully addressing the challenges of paid-for advertising is a wider task than is possible through the Bill alone. The Bill is designed to reduce harm on services which host user-generated content, whereas online advertising poses a different set of problems, with different actors. The Government are taking forward work in this area through the online advertising programme, which will consider the full range of actors and sector-appropriate solutions to those problems.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I understand the Minister’s response, and I accept that there is a parallel stream of work that may well address this. However, we have been waiting for the report from the group that has been looking at that for some time. Rumours—which I never listen to—say that it has been ready for some time. Can the Minister give us a timescale?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I cannot give a firm timescale today but I will seek what further information I can provide in writing. I have not seen it yet, but I know that the work continues.

Amendments 28 and 82, in the name of the noble Lord, Lord Russell, seek to remove the size and capacity of a service provider as a relevant factor when determining what is proportionate for services in meeting their child safety duties. This provision is important to ensure that the requirements in the child safety duties are appropriately tailored to the size of the provider. The Bill regulates a large number of service providers, which range from some of the biggest companies in the world to small voluntary organisations. This provision recognises that what it is proportionate to require of providers at either end of that scale will be different.

Removing this provision would risk setting a lowest common denominator. For instance, a large multinational company could argue that it is required only to take the same steps to comply as a smaller provider.

Amendment 32A from the noble Lord, Lord Knight of Weymouth, would require services to have regard to the potential use of virtual private networks and similar tools to circumvent age-restriction measures. He raised the use of VPNs earlier in this Committee when we considered privacy and encryption. As outlined then, service providers are already required to think about how safety measures could be circumvented and take steps to prevent that. This is set out clearly in the children’s risk assessment and safety duties. Under the duty at Clause 10(6)(f), all services must consider the different ways in which the service is used and the impact of such use on the level of risk. The use of VPNs is one factor that could affect risk levels. Service providers must ensure that they are effectively mitigating and managing risks that they identify, as set out in Clause 11(2). The noble Lord is correct in his interpretation of the Bill vis-à-vis VPNs.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Is this technically possible?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Technical possibility is a matter for the sector—

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I am grateful to the noble Lord for engaging in dialogue while I am in a sedentary position, but I had better stand up. It is relevant to this Committee whether it is technically possible for providers to fulfil the duties we are setting out for them in statute in respect of people’s ability to use workarounds and evade the regulatory system. At some point, could he give us the department’s view on whether there are currently systems that could be used —we would not expect them to be prescribed—by platforms to fulfil the duties if people are using their services via a VPN?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

This is the trouble with looking at legislation that is technologically neutral and future-proofed and has to envisage risks and solutions changing in years to come. We want to impose duties that can technically be met, of course, but this is primarily a point for companies in the sector. We are happy to engage and provide further information, but it is inherently part of the challenge of identifying evolving risks.

The provision in Clause 11(16) addresses the noble Lord’s concerns about the use of VPNs in circumventing age-assurance or age-verification measures. For it to apply, providers would need to ensure that the measures they put in place are effective and that children cannot normally access their services. They would need to consider things such as how the use of VPNs affects the efficacy of age-assurance and age-verification measures. If children were routinely using VPNs to access their service, they would not be able to conclude that Clause 11(16) applies. I hope that sets out how this is covered in the Bill.

Amendments 65, 65ZA, 65AA, 89, 90, 90B, 96A, 106A, 106B, 107A, 114A, 122, 122ZA, 122ZB and 122ZC from the noble Lord, Lord Russell of Liverpool, seek to make the measures Ofcom sets out in codes of practice mandatory for all services. I should make it clear at the outset that companies must comply with the duties in the Bill. They are not optional and it is not a non-statutory regime; the duties are robust and binding. It is important that the binding legal duties on companies are decided by Parliament and set out in legislation, rather than delegated to a regulator.

Codes of practice provide clarity on how to comply with statutory duties, but should not supersede or replace them. This is true of codes in other areas, including the age-appropriate design code, which is not directly enforceable. Following up on the point from my noble friend Lady Harding of Winscombe, neither the age-appropriate design code nor the SEND code is directly enforceable. The Information Commissioner’s Office or bodies listed in the Children and Families Act must take the respective codes into account when considering whether a service has complied with its obligations as set out in law.

As with these codes, what will be directly enforceable in this Bill are the statutory duties by which all sites in scope of the legislation will need to abide. We have made it clear in the Bill that compliance with the codes will be taken as compliance with the duties. This will help small companies in particular. We must also recognise the diversity and innovative nature of this sector. Requiring compliance with prescriptive steps rather than outcomes may mean that companies do not use the most effective or efficient methods to protect children.

I reassure noble Lords that, if companies decide to take a different route to compliance, they will be required to document what their own measures are and how they amount to compliance. This will ensure that Ofcom has oversight of how companies comply with their duties. If the alternative steps that providers have taken are insufficient, they could face enforcement action. We expect Ofcom to take a particularly robust approach to companies which fail to protect their child users.

My noble friend Lord Vaizey touched on the age-appropriate design code in his remarks—

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My noble friend the Minister did not address the concern I set out that the Bill’s approach will overburden Ofcom. If Ofcom has to review the suitability of each set of alternative measures, we will create an even bigger monster than we first thought.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I do not think that it will. We have provided further resource for Ofcom to take on the work that this Bill will give it; it has been very happy to engage with noble Lords to talk through how it intends to go about that work and, I am sure, would be happy to follow up on that point with my noble friend to offer her some reassurance.

Responding to the point from my noble friend Lord Vaizey, the Bill is part of the UK’s overall digital regulatory landscape, which will deliver protections for children alongside the data protection requirements for children set out in the Information Commissioner’s age-appropriate design code. Ofcom has strong existing relationships with other bodies in the regulatory sphere, including through the Digital Regulation Co-operation Forum. The Information Commissioner has been added to this Bill as a statutory consultee for Ofcom’s draft codes of practice and relevant pieces of guidance formally to provide for the ICO’s input into its areas of expertise, especially relating to privacy.

Amendment 138 from the noble Lord, Lord Russell of Liverpool, would amend the criteria for non-designated content which is harmful to children to bring into scope content whose risk of harm derives from its potential financial impact. The Bill already requires platforms to take measures to protect all users, including children, from financial crime online. All companies in scope of the Bill will need to design and operate their services to reduce the risk of users encountering content amounting to a fraud offence, as set out in the list of priority offences in Schedule 7. This amendment would expand the scope of the Bill to include broader commercial harms. These are dealt with by a separate legal framework, including the Consumer Protection from Unfair Trading Regulations. This amendment therefore risks creating regulatory overlap, which would cause confusion for business while not providing additional protections to consumers and internet users.

Amendment 261 in the name of the right reverend Prelate the Bishop of Oxford seeks to modify the existing requirements for the Secretary of State’s review into the effectiveness of the regulatory framework. The purpose of the amendment is to ensure that all aspects of a regulated service are taken into account when considering the risk of harm to users and not just content.

As we have discussed already, the Bill defines “content” very broadly and companies must look at every aspect of how their service facilitates harm associated with the spread of content. Furthermore, the review clause makes explicit reference to the systems and processes which regulated services use, so the review can already cover harm associated with, for example, the design of services.

18:00
Amendments 291, 292, and 293 seek to ensure that companies’ child safety duties apply to a broader range of functionalities which can facilitate harm online. The current list of functionalities in the Bill is not exhaustive. Services will therefore need to assess the risk from any feature or functionality of their service which enables user interaction and could cause harm to users.
The points raised in these amendments are covered already in the Bill in the places I have set out. I will consult the official record of this debate to see whether there are any areas which I have not followed up, but I invite noble Lords not to press their amendments in this group.
Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the Minister for his response. I think the entire Chamber will be thankful that I do not intend to respond in any great detail to almost one hour and three-quarters of debate on this series of amendments—I will just make a few points and suggestions.

The point that the noble Baroness made at the beginning about understanding the design and architecture of the systems and processes is fundamental, both for understanding why they are causing the sorts of harm that they are at the moment and for trying to ensure that they are designed better in future than they have been to date. Clearly, they are seriously remiss in the harms that they are inflicting on a generation of young people.

On the point made by the noble Baroness, Lady Harding, about trying to make Ofcom’s job easier— I can see the noble Lord, Lord Grade, in the corner— I would hope and anticipate that anything we could suggest that would lead the Government to make Ofcom’s job slightly easier and clearer would be very welcome. The noble Lord appears to be making an affirmatory gesture, so I will take that as a yes.

I say to the noble Lord, Lord Moylan, that I fully understand the importance of waving the flag of liberty and free speech, and I acknowledge its importance. I also acknowledge the always-incipient danger of unintentionally preventing things from happening that can and should happen when you are trying to make things safer and prevent harm. Trying to get the right balance is extraordinarily difficult, but I applaud the noble Lord for standing up and saying what he said. If one were to judge the balance of the contributions here as a very rough opinion poll, the noble Lord might find himself in the minority, but that does not necessarily mean that he is wrong, so I would encourage him to keep contributing.

I sympathise with the noble Baroness, Lady Fox, in trying to find the right balance; it is something that we are all struggling to do. One of the great privileges we have in this House is that we have the time to do it in a manner which is actively discouraged in the other place. Even if we go on a bit, we are talking about matters which are very important—in particular, the pre-legislative scrutiny committee was able to cover them in greater detail than the House of Commons was able to do.

The noble Lord, Lord Clement-Jones, was right. In the same way as they say, “Follow the money”, in this case it is “follow the algorithms”, because it is the algorithms which drive the business model.

On the points made by the noble Lord, Lord Knight, regarding the New York Times article about Geoffrey Hinton, one of the architects of AI in Google, I would recommend that all your Lordships read it to see somebody who has been at the forefront of developing artificial intelligence. Rather like a character in a Jules Verne novel suddenly being slightly aghast at what they have created—Frankenstein comes to mind—it makes one pause for thought. Even as we are talking about these things, AI is racing ahead like a greyhound in pursuit of a very fast rabbit, and there is no way that we will be able to catch up.

While I thank the noble Minister for his reply, as when we debated some of the amendments last week where the noble Baroness, Lady Harding, spoke about the train journey she took when she was trying to interrogate and interpret the different parts of the Bill and was trying to follow the trail and understand what was going on to the extent that she became so involved that she missed her station, I think there is a real point here about the fact that this Bill is very complex to follow and understand. Indeed, the way in which the Minster had to point to all the different points of the compass—so to speak—both within the Bill and without it in many of the answers that he gave to some of the amendments indicates to me that the Bill team is finding it challenging to respond to some of them. It is like filling in one of those diagrams where you join the dots, and you cannot quite see what it is until you have nearly finished. I find it slightly disturbing if the Bill team and some of the officials appear to be having a challenging time in trying to interpret, understand and explain some of the points we are raising; I would hope and expect that that could be done much more simply.

One of the pleas from all of us in a whole variety of these amendments is to get the balance right between legislating what it is that we want to legislate and making it simple enough to be understandable. At the moment, a criticism of this Bill is that it is extraordinary difficult to understand in many parts. I will not go through all the points, but there are some germane areas where it would be extremely helpful to pursue with the Minister and the Bill team some of the points we are trying to make. Many of them are raised by a variety of outside bodies which know infinitely more about it than I do, and which have genuine concerns. We have the time between Committee and Report to put some of those to bed or at least to understand them better than we do at the moment. We will probably be happy and satisfied with some of the responses that we receive from the department once we feel that we understand them, and perhaps more importantly, once we feel that the department and the Bill team themselves fully understand them. It is fair to say that at the moment we are not completely comfortable that they do. I do not blame the Minister for that. If I were in his shoes, I would be on a very long holiday and I would not be returning any time soon. However, we will request meetings—for one meeting, it would be too much, so we will try to put this into bit-size units and then try to dig into the detail in a manageable way without taking too much time to make sure that we understand each other.

With that, I beg leave to withdraw the amendment.

Amendment 23 withdrawn.
Amendment 24 not moved.
Amendment 25
Moved by
25: Clause 11, page 10, line 13, at end insert—
“(c) uphold children’s rights per the United Kingdom’s obligations as a signatory of the United Nations Convention on the Rights of the Child (UNCRC), with reference to General Comment No. 25 (2021) from the Committee on the Rights of the Child on children’s rights in relation to the digital environment.”Member’s explanatory statement
This amendment would mean regulated services would have to have regard for the UN Convention on the Rights of the Child to ensure children are treated according to their evolving capacities, in their best interests, in consideration of their wellbeing and are not locked out of spaces that they have a right to participate in and to access.
Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- Hansard - - - Excerpts

My Lords, I am sorry that it is me again—a bit like a worn 78. In moving Amendment 25, I will speak also to Amendments 78, 187 and 196, all of which speak to the principle of children’s rights as set out in the UN Convention on the Rights of the Child and, more specifically, how those rights are applied to the digital world as covered in the United Nations’ general comment No. 25, which was produced in 2021 and ratified by the UK Government. What we are suggesting and asking for is that the principles in this general comment are reflected in the Bill. I thank the noble Baronesses, Lady Harding, Lady Kennedy and Lady Bennett, and the noble Lord, Lord Alton—who is not with us—for adding their names to these amendments and for their support.

The general comment No. 25 that I mentioned recognises that children’s rights are applicable in the digital world as well as the real world. These amendments try to establish in the Bill the rights of children. Believe it or not, in this rather lengthy Bill there is not a single reference—as far as we can discern—specifically to children’s rights. There are a lot of other words, but that specific phrase is not used, amazingly enough. These amendments are an attempt to get children’s rights specifically into the Bill. Amendments 30 and 105 in the names of the noble Lords, Lord Clement-Jones and Lord Knight, also seek to preserve the well-being of children. Our aims are very similar, but we will try to argue that the convention would achieve them in a particularly effective and concise way.

The online world is not optional for children, given what we know—not least from some of the detailed and harrowing experiences related by various of your Lordships in the course of the Bill. The fact that the online world is not optional for children may be worrying to some adults. We have all heard about parents, grandparents and others who have direct experience of their beloved coming to harm. By contrast, it is also fascinating to note how many senior executives, and indeed founders, of digital companies forbid their own children from possessing and using mobile phones, typically until they are 12 or 14. That is telling us something. If they themselves do not allow their children to have access to some of the online world we are talking about so much, that should give us pause for reflection.

Despite the many harms online, there is undoubted good that all children can benefit from, including in terms of their cognitive and skills development, social development and relationships. There are some brilliant things which come from being online. It is also beneficial because having age-appropriate experiences when they are online is part of their fundamental rights. That, essentially, is what these amendments are about.

Throughout the many years that the Bill has been in gestation, we have heard a lot about freedom of speech and how it must be preserved. Indeed, in contrast to children’s rights not being mentioned once in the Bill, “freedom of expression” appears no less than 49 times. I venture to suggest to your Lordships that there is a degree of imbalance there which should cause us to pause and reflect on whether we have that balance quite right.

I will not go into detail, but the UNCRC is the most widely ratified human rights treaty in history, and it is legally binding on the states which are party to it. The UK is a signatory to this convention, yet if we do not get this right in the Bill, we are in danger of falling behind some of our global counterparts. Although I recognise that saying the name of this organisation may bring some members of the governing party out in a rather painful rash, the EU is incorporating the UNCRC into its forthcoming AI Act. Sweden has already incorporated it into law at a different level, and Canada, New Zealand and South Africa are all doing the same. It is not anything to be worried about. Even Wales incorporated it into its domestic law in 2004, and Scotland did so in 2021. This appears to be something that the English have a particular problem with.

18:15
These amendments would ensure that, very importantly, those reading the Bill absolutely know that they must give due consideration to children’s rights. It would not be optional. Amendments 25 and 78 would require services to uphold children’s rights when implementing safety measures. Amendment 187 would reflect children’s rights in Ofcom’s duties, and Amendment 196 would ensure that Ofcom takes into consideration children’s rights when it is making its assessments of risks.
In particular, we have tabled these amendments because one of the possible unintended consequences of the well-meaning and serious attempts by all of us to protect children better is that some of these companies and platforms may decide that having children access some of their services is too much bother. They may decide that it would be simpler to find means to exclude them completely because it would be too much trouble, money or regulatory hassle to try to build a platform or service which they know children will access, as that will impose a serious obligation on them for which they can be held legally accountable. That would be an unintended consequence. We do not want children locked out of services which are essential to their development, education and self-expression. That said, I have probably said enough. I beg to move.
Lord Weir of Ballyholme Portrait Lord Weir of Ballyholme (DUP)
- View Speech - Hansard - - - Excerpts

My Lords, I rise on this group of amendments, particularly with reference to Amendments 25, 78, 187 and 196, to inject a slight note of caution—I hope in a constructive manner—and to suggest that it would be the wrong step to try to incorporate them into this legislation. I say at the outset that I think the intention behind these amendments is perfectly correct; I do not query the intention of the noble Lord, Lord Russell, and others. Indeed, one thing that has struck me as we have discussed the Bill is the commonality of approach across the Chamber. There is a strong common desire to provide a level of protection for children’s rights, but I question whether these amendments are the right vehicle by which to do that.

It is undoubtedly the case that the spirit of the UNCRC is very strongly reflected within the Bill, and I think it moves in a complementary fashion to the Bill. Therefore, again, I do not query the UNCRC in particular. It can act as a very strong guide to government as to the route it needs to take, and I think it has had a level of influence on the Bill. I speak not simply as someone observing the Bill but as someone who, in a previous existence, served as an Education Minister in Northern Ireland and had direct responsibility for children’s rights. The guidance we received from the UNCRC was, at times, very useful to Ministers, so I do not question any of that.

For three reasons, I express a level of concern about these amendments. I mentioned that the purpose of the UNCRC is to act as a guide—a yardstick—for government as to what should be there in terms of domestic protections. That is its intention. The UNCRC itself was never written as a piece of legislation, and I do not think it was the original intention to have it directly incorporated and implemented as part of law. The UNCRC is aspirational in nature, which is very worth while. However, it is not written in a legislative form. At times, it can be a little vague, particularly if we are looking at the roles that companies will play. At times, it sets out very important principles, but ones which, if left for interpretation by the companies themselves, could create a level of tension.

To give an example, there is within the UNCRC a right to information and a right to privacy. That can sometimes create a tension for companies. If we are to take the purpose of the UNCRC, it is to provide that level of guidance to government, to ensure that it gets it right rather than trying to graft UNCRC directly on to domestic law.

Secondly, the effect of these amendments would be to shift the interpretation and implementation of what is required of companies from government to the companies themselves. They would be left to try to determine this, whereas I think that the UNCRC is principally a device that tries to make government accountable for children’s rights. As such, it is appropriate that government has the level of responsibility to draft the regulations, in conjunction with key experts within the field, and to try to ensure that what we have in these regulations is fit for purpose and bespoke to the kind of regulations that we want to see.

To give a very good example, there are different commissioners across the United Kingdom. One of the key groups that the Government should clearly be consulting with to make sure they get it right is the Children’s Commissioners of the different jurisdictions in the United Kingdom. Through that process, but with that level of ownership still lying with government and Ofcom, we can create regulations that provide the level of protection for our children that we all desire to see; whereas, if the onus is effectively shifted on to companies simply to comply with what is a slightly vague, aspirational purpose in these regulations, that is going to lead to difficulties as regards interpretation and application.

Thirdly, there is a reference to having due regard to what is in the UNCRC. From my experience, both within government and even seeing the way in which government departments do that—and I appreciate that “due regard” has case law behind it—even different government departments have tended to interpret that differently and in different pieces of legislation. At one extreme, on some occasions that effectively means that lip service has been paid to that by government departments and, in effect, it has been largely ignored. Others have seen it as a very rigorous duty. If we see that level of disparity between government departments within the same Government, and if this is to be interpreted as a direct instruction to and requirement of companies of varying sizes—and perhaps with various attitudes and feelings of responsibility on this subject—that creates a level of difficulty in and of itself.

My final concern in relation to this has been mentioned in a number of debates on various groups of amendments. Where a lot of Peers would see either a weakness in the legislation or something else that needs to be improved, we need to have as much consistency and clarity as possible in both interpretation and implementation. As such, the more we move away from direct regulations, which could then be put in place, to relying on the companies themselves interpreting and implementing, perhaps in different fashions, with many being challenged by the courts at times, the more we create a level of uncertainty and confusion, both for the companies themselves and for users, particularly the children we are looking to protect.

While I have a lot of sympathy for the intention of the noble Lord, Lord Russell, and while we need to find a way to incorporate into the Bill in some form how we can drive children’s rights more centrally within this, the formulation of the direct grafting of the UNCRC on to this legislation, even through due regard, is the wrong vehicle for doing it. It is inappropriate. As such, it is important that we take time to try to find a better vehicle for the sort of intention that the noble Lord, Lord Russell, and others are putting forward. Therefore, I urge the noble Lord not to press his amendments. If he does, I believe that the Committee should oppose the amendments as drafted. Let us see if, collectively, we can find a better and more appropriate way to achieve what we all desire: to try to provide the maximum protection in a very changing world for our children as regards online safety.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I support these amendments. We are in the process of having a very important debate, both in the previous group and in this one. I came to this really important subject of online safety 13 years ago, because I was the chief executive of a telecoms company. Just to remind noble Lords, 13 years ago neither Snap, TikTok nor Instagram—the three biggest platforms that children use today—existed, and telecoms companies were viewed as the bad guys in this space. I arrived, new to the telecoms sector, facing huge pressure—along with all of us running telecoms companies—from Governments to block content.

I often felt that the debate 13 years ago too quickly turned into what was bad about the internet. I was spending the vast majority of my working day trying to encourage families to buy broadband and to access this thing that you could see was creating huge value in people’s lives, both personal and professional. Sitting on these Benches, I fundamentally want to see a society with the minimum amount of regulation, so I was concerned that regulating internet safety would constrain innovation; I wanted to believe that self-regulation would work. In fact, I spent many hours in workshops with the noble Baroness, Lady Kidron, and many others in this Chamber, as we tried to persuade and encourage the tech giants—as everyone started to see that it was not the telecoms companies that were the issue; it was the emerging platforms—to self-regulate. It is absolutely clear that that has failed. I say that with quite a heavy heart; it has genuinely failed, and that is why the Bill is so important: to enshrine in law some hard regulatory requirements to protect children.

That does not change the underlying concern that I and many others—and everyone in this Chamber—have, that the internet is also potentially a force for good. All technology is morally neutral: it is the human beings who make it good or bad. We want our children to genuinely have access to the digital world, so in a Bill that is enshrining hard gates for children, it is really important that it is also really clear about the rights that children have to access that technology. When you are put under enormous pressure, it is too easy—I say this as someone who faced it 13 years ago, and I was not even facing legislation—to try to do what you think your Government want to do, and then end up causing harm to the individuals you are actually trying to protect. We need this counterbalance in this Bill. It is a shame that my noble friend Lord Moylan is not in his place, because, for the first time in this Committee, I find myself agreeing with him. It is hugely important that we remember that this is also about freedom and giving children the freedom to access this amazing technology.

Some parts of the Bill are genuinely ground-breaking, where we in this country are trying to work out how to put the legal scaffolding in place to regulate the internet. Documenting children’s rights is not something where we need to start from scratch. That is why I put my name to this amendment: I think we should take a leaf from the UN Convention on the Rights of the Child. I recognise that the noble Lord, Lord Weir of Ballyholme, made some very thought-provoking comments about how we have to be careful about the ambiguity that we might be creating for companies, but I am afraid that ambiguity is there whether we like it or not. These are not just decisions for government: the tension between offering services that will brighten the lives of children but risking them as well are exactly behind the decisions that technology companies take every day. As the Bill enshrines some obligations on them to protect children from the harms, I firmly believe it should also enshrine obligations on them to offer the beauty and the wonder of the internet, and in doing that enshrine their right to this technology.

18:30
Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- View Speech - Hansard - - - Excerpts

My Lords, I have attached my name to Amendment 25 in the name of the noble Lord, Lord Russell, and I rise to speak primarily to that. It is a great pleasure to follow the noble Baroness, Lady Harding, and agree with every word she has just said. I will draw on two elements of my personal history that she reminded me of. As a journalist in country Australia in the early 1990s—pre-internet days—I worked the night shift, and at least once a week we would get a frantic phone call from a parent calling on behalf of a child along the lines of, “Do you know anything about dolphins?” A school project had just been discovered that needed to be done by the next morning, and the source of information that the parent thought of was, “The local newspaper—they might be able to tell us something!” I am slightly ashamed to say that we had a newspaper to get out and we very quickly told them to go away, so we were not a good source of information in that case. Most people in your Lordships’ House will remember—but most young people will have no recollection of—a time when there was little access to information outside the hours when the library was open or you could go to a bookshop. There were literally no other sources available. We have to consider this amendment in the light of that.

I also want to slightly disagree with the comments of the noble Lord, Lord Bethell, on the previous group. He suggested that it was only with the arrival of phones that the internet became primarily or significantly a children’s thing. The best I can date it is that either in 1979 or 1980 I was playing “Lemonade Stand” on one of the early Apples. This might have been considered to be a harmful game from some political perspectives, given that it very much encouraged a capitalist mindset, profit taking and indeed the Americanisation of culture—but none the less that was back in 1980, if not 1979, and children were there. If we look back over the history of the internet, we see that some of the companies started out with young people, under the age of 18 in some cases, who have been at the forefront of innovation and development of what we now think of as our social media or internet world. This is the children’s world as much as it is the adults’ world, and that is the reality.

I will pick up the points made by the noble Lord, Lord Weir of Ballyholme, who suggested that the UN Convention on the Rights of the Child was only a guide to government and not law. It is a great pity that the noble Baroness, Lady Kennedy of The Shaws, is not in her place, because she is far better equipped to deal with this angle than I am. But I will give it a go. Children’s rights are humans’ rights. The UN Convention on the Rights of the Child is the most backed and most ratified rights convention—

Lord Weir of Ballyholme Portrait Lord Weir of Ballyholme (DUP)
- Hansard - - - Excerpts

I appreciate what the noble Baroness is saying, but I made a slightly different point. I am suggesting not that what is there was not meant to be law but that it was not written in a form which should be simply directly put in as legislation. It was not drafted in that format on that basis, which is why a direct graft on to a domestic piece of legislation is not quite the way to do it. It is about using that as guidance as to what should be in the law, rather than simply a direct incorporation.

Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- Hansard - - - Excerpts

I thank the noble Lord for his clarification, although, speaking not as a lawyer, my understanding is that a human right is a legal right; it is a law—a most fundamental right. In addition, every country in the world has ratified this except for the United States—which is another issue. I also point out that it is particularly important that we include reference to children’s rights in this Bill, given the fact that we as a country currently treat our children very badly. There is a huge range of issues, and we should have a demonstration in this and every Bill that the rights of children are respected across all aspects of British society.

I will not get diverted into a whole range of those, but I point noble Lords to a report to the United Nations from the Equality and Human Rights Commission in February this year that highlighted a number of ways in which children’s rights are not being lived up to in the UK. The most relevant part of this letter that the EHRC sent to the UN stresses that it is crucial to preserve children’s rights to accessible information and digital connectivity. That comes from our EHRC.

I think it was the noble Lord, Lord Russell, who referred to the fact that we live in a global environment, and of course our social media and the internet is very much a global world. I urge everyone who has not done so to look at a big report done by UNICEF in 2019, Global Kids Online, which, crucially, involved a huge amount of surveys, consultation and consideration by young people. Later we will get to an amendment of mine which says that we should have the direct voice of young people overseeing the implementation of the Bill. I am talking not about the NGOs that represent them but specifically about children: we need to listen to the children and young people.

The UNICEF report said that it was quite easy to defend access to information and to reputable sources, but showed that accessing entertainment activities—some of the things that perhaps some grandparents in this Chamber might have trouble with—was associated with the positive development of digital skills. Furthermore, the report says:

“When parents restrict children’s internet use”—


of course, this could also apply to the Government restricting their internet use—

“this has a negative effect on children’s information-seeking and privacy skills”.

So, if you do not give children the chance to develop these skills to learn how to navigate the internet, and they suddenly go to it at age 18 and a whole lot of stuff is out there that they have not developed any skills to deal with, you are setting yourself up for a real problem. So UNICEF stresses the real need to have children’s access.

Interestingly, this report—which was a global report from UNICEF—said that

“fewer than one third of children had been exposed to”

something they had found uncomfortable or upsetting in the preceding year. That is on the global scale. Perhaps that is an important balance to some of the other debates we have had in your Lordships’ House on the Bill.

Other figures from this report that I think are worth noting—this is from 2019, so these figures will undoubtedly have gone up—include the finding that

“one in three children globally is … an internet user and …. one in three internet users is a child”.

We have been talking about this as though the internet is “the grown-ups’ thing”, but that is not the global reality. It was co-created, established and in some cases invented by people under the age of 18. I am afraid to say that your Lordships’ House is not particularly well equipped to deal with this, but we need to understand this as best we possibly can. I note that the report also said, looking at the sustainable development goals on quality of education, good jobs and reducing inequality, that internet access for children was crucial.

I will make one final point. I apologise; I am aware that I have been speaking for a while, but I am passionate about these issues. Children and young people have agency and the ability to act and engage in politics. In several nations on these islands, 16 and 17 year-olds have the vote. I very much hope that that will soon also be the case in England, and indeed I hope that soon children even younger than that that will have the vote. I was talking about that with a great audience of year nines at the Queen’s School in Bushey on Friday with Learn with the Lords. Those children would have a great opportunity—

Lord Harlech Portrait Lord Harlech (Con)
- Hansard - - - Excerpts

My Lords, we have a very full order of business to get through, so I encourage the noble Baroness to remain on topic.

Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- Hansard - - - Excerpts

I think that is on topic. If 16 and 17 year-olds are voting, they have a right to access internet information about voting. I suggest that that is on topic.

My final point—for the pleasure of the noble Lord—is that historically we have seen examples where blocks and filters have denied children and young people who identify as LGBTQI+ access to crucial information for them. That is an example of the risk if we do not allow them right of access. On the most basic children’s right of all, we have also seen examples of blocks and filters that have stopped access to breastfeeding information on the internet. Access is a crucial issue, and what could be a more obvious way to allow it than by writing in the United Nations Declaration on the Rights of the Child?

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I welcome many of these amendments. I found reading them slightly more refreshing than the more dystopian images we have had previously. It is quite exciting, actually, because the noble Baroness, Lady Harding, sounded quite upbeat, which is in contrast to previous contributions on what the online world is like.

I want to defend the noble Baroness, Lady Bennett of Manor Castle, from the intervention that suggested that she was going off topic, because the truth is that these amendments are calling for children’s rights to be introduced into legislation via this Bill. I disagree with that, but we should at least talk about it if it is in the amendments.

Whereas I like the spirit of the amendments, it seems to me that children’s rights, which I consider to have huge constitutional implications, require a proper Bill to bring them in and not to be latched on to this one. My concern is that children’s rights can be used to undermine adult authority and are regularly cited as a way of undermining parents’ rights, and that children under 18 cannot enact political rights. Whether they have agency or capacity, they are not legally able to exercise their political rights, and therefore someone has to act on their behalf as an intermediary—as a third party—which is why it can become such a difficult, politicised area.

I say that because it would be a fascinating discussion to have. I do not think this is the Bill to have it on, but the spirit of the amendments raises issues that we should bear in mind for the rest of our discussion. During lockdown, we as a society stopped young people having any social interaction at all. They were isolated, and a lot of new reports suggest that young people’s mental health has suffered because they were on their own. They went online and, in many instances, it kept them sane. That is probably true not just of young people but of the rest of us, by the way, but I am making the point that it was not all bad.

Over recent years, as we have been concerned about children’s safety and protecting them, we have discouraged them from roaming far from home. They do not go out on their bikes or run around all the time; they are told, “Come back home, you’ll be safe”. Of course, they have gone into their room and gone online, and now we say, “That’s not safe either”.

I want to acknowledge that the online world has helped young people overcome the problems of isolation and lack of community that the adult world has sometimes denied them developing. That is important: it can be a source of support and solidarity. Children need spaces to talk, engage and interact with friends, mates, colleagues and so on where they can push boundaries, and all sorts of things, without grown-ups interfering. That is what we have always understood from child development. It is why you do not have spies wandering around all the time following them.

The main thing is that we know the difference between a four year-old and a 14 year-old. In the Bill, we call a child anyone under 18, but I was glad that the amendments acknowledge that distinction in terms of appropriateness is important. When young people are online, or if they are involved in encrypted messages, such as WhatsApp, that does not mean they are all planning to join county lines or are being groomed—it is not all dodgy. Appropriateness in terms of child age and not always imagining that the worst is happening are an important counter that these amendments bring to some of the pessimism that we have heard until now.

The noble Lord, Lord Russell, said that children’s rights are not mentioned in the Bill but freedom of expression has been mentioned 49 times. First, it is not a Bill about children’s rights, but when he says that freedom of expression has been mentioned 49 times, I assure him that quantity is not quality and the mention of it means nothing.

18:45
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

I want to challenge the noble Baroness’s assertion that the Bill is not about children’s rights. Anyone who has a teenage child knows that their right to access the internet is keenly held and fought out in every household in the country.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

The quip works, but political rights are not quips. Political rights have responsibilities, and so on. If we gave children rights, they would not be dependent on adults and adult society. Therefore, it is a debate; it is a row about what our rights are. Guess what. It is a philosophical row that has been going on all around the world. I am just suggesting that this is not the place—

Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- Hansard - - - Excerpts

I am sorry, but I must point out that 16 and 17 year-olds in Scotland and Wales have the vote. That is a political right.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

And it has been highly contentious whether the right to vote gives them independence. For example, you would still be accused of child exploitation if you did anything to a person under 18 in Scotland or Wales. In fact, if you were to tap someone and it was seen as slapping in Scotland and they were 17, you would be in trouble. Anyway, it should not be in this Bill. That is my point.

Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- View Speech - Hansard - - - Excerpts

My Lords, perhaps I may intervene briefly, because Scotland and Wales have already been mentioned. My perception of the Bill is that we are trying to build something fit for the future, and therefore we need some broad underlying principles. I remind the Committee that the Well-being of Future Generations Act (Wales) Act set a tone, and that tone has run through all aspects of society even more extensively than people imagined in protecting the next generation. As I have read them, these amendments set a tone to which I find it difficult to understand why anyone would object, given that that is a core principle, as I understood it, behind building in future-proofing that will protect children, among others.

Baroness Healy of Primrose Hill Portrait Baroness Healy of Primrose Hill (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I support the amendments in the name of the noble Lord, Lord Russell, to require regulated services to have regard to the UN Convention on the Rights of the Child. As we continue to attempt to strengthen the Bill by ensuring that the UK will be the safest place for children to be online, there is a danger that platforms may take the easy way out in complying with the new legislation and just block children entirely from their sites. Services must not shut children out of digital spaces altogether to avoid compliance with the child safety duties, rather than designing services with their safety in mind. Children have rights and, as the UN convention makes clear, they must be treated according to their evolving capacities and in their best interests in consideration of their well-being.

Being online is now an essential right, not an option, to access education, entertainment and friendship, but we must try to ensure that it is a safe space. As the 5Rights Foundation points out, the Bill risks infringing children’s rights online, including their rights to information and participation in the digital world, by mandating that services prevent children from encountering harmful content, rather than ensuring services are made age appropriate for children and safe by design, as we discussed earlier. As risk assessments for adults have been stripped from the Bill, this has had the unintended consequence of rendering a child user relative to an adult user even more costly, as services will have substantial safety duties to comply with to protect children. 5Rights Foundation warns that this will lead services to determine that it is not worth designing services with children’s safety in mind but that it could be more cost effective to lock them out entirely.

Ofcom must have a duty to have regard for the UNCRC in its risk assessments. Amendment 196 would ensure that children’s rights are reflected in Ofcom’s assessment of risks, so that Ofcom must have regard for children’s rights in balancing their rights to be safe against their rights to access age-appropriate digital spaces. This would ensure compliance with general comment No. 25, as the noble Lord, Lord Russell, mentioned, passed in 2021, to protect children’s rights to freedom of expression and privacy. I urge the Ministers to accept these amendments to ensure that the UK will be not only the safest place for children to be online but the best place too, by respecting and protecting their rights.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I support all the amendments in this group, and will make two very brief points. Before I do, I believe that those who are arguing for safety by design and to put harms in the Bill are not trying to restrict the freedom of children to access the internet but to give the tech sector slightly less freedom to access children and exploit them.

My first point is a point of principle, and here I must declare an interest. It was my very great privilege to chair the international group that drafted general comment No. 25 on children’s rights in relation to the digital environment. We did so on behalf of the Committee on the Rights of the Child and, as my noble friend Lord Russell said, it was adopted formally in 2021. To that end, a great deal of work has gone into balancing the sorts of issues that have been raised in this debate. I think it would interest noble Lords to know that the process took three years, with 150 submissions, many by nation states. Over 700 children in 28 countries were consulted in workshops of at least three hours. They had a good shout and, unlike many of the other general comments, this one is littered with their actual comments. I recommend it to the Committee as a very concise and forceful gesture of what it might be to exercise children’s rights in a balancing way across all the issues that we are discussing. I cannot remember who, but somebody said that the online world is not optional for children: it is where they grow up; it is where they spend their time; it is their education; it is their friendships; it is their entertainment; it is their information. Therefore, if it is not optional, then as a signatory to the UNCRC we have a duty to respect their rights in that environment.

My second point is rather more practical. During the passage of the age-appropriate design code, of which we have heard much, the argument was made that children were covered by the amendment itself, which said they must be kept in mind and so on. I anticipate that argument being made here—that we are aligning with children’s rights, apart from the fact that they are indivisible and must be done in their entirety. In that case, the Government happily accepted that it should be explicit, and it was put in the Data Protection Act. It was one of the most important things that happened in relation to the age-appropriate design code. We might hope that, when this Bill is an Act, it will all be over—our job will be done and we can move on. However, after the Data Protection Act, the most enormous influx of lobbying happened, saying, “Please take the age down from 18 to 13”. The Government, and in that case the ICO, shrugged their shoulders and said, “We can’t; it’s on the face of the Bill”, because Article 1 of the UNCRC says that a child is anyone under the age of 18.

The evolving capacities of children are central to the UNCRC, so the concerns of the noble Baroness, Lady Fox, which I very much share, that a four year-old and a 14 year-old are not the same, are embodied in that document and in the general comment, and therefore it is useful.

These amendments are asking for that same commitment here—to children and to their rights, and to their rights to protection, which is at the heart of so much of what we are debating, and their well-being. We need their participation; we need a digital world with children in it. Although I agreed very much with the noble Baroness, Lady Bennett, and her fierce defending of children’s rights, there are 1 billion children online. If two-thirds of them have not seen anything upsetting in the last year, that rather means that one-third of 1 billion children have—and that is too many.

Baroness Foster of Aghadrumsee Portrait Baroness Foster of Aghadrumsee (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I did not intend to speak in this debate but I have been inspired by it.

I was here for the encryption debate last week, which I did not speak in. One of the contributions was around unintended consequences of the legislation, and I am concerned about unintended consequences here.

I absolutely agree with the comments of the noble Baroness, Lady Bennett, around the need for children to engage on the internet. Due to a confidence and supply agreement with the then Government back in 2017, I ensured that children and adults alike in Northern Ireland have the best access to the internet in the United Kingdom, and I am very proud of that. Digital literacy is covered in a later amendment, Amendment 91, which I will be strongly supporting. It is something that everybody needs to be involved in, not least our young people—and here I declare an interest as the mother of a 16 year-old.

I have two concerns. The first was raised by my friend the noble Lord, Lord Weir, around private companies being legally accountable for upholding an international human rights treaty. I am much more comfortable with Amendments 187 and 196, which refer to Ofcom. I think that is where the duty should be. I have an issue not with the convention but with private companies being held responsible for it; Ofcom should be the body responsible.

Secondly, I listened very carefully to what the noble Baroness, Lady Kidron, said about general comment No. 25. If what I say is incorrect, I hope she will say so. Is general comment No. 25 a binding document on the Government? I understood that it was not.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

We need to see the UNCRC included in the Bill. The convention is never opened up again, and how it makes itself relevant to the modern world is through the general comments; that is how the Committee on the Rights of the Child would interpret it.

Baroness Foster of Aghadrumsee Portrait Baroness Foster of Aghadrumsee (Non-Afl)
- Hansard - - - Excerpts

So it is an interpretive document. The unintended consequences piece was around general comment No. 25 specifically having reference to children being able to seek out content. That is certainly something that I would be concerned about. I am sure that we will discuss it further in the next group of amendments, which are on pornography. If young people were able to seek out harmful content, that would concern me greatly.

I support Amendments 187 and 196, but I have some concerns about the unintended consequences of Amendment 25.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I think this may have been a brief interlude of positivity. I am not entirely convinced, in view of some of the points that have been made, but certainly I think that it was intended to be.

I will speak first to Amendments 30 and 105. I do not know what the proprieties are, but I needed very little prompting from the LEGO Group to put forward amendments that, in the online world, seek to raise the expectation that regulated services must go beyond purely the avoidance of risk of harm and consider the positive benefits that technology has for children’s development and their rights and overall well-being. It has been extremely interesting to hear that aspect of today’s debate.

It recognises that through the play experience of children, both offline and online, it has an impact on the lives of millions of children that it engages with around the world, and it recognises the responsibility to ensure that, wherever it engages with them, the impact is positive and that it protects and upholds the rights of children and fosters their well-being as part of its mission.

19:00
We have heard about UN general comment 25 on children’s rights in the digital environment. The Government’s response to the drafting process recognised the collective responsibility of all Governments and stakeholders to ensure
“that children can benefit from digital opportunities, and protecting them from online harms”.
In line with this, the Bill now offers the opportunity to require regulated services to not only mitigate and manage risk in their service design but to consider the benefits of the service to children’s rights and well-being. I am extending it rather further than some of the earlier discussions.
I agree that it is important to include reference to both rights and well-being in the Bill. An individual child may have low well-being even if all their rights are respected. For example, if a child does not feel socially connected or empowered in a positive online environment, they may experience low well-being even if their right to participate online is being respected. As drafted, the Bill instructs regulated services to have regard
“to the importance of protecting the rights of users and interested persons”
and give due consideration to benefits such as freedom of expression
“when deciding on, and implementing, safety measures and policies”
to comply with the regime.
I believe that, if the Bill is to fully deliver for children, it needs to ensure that there is consideration of the benefits of the service to children’s rights and well-being. Without this inclusion, there is a risk that the design of online services will disproportionately restrict children’s rights to participate in the online environment and the benefit it brings to their well-being. By instructing service providers to design for the benefits that technology can bring to children’s rights and well-being alongside the mitigation of risk, which we have heard so much about, we have a real opportunity in the Bill to create a blueprint for the online environment that can both protect and nurture children’s potential by supporting and empowering them, unleashing their creativity and helping them learn. We have heard many positive comments around the House on that. I hope the Minister will understand the clear intention here and take on board the positive intent of these amendments.
Briefly, many noble Lords have emphasised the importance of the UN Convention on the Rights of the Child. I am not going to add greatly to that debate, but children have a right to be safe and to privacy. They also have rights to information and participation in free speech, both online and offline. It was very interesting to hear, in particular from the noble Baroness, Lady Healy, and the noble Lord, Lord Russell, about their view that services may shut children out of digital spaces altogether to avoid compliance with the child safety duties, rather than designing services with their safety in mind. That is because the Bill focuses on content moderation rather than system design: we are back, in a sense, into that loop.
I believe that the reference to the UNCRC general comment 25 would be very useful. I understand the points made by the noble Lord, Lord Weir, and certainly the spirit in which he made them, but I cannot see why “having regard to” the UNCRC could not be in the Bill. I do not see that that is unduly prescriptive or difficult to interpret in those circumstances, or overly vague. So, on these Benches, we support those amendments.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, we too support the spirit of these amendments very much and pay tribute to the noble Lord, Lord Russell, for tabling them.

In many ways, I do not need to say very much. I think the noble Baroness, Lady Kidron, made a really powerful case, alongside the way the group was introduced in respect of the importance of these things. We do want the positivity that the noble Baroness, Lady Harding, talked about in respect of the potential and opportunity of technology for young people. We want them to have the right to freedom of expression, privacy and reliable information, and to be protected from exploitation by the media. Those happen to be direct quotes from the UN Convention on the Rights of the Child, as some of the rights they would enjoy. Amendments 30 and 105, which the noble Lord, Lord Clement-Jones, tabled—I attached my name to Amendment 30—are very much in that spirit of trying to promote well-being and trying to say that there is something positive that we want to see here.

In particular, I would like to see that in respect of Ofcom. Amendment 187 is, in some ways, the more significant amendment and the one I most want the Minister to reflect on. That is the one that applies to Ofcom: that it should have reference to the UN Convention on the Rights of the Child. I think even the noble Lord, Lord Weir, could possibly agree. I understand his thoughtful comments around whether or not it is right to encumber business with adherence to the UN convention, but Ofcom is a public body in how it carries out its duties as a regulator. There are choices for regulation. Regulation can just be about minimum standards, but it can also be about promoting something better. What we are seeking here in trying to have reference to the UN convention is for Ofcom to regulate for something more positive and better, as well as police minimum standards. On that basis, we support the amendments.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I will start in the optimistic spirit of the debate we have just had. There are many benefits to young people from the internet: social, educational and many other ways that noble Lords have mentioned today. That is why the Government’s top priority for this legislation has always been to protect children and to ensure that they can enjoy those benefits by going online safely.

Once again, I find myself sympathetic to these amendments, but in a position of seeking to reassure your Lordships that the Bill already delivers on their objectives. Amendments 25, 78, 187 and 196 seek to add references to the United Nations Convention on the Rights of the Child and general comment 25 on children’s rights in relation to the digital environment to the duties on providers and Ofcom in the Bill.

As I have said many times before, children’s rights are at the heart of this legislation, even if the phrase itself is not mentioned in terms. The Bill already reflects the principles of the UN convention and the general comment. Clause 207, for instance, is clear that a “child” means a person under the age of 18, which is in line with the convention. All providers in scope of the Bill need to take robust steps to protect users, including children, from illegal content or activity on their services and to protect children from content which is harmful to them. They will need to ensure that children have a safe, age-appropriate experience on services designed for them.

Both Ofcom and service providers will also have duties in relation to users’ rights to freedom of expression and privacy. The safety objectives will require Ofcom to ensure that services protect children to a higher standard than adults, while also making sure that these services account for the different needs of children at different ages, among other things. Ofcom must also consult bodies with expertise in equality and human rights, including those representing the interests of children, for instance the Children’s Commissioner. While the Government fully support the UN convention and its continued implementation in the UK, it would not be appropriate to place obligations on regulated services to uphold an international treaty between state parties. We agree with the reservations that were expressed by the noble Lord, Lord Weir of Ballyholme, in his speech, and his noble friend Lady Foster.

The convention’s implementation is a matter for the Government, not for private businesses or voluntary organisations. Similarly, the general comment acts as guidance for state parties and it would not be appropriate to refer to that in relation to private entities. The general comment is not binding and it is for individual states to determine how to implement the convention. I hope that the noble Lord, Lord Russell, will feel reassured that children’s rights are baked into the Bill in more ways than a first glance may suggest, and that he will be content to withdraw his amendment.

The noble Lord, Lord Clement-Jones, in his Amendments 30 and 105, seeks to require platforms and Ofcom to consider a service’s benefits to children’s rights and well-being when considering what is proportionate to fulfil the child safety duties of the Bill. They also add children’s rights and well-being to the online safety objectives for user-to-user services. The Bill as drafted is focused on reducing the risk of harm to children precisely so that they can better enjoy the many benefits of being online. It already requires companies to take a risk-based and proportionate approach to delivering the child safety duties. Providers will need to address only content that poses a risk of harm to children, not that which is beneficial or neutral. The Bill does not require providers to exclude children or restrict access to content or services that may be beneficial for them.

Children’s rights and well-being are already a central feature of the existing safety objectives for user-to-user services in Schedule 4 to the Bill. These require Ofcom to ensure that services protect children to a higher standard than adults, while making sure that these services account for the different needs of children at different ages, among other things. On this basis, while I am sympathetic to the aims of the amendments the noble Lord has brought forward, I respectfully say that I do not think they are needed.

More pertinently, Amendment 30 could have unintended consequences. By introducing a broad balancing exercise between the harms and benefits that children may experience online, it would make it more difficult for Ofcom to follow up instances of non-compliance. For example, service providers could take less effective safety measures to protect children, arguing that, as their service is broadly beneficial to children’s well-being or rights, the extent to which they need to protect children from harm is reduced. This could mean that children are more exposed to more harmful content, which would reduce the benefits of going online. I hope that this reassures the noble Lord, Lord Russell, of the work the Bill does in the areas he has highlighted, and that it explains why I cannot accept his amendments. I invite him to withdraw Amendment 25.

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I thank all noble Lords for taking part in this discussion. I thank the noble Lord, Lord Weir, although I would say to him that his third point—that, in his experience, the UNCRC is open to different interpretations by different departments—is my experience of normal government. Name me something that has not been interpreted differently by different departments, as it suits them.

Lord Weir of Ballyholme Portrait Lord Weir of Ballyholme (DUP)
- Hansard - - - Excerpts

I entirely take that point. I was making the slightly wider point—not specifically with regard to the UNCRC—that, whenever legislative provision has been made that a particular department has to have due regard to something, while there is case law, “due regard” has tended to be treated very differently by different departments. So, if even departments within the same Government treat that differently, how much more differently would private companies treat it?

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- Hansard - - - Excerpts

I would simply make the point that it would probably be more accurate to say that the departments treat it with “due disregard”;

This has been a wide ranging debate and I am not going to go through all the different bits and pieces. I recommend that noble Lords read United Nations general comment 25 as it goes, in great detail, right to the heart of the issues we are talking about. For example —this is very pertinent to the next group of amendments—it explicitly protects children from pornography, so I absolutely recommend that it be mentioned in the next group of amendments.

As I expected, the Minister said, “We are very sympathetic but this is not really necessary”. He said that children’s rights are effectively baked into the Bill already. But what is baked into something that children—for whom this is particularly relevant—or even adults might decide to consume is not always immediately obvious. There are problems with an approach whereby one says, “It’s fine because, if you really understood this rather complicated legislation, it would become completely clear to you what it means”. That is a very accurate and compelling demonstration of exactly why some of us have concerns about this well-intentioned Bill. We fear that it will become a sort of feast, enabling company lawyers and regulators to engage in occasionally rather arcane discourse at great expense, demonstrating that what the Government claim is clearly baked in is not so clearly baked in.

19:15
A common theme in many of these amendments on children’s rights is that it is important that these rights are not implicitly covered in the Bill, as they are in myriad cases, but that it should be stated more clearly in key places in the Bill that it explicitly is about helping children and protecting their rights. It should be about protecting their right to be online, but also their right not to be abused or suffer harm online. That is at the heart of what we are trying to do. I suspect there is rich room for further discussion to see if we can make some of this slightly less “baked in” and find some form of legislative icing, with hundreds and thousands, which makes it completely clear which children’s rights are being protected and how they will be protected. With that, I beg leave to withdraw the amendment.
Amendment 25 withdrawn.
Amendments 26 and 27 not moved.
Amendment 27A
Moved by
27A: Clause 11, page 11, line 19, at end insert—“(10A) A duty to summarise in the terms of service the findings of the most recent children’s risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to children).”Member’s explanatory statementThis amendment requires providers of Category 1 services to summarise (in their terms of service) the findings of their latest children’s risk assessment. The limitation to Category 1 services is achieved by an amendment in the name of the Minister to clause 6.
Amendment 27A agreed.
Amendment 28 not moved.
Amendment 29
Moved by
29: Clause 11, page 11, line 25, at end insert—“, except for pornographic content where age verification must always be applied, notwithstanding section 3(3)(a) of the Communications Act 2003.”Member’s explanatory statementThis amendment would require a user-to-user service to apply age verification for pornographic content regardless of their size or capacity.
Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- Hansard - - - Excerpts

My Lords, I am very happy to move Amendment 29 and to speak to Amendments 83 and 103, which are also in my name. We have just had a debate about the protection of children online, and this clearly follows on from that.

The intention of the Bill is to set general parameters through which different content types can be regulated. The problem with that approach, as the sheer number of amendments highlights, is this: not all content and users are the same, and therefore cannot be treated in the same way. Put simply, not all content online should be legislated for in the same way. That is why the amendments in this group are needed.

Pornography is a type of content that cannot be regulated in general terms; it needs specific provisions. I realise that some of these issues were raised in the debate last Tuesday on amendments in my name, and again on Thursday when we discussed harms to children. I recognise too that, during his response to Thursday’s debate, the Minister made a welcome announcement on primary priority content which I hope will be set out in the Bill, as we have been asking for during this debate. While we wait to see the detail of what that announcement means, I think it safe to assume that pornography will be one of the harms named on the Bill, which makes discussion of these amendments that bit more straightforward.

Given that context, in Clause 11(3), user-to-user services that fall under the scope of Part 3 of the Bill have a duty to prevent children from accessing primary priority content. This duty is repeated in Clause 25(3) for search services. That duty is, however, qualified by the words,

“using proportionate systems and processes”.

It is the word “proportionate” and how that would apply to the regulation of pornography that is at the heart of the issue.

Generally speaking, acting in a proportionate way is a sensible approach to legislation and regulation. For the most part, regulation and safeguards should ensure that a duty is not onerous or that it does not place a disproportionate cost on the service provider that may make their business unviable. While that is the general principle, proportionality is not an appropriate consideration for all policy decisions.

In the offline world, legislation and regulation is not always proportionate. This is even more stark when regulating for children. The noble Lord, Lord Bethell, raised the issue of the corner shop last Tuesday, and that example is apt to highlight my point today. We do not take a proportional approach to the sale of alcohol or cigarettes. We do not treat a corner shop differently from a supermarket. It would be absurd if I were to suggest that a small shop should apply different age checks for children when selling alcohol, compared to the age checks we expect a large supermarket to apply. Therefore, in the same way, we already do not apply proportionality to some online activities. For example, gambling is an activity that is age-verified for children. Indeed, gambling companies are not allowed to make their product attractive to children and must advertise in a regulated way to avoid harm to children and young people. The harm caused to children by gambling is significant, so the usual policy considerations of proportionality do not apply. Clearly, both online and offline, there are some goods and services to which a proportionality test is not applied; there is no subjectivity. A child cannot buy alcohol or gamble and should not be able to access pornography.

In the UK, there is a proliferation of online gambling sites. It would be absurd to argue that the size of a gambling company or the revenue that company makes should be a consideration in whether it should utilise age verification to prevent children placing a bet. In the same way, it would be absurd to argue that the size or revenue of a pornographic website could be used as an argument to override a duty to ensure that age verification is employed to ensure that children do not access that website.

This is not a grey area. It is beyond doubt that exposing children to pornography is damaging to their health and development. The Children’s Commissioner’s report from this year has been much quoted already in Committee but it is worth reminding your Lordships what she found: that pornography was “widespread and normalised”, to the extent that children cannot opt out. The average age at which children first see pornography is 13. By age nine, 10% had seen it, 27% had seen it by age 11 and half had seen it by age 13. The report found that frequent users of pornography are more likely to engage—unfortunately and sadly—in physically aggressive sex acts.

There is nothing proportionate about the damage of pornographic content. The size, number of visitors, financial budget or technical know-how must not be considerations as to whether or not to deploy age checks. If a platform is incapable for any reason of protecting children from harmful exposure to pornography, it must remove that content. The Bill should be clear: if there is pornography on a website, it must use age verification. We know that pornographic websites will do all they can to evade age verification. In France and Germany, which are ahead of us in passing legislation to protect minors from pornography, regulators are tangled up in court action as the pornographic sites they first targeted for enforcement action argue against the law.

We must also anticipate the response of websites that are not dedicated exclusively to pornography, especially social media—a point we touched on during Tuesday’s debate. Reuters reported last year that an internal Twitter presentation stated that 13% of tweets were pornographic. Indeed, the Children’s Commissioner has found that Twitter is the platform where young people are most likely to encounter pornographic content. I know that some of your Lordships are concerned about age-gating social media. No one is suggesting that social media should exclude children, a point that has been made already. What I am suggesting is that pornography on that platform should be subject to age verification. The capabilities already exist to do this. New accounts on Twitter have to opt in to view pornographic content. Why cannot the opt-in function be age-gated? Twitter is moving to subscription content. Why can it not make pornographic content subscription based, with the subscription being age-verified. The solutions exist.

The Minister may seek to reassure the House that the Bill as drafted would not allow any website or search facility regulated under Part 3 that hosts pornographic content to evade its duties because of size, capacity or cost. But, as we have seen in France, these terms will be subject to court action. I therefore trust that the Government will bring forward an amendment to ensure that any platform that hosts pornographic content will employ age verification, regardless of any other factors. Perhaps the Minister in his wind-up can provide us with some detail or a hint of a future amendment at Report. I look forward to hearing and considering the Minister’s response. I beg to move.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I wish to speak in support of Amendments 29, 83 and 103 in the name of the noble Baroness, Lady Ritchie. I am extremely pleased that the Minister said last Tuesday that pornography will be within primary priority content; he then committed on Thursday to naming primary priority content in the Bill. This is good news. We also know that pornography will come within the child safety duties in Clause 11. This makes me very happy.

In the document produced for the Government in January 2021, the BBFC said that there were millions of pornographic websites—I repeat, millions—and many of these will come within Part 3 of the Bill because they allow users to upload videos, make comments on content and chat with other users. Of course, some of these millions of websites will be very large, which means by definition that we expect them to come within the scope of the Bill. Under Clause 11(3) user-to-user services have a duty to prevent children accessing primary priority content. The duty is qualified by the phrase

“using proportionate systems and processes”.

The facts of deciding what is proportionate are set out in Clause 11(11): the potential harm of the content based on the children’s risk assessment, and the size and capacity of the provider of the service. Amendments 29, 83 and 103 tackle the issue of size and capacity.

19:30
With millions of sites on the internet, it is not unreasonable to think that some sites will argue that, despite the potential harm to children, they are not of a size to have the capacity to invest in technology. The amendment would require all user-to-user sites with pornographic content to use age verification to determine that the person accessing the content was aged 18 years or older, regardless of size and capacity. This issue was touched upon on Tuesday in the amendments tabled by the noble Baroness, Lady Ritchie, which said there should be a level playing field for websites that contain pornographic content regardless of which part of the Bill they fall within. Websites that come within the scope of Part 5 do not have any exceptions and must have age verification to meet the duty in Clause 72, and that should also apply to Part 3 services.
The Government have said there is a significant risk of harm posed by children’s access to pornography online since exposure to pornography may impact children’s perception of sex and relationships, increase the likelihood of engaging in sexual activities and harmful or aggressive behaviour, and reduce concern about consent from partners. For those reasons alone, all sites with pornographic content should have age verification.
I know that we will have further debates on age verification in due course, but I hope the Government’s announcement that pornographic content will be in the Bill means that age verification for pornography on Part 3 and Part 5 services will come into force at the same time. I urge the Government to support these amendments.
House resumed.
Committee (4th Day) (Continued)
20:24
Clause 11: Safety duties protecting children
Debate on Amendment 29 resumed.
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I support the noble Baroness, Lady Ritchie, in her search to make it clear that we do not need to take a proportionate approach to pornography. I would be delighted if the Minister could indicate in his reply that the Government will accept the age-assurance amendments in group 22 that are coming shortly, which make it clear that porn on any regulated service, under Part 3 or Part 5, should be behind an age gate.

In making the case for that, I want to say very briefly that, after the second day of Committee, I received a call from a working barrister who represented 90 young men accused of serious sexual assault. Each was a student and many were in their first year. A large proportion of the incidents had taken place during freshers’ week. She rang to make sure that we understood that, while what each and every one of them had done was indefensible, these men were also victims. As children brought up on porn, they believed that their sexual violence was normal—indeed, they told her that they thought that was what young women enjoyed and wanted. On this issue there is no proportionality.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I also support Amendments 29, 83 and 103 from the noble Baroness, Lady Ritchie. As currently drafted, the Bill makes frequent reference to Ofcom taking into account

“the size and capacity of … a service”

when it determines the extent of the measures a site should apply to protect children. We have discussed size on previous days; I am conscious that the point has been made in part, but I hope the Committee will forgive me if I repeat it clearly. When it comes to pornography and other harms to children, size does matter. As I have said many times recently, porn is porn no matter the size of the website or publisher involved with it. It does not matter whether it is run by a huge company such as MindGeek or out of a shed in London or Romania by a small gang of people. The harm of the content to children is still exactly the same.

Our particular concern is that, if the regulations from Ofcom are applied to the bigger companies, that will create a lot of space for smaller organisations which are not bending to the regulations to try to gain a competitive advantage over the larger players and occupy that space. That is the concern of the bigger players. They are very open to age verification; what concerns them is that they will face an unequal, unlevel playing field. It is a classic concern of bigger players facing regulation in the market: that bad actors will gain competitive advantage. We should be very cognisant of that when thinking about how the regulations on age verification for porn will be applied. Therefore, the measures should be applied in proportion to the risk of harm to children posed by a porn site, not in proportion to the site’s financial capacity or the impact on its revenues of basic protections for children.

In this, we are applying basic, real-world principles to the internet. We are denying its commonly held exceptionalism, which I think we are all a bit tired of. We are applying the same principles that you might apply in the real world, for instance, to a kindergarten, play centre, village church hall, local pub, corner shop or any other kind of business that brings itself in front of children. In other words, if a company cannot afford to implement or does not seem capable of implementing measures that protect children, it should not be permitted by law to have a face in front of the general public. That is the principle that we apply in the real world, and that is the principle we should be applying on the internet.

Allowing a dimension of proportionality to apply to pornography cases creates an enormous loophole in the legislation, which at best will delay enforcement for particular sites when it is litigated and at worst will disable regulatory action completely. That is why I support the amendments in the name of the noble Baroness, Lady Ritchie.

20:30
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, the proposers of these amendments have made a very good case to answer. My only reservation is that I think there are rather more subtle and proportionate ways of dealing with this—I take on board entirely what the noble Lord, Lord Bethell, says.

I keep coming back to the deliberations that we had in the Joint Committee. We said:

“All statutory requirements on user-to-user services, for both adults and children, should also apply to Internet Society Services likely to be accessed by children, as defined by the Age Appropriate Design Code”.


This goes back to the test that we described earlier, to

“ensure all pornographic websites would have to prevent children from accessing their content”,

and back to that definition,

“likely to be accessed by children”.

The Government keep resisting this aspect, but it is a really important way of making sure that we deal with this proportionately. We are going to have this discussion about minimum age-assurance standards. Rather than simply saying, “It has to be age verification”, if we had a set of principles for age assurance, which can encompass a number of different tools and approaches, that would also help with the proportionality of what we are talking about.

The Government responded to the point we made about age assurance. The noble Baroness, Lady Kidron, was pretty persuasive in saying that we should take this on board in our Joint Committee report, and she had a Private Member’s Bill at the ready to show us the wording, but the Government came back and said:

“The Committee’s recommendations stress the importance of the use of age assurance being proportionate to the risk that a service presents”.


They have accepted that this would be a proportionate way of dealing with it, so this is not black and white. My reservation is that there is a better way of dealing with this than purely driving through these three or four amendments, but there is definitely a case for the Government to answer on this.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I think the whole Committee is grateful to my noble friend Lady Ritchie for introducing these amendments so well.

Clearly, there is a problem. The anecdote from the noble Baroness, Lady Kidron, about the call she had had with the barrister relating to those freshers’ week offences, and the sense that people were both offenders and victims, underscored that. In my Second Reading speech I alluded to the problem of the volume of young people accessing pornography on Twitter, and we see the same on Reddit, Discord and a number of other platforms. As the noble Baroness said, it is changing what so many young people perceive to be normal about sexual relationships, and that has to be addressed.

Ofcom very helpfully provided a technical briefing on age assurance and age verification for Members of your Lordships’ House—clearly it did not persuade everybody, otherwise we would not be having this debate. Like the noble Lord, Lord Clement-Jones, I am interested in this issue of whether it is proportionate to require age verification, rather than age assurance.

For example, on Amendment 83 in my noble friend’s name in respect of search, I was trying to work out in my own mind how that would work. If someone used search to look for pornographic content and put in an appropriate set of keywords but was not logged in—so the platform would not know who they are—and if age verification was required, would they be interrupted with a requirement to go through an age-verification service before the search results were served up? Would the search results be served up but without the thumbnails of images and with some of the content suppressed? I am just not quite sure what the user experience would be like with a strict age-verification regime being used, for example, in respect of search services.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

My Lords, some light can be shone on that question by thinking a little about what the gambling industry has been through in the last few years as age verification has got tougher in that area. To answer the noble Lord’s question, if someone does not log into their search and looks for a gambling site, they can find it, but when they come to try to place a bet, that is when age verification is required.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

That is right. What is interesting about that useful intervention from the noble Lord, Lord Bethell, is that that kind of gets search off the hook in respect of gambling. You are okay to follow the link from the search engine, but then you are age-gated at the point of the content. Clearly, with thumbnail images and so on in search, we need something better than that. The Bill requires something better than that already; should we go further? My question to the Minister is whether this could be similar to the discussion we had with the noble Baroness, Lady Harding, around non-mandatory codes and alternative methods. I thought that the Minister’s response in that case was quite helpful.

Could it be that if Part 3 and category 2A services chose to use age verification, they could be certain that they are compliant with their duties to protect children from pornographic and equivalent harmful content, but if they chose age-assurance techniques, it would then be on them to show Ofcom evidence of how that alternative method would still provide the equivalent protection? That would leave the flexibility of age assurance; it would not require age verification but would still set the same bar. I merely offer that in an attempt to be helpful to the Minister, in the spirit of where the Joint Committee and the noble Lord, Lord Clement-Jones, were coming from. I look forward to the Minister’s reply.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

Before the noble Lord sits down, can I ask him whether his comments make it even more important that we have a clear and unambiguous definition of age assurance and age verification in the Bill?

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I would not want to disagree with the noble Baroness for a moment.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Does the noble Lord think it is also important to have some idea of measurement? Age assurance in certain circumstances is far more accurate than age verification.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Yes; the noble Baroness is right. She has pointed out in other discussions I have been party to that, for example, gaming technology that looks at the movement of the player can quite accurately work out from their musculoskeletal behaviour, I assume, the age of the gamer. So there are alternative methods. Our challenge is to ensure that if they are to be used, we will get the equivalent of age verification or better. I now hand over to the Minister.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I think those last two comments were what are known in court as leading questions.

As the noble Baroness, Lady Ritchie of Downpatrick, said herself, some of the ground covered in this short debate was covered in previous groups, and I am conscious that we have a later grouping where we will cover it again, including some of the points that were made just now. I therefore hope that noble Lords will understand if I restrict myself at this point to Amendments 29, 83 and 103, tabled by the noble Baroness, Lady Ritchie.

These amendments seek to mandate age verification for pornographic content on a user-to-user or search service, regardless of the size and capacity of a service provider. The amendments also seek to remove the requirement on Ofcom to have regard to proportionality and technical feasibility when setting out measures for providers on pornographic content in codes of practice. While keeping children safe online is the top priority for the Online Safety Bill, the principle of proportionate, risk-based regulation is also fundamental to the Bill’s framework. It is the Government’s considered opinion that the Bill as drafted already strikes the correct balance between these two.

The provisions in the Bill on proportionality are important to ensure that the requirements in the child-safety duties are tailored to the size and capacity of providers. It is also essential that measures in codes of practice are technically feasible. This will ensure that the regulatory framework as a whole is workable for service providers and enforceable by Ofcom. I reassure your Lordships that the smaller providers or providers with less capacity are still required to meet the child safety duties where their services pose a risk to children. They will need to put in place sufficiently stringent systems and processes that reflect the level of risk on their services, and will need to make sure that these systems and processes achieve the required outcomes of the child safety duty. Wherever in the Bill they are regulated, companies will need to take steps to ensure that they cannot offer pornographic content online to those who should not see it. Ofcom will set out in its code of practice the steps that companies in the scope of Part 3 can take to comply with their duties under the Bill, and will take a robust approach to sites that pose the greatest risk of harm to children, including sites hosting online pornography.

The passage of the Bill should be taken as a clear message to providers that they need to begin preparing for regulation now—indeed, many are. Responsible providers should already be factoring in regulatory compliance as part of their business costs. Ofcom will continue to work with providers to ensure that the transition to the new regulatory framework will be as smooth as possible.

The Government expect companies to use age-verification technologies to prevent children accessing services that pose the highest risk of harm to children, such as online pornography. The Bill will not mandate that companies use specific technologies to comply with new duties because, as noble Lords have heard me say before, what is most effective in preventing children accessing pornography today might not be equally effective in future. In addition, age verification might not always be the most appropriate or effective approach for user-to-user companies to comply with their duties. For instance, if a user-to-user service, such as a particular social medium, does not allow pornography under its terms of service, measures such as strengthening content moderation and user reporting would be more appropriate and effective for protecting children than age verification. This would allow content to be better detected and taken down, instead of restricting children from seeing content which is not allowed on the service in the first place. Companies may also use another approach if it is proportionate to the findings of the child safety risk assessment and a provider’s size and capacity. This is an important element to ensure that the regulatory framework remains risk-based and proportionate.

In addition, the amendments in the name of the noble Baroness, Lady Ritchie, risk inadvertently shutting children out of large swathes of the internet that are entirely appropriate for them to access. This is because it is impossible totally to eliminate the risk that a single piece of pornography or pornographic material might momentarily appear on a site, even if that site prohibits it and has effective systems in place to prevent it appearing. Her amendments would have the effect of essentially requiring every service to block children through the use of age verification.

Those are the reasons why the amendments before us are not ones that we can accept. Mindful of the fact that we will return to these issues in a future group, I invite the noble Baroness to withdraw her amendment.

Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I thank all noble Lords who have participated in this wide-ranging debate, in which various issues have been raised.

The noble Baroness, Lady Benjamin, made the good point that there needs to be a level playing field between Parts 3 and 5, which I originally raised and which other noble Lords raised on Tuesday of last week. We keep coming back to this point, so I hope that the Minister will take note of it on further reflection before we reach Report. Pornography needs to be regulated on a consistent basis across the Bill.

20:45
The noble Baroness, Lady Kidron—I offer my congratulations on her birthday; it was remiss of me not to do so earlier—emphasised the need for clarity and consistency yet again, as well as the effects of pornography, which follow people through their lives, give an unrealistic view of relationships and can lead to increased violence against women. We must always remember that one incident of pornography can plague you for the rest of your life, because it will possibly play on your mind and have indirect or unintended consequences for your life’s passage after that.
The noble Lord, Lord Bethell, talked about equality across the Bill, as well as across websites. He raised yet another great real-world example: if organisations such as schools and nurseries cannot keep people safe, we do not allow them to look after children; if businesses cannot keep children safe, they need to be regulated to do so.
The noble Lord, Lord Clement-Jones, stated that it seems that the view of the Committee is clear: we need principles in the Bill that are universal to keep children safe. That is the clear message throughout the Committee debate so far. There may be a better way, and I hope that we can work with the noble Lord, Lord Clement-Jones, and my noble friend Lord Knight and his colleagues, along with the Government Benches, to achieve that.
My noble friend Lord Knight in his summing up raised an excellent point. Again, I come back to this issue: if we do not have clarity or consistency, none of this work will be as it is intended it should be. If different duties apply and if different levels of proportionality exist, that will only create uncertainty.
The Minister made the point that, with pornography now named as a harm to children, as announced on Thursday of last week, he hoped to consider how consistency is brought across the Bill to ensure that all providers in Parts 3 and 5 will be kept safe from pornography. It seems clear from deliberations in Committee so far that noble Lords do not think that the Bill brings that clarity and consistency. That clearly needs to be addressed and corrected.
This is not about shoving kids out; everyone understands that, despite best efforts, pornography may slip through. It is about consistency. I ask the Minister during the interregnum period between now and the end of Committee and the beginning of Report to further reflect on the issues to do with the need for clarity and consistency in dealing with pornography across the Bill. I beg leave to withdraw the amendment.
Amendment 29 withdrawn.
Amendments 30 to 32A not moved.
Clause 11, as amended, agreed.
Amendment 33
Moved by
33: After Clause 11, insert the following new Clause—
“Offence of failing to comply with a relevant duty
(1) The provider of a service to whom a relevant duty applies commits an offence if the provider fails to comply with the duty.(2) In the application of sections 178(2) and 179(5) to an offence under this section (where the offence has been committed with the consent or connivance of an officer of the entity or is attributable to any neglect on the part of an officer of the entity) the references in those provisions to an officer of an entity include references to any person who, at the time of the commission of the offence—(a) was (within the meaning of section 93) a senior manager of the entity in relation to the activities of the entity in the course of which the offence was committed; or(b) was a person purporting to act in such a capacity.(3) A person who commits an offence under this section is liable on conviction on indictment to—(a) imprisonment for a term not exceeding two years,(b) a fine, or(c) both.(4) The Secretary of State may by regulations amend the sanctions in subsection (3), and such regulations may—(a) specify the maximum fine under subsection (3)(b), and(b) implement a scale to apply in cases where there have been repeated breaches of a relevant duty.(5) In this section, “relevant duty” means a duty provided for by section 11 of this Act.(6) Regulations under subsection (4) are subject to the affirmative procedure.”Member’s explanatory statement
This new Clause would make it an offence for the provider of a user-to-service not to comply with the safety duties protecting children set out in Clause 11. Where the offence was committed with the consent or connivance of a provider’s senior manager or other officer, or was attributable to their neglect, that person, as well as the entity, would be guilty of the offence.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My noble friend Lord Stevenson apologises that he can no longer be with the Committee, and he apologised to me that I suddenly find myself introducing this amendment. It heads up an important group because it tackles the issue of enforcement and, in essence, how we ensure that Ofcom has all the tools it needs to persuade some of the richest, largest and most litigious companies in the world to comply with the regime we are setting out in the Bill. Amendment 33, which my noble friend tabled and I am moving, sets out an offence of failing to comply with a relevant duty in respect of the child safety duties, if they do so negligently, and that it would be an imprisonable offence for a senior manager or other officer. I recall that those of us who sat on the Joint Committee discussed the data protection regime and whether there could be a similarly designated officer to the data controller in companies in respect of the safety duties with which the company would have to comply.

Clearly, this amendment has now been superseded by the government amendments that were promised, and which I am sure my noble friend was looking to flush out with this amendment. Flushed they are, so I will not go into any great detail about Amendment 33, because it is better to give time to the Minister to clarify the Government’s intentions. I shall listen carefully to him, as I will to the noble Lord, Lord Curry, who has great expertise in better regulation and who, I am sure, through talking to his amendments, will give us the benefit of his wisdom on how we can make this stick.

That leaves my Amendment 219, which in essence is about the supply chain that regulated companies use. I am grateful to the noble Lords, Lord Mann and Lord Austin, and the noble Baroness, Lady Deech, for putting their names to the amendment. Their enthusiasm did not run to missing the Arsenal game and coming to support in the Chamber, but that implies great trust in my ability to speak to the amendment, for which I accept the responsibility and compliment.

The amendment was inspired by a meeting that some Members of your Lordships’ House and the other place had in an all-party group that was looking, in particular, at the problems of the incel culture online. We heard from various organisations about how incel culture relates to anti-Semitism and misogyny, and how such content proliferates and circulates around the web. It became clear that it is fairly commonplace to use things such as cloud services to store the content and that the links are then shared on platforms. On the mainstream platforms, there might be spaces where, under the regime we are discussing under the Bill now that we have got rid of the controversial “legal but harmful” category, this content might be seen to be relatively benign, certainly in the category of freedom of expression, but starts to capture the interest of the target demographic for it. They are then taken off by links into smaller, less regulated sites and then, in turn, by links into cloud services where the real harmful content is hosted.

Therefore, by way of what reads as an exceptionally complicated and difficult amendment in respect of entities A, B and C, we are trying to understand whether it is possible to bring in those elements of the supply chain, of the technical infrastructure, that are used to disseminate hateful content. Such content too often leads to young men taking their own lives and to the sort of harm that we saw in Plymouth, where that young man went on the rampage and killed a number of people. His MP was one of the Members of Parliament at that meeting. That is what I want to explore with Amendment 219, which opens the possibility for this regime to ensure that well-resourced platforms cannot hide behind other elements of the infrastructure to evade their responsibilities.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I beg the forbearance of the Committee because, despite the best efforts of the Whips, this group includes two major issues that I must tackle.

Starting with senior management liability, I thank the Minister and the entire ministerial team for their engagement on this big and important subject. I am enormously proud of the technology sector and the enormous benefits that it has brought to the economy and to society. I remain a massive champion of innovation and technology in the round. However, senior executives in the technology sphere have had a long-standing blind spot. Their manifesto is that the internet is somehow different from the rest of the real world and that nothing must stand on its way. My noble friend Lord Moylan gave that pony quite a generous trot round the arena, so I will not go through it again, but when it comes to children, they have consistently failed to take seriously their safeguarding responsibilities.

I spoke in Committee last week of my experience at the Ministry of Sound. When I saw the internet in the late 1990s, I immediately saw a wonderful opportunity to target children, to sell to them, to get past their parents and normal regulation, and to get into their homes and their wallets. Lots of other people had the same thought, and for a long time we have let them do what they like. This dereliction of their duty of care has led to significant consequences, and the noble Lord, Lord Russell, spoke very movingly about that. Those consequences are increasing all the time because of the take-up of mobile phones and computers by ever younger children. That has got to stop, and it is why we are here. That is why we have this Bill—to stop those consequences.

To change this, we cannot rely just on rhetoric, fines and self-regulation. We tried that, the experiment has failed, and we must try a different approach. We found that exhortations and a playing-it-nicely approach failed in the financial sector before the financial crisis. We remember the massive economic and societal costs of that failure. Likewise, in the tech sector, senior managers of firms big and small must be properly incentivised and held accountable for identifying and mitigating risks to children in a systematic way. That is why introducing senior management liability for child safety transgressions is critical. Senior management must be accountable for ensuring that child safety permeates the company and be held responsible when risks of serious harm arise or gross failures take place. Just think how the banks have changed their attitude since the financial crisis because of senior liability.

I am pleased that the Government have laid their own amendment, Amendment 200A. I commend the Minister for bringing that forward and am extremely grateful to him and to the whole team for their engagement around this issue. The government amendment creates a new offence, holding senior managers accountable for failure to comply with confirmation decisions from Ofcom relating to protecting children from harmful content. I hope that my noble friend will agree that it is making Ofcom’s job easier by providing clear consequences for the non-enforcement of such decisions.

It is a very good amendment, but there are some gaps, and I would like to address those. It is worrying that the government amendment does not cover duties related to tackling child sexual exploitation and abuse. As it stands, this amendment is a half-measure which fails to hold senior managers liable for the most severe abuse online. Child sexual abuse and exploitation offences are at a record high, as we heard earlier. NSPCC research shows that there has been an 84% rise in online grooming since 2017-18. Tech companies must be held accountable for playing their role in tackling this.

That is why the amendment in my name does the following: first, it increases the scope of the Government’s amendment to make individuals also responsible for confirmation decisions on illegal safety duties related to child sexual abuse and exploitation. Secondly, it brings search services into scope, including both categories of service providers, which is critical for ensuring that a culture of compliance is adopted throughout the sector.

21:00
I ask my noble friend the Minister: first, what is the Government’s rationale for not holding senior managers accountable for acting on confirmation decisions related to child sexual abuse offences in their amendment? Secondly, will he commit to discussing this further to ensure that the amendment covers these offences?
I would also like to speak to probing Amendments 220A to 220C in my name. Without effective enforcement, the many words and hours we spend in this House and in the other place talking about the need for robust online safety will come to nothing. Unless we get the enforcement provisions of this Bill right, the aims of the Bill will fail. We know that other content providers will not implement the Bill unless they know that there will be significant penalties for non-compliance. Often companies need to know that the penalty for the consequences of what they do will outweigh doing nothing. For instance, research on the gambling industry has found that unless companies fear the consequences of ineffective enforcement, they simply will not invest in robust technologies.
The amendments in the name of the noble Lord, Lord Curry, in this group are clearly aimed at this very issue, and I express enormous thanks to the noble Lord. Those amendments seek to remove discretion from the regulator and ensure that enforcement action takes place.
As to the amendments to Clause 138, as your Lordships are aware, Ofcom is required to produce guidance on how it intends to enforce the duties and requirements of the Bill. Ofcom has already set out its road map for enforcement, which gives a start to its framework. But these amendments seek to put some flesh on the bones. Amendment 220A states that guidance must cover four important topics. The first is how ancillary services such as payment providers will be used in the enforcement process if the service provider is either free or uses cryptocurrency or other virtual currency. This is absolutely critical for users and providers. It simply cannot be the case that sites which are free or use alternative payment methods could find themselves able to avoid enforcement.
Secondly, guidance should be produced which shows how internet service providers will be used in access restriction orders. The Government have previously suggested that ISPs are less willing to be involved in policing than previously suggested, but without the ability to block sites in contravention of the measures in the Bill, it seems there is a significant gap in the enforcement toolbox. Blocking content is something ISPs already do; they already block sites to protect intellectual property, such as football and other sporting rights. If you try to play a Taylor Swift song, you will find out how effective they are at that. It simply cannot be the case that ISPs would deem TV rights more important than child safety.
The third and fourth topics for guidance in the amendment set out what action Ofcom will take if an ancillary service provider, or a person who provides an access facility, fails to act on a relevant court order. We need to know what will happen when the next court action is ignored.
I hope my noble friend the Minister will be able to provide information on how he envisages enforcement will be implemented under this Bill, and I would be glad to meet him to discuss the matter further.
Lord Curry of Kirkharle Portrait Lord Curry of Kirkharle (CB)
- View Speech - Hansard - - - Excerpts

My Lords, in view of the hour, I will be brief, and I have no interests to declare other than that I have grandchildren. I rise to speak to a number of amendments tabled in my name in this group: Amendments 216A to 216C, 218ZZA to 218ZD and 218BA to 218BC. I do not think I have ever achieved such a comprehensive view of the alphabet in a number of amendments.

These amendments carry a simple message: Ofcom must act decisively and quickly. I have tabled them out of a deep concern that the Bill does not specify timescales or obligations within which Ofcom is required to act. It leaves Ofcom, as the regulator, with huge flexibility and discretion as to when it must take action; some action, indeed, could go on for years.

Phrases such as

“OFCOM may vary a confirmation decision”

or it

“may apply to the court for an order”

are not strong enough, in my view. If unsuitable or harmful material is populating social media sites, the regulator must take action. There is no sense of urgency within the drafting of the Bill. If contravention is taking place, action needs to be taken very quickly. If Ofcom delays taking an action, the harmful influence will continue. If the providers of services know that the regulator will clamp down quickly and severely on those who contravene, they are more likely to comply in the first place.

I was very taken by the earlier comments of the noble Baroness, Lady Harding, about putting additional burdens on Ofcom. These amendments are not designed to put additional burdens on Ofcom; indeed, the noble Lord, Lord Knight, referred to the fact that, for six years, I chaired the Better Regulation Executive. It was my experience that regulators that had a reputation for acting quickly and decisively, and being tough, had a much more compliant base as a consequence.

Noble Lords will be pleased to hear that I do not intend to go through each individual amendment. They all have a single purpose: to require the regulator—in this case, Ofcom—to act when necessary, as quickly as possible within specified timescales; and to toughen up the Bill to reduce the risk of continuous harmful content being promoted on social media.

I hope that the Minister will take these comments in the spirit in which they are intended. They are designed to help Ofcom and to help reduce the continuous adverse influence that many of these companies will propagate if they do not think they will be regulated severely.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I understand that, for legislation to have any meaning, it has to have some teeth and you have to be able to enforce it; otherwise, it is a waste of time, especially with something as important as the legislation that we are discussing here.

I am a bit troubled by a number of the themes in these amendments and I therefore want to ask some questions. I saw that the Government had tabled these amendments on senior manager liability, then I read amendments from both the noble Lord, Lord Bethell, and the Labour Party, the Opposition. It seemed to me that even more people would be held liable and responsible as a result. I suppose I have a dread that—even with the supply chain amendment—this means that lots of people are going to be sacked. It seems to me that this might spiral dangerously out of control and everybody could get caught up in a kind of blame game.

I appreciate that I might not have understood, so this is a genuine attempt to do so. I am concerned that these new amendments will force senior managers and, indeed, officers and staff to take an extremely risk-averse approach to content moderation. They now have not only to cover their own backs but to avoid jail. One of my concerns has always been that this will lead to the over-removal of legal speech, and more censorship, so that is a question I would like to ask.

I also want to know how noble Lords think this will lie in relation to the UK being a science and technology superpower. Understandably, some people have argued that these amendments are making the UK a hostile environment for digital investment, and there is something to be balanced up there. Is there a risk that this will lead to the withdrawal of services from the UK? Will it make working for these companies unattractive to British staff? We have already heard that Jimmy Wales has vowed that the Wikimedia foundation will not scrutinise posts in the way demanded by the Bill. Is he going to be thrown in prison, or will Wikipedia pull out? How do we get the balance right?

What is the criminal offence that has a threat of a prison sentence? I might have misunderstood, but a technology company manager could fail to prevent a child or young person encountering legal but none the less allegedly harmful speech, be considered in breach of these amendments and get sent to prison. We have to be very careful that we understand what this harmful speech is, as we discussed previously. The threshold for harm, which encompasses physical and psychological harm, is vast and could mean people going to prison without the precise criminal offence being clear. We talked previously about VPNs. If a tech savvy 17-year-old uses a VPN and accesses some of this harmful material, will someone potentially be criminally liable for that young person getting around the law, find themselves accused of dereliction of duty and become a criminal?

My final question is on penalties. When I was looking at this Bill originally and heard about the eye-watering fines that some Silicon Valley companies might face, I thought, “That will destroy them”. Of course, to them it is the mere blink of an eye, and I do get that. This indicates to me, given the endless conversations we have had on whether size matters, that in this instance size does matter. The same kind of liabilities will be imposed not just on the big Silicon Valley monsters that can bear these fines, but on Mumsnet—or am I missing something? Mumsnet might not be the correct example, but could not smaller platforms face similar liabilities if a young person inadvertently encounters harmful material? It is not all malign people trying to do this; my unintended consequence argument is that I do not want to create criminals when a crime is not really being committed. It is a moral dilemma, and I do understand the issue of enforcement.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

I rise very much to support the comments of my noble friend Lord Bethell and, like him, to thank the Minister for bringing forward the government amendments. I will try to address some of the comments the noble Baroness, Lady Fox, has just made.

One must view this as an exercise in working out how one drives culture change in some of the biggest and most powerful organisations in the world. Culture change is really hard. It is hard enough in a company of 10 people, let alone in a company with hundreds of thousands of employees across the world that has more money than a single country. That is what this Bill requires these enormous companies to do: to change the way they operate when they are looking at an inevitably congested, contested technology pipeline, by which I mean—to translate that out of tech speak—they have more work to do than even they can cope with. Every technology company, big or small, always has this problem: more good ideas than their technologists can cope with. They have to prioritise what to fix and what to implement. For the last 15 years, digital companies have prioritised things that drive income, but not the safety of our children. That requires a culture change from the top of the company.

21:15
I draw heavily on my experience over the last eight years as a non-exec on the Court of the Bank of England, where I have seen first-hand the implementation of the senior managers regime. I have seen it first-hand because of the extraordinary privilege of a member of the court to sit as an observer in the Prudential Regulatory Authority meetings, but I have also seen it first-hand as the chair of the Bank’s remuneration committee, where I had to sign off as a senior manager. I promise your Lordships that it completely changes your approach to compliance if your own personal name is being used. That makes a huge difference. It does not matter how huge or historic the company, which is why I used the example of the Bank of England; once it is in your name, you behave differently.
We need the very senior managers of these enormous companies to change the way they behave. Sad though this is, I do not believe they will change if it is just about money, as we see time and again. They will change if they have to think about whether, in their own name, they are breaking the law. My understanding of the Government’s amendment—this is where I get to my questions for the Minister—is that they cannot stumble into that by mistake; they have to wilfully ignore the direction of the regulator. I hope the Minister can confirm and explain that.
My other question is: are we confident that the amendment as drafted really tackles the very senior managers? I share some of the concerns of the noble Baroness, Lady Fox: we do not want middle managers, deep in the leviathan of an enormous company, being sacrificial lambs while the company does not really address the issue. We want change from the top to reshape the way these companies think about the trade-offs they have to face. I hope the Minister can clarify that.
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I support something between the amendments of the noble Lords, Lord Stevenson and Lord Bethell, and the Government. I welcome all three and put on record my thanks to the Government for making a move on this issue.

There are three members of the pre-legislative committee still in the Chamber at this late hour, and I am sure I am not the only one of those three who remembers the excruciating detail in which Suzanne Webb MP, during evidence given with Meta’s head of child safety, established that there was nowhere to report harm, but nowhere—not up a bit, not sideways, not to the C-suite. It was stunning. I have used that clip from the committee’s proceedings several times in schools to show what we do in the House of Lords, because it was fascinating. That fact was also made abundantly clear by Frances Haugen. When we asked her why she took the risk of copying things and walking them out, she said, “There was nowhere to go and no one to talk to”.

Turning to the amendments, like the noble Baroness, Lady Harding, I am concerned about whether we have properly dealt with C-suite reporting and accountability, but I am a hugely enthusiastic supporter of that accountability being in the system. I will be interested to hear the Minister speak to the Government’s amendment, but also to some of the other issues raised by the noble Lord, Lord Knight.

I will comment very briefly on the supply chain and Amendment 219. Doing so, I go back again to Amendment 2, debated last week, which sought to add services not covered by the current scope but which clearly promoted and enabled access to harm and which were also likely to be accessed by children. I have a long quote from the Minister but, because of the hour, I will not read it out. In effect, and to paraphrase, he said, “Don’t worry, they will be caught by the other guys—the search and user-to-user platforms”. If the structure of the Bill means that it is mandatory that the user-to-user and search platforms catch the people in the supply chain, surely it would be a great idea to put that in the Bill absolutely explicitly.

Finally, while I share some of the concerns raised by the noble Baroness, Lady Fox, I repeat my constant reprise of “risk not size”. The size of the fine is related to the turnover of the company, so it is actually proportionate.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a really interesting debate. I started out thinking that we were developing quite a lot of clarity. The Government have moved quite a long way since we first started debating senior manager liability, but there is still a bit of fog that needs dispelling—the noble Baronesses, Lady Kidron and Lady Harding, have demonstrated that we are not there yet.

I started off by saying yes to this group, before I got to grips with the government amendments. I broadly thought that Amendment 33, tabled by the noble Lord, Lord Stevenson, and Amendment 182, tabled by the noble Lord, Lord Bethell, were heading in the right direction. However, I was stopped short by Trustpilot’s briefing, which talked about a stepped approach regarding breaches and so on—that is a very strong point. It says that it is important to recognise that not all breaches should carry the same weight. In fact, it is even more than that: certain things should not even be an offence, unless you have been persistent or negligent. We have to be quite mindful as to how you formulate criminal offences.

I very much liked what the noble Lord, Lord Bethell, had to say about the tech view of its own liability. We have all seen articles about tech exceptionalism, and, for some reason, that seems to have taken quite a hold—so we have to dispel that as well. That is why I very much liked what the noble Lord, Lord Curry, said. It seemed to me that that was very much part of a stepped approach, while also being transparent to the object of the exercise and the company involved. That fits very well with the architecture of the Bill.

The noble Baroness, Lady Harding, put her finger on it: the Bill is not absolutely clear. In the Government’s response to the Joint Committee’s report, we were promised that, within three to six months, we would get that senior manager liability. On reading the Bill, I am certainly still a bit foggy about it, and it is quite reassuring that the noble Baroness, Lady Harding, is foggy about it too. Is that senior manager liability definitely there? Will it be there?

The Joint Committee made two other recommendations which I thought made a lot of sense: the obligation to report on risk assessment to the main board of a company, and the appointment of a safety controller, which the noble Lord, Lord Knight, mentioned. Such a controller would make it very clear—as with GDPR, you would have a senior manager who you can fix the duty on.

Like the noble Baroness, Lady Harding, I would very much like to hear from the Minister on the question of personal liability, as well as about Ofcom. It is important that any criminal prosecution is mediated by Ofcom; that is cardinal. You cannot just create criminal offences where you can have a prosecution without the intervention of Ofcom. That is extraordinarily important.

I have just a couple of final points. The noble Baroness, Lady Fox, comes back quite often to this point about regulation being the enemy of innovation. It very much depends what kind of innovation we are talking about. Technology is not necessarily neutral. It depends how the humans who deploy it operate it. In circumstances such as this, where we are talking about children and about smaller platforms that can do harm, I have no qualms about having regulation or indeed criminal liability. That is a really important factor. We are talking about a really important area.

I very strongly support Amendment 219. It deals with a really important aspect which is completely missing from the Bill. I have a splendid briefing here, which I am not going to read out, but it is all about Mastodon being one example of a new style of federated platform in which the app or hub for a network may be category 1 owing to the size of its user base but individual subdomains or networks sitting below it could fall under category 2 status. I am very happy to give a copy of the briefing to the Minister; it is a really well-written brief, and demonstrates entirely some of the issues we are talking about here.

I reassure the noble Lord, Lord Knight, that I think the amendment is very well drafted. It is really quite cunning in the way that it is done.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I wonder whether I can make a brief intervention—I am sorry to do so after the noble Lord, Lord Clement-Jones, but I want to intervene before my noble friend the Minister stands up, unless the Labour Benches are about to speak.

I have been pondering this debate and have had a couple of thoughts. Listening to the noble Lord, Lord Clement-Jones, I am reminded of something which was always very much a guiding light for me when I chaired the Charity Commission, and therefore working in a regulatory space: regulation is never an end in itself; you regulate for a reason.

I was struck by the first debate we had on day one of Committee about the purpose of the Bill. If noble Lords recall, I said in that debate that, for me, the Bill at its heart was about enhancing the accountability of the platforms and the social media businesses. I felt that the contribution from my noble friend Lady Harding was incredibly important. What we are trying to do here is to use enforcement to drive culture change, and to force the organisations not to never think about profit but to move away from profit-making to focusing on child safety in the way in which they go about their work. That is really important when we start to consider the whole issue of enforcement.

It struck me at the start of this discussion that we have to be clear what our general approach and mindset is about this part of our economy that we are seeking to regulate. We have to be clear about the crimes we think are being committed or the offences that need to be dealt with. We need to make sure that Ofcom has the powers to tackle those offences and that it can do so in a way that meets Parliament’s and the public’s expectations of us having legislated to make things better.

I am really asking my noble friend the Minister, when he comes to respond on this, to give us a sense of clarity on the whole question of enforcement. At the moment, it is insufficiently clear. Even if we do not get that level of clarity today, when we come back later on and look at enforcement, it is really important that we know what we are trying to tackle here.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I will endeavour to give that clarity, but it may be clearer still if I flesh some points out in writing in addition to what I say now.

21:30
The amendments in this group address the Bill’s enforcement powers. I begin by assuring noble Lords that there is a strong package of enforcement powers in the Bill, which will promote compliance with the regulatory regime that it ushers in and ensure that providers are held to account. Ofcom will be given robust powers to use against companies that do not comply with their duties under the Bill; it will be able to impose a penalty and/or direct companies to take specific steps to come into compliance. When companies do not comply with such a direction, Ofcom will be able to issue penalties up to £18 million or 10% of qualifying global revenue, which can be considerably more. Ofcom will also be able to apply to the courts for business disruption measures, which we will touch on in a later group. These are court orders that require third parties to withdraw their services from, or block access to, the non-compliant regulated service.
Amendment 33 in the name of the noble Lord, Lord Stevenson of Balmacara, and moved by the noble Lord, Lord Knight, Amendments 182 and 218B in the name of my noble friend Lord Bethell, and government Amendments 218A, 284D, 284E and 284F all seek to widen senior management liability. It makes sense if I begin with the government amendments.
Senior managers can already be held criminally liable when they fail to ensure that their company provides Ofcom with the information that it needs to regulate. These amendments create a new offence of failure to comply with a requirement imposed by Ofcom in a confirmation decision, in relation to specific child safety duties. In such cases, the senior manager responsible will be liable and can face up to two years in prison, a fine or both.
My noble friend Lady Harding asked me to comment on whether that has to be conscious or deliberate. The means by which the new offence is linked to individuals or senior managers is achieved through the existing liability provisions in Clause 178. It does not have to be conscious or deliberate. This will ensure that a relevant senior manager could be held criminally liable for the offence of failing to comply with the steps in a confirmation decision relating to any linked duty, if such an offence was committed with the consent or connivance of the senior manager or was attributable to the neglect of the senior manager.
This approach is modelled on provisions in the Irish Online Safety and Media Regulation Act 2022. It ensures that services know when an action or omission risks criminal liability, while providing sufficient legal certainty to ensure that the offence can be prosecuted. The duties to which this offence will be linked are the child safety duties under Clause 11(3) and duties for pornographic content under Clause 72. This focuses the new offence on harms that are central to child safety, including self-harm content, eating disorder content and pornography. This offence fulfils the Government’s commitment in another place to bring forward an amendment in your Lordships’ House strengthening the Bill’s protections for children. I am grateful for the comments welcoming them.
Amendments 33 and 182 propose creating new offences for non-compliance with duties under the Bill. Attaching criminal liability directly to the duties would create uncertainty about the criminal action. Creating criminal offences that do not prescribe the required act or omission would give rise to real concerns about the quality of the criminal law. I am pleased to say that the Government’s amendments will achieve the core aims of Amendments 33 and 182 while providing sufficient legal certainty to ensure that managers can be prosecuted. I appreciate that my noble friend Lord Bethell has recognised the benefits of this approach in the drafting of his Amendment 218B.
I note that that amendment and my noble friend’s Amendment 182 link criminal liability with a wider range of duties, but it is important that this offence is a targeted one. As such, we have linked the offence with the specific duties which will most effectively focus efforts on child safety, and have intentionally targeted user-to-user sites, which have much greater control than search services over content and will therefore be best placed to prevent children accessing it. My noble friend asked about not linking senior management liability with child sexual exploitation and abuse content. The Bill already contains very strong powers to tackle child sexual exploitation and abuse content, including the power to require companies to use accredited technology to identify, take down and prevent users encountering such content.
Separately, the Bill imposes a requirement to report child sexual exploitation and abuse content to the National Crime Agency. Persons who falsify information in the course of their child sexual exploitation and abuse content reporting duties can be punished with up to two years in prison. This will tackle such exploitation and abuse at each stage, with strong preventive powers to ensure that such content is prevented from being encountered, that it is identified and removed, and that there are criminal sentences for falsifying information in the required reports to the National Crime Agency. At the same time, we are determined to ensure that this offence is as effective as possible in protecting children, while ensuring that it remains workable. We are willing to engage further with concerned parties to ensure that the provisions achieve these aims. I am very happy to discuss this further with my noble friend and other noble Lords if they wish to do so.
We are taking further steps to strengthen the Bill’s enforcement powers by conferring on Ofcom additional powers of seizure from premises, as per Section 50 of the Criminal Justice and Police Act 2001. Ofcom will be able to apply for a warrant to enter and inspect premises. Powers exercisable by warrant include the seizure of documentation and equipment. This amendment will, in certain circumstances, allow a person exercising this power to remove material from the premises, where it is not reasonably practicable to determine whether it is seizeable, in order to determine later whether they are entitled to seize it. Further, it allows a person to seize material where it is not reasonably practicable to separate it from seizeable material.
The amendments tabled by my fellow Northumbrian, the noble Lord, Lord Curry of Kirkharle, do three things. They require Ofcom to issue provisional notices of contravention if there are reasonable grounds for believing that a service or person is not complying with their duties; they provide that Ofcom can decide not to give an enforcement confirmation decision only if it is satisfied that systems and processes are in place to ensure that the service is in compliance; and they remove Ofcom’s discretion to determine how long specific enforcement steps should take. While I certainly accept the helpful spirit in which the noble Lord has tabled these amendments, I worry that they would undermine the discretion of the regulator to manage the enforcement process as it sees fit in each case. This would, in turn, undermine Ofcom’s ability to regulate in a proportionate way and could make Ofcom’s enforcement processes unnecessarily punitive and inflexible.
Instead, the Bill sees Ofcom acting proportionately in performing its regulatory functions, targeting action where it is needed and adjusting timeframes as necessary. Ofcom will have a statutory obligation to produce guidance on its approach to enforcing the new regime the Bill brings in, just as it does with other sectors that it regulates. Ofcom strives to take a consistent approach across these sectors and often combines guidance on its general principles of enforcement. In addition, as the Bill sets out, Ofcom may draw on guidelines it has produced under Section 392 of the Communications Act which relate to the amount of penalties. These examples of existing enforcement guidance illustrate Ofcom’s experience as a regulator in providing such enforcement guidance. Ofcom is well placed to produce clear and effective guidance to help businesses understand enforcement.
Amendment 219 in the name of the noble Lord, Lord Knight of Weymouth, seeks to impose liability on a provider where a company providing regulated services on its behalf does not comply with the duties in the Bill. The Bill sets out which services will need to comply with duties and makes it clear in Clause 198 that duties fall on the entity with control over the regulated service. Such entities are best placed to keep users safe online, as they can accurately assess risk and put in place systems and processes to minimise harm. At the same time, Ofcom can hold a parent and subsidiary company jointly responsible for the actions of a company if the parent company has sufficient control over the subsidiary. Under Amendment 219, the provider would be liable regardless of whether it has control over the service in question. That would impose an unreasonable burden on businesses and cause confusion regarding which companies are required to comply with the duties in the Bill.
The second group of amendments, in the name of my noble friend Lord Bethell, are Amendments 220A to 220C, which address the timing, nature and content of guidance that Ofcom must produce on its approach to enforcement. This guidance is important to ensure that companies are clear about Ofcom’s processes. The amendments would prescribe the details that Ofcom should contain in the guidance. To ensure the guidance is effective, Ofcom must retain the discretion to include the information which it considers relevant, drawing on its long experience as a regulator. As I say, we will come to debate later the business disruption measures for which Ofcom will be given the power to apply to the courts.
Finally, government Amendment 284B is a technical amendment providing extraterritorial application for the enforcement of civil proceedings in relation to a requirement on providers to publish details of enforcement actions. Together, the Bill’s suite of targeted, proportionate enforcement powers, further strengthened by the government amendments to which I have just spoken, will ensure that companies are held accountable. I hope that that brings a bit of clarity to noble Lords. I commend the amendment standing in my name and invite noble Lords not to press theirs.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, this discussion has been very useful. The noble Baroness, Lady Fox, as ever, made an interesting and thoughtful philosophical rumination. I hope that what she has just heard from the Minister around it applying to quite specific child safety duties gave her some comfort that this was not some kind of sweep-all measure that would result in lots of people being banged up.

The government amendments are tighter than those in the name of the noble Lord, Lord Bethell. In the end, that is the judgment that we all have to make between now and when we finish our consideration of the Bill. I agree with the noble Baroness, Lady Fox, that there are dangers attached to this: that platforms will choose just to exclude children altogether and that that may infringe on some of their rights. That is why we have to get this balance right. It ultimately has to be proportionate.

We have to develop trust in Ofcom to use its powers flexibly and proportionately. I have previously said some of the things that I think are needed in order to build our trust in Ofcom, in respect of transparency and parliamentary scrutiny and so on. I think that the noble Lord, Lord Curry, is right, from his experience, that the noble Lord, Lord Grade, and his colleagues will need to be quick, decisive and tough in using those powers proportionately in order to make these platforms, particularly the large, well-resourced and powerful ones, respond. Listening to the noble Baroness, Lady Harding, I reflected on when I was a senior executive of a largeish corporation a few years ago. I was in post when the anti-bribery and corruption Act, the Data Protection Act and the gender pay gap regulations all came in, and they made the senior executives—of the company I was in, anyway—sit up, take notice and change some behaviours. These things allow corporations to act according to the public interest and to adjust behaviour, but without it being proportionate.

I say to the Minister that the fact that, for example, under the Bribery Act you could be imprisoned on the basis of decisions made in your supply chain was significant. We had to be mindful of our whole supply chain to ensure that there was no corruption going on throughout, which is very different to the judgment the Minister is making on the supply chain in this system. I was grateful to the noble Lord, Lord Clement-Jones, for reminding us of the masterful Mastodon briefing; the way in which that technology is showing different ways in which things can be done to avoid aspects of regulation is another reason to think further about the spirit of Amendment 219 as we move to Report.

21:45
When we come to Part 10 and the enforcement section—or perhaps before then privately—it would be really useful, for my sake if for no one else’s, to clarify who the senior manager is. Does the senior manager have to be UK-based for these powers to be used? What happens with all the US companies and those based in parts of eastern Europe that do not have assets or people here, yet the harm extends to users here? How does senior manager liability work in that context? With that, I am happy to withdraw Amendment 33 and look forward to where we go next.
Amendment 33 withdrawn.
Amendment 33A not moved.
Amendment 33B
Moved by
33B: After Clause 11, insert the following new Clause—
“Adult risk assessment duties
(1) This section sets out the duties about adult risk assessments which apply in relation to all Category 1 services.(2) A duty to carry out a suitable and sufficient assessment of the risk of an adult user encountering by means of the service content which is harmful to adults taking into account any relevant risk profile and to keep that assessment up to date, including when OFCOM make any significant change to a risk profile that relates to services of the kind in question, or before making any significant change to any aspect of a service’s design or operation including changes to any user empowerment tools.”Member’s explanatory statement
This amendment requires Category 1 services to assess the risk of harm to adults arising from the operation of their services.
Lord McNally Portrait Lord McNally (LD)
- Hansard - - - Excerpts

My Lords, as a former Deputy Leader of this House, if I were sitting on the Front Bench, I would have more gumption than to try to start a debate only 10 minutes before closing time. But I realise that the wheels grind on—perhaps things are no longer as flexible as they were in my day—so noble Lords will get my speech. The noble Lord, Lord Grade, who is at his post—it is very encouraging to see the chair of Ofcom listening to this debate—and I share a love of music hall. He will remember Eric Morecambe saying that one slot was like the last slot at the Glasgow Empire on a Friday night. That is how I feel now.

A number of references have been made to those who served on the Joint Committee and what an important factor it has been in their thinking. I have said on many occasions that one of the most fulfilling times of my parliamentary life was serving on the Joint Committee for the Communications Act 2003. The interesting thing was that we had no real idea of what was coming down the track as far as the internet was concerned, but we did set up Ofcom. At that time, a lot of the pundits and observers were saying, “Murdoch’s lawyers will have these government regulators for breakfast”. Well, they did not. Ofcom has turned into a regulator for which—at some stages this has slightly worried me—for almost any problem facing the Government, they say, “We’ll give it to Ofcom”. It has certainly proved that it can regulate across a vast area and with great skill. I have every confidence that the noble Lord, Lord Grade, will take that forward.

Perhaps it is to do with the generation I come from, but I do not have this fear of regulation or government intervention. In some ways, the story of my life is that of government intervention. If I am anybody’s child, I am Attlee’s child—not just because of the reforms of the Labour Party, but the reforms of the coalition Government, the Butler Education Act and the bringing in of the welfare state. So I am not afraid of government and Parliament taking responsibility in addressing real dangers.

In bringing forward this amendment, along with my colleague the noble Lord, Lord Lipsey, who cannot be here today, I am referring to legislation that is 20 years old. That is a warning to newcomers; it could be another 20 years before parliamentary time is found for a Bill of this complexity, so we want to be sure that we get its scope right.

The Minister said recently that the Bill is primarily a child safety Bill, but it did not start off that way. Five years ago, the online harms White Paper was seen as a pathfinder and trailblazer for broader legislation. Before we accept the argument that the Bill is now narrowed down to more specific terms, we should think about whether there are other areas that still need to be covered.

These amendments are in the same spirit as those in the names of the noble Baronesses, Lady Stowell, Lady Bull, and Lady Featherstone. We seek to reinstate an adult risk assessment duty because we fear that the change in title signals a reduction in scope and a retreat from the protections which earlier versions of the Bill intended to provide.

It was in this spirit, and to enable us to get ahead of the game, that in 2016 I proposed a Private Member’s Bill on this subject: the Online Harms Reduction Regulator (Report) Bill, which asked Ofcom to publish, in advance of the anticipated legislation, assessments of what action was needed to reduce harm to users and wider society from social networks. I think we can all agree that, if that work had been done in advance of the main legislation, such evidence would be very useful now.

I am well aware that there are those who, in the cause of some absolute concepts of freedom, believe that to seek to broaden the scope of the Bill takes us into the realms of the nanny state. But part of the social contract which enables us to survive in this increasingly complex world is that the ordinary citizen, who is busy struggling with the day-to-day challenges of normal life, does trust his Government and Parliament to keep an anticipatory weather eye on what is coming down the track and what dangers lie therein for the ordinary citizen.

When there have been game-changing advances in technology in the past, it has often taken a long time for societies to adapt and adjust. The noble Lord, Lord Moylan, referred to the invention of the printing press. That caused the Reformation, the Industrial Revolution and around 300 years of war, so we have to be careful how we handle these technological changes. Instagram was founded in 2010, and the iPhone 4 was released then too. One eminent social psychologist wrote:

“The arrival of smartphones rewired social life.”


It is not surprising that liberal democracies, with their essentially 18th-century construct of democracy, struggle to keep up.

The record of big tech in the last 20 years has, yes, been an amazing leap in access to information. However, that quantum leap has come with a social cost in almost every aspect of our lives. Nevertheless, I refuse to accept the premise that these technologies are too global and too powerful in their operation for them not to come within the reach of any single jurisdiction or the rule of law. I am more impressed by efforts by big tech companies to identify and deal with real harms than I am by threats to quit this or that jurisdiction if they do not get the light-touch regulation they want so as to be able to profit maximise.

We know by their actions that some companies and individuals simply do not care about their social responsibilities or the impact of what they sell and how they sell it on individuals and society as a whole. That is why the social contract in our liberal democracies means a central role for Parliament and government in bringing order and accountability into what would otherwise become a jungle. That is why, over the last 200 years, Parliament has protected its citizens from the bad behaviour of employers, banks, loan sharks, dodgy salesmen, insanitary food, danger at work and so on. In this new age, we know that companies large and small, British and foreign, can, through negligence, indifference or malice, drive innocent people into harmful situations. The risks that people face are complex and interlocking; they cannot be reduced to a simple list, as the Government seek to do in Clause 12.

When I sat on the pre-legislative committee in 2003, we could be forgiven for not fully anticipating the tsunami of change that the internet, the world wide web and the iPhone were about to bring to our societies. That legislation did, as I said, establish Ofcom with a responsibility to promote media literacy, which it has only belatedly begun to take seriously. We now have no excuse for inaction or for drawing up legislation so narrowly that it fails to deal with the wide risks that might befall adults in the synthetic world of social media.

We have tabled our amendments not because they will solve every problem or avert every danger but because they would be a step in the right direction and so make this a better Bill.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

I am very grateful to the noble Lord, Lord McNally, for namechecking me and the amendments I have tabled with the support of the noble Baronesses, Lady Featherstone and Lady Bull, although I regret to inform him that they are not in this group. I understand where the confusion has come from. They were originally in this group, but as it developed I felt that my amendments were no longer in the right place. They are now in the freedom of expression group, which we will get to next week. What he has just said has helped, because the amendments I am bringing forward are not similar to the ones he has tabled. They have a very different purpose. I will not pre-empt the debate we will have when we get to freedom of expression, but I think it is only proper that I make that clear. I am very grateful to the noble Lord for the trail.

Debate on Amendment 33B adjourned.
House resumed.
House adjourned at 9.59 pm.

Online Safety Bill

Committee (5th Day)
Relevant document: 28th Report from the Delegated Powers Committee
15:37
Debate on Amendment 33B resumed.
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to Amendment 155 in my name, and I am grateful for the support of the noble Baroness, Lady Fox of Buckley, and my noble friend Lord Strathcarron. Some of my remarks in Committee last week did not go down terribly well with Members and, in retrospect, I realise that that was because I was the only Member of the Committee that day who did not take the opportunity to congratulate the noble Baroness, Lady Kidron, on her birthday. So at this very late stage—a week later —I make good that deficiency and hope that, in doing so, I will get a more jocular and welcoming hearing than I did last week. I will speak in a similar vein, though on a different topic and part of the Bill.

This amendment relates to Clause 65, which has 12 subsections. I regard the first subsection as relatively uncontroversial; it imposes a duty on all service providers. The effect of this amendment would be to remove all the remaining subsections, which fall particularly on category 1 providers. What Clause 65 does, in brief, is to make it a statutory obligation for category 1 providers to live up to their terms of service. Although it does not seek to specify what the terms of service must be, it does, in some ways, specify how they should be operated once they have been written—I regard that as very odd, and will come back to the reason why.

I say at the outset that I understand the motivation behind this section of the Bill. It addresses the understandable feeling that if a service provider of any sort says that they have terms of service which mean that, should there be complaints, they will be dealt with in a certain way and to a certain timetable and that you will get a response by a certain time, or if they say that they will remove certain material, that they should do what they say they will do in the terms of service. I understand what the clause is trying to do —to oblige service providers to live up to their terms of service—but this is a very dangerous approach.

First of all, while terms of service are a civil contract between the provider and the user, they are not an equal contract, as we all know. They are written for the commercial benefit and advantage of the companies that write them—not just in the internet world; this is generally true—and they are written on a take it or leave it basis. Of course, they cannot be egregiously disadvantageous to the customer or else the customer would not sign up to them; none the less, they are drafted with the commercial and legal advantage of the companies in question. Terms of service can be extreme. Noble Lords may be aware that, if you have a bank account, the terms of service that your bank has, in effect, imposed on you almost certainly include a right for the bank to close your account at any time it wishes and to give no reason for doing so. I regard that as an extreme terms of service provision, but it is common. They are not written as equal contracts between consumers and service providers.

Why, therefore, would we want to set terms of service in statute? That is what this clause does: to make them enforceable by a regulator under statute. Moreover, why would we want to do it when the providers we are discussing will have, in practice, almost certainly drafted their terms of service under the provisions of a foreign legal system, which we are then asking our regulator to ensure is enforced? My objection is not to try to find a way of requiring providers to live up to the terms of service they publish—indeed, the normal process for doing so would be through a civil claim; instead, I object to the method of doing so set out in this section of the Bill.

We do not use this method with other terms of service features. For example, we do not have a regulator who enforces terms of service on data protection; we have a law that says what companies must do to protect data, and then we expect them to draft terms of service, and to conduct themselves in other ways, that are compatible with that law. We do not make the terms of services themselves enforceable through statute and regulation, yet that is what this Bill does.

When we look at the terms of service of the big providers on the internet—the sorts of people we have in mind for the scope of the Bill—we find that they give themselves, in their terms of service, vast powers to remove a wide range of material. Much of that would fall—I say this without wanting to be controversial —into the category of “legal but harmful”, which in some ways this clause is reviving through the back door.

Of course, what could be “harmful” is extremely wide, because it will have no statutory bounds: it will be whatever Twitter or Google say they will remove in their terms of service. We have no control over what they say in their terms of service; we do not purport to seek such control in the Bill or in this clause. Twitter policy, for example, is to take down material that offends protected characteristics such as “gender” and “gender identity”. Now, those are not protected characteristics in the UK; the relevant protected characteristics in the Equality Act are “sex” and “gender reassignment”. So this is not enforcing our law; our regulator will be enforcing a foreign law, even though it is not the law we have chosen to adopt here.

15:45
YouTube policy during the pandemic prohibited material that contradicted the views of health authorities. Even my right honourable friend David Davis was removed for opposing Covid passes, but that was a legitimate political position to take and contribution to make. There is no obligation on the platforms to protect free speech or to have respect to Article 10 of the European Convention on Human Rights. They are not in any sense bound by the European convention; most of them are not in any sense European. I think very strongly that this whole section is very dangerous.
I posit an extreme case that requires a slight exercise of the imagination. Imagine if a Russian platform were to gain a significant presence in the UK. It is not impossible: nobody would have predicted TikTok emerging from China so quickly not very long ago. Imagine the terms of service said, quite in compliance with Russian law, that it would remove any material that included the words “war” and “Ukraine” together; “special military operation” would be all right, but “war” and “Ukraine” would not. Imagine that it was relatively inefficient at doing this and left such material up. Are we not in a position, as a result of this section of the Bill, of obliging Ofcom to seek to enforce that term of its service contract on a Russian platform? How absurd that would be in an extreme case, but the parallel exists with the American and other platforms.
I very much hope that my noble friend will say what I want to say, which is that, yes, there is an issue and we would like to do something. We understand the motivation here, but this is very much the wrong way of going about it. It is inimical to free speech and it leads to absurd conclusions.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - - - Excerpts

I support Amendment 44. I am pleased that, as part of the new triple shield, the Government have introduced Clause 12 on “User empowerment duties”, which allow users to protect themselves, not just from abusive posts from other users but from whole areas of content. In the Communications and Digital Committee’s inquiry, we had plenty of evidence from organisations representing minorities and people with special characteristics who are unable adequately to protect themselves from the hate they receive online. I am glad that subsections (10) to (12) recognise specific content and users with special characteristics who are targets of abuse and need to be able to protect themselves, but subsection (3) requests that these features should be

“designed to effectively … reduce the likelihood of the user encountering content”

they want to avoid. I am concerned that “effectively” will be interpreted subjectively by platforms in scope and that each will interpret it differently.

At the moment, it will not be possible for Ofcom to assess how thoroughly the platforms have been providing these empowerment tools of protection for users. If the features are to work, there must be an overview of how effective they are being and how well they are working. When the former Secretary of State, Michelle Donelan, was asked about this, she said that there was nothing in this clause to pin an assessment on. It seems to me that the lists in Clause 12 create plenty of criteria on which to hang an assessment.

The new duties in Clause 12 provide for control tools for users against very specific content that is abusive or incites hatred on the basis of race, ethnicity, religion, disability, sex, gender reassignment or sexual orientation. However, this list is not exhaustive. There will inevitably be areas of content for which users have not been given blocking tools, including pornography, violent material and other material that is subject to control in the offline world.

Not only will the present list for such tools need to be assessed for its thoroughness in allowing users to protect themselves from specific harms, but surely the types of harm from which they need to protect themselves will change over time. Ofcom will need regularly to assess where these harms are and make sure that service providers regularly update their content-blocking tools. Without such an assessment, it will be hard for Ofcom and civil society to understand what the upcoming concerns are with the tools.

The amendment would provide a transparency obligation, which would demand that service providers inform users of the risks present on the platform. Surely this is crucial when users are deciding what to protect themselves from.

The assessment should also look for unintended restrictions on freedom of expression created by the new tools. If the tools are overprotective, they could surely create a bubble and limit users’ access to information that they might find useful. For example, the user might want to block material about eating disorders, but the algorithm might interpret that to mean limiting the user’s access to content on healthy lifestyles or nutrition content. We are also told that the algorithms do not understand irony and humour. When the filters are used to stop content that is abusive or incites hatred on the basis of users’ particular characteristics, they might also remove artistic, humorous or satirical content.

Repeatedly, we are told that the internet creates echo chambers, where users read only like-minded opinions. These bubbles can create an atmosphere where freedom of expression is severely limited and democracy suffers. A freedom of expression element to the assessment would also, in these circumstances, be critical. We are told that the tech platforms often do not know what their algorithms do and, not surprisingly, they often evolve beyond their original intentions. Assessments on the tools demanded by Clause 12 need to be carefully investigated to ensure that they are keeping up to date with the trends of abuse on the internet but also for the unintended consequences they might create, curbing freedom of expression.

Throughout the Bill, there is a balancing act between freedom of expression and protection from abuse. The user empowerment tools are potentially very powerful, and neither the service providers, the regulators nor the Government know what their effects will be. It is beholden upon the Government to introduce an assessment to check regularly how the user empowerment duties are working; otherwise, how can they be updated, and how can Ofcom discover what content is being unintentionally controlled? I urge the Minister, in the name of common sense, to ensure that these powerful tools unleashed by the Bill will not be misused or become outdated in a fast-changing digital world.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I thank the noble Lord, Lord Moylan, for his words—I thought I was experiencing time travel there—and am sympathetic to many of the issues that he has raised, although I think that some of the other amendments in the group tackle those issues in a slightly different way.

I support Amendments 44 and 158 in the name of the right reverend Prelate the Bishop of Oxford. Requiring a post-rollout assessment to ensure that the triple shield acts as we are told it will seems to be a classic part of any regulatory regime that is fit for purpose: it needs to assess whether the system is indeed working. The triple shield is an entirely new concept, and none of the burgeoning regulatory systems around the world is taking this approach, so I hope that both the Government and Ofcom welcome this very targeted and important addition to the Bill.

I will also say a few words about Amendments 154 and 218. It seems to me that, in moving away from legal but harmful—which as a member of the pre-legislative committee I supported, under certain conditionality that has not been met, but none the less I did support it—not enough time and thought have been given to the implications of that. I do not understand, and would be grateful to the Minister if he could help me understand, how Ofcom is to determine whether a company has met its own terms and conditions—and by any means, not only by the means of a risk assessment.

I want to make a point that the noble Baroness, Lady Healy, made the other day—but I want to make it again. Taking legal but harmful out and having no assessment of whether a company has met its general safety duties leaves the child safety duties as an island. They used to be something that was added on to a general system of safety; now they are the first and only port of call. Again, because of the way that legal but harmful fell out of the Bill, I am not sure whether we have totally understood how the child risk assessments sit without a generally cleaned up or risk-assessed digital environment.

Finally, I will speak in support of Amendment 160, which would have Ofcom say what “adequate and appropriate” terms are. To a large degree, that is my approach to the problem that the noble Lord, Lord Moylan, spoke about: let Parliament and the regulator determine what we want to see—as was said on the data protection system, that is how it is—and let us have minimum standards that we can rightly expect, based on UK law, as the noble Lord suggested.

I am not against the triple shield per se, but it radically replaced an entire regime of assessment, enforcement and review. I think that some of the provisions in this group really beg the Government’s attention, in order to make sure that there are no gaping holes in the regime.

Baroness Fraser of Craigmaddie Portrait Baroness Fraser of Craigmaddie (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to Amendments 44 and 158 in the name of the right reverend Prelate the Bishop of Oxford. I also note my support for the amendments in the name of the noble Lord, Lord Stevenson of Balmacara, to ensure the minimum standard for a platform’s terms of service. My noble friend Lord Moylan has just given an excellent speech on the reasons why these amendments should be considered.

I am aware that the next group of amendments relates to the so-called user empowerment tools, so it seems slightly bizarre to be speaking to Amendment 44, which seeks to ensure that these user empowerment tools actually work as the Government hope they will, and Amendment 158, which seeks to risk assess whether providers’ terms of service duties do what they say and report this to Ofcom. Now that the Government have watered down the clauses that deal with protection for adults, like other noble Lords, I am not necessarily against the Government’s replacement—the triple shield—but I believe that it needs a little tightening up to ensure that it works properly. These amendments seem a reasonable way of doing just that. They would ensure greater protection for adults without impinging on others’ freedom of expression.

The triple shield relies heavily on companies’ enforcement of terms of service and other vaguely worded duties, as the noble Viscount mentioned, that user empowerment tools need to be “easily accessible” and “effective”—whatever that means. Unlike with other duties in the Bill, such as those on illegal content and children’s duties, there is no mechanism to assess whether these new measures are working; whether the way companies are carrying out these duties is in accordance with the criteria set out; and whether they are indeed infringing freedom of expression. Risk assessments are vital to doing just that, because they are vital to understanding the environment in which services operate. They can reduce bureaucracy by allowing companies to rule out risks which are not relevant to them, and they can increase user safety by revealing new risks, thereby enabling the future-proofing of a regime. Can the Minister give us an answer today as to why risk assessment duties on these two strands of the triple shield—terms of service and user empowerment tools—were removed? If freedom of speech played a part in this, perhaps he could elaborate why he thinks undertaking a risk assessment is in any way a threat.

Without these amendments, the Bill cannot be said to be a complete risk management regime. Companies will, in effect, be marking their own homework when designing their terms of service and putting their finger in the air when it comes to user empowerment tools. There will be no requirement for them to explain either to Ofcom or indeed to service users the true nature of the harms that occur on their service, nor the rationale behind any decisions they might make in these two fundamental parts of their service.

Since the Government are relying so heavily on their triple shield to ensure protection for adults, to me, not reviewing two of the three strands that make up the triple shield seems like fashioning a three-legged stool with completely uneven legs: a stool that will not stand up to the slightest pressure when used. Therefore, I urge the Minister to look again and consider reinstating these protections in the Bill.

16:00
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, this group of amendments looks at the treatment of legal content accessed by adults. The very fact that Parliament feels that legislation has a place in policing access to legal material is itself worrying. This door was opened by the Government in the initial draft Bill, but, as we have already heard, after a widespread civil liberties backlash against the legal but harmful clauses, we are left with Clause 65. As has been mentioned, I am worried that this clause, and some of the amendments, might well bring back legal but harmful for adults by the back door. One of the weasel words here is “harmful”. As I have indicated before, it is difficult to work out from the groupings when to raise which bit, so I am keeping that for your Lordships until later and will just note that I am rather nervous about the weasel word “harmful”.

Like many of us, I cheered at the removal of the legal but harmful provisions, but I have serious reservations about their replacement with further duties via terms of service, which imposes a duty on category 1 services to have systems and processes in place to take down or restrict access to content, and to ban or suspend users in accordance with terms of service, as the noble Lord, Lord Moylan, explained. It is one of the reasons I support his amendment. It seems to me to be the state outsourcing the grubby job of censorship to private multinational companies with little regard for UK law.

I put my name to Amendment 155 in the name of the noble Lord, Lord Moylan, because I wanted to probe the Government’s attitude to companies’ terms of service. Platforms have no obligation to align their terms of service with freedom of expression under UK law. It is up to them. I am not trying to impose on them what they do with their service users. If a particular platform wishes to say, “We don’t want these types of views on our platform”, fine, that is its choice. But when major platforms’ terms of service, which are extensive, become the basis on which UK law enforces speech, I get nervous. State regulators are to be given the role of ensuring that all types of lawful speech are suppressed online, because the duty applies to all terms of service, whatever they are, regarding the platforms’ policies on speech suppression, censorship, user suspension, bans and so on. This duty is not restricted to so-called harmful content; it is whatever content the platform wishes to censor.

What is more, Clause 65 asks Ofcom to ensure that individuals who express lawful speech are suspended or banned from platforms if in breach of the platforms’ Ts & Cs, and that means limiting those individuals from expressing themselves more widely, beyond the specific speech in question. That is a huge green light to interfere in UK citizens’ freedom of expression, in my opinion.

I stress that I am not interested in interfering in the terms and conditions of private companies, although your Lordships will see later that I have an amendment demanding that they introduce free-speech clauses. That is because of the way we seem to be enacting the law via the terms of service of private companies. They should of course be free to dictate their own terms of service, and it is reasonable that members of the public should know what they are and expect them to be upheld. But that does not justify the transformation of these private agreements into statutory duties—that is my concern.

So, why are we allowing this Bill to ask companies to enforce censorship policies in the virtual public square that do not exist in UK law? When companies’ terms of service permit the suppression of speech, that is up to them, but when they supress speech far beyond the limitations of speech in UK law and are forced to do so by a government regulator such as Ofcom, are we not in trouble? It means that corporate terms of service, which are designed to protect platforms’ business interests, are trumping case law on free speech that has evolved over many years.

Those terms of service are also frequently in flux, according to fashion or ownership; one only has to look at the endless arguments, which I have yet to understand, about Twitter’s changing terms of service after the Elon Musk takeover. Is Ofcom’s job to follow Elon Musk’s ever-changing terms of service and enforce them on the British public as if they are law?

The terms and conditions are therefore no longer simply a contract between a company and the user; their being brought under statute means that big tech will be exercising public law functions, with Ofcom as the enforcer, ensuring that lawful speech is suppressed constantly, in line with private companies’ terms of service. This is an utter mess and not in any way adequate to protect free speech. It is a fudge by the Government: they were unpopular on “lawful but harmful”, so they have outsourced it to someone else to do the dirty work.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, it has been interesting to hear so many noble Lords singing from the same hymn sheet—especially after this weekend. My noble friend Lord McNally opened this group by giving us his wise perspective on the regulation of new technology. Back in 2003, as he mentioned, the internet was not even mentioned in the Communications Act. He explained how regulation struggles to keep up and how quantum leaps come with a potential social cost; all that describes the importance of risk assessment of these novel technologies.

As we have heard from many noble Lords today, on Report in the Commons the Government decided to remove the adult safety duties—the so-called “legal but harmful” aspect of the Bill. I agree with the many noble Lords who have said that this has significantly weakened the protection for adults under the Bill, and I share the scepticism many expressed about the triple shield.

Right across the board, this group of amendments, with one or two exceptions, rightly aims to strengthen the terms of service and user empowerment duties in the Bill in order to provide a greater baseline of protection for adults, without impinging on others’ freedom of speech, and to reintroduce some risk-assessment requirement on companies. The new duties will clearly make the largest and riskiest companies expend more effort on enforcing their terms of service for UK users. However, the Government have not yet presented any modelling on what effect this will have on companies’ terms of service. I have some sympathy with what the noble Lord, Lord Moylan, said: the new duties could mean that terms of service become much longer and lawyered. This might have an adverse effect on freedom of expression, leading to the use of excessive takedown measures rather than looking at other more systemic interventions to control content such as service design. We heard much the same argument from the noble Baroness, Lady Fox. They both made a very good case for some of the amendments I will be speaking to this afternoon.

On the other hand, companies that choose to do nothing will have an easier life under this regime. Faced with stringent application of the duties, companies might make their terms of service shorter, cutting out harms that are hard to deal with because of the risk of being hit with enforcement measures if they do not. Therefore, far from strengthening protections via this component of the triple shield, the Bill risks weakening them, with particular risks for vulnerable adults. As a result, I strongly support Amendments 33B and 43ZA, which my noble friend Lord McNally spoke to last week at the beginning of the debate on this group.

Like the noble Baroness, Lady Kidron, I strongly support Amendments 154, 218 and 160, tabled by the noble Lord, Lord Stevenson, which would require regulated services to maintain “adequate and appropriate” terms of service, including provisions covering the matters listed in Clause 12. Amendment 44, tabled by the right reverend Prelate the Bishop of Oxford and me, inserts a requirement that services to which the user empowerment duties apply

“must make a suitable and sufficient assessment of the extent to which they have carried out the duties in this section including in each assessment material changes from the previous assessment such as new or removed user empowerment features”.

The noble Viscount, Lord Colville, spoke very well to that amendment, as did the noble Baronesses, Lady Fraser and Lady Kidron.

Amendment 158, also tabled by me and the right reverend Prelate, inserts a requirement that services

“must carry out a suitable and sufficient assessment of the extent to which they have carried out the duties under sections 64 and 65 ensuring that assessment reflects any material changes to terms of service”.

That is a very good way of meeting some of the objections that we have heard to Clause 65 today.

These two amendments focus on risk assessment because the new duties do not have an assessment regime to work out whether they work, unlike the illegal content and children’s duties, as we have heard. Risk assessments are vital to understanding the environment in which the services are operating. A risk assessment can reduce bureaucracy by allowing companies to rule out risks which are not relevant to them, and it can increase user safety by revealing new risks and future-proofing a regime.

The Government have not yet provided, in the Commons or in meetings with Ministers, any proper explanation of why risk assessment duties have been removed along with the previous adult safety duties, and they have not explained in detail why undertaking a risk assessment is in any way a threat to free speech. They are currently expecting adults to manage their own risks, without giving them the information they need to do so. Depriving users of basic information about the nature of harms on a service prevents them taking informed decisions as to whether they want to be on it at all.

Without these amendments, the Bill cannot be said to be a complete risk management regime. There will be no requirement to explain to Ofcom or to users of a company’s service the true nature of the harms that occur on its service, nor the rationale behind the decisions made in these two fundamental parts of the service. This is a real weakness in the Bill, and I very much hope that the Minister will listen to the arguments being made this afternoon.

Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I thank noble Lords from all sides of the House for their contributions and for shining a light on the point the noble Lord, Lord Clement-Jones, made near the end of his remarks about the need to equip adults with the tools to protect themselves.

It is helpful to have these amendments, because they give the Minister the opportunity to accept—as I hope he will—a number of the points raised. It seems a long time since the noble Lord, Lord McNally, introduced this group, but clearly it has given us all much time to reflect. I am sure we will see the benefits of that in the response from the Minister. Much of the debate on the Bill has focused on child safety and general practicalities, but this group helpfully allows us to focus on adults and the operation of the Government’s replacement for the legal but harmful section of the Bill. As the noble Baroness, Lady Fraser, rightly said, perhaps some tightening up of the legislation before us would be helpful. These amendments give us that chance.

16:15
My noble friend Lord Lipsey has put forward a number of amendments, which helpfully focus on the whole area of adult risk assessments, which were required under the previous iteration of the Bill but have since been drastically watered down. I would be grateful if the Minister could give some explanation as to why we find ourselves in that situation, and perhaps take the opportunity to pick up a number of the points raised in the amendments.
Quite a lot of the debate has focused around the amendments put forward in the name of the right reverend Prelate the Bishop of Oxford. These amendments take a somewhat different approach, because they require service providers to assess the extent to which their user empowerment tools are meeting the obligations laid out in Clause 12. The noble Viscount, Lord Colville, in his helpful remarks, said that it was right to keep up to date with the trends in abuse. This is a point that has come up repeatedly in our discussion: the need to make sure that this legislation is entirely fit for purpose and is able to move with the kind of changes that he referred to.
My noble friend Lord Stevenson has four very helpful amendments in this group, which focus on the minimum standards in platforms’ terms of service. This is an area we began to probe during a debate last week, where the answer seemed to be that, because terms of service are already complicated, we should not add to them. The issue here is really how we get the terms of service in the right place. All these amendments, again, take us there.
I was interested in the comments by the noble Lord, Lord Moylan, about enforceability, but again, on the issue of terms of service, the problem for me is inconsistency. We should seek to bring consistency as well as usefulness and applicability into those terms of service.
We will come on to broader amendments about user empowerment tools in the next group, but there clearly is a gap between what the Government have promised adult users, and what they are likely to end up with when the new regime is fully operational. I hope we will hear from the Minister how that gap may be closed.
I listened with great interest to the noble Baroness, Lady Fox. It is important to say that the issue here is whether algorithms should power the amount and nature of materials that come the way of users. The amendments seek to assist users to have that control, not to just be at the mercy of algorithms. It is about not individual pieces but what people can have control on. The amendments are useful in that respect. We know that there is much legal content which carries a risk of harm to adults, particularly vulnerable adults, who are not actually helped by the Bill. We need confidence that filters and other empowerment tools will make a genuine difference.
I hope that the Minister will accept that a number of these amendments are particularly helpful in strengthening the Bill, and that he will find a way to accept that form of strengthening.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

I am very grateful to the noble Lords who have spoken on the amendments in this group, both this afternoon and last Tuesday evening. As this is a continuation of that debate, I think my noble friend Lord Moylan is technically correct still to wish the noble Baroness, Lady Kidron, a happy birthday, at least in procedural terms.

We have had a very valuable debate over both days on the Bill’s approach to holding platforms accountable to their users. Amendments 33B, 41A, 43ZA, 138A and 194A in the names of the noble Lords, Lord Lipsey and Lord McNally, and Amendment 154 in the name of the noble Lord, Lord Stevenson of Balmacara, seek to bring back the concept of legal but harmful content and related adult risk assessments. They reintroduce obligations for companies to consider the risk of harm associated with legal content accessed by adults. As noble Lords have noted, the provisions in the Bill to this effect were removed in another place, after careful consideration, to protect freedom of expression online. In particular, the Government listened to concerns that the previous legal but harmful provisions could create incentives for companies to remove legal content from their services.

In place of adult risk assessments, we introduced new duties on category 1 services to enable users themselves to understand how these platforms treat different types of content, as set out in Clauses 64 and 65. In particular, this will allow Ofcom to hold them to account when they do not follow through on their promises regarding content they say that they prohibit or to which they say that they restrict access. Major platforms already prohibit much of the content listed in Clause 12, but these terms of service are often opaque and not consistently enforced. The Bill will address and change that.

I would also like to respond to concerns raised through Amendments 41A and 43ZA, which seek to ensure that the user empowerment categories cover the most harmful categories of content to adults. I reassure noble Lords that the user empowerment list reflects input from a wide range of interested parties about the areas of greatest concern to users. Platforms already have strong commercial incentives to tackle harmful content. The major technology companies already prohibit most types of harmful and abusive content. It is clear that most users do not want to see that sort of content and most advertisers do not want their products advertised alongside it. Clause 12 sets out that providers must offer user empowerment tools with a specified list of content to the extent that it is proportionate to do so. This will be based on the size or capacity of the service as well as the likelihood that adult users will encounter the listed content. Providers will therefore need internally to assess the likelihood that users will encounter the content. If Ofcom disagrees with the assessment that a provider has made, it will have the ability to request information from providers for the purpose of assessing compliance.

Amendments 44 and 158, tabled by the right reverend Prelate the Bishop of Oxford, seek to place new duties on providers of category 1 services to produce an assessment of their compliance with the transparency, accountability, freedom of expression and user empowerment duties as set out in Clauses 12, 64 and 65 and to share their assessments with Ofcom. I am sympathetic to the aim of ensuring that Ofcom can effectively assess companies’ compliance with these duties. But these amendments would enable providers to mark their own homework when it comes to their compliance with the duties in question. The Bill has been designed to ensure that Ofcom has responsibility for assessing compliance and that it can obtain sufficient information from all regulated services to make judgments about compliance with their duties. The noble Baroness, Lady Kidron, asked about this—and I think the noble Lord, Lord Clement-Jones, is about to.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I hope the Minister will forgive me for interrupting, but would it not be much easier for Ofcom to assess compliance if a risk assessment had been carried out?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will come on to say a bit more about how Ofcom goes about that work.

The Bill will ensure that providers have the information they need to understand whether they are in compliance with their duties under the Bill. Ofcom will set out how providers can comply in codes of practice and guidance that it publishes. That information will help providers to comply, although they can take alternative action if they wish to do so.

The right reverend Prelate’s amendments also seek to provide greater transparency to Ofcom. The Bill’s existing duties already account for this. Indeed, the transparency reporting duties set out in Schedule 8 already enable Ofcom to require category 1, 2A and 2B services to publish annual transparency reports with relevant information, including about the effectiveness of the user empowerment tools, as well as detailed information about any content that platforms prohibit or restrict, and the application of their terms of service.

Amendments 159, 160 and 218, tabled by the noble Lord, Lord Stevenson, seek to require user-to-user services to create and abide by minimum terms of service recommended by Ofcom. The Bill already sets detailed and binding requirements on companies to achieve certain outcomes. Ofcom will set out more detail in codes of practice about the steps providers can take to comply with their safety duties. Platforms’ terms of service will need to provide information to users about how they are protecting users from illegal content, and children from harmful content.

These duties, and Ofcom’s codes of practice, ensure that providers take action to protect users from illegal content and content that is harmful to children. As such, an additional duty to have adequate and appropriate terms of service, as envisaged in the amendments, is not necessary and may undermine the illegal and child safety duties.

I have previously set out why we do not agree with requiring platforms to set terms of service for legal content. In addition, it would be inappropriate to delegate this much power to Ofcom, which would in effect be able to decide what legal content adult users can and cannot see.

Amendment 155, tabled by my noble friend Lord Moylan, seeks to clarify whether and how the Bill makes the terms of service of foreign-run platforms enforceable by Ofcom. Platforms’ duties under Clause 65 apply only to the design, operation and use of the service in the United Kingdom and to UK users, as set out in Clause 65(11). Parts or versions of the service which are used in foreign jurisdictions—

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

On that, in an earlier reply the Minister explained that platforms already remove harmful content because it is harmful and because advertisers and users do not like it, but could he tell me what definition of “harmful” he thinks he is using? Different companies will presumably have a different interpretation of “harmful”. How will that work? It would mean that UK law will require the removal of legal speech based on a definition of harmful speech designed by who—will it be Silicon Valley executives? This is the problem: UK law is being used to implement the removal of content based on decisions that are not part of UK law but with implications for UK citizens who are doing nothing unlawful.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

The noble Baroness’s point gets to the heart of the debate that we have had. I talked earlier about the commercial incentive that there is for companies to take action against harmful content that is legal which users do not want to see or advertisers do not want their products to be advertised alongside, but there is also a commercial incentive to ensure that they are upholding free speech and that there are platforms on which people can interact in a less popular manner, where advertisers that want to advertise products legally alongside that are able to do so. As with anything that involves the market, the majority has a louder voice, but there is room for innovation for companies to provide products that cater to minority tastes within the law.

16:30
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, my noble friend has explained clearly how terms of service would normally work, which is that, as I said myself, a business might write its own terms of service to its own advantage but it cannot do so too egregiously or it will lose customers, and businesses may aim themselves at different customers. All this is part of normal commercial life, and that is understood. What my noble friend has not really addressed is the question of why uniquely and specifically in this case, especially given the egregious history of censorship by Silicon Valley, he has chosen to put that into statute rather than leave it as a commercial arrangement, and to make it enforceable by Ofcom. For example, when my right honourable friend David Davis was removed from YouTube for his remarks about Covid passes, it would have been Ofcom’s obligation not to vindicate his right to free speech but to cheer on YouTube and say how well it had done for its terms of service.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Our right honourable friend’s content was reuploaded. This makes the point that the problem at the moment is the opacity of these terms and conditions; what platforms say they do and what they do does not always align. The Bill makes sure that users can hold them to account for the terms of service that they publish, so that people can know what to expect on platforms and have some form of redress when their experience does not match their expectations.

I was coming on to say a bit more about that after making some points about foreign jurisdictions and my noble friend’s Amendment 155. As I say, parts or versions of the service that are used in foreign jurisdictions but not in the UK are not covered by the duties in Clause 65. As such, the Bill does not require a provider to have systems and processes designed to enforce any terms of service not applicable in the UK.

In addition, the duties do not give powers to Ofcom to enforce a provider’s terms of service directly. Ofcom’s role will be focused on ensuring that platforms have systems and processes in place to enforce their own terms of service consistently rather than assessing individual pieces of content.

Requiring providers to set terms of service for specific types of content suggests that the Government view that type of content as harmful or risky. That would encourage providers to prohibit such content, which of course would have a negative impact on freedom of expression, which I am sure is not what my noble friend wants to see. Freedom of expression is essential to a democratic society. Throughout the passage of the Bill, the Government have always committed to ensuring that people can speak freely online. We are not in the business of indirectly telling companies what legal content they can and cannot allow online. Instead, the approach that we have taken will ensure that platforms are transparent and accountable to their users about what they will and will not allow on their services.

Clause 65 recognises that companies, as private entities, have the right to remove content that is legal from their services if they choose to do so. To prevent them doing so, by requiring them to balance this against other priorities, would have perverse consequences for their freedom of action and expression. It is right that people should know what to expect on platforms and that they are able to hold platforms to account when that does not happen. On that basis, I invite the noble Lords who have amendments in this group not to press them.

Lord McNally Portrait Lord McNally (LD)
- View Speech - Hansard - - - Excerpts

My Lords, in his opening remarks, the Minister referred to the fact that this debate began last Tuesday. Well, it did, in that I made a 10-minute opening speech and the noble Baroness, Lady Stowell, rather elegantly hopped out of this group of amendments; perhaps she saw what was coming.

How that made me feel is perhaps best summed up by what the noble Earl, Lord Howe, said earlier when he was justifying the business for tomorrow. He said that adjournments were never satisfactory. In that spirit, I wrote to the Leader of the House, expressing the grumbles I made in my opening remarks. He has written back in a very constructive and thoughtful way. I will not delay the Committee any longer, other than to say that I hope the Leader of the House would agree to make his reply available for other Members to read. It says some interesting things about how we manage business. It sounds like a small matter but if what happened on Tuesday had happened in other circumstances in the other place, business would probably have been delayed for at least an hour while the usual suspects picked holes in it. If the usual channels would look at this, we could avoid some car crashes in future.

I am pleased that this group of amendments has elicited such an interesting debate, with fire coming from all sides. In introducing the debate, I said that probably the only real advice I could give the Committee came from my experience of being on the pre-legislative scrutiny committee in 2003. That showed just how little we were prepared for the tsunami of new technology that was about to engulf us. My one pleasure was that we were part of forming Ofcom. I am pleased that the chairman of Ofcom, the noble Lord, Lord Grade, has assiduously sat through our debates. I suspect he is thinking that he had better hire some more lawyers.

We are trying to get this right. I have no doubt that all sides of the House want to get this legislation through in good shape and for it to play an important role. I am sure that the noble Lord, Lord Grade, never imagined that he would become a state regulator in the kind of ominous way in which the noble Baroness, Lady Fox, said it. Ofcom has done a good job and will do so in future.

There is a problem of getting definitions right. When I was at the Ministry of Justice, I once had to entertain a very distinguished American lawyer. As I usually did, I explained that I was not a lawyer. He looked at me and said, “Then I will speak very slowly”. There is a danger, particularly in this part of the Bill, of wandering into a kind of lawyer-fest. It is important that we are precise about what powers we are giving to whom. Just to chill the Minister’s soul, I remember being warned as well about Pepper v Hart. What he says at the Dispatch Box will be used to interpret what Parliament meant when it gave this or that power.

The debate we have had thus far has been fully justified in sending a few warning signals to the Minister that it is perhaps not quite right yet. It needs further work. There is a lot of good will on all sides of the House to get it right. For the moment, I beg leave to withdraw my amendment.

Amendment 33B withdrawn.
Clause 12: User empowerment duties
Amendment 34
Moved by
34: Clause 12, page 12, line 9, leave out “if they wish to increase their control over” and insert “to control”
Member’s explanatory statement
This amendment, and another in the name of Baroness Morgan, would require Category 1 providers to ensure that the default options are the safest for users in regard to suicide, self-harm, eating disorders and the abuse and hate content already determined to be harmful as part of the Government’s “triple shield” approach.
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Hansard - - - Excerpts

My Lords, it is a great pleasure to speak to this group of amendments. As it is the first time I have spoken at this stage of the Bill’s proceedings, I declare my interest as a trustee and founder of the mental health charity the Loughborough Wellbeing Centre, which is relevant to this group. If it is lawyers’ confession time, then I am also going to confess to being a non-practising solicitor. But I can assure those Members of the House who are not lawyers that they do not need to be lawyers or ex-lawyers to understand the very simple proposition at the heart of this group of amendments.

Amendments 34 and 35 are in my name, along with those of the noble Baroness, Lady Parminter, the right reverend Prelate the Bishop of Gloucester and the noble Lord, Lord Griffiths of Burry Port. I am very grateful to them for their support for these amendments, which are also supported by the Football Association, Kick It Out, Beat, YoungMinds, the Royal College of Psychiatrists, the British Psychological Society, Mind, the Mental Health Network, the NHS Confederation, Rethink Mental Illness and Mental Health UK. I thank particularly the Mental Health Foundation for its support with making the points that we will cover in this group.

As we have already heard, and rightly, it is difficult with a Bill of this complexity to debate just one topic in a particular group. Although I have not spoken, it has been a great privilege to listen to your Lordships on earlier groups. We have already talked this afternoon and previously about the Government’s triple-shield approach and the replacement of that for the “legal but harmful” provisions that were taken out of the Bill. We have heard that the triple shield consists of the removal of illegal content, the takedown of material in breach of own terms of service—we have just been talking about that—and the provision to adults of greater choice over the content that they see online using these platforms. What we are talking about in this group of amendments is that third leg—I had put “limb” but have changed it because of what my noble friend Lady Fraser said—of the triple-shield categories, so that user empowerment tools should be on by default.

The change suggested by this proposal would require users on these platforms to flip a switch and choose whether to opt in to some of the most dangerous content available online, rather than receiving it by default. This adopts the Government’s existing approach of giving users choice over what they see but ensures that the default is that they will not be served this kind of material unless they actively choose to see it. The new offence on encouragement to serious self-harm, which the Government have committed to introducing, might form part of the solution here. But we cannot criminalise all the legal content that treads the line between glorification and outright encouragement, and no similar power is proposed to address eating disorder content. I know that others will talk about that, and I pay tribute to the work of Vicky Ford MP in relation to eating disorders; she has been brave enough to share her own experiences of those disorders.

During the Bill’s journey through Parliament, we have heard how vulnerable users often internalise the harmful and hateful content that they see online, which in turn can lead to users deliberately seeking out harmful content in an attempt to normalise self-destructive thoughts and behaviours. We have heard how Molly Russell, for example, viewed tweets which normalised her thoughts on self-harm and suicide; we have also heard how people with eating disorders often get what is called “inspiration” on platforms such as Tumblr, Instagram and TikTok.

We know from various studies that viewing this content has a negative effect on people’s mental well-being. A study carried out by the University of Oxford found that viewing images of self-harm often encouraged individuals to start self-harming, and concluded:

“Young people who self-harm are likely to use the internet in ways that increases their risk”.


Research by the Samaritans provided similar results, with 77% of respondents answering that they sometimes or often self-harmed in the same or similar ways after viewing self-harm imagery.

16:45
The Mental Health Foundation polled over 3,300 people and found that 67% of the public agreed or strongly agreed that they do not wish to be exposed to harmful content unless they explicitly choose to see it. I think my noble friend the Minister, perhaps not referring to this research, also said this earlier.
As we have also heard from the noble Baroness, Lady Merron, who is not in her place, even if a user is not searching for harmful content, they can be led to it through the algorithms. This includes pro-suicide, pro-self-harm, pro-anorexia and pro-bulimia content. In other words, it is too easy for users to see harmful content on these platforms, and this needs to change.
The Government chose to change from the legal but harmful to the triple-shield approach. However, the user empowerment tools introduced are neither new nor ground-breaking, because a lot of social media platforms already claim to have filters in place, giving users the ability to hide certain content from their timelines. But many users do not know that they are there, or how to use them properly. As it stands, the Government’s solution will be largely ineffective unless these tools are on by default.
Another point I suspect others will make, which we heard in the briefings before this group, is that vulnerability does not stop at the age of 18, so why would there be a cliff edge where there is protection from known harmful content for those under 18 but not for those over 18? As somebody made clear in the Samaritans briefing, which a number of us attended, people can be sectioned for their own protection after the age of 18. Adults, and particularly the vulnerable, may not be in a position to self-protect, and the trouble with not having the tools on by default is that we are yet again putting the burden to self-protect on the vulnerable and potential victims without taking responsibility as a society for this.
There is of course a wider point here—perhaps not for this debate but I am sure it will come up again—which is that not seeing the content does not mean that it does not exist. We will return to this when we debate content that is violent against women and girls. The noble Baroness, Lady Fox, has already referred to the content set out in subsections (10), (11) and (12) of this clause. Does the fact that it is listed mean we are saying that such harmful content is still OK to circulate on the internet, just because people are not seeing it? I would say this raises broader questions, but it is perhaps not a debate for today.
These two amendments would ensure that platforms’ design involves the safest options being on by default. They are two straightforward, common-sense amendments that, as the noble Viscount, Lord Colville—who is not here now—said, balance the understandable concerns about freedom of speech with safety. They do not stop the publication of this objectionable material, but they offer others, particularly the most vulnerable, a real choice about whether they see it. I would argue that it is our minimum duty to make sure these safety protections are on by default. I beg to move.
Lord Griffiths of Burry Port Portrait Lord Griffiths of Burry Port (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to be collaborating with the noble Baroness, Lady Morgan. We seem to have been briefed by the same people, been to the same meetings and drawn the same conclusions. However, there are some things that are worth saying twice and, although I will try to avoid a carbon copy of what the noble Baroness said, I hope the central points will make themselves.

The internet simply must be made to work for its users above all else—that is the thrust of the two amendments that stand in our names. Through education and communication, the internet can be a powerful means of improving our lives, but it must always be a safe platform on which to enjoy a basic right. It cannot be said often enough that to protect users online is to protect them offline. To create a strict division between the virtual and the public realms is to risk ignoring how actions online can have life and death repercussions, and that is at the heart of what these amendments seek to bring to our attention.

I was first made aware of these amendments at a briefing from the Samaritans, where we got to know each other. There I heard the tragic accounts of those whose loved ones had taken their own lives due to exposure to harmful content online. I will not repeat their accounts—this is not the place to do that—but understanding only a modicum of their grief made it obvious to me that the principle of “safest option by default” must underline all our decision-making on this.

I applaud the work already done by Members of this House to ensure the safety of young people online. Yet it is vital, as the noble Baroness has said, that we do not create a drop-off point for future users—one in which turning 18 means sudden exposure to the most harmful content lurking online, as it is always there. Those most at risk of suicide due to exposure to harmful content are aged between their late teens and early 20s. In fact, a 2017 inquiry into the suicides of young people found harmful content accessed online in 26% of the deaths of under 20s and 13% of the deaths of 20 to 24 year-olds. It is vital for us to empower users from their earliest years.

In the Select Committee—I see fellow members sitting here today—we have been looking at digital exclusion and the need for education at all levels for those using the internet. Looking for good habits established in the earliest years is the right way to start, but it goes on after that, because the world that young people go on to inhabit in adulthood is one where they are already in control of the internet—if they had the education earlier. Adulthood comes with the freedom to choose how one expresses oneself online—of course it does—but this must not be at the cost of their continuing freedom from the most insidious content that puts their mental health at risk. Much mention has been made of the triple shield and I need not go there again. Its origins and perhaps deficiencies have been mentioned already.

The Center for Countering Digital Hate recently conducted an experiment, creating new social media accounts that showed interest in body image and mental health. This study found that TikTok served suicide-related content to new accounts within 2.6 minutes, with eating disorder content being recommended within 8 minutes. At the very least, these disturbing statistics tell us that users should have the option to opt in to such content, and not have to suffer this harm before later opting out. While the option to filter out certain categories of content is essential, it must be toggled on by default if safety is to be our primary concern.

The principle of safest by default creates not only a less harmful environment, but one in which users are in a position to define their own online experience. The space in which we carry out our public life is increasingly located on a small number of social media platforms—those category 1 platforms already mentioned several times—which everyone, from children to pensioners, uses to communicate and share their experiences.

We must then ensure that the protections we benefit from offline continue online: namely, protection from the harm and hate that pose a threat to our physical and mental well-being. When a child steps into school or a parent into their place of work, they must be confident that those with the power to do so have created the safest possible environment for them to carry out their interactions. This basic confidence must be maintained when we log in to Twitter, Instagram, TikTok or any other social media giant.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, my Amendment 43 tackles Clause 12(1), which expressly says that the duties in Clause 12 are to “empower” users. My concern is to ensure that, first, users are empowered and, secondly, legitimate criticism around the characteristics listed in Clause 12(11) and (12), for example, is not automatically treated as abusive or inciting hatred, as I fear it could be. My Amendment 283ZA specifies that, in judging content that is to be filtered out after a user has chosen to switch on various filters, the providers act reasonably and pause to consider whether they have “reasonable grounds” to believe that the content is of the kind in question—namely, abusive or problematic.

Anything under the title “empower adult users” sounds appealing—how can I oppose that? After all, I am a fan of the “taking back control” form of politics, and here is surely a way for users to be in control. On paper, replacing the “legal but harmful” clause with giving adults the opportunity to engage with controversial content if they wish, through enhanced empowerment tools, sounds positive. In an earlier discussion of the Bill, the noble Baroness, Lady Featherstone, said that we should treat adults as adults, allowing them to confront ideas with the

“better ethics, reason and evidence”—[Official Report, 1/2/23; col. 735.]

that has been the most effective way to deal with ideas from Socrates onwards. I say, “Hear, hear” to that. However, I worry that, rather than users being in control, there is a danger that the filter system might infantilise adult users and disempower them by hard-wiring into the Bill a duty and tendency to hide content from users.

There is a general weakness in the Bill. I have noted that some platforms are based on users moderating their own sites, which I am quite keen on, but this will be detrimentally affected by the Bill. It would leave users in charge of their own moderation, with no powers to decide what is in, for example, Wikipedia or other Wikimedia projects, which are added to, organised and edited by a decentralised community of users. So I will certainly not take the phrase “user empowerment” at face value.

I am slightly concerned about linguistic double-speak, or at least confusion. The whole Bill is being brought forward in a climate in which language is weaponised in a toxic minefield—a climate of, “You can’t say that”. More nerve-rackingly, words and ideas are seen as dangerous and interchangeable with violent acts, in a way that needs to be unpicked before we pass this legislation. Speakers can be cancelled for words deemed to threaten listeners’ safety—but not physical safety; the opinions are said to be unsafe. Opinions are treated as though they cause damage or harm as viscerally as physical aggression. So lawmakers have to recognise the cultural context and realise that the law will be understood and applied in it, not in the abstract.

I am afraid that the language in Clause 12(1) and (2) shows no awareness of this wider backdrop—it is worryingly woolly and vague. The noble Baroness, Lady Morgan, talked about dangerous content, and all the time we have to ask, “Who will interpret what is dangerous? What do we mean by ‘dangerous’ or ‘harmful’?”. Surely a term such as “abusive”, which is used in the legislation, is open to wide interpretation. Dictionary definitions of “abusive” include words such as “rude”, “insulting” and “offensive”, and it is certainly subjective. We have to query what we mean by the terms when some commentators complain that they have been victims of online abuse, but when you check their timelines you notice that, actually, they have been subject just to angry, and sometimes justified, criticism.

I recently saw a whole thread arguing that the Labour Party’s recent attack ads against the Prime Minister were an example of abusive hate speech. I am not making a point about this; I am asking who gets to decide. If this is the threshold for filtering content, there is a danger of institutionalising safe space echo chambers. It can also be a confusing word for users, because if someone applies a user empowerment tool to protect themselves from abuse, the threshold at which the filter operates could be much lower than they intend or envisage but, by definition, the user would not know what had been filtered out in their name, and they have no control over the filtering because they never see the filtered content.

17:00
The same is true of the Bill’s use of the term “incites hatred”. The word “hatred” in 2023 is highly contentious in the public arena. Indeed, over the last decade Parliament has wrestled with criminal offences around the incitement of hatred, and safeguards were built into legislation in the past, including free speech clauses in controversial areas such as religion. However, it seems to me that in this Bill the word “hatred” is just free floating. A user who understands “incites hatred” to cover really malicious, nasty content might not realise how much other content could be filtered out by the filtering tool if it operates at a low threshold of understanding what inciting hatred is.
It is also the case that inciting hatred around protected characteristics is fraught as an issue offline, let alone online. There are huge rows about whether accusations of Islamophobia and inciting hatred of Muslims are sometimes used to avoid open debates on extreme Islamist views. For example, will images such as the cartoons in the Charlie Hebdo magazine be seen as inciting hatred by some, and will they get filtered out? Similarly, some say that accusations of anti-Semitism—inciting hatred of Jewish people—are used to quell legitimate criticism of Israeli policy. I could go on.
I am not making a comment on any of those issues, other than to note that those who think that using hatred as a basis for filtering online content is easy need to get out a bit more—and that is before we even get to the gender wars. Regularly, those who assert the immutability of biological sex are accused of whipping up hatred against trans people; Joanna Cherry MP has had a talk cancelled by the Stand Comedy Club for just that. Even though the label “transphobic hate speech” directed at Joanna Cherry MP is totally illegitimate, in my opinion, because she is a crusader for women’s rights and lesbian rights, it does not matter whether you and I agree or whether we should have an argument; that is what debate is. We have to ask who from a big tech company will filter out material or decide what is, or is not, hatred. These are the kinds of issues that, we have to note, are difficult.
It is worth asking the Minister: who do the Government envisage will do the filtering? Do online filterers, let alone algorithms or machine learning, have the qualifications to establish what constitutes abuse or hatred? In other professions, from the College of Policing to overzealous HR departments and senior management teams in universities, we have seen overcaution in censoring and banning material under the auspices of hatred, abuse and that weasel word “harm”. Rather than empowering users, will the Bill not empower a new staff team of filterers trained in their own company’s equality, diversity and inclusion norms to use filtering tools at the lowest common denominator, leading to over-removal policies that err on the side of caution in order to comply with regulations? All that Amendment 43 does is to borrow the language of “discussion or criticism” from the free speech clause in the stirring up hatred offences section of the Public Order Act 1986 to try to lift the threshold at which Clause 12(11) and (12) might kick in. It is not ideal, but there is a lot at stake.
I completely oppose those amendments that promote a default setting. They are clearly advocating a censorious approach to legal speech. I rather liked an analogy that I heard the IEA’s Matthew Lesh use recently when he said, “Imagine if, when you go to a bookshop, you have to ask the shop assistant to let you into the special room that contains harmful books”. Of course, material is still accessible, but creating a barrier to accessing certain speech that is perhaps uncomfortable in terms of religion, race or gender also forces people to identify themselves. If you have to say, “Please can I go into the harmful speech section?”, or go into the harmful section of the bookshop, immediately you label yourself as pro-dangerous or pro-harmful material.
If those advocating these provisions are so certain about the righteousness of knowing that this speech is problematic, it would be more honest to simply outlaw it. What is more, the director of Defend Digital Me, Jen Persson, has raised concern that, by considering all adults to be at risk of harm in that way, the Bill will infantilise us, because it assumes that adults are inherently vulnerable. It is a sort of paternalistic Big Brother that we want to avoid in the Bill.
Finally, it is damaging in a democracy to have a proliferation of things that are unsayable. As the Bill reflects, so much debate takes place online, so it seems our responsibility as legislators to encourage a diversity of views to circulate, rather than carelessly or inadvertently to narrow the range of what circulates. On previous groups we mentioned Germany’s infamous legislation, brought in in 2017, which is now facing major opposition at home. Danish free-speech think tank Justitia notes that though
“the German government’s adoption of the NetzDG was a good faith initiative to curb hate online, the law has provided a blueprint for Internet censorship that is being used to target dissent and pluralism.”
I fear that unless we are very careful this section will do the same.
Baroness Parminter Portrait Baroness Parminter (LD)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to follow the noble Baroness, Lady Fox. I am afraid that on this issue, as I am sure she would expect, we profoundly disagree. I am delighted to support the amendment of the noble Baroness, Lady Morgan, and those from my noble friend Lord Clement-Jones, which do the same sort of thing and address the critical issue of what is a proportionate response, respecting the fact that the position for adults is different from that for children. What is a proportionate response, recognising that there is a large cadre of vulnerable people who need help to manage the beneficial but also worrying tool which is social media?

I shall cover only the issues on which I have any degree of competence in this complex field, which is to speak about the importance of this amendment because of the particular nature of eating disorders. I declare an interest as the mother of a young adult who has eating disorders and had them when she was a child. The noble Baroness, Lady Fox, talked about the need to allow adults to use their reason. Let me tell the Committee about people with eating disorders: I would love it if I could get my daughter to be as reasonable as she is when I talk to her about the benefits of proportional representation, where she can beat me hands down, when I try but fail to get her to put food in her mouth.

Eating disorders have two issues of relevance to this debate, and they are why I support the case for the strongest protection for them, the default being that people should have to opt in to have access to harmful content. First, eating disorders are intensely controlling. They suck people in, and they are not just about not eating; they control how they exercise; they control who they see; they are a control mechanism over a person’s whole life. I reject the idea that you can get someone who is controlled, day and night, by an eating disorder to make the decision to opt out of accessing social media content, when we know that people with eating disorders gravitate towards it because it provides them with content that sustains their illness. It provides them with communities of other users— the pro-mia and pro-ana sites, which sound incredibly comforting but are actually communities of people that encourage people, sometimes literally, to starve themselves to death. That controlling nature means that, for me, people having to opt in is the best way forward: it is a controlling illness.

Secondly, eating disorders are a very competitive illness. If you have anorexia, you want to be the thinnest. In the old days, that meant that you would cook food that you would not eat, but you would get your sister to eat it and you would feel good because you were thinner. Of course, with social media, you can now access all these websites where you can see people with nasogastric tubes and see people who are doing much “better”. As the noble Baroness, Lady Morgan, said, in that dreadful phrase, they provide “thinspiration”: people look for thinness and compare themselves to other people. It is an insatiable desire, so the idea that they will voluntarily opt out of that is just away with the fairies.

As I say, we need a proportionate response. I appreciate that people with eating disorders may well choose to opt in, but I think that the state in the first place should require that people have to opt into that choice. We have heard about the various mental health organisations that have made that case, but in thinking about this and talking to Rose about it, I think there is another fundamental reason why it is right that the state should take this approach. As the noble Baroness, Lady Morgan, said, eating disorders can start at a young age, but they can also start after the age of 18. If someone in their mid-20s—or mid-30s or mid-40s—is starting to feel a bit uncomfortable about their body image and starting to get some rather odd views about food but does not yet have an eating disorder, that is the time when, if they get support and do not get encouragement, we might be able to stop them getting sucked into these appalling vortexes of eating disorders. If we have this provision that people have to opt in, they might not see that content which, as has been mentioned, is being pushed at them—the right reverend Prelate the Bishop of Oxford gave examples the other week of how these sites feed you stuff immediately as soon as you start going down this route. If people have to opt in, we might just have that chance of stopping them getting an eating disorder.

Yes, people have to be given access to some of this material in a free society, but it is the role of the state to protect the vulnerable, and the particular nature of eating disorders means that, for me, this amendment is vital.

Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - - - Excerpts

My Lords, it is a privilege to follow the noble Baroness, Lady Parminter, in her very moving and personal speech. I am sorry that I was unable to speak to the previous group of amendments, some of which were in my name, because, due to unavoidable business in my diocese, I was not able to be present when that debate began late last Tuesday. However, it is very good to be able to support this group of amendments, and I hope tangentially to say something also in favour of risk assessment, although I am conscious that other noble Lords have ably made many of the points that I was going to make.

My right reverend friend the Bishop of Gloucester has added her name in support of amendments in this group, and I also associate myself with them—she is not able to be here today. As has been said, we are all aware that reaching the threshold of 18 does not somehow award you with exponentially different discernment capabilities, nor wrap those more vulnerable teenagers in some impermeable cotton wool to protect them from harm.

We are united, I think, in wanting to do all we can to make the online space feel safe and be safe for all. However, there is increasing evidence that people do not believe that it is. The DCMS’s own Public Attitudes to Digital Regulation survey is concerning. The most recent data shows that the number of UK adults who do not feel safe and secure online increased from 38% in November/December 2021 to 45% in June/July 2022. If that trend increases, the number will soon pass half, with more than half of UK adults not feeling safe and secure online.

It is vital that we protect society’s most vulnerable. When people are vulnerable through mental illness or other challenges, they are surely not able to protect themselves from being exposed to damaging online content by making safe choices, as we have just heard. In making this an opt-in system, we would save lives when people are at a point of crisis.

17:15
In listening to our debates, I sometimes feel that we have not grasped in our deliberations as a Committee the inequality of arms which exists in an individual faced with the entire internet. We have heard analogies this afternoon of a bookshop, and we might think of a supermarket. We might also think of a debate in the Athenian Agora many years ago, when people debated person to person, with an equality of arms and intellect. There is no such equality of arms when it comes to exposure to the internet and social media. I will categorise five things which break this equality down—they all begin with “A”, if your Lordships like alliteration.
The first is advertising. The whole expertise of the advertising industry, commercially driven through applications, places its weight on the individual. The accumulated skill of how to sell more to more people is focused and channelled through all the social media we are concerned with regulating.
The second is access. Through the mobile phone in the 19 year-old’s pocket, and in mine, social media and app producers have access 24/7, in the most private and intimate moments of our lives, to influence and shape our minds. There is no physical boundary of going to a bookshop; it is present wherever we are.
The third “A” is access to our data. The people who are pushing things at us know more about us than the closest members of our families, because they study every purchase. Every click is interpreted. Every inquiry that we search is channelled back into access to our data and used to pressure the individual and to shape their choices in the offline world as well as the online one.
Fourthly, all this information and skill is then channelled algorithmically and driven by the power of algorithms. It is multiplied, and multiplied again, in ways that no consumer fully understands or can measure.
Fifthly, we are now on the threshold of much of the content to which we and others are exposed being energised and powered by artificial intelligence, so that the problems we have seen to date are multiplying and will be multiplied hugely in the coming decade.
I believe that people will look back on the first two decades of the 21st century—the time that the noble Lord, Lord McNally, referred to, from 2003, when we did not envisage what was coming, to this Bill in 2023—as a time of complete madness. They will see it as a time when we created such harmful, toxic environments—not only for children and young people but for adults—that it affected the mental health of a generation profoundly. This Bill is an opportunity to draw a line in the sand and to remedy that. The user empowerment tools and adult risk assessments offer us very important tools. We must take this opportunity and fight back against this inequality of arms. I support these amendments.
Baroness Fraser of Craigmaddie Portrait Baroness Fraser of Craigmaddie (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I contribute to this debate on the basis of my interests as laid out in the register: as chief executive of Cerebral Palsy Scotland; my work with the Scottish Government on people with neurological conditions; and as a trustee of the Neurological Alliance of Scotland. It is an honour to follow the right reverend Prelate, whose point about the inequality people experience in the online world is well made. I want to be clear that when I talk about ensuring online protection for people with disabilities, I do not assume that all adults with disabilities are unable to protect themselves. As the right reverend Prelate and the noble Lord, Lord Griffiths of Burry Port, pointed out, survey after survey demonstrates how offline vulnerabilities translate into the online world, and Ofcom’s own evidence suggests that people with physical disabilities, learning disabilities, autism, mental health issues and others can be classed as being especially vulnerable online.

The Government recognise that vulnerable groups are at greater risk online, because in its previous incarnations, this Bill included greater protection for such groups. We spoke in a previous debate about the removal of the “legal but harmful” provisions and the imposition of the triple shield. The question remains from that debate: does the triple shield provide sufficient protection for these vulnerable groups?

As I have said previously this afternoon, user empowerment tools are the third leg of the triple shield, but they put all the onus on users and no responsibility on the platforms to prevent individuals’ exposure to harm. Amendments 36, 37 and 38A, in the name of the noble Lord, Lord Clement-Jones, seek simply to make the default setting for the proposed user empowerment tools to be “on”. I do not pretend to understand how, technically, this will happen, but it clearly can, because the Bill requires platforms to ensure that this is the default position to ensure protection for children. The default position in those amendments protects all vulnerable people, and that is why I support them—unlike, I fear, Amendment 34 from my noble friend Lady Morgan, which lists specific categories of vulnerable adults. I would prefer that all vulnerable people be protected from being exposed to harm in the first place.

Nobody’s freedom of expression is affected in any way by this default setting, but the overall impact on vulnerable individuals in the online environment would, I assure your Lordships, be significant. Nobody’s ability to explore the internet or to go into those strange rooms at the back of bookshops that the noble Baroness, Lady Fox, was talking about would be curtailed. The Government have already stated that individuals will have the capacity to seek out these tools and turn them on and off, and that they must be easily accessible. So individuals with capacity will be able to find the settings and set them to explore whatever legal content they choose.

However, is it not our duty to remember those who do not have capacity? What about adults with learning difficulties and people at a point of crisis—the noble Baroness, Lady Parminter, movingly spoke about people with eating disorders—who might not be able to turn to those tools due to their affected mental state, or who may not realise that what they are seeing is intended to manipulate? Protecting those users from encountering such content in the first place surely tips the balance in favour of turning the tools on by default.

I am very sad that the noble Baroness, Lady Campbell of Surbiton, cannot be here, because her contribution to this debate would be powerful. But, from her enormous experience of work with disabled people, this is her top priority for the Bill.

In preparing to speak to these amendments, I looked back to the inquiry in the other place into online abuse and the experience of disabled people that was prompted by Katie Price’s petition after the shocking abuse directed at her disabled son Harvey. In April 2019 the Government responded to that inquiry by saying that they were

“aware of the disproportionate abuse experienced by disabled people online and the damage such abuse can have on people’s lives, career and health”—

and the Government pledged to act.

The internet is a really important place for disabled people, and I urge the Government to ensure that it remains a safe place for all of us and to accept these amendments that would ensure the default settings are set to on.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to support the amendments in the name of the noble Baroness, Lady Morgan. I do so somewhat reluctantly, not because I disagree with anything that she said but because I would not necessarily start from here. I want to briefly say three very quick things about that and then move on to Amendments 42 and 45, which are also in this group.

We already have default settings, and we are pretending that this is a zero-sum game. The default settings at the moment are profiling us, filtering us and rewarding us; and, as the right reverend Prelate said in his immensely powerful speech, we are not starting at zero. So I do share the concerns of the noble Baroness, Lady Fox, about who gets to choose—some of us on this side of the debate are saying, “Can we define who gets to choose? Can Parliament choose? Can Ofcom choose? Can we not leave this in the hands of tech companies?” So on that I fully agree. But we do have default settings already, and this is a question of looking at some of the features as well as the content. It is a weakness of the Government’s argument that it keeps coming back to the content rather than the features, which are the main driver of what we see.

The second thing I want to say—this is where I am anxious about the triple shield—is: does not knowing you are being abused mean that you are not abused? I say that as someone with some considerable personal abuse. I have my filter on and I am not on social media, but my children, my colleagues and some of the people I work with around the world do see what is said about me—it is a reputational thing, and for some of them it is a hurtful thing, and that is why I am reluctant in my support. However, I do agree with all the speakers who have said that our duty is to start with those people who are most vulnerable.

I want to mention the words of one of the 5Rights advisers—a 17 year-old girl—who, when invited to identify changes and redesign the internet, said, “Couldn’t we do all the kind things first and gradually get to the horrible ones?” I think that this could be a model for us in this Chamber. So, I do support the noble Baroness.

I want to move briefly to Amendment 42, which would see an arbitrary list of protected characteristics replaced by the Equality Act 2010. This has a lot to do with a previous discussion we had about human rights, and I want to say urgently to the Minister that the offer of the Online Safety Bill is not to downgrade human rights, children’s rights and UK law, but rather to bring forward a smart and comprehensive regime to hold companies accountable for human rights, children’s rights and UK law. We do not want to have a little list of some of our children’s rights or of some of our legislation; we would like our legislation and our rights embedded in the Bill.

I have to speak for Amendment 45. I express my gratitude to the noble Lord, Lord Stevenson, for tabling it. It would require Ofcom, six months after the event, to ask whether children need these user empowerment tools. It is hugely important. I remind the Committee that children have not only rights but an evolving capacity to be out there in the world. As I said earlier, the children’s safety duties have a cliff-edge feel to them. As children go out into the world on the cusp of adulthood, maybe they would like to have some of these user empowerment tools.

17:30
Baroness Buscombe Portrait Baroness Buscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, the noble Baroness, Lady Kidron, said words to the effect that perhaps we should begin by having particular regard for certain vulnerabilities, but we are dealing with primary legislation and this really concerns me. Lists such as in Clause 12 are really dangerous. It is not a great way to write law. We could be with this law for a long time.

I took the Communications Act 2003 through for Her Majesty’s Opposition, and we were doing our absolute best to future-proof the legislation. There was no mention of the internet in that piece of legislation. With great respect to the noble Lord, Lord McNally, with whom I sparred in those days, in was not that Act that introduced Ofcom but a separate Act. The internet was not even mentioned until the late Earl of Northesk introduced an amendment with the word “internet” to talk about the investigative powers Act.

The reality is that we already had Facebook, and tremendous damage being done through it to people such as my daughter. Noble Lords will remember that in the early days it was Oxford, Cambridge, Yale and Harvard; that is how it all began. It was an amazing thing, and we could not foresee what would happen but there was a real attempt to future-proof. If you start having lists such as in Clause 12, you cannot just add on or change. Cultural mores change. This list, which looks great in 2023, might look really odd in about 2027. Different groups will have emerged and say, “Well, what about me, what about me?”.

I entirely agree with the noble Baroness, Lady Fox. Who will be the decider of what is right, what is rude or what is abusive? I have real concerns with this. The Government have had several years to get this right. I say that with great respect to my noble friend the Minister, but we will have to think about these issues a little further. The design of the technology around all this is what we should be imposing on the tech companies. I was on the Communications and Digital Committee in 2020 when that was a key plank of our report, following the inquiry that we carried out and prior to the Joint Committee, then looking at this issue of “legal but harmful”, et cetera. I am glad that was dropped because—I know that I should not say this—when I asked a civil servant what was meant by “harmful”, he said, “Well, it might upset people”.

It is a very subjective thing. This is difficult for the Government. We must do all we can to support the Government in trying to find the right solutions, but I am sorry to say that I am a lawyer—a barrister—and I worry. We are trying to make things right but, remember, once it is there in an Act, it is there. People will use that as a tool. In 2002, at New Scotland Yard, I was introduced to an incredible website about 65 ways to become a good paedophile. Where does that fit in Clause 12? I have not quite worked that out. Is it sex? What is it? We have to be really careful. I would prefer having no list and making it more general, relying on the system to allow us to opt in.

I support my noble friend Lady Morgan’s amendment on this, which would make it easier for people to say, “Well, that’s fine”, but would not exclude people. What happens if you do not fit within Clause 12? Do you then just have to suck it up? That is not a very House of Lords expression, but I am sure that noble Lords will relate to it.

We have to go with care. I will say a little more on the next group of amendments, on anonymity. It is really hard, but what the Government are proposing is not quite there yet.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

That seemed to be provoked by me saying that we must look after the vulnerable, but I am suggesting that we use UK law and the rights that are already established. Is that not better than having a small list of individual items?

Baroness Buscombe Portrait Baroness Buscombe (Con)
- Hansard - - - Excerpts

I agree. The small list of individual items is the danger.

Baroness Bull Portrait Baroness Bull (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I support the noble Baroness, Lady Buscombe, on the built-in obsolescence of any list. It would very soon be out of date.

I support the amendments tabled by the noble Lord, Lord Clement-Jones, and by the noble Baroness, Lady Morgan of Cotes. They effectively seek a similar aim. Like the noble Baroness, Lady Fraser, I tend towards those tabled by the noble Lord, Lord Clement-Jones, because they seem clearer and more inclusive, but I understand that they are trying for the same thing. I also register the support for this aim of my noble friend Lady Campbell of Surbiton, who cannot be here but whom I suspect is listening in. She was very keen that her support for this aim was recorded.

The issue of “on by default” inevitably came up at Second Reading. Then and in subsequent discussions, the Minister reiterated that a “default on” approach to user empowerment tools would negatively impact people’s use of these services. Speaking at your Lordships’ Communications and Digital Committee, on which I sat at the time, Minister Scully went further, saying that the strongest option, of having the settings off in the first instance,

“would be an automatic shield against people’s ability to explore what they want to explore on the internet”.

According to the Government’s own list, this was arguing for the ability to explore content that abuses, targets or incites hatred against people with protected characteristics, including race and disability. I struggle to understand why protecting this right takes precedence over ensuring that groups of people with protected characteristics are, well, protected. That is our responsibility. It is precedence, because switching controls one way is not exactly the same as switching them the other way. It is easy to think so, but the noble Baroness, Lady Parminter, explained very clearly that it is not the same. It is undoubtedly easier for someone in good health and without mental or physical disabilities to switch controls off than it is for those with disabilities or vulnerabilities to switch them on. That is self-evident.

It cannot be right that those most at risk of being targeted online, including some disabled people—not all, as we have heard—and those with other protected characteristics, will have the onus on them to switch on the tools to prevent them seeing and experiencing harm. There is a real risk that those who are meant to benefit from user empowerment tools, those groups at higher risk of online harm, including people with a learning disability, will not be able to access the tools because the duties allow category 1 services to design their own user empowerment tools. This means that we are likely to see as many versions of user empowerment tools as there are category 1 services to which this duty applies.

Given what we know about the nature of addiction and self-harm, which has already been very eloquently explained, it surely cannot be the intention of the Bill that those people who are in crisis and vulnerable to eating disorders or self-harm, for example, will be required to seek and activate a set of tools to turn off the very material that feeds their addiction or encourages their appetite for self-harm.

The approach in the Bill does little to prevent people spiralling down this rabbit hole towards ever more harmful content. Indeed, instead it requires people to know that they are approaching a crisis point, and to have sufficient levels of resilience and rationality to locate the switch and turn on the tools that will protect them. That is not how the irrational or distressed mind works.

So, all the evidence that we have about the existence of harm which arises from mental states, which has been so eloquently set out in introducing the amendments— I refer again to my noble friend Lady Parminter, because that is such powerful evidence—tips the balance in favour, I believe, of setting the tools to be on by default. I very much hope the Minister will listen and heed the arguments we have heard set out by noble Lords across the Committee, and come back with some of his own amendments on Report.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

Before the noble Baroness sits down, I wanted to ask for clarification, because I am genuinely confused. When it comes to political rights for adults in terms of their agency, they are rights which we assume are able to be implemented by everyone. But we recognise that in the adult community —this is offline now; I mean in terms of how we understand political rights—there may well be people who lack capacity or are vulnerable, and we take that into account. But we do not generally organise political rights and access to, for example, voting or free speech around the most vulnerable in society. That is not because we are insensitive or inhumane, or do not understand. The moving testimonies we have heard about people with eating disorders and so on are absolutely spot-on accurate. But are we suggesting that the world online should be organised around vulnerable adults, rather than adults and their political rights?

Baroness Bull Portrait Baroness Bull (CB)
- Hansard - - - Excerpts

I do not have all the answers, but I do think we heard a very powerful point from the right reverend Prelate. In doing the same for everybody, we do not ensure equality. We need to have varying approaches, in order that everybody has equality of access. As the Bill stands, it says nothing about vulnerable adults. It simply assumes that all adults have full capacity, and I think what these amendments seek to do is find a way to recognise that simply thinking about children, and then that everybody aged 18 is absolutely able to take care of themselves and, if I may say, “suck it up”, is not the world we live in. We can surely do better than that.

Baroness Healy of Primrose Hill Portrait Baroness Healy of Primrose Hill (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I rise briefly to support Amendments 34 and 35, from the noble Baroness, Lady Morgan, and others in this essential group. It is not enough to say the new triple shield will help prevent adults seeing harmful but legal material if they so wish. Having removed “harmful but legal” from the original Bill, there is now a need to ensure that the default options are the safest for users in regard to suicide, self-harm, eating disorders and abuse and hate content.

As the Bill stands, adults can still see the most dangerous content online. Young people over 18 may be especially vulnerable if faced with a torrent of images edited digitally to represent unattainable beauty standards; it can result in poor body image detrimental to mental health, resulting in shame, anxiety and, in some cases, suicide. As other noble Lords have said, anorexia has the highest mortality rate of any mental health problem. We know pro-anorexia sites are rife online. Vulnerable adults should be protected.

These amendments would make a real difference to the Bill. Changing the user empowerment provisions to require category 1 providers to have the safest options as the default for users would be a straightforward way of increasing the protection of most internet users who do not want to have this material bombard them. It would not overburden the tech companies and could do some good. It would not curtail freedom of speech, as tech-savvy users could easily flip a switch if they wished to opt in to some of the most dangerous content, which will still be available online, rather than receiving it by default.

Even with the Government’s best intentions to prevent encouragement of serious self-harm, we know they cannot criminalise all the legal content that treads the line between glorification and outright encouragement, as the noble Baroness, Lady Morgan, said. As the Communications and Digital Select Committee, on which I now serve, said in its 2021 report,

“the Online Safety Bill should require category 1 platforms to give users a comprehensive toolkit of settings, overseen by Ofcom, allowing users to decide what types of content they see and from whom. Platforms should be required to make these tools easy to find and use. The safest settings should always be the default”.

I hope the Government accept these valuable and simple amendments. They are supported by the Mental Health Foundation, to whom I owe thanks for this briefing, together with many other experts in the field of mental health.

17:45
Baroness Burt of Solihull Portrait Baroness Burt of Solihull (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this is my first contribution to the Bill, and I feel I need to apologise in advance for my lack of knowledge and expertise in this whole field. In her initial remarks, the noble Baroness, Lady Morgan of Cotes, was saying “Don’t worry, because you don’t need to be a lawyer”. Unfortunately, I do not have any expertise in the field of the internet and social media and all of that as well, so I will be very brief in all of my remarks on the Bill. But I feel that I cannot allow the Bill to go past without at least making a few remarks, as equalities spokesperson for the Lib Dems. The issues are of passionate importance to me, and of course to victims of online abuse, and it is those victims for whom I speak today.

In this group, I will address my remarks to Amendments 34 and 35, in which we have discussed content deemed to be harmful—suicide, self-harm, eating disorders and abuse and hate content—under the triple shield approach, although this content discussion has strayed somewhat during the course of the debate.

Much harmful material, as we have heard, initially comes to the user uninvited. I do not pretend to understand how these algorithms work, but my understanding is that if you open one, they literally click into action, increasing more and more of this kind of content being fed to you in your feed. The suicide of young Molly Russell is a typical example of the devastating consequences of how much damage these algorithms can contribute. I am glad that the Bill will go further to protect children, but it still leaves adults—some young and vulnerable—without some protection and with the same amount of automatic exposure to harmful content, which algorithms can increase with engagement, which could have overwhelming impacts on their mental health, as my noble friend Lady Parminter so movingly and eloquently described.

So this amendment means a user would have to make an active, conscious choice to be exposed to such content: an opt out rather than an opt in. This has been discussed at length by noble Lords a great deal more versed in the subject than me. But surely the only persons or organisations who would not support this would be the ones who do not have the best interests of the vulnerable users we have been talking about this afternoon at heart. I hope the Minister will confirm in his remarks that the Government do.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I had not intended to speak in this debate because I now need to declare an unusual interest, in that Amendment 38A has been widely supported outside this Chamber by my husband, the Member of Parliament for Weston-super-Mare. I am not intending to speak on that amendment but, none the less, I mention it just in case.

I rise to speak because I have been so moved by the speeches, not least the right reverend Prelate’s speech. I would like just to briefly address the “default on” amendments and add my support. Like others, on balance I favour the amendments in the name of the noble Lord, Lord Clement-Jones, but would willingly throw my support behind my noble friend Lady Morgan were that the preferred choice in the Chamber.

I would like to simply add two additional reasons why I ask my noble friend the Minister to really reflect hard on this debate. The first is that children become teenagers, who become young adults, and it is a gradual transition—goodness, do I feel it as the mother of a 16 year-old and a 17 year-old. The idea that on one day all the protections just disappear completely and we require our 18 year-olds to immediately reconfigure their use of all digital tools just does not seem a sensible transition to adulthood to me, whereas the ability to switch off user empowerment tools as you mature as an adult seems a very sensible transition.

Secondly, I respect very much the free speech arguments that the noble Baroness, Lady Fox, made but I do not think this is a debate about the importance of free speech. It is actually about how effective the user empowerment tools are. If they are so hard for non-vulnerable adults to turn off, what hope have vulnerable adults to be able to turn them on? For the triple shield to work and the three-legged stool to be effective, the onus needs to be on the tech companies to make these user empowerment tools really easy to turn on and turn off. Then “default on” is not a restriction on freedom of speech at all; it is simply a means of protecting our most vulnerable.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a very thoughtful and thought-provoking debate. I start very much from the point of view expressed by the noble Baroness, Lady Kidron, and this brings the noble Baroness, Lady Buscombe, into agreement—it is not about the content; this is about features. The noble Baroness, Lady Harding, made exactly the same point, as did the noble Baroness, Lady Healy—this is not about restriction on freedom of speech but about a design feature in the Bill which is of crucial importance.

When I was putting together the two amendments that I have tabled, I was very much taken by what Parent Zone said in a recent paper. It described user empowerment tools as “a false hope”, and rightly had a number of concerns about undue reliance on tools. It said:

“There is a real danger of users being overwhelmed and bewildered”.


It goes on to say that

“tools cannot do all the work, because so many other factors are in play—parental styles, media literacy and technological confidence, different levels of vulnerability and, crucially, trust”.

The real question—this is why I thought we should look at it from the other side of things in terms of default—is about how we mandate the use of these user empowerment tools in the Bill for both children and adults. In a sense, my concerns are exactly the opposite of those of the noble Baroness, Lady Fox—for some strange, unaccountable reason.

The noble Baroness, Lady Morgan, the noble Lord, Lord Griffiths, the right reverend Prelate and, notably, my noble friend Lady Parminter have made a brilliant case for their amendment, and it is notable that these amendments are supported by a massive range of organisations. They are all in this area of vulnerable adults: the Mental Health Foundation, Mind, the eating disorder charity Beat, the Royal College of Psychiatrists, the British Psychological Society, Rethink Mental Illness, Mental Health UK, and so on. It is not a coincidence that all these organisations are discussing this “feature”. This is a crucial aspect of the Bill.

Again, I was very much taken by some of the descriptions used by noble Lords during the debate. The right reverend Prelate the Bishop of Oxford said that young people do not suddenly become impervious to content when they reach 18, and he particularly described the pressures as the use of AI only increases. I thought the way the noble Baroness, Lady Harding, described the progression from teenagehood to adulthood was extremely important. There is not some sort of point where somebody suddenly reaches the age of 18 and has full adulthood which enables then to deal with all this content.

Under the Bill as it stands, adult users could still see and be served some of the most dangerous content online. As we have heard, this includes pro-suicide, pro-anorexia and pro-bulimia content. One has only to listen to what my noble friend Lady Parminter had to say to really be affected by the operation, if you like, of social media in those circumstances. This is all about the vulnerable. Of course, we know that anorexia has the highest mortality rate of any mental health problem; the NHS is struggling to provide specialist treatment to those who need it. Meanwhile, suicide and self-harm-related content remains common and is repeatedly implicated in deaths. All Members here who were members of the Joint Committee remember the evidence of Ian Russell about his daughter Molly. I think that affected us all hugely.

We believe now you can pay your money and take your choice of whichever amendment seems appropriate. Changing the user empowerment provisions to require category 1 providers to have either the safest options as default for users or the terms of my two amendments is surely a straightforward way of protecting the vast majority of internet users who do not want this material served to them.

You could argue that the new offence of encouragement to serious self-harm, which the Government have committed to introducing, might form part of the solution here, but you cannot criminalise all the legal content that treads the line between glorification and outright encouragement. Of course, we know the way the Bill has been changed. No similar power is proposed, for instance, to address eating disorder content.

The noble Baroness, Lady Healy, quoted our own Communications and Digital Committee and its recommendations about a comprehensive toolkit of settings overseen by Ofcom, allowing users to decide what types of content they see and from whom. I am very supportive of Amendment 38A from the noble Lord, Lord Knight, which gives a greater degree of granularity about the kind of user, in a sense, that can communicate to users.

Modesty means that of course I prefer my own amendments and I agree with the noble Baronesses, Lady Fraser, Lady Bull and Lady Harding, and I am very grateful for their support. But we are all heading in the same direction. We are all arguing for a broader “by default” approach. The onus should not be on these vulnerable adults in particular to switch them on, as the noble Baroness, Lady Bull, said. It is all about those vulnerable adults and we must, as my noble friend Lady Burt, said, have their best interests at heart, and that is why we have tabled these amendments.

18:00
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, this has been one of the most important debates we have had so far in Committee, covering most of the issues in Clause 12—effectively, the replacement of the legal but harmful provisions that were in the draft Bill with the user empowerment tools, introducing the new element of the triple shield, or the three-legged stool as we are now going to describe it thanks to the noble Baroness, Lady Fraser. It is about how we as adults are empowered to protect ourselves from harmful content and, most crucially, the amplification of the harm caused by the systems used on the platforms.

I welcome subsections (4) and (5) of Clause 12, on ease of use and ease of access to the tools. Many platforms already offer these sort of tools. The noble Lord, Lord Clement-Jones, referred to the ParentZone research that has been circulated, which talked about a Facebook tool to prevent autoplay of ads. It took ParentZone’s tech-savvy researcher—not the noble Baroness, Lady Burt—three and a half hours to work out how to turn autoplay off. The research also found that 30% of tools had changed in the last year, so this is an ever-moving target for people to chase after.

The reality is that most of us do not have the time, even if we have the inclination, to deal with all these things. We already have user empowerment tools for unsubscribing from junk emails—and how many of us can be bothered to go through all that all the time? Sometimes I do but sometimes I just have to delete them and move on. We have to manage cookies; sometimes I do and sometimes I do not because I do not have time. That is why we need to look seriously at putting some of these tools on by default, with easily accessible settings to then turn them off if desired.

I therefore support Amendments 34 and 35, tabled by the noble Baroness, Lady Morgan, although I support those from the noble Lord, Lord Clement-Jones, more, which is why I put my name to them before the debate started. What the noble Baroness said about self-harm, suicide and eating disorders is really important. Again, this is less about people never being able to see individual items of content relating to those things and much more about restraining the platforms from bombarding us with similar content, as happened to Molly Russell and others. Here, of course, as many noble Lords have said, we should be mindful of the vulnerability of many young adults and other adults to the same experience that was implicated in Molly’s death.

According to Refuge’s research, which has been circulated, just over one in three UK women have experienced online abuse or harassment on social media, and perpetrators of domestic abuse are increasingly turning to technology as a tool to further their abuse. A briefing sent by the Royal College of Psychiatrists says that, according to NHS England, only 57.5% of 17 to 24 year-olds feel safe using social media in this country. Why not improve their safety as adults by having them opt in to seeing potentially harmful content—this is particularly important to some vulnerable adults with limited capacity to make decisions about internet and social media use—without limiting the freedom of adults to see this content if they want to?

The noble Lord, Clement-Jones, with Amendments 36 and 37, to which I added my name, is essentially going back to some of the debate about safety by design. As the right reverend Prelate set out so powerfully, the platforms are designed to maximise engagement, time spent on their site, data collection and the targeting of advertising. It is about their business model, not our safety. Artificial intelligence has no ethical constraint, and these user empowerment tools allow us to shift the algorithm in our favour, including to make us safer. To toggle them off is to side with the business model regardless of adult safety; to toggle them on is to side with adults having a more pleasant but slightly less engaging experience. Whose side is the Minister on? We look forward to hearing.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

Just to clarify, in a way we have reduced this debate to whether the default position should be on or off, although in fact that is only one aspect of this. My concern, and what I maybe spent too long talking about, is what happens if we turn the toggles to “on”. The assumption we keep making is that once they are on, we are safe. The difficulty is that the categories of what is filtered out after turning them on are not necessarily what the user thinks they are. I am simply asking how you get around that; otherwise, we think it is too easy—turn it on or off; press the button. Is it not problematic for us all if, in thinking you are going to stop seeing hate, hate turns out actually to be legitimate and interesting political ideas?

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

As ever, the noble Baroness is an important voice in bursting our bubble in the Chamber. I continue to respect her for that. It will not be perfect; there is no perfect answer to all this. I am siding with safety and caution rather than a bit of a free-for-all. Sometimes there might be overcaution and aspects of debate where the platforms, the regulator, the media, and discussion and debate in this Chamber would say, “The toggles have got it wrong”, but we just have to make a judgment about which side we are on. That is what I am looking forward to hearing from the Minister.

These amendments are supported on all sides and by a long list of organisations, as listed by the noble Baroness, Lady Morgan, and the noble Lord, Lord Clement-Jones. The Minister has not conceded very much at all so far to this Committee. We have heard compelling speeches, such as those from the noble Baroness, Lady Parminter, that have reinforced my sense that he needs to give in on this when we come to Report.

I will also speak to my Amendment 38A. I pay tribute to John Penrose MP, who was mentioned by the noble Baroness, Lady Harding, and his work in raising concerns about misinformation and in stimulating discussion outside the Chambers among parliamentarians and others. Following discussions with him and others in the other place, I propose that users of social media should have the option to filter out content the provenance of which cannot be authenticated.

As we know, social media platforms are often awash with content that is unverified, misleading or downright false. This can be particularly problematic when it comes to sensitive or controversial topics such as elections, health or public safety. In these instances, it can be difficult for users to know whether the information presented to them is accurate. Many noble Lords will be familiar with the deep-fake photograph of the Pope in a white puffa jacket that recently went viral, or the use of imagery for propaganda purposes following the Russian invasion of Ukraine.

The Content Authenticity Initiative has created an open industry standard for content authenticity and provenance. Right now, tools such as Adobe Photoshop allow users to turn on content credentials to securely attach provenance data to images and any edits then made to those images. That technology has now been adopted by camera manufacturers such as Leica and Nikon, so the technology is there to do some of this to help give us some reassurance.

Amendment 38A would allow users to filter out unverified content and is designed to flag posts or articles that do not come from a reliable source or have not been independently verified by a reputable third party. Users could then choose to ignore or filter out such content, ensuring that they are exposed only to information that has been vetted and verified. This would not only help users to make more informed decisions but help to combat the spread of false information on social media platforms. By giving users the power to filter out unverified content, we can help to ensure that social media platforms are not used to spread harmful disinformation or misinformation.

Amendments 42 and 45, in the name of my noble friend Lord Stevenson, are good and straightforward improvements to the Bill, suggesting that Clause 12(11) should include all the protected characteristics in the Equality Act 2010. I listened closely to the argument of the noble Baroness, Lady Buscombe, about listing the various characteristics, but I think it makes much more sense to do what my noble friend is suggesting and just reference the Equality Act. If we chose separately to amend the Equality Act to change protected characteristics, that change would then flow through into this Bill. That seems to be a much more sensible and pragmatic way to proceed. Similarly, in Amendment 45 my noble friend is suggesting that consideration should be given by the Secretary of State as to whether any of these tools should also be available to child users, and we have heard good arguments in favour of that when it comes to dealing with the cliff-edge effect, as described by the noble Baroness, Lady Kidron.

Lastly—and, perhaps the Committee will forgive me, slightly bizarrely—Amendment 283ZA, in the name of the noble Baroness, Lady Fox, to Clause 170(7) in Part 11 invites us to consider, among other things, whether a bot can act reasonably. Given that self-coding bots are now technically possible, I would be fascinated to hear the Minister’s view on whether an autonomous bot can have reason. I asked ChatGPT this question. For the help of the Committee, it replied:

“As an artificial intelligence language model, I do not have consciousness or emotions, and therefore, I do not have a capacity for ‘reason’ in the way that humans understand it. However, I am programmed to use advanced algorithms that allow me to process and analyze large amounts of data, recognize patterns, and provide reasoned responses based on the information available to me. This allows me to simulate a form of reasoning, but it is important to note that it is not the same as human reasoning, as I do not have subjective experiences or personal biases. Ultimately, my abilities are limited to the algorithms and data that have been programmed into my system, and I cannot generate my own subjective experiences or judgments.”


That is the view of the algorithm as to whether or not bots can have reason. I look forward to the Minister’s response.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, the Government recognise the objectives of the amendments in this group: to strengthen protections for adults online. I hope noble Lords will agree that the Bill will indeed significantly improve the safety of all adult users, particularly those who are more vulnerable.

The user empowerment content features will not be the only measures in the Bill that will protect adults. They will act as a final layer of protection, coming after the duties on illegal content and the requirement on category 1 providers to uphold their terms of service. However, as the Clause 12 duties apply to legal content, we need to tread carefully and not inadvertently restrict free expression.

Amendments 34 and 35 in the name of my noble friend Lady Morgan of Cotes and Amendments 36 and 37 in the name of the noble Lord, Lord Clement-Jones, seek to require category 1 services to have their user empowerment content features in operation by default for adult users. The Government share concerns about users who experience disproportionate levels of abuse online or those who are more susceptible to suicide, self-harm or eating disorder content, but these amendments encroach on users’ rights in two ways.

First, the amendments intend to make the decision on behalf of users about whether to have these features turned on. That is aimed especially at those who might not otherwise choose to use those features. The Government do not consider it appropriate to take that choice away from adults, who must be allowed to decide for themselves what legal content they see online. That debate was distilled in the exchange just now between the noble Lord, Lord Knight, and the noble Baroness, Lady Fox, when the noble Lord said he would err on the side of caution, even overcaution, while he characterised the other side as a free-for-all. I might say that it was erring on the side of freedom. That is the debate that we are having, and should have, when looking at these parts of the Bill.

Secondly, the amendments would amount to a government requirement to limit adults’ access to legal content. That presents real concerns about freedom of expression, which the Government cannot accept.

18:15
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Does the Minister therefore think that the Government condone the current system, where we are inundated algorithmically with material that we do not want? Are the Government condoning that behaviour, in the way that he is saying they would condone a safety measure?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

We will come to talk about algorithms and their risks later on. There is an important balance to strike here that we have debated, rightly, in this group. I remind noble Lords that there are a range of measures that providers can put in place—

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

Because of the importance of that point in relation to what the Minister is about to say, we should be clear about this point: is he ruling out the ability to prioritise the needs and requirements of those who are effectively unable to take the decisions themselves in favour of a broader consideration of freedom of expression? It would be helpful for the future of this debate to be clear on that point.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

We will come in a moment to the provisions that are in the Bill to make sure that decisions can be taken by adults, including vulnerable adults, easily and clearly. If the noble Lord will allow, I will cover that point.

I was in the middle of reminding noble Lords that there are a range of measures that providers can put in place under these duties, some of which might have an impact on a user’s experience if they were required to be switched on by default. That may include, for example, restricting a user’s news feed to content from connected users, adding to the echo chamber and silos of social media, which I know many noble Lords would join me in decrying. We think it is right that that decision is for individual users to make.

The Bill sets out that the user empowerment content tools must be offered to all adult users and must be easy to access—to go the point raised just now as well as by my noble friend Lady Harding, and the noble Baroness, Lady Burt, and, as noble Lords were right to remind us, pushed by the noble Baroness, Lady Campbell of Surbiton, who I am pleased to say I have been able to have discussions with separately from this Committee.

Providers will also be required to have clear and accessible terms of service about what tools are offered on their service and how users might take advantage of them. Ofcom will be able to require category 1 services to report on user empowerment tools in use through transparency reports. Ofcom is also bound by the Communications Act 2003 and the public sector equality duty, so it will need to take into account the ways that people with certain characteristics, including people with disabilities, may be affected when performing its duties, such as writing the codes of practice for the user empowerment duties.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I think the Minister is trying to answer the point raised by my noble friend about vulnerable adults. I am interested in the extent to which he is relying on the Equality Act duty on Ofcom then to impact the behaviour of the platforms that it is regulating in respect of how they are protecting vulnerable adults. My understanding is that the Equality Act duty will apply not to the platforms but only to Ofcom in the way that it regulates them. I am unclear how that is going to provide the protection that we want.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

That is right. Platforms are not in the public sector, so the public sector equality duty does not apply to them. However, that duty applies to Ofcom, taking into account the ways in which people with certain characteristics can be affected through the codes of practice and the user empowerment duties that it is enforcing. So it suffuses the thinking there, but the duty is on Ofcom as a public sector body.

We talk later in Clause 12(11) of some of the characteristics that are similar in approach to the protected characteristics in the Equality Act 2010. I will come to that again shortly in response to points made by noble Lords.

I want to say a bit about the idea of there being a cliff edge at the age of 18. This was raised by a number of noble Lords, including the noble Lord, Lord Griffiths, my noble friends Lady Morgan and Lady Harding and the noble Baroness, Lady Kidron. The Bill’s protections recognise that, in law, people become adults when they turn 18—but it is not right to say that there are no protections for young adults. As noble Lords know, the Bill will provide a triple shield of protection, of which the user empowerment duties are the final element.

The Bill already protects young adults from illegal content and content that is prohibited in terms and conditions. As we discussed in the last group, platforms have strong commercial incentives to prohibit content that the majority of their users do not want to see. Our terms of service duties will make sure that they are transparent about and accountable for how they treat this type of content.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, what distinguishes young adults from older adults in what the Minister in saying?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

In law, there is nothing. I am engaging with the point that there is no cliff edge. There are protections for people once they turn 18. People’s tastes and risk appetites may change over time, but there are protections in the Bill for people of all ages.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Surely, this is precisely the point that the noble Baroness, Lady Kidron, was making. As soon as you reach 18, there is no graduation at all. There is no accounting for vulnerable adults.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

There is not this cliff edge which noble Lords have feared—that there are protections for children and then, at 18, a free for all. There are protections for adult users—young adults, older adults, adults of any age—through the means which I have just set out: namely, the triple shield and the illegal content provisions. I may have confused the noble Lord in my attempt to address the point. The protections are there.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

There is an element of circularity to what the Minister is saying. This is precisely why we are arguing for the default option. It allows this vulnerability to be taken account of.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Perhaps it would help if the Minister wanted to just set out the difference for us. Clearly, this Committee has spent some time debating the protection for children, which has a higher bar than protection for adults. It is not possible to argue that there will be no difference at the age of 18, however effective the first two elements of the triple shield are. Maybe the Minister needs to think about coming at it from the point of view of a child becoming an adult, and talk us through what the difference will be.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Once somebody becomes an adult in law at the age of 18, they are protected through the triple shield in the Bill. The user empowerment duties are one element of this, along with the illegal content duties and the protection against content prohibited in terms and conditions and the redress through Ofcom.

The legislation delivers protection for adults in a way that preserves their choice. That is important. At the age of 18, you can choose to go into a bookshop and to encounter this content online if you want. It is not right for the Government to make decisions on behalf of adults about the legal content that they see. The Bill does not set a definition of a vulnerable adult because this would risk treating particular adults differently, or unfairly restricting their access to legal content or their ability to express themselves. There is no established basis on which to do that in relation to vulnerability.

Finally, we remain committed to introducing a new criminal offence to capture communications that intentionally encourage or assist serious self-harm, including eating disorders. This will provide another layer of protection on top of the regulatory framework for both adults and children.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I understand all of that—I think—but that is not the regime being applied to children. It is really clear that children have a safer, better experience. The difference between those experiences suddenly happening on an 18th birthday is what we are concerned about.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Before the Minister stands up—a new phrase—can he confirm that it is perfectly valid to have a choice to lift the user empowerment tool, just as it is to impose it? Choice would still be there if our amendments were accepted.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

It would be, but we fear the chilling effect of having the choice imposed on people. As the noble Baroness, Lady Fox, rightly put it, one does not know what one has not encountered until one has engaged with the idea. At the age of 18, people are given the choice to decide what they encounter online. They are given the tools to ensure that they do not encounter it if they do not wish to do so. As the noble Lord has heard me say many times, the strongest protections in the Bill are for children. We have been very clear that the Bill has extra protections for people under the age of 18, and it preserves choice and freedom of expression online for adult users—young and old adults.

My noble friend Lady Buscombe asked about the list in Clause 12(11). We will keep it under constant review and may consider updating it should compelling evidence emerge. As the list covers content that is legal and designed for adults, it is right that it should be updated by primary legislation after a period of parliamentary scrutiny.

Amendments 42 and 38A, tabled by the noble Lords, Lord Stevenson of Balmacara and Lord Knight of Weymouth, respectively, seek to change the scope of user empowerment content features. Amendment 38A seeks to expand the user empowerment content features to include the restriction of content the provenance of which cannot be authenticated. Amendment 42 would apply features to content that is abusive on the basis of characteristics protected under the Equality Act 2010.

The user empowerment content list reflects areas where there is the greatest need for users to be offered choice about reducing their exposure to types of content. While I am sympathetic to the intention behind the amendments, I fear they risk unintended consequences for users’ rights online. The Government’s approach recognises the importance of having clear, enforceable and technically feasible duties that do not infringe users’ rights to free expression. These amendments risk undermining this. For instance, Amendment 38A would require the authentication of the provenance of every piece of content present on a service. This could have severe implications for freedom of expression, given its all-encompassing scope. Companies may choose not to have anything at all.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I will try to help the Minister. If the amendment has been poorly drafted, I apologise. It does not seek to require a platform to check the provenance of every piece of content, but content that is certified as having good provenance would have priority for me to be able to see it. In the Bill, I can see or not see verified users. In the same way, I could choose to see or not see verified content.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Thank you. I may be reading the noble Lord’s Amendment 38A excessively critically. I will look at it again. To try to reassure the noble Lord, the Bill already ensures that all services take steps to remove illegal manufactured or manipulated content when they become aware of it. Harmful and illegal misinformation and disinformation is covered in that way.

Amendment 42 would require providers to try to establish on a large scale what is a genuinely held belief that is more than an opinion. In response, I fear that providers would excessively apply the user empowerment features to manage that burden.

A number of noble Lords referred to the discrepancy between the list—

18:30
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

Several times in the Bill—but this is a clear example—the drafters have chosen to impose a different sequence of words from that which exists in statute. The obvious one here is the Equality Act, which we have touched on before. The noble Baroness, Lady Buscombe, made a number of serious points about that. Why have the Government chosen to list, separately and distinctively, the characteristics which we have also heard, through a different route, the regulator will be required to uphold in respect of the statute, while the companies will be looking to the text of the Bill, when enacted? Is that not just going to cause chaos?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

The discrepancy comes from the point we touched on earlier. Ofcom, as a public body, is subject to the public sector equality duty and therefore the list set out in the Equality Act 2010. The list at Clause 12(11) relates to content which is abusive, and is therefore for providers to look at. While the Equality Act has established an understanding of characteristics which should be given special protection in law, it is not necessarily desirable to transpose those across. They too are susceptible to the point made by my noble friend Lady Buscombe about lists set out in statute. If I remember rightly, the Equality Act was part of a wash-up at the end of that Parliament, and whether Parliament debated that Bill as thoroughly as it is debating this one is a moot point.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

The noble Lord made that point before, and I was going to pick him up on it. It really is not right to classify our legislation by whether it came through in a short or long period. We are spending an awfully long time on this but that is not going to make it any better. I was involved in the Equality Act, and I have the scars on my back to prove it. It is jolly good legislation and has stood the test of time. I do not think the point is answered properly by simply saying that this is a better way of doing it. The Minister said that Clause 12(11) was about abuse targets, but Clause 12(12) is about “hatred against people” and Clause 12(13) is a series of explanatory points. These provisions are all grist to the lawyers. They are not trying to clarify the way we operate this legislation, in my view, to the best benefit of those affected by it.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

The content which we have added to Clause 12 is a targeted approach. It reflects input from a wide range of interested parties, with whom we have discussed this, on the areas of content that users are most concerned about. The other protected characteristics that do not appear are, for instance, somebody’s marriage or civil partnership status or whether they are pregnant. We have focused on the areas where there is the greatest need for users to be offered the choice about reducing their exposure to types of content because of the abuse they may get from it. This recognises the importance of clear, enforceable and technically feasible duties. As I said a moment ago in relation to the point made by my noble friend Lady Buscombe, we will keep it under review but it is right that these provisions be debated at length—greater length than I think the Equality Bill was, but that was long before my time in your Lordships’ House, so I defer to the noble Lord’s experience and I am grateful that we are debating them thoroughly today.

I will move now, if I may, to discuss Amendments 43 and 283ZA, tabled by the noble Baroness, Lady Fox of Buckley. Amendment 43 aims to ensure that the user empowerment content features do not capture legitimate debate and discussion, specifically relating to the characteristics set out in subsections (11) and (12). Similarly, her Amendment 283ZA aims to ensure that category 1 services apply the features to content only when they have reasonable grounds to infer that it is user empowerment content.

With regard to both amendments, I can reassure the noble Baroness that upholding users’ rights to free expression is an integral principle of the Bill and it has been accounted for in drafting these duties. We have taken steps to ensure that legitimate online discussion or criticism will not be affected, and that companies make an appropriate judgment on the nature of the content in question. We have done this by setting high thresholds for inclusion in the content categories and through further clarification in the Bill’s Explanatory Notes, which I know she has consulted as well. However, the definition here deliberately sets a high threshold. By targeting only abuse and incitement to hatred, it will avoid capturing content which is merely challenging or robust discussion on controversial topics. Further clarity on definitions will be provided by Ofcom through regulatory guidance, on which it will be required to consult. That will sit alongside Ofcom’s code of practice, which will set out the steps companies can take to fulfil their duties.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I appreciate the Minister’s comments but, as I have tried to indicate, incitement to hatred and abuse, despite people thinking they know what those words mean, is causing huge difficulty legally and in institutions throughout the land. Ofcom will have its work cut out, but it was entirely for that reason that I tabled this amendment. There needs to be an even higher threshold, and this needs to be carefully thought through.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

But as I think the noble Baroness understands from that reference, this is a definition already in statute, and with which Parliament and the courts are already engaged.

The Bill’s overarching freedom of expression duties also apply to Clause 12. Subsections (4) to (7) of Clause 18 stipulate that category 1 service providers are required to assess the impact on free expression from their safety policies, including the user empowerment features. This is in addition to the duties in Clause 18(2), which requires all user-to-user services to have particular regard to the importance of protecting freedom of expression when complying with their duties. The noble Baroness’s Amendment 283ZA would require category 1 providers to make judgments on user empowerment content to a similar standard required for illegal content. That would be disproportionate. Clause 170 already specifies how providers must make judgments about whether content is of a particular kind, and therefore in scope of the user empowerment duties. This includes making their judgment based on “all relevant information”. As such, the Bill already ensures that the user empowerment content features will be applied in a proportionate way that will not undermine free speech or hinder legitimate debate online.

Amendment 45, tabled by the noble Lord, Lord Stevenson of Balmacara, would require the Secretary of State to lay a Statement before Parliament outlining whether any of the user empowerment duties should be applied to children. I recognise the significant interest that noble Lords have in applying the Clause 12 duties to children. The Bill already places comprehensive requirements on Part 3 services which children are likely to access. This includes undertaking regular risk assessments of such services, protecting children from harmful content and activity, and putting in place age-appropriate protections. If there is a risk that children will encounter harm, such as self-harm content or through unknown or unverified users contacting them, service providers will need to put in place age- appropriate safety measures. Applying the user empowerment duties for child users runs counter to the Bill’s child safety objectives and may weaken the protections for children—for instance, by giving children an option to see content which is harmful to them or to engage with unknown, unverified users. While we recognise the concerns in this area, for the reasons I have set out, the Government do not agree with the need for this amendment.

I will resist the challenge of the noble Lord, Lord Knight, to talk about bots because I look forward to returning to that in discussing the amendments on future-proofing. With that, I invite noble Lords—

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I noted the points made about the way information is pushed and, in particular, the speech of the right reverend Prelate. Nothing in the Government’s response has really dealt with that concern. Can the Minister say a few words about not the content but the way in which users are enveloped? On the idea that companies always act because they have a commercial imperative not to expose users to harmful material, actually, they have a commercial imperative to spread material and engage users. It is well recorded that a lot of that is in fact harmful material. Can the Minister speak a little more about the features rather than the content?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

We will discuss this when it comes to the definition of content in the Bill, which covers features. I was struck by the speech by the right reverend Prelate about the difference between what people encounter online, and the analogy used by the noble Baroness, Lady Fox, about a bookshop. Social media is of a different scale and has different features which make that analogy not a clean or easy one. We will debate in other groups the accumulated threat of features such as algorithms, if the noble Baroness, Lady Kidron, will allow me to go into greater detail then, but I certainly take the points made by both the right reverend Prelate and the noble Baroness, Lady Fox, in their contributions.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I thank my noble friend very much indeed, and thank all noble Lords who have taken part. As the noble Lord, Lord Knight, said, this has been an important debate—they are all important, of course—but I think this has really got to the heart of parts of the Bill, parts of why it has been proposed in the first place, and some choices the Government made in their drafting and the changes they have made to the Bill. The right reverend Prelate reminded us, as Bishops always do, of the bigger picture, and he was quite right to do so. There is no equality of arms, as he put it, between most of us as internet users and these enormous companies that are changing, and have changed, our society. My noble friend was right—and I was going to pick up on it too—that the bookshop example given by the noble Baroness, Lady Fox, is, I am afraid, totally misguided. I love bookshops; the point is that I can choose to walk into one or not. If I do not walk into a bookshop, I do not see the books promoting some of the content we have discussed today. If they spill out on to the street where I trip over them, I cannot ignore them. This would be even harder if I were a vulnerable person, as we are going to discuss.

Noble Lords said that this is not a debate about content or freedom of expression, but that it is about features; I think that is right. However, it is a debate about choice, as the noble Lord, Lord Clement-Jones, said. I am grateful to each of those noble Lords who supported my amendments; we have had a good debate on both sets of amendments, which are similar. But as the noble Lord, Lord Griffiths, said, some of the content we are discussing, particularly in subsection (10), relating to suicide, pro-self-harm and pro-anorexia content, has literal life or death repercussions. To those noble Lords, and those outside this House, who seem to think we should not worry and should allow a total free-for-all, I say that we are doing so, in that the Government, in choosing not to adopt such amendments, are making an active choice. I am afraid the Government are condoning the serving up of insidious, deliberately harmful and deliberately dangerous content to our society, to younger people and vulnerable adults. The Minister and the Government would be better off if they said, “That is the choice that we have made”. I find it a really troubling choice because, as many noble Lords will know, I was involved in this Bill a number of years ago—there has been a certain turnover of Culture Secretaries in the last couple of years, and I was one of them. I find the Government’s choice troubling, but it has been made. As the noble Lord, Lord Knight, said, we are treating children differently from how we are treating adults. As drafted, there is a cliff edge at the age of 18. As a society, we should say that there are vulnerabilities among adults, as we do in many walks of life; and exactly as the noble Baroness, Lady Parminter, so powerfully said, there are times when we as a House, as a Parliament, as a society and as a state, should say we want to protect people. There is an offer here in both sets of amendments—I am not precious about which ones we choose—to have that protection.

I will of course withdraw the amendment today, because that is the convention of the House, but I ask my noble friend to reflect on the strength of feeling expressed by the House on this today; I think the Whip on the Bench will report as well. I am certain we will return to this on Report, probably with a unified set of amendments. In the algorithmic debate we will return to, the Government will have to explain, in words of one syllable, to those outside this House who worry about the vulnerable they work with or look after, about the choice that the Government have made in not offering protections when they could have done, in relation to these enormously powerful platforms and the insidious content they serve up repeatedly.

18:45
Amendment 34 withdrawn.
Amendments 35 to 37 not moved.
Baroness McIntosh of Hudnall Portrait The Deputy Chairman of Committees (Baroness McIntosh of Hudnall) (Lab)
- Hansard - - - Excerpts

I advise the Committee that if Amendment 38 is agreed to, I shall not be able to call Amendment 38A by reason of pre-emption.

Amendment 38

Moved by
38: Clause 12, page 12, line 24, leave out subsection (6)
Member’s explanatory statement
This amendment, along with the other amendment to Clause 12 in the name of Lord Moylan, removes requirements on sites to display, on demand, only the parts of a conversation (or in the case of collaboratively-edited content, only the parts of a paragraph, sentence or article) that were written by “verified” users, and to prevent other users from amending (e.g. improving), or otherwise interacting with, such contributions.
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, I am going to endeavour to be relatively brief. I rise to move Amendment 38 and to speak to Amendments 39, 139 and 140 in this group, which are in my name. All are supported by my noble friend Lord Vaizey of Didcot, to whom I am grateful.

Amendments 38 and 39 relate to Clause 12. They remove subsections (6) and (7) from the Bill; that is, the duty to filter out non-verified users. Noble Lords will understand that this is different from the debate we have just had, which was about content. This is about users and verification of the users, rather than the harm or otherwise of the content. I am sure I did not need to say that, but perhaps it helps to clarify my own thinking to do so. Amendments 139 and 140 are essentially consequential but make it clear that my amendments do not prohibit category 1 services from offering this facility. They make it a choice, not a duty.

I want to make one point only in relation to these amendments. It has been well said elsewhere that this is a Twitter-shaped Bill, but it is trying to apply itself to a much broader part of the internet than Twitter, or things like it. In particular, community-led services like Wikipedia, to which I have made reference before, operate on a totally different basis. The Bill seeks to create a facility whereby members of the public like you and me can, first, say that we want the provider to offer a facility for verifying those who might use their service, and secondly, for us, as members of the public, to be able to say we want to see material from only those verified accounts. However, the contributors to Wikipedia are not verified, because Wikipedia has no system to verify them, and therefore it would be impossible for Wikipedia, as a category 1 service, to be able to comply with this condition on its current model, which is a non-commercial, non-profit one, as noble Lords know from previous comments. It would not be able to operate this clause; it would have to say that either it is going to require every contributing editor to Wikipedia to be verified first in order to do so, which would be extremely onerous; or it would have to make it optional, which would be difficult, but lead to the bizarre conclusion that you could open an article on Wikipedia and find that some of its words or sentences were blocked, and you could not read them because those amendments to the article had been made by someone who had not been verified. Of course, putting a system in place to allow that absurd outcome would itself be an impossible burden on Wikipedia.

My complaint—as always, in a sense—about the Bill is that it misfires. Every time you touch it, it misfires in some way because it has not been properly thought through. It is perhaps trying to do too much across too broad a front, when it is clear that the concern of the Committee is much narrower than trying to bowdlerize Wikipedia articles. That is not the objective of anybody here, but it is what the Bill is tending to do.

I will conclude by saying—I invite my noble friend to comment on this if he wishes; I think he will have to comment on it at some stage—that in reply to an earlier Committee debate, I heard him say somewhat tentatively that he did not think that Wikipedia would qualify as a category 1 service. I am not an advocate for Wikipedia; I am just a user. But we need to know what the Government’s view is on the question of Wikipedia and services like it. Wikipedia is the only community-led service, I think, of such a scale that it would potentially qualify as category 1 because of its size and reach.

If the Minister’s view is that Wikipedia would not qualify as a category 1 service—in which case, my amendments are irrelevant because it would not be caught by this clause—then he needs to say so. More than that, he needs to say on what basis it would not qualify as a category 1 service. Would it be on the face of the Bill? If not, would it be in the directions given by the Secretary of State to the regulator? Would it be a question of the regulator deciding whether it was a category 1 service? Obviously, if you are trying to run an operation such as Wikipedia with a future, you need to know which of those things it is. Do you have legal security against being determined as a category 1 provider or is it merely at the whim—that is not the right word; the decision—of the regulator in circumstances that may legitimately change? The regulator may have a good or bad reason for changing that determination later. You cannot run a business not knowing these things.

I put it to noble Lords that this clause needs very careful thinking through. If it is to apply to community-led services such as Wikipedia, it is an absurdity. If it is not to apply to them because what I think I heard my noble friend say pertains and they are not, in his view, a category 1 service, why are they not a category 1 service? What security do they have in knowing either way? I beg to move.

Baroness Buscombe Portrait Baroness Buscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to Amendment 106 in my name and the names of my noble and learned friend Lord Garnier and the noble Lord, Lord Moore of Etchingham. This is one of five amendments focused on the need to address the issue of activist-motivated online bullying and harassment and thereby better safeguard the mental health and general well-being of potential victims.

Schedule 4, which defines Ofcom’s objectives in setting out codes of practice for regulated user-to-user services, should be extended to require the regulator to consider the protection of individuals from communications offences committed by anonymous users. The Government clearly recognise that there is a threat of abuse from anonymous accounts and have taken steps in the Bill to address that, but we are concerned that their approach is insufficient and may be counterproductive.

I will explain. The Government’s approach is to require large social media platforms to make provision for users to have their identity verified, and to have the option of turning off the ability to see content shared by accounts whose owners have not done this. However, all this would mean is that people could not see abuse being levelled at them. It would not stop the abuse happening. Crucially, it would not stop other people seeing it, or the damage to his or her reputation or business that the victim may suffer as a result. If I am a victim of online bullying and harassment, I do not want to see it, but I do not want it to be happening at all. The only means I have of stopping it is to report it to the platform and then hope that it takes the right action. Worse still, if I have turned off the ability to see content posted by unverified—that is, anonymous—accounts, I will not be able to complain to the platform as I will not have seen it. It is only when my business goes bust or I am shunned in the street that I realise that something is wrong.

The approach of the Bill seems to be that, for the innocent victim—who may, for example, breed livestock for consumption—it is up that breeder to be proactive to correct harm already done by someone who does not approve of eating meat. This is making a nonsense of the law. This is not how we make laws in this country —until now, it seems. Practically speaking, the worst that is likely to happen is that the platform might ban their account. However, if their victims have had no opportunity to read the abuse or report it, even that fairly low-impact sanction could not be levelled against them. In short, the Bill’s current approach, I am sorry to say, would increase the sense of impunity, not lessen it.

One could argue that, if a potential abuser believes that their victim will not read their abuse, they will not bother issuing it. Unfortunately, this misunderstands the psyche of the online troll. Many of them are content to howl into the void, satisfied that other people who have not turned on the option to filter out content from unverified accounts will still be able to read it. The troll’s objective of harming the victim may be partially fulfilled as a result.

There is also the question of how much uptake there will be of the option to verify one’s identity, and numerous questions about the factors that this will depend on. Will it be attractive? Will there be a cost? How quick and efficient will the process be? Will platforms have the capacity to implement it at scale? Will it have to be done separately for every platform?

If uptake of verification is low, most people simply will not use the option to filter content of unverified accounts, even if it means that they remain more susceptible to abuse, since they would be cutting themselves off from most of their users. Clearly, that is not an option for anyone using social media for any promotional purpose. Even those who use it for purely social reasons will find that they have friends who do not want to be verified. Fundamentally, people use social media because other people use it. Carving oneself off from most of them defeats the purpose of the exercise.

It is not clear what specific measures the Bill could take to address the issue. Conceivably, it could simply ban online platforms from maintaining user accounts whose owners have not had their identities verified. However, this would be truly draconian and most likely lead to major platforms exiting the UK market, as the noble Baroness, Lady Fox, has rightly argued in respect of other possible measures. It would also be unenforceable, since users could simply turn on a VPN, pretend to be from some other country where the rules do not apply and register an account as though they were in that country.

There are numerous underlying issues that the Bill recognises as problems but does not attempt to prescribe solutions for. Its general approach is to delegate responsibility to Ofcom to frame its codes of practice for operators to follow in order to effectively tackle these problems. Specifically, it sets out a list of objectives that Ofcom, in drawing up its codes of practice, will be expected to meet. The protection of users from abuse, specifically by unverified or anonymous users, would seem to be an ideal candidate for inclusion in this list of amendments. If required to do so, Ofcom could study the issue closely and develop more effective solutions over time.

I was pleased to see, in last week’s Telegraph, an article that gave an all too common example of where the livelihood of a chef running a pub in Cornwall has suffered what amounts to vicious abuse online from a vegan who obviously does not approve of the menu, and who is damaging the business’s reputation and putting the chef’s livelihood at risk. This is just one tiny example, if I can put it that way, of the many thousands that are happening all the time. Some 584 readers left comments, and just about everyone wrote in support of the need to do something to support that chef and tackle this vicious abuse.

I return to a point I made in a previous debate: livelihoods, which we are deeply concerned about, are at stake here. I am talking not about big business but about individuals and small and family businesses that are suffering—beyond abuse—loss of livelihood, financial harm and/or reputational damage to business, and the knock-on effects of that.

House resumed. Committee to begin again not before 7.41 pm.

Online Safety Bill

Committee (5th Day) (Continued)
19:42
Clause 12: User empowerment duties
Debate on Amendment 38 resumed.
Baroness Buscombe Portrait Baroness Buscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, before we continue this debate, I want to understand why we have changed the system so that we break part way through a group of amendments. I am sorry, but I think this is very poor. It is definitely a retrograde step. Why are we doing it? I have never experienced this before. I have sat here and waited for the amendment I have just spoken to. We have now had a break; it has broken the momentum of that group. It was even worse last week, because we broke for several days half way through the debate on an amendment. This is unheard of in my memory of 25 years in this House. Can my noble friend the Minister explain who made this decision, and how this has changed?

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

I have not had as long in your Lordships’ House, but this is not unprecedented, in my experience. These decisions are taken by the usual channels; I will certainly feed that back through my noble friend. One of the difficulties, of course, is that because there are no speaking limits on legislation and we do not know how many people want to speak on each amendment, the length of each group can be variable, so I think this is for the easier arrangement of dinner-break business. Also, for the dietary planning of those of us who speak on every group, it is useful to have some certainty, but I do appreciate my noble friend’s point.

Baroness Buscombe Portrait Baroness Buscombe (Con)
- Hansard - - - Excerpts

Okay; I thank my noble friend for his response. However, I would just say that we never would have broken like that, before 7.30 pm. I will leave it at that, but I will have a word with the usual channels.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to speak to Amendments 141 and 303 in the name of the noble Lord, Lord Stevenson. Before I do, I mention in passing how delighted I was to see Amendment 40, which carries the names of the Minister and the noble Lord, Lord Stevenson—may there be many more like that.

I am concerned that without Amendments 141 and 303, the concept of “verified” is not really something that the law can take seriously. I want to ask the Minister two rather technical questions. First, how confident can the Government and Ofcom be that with the current wording, Ofcom could form an assessment of whether Twitter’s current “verified by blue” system satisfies the duty in terms of robustness? If it does not, does Ofcom have the power to send it back to the drawing board? I am sure noble Lords understand why I raise this: we have recently seen “verified by blue” ticks successfully bought by accounts impersonating Martin Lewis, US Senators and Putin propagandists. My concern is that in the absence of a definition of verification in the Bill such as the one proposed in Amendments 141 and 303, where in the current wording does Ofcom have the authority to say that “verified by blue” does not satisfy the user verification duty?

19:45
My second question is similar. We see now around the world—it is not available in the UK—that Meta has a verified subscription, for which you can pay around $15 per month. It is being piloted in the US as we speak. Again, I ask whether that satisfies the duty in terms of it being affordable to the average UK user. I am concerned that most UK social media users will not be able to afford £180 per social media account for verification. If that ends up being Meta’s UK offering, many users would not be given a proper, meaningful chance to be verified. What powers are there in the Bill for Ofcom to send Meta back and offer something else? So my questions really are about what “verified” means in terms of the Bill.
Baroness Bull Portrait Baroness Bull (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to speak to Amendment 141 in the names of the noble Lords, Lord Stevenson and Lord Clement-Jones. Once again, I register the support of my noble friend Lady Campbell of Surbiton, who feels very strongly about this issue.

Of course, there is value in transparency online, but anonymity can be vital for certain groups of people, such as those suffering domestic abuse, those seeking help or advice on matters they wish to remain confidential, or those who face significant levels of hatred or prejudice because of who they are, how they live or what they believe in. Striking the right balance is essential, but it is equally important that everyone who wishes to verify their identity and access the additional protections that this affords can do so easily and effectively, and that this opportunity is open to all.

Clause 57 requires providers of category 1 services to offer users the option to verify their identity, but it is up to providers to decide what form of verification to offer. Under subsection (2) it can be “of any kind”, and it need not require any documentation. Under subsection (3), the terms of service must include a “clear and accessible” explanation of how the process works and what form of verification is available. However, this phrase in itself is open to interpretation: clear and accessible for one group may be unclear and inaccessible to another. Charities including Mencap are concerned that groups, such as people with a learning disability, could be locked out of using these tools.

It is also relevant that people with a learning disability are less likely to own forms of photographic ID such as passports or driving licences. Should a platform require this type of ID, large numbers of people with a learning disability would be denied access. In addition, providing an email or phone number and verifying this through an authentication process could be extremely challenging for those people who do not have the support in place to help them navigate this process. This further disadvantages groups of people who already suffer some of the most extensive restrictions in living their everyday lives.

Clause 58 places a duty on Ofcom to provide guidance to help providers comply with their duty, but this guidance is optional. Amendment 141 aims to strengthen Clause 58 by requiring Ofcom to set baseline principles and standards for the guidance. It would ensure, for example, that the guidance considers accessibility for disabled as well as vulnerable adults and aligns with relevant guidance on related matters such as age verification; it would ensure that verification processes are effective; and it would ensure that the interests of disabled users are covered in Ofcom’s pre-guidance consultation.

Online can be a lifeline for disabled and vulnerable adults, providing access to support, advice and communities of interest, and this is particularly important as services in the real world are diminishing, so we need to ensure that user-verification processes do not act as a further barrier to inclusion for people with protected characteristics, especially those with learning disabilities.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, the speech of the noble Baroness, Lady Buscombe, raised so many of the challenges that people face online, and I am sure that the masses who are watching parliamentlive as we speak, even if they are not in here, will recognise what she was talking about. Certainly, some of the animal rights activists can be a scourge, but I would not want to confine this to them, because I think trashing reputations online and false allegations have become the activists’ chosen weapon these days. One way that I describe cancel culture, as distinct from no-platforming, is that it takes the form of some terrible things being said about people online, a lot of trolling, things going viral and using the online world to lobby employers to get people sacked, and so on. It is a familiar story, and it can be incredibly unpleasant. The noble Baroness and those she described have my sympathy, but I disagree with her remedy.

An interesting thing is that a lot of those activities are not carried out by those who are anonymous. It is striking that a huge number of people with large accounts, well-known public figures with hundreds of thousands of followers—sometimes with more than a million—are prepared to do exactly what I described in plain sight, often to me. I have thought long and hard about this, because I really wanted to use this opportunity to read out a list and name and shame them, but I have decided that, when they go low, I will try to go at least a little higher. But subtweeting and twitchhunts are an issue, and one reason why we think we need an online harms Bill. As I said, I know that sometimes it can feel that if people are anonymous, they will say things that they would not say to your face or if you knew who they were, but I think it is more the distance of being online: even when you know who they are, they will say it to you or about you online, and then when you see them at the drinks reception, they scuttle away.

My main objection, however, to the amendment of the noble Baroness, Lady Buscombe, and the whole question of anonymity in general is that it treats anonymity as though it is inherently unsafe. There is a worry, more broadly on verification, about creating two tiers of users: those who are willing to be verified and those who are not, and those who are not somehow having a cloud of suspicion over them. There is a danger that undermining online anonymity in the UK could set a terrible precedent, likely to be emulated by authoritarian Governments in other jurisdictions, and that is something we must bear in mind.

On evidence, I was interested in Big Brother Watch’s report on some analysis by the New Statesman, which showed that there is little evidence to suggest that anonymity itself makes online discourse more febrile. It did an assessment involving tweets sent to parliamentarians since January 2021, and said there was

“little discernible difference in the nature or tone of the tweets that MPs received from anonymous or non-anonymous accounts. While 32 per cent of tweets from anonymous accounts were classed as angry according to the metric used by the New Statesman, so too were 30 per cent of tweets from accounts with full names attached.18 Similarly, 5.6 per cent of tweets from anonymous accounts included swear words, only slightly higher than the figure of 5.3 per cent for named accounts.”

It went through various metrics, but it said, “slightly higher, not much of a difference”. That is to be borne in mind: the evidence is not there.

In this whole debate, I have wanted to emphasise freedom as at least equal to, if not of greater value than, the safetyism of this Bill, but in this instance, I will say that, as the noble Baroness, Lady Bull, said, for some people anonymity is an important safety mechanism. It is a tool in the armoury of those who want to fight the powerful. It can be anyone: for young people experimenting with their sexuality and not out, it gives them the freedom to explore that. It can be, as was mentioned, survivors of sexual violence or domestic abuse. It is certainly crucial to the work of journalists, civil liberties activists and whistleblowers in the UK and around the world. Many of the Iranian women’s accounts are anonymous: they are not using their correct names. The same is true of Hong Kong activists; I could go on.

Anyway, in our concerns about the Bill, compulsory identity verification means being forced to share personal data, so there is a privacy issue for everyone, not just the heroic civil liberties people. In a way, it is your own business why you are anonymous—that is the point I am trying to make.

There are so many toxic issues at the moment that a lot of people cannot just come out. I know I often mention the gender-critical issue, but it is true that in many professions, you cannot give your real name or you will not just be socially ostracised but potentially jeopardise your career. I wrote an article during the 2016-17 days called Meet the Secret Brexiteers. It was true that many teachers and professors I knew who voted to leave had to be anonymous online or they would not have survived the cull.

Finally, I do not think that online anonymity or pseudonymity is a barrier to tracking down and prosecuting those who commit the kind of criminal activity on the internet described, creating some of the issues we are facing. Police reports show that between 2017-18, 96% of attempts by public authorities to identify anonymous users of social media accounts, their email addresses and telephone numbers, resulted in successful identification of the suspect in the investigation; in other words, the police already have a range of intrusive powers to track down individuals, should there be a criminal problem, and the Investigatory Powers Act 2016 allows the police to acquire communications data—for example, email addresses or the location of a device—from which alleged illegal anonymous activity is conducted and use it as evidence in court.

If it is not illegal but just unpleasant, I am afraid that is the world we live in. I would argue that what we require in febrile times such as these is not bans or setting the police on people but to set the example of civil discourse, have more speech and show that free speech is a way of conducting disagreement and argument without trashing reputations.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, what an unusually reticent group we have here for this group of amendments. I had never thought of the noble Baroness, Lady Fox, as being like Don Quixote, but she certainly seems to be tilting at windmills tonight.

I go back to the Joint Committee report, because what we said there is relevant. We said:

“Anonymous abuse online is a serious area of concern that the Bill needs to do more to address. The core safety objectives apply to anonymous accounts as much as identifiable ones. At the same time, anonymity and pseudonymity are crucial to online safety for marginalised groups, for whistleblowers, and for victims of domestic abuse and other forms of offline violence. Anonymity and pseudonymity themselves are not the problem and ending them would not be a proportionate response”.


We were very clear; the Government’s response on this was pretty clear too.

We said:

“The problems are a lack of traceability by law enforcement, the frictionless creation and disposal of accounts at scale, a lack of user control over the types of accounts they engage with and a failure of online platforms to deal comprehensively with abuse on their platforms”.


We said there should be:

“A requirement for the largest and highest risk platforms to offer the choice of verified or unverified status and user options on how they interact with accounts in either category”.


Crucially for these amendments, we said:

“We recommend that the Code of Practice also sets out clear minimum standards to ensure identification processes used for verification protect people’s privacy—including from repressive regimes or those that outlaw homosexuality”.


We were very clear about the difference between stripping away anonymity and ensuring that verification was available where the user wanted to engage only with those who had verified themselves. Requiring platforms to allow users—

20:00
Baroness Buscombe Portrait Baroness Buscombe (Con)
- Hansard - - - Excerpts

I am sorry to interrupt the noble Lord, but I would like to ask him whether, when the Joint Committee was having its deliberations, it ever considered, in addition to people’s feelings and hurt, their livelihoods.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Of course. I think we looked at it in the round and thought that stripping away anonymity could in many circumstances be detrimental to those, for instance, working in hostile regimes or regimes where human rights were under risk. We considered a whole range of things, and the whole question about whether you should allow anonymity is subject to those kinds of human rights considerations.

I take the noble Baroness’s point about business, but you have to weigh up these issues, and we came around the other side.

Baroness Buscombe Portrait Baroness Buscombe (Con)
- Hansard - - - Excerpts

Does the noble Lord not think that many people watching and listening to this will be thinking, “So people in far-off regimes are far more important than I am—I who live, work and strive in this country”? That is an issue that I think was lacking through the whole process and the several years that this Bill has been discussed. Beyond being hurt, people are losing their livelihoods.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I entirely understand what the noble Baroness is saying, and I know that she feels particularly strongly about these issues given her experiences. The whole Bill is about trying to weigh up different aspects—we are on day 5 now, and this has been very much the tenor of what we are trying to talk about in terms of balance.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I want to reassure the noble Baroness that we did discuss anonymity in relation to the issues that she has put forward. A company should not be able to use anonymity as an excuse not to deal with the situation, and that is slightly different from simply saying, “We throw our hands up on those issues”.

There was a difference between the fact that companies are using anonymity to say, “We don’t know who it is, and therefore we can’t deal with it”, and the idea that they should take action against people who are abusing the system and the terms of service. It is subtle, but it is very meaningful in relation to what the noble Baroness is suggesting.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

That is a very fair description. We have tried to emphasise throughout the discussion on the Bill that it is about not just content but how the system and algorithms work in terms of amplification. In page 35 of our report, we try to address some of those issues—it is not central to the point about anonymity, but we certainly talked about the way that messages are driven by the algorithm. Obviously, how that operates in practice and how the Bill as drafted operates is what we are kicking the tyres on at the moment, and the noble Baroness is absolutely right to do that.

The Government’s response was reasonably satisfactory, but this is exactly why this group explores the definition of verification and so on, and tries to set standards for verification, because we believe that there is a gap in all this. I understand that this is not central to the noble Baroness’s case, but—believe me—the discussion of anonymity was one of the most difficult issues that we discussed in the Joint Committee, and you have to fall somewhere in that discussion.

Requiring platforms to allow users to see other users’ verification status is a crucial further pillar to user empowerment, and it provides users with a key piece of information about other users. Being able to see whether an account is verified would empower victims of online abuse or threats—I think this partly answers the noble Baroness’s question—to make more informed judgments about the source of the problem, and therefore take more effective steps to protect themselves. Making verification status visible to all users puts more choice in their hands as to how they manage the higher risks associated with non-verified and anonymous accounts, and offers them a lighter-touch alternative to filtering out all non-verified users entirely.

We on these Benches support the amendments that have been put forward. Amendment 141 aims to ensure that a user verification duty delivers in the way that the public and Government hope it will—by giving Ofcom a clear remit to require that the verification systems that platforms are required to develop in response to the duty are sufficiently rigorous and accessible to all users.

I was taken by what the noble Baroness, Lady Bull, said, particularly the case for Ofcom’s duties as regards those with disabilities. We need Ofcom to be tasked with setting out the principles and minimum standards, because otherwise platforms will try to claim, as verification, systems that do not genuinely verify a user’s identity, are unaffordable to ordinary users or use their data inappropriately.

Likewise, we support Amendment 303, which would introduce a definition of “user identity verification” into the Bill to ensure that we are all on the same page. In Committee in the House of Commons, Ministers suggested that “user identity verification” is an everyday term so does not need a definition. This amendment, which no doubt the noble Baroness, Lady Merron, will speak to in more detail, is bang on point as far as that is concerned. That was not a convincing answer, and that is why this amendment is particularly apt.

I heard what the noble Baroness, Lady Buscombe, had to say, but in many ways the amendment in the previous group in the name of the noble Lord, Lord Knight, met some of the noble Baroness’s concerns. As regards the amendment in the name of the noble Lord, Lord Moylan, we are all Wikipedia fans, so we all want to make sure that there is no barrier to Wikipedia operating successfully. I wonder whether perhaps the noble Lord is making quite a lot out of the Wikipedia experience, but I am sure the Minister will enlighten us all and will have a spot-on response for him.

Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am pleased to speak on this group of amendments, and I will particularly address the amendments in the name of my noble friend Lord Stevenson. To start with the very positive, I am very grateful to the Minister for signing Amendment 40 —as has already been commented, this is hopefully a sign of things to come. My observation is that it is something of a rarity, and I am containing my excitement as it was agreement over one word, “effectively”. Nevertheless, it is very welcome support.

These amendments aim to make it clearer to users whether those whom they interact with are verified or non-verified, with new duties backed up by a set of minimum standards, to be reflected in Ofcom’s future guidance on the user verification duty, with standards covering—among other things—privacy and data protection. The noble Lord, Lord Clement-Jones, helpfully referred your Lordships’ House to the report of the Joint Committee and spent some useful time on the challenges over anonymity. As is the case with so many issues on other Bills and particularly on this one, there is a balance to be struck. Given the proliferation of bots and fake profiles, we must contemplate how to give confidence to people that they are interacting with real users.

Amendment 141 tabled by my noble friend Lord Stevenson and supported by the noble Lord, Lord Clement- Jones, requires Ofcom to set a framework of principles and minimum standards for the user verification duty. The user verification duty is one of the most popular changes to be made to the Bill following the pre-legislative scrutiny process and reflects a recommendation of the Joint Committee. Why is it popular? Because the public understand that the current unregulated approach by social media platforms is a major enabler of harmful online behaviour. Anonymous accounts are more likely to engage in abuse or harassment and, for those at the receiving end, threats from anonymous accounts can feel even more frightening, while the chances are lower of any effective enforcement from the police or platforms.

As we know, bad actors use networks of fake accounts to peddle disinformation and divisive conspiracy theories. I am sure that we will come back to this in later groups. This amendment aims to ensure that the user verification duty delivers in the way that the public and the Government hope that it will. It requires that the systems which platforms develop in response to the duty are sufficiently rigorous and accessible to all users.

The noble Baroness, Lady Kidron, talked about affordability, something that I would like to amplify. There will potentially be platforms which try to claim that verification systems somehow genuinely verify a user’s identity when they do not, or they will be unaffordable to ordinary users, as the noble Baroness said, or data will be used inappropriately. This is not theoretical. She referred to the Meta-verified product, which looks like it might be more rigorous, but at a cost of $180 per year per account, which will not be within the grasp of many people. Twitter is now also selling blue ticks of verification for $8, including a sale to those who are scamming, impersonating, and who are propagandists for figures in our world such as Putin. This amendment future-proofs and allows flexibility. It will not tie the hands of either the regulator or the platforms. Therefore, I hope that it can find some favour with the Minister.

In Amendment 303, again tabled by my noble friend Lord Stevenson and supported by the noble Lord, Lord Clement-Jones, there is an addition of the definition of “user identity verification”. I agree with the noble Lord about how strange it was that, in Committee in the Commons, Ministers felt that user identity verification was somehow an everyday term which did not need definition. I dispute that. It is no better left to common sense than any other terms that we do have definitions for in Clause 207—for example, “age assurance”, “paid-for advertisement” and “terms of service”. All these get definitions. Surely it is very wise to define user identity verification.

20:15
Without definition, there is obviously scope for dispute about how verification is defined. As we heard earlier in Committee, a dispute over what something means only creates the conditions for uncertainty, delay and legal costs. Therefore, I hope that we can see a brief definition that provides clarity for regulators and platforms and reduces the potential for disputes and enforcement delays. If we could rely on platforms to operate in good faith, in the interests of all of us, we would not even need the Bill.
Amendment 41, again tabled by my noble friend Lord Stevenson and supported by the noble Lord, Lord Clement-Jones, would require category 1 services to make visible to users whether another user is verified or non-verified. There is already a duty to allow users to be verified and to allow all users to filter out interaction with unverified accounts, but these duties must be—to use that word again—effective.
In cases of fraud, we well know that online scammers rely heavily on deceptive fake accounts, often backed up by reviews from other fake accounts, and that they will think twice about going through any credible verification process because it will make them more traceable. So a simple and clear piece of advice, if we become able to use it, would be to check if the user you are interacting with is verified. That would be powerful advice for consumers to help them avoid fraud.
In the case of disinformation—again, something we will return to in a later group—bad actors, including foreign Governments, are setting up networks of fake accounts which make all sorts of false claims about their identity: maybe that they are a doctor, a British Army veteran or an expert in vaccines. We have seen and heard them all. We ask the public to check the source of the information they read, and that would be a lot easier if it was obvious who is verified and who is not. For those who are subject to online abuse or threats, being able to see if an account is verified would empower them to make more informed decisions about the source of the problem, and therefore to take more definitive steps to protect themselves.
It is absolutely right, as the noble Baronesses, Lady Bull and Lady Fox, outlined, that there are very legitimate reasons why some people do not want their identity shared when they are using a service. This issue was raised with me by a number of young people that I, like other noble Lords, had the opportunity to speak to at a meeting organised by the NSPCC. They explained how they experienced the online world and how they wanted to be able to use it, but there are times when they need to protect their identity in order to benefit from using it and to explore various aspects of themselves, and I believe we should enable that protection.
Amendments in this group from the noble Lord, Lord Moylan, bring us back to previous debates on crowdsourced sites such as Wikipedia, so I will not repeat the same points made in previous debates, but I feel sure that the Minister will provide the reassurance that the noble Lord seeks, and we all look forward to it.
I have a question for the Minister in concluding my comments on this group. Could he confirm whether, under the current provisions, somebody’s full name would have to be publicly displayed for the verification duty to have been met, or could they use a pseudonym or a generic username publicly, with verification having taken place in a private and secure manner? I look forward to hearing from the Minister.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, the range of the amendments in this group indicates the importance of the Government’s approach to user verification and non-verified user duties. The way these duties have been designed seeks to strike a careful balance between empowering adults while safeguarding privacy and anonymity.

Amendments 38, 39, 139 and 140 have been tabled by my noble friend Lord Moylan. Amendments 38 and 39 seek to remove subsections (6) and (7) of the non-verified users’ duties. These place a duty on category 1 platforms to give adult users the option of preventing non-verified users interacting with their content, reducing the likelihood that a user sees content from non-verified users. I want to be clear that these duties do not require the removal of legal content from a service and do not impinge on free speech.

In addition, there are already existing duties in the Bill to safeguard legitimate online debate. For example, category 1 services will be required to assess the impact on free expression of their safety policies, including the impact of their user empowerment tools. Removing subsections (6) and (7) of Clause 12 would undermine the Bill’s protection for adult users of category 1 services, especially the most vulnerable. It would be entirely at the service provider’s discretion to offer users the ability to minimise their exposure to anonymous and abusive users, sometimes known as trolls. In addition, instead of mandating that users verify their identity, the Bill gives adults the choice. On that basis, I am confident that the Bill already achieves the effect of Amendment 139.

Amendment 140 seeks to reduce the amount of personal data transacted as part of the verification process. Under subsection (3) of Clause 57, however, providers will be required to explain in their terms of service how the verification process works, empowering users to make an informed choice about whether they wish to verify their identity. In addition, the Bill does not alter the UK’s existing data protection laws, which provide people with specific rights and protections in relation to the processing of their personal data. Ofcom’s guidance in this area will reflect existing laws, ensuring that users’ data is protected where personal data is processed. I hope my noble friend will therefore be reassured that these duties reaffirm the concept of choice and uphold the importance of protecting personal data.

While I am speaking to the questions raised by my noble friend, I turn to those he asked about Wikipedia. I have nothing further to add to the comments I made previously, not least that it is impossible to pre-empt the assessments that will be made of which services fall into which category. Of course, assessments will be made at the time, based on what the services do at the time of the assessment, so if he will forgive me, I will not be drawn on particular services.

To speak in more general terms, category 1 services are those with the largest reach and the greatest influence over public discourse. The Bill sets out a clear process for determining category 1 providers, based on thresholds set by the Secretary of State in secondary legislation following advice from Ofcom. That is to ensure that the process is objective and evidence based. To deliver this advice, Ofcom will undertake research into the relationship between how quickly, easily and widely user-generated content is disseminated by that service, the number of users and functionalities it has and other relevant characteristics and factors.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

Will my noble friend at least confirm what he said previously: namely, that it is the Government’s view—or at least his view—that Wikipedia will not qualify as a category 1 service? Those were the words I heard him use at the Dispatch Box.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

That is my view, on the current state of play, but I cannot pre-empt an assessment made at a point in the future, particularly if services change. I stand by what I said previously, but I hope my noble friend will understand if I do not elaborate further on this, at the risk of undermining the reassurance I might have given him previously.

Amendments 40, 41, 141 and 303 have been tabled by the noble Lord, Lord Stevenson of Balmacara, and, as noble Lords have noted, I have added my name to Amendment 40. I am pleased to say that the Government are content to accept it. The noble Baroness, Lady Merron, should not minimise this, because it involves splitting an infinitive, which I am loath to do. If this is a statement of intent, I have let that one go, in the spirit of consensus. Amendment 40 amends Clause 12(7) to ensure that the tools which will allow adult users to filter out content from non-verified users are effective and I am pleased to add my name to it.

Amendment 41 seeks to make it so that users can see whether another user is verified or not. I am afraid we are not minded to accept it. While I appreciate the intent, forcing users to show whether they are verified or not may have unintended consequences for those who are unable to verify themselves for perfectly legitimate reasons. This risks creating a two-tier system online. Users will still be able to set a preference to reduce their interaction with non-verified users without making this change.

Amendment 141 seeks to prescribe a set of principles and standards in Ofcom’s guidance on user verification. It is, however, important that Ofcom has discretion to determine, in consultation with relevant persons, which principles will have the best outcomes for users, while ensuring compliance with the duties. Further areas of the Bill also address several issues raised in this amendment. For example, all companies in scope will have a specific legal duty to have effective user reporting and redress mechanisms.

Existing laws also ensure that Ofcom’s guidance will reflect high standards. For example, it is a general duty of Ofcom under Section 3 of the Communications Act 2003 to further the interests of consumers, including by promoting competition. This amendment would, in parts, duplicate existing duties and undermine Ofcom’s independence to set standards on areas it deems relevant after consultation with expert groups.

Amendment 303 would add a definition of user identity verification. The definition it proposes would result in users having to display their real name online if they decide to verify themselves. In answer to the noble Baroness’s question, the current requirements do not specify that users must display their real name. The amendment would have potential safety implications for vulnerable users, for example victims and survivors of domestic abuse, whistleblowers and others of whom noble Lords have given examples in their contributions. The proposed definition would also create reliance on official forms of identification. That would be contrary to the existing approach in Clause 57 which specifically sets out that verification need not require such forms of documentation.

The noble Baroness, Lady Kidron, talked about paid-for verification schemes. The user identity verification provisions were brought in to ensure that adult users of the largest services can verify their identity if they so wish. These provisions are different from the blue tick schemes and others currently in place, which focus on a user’s status rather than verifying their identity. Clause 57 specifically sets out that providers of category 1 services will be required to offer all adult users the option to verify their identity. Ofcom will provide guidance for user identity verification to assist providers in complying with these duties. In doing so, it will consult groups that represent the interests of vulnerable adult users. In setting out recommendations about user verification, Ofcom must have particular regard to ensuring that providers of category 1 services offer users a form of identity verification that is likely to be available to vulnerable adult users. Ofcom will also be subject to the public sector equality duty, so it will need to take into account the ways in which people with certain characteristics may be affected when it performs this and all its duties under the Bill.

A narrow definition of identity verification could limit the range of measures that service providers might offer their users in the future. Under the current approach, Ofcom will produce and publish guidance on identity verification after consulting those with technical expertise and groups which represent the interests of vulnerable adult users.

20:30
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I am sorry to interrupt the noble Lord. Is the answer to my question that the blue tick and the current Meta system will not be considered as verification under the terms of the Bill? Is that the implication of what he said?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes. The blue tick is certainly not identity verification. I will write to confirm on Meta, but they are separate and, as the example of blue ticks and Twitter shows, a changing feast. That is why I am talking in general terms about the approach, so as not to rely too much on examples that are changing even in the course of this Committee.

Government Amendment 43A stands in my name. This clarifies that “non-verified user” refers to users whether they are based in the UK or elsewhere. This ensures that, if a UK user decides he or she no longer wishes to interact with non-verified users, this will apply regardless of where they are based.

Finally, Amendment 106 in the name of my noble friend Lady Buscombe would make an addition to the online safety objectives for regulated user-to-user services. It would amend them to make it clear that one of the Bill’s objectives is to protect people from communications offences committed by anonymous users.

The Bill already imposes duties on services to tackle illegal content. Those duties apply across all areas of a service, including the way it is designed and operated. Platforms will be required to take measures—for instance, changing the design of functionalities, algorithms, and other features such as anonymity—to tackle illegal content.

Ofcom is also required to ensure that user-to-user services are designed and operated to protect people from harm, including with regard to functionalities and other features relating to the operation of their service. This will likely include the use of anonymous accounts to commit offences in the scope of the Bill. My noble friend’s amendment is therefore not needed. I hope she will be satisfied not to press it, along with the other noble Lords who have amendments in this group.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, I would like to say that that was a rewarding and fulfilling debate in which everyone heard very much what they wanted to hear from my noble friend the Minister. I am afraid I cannot say that. I think it has been one of the most frustrating debates I have been involved in since I came into your Lordships’ House. However, it gave us an opportunity to admire the loftiness of manner that the noble Lord, Lord Clement-Jones, brought to dismissing my concerns about Wikipedia—that I was really just overreading the whole thing and that I should not be too bothered with words as they appear in the Bill because the noble Lord thinks that Wikipedia is rather a good thing and why is it not happy with that as a level of assurance?

I would like to think that the Minister had dealt with the matter in the way that I hoped he would, but I do thin, if I may say so, that it is vaguely irresponsible to come to the Dispatch Box and say, “I don’t think Wikipedia will qualify as a category 1 service”, and then refuse to say whether it will or will not and take refuge in the process the Bill sets up, when at least one Member of the House of Lords, and possibly a second in the shape of the noble Lord, Lord Clement-Jones, would like to know the answer to the question. I see a Minister from the business department sitting on the Front Bench with my noble friend. This is a bit like throwing a hand grenade into a business headquarters, walking away and saying, “It was nothing to do with me”. You have to imagine what the position is like for the business.

We had a very important amendment from my noble friend Lady Buscombe. I think we all sympathise with the type of abuse that she is talking about—not only its personal effects but its deliberate business effects, the deliberate attempt to destroy businesses. I say only that my reading of her Amendment 106 is that it seeks to impose on Ofcom an objective to prevent harm, essentially, arising from offences under Clauses 160 and 162 of the Bill committed by unverified or anonymous users. Surely what she would want to say is that, irrespective of verification and anonymity, one would want action taken against this sort of deliberate attempt to undermine and destroy businesses. While I have every sympathy with her amendment, I am not entirely sure that it relates to the question of anonymity and verification.

Apart from that, there were in a sense two debates going on in parallel in our deliberations. One was to do with anonymity. On that question, I think the noble Lord, Lord Clement-Jones, put the matter very well: in the end, you have to come down on one side or the other. My personal view, with some reluctance, is that I have come down on the same side as the Government, the noble Lord and others. I think we should not ban anonymity because there are costs and risks to doing so, however satisfying it would be to be able to expose and sue some of the people who say terrible and untrue things about one another on social media.

The more important debate was not about anonymity as such but about verification. We had the following questions, which I am afraid I do not think were satisfactorily answered. What is verification? What does it mean? Can we define what verification is? Is it too expensive? Implicitly, should it be available for free? Is there an obligation for it to be free or do the paid-for services count, and what happens if they are so expensive that one cannot reasonably afford them? Is it real, in the sense that the verification processes devised by the various platforms genuinely provide verification? Various other questions like that came up but I do not think that any of them was answered.

I hate to say this as it sounds a little harsh about a Government whom I so ardently support, but the truth is that the triple shield, also referred to as a three-legged stool in our debate, was hastily cobbled together to make up for the absence of legal but harmful, but it is wonky; it is not working, it is full of holes and it is not fit for purpose. Whatever the Minister says today, there has to be a rethink before he comes back to discuss these matters at the next stage of the Bill. In the meantime, I beg leave to withdraw my amendment.

Amendment 38 withdrawn.
Amendments 38A and 39 not moved.
Amendment 40
Moved by
40: Clause 12, page 12, line 27, after “to” insert “effectively”
Member’s explanatory statement
This amendment would bring this subsection into line with subsection (3) by requiring that the systems or processes available to users for the purposes described in subsections (7)(a) and (7)(b) should be effective.
Amendment 40 agreed.
Amendments 41 to 43ZA not moved.
Amendment 43A
Moved by
43A: Clause 12, page 13, line 20, leave out from “who” to end of line 21 and insert “—
(a) is an individual, whether in the United Kingdom or outside it, and(b) has not verified their identity to the provider of a service;”Member’s explanatory statement
This amendment makes it clear that the term “non-verified user” in clause 12 (user empowerment duties) refers to individuals and includes users outside the United Kingdom.
Amendment 43A agreed.
Amendments 44 and 45 not moved.
Clause 12, as amended, agreed.
Amendment 46
Moved by
46: After Clause 12, insert the following new Clause—
“Adult risk assessment duties
(1) This section sets out the duties about risk assessments in respect of adult users which apply in relation to Category 1 services.(2) A duty to carry out a suitable and sufficient adults’ risk assessment.(3) A duty to take appropriate steps to keep an adults’ risk assessment up to date, including when OFCOM make any significant change to a risk profile that relates to services of the kind in question.(4) Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient adults’ risk assessment relating to the impacts of that proposed change.(5) An “adults’ risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—(a) the user base;(b) the level of risk of adults who are users of the service encountering, by means of the service, each kind of content specified in section 12(10) to (12), taking into account (in particular) algorithms used by the service, and how easily, quickly and widely content may be disseminated by means of the service;(c) the level of risk of functionalities of the service, including user empowerment tools, which facilitate the presence, identification, dissemination, and likelihood of users encountering or being alerted to, content specified in section 12(10) to (12);(d) the extent to which user empowerment tools might result in interference with users’ right to freedom of expression within the law (see section 18);(e) how the design and operation of the service (including the business model, governance, use of proactive technology, measures to promote users’ media literacy and safe use of the service, and other systems and processes) may reduce or increase the risks identified.”Member’s explanatory statement
This and other amendments in the name of Baroness Stowell relate to risk assessments for adults in relation to platforms’ new duties to provide user empowerment tools. They would require platforms to provide public risk assessments in their terms of service and be transparent about the effect of user empowerment tools on users’ freedom of expression.
Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Hansard - - - Excerpts

My Lords, in introducing this group, I will speak directly to the three amendments in my name—Amendments 46, 47 and 64. I will also make some general remarks about the issue of freedom of speech and of expression, which is the theme of this group. I will come to these in a moment.

The noble Lord, Lord McNally, said earlier that I had taken my amendments out of a different group— I hope from my introductory remarks that it will be clear why—but, in doing so, I did not realise that I would end up opening on this group. I offer my apologies to the noble Lord, Lord Stevenson of Balmacara, for usurping his position in getting us started.

I am grateful to the noble Baronesses, Lady Bull and Lady Featherstone, for adding their names. The amendments represent the position of the Communications and Digital Select Committee of your Lordships’ House. In proposing them, I do so with that authority. My co-signatories are a recent and a current member. I should add sincere apologies from the noble Baroness, Lady Featherstone, for not being here this evening. If she is watching, I send her my very best wishes.

When my noble friend Lord Gilbert of Panteg was its chair, the committee carried out an inquiry into freedom of speech online. This has already been remarked on this evening. At part of that inquiry, the committee concluded that the Government’s proposals in the then draft Bill—which may have just been a White Paper at that time—for content described as legal but harmful were detrimental to freedom of speech. It called for changes. Since then, as we know, the Government have dropped legal but harmful and instead introduced new user empowerment tools for adults to filter out harmful content. As we heard in earlier groups this evening, these would allow people to turn off or on content about subjects such as eating disorders and self-harm.

Some members of our committee might favour enhanced protection for adults. Indeed, some of my colleagues have already spoken in support of amendments to this end in other groups. Earlier this year, when the committee looked at the Bill as it had been reintroduced to Parliament, we agreed that, as things stood, these new user empowerment tools were a threat to freedom of speech. Whatever one’s views, there is no way of judging their impact or effectiveness—whether good or bad.

As we have heard already this evening, the Government have dropped the requirement for platforms to provide a public risk assessment of how these tools would work and their impact on freedom of speech. To be clear, for these user empowerment tools to be effective, the platforms will have to identify the content that users can switch off. This gives the platforms great power over what is deemed harmful to adults. Amendments 46, 47 and 64 are about ensuring that tech platforms are transparent about how they balance the principles of privacy, safety and freedom of speech for adults. These amendments would require platforms to undertake a risk assessment and publish a summary in their terms of service. This would involve them being clear about the effect of user empowerment tools on the users’ freedom of expression. Without such assessments, there is a risk that platforms would do either too much or too little. It would be very difficult to find out how they are filtering content and on what basis, and how they are addressing the twin imperatives of ensuring online safety without unduly affecting free speech.

To be clear, these amendments, unlike amendments in earlier groups, are neither about seeking to provide greater protection to adults nor about trying to reopen or revisit the question of legal but harmful. They are about ensuring transparency to give all users confidence about how platforms are striking the right balance. While their purpose is to safeguard freedom of speech, they would also bring benefits to those adults who wanted to opt in to the user empowerment tool because they would be able to assess what it was they were choosing not to see.

It is because of their twin benefits—indeed, their benefit to everyone—that we decided formally, as a committee, to recommend these amendments to the Government and for debate by your Lordships’ House. That said, the debate earlier suggests support for a different approach to enhancing protection for adults, and we may discover through this debate a preference for other amendments in this group to protect freedom of speech—but that is why we have brought these amendments forward.

20:45
I am now going to take off my Select Committee hat to say a few other remarks about freedom of expression—but I will not say very much, because I have the privilege of responding at the end. Indeed, there are noble Lords in the Chamber this evening who are far more steeped in this important principle of freedom of speech than me. I am keen to listen to what they have to say in order to judge to which of their amendments, if any, I will lend my support.
I should add that, perhaps unlike some other noble Lords who will speak on this group, I am about freedom of speech less as an end in itself and more as a means to a thriving democracy and healthy society. I have said on various public platforms over the last few months that I would have preferred the Bill to be about only child safety, so that we could learn before deciding what, if any, further steps to take—but we are where we are. What concerns me about the online world we now inhabit is in whose hands the power exists to decide what we get to see and debate. Who has the power to influence what is an acceptable opinion to hold? Who has the power to shape society, to such an extent that they can influence and change what we believe is right or wrong?
There is a real dilemma for me between the big tech platforms’ resistance to the responsibility that comes with being a publisher and us giving them that power and responsibility via the Bill. We will come back to the question of power and how we ensure that it is spread properly between Parliament, the Executive, the regulator and media platforms in a later group but, as we have decided to legislate for online safety, I want us to be as sure as we can be that we are not giving away political powers to individuals or institutions who have no democratic mandate or are not subject to suitable oversight. Freedom of speech and the clauses to which the amendments relate is why this is an important group.
I will make one final point before I sit down. Freedom of speech is also a critical element of the Digital Markets, Competition and Consumers Bill. That is why I have been so concerned that it was introduced alongside online safety. I am glad that it has finally arrived in Parliament and that we will get to examine it before too long. But that is for another day—for now, I beg to move.
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I have slightly abused my position because, as the noble Baroness has just said, this is a rather oddly constructed group. My amendments, which carve great chunks out of the Bill—or would do if I get away with it—do not quite point in the same direction as the very good speech the noble Baroness made, representing of course the view of the committee that she chairs so brilliantly. She also picked out one or two points of her own, which we also want to debate. It therefore might be easier if I just explain what I was trying to do in my amendments; then I will sit down and let the debate go, and maybe come back to pick up one or two threads at the end.

In previous Bills—and I have seen a lot of them—people who stand up and move clause stand part debates usually have a deeper and more worrying purpose behind the proposition. Either they have not read the Bill and are just trying to wing it, or they have a plan that is so complex and deep that it would probably need another whole Bill to sort it out. This is neither of those approaches; it is done because I want to represent the views mainly of the Joint Committee. We had quite a lot of debate in that committee about this area, beginning with the question about why the Bill—or the White Paper or draft Bill, at that stage—used the term “democratic importance” when many people would have used the parallel term “public interest” to try to reflect the need to ensure that matters which are of public good take place as a result of publication, or discussion and debate, or on online platforms. I am very grateful that the noble Lord, Lord Black, is able to be with us today. I am sure he will recall those debates, and hopefully he will make a comment on some of the work—and other members of the committee are also present.

To be clear, the question of whether Clauses 13, 14, 15 and 18 should stand part of the Bills is meant to release space for a new clause in Amendment 48. It is basically designed to try to focus the actions that are going to be taken by the Bill, and subsequently by the regulator, to ensure that the social media companies that are affected by, or in scope of, the Bill use, as a focus, some of the issues mainly related to “not taking down” and providing an appeal mechanism for journalistic material, whether that is provided by recognised news publishers or some other form of words that we can use, or it is done by recognised journalists. “Contentious” is an overused word, but all these terms are difficult to square away and be happy with, and therefore we should have the debate and perhaps reflect on that later when we come back to it.

The committee spent quite a lot of time on this, and there are two things that exercised our minds when we were working on this area. First, if one uses “content of democratic importance”, although it is in many ways quite a clever use of words to reflect a sensibility that you want to have an open and well-founded debate about matters which affect the health of our democracy, it can be read as being quite limiting. It is very hard to express—I am arguing against myself here—in the words of a piece of legislation what it is we are trying to get down to, but, during the committee’s recommendations, we received evidence that the definition of content of democratic importance was wider, or more capable of being interpreted as wider, than the scope the Government seem to have indicated. So there is both a good side and a bad side to this. If we are talking about content which is, or appears to be, specifically intended to contribute to the democratic political debate of the United Kingdom, or a part or area of the United Kingdom, we have got to ask the Minister to put on the record that this also inclusive of matters which perhaps initially do not appear necessarily to be part of it, but include public health, crime, justice, the environment, professional malpractice, the activities of large corporations and the hypocrisy of public figures when that occurs. I am not suggesting this is what we should be doing all the time, but these are things we often read about in our papers, and much the better off we are for it. However, if these things are not inclusive and not well rooted in the phrase “content of democratic importance”, it is up to the Government to come forward with a better way of expressing that, or perhaps in debate we can find it together.

I have some narrow questions. Are we agreed that what is currently in the Bill is intended specifically to contribute to democratic political debate, and is anything more needed to be said or done in order to make sure that happens? Secondly, the breadth of democratic political debate is obviously important; are there any issues here that are going to trip us up later when the Government come back and say, “Well, that wasn’t what we meant at all, and that doesn’t get covered, and therefore that stuff can be taken down, and that stuff there doesn’t have to be subject to repeal”? Are there contexts and subjects which we need to talk about? This is a long way into the question of content of democratic importance being similar or limited to matters that one recognises as relating to public interest. I think there is a case to be argued for the replacement of what is currently in the Bill with a way of trying to get closer to what we now recognise as being the standard form of debate and discussion when matters, which either the Government of the day or people individually do not like, get taken up and made the subject of legal discussion, because we do have discussions about whether or not it is in the public interest.

We probably do not know what that means. Therefore, a third part of my argument is that perhaps this is the point at which we try to define this, even though that might cause a lot of reaction from those currently in the press. In a sense, it is a question that needs to be resolved. Maybe this is or is not the right time to do that. Are the Government on the same page as the Joint Committee on this? Do they have an alternative and is this what they are trying to get across in the Bill?

Can we have a debate and discussion in relation to those things, making it clear that we want something in the Bill ensuring that vibrant political debate—the sort of things the noble Baroness was talking about on freedom of expression, but in a broader sense covering all the things that matter to the body politic, the people of this country—is not excluded by the Bill? That was the reason for putting down a raft of rather aggressive amendments. I hope it has been made clear that that was the case. I have other things that I would like to come back to, but I will probably do that towards the end of the debate. I hope that has been helpful.

Baroness Bull Portrait Baroness Bull (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to the amendments in the name of the noble Baroness, Lady Stowell, to which I have added my name. As we heard, the amendments originally sat in a different group, on the treatment of legal content accessed by adults. Noble Lords will be aware from my previous comments that my primary focus for the Bill has been on the absence of adequate provisions for the protection of adults, particularly those who are most vulnerable. These concerns underpin the brief remarks I will make.

The fundamental challenge at the heart of the Bill is the need to balance protection with the right to freedom of expression. The challenge, of course, is how. The noble Baroness’s amendments seek to find that balance. They go beyond the requirements on transparency reporting in Clause 68 in several ways. Amendment 46 would provide a duty for category 1 services to maintain an up-to-date document for users of the service, ensuring that users understand the risks they face and how, for instance, user empowerment tools can be used to help mitigate these risks. It also provides a duty for category 1 services to update their risk assessments before making any “significant change” to the design or operation of their service. This would force category 1 services to consider the impact of changes on users’ safety and make users aware of changes before they happen, so that they can take any steps necessary to protect themselves and prepare for them. Amendment 47 provides additional transparency by providing a duty for category 1 services to release a public statement of the findings of the most recent risk assessment, which includes any impact on freedom of expression.

The grouping of these amendments is an indication, if any of us were in doubt, of the complexity of balancing the rights of one group against the rights of another. Regardless of the groupings, I hope that the Minister takes note of the breadth and depth of concerns, as well as the willingness across all sides of the Committee to work together on a solution to this important issue.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I put my name to Amendment 51, which is also in the name of the noble Lords, Lord Stevenson and Lord McNally. I have done so because I think Clause 15 is too broad and too vague. I declare an interest, having been a journalist for my entire career. I am currently a series producer of a series of programmes on Ukraine.

This clause allows journalism on the internet to be defined simply as the dissemination of information, which surely covers all posts on the internet. Anyone can claim that they are a journalist if that is the definition. My concern is that it will make a nonsense of the Bill if all content is covered as journalism.

I support the aims behind the clause to protect journalism in line with Article 10. However, I am also aware of the second part of Article 10, which warns that freedom of speech must be balanced by duties and responsibilities in a democratic society. This amendment aims to hone the definition of journalism to that which is in the public interest. In doing so, I hope it will respond to the demands of the second part of Article 10.

It has never been more important to create this definition of journalism in the public interest. We are seeing legacy journalism of newspapers and linear television being supplanted by digital journalism. Both legacy and new journalism need to be protected. This can be a single citizen journalist, or an organisation like Bellingcat, which draws on millions of digital datapoints to create astonishing digital journalism to prove things such as that Russian separatist fighters shot down flight MH17 over Ukraine.

The Government’s view is that the definition of “in the public interest” is too vague to be useful to tech platforms when they are systematically filtering through possible journalistic content that needs to be protected. I do not agree. The term “public interest” is well known to the courts from the Defamation Act 2013. The law covers the motivation of a journalist, but does not go on to define the content of journalism to prove that it is in the public interest.

21:00
Surely what defines the public interest in journalism is proof that a process has been followed to ensure the accuracy and fairness of the information purveyed. A journalist using a public interest defence would show that they have checked the facts for accuracy by using authoritative or verifiable sources for their information. But, if the Government will not accept this definition and say that it is too hard to define “public interest”, the response should be to look at the laws that do that.
I ask the Committee to look at the public interest tests put forward by the Information Commissioner’s Office when deciding whether to grant a freedom of information request. They require the content to “promote public understanding” and safeguard the democratic process, uphold “standards of integrity”, ensure “justice and fair treatment” for all, and ensure the “best use” of public resources.
This is not an extensive list of the criteria that can be used to define “public interest”, so I also suggest that the Minister looks at the Public Interest Disclosure Act 1998, which aims to protect employees from unfair dismissal due to whistleblowing. It goes further in trying to define the disclosures that might be protected because they are in the public interest: a request should ensure that the information disclosed will reveal
“that a criminal offence has been committed, … that a person has failed … to comply with any … legal obligation to which he is subject, … that a miscarriage of justice has occurred, … that the health or safety of any individual has been … endangered”,
or
“that the environment has been … or is likely to be damaged”.
These definitions can be built on or worked through. Both Acts show that Parliament has successfully accepted the concept of the public interest defence and defined it, albeit in a limited way.
This amendment would ensure that category 1 services protect journalism in the public interest. This is not same as the powerful exemption offered to content provided by news publishers in Clause 50, which are defined by a clear set of criteria. Under Amendment 51, the journalism covered in Clause 15 would not have to belong to a regulator to qualify as being in the public interest; the author just has to prove that they have acted responsibly to deliver accurate and verifiable journalism. This would not stop disinformation appearing on the internet—which should be allowed to continue so that it can be refuted—but it would ensure that it does not benefit from the protection offered by Clause 15.
The Bill changes for ever the controversy about whether the platforms are publishers. Companies come in the scope of the Bill as publishers, and, as such, should have the ability to distinguish content that is accurate and fair public interest journalism and, as Clause 15(2) says, create a service
“using proportionate systems and processes designed to ensure that the importance of the free expression of journalistic content is taken into account”.
I am a great supporter of freedom of expression, and I am glad that the Bill contains protections for that. However, if category 1 companies will be asked to provide this protection, it has to be less vague and more defined. This amendment offers some way towards an answer.
Lord Kamall Portrait Lord Kamall (Con)
- View Speech - Hansard - - - Excerpts

My Lords, at the beginning of Committee, I promised that I would speak only twice, and this is the second time. I hope that noble Lords will forgive me if I stray from the group sometimes, but I will be as disciplined as I can. I will speak to Amendments 57 and 62, which the noble Baroness, Lady Featherstone, and I tabled. As others have said, the noble Baroness sends her apologies; sadly, she has fractured her spine, and I am sure we all wish her a speedy recovery. The noble Baroness, Lady Fox, has kindly added her name to these amendments.

As I have said, in a previous role, as a research director of a think tank—I refer noble Lords to my registered interests—I became interested in the phenomenon of unintended consequences. As an aside, it is sometimes known as the cobra effect, after an incident during the colonial rule of India, when a British administrator of Delhi devised a cunning plan to rid the city of dangerous snakes. It was simple: he would pay local residents a bounty for each cobra skin delivered. What could possibly go wrong? Never slow to exploit an opportunity, enterprising locals started to farm cobras as a way of earning extra cash. Eventually, the authorities grew wise to this, and the payments stopped. As a result, the locals realised that the snakes were now worthless and released them into the wild, leading to an increase, rather than a decrease, in the population of cobras.

As with the cobra effect, there have been many similar incidents of well-intentioned acts that have unintentionally made things worse. So, as we try to create a safer online space for our citizens, especially children and vulnerable adults, we should try to be as alert as we can to unintended consequences. An example is encrypted messages, which I discussed in a previous group. When we seek access to encrypted messages in the name of protecting children in this country, we should be aware that such technology could lead to dissidents living under totalitarian regimes in other countries being compromised or even murdered, with a devastating impact on their children.

We should also make sure that we do not unintentionally erode the fundamental rights and freedoms that underpin our democracy, and that so many people have struggled for over the centuries. I recognise that some noble Lords may say that that is applicable to other Bills, but I want to focus specifically on the implications for this Bill. In our haste to protect, we may create a digital environment and marketplace that stifles investment and freedom of expression, disproportionately impacting marginalised communities and cultivating an atmosphere of surveillance. The amendments the noble Baroness and I have tabled are designed to prevent such outcomes. They seek to strike a balance between regulating for a safer internet and preserving our democratic values. As many noble Lords have rightly said, all these issues will involve trade-offs; we may disagree, but I hope we will have had an informed debate, regardless of which side of the argument we are on.

We should explicitly outline the duties that service providers and regulators have with respect to these rights and freedoms. Amendment 57 focuses on safe- guarding specific fundamental rights and freedoms for users of regulated user-to-user services, including the protection of our most basic human rights. We believe that, by explicitly stating these duties, rather than hoping that they are somehow implied, we will create a more comprehensive framework for service providers to follow, ensuring that their safety policies and procedures do not undermine the essential rights of users, with specific reference to

“users with protected characteristics under the Equality Act 2010”.

Amendment 62 focuses on the role of Ofcom in mitigating risks to freedom of expression. I recognise that there are other amendments in this group on that issue. It is our responsibility to ensure that the providers of regulated user-to-user services are held accountable for their content moderation and recommender systems, to ensure they do not violate our freedoms.

I want this Bill to be a workable Bill. As I have previously said, I support the intention behind it to protect children and vulnerable adults, but as I have said many times, we should also be open about the trade-off between security and protection on the one hand, and freedom of expression on the other. My fear is that, without these amendments, we risk leaving our citizens vulnerable to the unintended consequences of overzealous content moderation, biased algorithms and opaque decision-making processes. We should shine a light on and bring transparency to our new processes, and perhaps help guide them by being explicit about those elements of freedom of speech we wish to preserve.

It is our duty to ensure that the Online Safety Bill not only protects our citizens from harm but safeguards the principles that form the foundation of a free and open society. With these amendments, we hope to transcend partisan divides and to fortify the essence of our democracy. I hope that we can work together to create an online environment that is safe, inclusive and respectful of the rights and freedoms that the people of this country cherish. I hope that other noble Lords will support these amendments, and, ever the optimist, that my noble friend the Minister will consider adopting them.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, it is a great pleasure to follow the noble Lord, Lord Kamall, who explained well why I put my name to the amendments. I extend my regards to the noble Baroness, Lady Featherstone; I was looking forward to hearing her remarks, and I hope that she is well.

I am interested in free speech; it is sort of my thing. I am interested in how we can achieve a balance and enhance the free speech rights of the citizens of this country through the Bill—it is what I have tried to do with the amendments I have supported—which I fear might be undermined by it.

I have a number of amendments in this group. Amendment 49 and the consequential Amendments 50 and 156 would require providers to include in their terms of service

“by what method content present on the service is to be identified as content of democratic importance”,

and bring Clause 13 in line with Clauses 14 and 15 by ensuring an enhanced focus on the democratic issue.

Amendment 53A would provide that notification is given

“to any user whose content has been removed or restricted”.

It is especially important that the nature of the restriction in place be made clear, evidenced and justified in the name of transparency and—a key point—that the user be informed of how to appeal such decisions.

Amendment 61 in my name calls for services to have

“proportionate systems, processes and policies designed to ensure that as great a weight is given to users’ right to freedom of expression ... as to safety when making decisions”

about whether to take down or restrict users access to the online world, and

“whether to take action against a user generating, uploading or sharing content”.

In other words, it is all about applying a more robust duty to category 1 service providers and emphasising the importance of protecting

“a wide diversity of political, social, religious and philosophical opinion”

online.

I give credit to the Government, in that Clause 18 constitutes an attempt by them in some way to balance the damage to individual rights to freedom of expression and privacy as a result of the Bill, but I worry that it is a weak duty. Unlike operational safety duties, which compel companies proactively to prevent or minimise so-called harm in the way we have discussed, there is no such attempt to insist that freedom of speech be given the same regard or importance. In fact, there are worries that the text of the Bill has downgraded speech and privacy rights, which the Open Rights Group says

“are considered little more than a contractual matter”.

There has certainly been a lot of mention of free speech in the debates we have had so far in Committee, yet I am not convinced that the Bill gives it enough credit, which is why I support the explicit reference to it by the noble Lord, Lord Kamall.

I have a lot of sympathy with the amendments of the noble Lord, Lord Stevenson, seeking to replace Clauses 13, 14, 15 and 18 with a single comprehensive duty, because in some ways we are scratching around. That made some sense to me and I would be very interested to hear more about how that might work. Clauses 13, 14, 15 and 18 state that service providers must have regard to the importance of protecting users’ rights to freedom of expression in relation to

“content of democratic importance ... publisher content ... journalistic content”.

The very existence of those clauses, and the fact that we even need those amendments, is an admission by the Government that elsewhere, free speech is a downgraded virtue. We need these carve-outs to protect these things, because the rest of the Bill threatens free speech, which has been my worry from the start.

My Amendment 49 is a response to the Bill’s focus on protecting “content of democratic importance”. I was delighted that this was included, and the noble Lord, Lord Stevenson of Balmacara, has raised a lot of the questions I was asking. I am concerned that it is rather vaguely drawn, and too narrow and technocratic—politics with a big “P”, rather than in the broader sense. There is a lot that I would consider democratically important that other people might see, especially given today’s discussion, as harmful or dangerous. Certainly, the definition should be as broad as possible, so my amendment seeks to write that down, saying that it should include

“political, social, religious and philosophical opinion”.

That is my attempt to broaden it out. It is not perfect, I am sure, but that is the intention.

I am also keen to understand why Clauses 14 and 15, which give special protection to news publisher and journalistic content, have enhanced provisions, including an expedited appeals process for the reinstatement of removed materials, but those duties are much weaker—they do not exist—in Clause 13, which deals with content of democratic importance. In my amendment, I have suggested that they are levelled up.

21:15
My Amendment 61 attempts to tackle the duties that will be used for companies in terms of safety, which is the focus of the Bill. It stresses that equal weight should be given to free speech and to safety. This relates to the content of democratic importance that I have just been talking about, because I argue that democracy is not safe if we do not proactively promote freedom. Both those amendments try to ensure that companies act to remove philosophical, religious, democratic and social material only in extremis—as an exception, not the rule—and that they always have free speech at the forefront.
On the issue of how we view content of democratic importance, one thing has not been stressed in our discussions so far. We should note that the right to freedom of expression is not just about defending the ability of individuals to speak or impart information; it is also the right of the public to receive information and the freedom to decide what they find useful or second-rate and what they want to watch or listen to. It is not just the right to post opinions but the right of others to have access to diverse opinions and postings; that kind of free flow of information is the very basis of our democracy. In my view, despite its talk of user controls and user empowerment, the Bill does not allow for that or take it into account enough.
It is very important, therefore, that users are told if their posts are restricted, how they are restricted and how they can appeal. That is the focus of Amendment 53A. The EHRC says that the Bill overall lacks a robust framework for individuals to appeal platforms’ decisions or to seek redress for unjustified censorship. I think that needs to be tackled. Clause 19 has a basic complaints procedure, but my amendment to Clause 17 tries to tackle what is a very low bar by stressing the need for “evidenced justification” and details on how to appeal. Users need to know exactly why there has been a decision to restrict or remove. That is absolutely crucial.
Ofcom is the enforcer in all this, with the Secretary of State of the day being given a plethora of new delegated powers, which I think we need to be concerned about. As the coalition group Legal to Say, Legal to Type notes, the Bill in its current form gives extensive powers to the Secretary of State and Ofcom:
“This would be the first time since the 1600s that written speech will be overseen by the state in the UK”.
The truth is that we probably need a new Milton, but in 2023 what we have instead is a Moylan. I have put my name to a range of the excellent series of amendments from the noble Lord, Lord Moylan, including Amendments 102, 191 and 220, all dealing with Ofcom and the Secretary of State. As he will explain, it is really crucial that we take that on.
I did not put my name to the noble Lord’s Amendment 294, although I rather wish I had. In some ways this is a key amendment, as it would leave out the word “psychological” from the definition of harm. As we have gone through all these discussions so far in Committee and at Second Reading and so on, the definition of harm is something that, it seems to me, is very slippery and difficult. People just say, “We have to remove harmful content” or, “It is okay to remove harmful content”, but it is not so simple.
I know that any philosophical rumination is frowned upon at this stage—I was told off for it the other day—but, as this is the 150th anniversary of JS Mill’s death, let me note that his important harm principle has been somewhat bastardised by an ever-elastic concept of harm.
Psychological harm, once added into the mix—I spoke about this before—is going to lead to the over-removal of lawful content, because what counts as harm is not settled online or offline. There is no objective way of ascertaining whether emotional or psychological harm has occurred. Therefore, it will be impossible to determine whether service providers have discharged their duties. Controversies of interpretation about what is harmful have already left the door open to activist capture, and this concept is regularly weaponised to close down legitimate debate.
The concept of harm, once expanded to include psychological harm, is subject to concept creep and subjectivity. The lack of definition was challenged by the Lords Communications and Digital Committee when it wrote to the Secretary of State asking whether psychological harm had any objective clinical basis. DCMS simply confirmed that it did not, yet psychological harm is going to be used as a basis for removing lawful speech from the online world. That can lead only to a censorious and, ironically, more toxic online environment, with users posting in good faith finding their access to services—access that is part of the democratic public square—being shut down temporarily or permanently, even reported to the law or what have you, just because they have been accused of causing psychological harm. The free speech elements of the Bill need to be strengthened enormously.
Lord Hope of Craighead Portrait Lord Hope of Craighead (CB)
- View Speech - Hansard - - - Excerpts

My Lords, my Amendment 63 is about the meaning of words. It was an interesting feature of the speech made by the noble Baroness, Lady Fox of Buckley, which we have just had the pleasure of listening to, that she slipped from time to time from the phrase “freedom of expression” to “freedom of speech”. That is not a criticism; it is very easy for one to treat these expressions as meaning the same thing. Others in this debate have done the same thing. I think that the noble Baroness, Lady Stowell, used “freedom of speech” sometimes, as well as “freedom of expression”. It is not a criticism; it is just a fact that we tend to treat the two the same.

However, the Government in Clause 18 have chosen to use the words

“freedom of expression within the law”.

My amendment draws attention to that feature. If we work our way through Clause 18, its purpose is to set out the duties about freedom of expression and privacy that are to apply in relation to the user-to-user services referred to in that clause. Clause 18(2) imposes on those providing user-to-user services

“a duty to have particular regard to the importance of protecting users’ right to freedom of expression within the law”

when deciding on and implementing safety measures and policies. Clause 18(8) provides a definition of the phrase “safety measures and policies”, which

“means measures and policies designed to secure compliance with any of the duties set out”

in previous clauses of the Bill. These extend to illegal content, to children’s online safety, to user empowerment, to content reporting relating to illegal content and content that is likely to be harmful to children, and to complaints procedures. So a balance has to be struck between giving effect to the right to freedom of expression within the law and performing the important duties referred to in the clause. As Clause 18(4) explains, when decisions are being taken about the safety measures and policies that are to be introduced or applied, there must be an assessment of the impact that they would have on the user’s right to freedom of expression within the law.

My amendment was prompted by a point made by the Constitution Committee, of which I am a member, in its report on the Bill. It suggested that the House might wish to consider whether, in the interests of legal certainty, the expression “freedom of expression” should also be defined for the purposes of this clause.

The committee referred to the fact that in its report on the on the Higher Education (Freedom of Speech) Bill, it recommended that that Bill should define the expression “freedom of speech”, which is what that Bill was talking about, by referring to Article 10 of the European Convention on Human Rights. I raised this issue by proposing an amendment to that effect in Committee on that Bill. On Report, a government amendment to achieve that was agreed to and, in due course, it was also agreed by the House of Commons. My Amendment 63 adopts the same wording as that used in the Higher Education (Freedom of Speech) Bill, and I suggest that it should be adopted here, too, in the interests of consistency and to provide the desirable element of legal certainty.

Although it appears in a different group, I think it is worth referring to Amendment 58 in the names of the noble Baroness, Lady Fraser of Craigmaddie, and the noble Lord, Lord Foulkes of Cumnock. It proposes the insertion of the words

“as defined under the Human Rights Act 1998 and its application to the United Kingdom”,

so it is making the same point and an additional one, which is this. We have to be very careful in this Bill to recognise that it extends to all parts of the United Kingdom, particularly in regard to the devolved Administrations in Scotland, Wales and Northern Ireland. Scotland is very active in promoting legislation dealing with matters of this kind, and it is rather important that we should define in the Bill what is meant by

“freedom of expression within the law”

in its application throughout the United Kingdom, lest there should be any doubt as to what it might mean in the other parts of this country—particularly, if I may say so, with regard to Scotland. The noble Baroness, Lady Fraser, may say more about this at this stage, although her amendment is in a different group, because it is very pertinent to the point I am trying to make about the need for a definition in Clause 18.

That is the reasoning behind the amendment, and I come back to the interesting feature that one tends to mix the expressions “freedom of speech” and “freedom of expression”, but it is important to anchor exactly why the Government chose to use the words

“freedom of expression within the law”

for the purposes of this clause.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I hung back in the hope that the noble and learned Lord, Lord Hope of Craighead, would speak before me, because I suspected that his remarks would help elucidate my amendments, as I believe they have. I have a large number of amendments in this group, but all of them, with one exception, work together as, effectively, a single amendment. They are Amendments 101, 102, 109, 112, 116, 121, 191 and 220. The exception is Amendment 294, to which the noble Baroness, Lady Fox of Buckley, alluded and to which I shall return in a moment.

Taking that larger group of amendments first, I can describe their effect relatively briefly. In the Bill, there are requirements on services to consider how their practices affect freedom of expression, but there is no equivalent explicit duty on the regulator, Ofcom, to have regard to freedom of expression.

These amendments, taken together, would require Ofcom to

“have special regard to freedom of expression”

within the law when designing codes of practice, writing guidance and undertaking enforcement action. They would insert a new clause requiring Ofcom to have special regard to rights to freedom of expression within the law in preparing a code of practice; they would also require Ofcom, when submitting a draft code to the Secretary of State, to submit a statement setting out it had complied with the duty imposed by that new requirement; and they would require the Secretary of State to submit that statement to Parliament when laying a draft code before Parliament. They would impose similar obligations on Ofcom and the Secretary of State when making amendments to codes that might be made later. Finally, they would have a similar effect relating to guidance issued by Ofcom.

It is so glaringly obvious that Ofcom should be under this duty that it must be a mere omission that the balancing, corresponding duty has not been placed on it that has been placed on the providers. I would hope, though experience so far in Committee does not lead me to expect, that my noble friend would accept this, and that it would pass relatively uncontroversially.

21:30
I will say no more about it, except to make one slightly more reflective comment—and here I am very conscious of speaking in the presence of the noble and learned Lord, Lord Hope of Craighead, who is perfectly entitled to correct me if I stray. There has been a great deal of comment from the Front Bench and from other parts of the Committee about how the Bill has to balance freedom of expression with safety, and inevitably such a balance is required. But in any such balance, the scales have to be tipped in favour of freedom of expression, because freedom of expression is a human right in the European Convention on Human Rights.
It is true of course that the second part of Article 10 allows it to be mitigated in some ways, but the starting point has to be the first clause of Article 10, which states that freedom of expression stands as a fundamental human right. Every abridgement of it has to be justified individually in relation to the second part; it is not enough to say that the two are somehow equal and that we have to find a balance that is purely prudential or that fits in with our notions of common sense or good judgment. There is a weighting in that balance, and that weighting is in favour of freedom of expression. So, I would strongly encourage noble Lords to bear that in mind, and I hope that this relatively simple proposal will find widespread acceptance.
I come now to Amendment 294, which is completely different but relates to this question of the definition of harm. As the noble Baroness, Lady Fox of Buckley, said, harm is defined very loosely and vaguely in the Bill—it is defined simply as “physical or psychological harm”, which is a self-referential definition and expands it somewhat.
I think we all understand what might be meant by “physical harm”, but, when it comes to “psychological harm”, I could understand a definition that had a basis in medical science. Perhaps the right word for such a definition would be “psychiatric harm”; I could understand that because medical science has some objective basis to it. But when one finds the words “psychological harm” being used, and when the department confirms that there is no objective basis for it, one is effectively opening the door to talking about “feelings”.
I know of course that there are genuine psychological harms which give great concern to Members of this Committee, including myself. Psychological harms that lead to eating disorders are a good example, and I understand that; I am not trying to trivialise psychological harms. This amendment is a probing amendment; it is trying to find out what the Government mean and what boundaries, if any, they set to their understanding of the term “psychological”. If there are no boundaries, it really does extend to “feelings”, because that is how the term is increasingly used, especially among the young—and that is a very loose definition.
So, in probing the Government on what they mean by “psychological harm”, I hope to have something hard and solid coming back from them that we know sets some limits to where this can take us.
Lord Black of Brentwood Portrait Lord Black of Brentwood (Con)
- View Speech - Hansard - - - Excerpts

My Lords, this is my first opportunity to speak in Committee on this important Bill, but I have followed it very closely, and the spirit in which constructive debate has been conducted has been genuinely exemplary. In many ways, it mirrors the manner in which the Joint Committee, on which I had the privilege to serve with other noble Lords, was conducted, and its report rightly has influenced our proceedings in so many ways. I declare an interest as deputy chairman of Telegraph Media Group, which is a member of the News Media Association, and a director of the Regulatory Funding Company, and note my other interests as set out in the register.

I will avoid the temptation to ruminate philosophically, as the noble Baroness, Lady Fox, entertained us by doing. I will speak to Amendment 48, in the name of the noble Lord, Lord Stevenson of Balmacara, and the other amendments which impact on the definition of “recognised news publisher”. As the noble Lord said, his amendments are pretty robust in what they seek to achieve, but I am very pleased that he has tabled them, because it is important that we have a debate about how the Bill impacts on freedom of expression—I use that phrase advisedly—and press and media freedom. The noble Lord’s aims are laudable but do not quite deliver what he intends.

I will explain why it is important that Clauses 13 and 14 stand part of the Bill, and without amendments of the sort proposed. The Joint Committee considered this issue in some detail and supported the inclusion of the news publisher content exemption. These clauses are crucial to the whole architecture of the Bill because they protect news publishers from being dragged into an onerous regime of statutory content control. The press—these clauses cover the broadcasters too—have not been subject to any form of statutory regulation since the end of the 17th century. That is what we understand by press freedom: that the state and its institutions do not have a role in controlling or censoring comment. Clauses 13 and 14 protect that position and ensure that the media, which is of course subject to rigorous independent standard codes as well as to criminal and civil law, does not become part of a system of state regulation by the back door because of its websites and digital products.

That is what is at the heart of these clauses. However, it is not a carte blanche exemption without caveats. As the Joint Committee looked at, and as we have heard, to qualify for it, publishers must meet stringent criteria, as set out in Clause 50, which include being subject to standards codes, having legal responsibility for material published, having effective policies to handle complaints, and so on. It is exactly the same tough definition as was set out in the National Security Bill, which noble Lords across the House supported when it was on Report here.

Without such clear definitions, alongside requirements not to take down or restrict access to trusted news sources without notification, opaque algorithms conjured up in Silicon Valley would end up restricting the access of UK citizens to news, with scant meaningful scope for reinstating it given the short shelf life of news. Ultimately, that would have a profound impact on the public’s right to access news, something which the noble Baroness rightly highlighted. That is why the Joint Committee recommended, at paragraph 304 of its report, that the Bill was

“strengthened to include a requirement that news publisher content should not be moderated, restricted or removed unless it is content the publication of which clearly constitutes a criminal offence, or which has been found to be unlawful by order of a court within the appropriate jurisdiction”.

The Government listened to that concern that the platforms would put themselves in the position of censor on issues of democratic importance, and quite rightly amended the draft Bill to deal with that point. Without it, instead of trusted, curated, regulated news comment, from the BBC to the Guardian to the Manchester Evening News, news would end up being filtered by Google and Facebook. That would be a crushing blow to free speech, to which all noble Lords are absolutely committed.

So, instead of these clauses acting as a bulwark against disinformation by protecting content of democratic importance, they would weaken the position of trusted news providers by introducing too much ambiguity into the system. As we all know, ambiguity brings with it legal challenge and constant controversy. This is especially so given that the exemptions that we are talking about already exist in statute elsewhere, which would cause endless confusion.

I understand the rationale behind many of the amendments, but I fear they would not work in practice. Free speech—and again I use the words advisedly—is a very delicate bloom, which can easily be swept away by badly drafted, uncertain or opaque laws. Its protection needs certainty, which is what the Bill, as it stands, provides. A general catch-all clause would be subject, I fear, to endless argument with the platforms, which are well known for such tactics and for endless legal wrangling.

I noted the remarks of the noble Lord, Lord Stevenson of Balmacara, in his superb speech on the opening day in Committee, when he said that one issue with the Bill is that it

“is very difficult to understand, in part because of its innate complexity and in part because it has been revised so often”. [Official Report, 19/4/23; col. 700.]


He added, in a welcome panegyric to clarity and concision, that given that it is a long and complex Bill, why would we add to it? I agree absolutely with him, but those are arguments for not changing the Bill in the way he proposes. I believe the existing provisions are clear and precise, practical and carefully calibrated. They do not leave room for doubt, and protect media freedom, investigative journalism and the citizen’s right to access authoritative news, which is why I support the Bill as it stands.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, given the lateness of the hour, I will make just three very brief points. The first is that I find it really fascinating that the amendments in the name of the noble Baroness, Lady Stowell, come from a completely different perspective, but still demand transparency over what is going on. I fully support the formation that she has found, and I think that in many ways they are better than the other ones which came from the other perspective. But what I urge the Minister to hear is that we all seek transparency over what is going on.

Secondly, in many of the amendments—I think I counted about 14 or 15 in the name of the noble Lord, Lord Moylan, and also of the noble Lord, Lord Kamall—there is absolutely nothing I disagree with. My problem with these amendments really goes back to the debate we had on the first day on Amendment 1, in the name of the noble Lord, Lord Stevenson. He set out the purposes of the Bill, and the Minister gave what was considered by most Members of your Lordships’ House to be the groundwork of a very excellent alternative, in the language of government. It appears, as we go on, that many dozens of amendments could be dropped in favour of this purposive clause, which itself could include reference to human rights, children’s rights, the Equality Act, the importance of freedom of expression under the law, and so on. I urge the Minister to consider the feeling of the House: that the things said at the Dispatch Box to be implicit, again and again, the House requires to be explicit. This is one way we could do it, in short form, as the noble Lord, Lord Black, just urged us.

Thirdly, I do have to speak against Amendment 294. I would be happy to take the noble Lord, Lord Moylan, through dozens of studies that show the psychological impact of online harms: systems that groom users to gamble, that reward them for being online at any cost to their health and well-being, that profile them to offer harmful material, and more of the same whether they ask for it or not, and so on. I am also very happy to put some expert voices at his disposal, but I will just say this: the biggest clue as to why this amendment is wrongheaded is the number of behavioural psychologists that are employed by the tech sector. They are there, trying to get at our behaviours and thoughts; they anticipate our move and actually try to predict and create the next move. That is why we have to have psychological harm in the Bill.

21:45
Baroness Fraser of Craigmaddie Portrait Baroness Fraser of Craigmaddie (Con)
- View Speech - Hansard - - - Excerpts

I will not detain noble Lords very long either. Two things have motivated me to be involved in this Bill. One is protection for vulnerable adults and the second is looking at this legislation with my Scottish head on, because nobody else seems to be looking at it from the perspective of the devolved Administrations.

First, on protection for vulnerable adults, we have already debated the fact that in an earlier iteration of this Bill, there were protections. These have been watered down and we now have the triple shield. Whether they fit here, with the amendment from my noble friend Lady Stowell, or fit earlier, what we are all asking for is the reinstatement of risk assessments. I come at this from a protection of vulnerable groups perspective, but I recognise that others come at it from a freedom of expression perspective. I do not think the Minister has answered my earlier questions. Why have risk assessments been taken out and why are they any threat? It seems to be the will of the debate today that they do nothing but strengthen the transparency and safety aspects of the Bill, wherever they might be put.

I speak with trepidation to Amendment 63 in the name of the noble and learned Lord, Lord Hope of Craighead. I flatter myself that his amendment and mine are trying to do a similar thing. I will speak to my amendment when we come to the group on devolved issues, but I think what both of us are trying to establish is, given that the Bill is relatively quiet on how freedom of expression is defined, how do platforms balance competing rights, particularly in the light of the differences between the devolved Administrations?

The Minister will know that the Hate Crime and Public Order (Scotland) Act 2021 made my brain hurt when trying to work out how this Bill affects it, or how it affects the Bill. What is definitely clear is that there are differences between the devolved Administrations in how freedom of expression is interpreted. I will study the noble and learned Lord’s remarks very carefully in Hansard; I need a little time to think about them. I will listen very carefully to the Minister’s response and I look forward to the later group.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I too will be very brief. As a member of the Communications and Digital Committee, I just wanted to speak in support of my noble friend Lady Stowell of Beeston and her extremely powerful speech, which seems like it was quite a long time ago now, but it was not that long. I want to highlight two things. I do not understand how, as a number of noble Lords have said, having risk assessments is a threat to freedom of expression. I think the absolute opposite is the case. They would enhance all the things the noble Baroness, Lady Fox, is looking to see in the Bill, just as much as they would enhance the protections that my noble friend, who I always seem to follow in this debate, is looking for.

Like my noble friend, I ask the Minister: why not? When the Government announced the removal of legal but harmful and the creation of user empowerment tools, I remember thinking—in the midst of being quite busy with Covid—“What are user empowerment tools and what are they going to empower me to do?” Without a risk assessment, I do not know how we answer that question. The risk is that we are throwing that question straight to the tech companies to decide for themselves. A risk assessment provides the framework that would enable user empowerment tools to do what I think the Government intend.

Finally, I too will speak against my noble friend Lord Moylan’s Amendment 294 on psychological harm. It is well documented that tech platforms are designed to drive addiction. Addiction can be physiological and psychological. We ignore that at our peril.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to have been part of this debate and to have heard how much we are on common ground. I very much hope that, in particular, the Minister will have listened to the voices on the Conservative Benches that have very powerfully put forward a number of amendments that I think have gained general acceptance across the Committee.

I fully understand the points that the noble Lord, Lord Black, made and why he defends Clause 14. I hope we can have a more granular discussion about the contents of that clause rather than wrap it up on this group of amendments. I do not know whether we will be able to have that on the next group.

I thank the noble Baroness, Lady Stowell, for putting forward her amendment. It is very interesting, as the noble Baronesses, Lady Bull and Lady Fraser, said, that we are trying to get to the same sort of mechanisms of risk assessment, perhaps out of different motives, but we are broadly along the same lines and want to see them for adult services. We want to know from the Minister why we cannot achieve that, basically. I am sure we could come to some agreement between us as to whether user empowerment tools or terms of service are the most appropriate way of doing it.

We need to thank the committee that the noble Baroness chairs for having followed up on the letter to the Secretary of State for DCMS, as was, on 30 January. It is good to see a Select Committee using its influence to go forward in this way.

The amendments tabled by the noble Lord, Lord Kamall, and supported by my noble friend Lady Featherstone—I am sorry she is unable to be here today, as he said—are important. They would broaden out consideration in exactly the right kind of way.

However, dare I say it, probably the most important amendment in this group is Amendment 48 in the name of the noble Lord, Lord Stevenson. Apart from the Clause 14 stand part notice, it is pretty much bang on where the Joint Committee got to. He was remarkably tactful in not going into any detail on the Government’s response to that committee. I will not read it out because of the lateness of the hour, but the noble Viscount, Lord Colville, got pretty close to puncturing the Government’s case that there is no proper definition of public interest. It is quite clear that there is a perfectly respectable definition in the Human Rights Act 1998 and, as the noble Viscount said, in the Defamation Act 2013, which would be quite fit for purpose. I do not quite know why the Government responded as they did at paragraph 251. I very much hope that the Minister will have another look at that.

The amendment from the noble and learned Lord, Lord Hope, which has the very respectable support of Justice, is also entirely apposite. I very much hope that the Government will take a good look at that.

Finally, and extraordinarily, I have quite a lot of sympathy with the amendments from the noble Lord, Lord Moylan. It was all going so well until we got to Amendment 294; up to that point I think he had support from across the House, because placing that kind of duty on Ofcom would be a positive way forward.

As I say, getting a clause of the kind that the noble Lord, Lord Stevenson, has put forward, with that public interest content point and with an umbrella duty on freedom of expression, allied to the definition from the noble and learned Lord, Lord Hope, would really get us somewhere.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

Lawyers—don’t you love them? How on earth are we supposed to unscramble that at this time of night? It was good to have my kinsman, the noble and learned Lord, Lord Hope, back in our debates. We were remarking only a few days ago that we had not seen enough lawyers in the House in these debates. One appears, and light appears. It is a marvellous experience.

I thank the Committee for listening to my earlier introductory remarks; I hope they helped to untangle some of the issues. The noble Lord, Lord Black, made it clear that the press are happy with what is in the current draft. There could be some changes, and we have heard a number of examples of ways in which one might either top or tail what there is.

There was one question that perhaps he could have come back on, and maybe he will, as I have raised it separately with the department before. I agree with a lot of what he said, but it applies to a lot more than just news publishers. Quality journalism more generally enhances and restores our faith in public services in so many ways. Why is it only the news? Is there a way in which we could broaden that? If there is not this time round, perhaps that is something we need to pick up later.

As the noble Lord, Lord Clement-Jones, has said, the noble Viscount, Lord Colville, made a very strong and clear case for trying to think again about what journalism does in the public realm and making sure that the Bill at least carries that forward, even if it does not deal with some of the issues that he raised.

We have had a number of other good contributions about how to capture some of the good ideas that were flying around in this debate and keep them in the foreground so that the Bill is enhanced. But I think it is time that the Minister gave us his answers.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

I join noble Lords who have sent good wishes for a speedy recovery to the noble Baroness, Lady Featherstone.

Amendments 46, 47 and 64, in the name of my noble friend Lady Stowell of Beeston, seek to require platforms to assess the risk of, and set terms for, content currently set out in Clause 12. Additionally, the amendments seek to place duties on services to assess risks to freedom of expression resulting from user empowerment tools. Category 1 platforms are already required to assess the impact on free expression of their safety policies, including user empowerment tools; to keep that assessment up to date; to publish it; and to demonstrate the positive steps they have taken in response to the impact assessment in a publicly available statement.

Amendments 48 and 100, in the name of the noble Lord, Lord Stevenson, seek to introduce a stand-alone duty on category 1 services to protect freedom of expression, with an accompanying code of practice. Amendments 49, 50, 53A, 61 and 156, in the name of the noble Baroness, Lady Fox, seek to amend the Bill’s Clause 17 and Clause 18 duties and clarify duties on content of democratic importance.

All in-scope services must already consider and implement safeguards for freedom of expression when fulfilling their duties. Category 1 services will need to be clear what content is acceptable on their services and how they will treat it, including when removing or restricting access to it, and that they will enforce the rules consistently. In setting these terms of service, they must adopt clear policies designed to protect journalistic and democratic content. That will ensure that the most important types of content benefit from additional protections while guarding against the arbitrary removal of any content. Users will be able to access effective appeal mechanisms if content is unfairly removed. That marks a considerable improvement on the status quo.

Requiring all user-to-user services to justify why they are removing or restricting each individual piece of content, as Amendment 53A would do, would be disproportionately burdensome on companies, particularly small and medium-sized ones. It would also duplicate some of the provisions I have previously outlined. Separately, as private entities, service providers have their own freedom of expression rights. This means that platforms are free to decide what content should or should not be on their website, within the bounds of the law. The Bill should not mandate providers to carry or to remove certain types of speech or content. Accordingly, we do not think it would be appropriate to require providers to ensure that free speech is not infringed, as suggested in Amendment 48.

22:00
Similarly, it would not be appropriate to require providers to give the same weight to protecting freedom of expression as to safety, as required under Amendment 61. Both amendments would, in effect, require platforms to carry legal content—even if they did not wish to—for safety, commercial or other reasons. This would likely result in worse outcomes for many users.
We have designed the regulatory framework to balance protecting user safety and freedom of expression. Platforms and Ofcom have duties relating to freedom of expression for which they can be held to account. A “must balance” test suggests there is a clear line to be drawn as to where legal content should be removed. This is in conflict with our policy, which accepts that it would be inappropriate for the Government to require companies to remove legal content accessed by adults. It also recognises that, as private entities, companies have the right to remove legal content from their services if they wish to do so. Preventing them from doing so by requiring them to balance this against other priorities could have unintended consequences.
Government Amendments 50A and 50F in my name seek to clarify that the size and capacity of the provider are important in construing the reference to proportionate systems and processes with regard to the duties on category 1 services to protect journalistic content and content of democratic importance. These amendments increase legal certainty and make the structure of these clauses consistent with other references to proportionality in the Bill. Without these amendments, it would be less clear which factors are important when construing whether a provider’s systems and processes to protect journalistic content and content of democratic importance are proportionate.
Amendment 51 in the name of the noble Lord, Lord Stevenson of Balmacara, seeks to change the duty of category 1 services to protect journalistic content so it applies only to journalism which they have judged to be in the public interest. This would delegate an inappropriate amount of power to platforms. Category 1 platforms are not in a position to decide what information is in the interests of the British public. Requiring them to do so would undermine why we introduced the Clause 15 duties—
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - - - Excerpts

Why would it not be possible for us to try to define what the public interest might be, and not leave it to the platforms to do so?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I ask the noble Viscount to bear with me. I will come on to this a bit later. I do not think it is for category 1 platforms to do so.

We have introduced Clause 15 to reduce the powers that the major technology companies have over what journalism is made available to UK users. Accordingly, Clause 15 requires category 1 providers to set clear terms of service which explain how they take the importance of journalistic content into account when making their moderation decisions. These duties will not stop platforms removing journalistic content. Platforms have the flexibility to set their own journalism policies, but they must enforce them consistently. They will not be able to remove journalistic content arbitrarily. This will ensure that platforms give all users of journalism due process when making content moderation decisions. Amendment 51 would mean that, where platforms subjectively reached a decision that journalism was not conducive to the public good, they would not have to give it due process. Platforms could continue to treat important journalistic content arbitrarily where they decided that this content was not in the public interest of the UK.

In his first remarks on this group the noble Lord, Lord Stevenson, engaged with the question of how companies will identify content of democratic importance, which is content that seeks to contribute to democratic political debate in the UK at a national and local level. It will be broad enough to cover all political debates, including grass-roots campaigns and smaller parties. While platforms will have some discretion about what their policies in this area are, the policies will need to ensure that platforms are balancing the importance of protecting democratic content with their safety duties. For example, platforms will need to consider whether the public interest in seeing some types of content outweighs the potential harm it could cause. This will require companies to set out in their terms of service how they will treat different types of content and the systems and processes they have in place to protect such content.

Amendments 57 and 62, in the name of my noble friend Lord Kamall, seek to impose new duties on companies to protect a broader range of users’ rights, as well as to pay particular attention to the freedom of expression of users with protected characteristics. As previously set out, services will have duties to safeguard the freedom of expression of all users, regardless of their characteristics. Moreover, UK providers have existing duties under the Equality Act 2010 not to discriminate against people with characteristics which are protected in that Act. Given the range of rights included in Amendment 57, it is not clear what this would require from service providers in practice, and their relevance to service providers would likely vary between different rights.

Amendment 60, in the name of the noble Lord, Lord Clement-Jones, and Amendment 88, in the name of the noble Lord, Lord Stevenson, probe whether references to privacy law in Clauses 18 and 28 include Article 8 of the European Convention on Human Rights. That convention applies to member states which are signatories. Article 8(1) requires signatories to ensure the right to respect for private and family life, home and correspondence, subject to limited derogations that must be in accordance with the law and necessary in a democratic society. The obligations flowing from Article 8 do not apply to individuals or to private companies and it would not make sense for these obligations to be applied in this way, given that states which are signatories will need to decide under Article 8(2) which restrictions on the Article 8(1) right they need to impose. It would not be appropriate or possible for private companies to make decisions on such restrictions.

Providers will, however, need to comply with all UK statutory and common-law provisions relating to privacy, and must therefore implement safeguards for user privacy when meeting their safety duties. More broadly, Ofcom is bound by the Human Rights Act 1998 and must therefore uphold Article 8 of the European Convention on Human Rights when implementing the Bill’s regime.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

It is so complicated that the Minister is almost enticing me to stand up and ask about it. Let us just get that right: the reference to the Article 8 powers exists and applies to those bodies in the UK to which such equivalent legislation applies, so that ties us into Ofcom. Companies cannot be affected by it because it is a public duty, not a private duty, but am I then allowed to walk all the way around the circle? At the end, can Ofcom look back at the companies to establish whether, in Ofcom’s eyes, its requirements in relation to its obligations under Article 8 have or have not taken place? It is a sort of transparent, backward-reflecting view rather than a proactive proposition. That seems a complicated way of saying, “Why don’t you behave in accordance with Article 8?”

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes, Ofcom, which is bound by it through the Human Rights Act 1998, can ask those questions and make that assessment of the companies, but it would not be right for private companies to be bound by something to which it is not appropriate for companies to be signatories. Ofcom will be looking at these questions but the duty rests on it, as bound by the Human Rights Act.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

It is late at night and this is slightly tedious, but in the worst of all possible circumstances, Ofcom would be looking at what happened over the last year in relation to its codes of practice and assertions about a particular company. Ofcom is then in trouble because it has not discharged its Article 8 obligations, so who gets to exercise a whip on whom? Sorry, whips are probably the wrong things to use, but you see where I am coming from. All that is left is for the Secretary of State, but probably it would effectively be Parliament, to say to Ofcom, “You’ve failed”. That does not seem a very satisfactory solution.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Platforms will be guided by Ofcom in taking measures to comply with their duties which are recommended in Ofcom’s codes, and which contain safeguards for privacy, including ones based on the European Convention on Human Rights and the rights therein. Paragraph 10(2)(b) of Schedule 4 requires Ofcom to ensure that measures, which it describes in the code of practice, are designed in light of the importance of protecting the privacy of users. Clause 42(2) and (3) provides that platforms will be treated as complying with the privacy duties set out at Clause 18(2) and Clause 28(2), if they take the recommended measures that Ofcom sets out in the codes.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

That is the point I was making.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

It worked. In seriousness, we will both consult the record and, if the noble Lord wants more, I am very happy to set it out in writing.

Amendment 63 in the name of the noble and learned Lord, Lord Hope of Craighead, seeks to clarify that “freedom of expression” in Clause 18 refers to the

“freedom to impart ideas, opinions or information”,

as referred to in Article 10 of the European Convention on Human Rights. I think I too have been guilty of using the phrases “freedom of speech” and “freedom of expression” as though they were interchangeable. Freedom of expression, within the law, is intended to encompass all the freedom of expression rights arising from UK law, including under common law. The rights to freedom of expression under Article 10 of the European Convention on Human Rights include both the rights to impart ideas, opinions and information, but also the right to receive such ideas, opinions and information. Any revised definition of freedom of expression to be included in the Bill should refer to both aspects of the Article 10 definition, given the importance for both children and adults of receiving information via the internet. We recognise the importance of clarity in relation to the duties set out in Clauses 18 and 28, and we are very grateful to the noble and learned Lord for proposing this amendment, and for the experience he brings to bear on behalf of the Constitution Committee of your Lordships’ House. The Higher Education (Freedom of Speech) Bill and the Online Safety Bill serve very different purposes, but I am happy to say that the Bill team and I will consider this amendment closely between now and Report.

Amendments 101, 102, 109, 112, 116, 121, 191 and 220, in the name of my noble friend Lord Moylan, seek to require Ofcom to have special regard to the importance of protecting freedom of expression when exercising its enforcement duties, and when drafting or amending codes of practice or guidance. Ofcom must already ensure that it protects freedom of expression when overseeing the Bill, because it is bound by the Human Rights Act, as I say. It also has specific duties to ensure that it is clear about how it is protecting freedom of expression when exercising its duties, including when developing codes of practice.

My noble friend’s Amendment 294 seeks to remove “psychological” from the definition of harm in the Bill. It is worth being clear that the definition of harm is used in the Bill as part of the illegal and child safety duties. There is no definition of harm, psychological or otherwise, with regard to adults, given that the definition of content which is harmful to adults was removed from the Bill in another place. With regard to children, I agree with the points made by the noble Baroness, Lady Kidron. It is important that psychological harm is captured in the Bill’s child safety duties, given the significant impact that such content can have on young minds.

I invite my noble friend and others not to press their amendments in this group.

22:15
Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Hansard - - - Excerpts

My Lords, your Lordships will want me to be brief, bearing in mind the time. I am very grateful for the support I received from my noble friends Lady Harding and Lady Fraser and the noble Baronesses, Lady Kidron and Lady Bull, for the amendments I tabled. I am particularly grateful to the noble Baroness, Lady Bull, for the detail she added to my description of the amendments. I can always rely on the noble Baroness to colour in my rather broad-brush approach to these sorts of things.

I am pleased that the noble Lord, Lord Stevenson, made his remarks at the beginning of the debate. That was very helpful in setting the context that followed. We have heard a basic theme come through from your Lordships: a lack of certainty that the Government have struck the right balance between privacy protection and freedom of expression. I never stop learning in your Lordships’ House. I was very pleased to learn from the new Milton—my noble friend Lord Moylan—that freedom of expression is a fundamental right. Therefore, the balance between that and the other things in the Bill needs to be considered in a way I had not thought of before.

What is clear is that there is a lack of confidence from all noble Lords—irrespective of the direction they are coming from in their contributions to this and earlier debates— either that the balance has been properly struck or that some of the clauses seeking to address freedom of speech in the Bill are doing so in a way that will deliver the outcome and overall purpose of this legislation as brought forward by the Government.

I will make a couple of other points. My noble friend Lord Moylan’s amendments about the power of Ofcom in this context were particularly interesting. I have some sympathy for what he was arguing. As I said earlier, the question of power and the distribution of it between the various parties involved in this new regime will be one we will look at in broad terms certainly in later groups.

On the amendments of the noble Lord, Lord Stevenson, on Clauses 13, 14 and so on and the protections and provisions for news media, I tend towards the position of my noble friend Lord Black, against what the noble Lord, Lord Stevenson, argued. As I said at the beginning, I am concerned about the censorship of our news organisations by the tech firms. But I also see his argument, and that of the noble Viscount, Lord Colville, that it is not just our traditional legacy media that provides quality journalism now—that is an important issue for us to address.

I am grateful to my noble friend the Minister for his round-up and concluding remarks. Although it is heartening to hear that he and the Bill team will consider the amendment from the noble and learned Lord, Lord Hope, in this group, we are looking—in the various debates today, for sure—for a little more responsiveness and willingness to consider movement by the Government on various matters. I hope that he is able to give us more encouraging signs of this, as we proceed through Committee and before we get to further discussions with him—I hope—outside the Chamber before Report. With that, I of course withdraw my amendment.

Amendment 46 withdrawn.
Amendments 47 and 48 not moved.
Clause 13: Duties to protect content of democratic importance
Amendments 49 and 50 not moved.
Amendment 50A
Moved by
50A: Clause 13, page 14, line 8, at end insert—
“(5A) In determining what is proportionate for the purposes of subsection (2), the size and capacity of the provider of a service, in particular, is relevant.”Member’s explanatory statement
This amendment indicates that the size and capacity of a provider is important in construing the reference to “proportionate systems and processes” in clause 13 (duties to protect content of democratic importance).
Amendment 50A agreed.
Clause 13, as amended, agreed.
House resumed.
House adjourned at 10.20 pm.
Committee (6th Day)
15:17
Relevant document: 28th Report from the Delegated Powers Committee
Clause 14: Duties to protect news publisher content
Amendment 50B
Moved by
50B: Clause 14, page 15, line 30, leave out “subsection (2)(a)” and insert “this section”
Member’s explanatory statement
This is a technical amendment to make it clear that clause 14(9), which sets out circumstances which do not count as a provider “taking action” in relation to news publisher content, applies for the purposes of the whole clause.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- Hansard - - - Excerpts

My Lords, His Majesty’s Government are committed to defending the invaluable role of our free media. We are clear that our online safety legislation must protect the vital role of the press in providing people with reliable and accurate information. That is why this Bill includes strong protections for recognised news publishers. The Bill does not impose new duties on news publishers’ content, which is exempt from the Bill’s safety duties. In addition, the Bill includes strong safeguards for news publisher content, set out in Clause 14. In order to benefit from these protections, publishers will have to meet a set of stringent criteria, set out in Clause 50.

I am aware of concerns in your Lordships’ House and another place that the definition of news publishers is too broad and that these protections could therefore create a loophole to be exploited. That is why the Government are bringing forward amendments to the definition of “recognised news publisher” to ensure that sanctioned entities cannot benefit from these protections. I will shortly explain these protections in detail but I would like to be clear that narrowing the definition any further would pose a critical risk to our commitment to self-regulation of the press. We do not want to create requirements which would in effect put Ofcom in the position of a press regulator. We believe that the criteria set out in Clause 50 are already strong, and we have taken significant care to ensure that established news publishers are captured, while limiting the opportunity for bad actors to benefit. 

Government Amendments 126A and 127A propose changes to the criteria for recognised news publishers. These criteria already exclude any entity that is a proscribed organisation under the Terrorism Act 2000 or the purpose of which is to support a proscribed organisation under that Act. We are clear that sanctioned news outlets such as RT, formerly Russia Today, must not benefit from these protections either. The amendments we are tabling today will therefore tighten the recognised news publisher criteria further by excluding entities that have been designated for sanctions imposed by both His Majesty’s Government and the United Nations Security Council. I hope noble Lords will accept these amendments, in order to ensure that content from publishers which pose a security threat to this country cannot benefit from protections designed to defend a free press.

In addition, the Government have also tabled amendments 50B, 50C, 50D, 127B, 127C and 283A, which are aimed at ensuring that the protections for news publishers in Clause 14 are workable and do not have unforeseen consequences for the operation of category 1 services. Clause 14 gives category 1 platforms a duty to notify recognised news publishers and offer a right of appeal before taking action against any of their content or accounts.

Clause 14 sets out the circumstances in which companies must offer news publishers an appeal. As drafted, it states that platforms must offer this before they take down news publisher content, before they restrict users’ access to such content or where they propose to “take any other action” in relation to publisher content. Platforms must also offer an appeal if they propose to take action against a registered news publisher’s account by giving them a warning, suspending or banning them from using a service or in any way restricting their ability to use a service.

These amendments provide greater clarity about what constitutes “taking action” in relation to news publisher content, and therefore when category 1 services must offer an appeal. They make it clear that a platform must offer this before they take down such content, add a warning label or take any other action against content in line with any terms of service that allow or prohibit content. This will ensure that platforms are not required to offer publishers a right of appeal every time they propose to carry out routine content curation and similar routine actions. That would be unworkable for platforms and would be likely to inhibit the effectiveness of the appeal process.

As noble Lords know, the Bill has a strong focus on user empowerment and enabling users to take control of their online experience. The Government have therefore tabled amendments to Clause 52 to ensure that providers are required only to offer publishers a right of appeal in relation to their own moderation decisions, not where a user has voluntarily chosen not to view certain types of content. For example, if a user has epilepsy and has opted not to view photo-sensitive content, platforms will not be required to offer publishers a right of appeal before restricting that content for the user in question.

In addition, to ensure that the Bill maintains strong protections for children, the amendments make it clear that platforms are not required to offer news publishers an appeal before applying warning labels to content viewed by children. The amendments also make it clear that platforms would be in breach of the legislation if they applied warning labels to content encountered by adults without first offering news publishers an appeal, but in order to ensure that the Bill maintains strong protections for children, that does not apply to warning labels on content encountered by children. I beg to move.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I welcome the amendments the Government have tabled, but I ask the Minister to clarify the effect of Amendment 50E. I declare an interest as chair of the Communications and Digital Select Committee, which has discussed Amendment 50E and the labelling of content for children with the news media organisations. This is a very technical issue, but from what my noble friend was just saying, it seems that content that would qualify for labelling for child protection purposes, and which therefore does not qualify for a right of appeal before the content is so labelled, is not content that would normally be encountered by adults but might happen to appeal to children. I would like to be clear that we are not giving the platforms scope for adding labels to content that they ought not to be adding labels to. That aside, as I say, I am grateful to my noble friend for these amendments.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, like the noble Baroness, Lady Stowell, I have no major objection and support the Government’s amendments. In a sense the Minister got his retaliation in first, because we will have a much more substantial debate on the scope of Clause 14. At this point I welcome any restriction on Clause 14 in the way that the Minister has stated.

Yet to come we have the whole issue of whether an unregulated recognised news publisher, effectively unregulated by the PRP’s arrangements, should be entitled to complete freedom in terms of below-the-line content, where there is no moderation and it does not have what qualifies as independent regulation. Some debates are coming down the track and—just kicking the tyres on the Minister’s amendments—I think the noble Baroness, Lady Stowell, made a fair point, which I hope the Minister will answer.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

My Lords, I thank the Minister for his very clear and precise introduction of these amendments. As the noble Lord, Lord Clement-Jones, said, we will return to some of the underlying issues in future debates. It may be that this is just an aperitif to give us a chance to get our minds around these things, as the noble Baroness, Lady Stowell, said.

It is sometimes a bit difficult to understand exactly what issue is being addressed by some of these amendments. Even trying to say them got us into a bit of trouble. I think I follow the logic of where we are in the amendments that deal with the difference between adult material and children’s material, but it would benefit us all if the Minister could repeat it, perhaps a little slower this time, and we will see if we can agree that that is the way forward.

Broadly speaking, we accept the arrangements. They clarify the issues under which the takedown and appeal mechanisms will work. They are interfacing with the question of how the Bill deals with legal but harmful material, particularly for those persons who might wish not to see material and will not be warned about it under any process currently in the Bill but will have a toggle to turn to. It safeguards children who would not otherwise be covered by that. That is a fair balance to be struck.

Having said that, we will be returning to this. The noble Lord, Lord Clement-Jones, made the good point that we have a rather ironic situation where a press regulation structure set up and agreed by Parliament is not in operation across the whole of the press, but we do not seem to make any accommodation for that. This is perhaps something we should return to at a later date.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I want very briefly to probe something. I may have got the wrong end of the stick, but I want to just ask about the recognised news publishers. The Minister’s explanation about what these amendments are trying to do was very clear, but I have some concerns.

I want to know how this will affect how we understand what a recognised news publisher is in a world in which we have many citizen journalists, blogs and online publications. One of the democratising effects of the internet has been in opening up spaces for marginalised voices, campaign journalism and so on. I am worried that we may inadvertently put them into a category of being not recognised; maybe the Minister can just explain that.

I am also concerned that, because this is an area of some contention, this could be a recipe for all sorts of litigious disputes with platforms about content removal, what constitutes those carve-outs and what is a recognised news, journalism or publishing outlet.

I know we will come on to this, but for now I am opposed to Amendment 127 in this group—or certainly concerned that it is an attempt to coerce publishers into a post-Leveson regulatory structure by denying them the protections that the Bill will give news publishers, unless they sign up in certain ways. I see that as blackmail and bullying, which I am concerned about. Much of the national press and many publishers have refused to join that kind of regulatory regime post Leveson, as is their right; I support them in the name of press freedom. Any comments or clarifications would be helpful.

15:30
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am sorry; in my enthusiasm to get this day of Committee off to a swift start, I perhaps rattled through that rather quickly. On Amendment 50E, which my noble friend Lady Stowell asked about, I make clear that platforms will be in breach of their duties if, without applying the protection, they add warning labels to news publishers’ content that they know will be seen by adult users, regardless of whether that content particularly appeals to children.

As the noble Lord, Lord Clement-Jones, and others noted, we will return to some of the underlying principles later on, but the Government have laid these amendments to clarify category 1 platforms’ duties to protect recognised news publishers’ content. They take some publishers out of scope of the protections and make it clearer that category 1 platforms will have only to offer news publishers an appeal before taking punitive actions against their content.

The noble Baroness, Lady Fox, asked about how we define “recognised news publisher”. I am conscious that we will debate this more in later groups, but Clause 50 sets out a range of criteria that an organisation must meet to qualify as a recognised news publisher. These include the organisation’s “principal purpose” being the publication of news, it being subject to a “standards code” and its content being “created by different persons”. The protections for organisations are focused on publishers whose primary purpose is reporting on news and current affairs, recognising the importance of that in a democratic society. I am grateful to noble Lords for their support.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Hansard - - - Excerpts

What my noble friend said is absolutely fine with me, and I thank him very much for it. It might be worth letting the noble Baroness, Lady Fox, know that Amendment 127 has now been moved to the group that the noble Lord, Lord Clement-Jones, referred to. I thought it was worth offering that comfort to the noble Baroness.

Amendment 50B agreed.
Amendments 50C to 50E
Moved by
50C: Clause 14, page 15, line 44, leave out subsection (11)
Member’s explanatory statement
This amendment omits a provision about OFCOM’s guidance under clause 171, as that provision is now to be made in clause 171 itself.
50D: Clause 14, page 16, line 3, leave out paragraph (b)
Member’s explanatory statement
This amendment omits the definition of “taking action” in relation to content, as that is now dealt with by the amendment in the Minister’s name below.
50E: Clause 14, page 16, line 10, at end insert—
“(13A) In this section references to “taking action” in relation to content are to—(a) taking down content,(b) restricting users’ access to content, or(c) adding warning labels to content, except warning labels normally encountered only by child users,and also include references to taking any other action in relation to content on the grounds that it is content of a kind which is the subject of a relevant term of service (but not otherwise).(13B) A “relevant term of service” means a term of service which indicates to users (in whatever words) that the presence of a particular kind of content, from the time it is generated, uploaded or shared on the service, is not tolerated on the service or is tolerated but liable to result in the provider treating it in a way that makes it less likely that other users will encounter it.”Member’s explanatory statement
This amendment provides a revised definition of what it means to “take action” in relation to news publisher content, to ensure that the clause only applies to actions other than those set out in subsection (13A)(a), (b) or (c) in the circumstances set out in subsection (13B).
Amendments 50C to 50E agreed.
Clause 14, as amended, agreed.
Clause 15: Duties to protect journalistic content
Amendment 50F
Moved by
50F: Clause 15, page 17, line 14, at end insert—
“(8A) In determining what is proportionate for the purposes of subsection (2), the size and capacity of the provider of a service, in particular, is relevant.”Member’s explanatory statement
This amendment indicates that the size and capacity of a provider is important in construing the reference to “proportionate systems and processes” in clause 15 (duties to protect journalistic content).
Amendment 50F agreed.
Amendment 51 not moved.
Clause 15, as amended, agreed.
Amendment 52
Moved by
52: After Clause 15, after Clause 15, insert the following new Clause—
“Health disinformation and misinformation
(1) This section sets out the duties about harmful health disinformation and misinformation which apply in relation to Category 1 services.The duties(2) A duty to carry out and keep up to date a risk assessment of the risks presented by harmful health disinformation and misinformation that is present on the service.(3) A duty to develop and maintain a policy setting out the service’s approach to the treatment of harmful health disinformation and misinformation on the service. (4) A duty to explain in the policy how the service’s approach to the treatment of harmful disinformation and misinformation is designed to mitigate or manage any risks identified in the latest risk assessment.(5) A duty to summarise the policy in the terms of service, and to include provisions in the terms of service about how that content is to be treated on the service.(6) A duty to ensure that the policy, and any related terms of service, are—(a) clear and accessible, and(b) applied consistently.(7) In this section, “harmful health disinformation and misinformation” means content which contains information which—(a) is false or misleading in a material respect; and(b) presents a material risk of significant harm to the health of an appreciable number of persons in the United Kingdom.”Member’s explanatory statement
This new Clause would introduce a variety of duties on Category 1 platforms, in relation to their treatment of content which represents harmful health misinformation and disinformation.
Baroness Merron Portrait Baroness Merron (Lab)
- Hansard - - - Excerpts

My Lords, I shall speak to this group which includes Amendments 52, 99 and 222 in my name. These are complemented by Amendments 223 and 224 in the name of my noble friend Lord Knight. I am grateful to the noble Lords, Lord Clement-Jones and Lord Bethell, and to the noble Baroness, Lady Bennett, for putting their names to the amendments in this group. I am also grateful to the noble Lord, Lord Moylan, for tabling Amendments 59, 107 and 264. I appreciate also the work done by the APPG on Digital Regulation and Responsibility and by Full Fact on this group, as well as on many others in our deliberations.

These amendments would ensure that platforms were required to undertake a health misinformation and disinformation risk assessment. They would also require that they have a clear policy in their terms of service on dealing with harmful, false and misleading health information, and that there are mechanisms to support and monitor this, including through the effective operation of an advisory committee which Ofcom would be required to consult. I appreciate that the Minister may wish to refer to the false communication offence in Clause 160 as a reason why these amendments are not required. In order to pre-empt this suggestion, I put it to him that the provision does not do the job, as it covers only a user sending a knowingly false communication with the intention of causing harm, which does not cover most of the online health misinformation and disinformation about which these amendments are concerned.

Why does all this matter? The stakes are high. False claims about miracle cures, unproven treatments and dangerous remedies can and do spread rapidly, leading people to make the poorest of health decisions, with dire consequences. We do not have to go far back in time to draw on the lessons of our experience. It is therefore disappointing that the Government have not demonstrated, through this Bill, that they have learned the lessons of the Covid-19 pandemic. This is of concern to many health practitioners and representatives, as well as to Members of your Lordships’ House. We all remember the absolute horror of seeing false theories being spread quickly online, threatening to undermine the life-saving vaccine rollout. In recent years, the rising anti-vaccine sentiment has certainly contributed to outbreaks of preventable diseases that had previously been eradicated. This is a step backwards.

In 2020, an estimated 5,800 people globally were admitted to hospital because of false information online relating to Covid-19, with at least 800 people believed to have died because they followed this misinformation or disinformation. In 2021, the Royal College of Obstetricians and Gynaecologists found that only 40% of women offered the vaccine against Covid-19 had accepted it, with many waiting for more evidence that it would be safe. It is shocking to recall that, in October 2021, one in five of the most critically ill Covid patients was an unvaccinated, pregnant woman.

If we look beyond Covid-19, we see misinformation and disinformation affecting many other aspects of health. I will give a few examples. There are false claims about cancer treatment—for example, lemons treat cancer better than chemotherapy; tumours are there to save your life; cannabis oil cures cancer; rubbing hydrogen peroxide on your skin will treat cancer. Just last year, the lack of publicly available information about Mpox fuelled misinformation online. There is an issue about the Government’s responsibility for ensuring that there is publicly available information about health risks. In this respect, the lack of it—the void—led to a varied interpretation and acceptance of the public health information that was available, limited though it was. UNAIDS also expressed concern that public messaging on Mpox used language and imagery that reinforced homophobic and racist stereotypes.

For children, harmful misinformation has linked the nasal flu vaccine to an increase in Strep A infections. In late 2022, nearly half of all parents falsely believed these claims, such that the uptake of the flu vaccine among two and three year-olds dropped by around 11%. It is not just that misinformation and disinformation may bombard us online and affect us; there are also opportunities for large, language-model AIs such as ChatGPT to spread misinformation.

The Government had originally promised to include protections from harmful false health content in their indicative list of harmful content that companies would have been required to address under the now removed adult safety duties, yet we find that the Bill maintains the status quo, whereby platforms are left to their own devices as to how they tackle health misinformation and disinformation, without the appropriate regulatory oversight. It is currently up to them, so they can remove it at scale or leave it completely unchecked, as we recently saw when Twitter stopped enforcing its Covid-19 misinformation policy. This threatens not just people’s health but their freedom of expression and ability to make proper informed decisions. With that in mind, I look forward to amendments relating to media literacy in the next group that the Committee will consider.

I turn to the specific amendments. The new clause proposed in Amendment 52 would place a duty on category 1 platforms to undertake a health misinformation risk assessment and set out a policy on their treatment of health misinformation content. It would also require that the policy and related terms of service are consistently applied and clear and accessible—something that we have previously debated in this Committee. It also defines what is meant by

“harmful health disinformation and misinformation”—

and, again, on that we have discussed the need for clarity and definition.

Amendment 99 would require Ofcom to consult an advisory committee on disinformation and misinformation when preparing draft codes of practice or amendments to such codes. Amendment 222 is a probing amendment and relates to the steps, if any, that Ofcom will be expected to take to avoid the advisory committee being dominated by representatives of regulated services. It is important to look at how the advisory committee is constructed, as that will be key not just to the confidence that it commands but to its effectiveness.

Amendment 223, in the name of my noble friend Lord Knight, addresses the matter of timeliness in respect of the establishment of the advisory committee, which should be within six months of the Bill being passed. Amendment 224, also in the name of my noble friend Lord Knight, would require the advisory committee to consider as part of its first report whether a dedicated Ofcom code of practice in this area would be effective in the public interest. This would check that we have the right building blocks in place. With that in mind, I beg to move.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

My Lords, it is a great honour to rise after the noble Baroness, Lady Merron, who spoke so clearly about Amendment 52 and the group of amendments connected with health misinformation, some of which stand also in my name.

As the noble Baroness rightly pointed out, we have known for a long time the negative impact of social media, with all its death scrolls, algorithms and rabbit holes on vaccine uptake. In 2018, the University of Southampton did a study of pregnant women and found that those who reported using social media to research antenatal vaccinations were 58% less likely to accept the whooping cough vaccine. Since then, things have only got worse.

15:45
As a junior Health Minister during the pandemic, I saw how the successful vaccine rollout was at severe risk of being undermined by misinformation, amplified by foreign actors and monetised by cynical commercial interests. The challenge was enormous. The internet, as we know, is a highly curated environment that pushes content, functions and services that create an emotional response and retain our attention. Social media algorithms are absolutely the perfect tool for conspiracy theorists, and a pandemic necessarily raises everyone’s concerns. It was unsurprising that a lot of people went down various rabbit holes on health information.
The trust between our clinical professionals and their patients relies on a shared commitment to evidence-based science. That can quickly go out of the window if the algorithms are pushing rousing content that deliberately plays into people’s worst fears and anxieties, thereby displacing complex and nuanced analysis with simplistic attention-seeking hooks, based sometimes on complete nonsense. The noble Baroness, Lady Merron, mentioned lemons for cancer as a vivid example of that.
At the beginning of the vaccine programme, a thorough report by King’s College London, funded by the NIHR health protection research unit, found that 14% of British adults believed the real purpose of mass vaccination against coronavirus was to track and control the population. That rose to an astonishing 42% among those who got their information from WhatsApp, 39% for YouTubers, 29% from the Twitterati and 28% from Facebookers. I remember that, when those statistics came through, it put this important way out of the pandemic in jeopardy.
I remind the Committee that a great many people make money out of such fear. I highly recommend the Oxford University Journal of Communication article on digital profiteering for a fulsome and nuanced guide to the economics of the health misinformation industry. I also remind noble Lords that foreign actors and states are causing severe trouble in this area. “Foreign disinformation” social media campaigns are linked to falling vaccination rates, according to an international time-trend analysis published by BMJ Global Health.
As it happens, in the pandemic, the DHSC, the Cabinet Office and a wide group throughout government worked incredibly thoughtfully on a communications strategy that sought to answer people’s questions and apply the sunlight of transparency to the vaccine process. It balanced the rights to freedom of expression with protecting our central strategy for emerging from the pandemic through the vaccine rollout. I express considerable thanks to those officials, and the social media industry, who leant into the issue more out of a sense of good will than any legal obligation. I was aware of some of the legal ambiguities around those times.
Since then, things have gone backwards, not forwards. Hesitancy in the UK has risen, with a big impact on vaccine take-up rates. We are behind on 13 out of the 14 routine vaccine programmes, well behind the 95% target set by the World Health Organization. The results are clear: measles is rising because of vaccine uptake falling, and that is true of many common, avoidable diseases. As for the platforms, Twitter’s recent decision at the end of last year to suddenly stop enforcing its Covid-19 misinformation policy was a retrograde step and possibly the beginning of a worrying trend that we should all be conscious of, and is one of the motivating reasons for this amendment.
Unfortunately, the Government’s decision to remove from the Bill the provisions on content harmful to adults, and with that the scope to include harmful health content, has had unintended consequences and left a big gap. We will have learned nothing from the pandemic if we do not act to plug that gap. The amendment and associated amendments in the group seek to address this by introducing three duties, as the noble Baroness, Lady Merron explained.
The first requirement is an assessment of the risks presented by harmful health disinformation and misinformation. Anyone who has been listening to these debates will recognise that this very much runs with the grain of the Bill’s approach and is consistent with many of the good things already in the Bill. Risk assessments are a very valuable tool in our approach to misinformation. I remind noble Lords that, for this Bill, “content” has a broad meaning that includes services and functions of a site, including the financial exploitation of that content. Secondly, the amendment would require large platforms to publish a policy setting out their approach to health misinformation. Each policy would have to explain how it is designed to mitigate or manage risks and should be kept up to date and maintained. That kind of transparency is at the heart of how we hold platforms to account. Lastly, platforms would be required to summarise their health misinformation policy in terms that consumers can properly understand.
This approach is consistent with the spirit of the Bill’s treatment of many harms: we are seeking transparency and we are creating accountability, but we are not mandating protocols. The consequences are clear. Users, health researchers and internet analysts would be able to see clearly how a platform proposes to deal with health misinformation that they may encounter on a particular service and make informed decisions as a result. The regulator would be able to see clearly what the nature of these risks is.
May I briefly tackle some natural concerns? On the question of protection of freedom of expression, my noble friend Lord Moylan rightly reminded us on Tuesday of Article 19 of the UN Universal Declaration of Human Rights: everyone has the freedom to express opinions and speech. On this point, I make it clear that this amendment would not require platforms to remove health misinformation from their service or to prescribe particular responses. In fact, I would go further. I recognise that it is important to have a full debate about the efficacy, safety and financial wisdom of treatments, cures and vaccines. This amendment would do nothing to close down that debate. It is about clarity. The purpose of the amendment is to prevent providers ducking the question about how they handle health misinformation. To that extent, it would help both those who are worried about health misinformation and those who are worried about being branded as sharing health misinformation to know where the platforms are coming from. It would ensure that providers establish what is happening on their service, what the associated risks to their users are, and then to shine a light on how they intend to deal with it.
I also make it clear that this is not just about videos, articles and tweets. We should also be considering whether back-end payment mechanisms, including payment intermediaries, donation collection services and storefront support, should be used to monetise health misinformation and enable bad actors. During the pandemic, the platforms endorsed the principle that no company should be profiting from Covid-19 vaccine misinformation, for instance. It is vital that this is considered as part of the platforms’ response to health misinformation. We should have transparency about whether platforms such as PayPal and Google are accepting donations, membership or merchandise payments from known misinformation businesses. Is Amazon, for instance, removing products that are used to disseminate health misinformation? Are crowdfunding websites hosting health misinformation campaigns from bad actors?
To anticipate my noble friend the Minister, I say that he will likely remind us that there are measures already in place in the Bill if the content is criminal or likely to be viewed by children, and I welcome those provisions. However, as the Bill stands, the actual policies on misinformation and the financial exploitation of that content will be a matter of platform discretion, with no clarity for users or the regulator. It will be out of sight of clear regulatory oversight. This is a mistake, just as Twitter has just shown, and that is why we need this change.
Senior clinicians including Sir Jeremy Farrar, Professor John Bell and the noble Lord, Lord Darzi, have written to the Secretary of State to raise their concerns. These are serious players voicing serious concerns. The approach in Amendment 52 is, in my view, the best and most proportionate way to protect those who are most vulnerable to false and misleading information.
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I shall speak to Amendments 59, 107 and 264 in this group, all of which are in my name. Like the noble Baroness, Lady Merron, I express gratitude to Full Fact for its advice and support in preparing them.

My noble friend Lord Bethell has just reminded us of the very large degree of discretion that is given to platforms by the legislation in how they respond to information that we might all agree, or might not agree, is harmful, misinformation or disinformation. We all agree that those categories exist. We might disagree about what falls into them, but we all agree that the categories exist, and the discretion given to the providers in how to handle it is large. My amendments do not deal specifically with health-related misinformation or disinformation but are broader.

The first two, Amendments 59 and 107—I am grateful to my noble friend Lord Strathcarron for his support of Amendment 59—try to probe what the Government think platforms should do when harmful material, misinformation and disinformation appear on their platforms. As things stand, the Government require that the platforms should decide what content is not allowed on their platforms; then they should display this in their terms of service; and they should apply a consistent approach in how they manage content that is in breach of their terms of service. The only requirement is for consistency. I have no objection to their being required to behave consistently, but that is the principal requirement.

What Amendments 59 and 107 do—they have similar effects in different parts of the Bill; one directly on the platforms; the other in relation to codes of practice—is require them also to act proportionately. Here, it might be worth articulating briefly the fact that there are two views about platforms and how they respond, both legitimate. One is that some noble Lords may fear that platforms will not respond at all: in other words, they will leave harmful material on their site and will not properly respond.

The other fear, which is what I want to emphasise, is that platforms will be overzealous in removing material, because they will have written their terms of service, as I said on a previous day in Committee, not only for their commercial advantage but also for their legal advantage. They will have wanted to give themselves a wide latitude to remove material, or to close accounts, because that will help cover their backs legally. Of course, once they have granted themselves those powers, the fear is that they will use them overzealously, even in cases where that would be an overreaction. These two amendments seek to oblige the platforms to respond proportionately, to consider alternative approaches to cancellation and removal of accounts and to be obliged to look at those as well.

There are alternative approaches that they could consider. Some companies already set out to promote good information, if you like, and indeed we saw that in the Covid-19 pandemic. My noble friend Lord Bethell said that they did so, and they did so voluntarily. This amendment would not explicitly but implicitly encourage that sort of behaviour as a first resort, rather than cancellation, blocking and removal of material as a first resort. They would still have the powers to cancel, block and remove; it is a question of priority and proportionality.

There are also labels that providers can put on material that they think is dubious, saying, “Be careful before you read this”, or before you retweet it; “This is dubious material”. Those practices should also be encouraged. These amendments are intended to do that, but they are intended, first and foremost, to probe what the Government’s attitude is to this, whether they believe they have any role in giving guidance on this point and how they are going to do so, whether through legislation or in some other way, because many of us would like to know.

Amendment 264, supported by my noble friend Lord Strathcarron and the noble Lord, Lord Clement-Jones, deals with quite a different matter, although it falls under the general category of misinformation and disinformation: the role the Government take directly in seeking to correct misinformation and disinformation on the internet. We know that No. 10 has a unit with this explicit purpose and that during the Covid pandemic it deployed military resources to assist it in doing so. Nothing in this amendment would prevent that continuing; nothing in it is intended to create scare stories in people’s minds about an overweening Government manipulating us. It is intended to bring transparency to that process.

16:00
Amendment 264 requires that once a year, within six months of the enactment of the Bill and annually thereafter, the Government would be required to produce a report setting out relevant representations they had made to providers during that previous year. It specifies the relevant representations: trying to persuade platforms to modify their terms of service, to restrict or remove a particular user’s access or to take down, reduce the visibility of or restrict access to content. The Secretary of State would be required to present a new report to Parliament once a year so that we understood what was happening. As I say, it would not inhibit the Government doing it—there may well be good reasons for their doing so—but in this age people feel entitled to know.
Concerns might be expressed that, in doing so, national security might be compromised in some way because of the involvement of the Army or whatever. However, as drafted, this amendment gives the Secretary of State the power, simply if he considers something to be harmful to national security, not to publish it and to withhold it, so I think no national security argument can be made against this. Instead, he would be required to summarise it in a report to the Intelligence and Security Committee of Parliament. It would not enter the public domain. That is a grown-up thing to ask for. I am sustained in that view by the support for the amendment from at least one opposition spokesman.
Those are the two things I am trying to achieve, which in many ways speak for themselves. I hope my noble friend will feel able to support them.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I have given notice in this group that I believe Clause 139 should not stand part of the Bill. I want to remove the idea of Ofcom having any kind of advisory committee on misinformation and disinformation, at least as it has been understood. I welcome the fact that the Government have in general steered clear of putting disinformation and misinformation into the Bill, because the whole narrative around it has become politicised and even weaponised, often to delegitimise opinions that do not fit into a narrow set of official opinions or simply to shout abuse at opponents. We all want the truth—if only it was as simple as hiring fact-checkers or setting up a committee.

I am particularly opposed to Amendment 52 from the noble Baroness, Lady Merron, and the noble Lord, Lord Bethell. They have both spoken very eloquently of their concerns, focusing on harmful health misinformation and disinformation. I oppose it because it precisely illustrates my point about the danger of these terms being used as propaganda.

There was an interesting and important investigative report brought out in January this year by Big Brother Watch entitled Inside Whitehall’s Ministry of Truth—How Secretive “Anti-Misinformation” Teams Conducted Mass Political Monitoring. It was rather a dramatic title. We now know that the DCMS had a counter-disinformation unit that had a special relationship with social media companies, and it used to recommend that content was removed. Interestingly, in relation to other groups we have discussed, it used third-party contractors to trawl through Twitter looking for perceived terms of service violations as a reason for content to be removed. This information warfare tactic, as we might call it, was used to target politicians and high-profile journalists who raised doubts or asked awkward questions about the official pandemic response. Dissenting views were reported to No.10 and then often denounced as misinformation, with Ministers pushing social media platforms to remove posts and promote Government-sponsored lines.

It has been revealed that a similar fake news unit was in the Cabinet Office. It got Whitehall departments to attack newspapers for publishing articles that analysed Covid-19 modelling, not because it was accurate—it was not accurate in many instances—but because it feared that any scepticism would affect compliance with the rules. David Davis MP appeared in an internal report on vaccine hesitancy, and his crime was arguing against vaccine passports as discriminatory, which was a valid civil liberties opposition but was characterised as health misinformation. A similar approach was taken to vaccine mandates, which led to tens of thousands of front-line care workers being sacked even though, by the time this happened, the facts were known: the vaccine was absolutely invaluable in protecting individual health, but it did not stop transmission, so there was no need for vaccine mandates to be implemented. The fact that this was not discussed is a real example of misinformation, but we did not have it in the public sphere.

Professor Carl Heneghan’s Spectator article that questioned whether the rule of six was an arbitrary number was also flagged across Whitehall as misinformation, but we now know that the rule of six was arbitrary. Anyone who has read the former Health Secretary Matt Hancock’s WhatsApp messages, which were leaked to the Telegraph and which many of us read with interest, will know that many things posed as “the science” and factual were driven by politics more than anything else. Covid policies were not all based on fact, yet it was others who were accused of misinformation.

Beyond health, the Twitter files leaked by Elon Musk, when he became its new owner, show the dangers of using the terms misinformation and disinformation to pressure big tech platforms into becoming tools of political censorship. In the run-up to the 2020 election, Joe Biden’s presidential campaign team routinely flagged tweets and accounts it wanted to be censored, and we have all seen the screengrab of email exchanges between executives as evidence of that. Twitter suppressed the New York Post’s infamous Hunter Biden laptop exposé on the spurious grounds that it was “planted Russian misinformation”. The Post was even locked out of its own account. It took 18 months for the Washington Post and the New York Times to get hold of, and investigate, Hunter Biden’s emails, and both determined that the New York Post’s original report was indeed legitimate and factually accurate, but it was suppressed as misinformation when it might have made some political difference in an election.

We might say that all is fair in love and war and elections but, to make us think about what we mean by “misinformation” and why it is not so simple, was the Labour Party attack ad that claimed Rishi Sunak did not believe that paedophiles should go to jail fair comment or disinformation, and who decides? I know that Tobias Ellwood MP called for a cross-party inquiry on the issue, calling on social media platforms to do more to combat “malicious political campaigns”. I am not saying that I have a view one way or another on this, but my question is: in that instance, who gets to label information as “malicious” or “fake” or “misinformation”? Who gets the final say? Is it a black and white issue? How can we avoid it becoming partisan?

Yesterday, at the Second Reading of the Illegal Migration Bill, I listened very carefully to the many contributions. Huge numbers of noble Lords continually claimed that all those in the small boats crossing the channel were fleeing war and persecution—fleeing for their lives. Factually that was inaccurate, according to detailed statistics and evidence, yet no one called those contributors “peddlers of misinformation”, because those speaking are considered to be compassionate and on the righteous side of the angels—at least in the case of the most reverend Primate the Archbishop of Canterbury—and, as defined by this House, they were seen to be saying the truth, regardless of the evidence. My point is that it was a political argument, yet here we are focusing on this notion that the public are being duped by misinformation.

What about those who tell children that there are 140 genders to choose from, or that biological sex is immutable? I would say that is dangerous misinformation or disinformation; others would say that me saying that is bigoted. There is at least an argument to be had, but it illustrates that the labelling process will always be contentious, and therefore I have to ask: who is qualified to decide?

A number of amendments in this group put forward a variety of “experts” who should be, for example, on the advisory committee—those who should decide and those who should not—and I want to look at this notion of expertise in truth. For example, in the report by the Communications and Digital Committee in relation to an incident where Facebook marked as “false” a post on Covid by a professor of evidence-based medicine at Oxford University, the committee asked Facebook about the qualifications of those who made that judgment—of the fact-checkers. It was told that they were

“certified by the International Fact-Checking Network”.

Now, you know, who are they? The professor of evidence-based medicine at Oxford University might have a bit more expertise here, and I do not want a Gradgrind version of truth in relation to facts, and so on.

If it were easy to determine the truth, we would be able to wipe out centuries of philosophy, but if we are going to have a committee determining the truth, could we also have some experts in civil liberties—maybe the Free Speech Union, Big Brother Watch, and the Index on Censorship—on a committee to ensure that we do not take down accurate information under the auspices of “misinformation”? Are private tech companies, or professional fact-checkers, or specially selected experts, best placed to judge the reliability of all sorts of information and of the truth, which I would say requires judgement, analysis and competing perspectives?

Too promiscuous a use of the terms “misinformation” and “disinformation” can also cause problems, and often whole swathes of opinion are lumped together. Those who raised civil liberties objections to lockdown where denounced “Covidiots”, conspiracy theorists peddling misinformation and Covid deniers, on a par with those who suggested that the virus was linked to everything from 5G masts to a conscious “plandemic”.

Those who now raise queries about suppressing any reference to vaccine harms, or who are concerned that people who have suffered proven vaccine-related harms are not being shown due support, are often lumped in with those who claim the vaccine was a crime against humanity. All are accused of misinformation, with no nuance and no attempt at distinguishing very different perspectives. Therefore, with such wide-ranging views labelled as “misinformation” as a means of censorship, those good intentions can backfire—and I do believe that there are good intentions behind many of these amendments.

16:15
To conclude, banning inaccurate ideas—if they are actually censored as misinformation or disinformation—can push them underground and allow them to fester unchallenged in echo chambers. It can also create martyrs. How often do we hear those who have embraced full-blown conspiracy theories, often peddling cranky and scaremongering theories, say, “They’re trying to silence me because they know that what I’m saying is true. What are they afraid of?” Historically, I think the best solution to bad speech is more speech and more argument; the fullest debate, discussion, scholarship, investigation and research—yes, googling, using Wikipedia or reading the odd book—and, of course, judgment and common sense to figure it out.
We should also remember from our history that what is labelled as false by a minority of people can be invaluable scepticism, challenging a consensus and eventually allowing truth to emerge. The fact—the truth—was once that the world was flat. Luckily, the fact-checkers were not around to ban the minority who challenged that view, and now we know a different truth.
Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- View Speech - Hansard - - - Excerpts

My Lords, I have attached my name to Amendments 52 and 99 in the name of the noble Baroness, Lady Merron, respectively signed by the noble Lords, Lord Bethell and Lord Clement-Jones, and Amendment 222 in her name. I entirely agree with what both the noble Baroness, Lady Merron, and the noble Lord, Lord Bethell, said. The noble Lord in particular gave us a huge amount of very well-evidenced information on the damage done during the Covid pandemic—and continuing to be done—by disinformation and misinformation. I will not repeat what they said about the damage done by the spread of conspiracy theories and anti-vaccination falsehoods and the kind of malicious bots, often driven by state actors, that have caused such damage.

I want to come from a different angle. I think we were—until time prevented it, unfortunately—going to hear from the noble Baroness, Lady Finlay of Llandaff, which would have been a valuable contribution to this debate. Her expert medical perspective would have been very useful. I think that she and I were the only two Members in the Committee who took part in the passage of the Medicines and Medical Devices Act. I think it was before the time of the noble Lord, Lord Bethell—he is shaking his head; I apologise. He took part in that as well. I also want to make reference to discussions and debates I had with him over changes to regulations on medical testing.

The additional point I want to make about disinformation and misinformation—this applies in particular to Amendment 222 about the independence of the advisory committee on disinformation and misinformation—is that we are now seeing in our medical system a huge rise in the number of private actors. These are companies seeking to encourage consumers or patients to take tests outside the NHS system and to get involved in a whole set of private provision. We are seeing a huge amount of advertising of foreign medical provision, given the pressures that our NHS is under. In the UK we have had traditionally, and still have, rules that place severe restrictions on the direct advertising of medicines and medical devices to patients— unlike, for example, the United States, where it is very much open slather, with some disastrous and very visible impacts.

We need to think about the fact that the internet, for better or for worse, is now a part of our medical system. If people feel ill, the first place they go—before they call the NHS, visit their pharmacist or whatever—is very often the internet, through these providers. We need to think about this in the round and as part of the medical system. We need to think about how our entire medical ecology is working, and that is why I believe we need amendments like these.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

The noble Baroness makes two incredibly important points. We are seeking to give people greater agency on their own health and the internet has been an enormous bonus in doing that, but of course that environment needs to be curated extremely well. We are also seeking to make use of health tech—non-traditional clinical interventions, some of which do not pierce the skin and therefore fall outside the normal conversation with GPs—and giving people the power to make decisions about the use of these new technologies for themselves. That is why curation of the health information environment is so important. Does the noble Baroness have any reflections on that.

Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- Hansard - - - Excerpts

I thank the noble Lord for his intervention. He has made me think of the fact that a particular area where this may be of grave concern is cosmetic procedures, which I think we debated during the passage of the Health and Care Act. These things are all interrelated, and it is important that we see them in an interrelated way as part of what is now the health system.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to a number of amendments in this group. I want to make the point that misinformation and disinformation was probably the issue we struggled with the most in the pre-legislative committee. We recognised the extraordinary harm it did, but also—as the noble Baroness, Lady Fox, said—that there is no one great truth. However, algorithmic spread and the drip, drip, drip of material that is not based on any search criteria or expression of an opinion but simply gives you more of the same, particularly the most shocking, moves very marginal views into the mainstream.

I am concerned that our debates over the last five days have concentrated so much on content, and that the freedom we seek does not take enough account of the way in which companies currently exercise control over the information we see. Correlations such as “Men who like barbecues are also susceptible to conspiracy theories” are then exploited to spread toxic theories that end in real-world harm or political tricks that show, for example, the Democrats as a paedophile group. Only last week I saw a series of pictures, presented as “evidence”, of President Biden caught in a compromising situation that gave truth to that lie. As Maria Ressa, the Nobel Peace Prize winner for her contribution to the freedom of expression, said in her acceptance speech:

“Tech sucked up our personal experiences and data, organized it with artificial intelligence, manipulated us with it, and created behavior at a scale that brought out the worst in humanity”.


That is the background to this set of amendments that we must take seriously.

As the noble Lord, Lord Bethell, said, Amendment 52 will ensure that platforms undertake a health misinformation risk assessment and provide a clear policy on dealing with harmful, false and misleading information. I put it to the Committee that, without this requirement, we will keep the status quo in which clicks are king, not health information.

It is a particular pleasure to support the noble Lord, Lord Moylan, on his Amendments 59 and 107. Like him, I am instinctively against taking material down. There are content-neutral ways of marking or questioning material, offering alternatives and signposting to diverse sources—not only true but diverse. These can break this toxic drip feed for long enough for people to think before they share, post and make personal decisions about the health information that they are receiving.

I am not incredibly thrilled by a committee for every occasion, but since the Bill is silent on the issue of misinformation and disinformation—which clearly will be supercharged by the rise of large language data models—it would be good to give a formal role to this advisory committee, so that it can make a meaningful and formal contribution to Ofcom as it develops not only this code of conduct but all codes of conduct.

Likewise, I am very supportive of Amendment 222, which seeks independence for the chair of the advisory body. I have seen at first hand how a combination of regulatory capture and a very litigious sector with deep pockets slows down progress and transparency. While the independence of the chair should be a given, our collective lived experience would suggest otherwise. This amendment would make that requirement clear.

Finally, and in a way most importantly, Amendment 224 would allow Ofcom to consider after the effect whether the code of conduct is necessary. This strikes a balance between adding to its current workload, which we are trying not to do, and tying one hand behind its back in the future. I would be grateful to hear from the Minister why we would not give Ofcom this option as a reasonable piece of future-proofing, given that this issue will be ever more important as AI creates layers of misinformation and disinformation at scale.

Baroness Healy of Primrose Hill Portrait Baroness Healy of Primrose Hill (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I support Amendment 52, tabled by my noble friend Lady Merron. This is an important issue which must be addressed in the Bill if we are to make real progress in making the internet a safer space, not just for children but for vulnerable adults.

We have the opportunity to learn lessons from the pandemic, where misinformation had a devastating impact, spreading rapidly online like the virus and threatening to undermine the vaccine rollout. If the Government had kept their earlier promise to include protection from harmful false health content in their indicative list of harmful content that companies would have been required to address under the now removed adult safety duties, these amendments would not be necessary.

It is naive to think that platforms will behave responsibly. Currently, they are left to their own devices in how they tackle health misinformation, without appropriate regulatory oversight. They can remove it at scale or leave it completely unchecked, as illustrated by Twitter’s decision to stop enforcing its Covid-19 misinformation policies, as other noble Lords have pointed out.

It is not a question of maintaining free speech, as some might argue. It was the most vulnerable groups who suffered from the spread of misinformation online—pregnant women and the BAME community, who had higher illness rates. Studies have shown that, proportionately, more of them died, not just because they were front-line workers but because of rumours spread in the community which resulted in vaccine hesitancy, with devastating consequences. As other noble Lords have pointed out, in 2021 the Royal College of Obstetricians and Gynaecologists found that only 42% of women who had been offered the vaccine accepted it, and in October that year one in five of the most critically ill Covid patients were unvaccinated, pregnant women. That is a heartbreaking statistic.

Unfortunately, it is not just vaccine fears that are spread on the internet. Other harmful theories can affect patients with cancer, mental health issues and sexual health issues, and, most worryingly, can affect children’s health. Rumours and misinformation play on the minds of the most vulnerable. The Government have a duty to protect people, and by accepting this amendment they would go some way to addressing this.

Platforms must undertake a health misinformation risk assessment and have a clear policy on dealing with harmful, false and misleading health information in their terms of service. They have the money and the expertise to do this, and Parliament must insist. As my noble friend Lady Merron said, I do not think that the Minister can say that the false communications offence in Clause 160 will address the problem, as it covers only a user sending a knowingly false communication with the intention of causing harm. The charity Full Fact has stated that this offence will exclude most health misinformation that it monitors online.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a very interesting debate. I absolutely agree with what the noble Baroness, Lady Kidron, said right at the beginning of her speech. This was one of the most difficult areas that the Joint Committee had to look at. I am not saying that anything that we said was particularly original. We tried to say that this issue could be partly addressed by greater media literacy, which, no doubt, we will be talking about later today; we talked about transparency of system design, and about better enforcement of service terms and conditions. But things have moved on. Clearly, many of us think that the way that the current Bill is drafted is inadequate. However, the Government did move towards proposing a committee to review misinformation and disinformation. That is welcome, but I believe that these amendments are taking the thinking and actions a step forward.

16:30
I do not agree with the noble Baroness, Lady Fox. What she has been saying is really a counsel of despair on not being able to deal with misinformation and disinformation. I was really interested to hear what the noble Lord, Lord Bethell, had to say about his experience —this is pretty difficult stuff to tackle when you are in a position of that sort. I support the noble Baronesses, Lady Bennett and Lady Healy, in what they had to say about this particular aspect. As the noble Baroness, Lady Kidron, said, it is about the system and the amplification that takes place, which brings out the worst in humanity.
The Puttnam report, by the Democracy and Digital Technologies Committee, also raised this. If Lord Puttnam had not retired from this House, he would be here today, saying that we need to do a lot more about this than we are proposing even in the amendments. In the report, the committee talked about a pandemic of misinformation. Nowhere is that more apparent than in health. The report was prescient; it came out in June 2020, some three years ago, well before we heard and saw all kinds of disinformation about vaccines.
We are seeing increasing numbers of commentators talking about the impact of misinformation and disinformation. We have had Ciaran Martin, former head of the National Cyber Security Centre, talking about the dangers to democracy. We have heard Sir Jeremy Fleming, head of GCHQ, saying that the main threat from AI is disinformation. We have had some really powerful statements, quite apart from seeing the impact of disinformation and misinformation on social media platforms.
On these Benches, we believe that the Government have a responsibility to intervene on misinformation and to support legislation to stop the spread of fake news. I believe that the public have an expectation that the Government do that and that the large social media companies address this issue on their platforms, hence my support for the amendments in these groups.
It has to be balanced. That is why I support the amendments by the noble Lord, Lord Moylan, as well. We have a common interest in trying to make sure that, while preventing misinformation and disinformation, we do it in a proportional way, as he described. That is of great importance.
The noble Lord, Lord Bethell, did not quote at length from the letter from Full Fact and all the health professionals, but, notably, it says:
“One key way that we can protect the future of our healthcare system is to ensure that internet companies have clear policies on how they identify the harmful health misinformation that appears on their platforms, as well as consistent approaches in dealing with it”.
It is powerful testimony from some very experienced and senior health professionals.
The focus of many of these amendments is on the way that the advisory committee will operate. Having an independent chair is of great importance, as is having a time limit within which there must be a report, along with other aspects.
The noble Lord, Lord Moylan, referred in one of the amendments to addressing the opacity of existing government methods for tackling disinformation. He mentioned one unit, but there are three units that I have been briefed about. There is the counter-disinformation unit in DCMS, which addresses mainly Covid issues that breach companies’ terms of service, and, recently, Russia/Ukraine issues. Then we have the Government Information Cell, which is based in the FCDO, and the rapid response unit, which I think he referred to, in the Cabinet Office. Ministers referred to these and said that the principal focus of the DCMS unit during the pandemic was Covid et cetera, but we do not know very much about what these units do or what their criteria are. Do they have any relationship with Ofcom? Will they have a relationship with Ofcom? It is important that we have something that reduces that level of opacity and opens up what those units do to a greater degree of scrutiny.
The only direct reference to misinformation in the Bill as it stands is to the advisory committee, so it is important that we know how it fits in with Ofcom’s wider regulatory functions, and that there is a duty to create a code of practice on information and misinformation. The advisory committee should be creative in the way it operates. One of the difficult issues we found is that there is not a great deal of knowledge out there about how to tackle misinformation and disinformation in a systemic way.
Finally, I was very interested in the briefing that noble Lords probably all received from Adobe, which talked about the Content Authenticity Initiative. That is exactly the kind of thing the advisory committee should be exploring. Apparently, it has more than 1,000 members, including media and tech companies, NGOs and so on. Its ambition is to promote the adoption of an open industry standard for content authenticity and provenance. That may sound like the holy grail, but it is something we should be trying to work towards.
These amendments are a means of at least groping towards a better way of tackling misinformation and disinformation, which, as we have heard, can have a huge impact, particularly in health.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, this debate has demonstrated the diversity of opinion regarding misinformation and disinformation—as the noble Lord said, the Joint Committee gave a lot of thought to this issue—as well as the difficulty of finding the truth of very complex issues while not shutting down legitimate debate. It is therefore important that we legislate in a way that takes a balanced approach to tackling this, keeping people safe online while protecting freedom of expression.

The Government take misinformation and disinformation very seriously. From Covid-19 to Russia’s use of disinformation as a tool in its illegal invasion of Ukraine, it is a pervasive threat, and I pay tribute to the work of my noble friend Lord Bethell and his colleagues in the Department of Health and Social Care during the pandemic to counter the cynical and exploitative forces that sought to undermine the heroic effort to get people vaccinated and to escape from the clutches of Covid-19.

We recognise that misinformation and disinformation come in many forms, and the Bill reflects this. Its focus is rightly on tackling the most egregious, illegal forms of misinformation and disinformation, such as content which amounts to the foreign interference offence or which is harmful to children—for instance, that which intersects with named categories of primary priority or priority content.

That is not the only way in which the Bill seeks to tackle it, however. The new terms of service duties for category 1 services will hold companies to account over how they say they treat misinformation and disinformation on their services. However, the Government are not in the business of telling companies what legal content they can and cannot allow online, and the Bill should not and will not prevent adults accessing legal content. In addition, the Bill will establish an advisory committee on misinformation and disinformation to provide advice to Ofcom on how they should be tackled online. Ofcom will be given the tools to understand how effectively misinformation and disinformation are being addressed by platforms through transparency reports and information-gathering powers.

Amendment 52 from the noble Baroness, Lady Merron, seeks to introduce a new duty on platforms in relation to health misinformation and disinformation for adult users, while Amendments 59 and 107 from my noble friend Lord Moylan aim to introduce new proportionality duties for platforms tackling misinformation and disinformation. The Bill already addresses the most egregious types of misinformation and disinformation in a proportionate way that respects freedom of expression by focusing on misinformation and disinformation that are illegal or harmful to children.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I am curious as to what the Bill says about misinformation and disinformation in relation to children. My understanding of primary priority and priority harms is that they concern issues such as self-harm and pornography, but do they say anything specific about misinformation of the kind we have been discussing and whether children will be protected from it?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am sorry—I am not sure I follow the noble Baroness’s question.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Twice so far in his reply, the Minister has said that this measure will protect children from misinformation and disinformation. I was just curious because I have not seen any sight of that, either in discussions or in the Bill. I was making a distinction regarding harmful content that we know the shape of—for example, pornography and self-harm, which are not, in themselves, misinformation or disinformation of the kind we are discussing now. It is news to me that children are going to be protected from this, and I am delighted, but I was just checking.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes, that is what the measure does—for instance, where it intersects with the named categories of primary priority or priority content in the Bill, although that is not the only way the Bill does it. This will be covered by non-designated content that is harmful to children. As we have said, we will bring forward amendments on Report—which is perhaps why the noble Baroness has not seen them in the material in front of us—regarding material harms to children, and they will provide further detail and clarity.

Returning to the advisory committee that the Bill sets up and the amendments from the noble Baroness, Lady Merron, and my noble friend Lord Moylan, all regulated service providers will be forced to take action against illegal misinformation and disinformation in scope of the Bill. That includes the new false communication offences in the Bill that will capture communications where the sender knows the information to be false but sends it intending to cause harm—for example, hoax cures for a virus such as Covid-19. The noble Baroness is right to say that that is a slightly different approach from the one taken in her amendment, but we think it an appropriate and proportionate response to tackling damaging and illegal misinformation and disinformation. If a platform is likely to be accessed by children, it will have to protect them from encountering misinformation and disinformation content that meets the Bill’s threshold for content that is harmful to children. Again, that is an appropriate and proportionate response.

Turning to the points made by my noble friend Lord Moylan and the noble Baroness, Lady Fox, services will also need to have particular regard to freedom of expression when complying with their safety duties. Ofcom will be required to set out steps that providers can take when complying with their safety duties in the codes of practice, including what is proportionate for different providers and how freedom of expression can be protected.

16:45
My noble friend Lord Bethell and the noble Baroness, Lady Merron, are concerned that health misinformation and disinformation will not be adequately covered by this. Their amendment seeks to tackle that but, in doing so, mimics provisions on content harmful to adults previously included in the Bill which the Government consciously removed last year following debates in another place. The Government take concerns about health-related misinformation and disinformation very seriously. Our approach will serve a purpose of transparency and accountability by ensuring that platforms are transparent and accountable to their users about what they will and will not allow on their services.
Under the new terms of service for category 1 services, if certain types of misinformation and disinformation are prohibited in platforms’ terms of service, they will have to remove it. That will include anti-vaccination falsehoods and health-related misinformation and disinformation if it is prohibited in their terms of service. This is an appropriate response which prevents services from arbitrarily removing or restricting legal content, however controversial it may be, or suspending or banning users where it is not in accordance with their expressed terms of service.
The Bill will protect people from the most egregious types of health-related misinformation and disinformation while still protecting freedom of expression and allowing users to ask genuine questions about health-related matters. There are many examples from recent history—Primodos, Thalidomide and others—which point to the need for legitimate debate about health-related matters, sometimes against companies which have deep pockets to defend the status quo.
My noble friend Lord Bethell also raised concerns about the role that algorithms play in pushing content. I reassure him that all companies will face enforcement action if illegal content in scope of the Bill is being promoted to users via algorithms. Ofcom will have a range of powers to assess whether companies are fulfilling their regulatory requirements in relation to the operation of their algorithms.
In circumstances where there is a significant threat to public health, the Bill already provides additional powers for the Secretary of State to require Ofcom to prioritise specified objectives when carrying out its media literacy activity and to require that companies report on the action they are taking to address the threat. The advisory committee on misinformation and disinformation will also be given the flexibility and expertise to consider providing advice to Ofcom on this issue, should it choose to.
Amendments 99 and 222 from the noble Baroness, Lady Merron, and Amendments 223 and 224 from the noble Lord, Lord Knight of Weymouth, relate to the advisory committee. Disinformation is a pervasive and evolving threat. The Government believe that responding to the issue effectively requires a multifaceted, whole-of-society approach. That is what the advisory committee seeks to do by bringing together technology companies, civil society organisations and sector experts to advise Ofcom in building cross-sector understanding and technical knowledge of the challenges and how best to tackle them. The Government see this as an essential part of the Bill’s response to this issue.
I understand the desire of noble Lords to ensure that the committee is conducting its important work as quickly as possible, but it is imperative that Ofcom has the appropriate time and space to appoint the best possible committee and that its independence as a regulator is respected. Ofcom is well versed in setting up statutory committees and ensuring that committees established under statute meet their obligations while maintaining impartiality and integrity. To seek to prescribe timeframes or their composition risks impeding Ofcom’s ability to run a transparent process that finds the most suitable candidates. Considering the evolving nature of disinformation and the online realm, the advisory committee will also need the flexibility to adapt and respond. It would therefore not be appropriate for the Bill to be overly prescriptive about the role of the advisory committee or to mandate the things on which it must report.
The noble Baroness, Lady Fox of Buckley, asked whether the committee could include civil liberties representatives. It is for Ofcom to decide who is on the committee, but Ofcom must have regard to the desirability of including, among others, people representing the interests of UK users of regulated services, which could include civil liberties groups.
The noble Baroness, Lady Kidron, raised the challenges of artificial intelligence. Anything created by artificial intelligence and shared on an in-scope service by a user will qualify as user-generated content. It would therefore be covered by the Bill’s safety duties, including to protect children from harmful misinformation and disinformation, and to ensure that platforms properly enforce their terms of service for adults.
I turn to the points raised in my noble friend Lord Moylan’s Amendment 264. Alongside this strong legislative response, the Government will continue their operational response to tackling misinformation and disinformation. As part of this work, the Government meet social media companies on a regular basis to discuss a range of issues. These meetings are conducted in the same way that the Government would engage with any other external party, and in accordance with the well-established transparency processes and requirements.
The Government’s operational work also seeks to understand misinformation and disinformation narratives that are harmful to the UK, to build an assessment of their risk and threat. We uphold the same commitment to freedom of expression in our operational response as we do in our legislative response. As I said, we are not in the business of telling companies what legal content they can and cannot allow. Indeed, under the Bill, category 1 services must set clear terms of service that are easy for users to understand and are consistently enforced, ensuring new levels of transparency and accountability.
Our operational response will accompany our legislative response. The measures have been designed to provide a strong response to tackle misinformation and disinformation, ensuring users’ safety while promoting a thriving and lively democracy where freedom of expression is protected.
The noble Baroness, Lady Fox, and the noble Lord, Lord Clement-Jones, asked about the counter-disinformation unit run, or rather led, by the Department for Science, Innovation and Technology. That works to understand attempts to artificially manipulate the information environment, and to understand the scope, scale and reach of misinformation and disinformation. It responds to acute information incidents, such as Russian information operations during the war in Ukraine, those we saw during the pandemic and those around important events such as general elections. It does not monitor individuals; rather, its focus is on helping the Government understand online misinformation and disinformation narratives and threats.
When harmful narratives are identified, the unit works with departments across Whitehall to deploy the appropriate response, which could involve a direct rebuttal on social media or awareness-raising campaigns to promote the facts. Therefore, the primary purpose is not to monitor for harmful content to flag to social media companies—the noble Baroness raised this point—but the department may notify the relevant platform if, in the course of its work, it identifies content that potentially violates platforms’ terms of service, including co-ordinated, inauthentic or manipulative behaviour. It is then up to the platform to decide whether to take action against the content, based on its own assessment and terms of service.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

The Minister mentioned “acute” examples of misinformation and used the example of the pandemic. I tried to illustrate that perhaps, with hindsight, what were seen as acute examples of misinformation turned out to be rather more accurate than we were led to believe at the time. So my concern is that there is already an atmosphere of scepticism about official opinion, which is not the same as misinformation, as it is sometimes presented. I used the American example of the Hunter Biden laptop so we could take a step away.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

This might be an appropriate moment for me to say—on the back of that—that, although my noble friend explained current government practice, he has not addressed my point on why there should not be an annual report to Parliament that describes what government has done on these various fronts. If the Government regularly meet newspaper publishers to discuss the quality of information in their newspapers, I for one would have entire confidence that the Government were doing so in the public interest, but I would still quite like—I think the Government would agree on this—a report on what was happening, making an exception for national security. That would still be a good thing to do. Will my noble friend explain why we cannot be told?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

While I am happy to elaborate on the work of the counter-disinformation unit in the way I just have, the Government cannot share operational details about its work, as that would give malign actors insight into the scope and scale of our capabilities. As my noble friend notes, this is not in the public interest. Moreover, reporting representations made to platforms by the unit would also be unnecessary as this would overlook both the existing processes that govern engagements with external parties and the new protections that are introduced through the Bill.

In the first intervention, the noble Baroness, Lady Fox, gave a number of examples, some of which are debatable, contestable facts. Companies may well choose to keep them on their platforms within their terms of service. We have also seen deliberate misinformation and disinformation during the pandemic, including from foreign actors promoting more harmful disinformation. It is right that we take action against this.

I hope that I have given noble Lords some reassurance on the points raised about the amendments in this group. I invite them not to press the amendments.

Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am most grateful to noble Lords across the Committee for their consideration and for their contributions in this important area. As the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones, both said, this was an area of struggle for the Joint Committee. The debate today shows exactly why that is so, but it is a struggle worth having.

The noble Lord, Lord Bethell, talked about there being a gap in the Bill as it stands. The amendments include the introduction of risk assessments and transparency and, fundamentally, explaining things in a way that people can actually understand. These are all tried and tested methods and can serve only to improve the Bill.

I am grateful to the Minister for his response and consideration of the amendments. I want to take us back to the words of the noble Baroness, Lady Kidron. She explained it beautifully—partly in response to the comments from the noble Baroness, Lady Fox. This is about tackling a system of amplification of misinformation and disinformation that moves the most marginal of views into the mainstream. It deals with restricting the damage that, as I said earlier, can produce the most dire circumstances. Amplification is the consideration that these amendments seek to tackle.

I am grateful to the noble Lord, Lord Moylan, for his comments, as well as for his amendments. I am sure the noble Lord has reflected that some of the previous amendments he brought before the House somewhat put the proverbial cat among the Committee pigeons. On this occasion, I think the noble Lord has nicely aligned the cats and the pigeons. He has managed to rally us all—with the exception of the Minister—behind these amendments.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

The noble Baroness is entirely right to emphasise amplification. May I put into the mix the very important role of the commercialisation of health misinformation? The more you look at the issue of health misinformation, the more you realise that its adverse element is to do with making money out of people’s fears. I agree with the noble Baroness, Lady Fox, that there should be a really healthy discussion about the efficacy, safety and value for money of modern medicines. That debate is worth having. The Minister rightly pointed out some recent health scandals that should have been chased down much more. The commercialisation of people’s fears bears further scrutiny and is currently a gap in the Bill.

Baroness Merron Portrait Baroness Merron (Lab)
- Hansard - - - Excerpts

I certainly agree with the noble Lord, Lord Bethell, on that point. It is absolutely right to talk about the danger of commercialisation and how it is such a driver of misinformation and disinformation; I thank him for drawing that to the Committee’s attention. I also thank my noble friend Lady Healy for her remarks, and her reflection that these amendments are not a question of restricting free speech and debate; they are actually about supporting free speech and debate but in a safe and managed way.

17:00
The Minister gave the Committee the assurance that the Bill in its current form tackles the most egregious forms of disinformation and misinformation. If only it were so, we would not have had cause to bring forward these amendments. I again refer to the point in the Minister’s response when, as I anticipated, he referred to the false communications offence in Clause 160. I repeat the point gently but firmly to the Minister that this just does not address the amplification point that we seek to focus on. One might argue that perhaps it is more liberal and proportionate to allow misinformation and disinformation but to focus on tackling their amplification. That is where our efforts should be.
With those comments, with thanks to the Minister and other noble Lords, and in the hope that the Minister will have the opportunity to reflect on the points raised in this debate, I beg leave to withdraw.
Amendment 52 withdrawn.
Amendment 52A
Moved by
52A: After Clause 15, insert the following new Clause—
“Duty to inform users about accuracy of content on a service
(1) This section sets out a duty to make available information to allow users to establish the reliability and accuracy of content which applies in relation to Category 1 services.(2) A duty, where a service provides access to both journalistic and other forms of content, to make available to users such information that may be necessary to allow users to establish the reliability and accuracy of content encountered on the service.”Member’s explanatory statement
This amendment is to probe what steps, if any, a carrier of journalistic content is expected to take to improve users’ media literacy skills.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I move this amendment in my name as part of a group of amendments on media literacy. I am grateful to Full Fact, among others, for some assistance around these issues, and to Lord Puttnam. He has retired from this House, of course, but it was my pleasure to serve on the committee that he chaired on democracy and digital technology. He remains in touch and is watching from his glorious retirement in the Republic of Ireland—and he is pressing that we should address issues around media literacy in particular.

The Committee has been discussing the triple shield. We are all aware of the magic of threes—the holy trinity. Three is certainly a magic number, but we also heard about the three-legged stool. There is more stability in four, and I put it to your Lordships that, having thought about “illegal” as the first leg, “terms of service” as the second and “user empowerment tools” as the third, we should now have, as a fourth leg underpinning a better and safer environment for the online world, “better media literacy”, so that users have confidence and competence online as a result.

To use the user empowerment tools effectively, we need to be able to understand the business models of the platforms, and how we are paying for their services with our data and our attention; how platforms use our data; our data rights as individuals; and the threat of scams, catfishing, phishing and fraud, which we will discuss shortly. Then there is the national cyber threat. I was really struck, when we were on that committee that Lord Puttnam chaired, by hearing how nations such as Finland and the Baltic states regard media literacy as a national mission to protect them particularly from the threat of cyberwarfare from Russia.

We have heard about misinformation and disinformation. There are issues of emerging technologies that we all need to be more literate about. I remember, some six or seven years ago, my wife was in a supermarket queue with her then four year-old daughter who turned to her and asked what an algorithm was. Could any of us then confidently be able to reply and give a good answer? I know that some would be happy to do so, but we equally need to be able to answer what machine learning is, what large-language models are, or what neural networks are in order to understand the emerging world of artificial intelligence.

Ofcom already has a duty under the Communications Act 2002. Incidentally, Lord Puttnam chaired the Joint Committee on that Act. It is worth asking ourselves: how is it going for Ofcom in the exercise of that duty? We can recall, I am sure, the comments last Tuesday in this Committee of the noble Baroness, Lady Buscombe, who said:

“I took the Communications Act 2003 through for Her Majesty’s Opposition, and we were doing our absolute best to future-proof the legislation. There was no mention of the internet in that piece of legislation”.—[Official Report, 9/5/23; col. 1709.]


There is no doubt in my mind that, as a result of all the changes that have taken place in the last 20 years, the duty in that Act needs updating, and that is what we are seeking to do.

It is also possible to look at the outcomes. What is the state of media literacy in the nation at the moment? I was lucky enough this weekend to share a platform at a conference with a young woman, Monica. She lives in Greenwich, goes to Alleyn’s School, is articulate and is studying computer science at A-level. When asked about the content of the computer science curriculum, which is often prayed in aid in terms of the digital and media literacy of our young people, she reminded the audience that she still has to learn about floppy disks because the curriculum struggles to keep up to date. She is not learning about artificial intelligence in school because of that very problem. The only way in which she could do so, and she did, was through an extended project qualification last year.

We then see Ofcom’s own reporting on levels of media literacy in adults. Among 16 to 24 year-olds, which would cover Monica, for example, according to the most recent report out earlier this year or at the end of last, only two-thirds are confident and able to recognise scam ads, compared to 76% of the population in England. Young people are less confident in recognising search-engine advertising than the majority: only 42% of young people are confident around differentiating between organic and advertising content on search. Of course, young people are better at thinking about the truthfulness of “factual” information online. For adults generally, the report showed that only 45% of us are confident and able to recognise search-engine advertising, and a quarter of us struggle to identify scam emails and factful truthfulness online. You are less media literate and therefore more vulnerable if you are from the poorer parts of the population. If you are older, you are still yet more vulnerable to scam emails, although above average on questioning online truth and spotting ads in search engines. Finally, in 2022, Ofcom also found that 61% of social media users who say they are confident in judging whether online content is true or false actually lack the skills to be able to do so. A lot of us are kidding ourselves in terms of how safe we are and how much we know about the online world.

So, much more is to be done. Hence, Amendment 52A probes what the duty on platforms should be to improve media literacy and thereby establish the reliability and accuracy of journalistic content. Amendment 91 in my name requires social media and search services to put in place measures to improve media literacy and thereby explain things like the business model that currently is too often skated over by the media literacy content provided by platforms to schools and others. The noble Lord, Lord Holmes, has Amendment 91A, which is similar in intent, and I look forward to hearing his comments on that.

Amendment 98 in my name would require a code of practice from Ofcom in support of these duties and Amendment 186 would ensure that Ofcom has sufficient funds for its media literacy duties. Amendment 188 would update the Communications Act to reflect the online world that we are addressing in this Bill. I look forward to the comments from the noble Baroness, Lady Prashar, in respect of her Amendment 236, which, she may argue, does a more comprehensive job than my amendment.

Finally, my Amendment 189 in this group states that Ofsted would have to collaborate with Ofcom in pursuance of its duties, so that Ofcom could have further influence into the quality of provision in schools. Even this afternoon, I was exchanging messages with an educator in Cornwall called Giles Hill, who said to me that it is truly dreadful for schools having to mop up problems caused by this unregulated mess.

This may not be the perfect package in respect of media literacy and the need to get this right and prop up the three-legged stool, but there is no doubt from Second Reading and other comments through the Bill’s passage that this is an area where the Bill needs to be amended to raise the priority and the impact of media literacy among both service providers and the regulator. I beg to move.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to take part in today’s proceedings. As it is my first contribution on this Bill, I declare my technology and financial services interests, as set out in the register. I also apologise for not being able to take part in the Second Reading deliberations.

It is a particular pleasure to follow my friend, the noble Lord, Lord Knight; I congratulate him on all the work that he has done in this area. Like other Members, I also say how delighted I was to be part of Lord Puttnam’s Democracy and Digital Technologies Committee. It is great to know that he is watching—hopefully on wide-movie screen from Skibbereen—because the contribution that he has made to this area over decades is beyond parallel. To that end, I ask my noble friend the Minister whether he has had a chance to remind himself of the recommendations in our 2020 report. Although it is coming up to three years old, so much of what is in that report is completely pertinent today, as it was on the date of publication.

I am in the happy position to support all the amendments in this group; they all have similar intent. I have been following the debate up to this point and have been in the Chamber for a number of previous sessions. Critically important issues have been raised in every group of amendments but, in so many ways, this group is perhaps particularly critical, because this is one of the groups that enables individuals, particularly young people, to have the tools that they—and we—need in their hands to enable them to grip this stuff, in all its positive and, indeed, all its less-positive elements.

My Amendment 91A covers much of the same ground as Amendment 91 from the noble Lord, Lord Knight. It is critical that, when we talk about media literacy, we go into some detail around the subsets of data literacy, data privacy, digital literacy and, as I will come on to in a moment, financial literacy. We need to ensure that every person has an understanding of how this online world works, how it is currently constructed and how there is no inevitability about that whatever. People need to understand how the algorithms are set up. As was mentioned on a previous group, it is not necessarily that much of a problem if somebody is spouting bile in the corner; it is not ideal, but it is not necessarily a huge problem. The problem in this world is the programmability, the focus, the targeting and the weaponising of algorithms to amplify such content for monetary return. Nothing is inevitable; it is all utterly determined by the models currently in play.

It is critical for young people, and all people, to understand how data is used and deployed. In that media literacy, perhaps the greatest understanding of all is that it is not “the data” but “our data”. It is for us, through media literacy, to determine how our data is deployed, for what purpose, to what intent and in what circumstances, rather than, all too often, it being sold on, and so on.

17:15
Does the Minister agree that it is critical that we include financial literacy in this broader media literacy group of amendments, because so much of what is currently online is designed as financial scams or inducements? It would not be overstating it to say that there is currently an epidemic of online scamming and fraud. Does he agree that the Bill needs to be very clear on this specific issue of literacy? Will he update the Committee on the work the Government have done on the Media Literacy Taskforce Fund and, indeed, the programme fund launched last October? What updates or plans are there to scale, to develop and to further partner on both those funds?
Finally, I quote the words of the Royal College of Psychiatrists, stating pretty clearly, in terms, why media literacy matters:
“media literacy … can equip young people with the tools they need to help protect themselves as new online harms develop”.
I agree but, matching like with like, I seek to amplify. More than tools, we need media literacy to be nothing short of the sword and the shield for young people in the online world—the sword and the shield for all people.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, for once, I am not entirely hostile to all these amendments—hurrah. In fact, I would rather have media literacy and education than regulation; that seems to me the solution to so much of what we have been discussing. But guess what? I have a few anxieties and I shall just raise them so that those who have put forward the arguments can come back to me.

We usually associate media literacy with schools and young people in education. Noble Lords will be delighted to know that I once taught media literacy: that might explain where we are now. It was not a particularly enlightening course for anybody, but it was part of the communications A-level at the time. I am worried about mandating schools how to teach media literacy. As the noble Lord, Lord Knight, will know, I worry about adding more to their overcrowded curriculum than they already have on their plate, but I note that the amendments actually expand the notion of being taught literacy to adults, away from just children. I suppose I just have some anxiety about Ofcom becoming the nation’s teacher, presenting users of digital services as though they are hapless and helpless. In other words, I am concerned about an overly paternalistic approach—that we should not be patronising.

The noble Baroness, Lady Kidron, keeps reminding us that content should not be our focus, and that it should be systems. In fact, in practically every discussion we have had, content has been the focus, because that is what will be removed, or not, by how we deal with the systems. That is one of the things that we are struggling with.

Literacy in the systems would certainly be very helpful for everybody. I have an idea—it is not an amendment—that we should send the noble Lord, Lord Allan of Hallam, on a UK tour so that he can explain it to us all; he is not here for this compliment, but every time he spoke in the first week of Committee, I think those of us who were struggling understood what he meant, as he explained complicated and technical matters in a way that was very clear. That is my constructive idea.

Amendment 52A from the noble Lord, Lord Knight of Weymouth, focuses on content, with its

“duty to make available information to allow users to establish the reliability and accuracy of content”.

That takes us back to the difficulties we were struggling with on how misinformation and disinformation will be settled and whether it is even feasible. I do not know whether any noble Lords have been following the “mask wars” that are going on. There are bodies of scientists on both sides on the efficacy of mask wearing—wielding scientific papers at dawn, as it were. These are well-informed, proper scientists who completely disagree on whether it was effective during lockdown. I say that because establishing reliability and accuracy is not that straightforward.

I like the idea of making available

“to users such information that may be necessary to allow users to establish the reliability and accuracy of content encountered on the service”.

I keep thinking that we need adults and young people to say that there is not one truth, such as “the science”, and to be equipped and given the tools to search around and compare and contrast different versions. I am involved in Debating Matters for 16 to 18 year-olds, which has topic guides that say, “Here is an argument, with four really good articles for it and four really good articles against, and here’s a load of background”. Then 16 to 18 year-olds will at least think that there is not just one answer. I feel that is the way forward.

The noble Lord, Lord Clement-Jones, said that I was preaching a counsel of despair; I like to think of myself as a person who has faith in the capacity and potential of people to overcome problems. I had a slight concern when reading the literature associated with online and digital literacy—not so much with the amendments—that it always says that we must teach people about the harms of the online world. I worry that this will reinforce a disempowering idea of feeling vulnerable and everything being negative. One of the amendments talks about a duty to promote users’ “safe use” of the service. I encourage a more positive outlook, incorporating into this literacy an approach that makes people aware that they can overcome and transcend insults and be robust and savvy enough to deal with algorithms—that they are not always victims but can take control over the choices they make. I would give them lessons on resilience, and possibly just get them all to read John Locke on toleration.

Baroness Prashar Portrait Baroness Prashar (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to Amendments 236, 237 and 238 in my name. I thank the noble Lord, Lord Storey, and the noble Baroness, Lady Bennett of Manor Castle, for supporting me. Like others, I thank Full Fact for its excellent briefings. I also thank the noble Lord, Lord Knight, for introducing this group of amendments, as it saves me having to make the case for why media literacy is a very important aspect of this work. It is the other side of regulation; they very much go hand in hand. If we do not take steps to promote media literacy, we could fall into a downward spiral of further and further regulation, so it is extremely important.

It is a sad fact that levels of media literacy are very low. Research from Ofcom has found that one-third of internet users are unaware of the potential for inaccurate and biased information. Further, 40% of UK adult internet users do not have the skills to critically assess information they see online, and only 2% of children have skills to tell fact from fiction online. It will not be paternalistic, but a regulator should be proactively involved in developing media literacy programmes. Through the complaints it receives and from the work that it does, the regulator can identify and monitor where the gaps are in media literacy.

To date, the response to this problem has been for social media platforms to remove content deemed harmful. This is often done using technology that picks up on certain words and phrases. The result has been content being removed that should not have been. Examples of this include organisations such as Mumsnet having social media posts on sexual health issues taken down because the posts use certain words or phrases. At one stage, Facebook’s policy was to delete or censor posts expressing opinions that deviated from the norm, without defining what “norm” actually meant. The unintended consequences of the Bill could undermine free speech. Rather than censoring free speech through removing harmful content, we should give a lot more attention to media literacy.

During the Bill’s pre-legislative scrutiny, the Joint Committee recommended that the Government include provisions to ensure media literacy initiatives are of a high standard. The draft version of the Bill included Clause 103, which strengthened the media literacy provisions in the Communications Act 2003, as has already been mentioned. Regrettably, the Government later withdrew the enhanced media literacy clause, so the aim of my amendments is to reintroduce strong media literacy provisions. Doing so will both clarify and strengthen media literacy obligations on online media providers and Ofcom.

Amendment 236 would place a duty on Ofcom to take steps to improve the media literacy of the public in relation to regulated services. As part of this duty, Ofcom must try to reach audiences who are less engaged and harder to reach through traditional media literacy services. It must also address gaps in the current availability of media literacy provisions for vulnerable users. Many of the existing media literacy services are targeted at children but we need to include vulnerable adults too. The amendment would place a duty on Ofcom to promote availability and increase the effectiveness of media literacy initiatives in relation to regulated services. It seeks to ensure that providers of regulated services take appropriate measures to improve users’ media literacy through Ofcom’s online safety function. This proposed new clause makes provision for Ofcom to prepare guidance about media literacy matters, and such guidance must be published and kept under review.

Amendment 237 would place a duty on Ofcom to prepare a strategy on how it intends to undertake the duty to promote media literacy. This strategy should set out the steps Ofcom proposes to take to achieve its media literacy duties and identify organisations, or types of organisations, that Ofcom will work with to undertake these duties. It must also explain why Ofcom believes the proposed steps will be effective in how it will assess progress. This amendment would also place a duty on Ofcom to have regard to the need to allocate adequate resources for implementing this strategy. It would require Ofcom’s media strategy to be published within six months of this provision coming into force, and to be revised within three years; in both cases this should be subject to consultation.

Amendment 238 would place a duty on Ofcom to report annually on the delivery of its media literacy strategy. This reporting must include steps taken in accordance with the strategy and assess the extent to which those steps have had an effect. This amendment goes further than the existing provisions in the Communications Act 2003, which do not include duties on Ofcom to produce a strategy or to measure progress; nor do they place a duty on Ofcom to reach hard-to-reach audiences who are the most vulnerable in our society to disinformation and misinformation.

17:30
The Government have previously responded by saying that there is no need to include media literacy provisions in the Bill, citing Ofcom’s Approach to Online Media Literacy, a document published in December 2021, and the Government’s own Online Media Literacy Strategy, published in July 2021. Both these documents make multiple references to the Online Safety Bill placing media literacy duties on Ofcom. The removal of media literacy provisions from the Bill risks this not being viewed as a priority area of the work of Ofcom or future Governments. Meta have said that it would prefer to have clear media literacy duties in the Bill, as this provides clarity. Without regulatory obligations, there is a risk that, in this important area of work, the regulator will not have the teeth it needs to monitor and regulate where there are gaps.
We need to equip society—children and adults—so that they can make knowledgeable and intelligent use of the internet. We have focused on the harm that the internet does, but the proper use of it can have a very positive impact. The previous debate that we had about misinformation and disinformation highlighted the importance of media literacy.
Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to follow the noble Baroness, Lady Prashar, and I join her in thanking the noble Lord, Lord Knight, for introducing this group very clearly.

In taking part in this debate, I declare a joint interest with the noble Baroness, Lady Fox, in that I was for a number of years a judge in the Debating Matters events to which she referred. Indeed, the noble Baroness was responsible for me ending up in Birmingham jail, during the time that such a debate was conducted with the inmates of Birmingham jail. We have a common interest there.

I want to pick up a couple of additional points. Before I joined your Lordships’ Committee today I was involved in the final stages of the Committee debate on the economic crime Bill, where the noble Lord, Lord Sharpe of Epsom, provided a powerful argument—probably unintentionally—for the amendments we are debating here now. We were talking, as we have at great length in the economic crime Bill, about the issue of fraud. As the noble Lord, Lord Holmes of Richmond, highlighted, in the context of online harms fraud is a huge aspect of people’s lives today and one that has been under-covered in this Committee, although it has very much been picked up in the economic crime Bill Committee. As we were talking about online fraud, the noble Lord, Lord Sharpe of Epsom, said that consumers have to be “appropriately savvy”. I think that is a description of the need for education and critical thinking online, equipping people with the tools to be, as he said, appropriately savvy when facing the risks of fraud and scams, and all the other risks that people face online.

I have attached my name to two amendments here: Amendment 91, which concerns the providers of category 1 and 2A services having a duty, and Amendment 236, which concerns an Ofcom duty. This joins together two aspects. The providers are making money out of the services they provide, which gives them a duty to make some contribution to combatting the potential harms that their services present to people. Ofcom as a regulator obviously has a role. I think it was the noble Lord, Lord Knight, who said that the education system also has a role, and there is some reference in here to Ofsted having a role.

What we need is a cross-society, cross-systems approach. This is where I also make the point that we need to think outside the scope of the Bill—it is part of the whole package—about how the education system works, because media literacy is not a stand-alone thing that you can separate out from the issues of critical thinking more broadly. We need to think about our education system, which is far too often, for schools in particular, where we get pupils to learn and regurgitate a whole set of facts and then reward them for that. We need to think about how our education system prepares children for the modern online world.

There is a great deal we can learn from the example—often cited but worth referring to—of Finland, which by various tests has been ranked as the country most resistant to fake news. A very clearly built-in idea of questioning, scrutiny and challenge is being encouraged among pupils, starting from the age of seven. That is something we need to transform our education system to achieve. However, of course, many people using the internet now are not part of our education system, so this needs to be across our society. A focus on the responsibilities of Ofcom and the providers has to be in the Bill.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, over the last decade, I have been in scores of schools, run dozens of workshops and spoken to literally thousands of children and young people. A lot of what I pass off as my own wisdom in this Chamber is, indeed, their wisdom. I have a couple of points, and I speak really from the perspective of children under 18 with regard to these amendments, which I fully support.

Media literacy—or digital literacy, as it is sometimes called—is not the same as e-safety. E-safety regimes concentrate on the behaviour of users. Very often, children say that what they learn in those lessons is focused on adult anxieties about predators and bullies, and when something goes wrong, they feel that they are to blame. It puts the responsibility on children. This response, which I have heard hundreds of times, normally comes up after a workshop in which we have discussed reward loops, privacy, algorithmic bias, profiling or—my own favourite—a game which reveals what is buried in terms and conditions; for example, that a company has a right to record the sound of a device or share their data with more than a thousand other companies. When young people understand the pressures that they are under and which are designed into the system, they feel much better about themselves and rather less enamoured of the services they are using. It is my experience that they then go on to make better choices for themselves.

Secondly, we have outsourced much of digital literacy to companies such as Google and Meta. They too concentrate on user behaviour, rather than looking at their own extractive policies focused on engagement and time spent. With many schools strapped for cash and expertise, this teaching is widespread. However, when I went to a Google-run assembly, children aged nine were being taught about features available only on services for those aged over 13—and nowhere was there a mention of age limits and why they are important. It cannot be right that the companies are grooming children towards their services without taking full responsibility for literacy, if that is the literacy that children are being given in school.

Thirdly, as the Government’s own 2021 media literacy strategy set out, good media literacy is one line of defence from harm. It could make a crucial difference in people making informed and safe decisions online and engaging in a more positive online debate, at the same time as understanding that online actions have consequences offline.

However, while digital literacy and, in particular, critical thinking are fundamental to a contemporary education and should be available throughout school and far beyond, they must not be used as a way of putting responsibility on the user for the company’s design decisions. I am specifically concerned that in the risk-assessment process, digital literacy is one of the ways that a company can say it has mitigated a potential risk or harm. I should like to hear from the Minister that that is an additional responsibility and not instead of responsibility.

Finally, over all these years I have always asked at the end of the session what the young people care about the most. The second most important thing is that the system should be less addictive—it should have less addiction built into it. Again, I point the Committee in the direction of the safety-by-design amendments in the name of my noble friend Lord Russell that try to get to the crux of that. They are not very exciting amendments in this debate but they get to the heart of it. However, the thing the young people most often say is, “Could you do something to get my parents to put down their phones?” I therefore ask the Minister whether he can slip something into the Bill, and indeed ask the noble Lord, Lord Grade, whether that could emerge somewhere in the guidance. That is what young people want.

Baroness Healy of Primrose Hill Portrait Baroness Healy of Primrose Hill (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I strongly support the amendments in the name of my noble friend Lord Knight and others in this group.

We cannot entirely contain harmful, misleading and dangerous content on the internet, no matter how much we strengthen the Bill. Therefore, it is imperative that we put a new duty on category 1 and category 2A services to require them to put in place measures to promote the media literacy of users so that they can use the service safely.

I know that Ofcom takes the issue of media literacy seriously, but it is regrettable that the Government have dropped their proposal for a new media literacy duty for Ofcom. So far, I see no evidence that the platforms take media literacy seriously, so they need to be made to understand that they have corporate social responsibilities towards their clients.

Good media literacy is the first line of defence from bad information and the kind of misinformation we have discussed in earlier groups. Schools are trying to prepare their pupils to understand that the internet can peddle falsehoods as well as useful facts, but they need support, as the noble Baroness, Lady Kidron, just said. We all need to increase our media literacy, especially with the increasing use of artificial intelligence, as it can make the difference between decisions based on sound evidence and decisions based on poorly informed opinions that can harm health and well-being, social cohesion and democracy.

In 2022, Ofcom found that a third of internet users are unaware of the potential for inaccurate or biased information online, and 61% of social media users who say they are confident in judging whether online content is true or false actually lack the skills to do so, as my noble friend Lord Knight, has pointed out.

Amendment 91 would mean that platforms have to instigate measures to give users an awareness and understanding of the nature and characteristics of the content that may be on the service, its potential impact and how platforms operate. That is a sensible and practical request that is not beyond the ability of companies to provide, and it will be to everyone’s benefit.

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I indicate my support in principle for what these amendments are trying to achieve.

I speak with a background that goes back nearly 40 years, being involved in health education initiatives, particularly in primary schools. For 24 years—not very good corporate governance—I was the chair of what is now the largest supplier of health education into primary schools in the United Kingdom, reaching about 500,000 children every year.

The principle of preventive health is not a million miles away from what we are talking about today. I take the point that was well made by the noble Baroness, Lady Fox, that piling more and more duties on Ofcom in a well-intentioned way may not have the effect that we want. What we are really looking for and talking about is a joined-up strategy—a challenge for any Government—between the Department for Education, the Department for Digital, Culture, Media and Sport, the Department for Science, Innovation and Technology, and probably the Department of Health and Social Care, because health education, as it has developed over the last 40 or 50 years, has a lot to teach us about how we think about creating effective preventive education.

17:45
It is not just about children; it is about adults. In the readers’ problem page of any newspaper, whether from the left or the right of the political spectrum, the number of people, including those whom most of us would regard as intellectual peers or cleverer than us, who have been scammed in different ways, particularly through online intrusion, shows that it is very prevalent. These are clever, university-educated people who are being taken for a ride.
Yesterday I cleaned out the spam folder in one of my email accounts, which I do fairly quickly. As of about five minutes ago, I have three spam emails. In two of them, a major retailer seems to be telling me that I am the fortunate winner of a Ninja air fryer—not an offer that I propose to take up. The third purports to be from the Post Office, telling me that I have an exciting parcel to open. I am sure that if I clicked on it, something quite unpleasant would happen.
We need to do something about this. The point made by the noble Baroness, Lady Kidron, about children saying that we would love this to be less addictive, is a very moot point because the companies know exactly what they are doing. Clearly, we want to encourage children to understand how those tools operate and how one can try to control, mitigate or avoid them, or point them out to others who may not be as savvy. As for the one that was most desirable, parents putting down their telephones, I confess that occasionally, when sitting as a Deputy Speaker in your Lordships’ House, I wish the Government Whips would spend slightly less time looking at their telephones, although I am sure that whatever they are doing is very important government business.
I do not expect the Minister to stand up and say that we have a solution. The tech companies need to be involved. We need to look at good or best practice around the world, which probably has a lot to teach us, but we can do this only if we do it together in a joined-up way. If we try to do it in a fragmented way, we will put all the onus on Ofcom and it ain’t going to work.
Lord Davies of Brixton Portrait Lord Davies of Brixton (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I spoke at Second Reading about the relationship between online safety and protecting people’s mental health, a theme that runs throughout the Bill. I have not followed the progress in Committee as diligently as I wish, but this group of amendments has caught the eye of the Mental Health Foundation, which has expressed support. It identified Amendment 188, but I think it is the general principle that it supports. The Mental Health Foundation understands the importance of education, because it asked young people what they thought should be done. It sponsored a crucial inquiry through its organisation YoungMinds, which produced a report earlier this year, Putting a Stop to the Endless Scroll.

One of the three major recommendations that emerged from that report, from the feelings of young people themselves, was the need for better education. It found that young people were frustrated at being presented with outdated information about keeping their details safe. They felt that they needed something far more advanced, more relevant to the online world as it is happening at the moment, on how to avoid the risks from such things as image-editing apps. They needed information on more sophisticated risks that they face, essentially what they described as design risks, where the website is designed to drag you in and make you addicted to these algorithms.

The Bill as a whole is designed to protect children and young people from harm, but it must also, as previous speakers have made clear, provide young people themselves with tools so that they can exercise their own judgment to protect themselves and ensure that they do not fall foul, set on that well-worn path between being engaged on a website and ending up with problems with their mental health. Eating is the classic example: you click on a website about a recipe and, step by step, you get dragged into material designed to harm your health through its effect on your diet.

I very much welcome this group of amendments, what it is trying to achieve and the role that it will have by educating young people to protect themselves, recognising the nature of the internet as it is now, so that they do not run the risks of affecting their mental health.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this has probably been the most constructive and inspiring debate that we have had on the Bill. In particular, I thank the noble Lord, Lord Knight, for introducing this debate. His passion for this kind of media literacy education absolutely shines through. I thank him for kicking off in such an interesting and constructive way. I am sorry that my noble friend Lord Storey is not here to contribute as well, with his educational background. He likewise has a passion for media literacy education and would otherwise have wanted to contribute to the debate today.

I am delighted that I have found some common ground with the noble Baroness, Lady Fox. The idea of sending my noble friend Lord Allan on tour has great attractions. I am not sure that he would find it quite so attractive. I am looking forward to him coming back before sending him off around the country. I agree that he has made a very constructive contribution. I agree with much of what the noble Baroness said, and the noble Baroness, Lady Prashar, had the same instinct: this is a way of better preserving freedom of speech. If we can have those critical thinking skills so that people can protect themselves from misinformation, disinformation and some of the harms online, we can have greater confidence that people are able to protect themselves against these harms at whatever age they may be.

I was very pleased to hear the references to Lord Puttnam, because I think that the Democracy and Digital Technologies Committee report was ground-breaking in the way it described the need for digital media literacy. This is about equipping not just young people but everybody with the critical thinking skills needed to differentiate fact from fiction—particularly, as we have talked through in Committee, on the way that digital platforms operate through their systems, algorithms and data.

The noble Lord, Lord Holmes, talked about the breadth and depth needed for media and digital literacy education; he had it absolutely right about people being appropriately savvy, and the noble Baroness, Lady Bennett, echoed what he said in that respect.

I think we have some excellent amendments here. If we can distil them into a single amendment in time for Report or a discussion with the Minister, I think we will find ourselves going forward constructively. There are many aspects of this. For instance, the DCMS Select Committee recommended that digital literacy becomes the fourth pillar of education, which seems to me a pretty important aspect alongside reading, writing and maths. That is the kind of age that we are in. I have quoted Parent Zone before. It acknowledges the usefulness of user empowerment tools and so on, but again it stressed the need for media literacy. What kind of media literacy? The noble Baroness, Lady Kidron, was extremely interesting when she said that what is important is not just user behaviour but making the right choices—that sort of critical thinking. The noble Lord, Lord Russell, provided an analogy with preventive health that was very important.

Our Joint Committee used a rather different phrase. It talked about a “whole of government” approach. When we look at all the different aspects, we see that it is something not just for Ofcom—I entirely agree with that—but that should involve a much broader range of stakeholders in government. We know that, out there, there are organisations such as the Good Things Foundation and CILIP, the library association, and I am sorry that the noble Baroness, Lady Lane-Fox, is not in her place to remind us about Doteveryone, an organisation that many of us admire a great deal for the work it carries out.

I think the “appropriately savvy” expression very much applies to the fraud prevention aspect, and it will be interesting when we come to the next group to talk about that as well. The Government have pointed to the DCMS online media strategy, but the noble Lord, Lord Holmes, is absolutely right to ask what its outcome has been, what its results have been, and what resources are being devoted towards it. We are often pointed to that by the Government, here in Committee and at Oral Questions whenever we ask how the media literacy strategy is going, so we need to kick the tyres on that as well as on the kind of priority and resources being devoted to media literacy.

As ever, I shall refer to the Government’s response to the Joint Committee, which I found rather extraordinary. The Government responded to the committee’s recommendation about minimum standards; there is an amendment today about minimum standards. They said:

“Ofcom has recently published a new approach to online media literacy … Clause 103 of the draft Bill”—


the noble Baroness, Lady Prashar, referred to the fact that in the draft Bill there was originally a new duty on Ofcom—

“did not grant Ofcom any additional powers. As such, it is … unnecessary regulation. It has therefore been removed”.

It did add to Ofcom’s duties. Will the Minister say whether he thinks all the amendments here today would constitute unnecessary regulation? As he can see, there is considerable appetite around the Committee for the kind of media literacy duty across the board that we have talked about today. He might make up for some of the disappointment that many of us feel about the Government’s having got rid of that clause by responding to that question.

18:00
The noble Lord, Lord Davies, made an important point about the mental health aspects of digital literacy. A survey run by the charity YoungMinds said that this was one of the main provisions it wanted included in the Bill. Again, on those grounds, we should see a minimum standard set by Ofcom under the terms of the Bill, as we are asking for in the amendment.
The All-Party Parliamentary Group on Media Literacy has done some really good work. Just saying, “This is cross-government”, “We need a holistic approach to this” and so on does not obviate the fact that our schools need to be much more vigorous in what they do in this area. Indeed, the group is advocating a media literacy education Bill, talking about upskilling teachers and talking, as does one of the amendments here, about Ofcom having a duty in this area. We need to take a much broader view of this and be much more vigorous in what we do on media literacy, as has been clear from all the contributions from around the House today.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a good debate. I am glad that a number of noble Lords mentioned Lord Puttnam and the committee that he chaired for your Lordships’ House on democracy and digital technologies. I responded to the debate that we had on that; sadly, it was after he had already retired from your Lordships’ House, but he participated from the steps of the Throne. I am mindful of that report and the lessons learned in it in the context of the debate that we have had today.

We recognise the intent behind the amendments in this group to strengthen the UK’s approach to media literacy in so far as it relates to services that will be regulated by the Bill. Ofcom has a broad duty to promote media literacy under the Communications Act 2003. That is an important responsibility for Ofcom, and it is right that the regulator is able to adapt its approach to support people in meeting the evolving challenges of the digital age.

Amendments 52A and 91 from the noble Lord, Lord Knight, and Amendment 91A from the noble Lord, Lord Holmes of Richmond, seek to introduce duties on in-scope services, requiring them to put in place measures that promote users’ media literacy, while Amendment 98 tabled by the noble Lord, Lord Knight, would require Ofcom to issue a code of practice in relation to the new duty proposed in his Amendment 91. While we agree that the industry has a role to play in promoting media literacy, the Government believe that these amendments could lead to unintended, negative consequences.

I shall address the role of the industry and media literacy, which the noble Baroness, Lady Kidron, dwelt on in her remarks. We welcome the programmes that it runs in partnership with online safety experts such as Parent Zone and Internet Matters and hope they continue to thrive, with the added benefit of Ofcom’s recently published evaluation toolkit. However, we believe that platforms can go further to empower and educate their users. That is why media literacy has been included in the Bill’s risk assessment duties, meaning that regulated services will have to consider measures to promote media literacy to their users as part of the risk assessment process. Additionally, through work delivered under its existing media literacy duty, Ofcom is developing a set of best-practice design principles for platform-based media literacy measures. That work will build an evidence base of the most effective measures that platforms can take to build their users’ media literacy.

In response to the noble Baroness’s question, I say: no, platforms will not be able to avoid putting in place protections for children by using media literacy campaigns. Ofcom would be able to use its enforcement powers if a platform was not achieving appropriate safety outcomes. There are a range of ways in which platforms can mitigate risks, of which media literacy is but one, and Ofcom would expect platforms to consider them all in their risk assessments.

Let me say a bit about the unintended consequences we fear might arise from these amendments. First, the resource demands to create a code of practice and then to regulate firms’ compliance with this type of broad duty will place an undue burden on the regulator. It is also unclear how the proposed duties in Amendments 52A, 91 and 91A would interact with Ofcom’s existing media literacy duty. There is a risk, we fear, that these parallel duties could be discharged in conflicting ways. Amendment 91A is exposed to broad interpretation by platforms and could enable them to fulfil the duty in a way that lacked real impact on users’ media literacy.

The amendment in the name of my noble friend Lord Holmes proposes a duty to promote awareness of financial deception and fraud. The Government are already taking significant action to protect people from online fraud, including through their new fraud strategy and other provisions in this Bill. I know that my noble friends Lord Camrose, Lord Sharpe of Epsom and Lady Penn met noble Lords to talk about that earlier this week. We believe that measures such as prompts for users before they complete financial transactions sit more logically with financial service providers than with services in scope of this Bill.

Amendment 52A proposes a duty on carriers of journalistic content to promote media literacy to their users. We do not want to risk requiring platforms to act as de facto press regulators, assessing the quality of news publishers’ content. That would not be compatible with our commitment to press freedom. Under its existing media literacy duty, Ofcom is delivering positive work to support people to discern high-quality information online. It is also collaborating with the biggest platforms to design best practice principles for platform-based media literacy measures. It intends to publish these principles this year and will encourage platforms to adopt them.

It is right that Ofcom is given time to understand the benefits of these approaches. The Secretary of State’s post-implementation review will allow the Government and Parliament to establish the effectiveness of Ofcom’s current approach and to reconsider the role of platforms in enhancing users’ media literacy, if appropriate. In the meantime, the Bill introduces new transparency-reporting and information-gathering powers to enhance Ofcom’s visibility of platforms delivery and evaluation of media literacy activities. We would not want to see amendments that would inadvertently dissuade platforms from delivering these activities in favour of less costly and less effective measures.

My noble friend Lord Holmes asked about the Online Media Literacy Strategy, published in July 2021, which set out the Government’s vision for improving media literacy in the country. Alongside the strategy, we have committed to publishing annual action plans each financial year until 2024-25, setting out how we meet the ambition of the strategy. In April 2022 we published the Year 2 Action Plan, which included extending the reach of media literacy education to those who are currently disengaged, in consultation with the media literacy task force—a body of 17 cross-sector experts—expanding our grant funding programme to provide nearly £2.5 million across two years for organisations delivering innovative media literacy activities, and commissioning research to improve our understanding of the challenges faced by the sector. We intend to publish the research later this year, for the benefit of civil society organisations, technology platforms and policymakers.

The noble Lord, Lord Knight, in his Amendment 186, would stipulate that Ofcom must levy fees on regulated firms sufficient to fund the work of third parties involved in supporting it to meet its existing media literacy duties. The Bill already allows Ofcom to levy fees sufficient to fund the annual costs of exercising its online safety functions. This includes its existing media literacy duty as far as it relates to services regulated by this Bill. As such, the Bill already ensures that these media literacy activities, including those that Ofcom chooses to deliver through third parties, can be funded through fees levied on industry.

I turn to Amendments 188, 235, 236, 237 and 238. The Government recognise the intent behind these amendments, which is to help improve the media literacy of the general public. Ofcom already has a statutory duty to promote media literacy with regard to the publication of anything by means of electronic media, including services in scope of the Bill. These amendments propose rather prescriptive objectives, either as part of a new duty for Ofcom or through updating its existing duty. They reflect current challenges in the sector but run the risk of becoming obsolete over time, preventing Ofcom from adapting its work in response to emerging issues.

Ofcom has demonstrated flexibility in its existing duty through its renewed Approach to Online Media Literacy, launched in 2021. This presented an expanded media literacy programme, enabling it to achieve almost all the objectives specified in this group. The Government note the progress that Ofcom has already achieved under its renewed approach in the annual plan it produced last month. The Online Safety Bill strengthens Ofcom’s functions relating to media literacy, which is included in Ofcom’s new transparency-reporting and information-gathering powers, which will give it enhanced oversight of industry activity by enabling it to require regulated services to share or publish information about the work that that they are doing on media literacy.

The noble Baroness, Lady Prashar, asked about the view expressed by the Joint Committee on minimum standards for media literacy training. We agree with the intention behind that, but, because of the broad and varied nature of media literacy, we do not believe that introducing minimum standards is the most effective way of achieving that outcome. Instead, we are focusing efforts on improving the evaluation practices of media literacy initiatives to identify which ones are most effective and to encourage their delivery. Ofcom has undertaken extensive work to produce a comprehensive toolkit to support practitioners to deliver robust evaluations of their programmes. This was published in February this year and has been met with praise from practitioners, including those who received grant funding from the Government’s non-legislative media literacy work programme. The post-implementation review of Ofcom’s online safety regime, which covers its existing media literacy duty in so far as it relates to regulated services, will provide a reasonable point at which to establish the effectiveness of Ofcom’s new work programme, after giving it time to take effect.

Noble Lords talked about the national curriculum and media literacy in schools. Media literacy is indeed a crucial skill for everyone in the digital age. Key media literacy skills are already taught through a number of compulsory subjects in the national curriculum. Digital literacy is included in the computing national curriculum in England, which equips pupils with the knowledge, understanding and skills to use information and communication technology creatively and purposefully. I can reassure noble Lords that people such as Monica are being taught not about historic things like floppy disks but about emerging and present challenges; the computing curriculum ensures that pupils are taught how to design program systems and accomplish goals such as collecting, analysing, evaluating and presenting data.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Does the Minister know how many children are on computing courses?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I do not know, but I shall find out from the Department for Education and write. But those who are on them benefit from a curriculum that includes topics such as programming and algorithms, the responsible and safe use of technology, and other foundational knowledge that may support future study in fields such as artificial intelligence and data science.

This is not the only subject in which media literacy and critical thinking are taught. In citizenship education, pupils are taught about critical thinking and the proper functioning of a democracy. They learn to distinguish fact from opinion, as well as exploring freedom of speech and the role and responsibility of the media in informing and shaping public opinion. As Minister for Arts and Heritage, I will say a bit about subjects such as history, English and other arts subjects, in which pupils learn to ask questions about information, think critically and weigh up arguments, all of which are important skills for media literacy, as well as more broadly.

18:15
In the debate on the report of the committee led by Lord Puttnam I mentioned the work of Art UK and its programme, the Superpower of Looking. There are many other excellent examples, such as the National Gallery’s Take One Picture scheme, which works with schools to encourage pupils to look at just one work of art from that fabulous collection in order to encourage critical thinking and to look beyond what is immediately apparent. My department is working with the Department for Education on a cultural education plan to ensure that these sorts of initiatives are shared across all schools in the state sector . Additionally, the Department for Education published its updated Teaching Online Safety in Schools non-statutory guidance in January 2023, which provides schools with advice on how to teach children to stay safe online.
There are many ways outside the curriculum in which schoolchildren and young people benefit. I had the pleasure of being a judge for Debating Matters, as did the noble Baroness, Lady Bennett—though not in my case behind bars. A scheme such as this, along with debating clubs in schools, all add to the importance of critical thinking and debate.
Amendment 189 in the name of the noble Lord, Lord Knight, seeks to place a requirement on all public bodies to assist Ofcom in relation to its duties under the regime set out by the Bill. The regulator will need to co-operate with a variety of organisations. Ofcom has existing powers to enable this and, where appropriate and proportionate, we have used the Bill to strengthen them. The Bill’s information-gathering powers will allow Ofcom to request information from any person, including public bodies, who appears to have information required by it in order to exercise its online safety function. Placing this broad duty on all public bodies would not be proportionate or effective. It would create an undefined requirement on public bodies and give Ofcom a disproportionate amount of power.
The noble Lord’s amendment uses Ofsted as an example of a public body that would be required to co-operate with Ofcom under the proposed duty. Ofsted already has the power to advise and assist other public authorities, including Ofcom, under Section 149 of the Education and Inspections Act 2006.
I hope noble Lords have been reassured by the points I have set out and will understand why the Government are not able to accept these amendments. I will reflect on the wider remarks made in this debate. With that, I invite the noble Lord to withdraw his amendment.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to all Members of the Committee for their contributions to a good debate. I was particularly happy to hear the noble Lord, Lord Clement-Jones, describe it as “inspiring”. There were some great speeches.

I could go on at some length about the educational element to this, but I will constrain myself. In the last year, 1.4% of secondary school pupils in this country did computer science at GCSE. It is a constant source of frustration that computer science is prayed in aid by the Department for Education as a line for Ministers to take in the algorithm they are given to use. However, I understand that the Minister has just to deliver the message.

The noble Baroness was worried about adding to the curriculum. Like the noble Baroness, Lady Bennett, I favour a wider-scale reform of the education system to make it much more fit for purpose, but I will not go on.

I was the Minister responsible for the Education and Inspections Act 2006. I would be interested in further updates as to how it is going. For example, does Ofcom ever go with Ofsted into schools and look properly at media literacy delivery? That is what I am trying to tease out with the amendment.

The comments in the speech by the noble Baroness, Lady Prashar, were significant. She pointed out the weaknesses in the strategy and the difference between the duty as set out in the 2003 Act and the duties we now need, and the pressing case for these duties to be updated as we take this Bill through this House.

The noble Baroness, Lady Fox, had some misgivings about adding adults, which I think were perfectly answered by the noble Baroness, Lady Kidron, in respect of her plea on behalf of young people to help educate parents and give them better media literacy, particularly around the overuse of phones. We have a digital code of conduct in our own house to do with no phones being allowed at mealtimes or in bedrooms by any of us. All of that plays to the mental health issues referred to by my noble friend Lord Davies, and the preventive health aspect referred to by the noble Lord, Lord Russell.

As ever, I am grateful to the Minister for the thorough and comprehensive way in which he answered all the amendments. However, ultimately, the media literacy levels of adults and children in this country are simply not good enough. The existing duties that he refers to, and the way in which he referred to them in his speaking notes, suggest a certain amount of complacency about that. The duties are not working and need to be updated; we need clarity as to who owns the problem of that lack of media literacy, and we are not getting that. This is our opportunity to address that and to set out clearly what the responsibilities are of the companies and the regulator, and how the two work together so that we address the problem. I urge the Minister to work with those of us concerned about this and come forward with an amendment that he is happy with at Report, so that we can update this duty. On that basis, I am happy to withdraw the amendment for now.

Amendment 52A withdrawn.
Clause 16: Duty about content reporting
Amendment 53
Moved by
53: Clause 16, page 18, line 10, at end insert—
“(3A) Content that constitutes a fraudulent advertisement within the meaning of section 33.”Member’s explanatory statement
This amendment, and others in the name of Baroness Morgan, would extend the current provisions on transparency reporting, user reporting and user complaints to fraudulent advertisements.
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Hansard - - - Excerpts

My Lords, I shall speak to Amendments 53 to 55, and Amendments 86, 87, 162 to 173, and 175 to 181 in my name and that of the noble Lord, Lord Clement-Jones. I declare my relevant interests in this group of amendments as a non-executive director of the Financial Services Compensation Scheme and Santander UK, and chair of the Association of British Insurers—although, as we have heard, fraud is prevalent across all sectors, so we are all interested in these issues.

This debate follows on well from that on the last group of amendments, as we were just hearing. Fraud is now being discussed so widely in this House and in Parliament that there are three Bills before your Lordships’ House at the moment in which fraud is a very real issue. I am sure that there are others, but there are three major Bills—this one, the Economic Crime and Corporate Transparency Bill, and the Financial Services and Markets Bill.

These amendments seek to fill a noticeable gap in the Bill concerning fraudulent advertisements—a gap that can be easily remedied. The Minister has done a very good job so far with all groups that we have debated, batting away amendments, but I hope that he might just say, “Yes, I see the point of the amendment that you are putting forward, and I shall go away and think about it”. I will see what attitude and response we get at the end of the debate.

I had the great privilege, as I said yesterday when asking a question, of chairing this House’s 2022 inquiry into the Fraud Act 2006 and digital fraud. As we have heard, fraud is currently the fastest growing crime and is being facilitated by online platforms. Coincidentally, just today, UK Finance, the trade body for the UK banking industry, has published its fraud figures for 2022. It has conducted analysis on more than 59,000 authorised push payment fraud cases to show the sources of fraud. Authorised push payment is where the customer—the victim, unfortunately—transfers money to the fraudster and authorises that transfer but has often, or usually, been socially engineered into doing so. UK Finance is now asking where those frauds originate from, and its analysis shows that 78% of APP fraud cases originated online and accounted for 36% of losses, and 18% of fraud cases originated via telecommunications and accounted for 44% of losses.

I will leave to one side the fact that the Bill does not touch on emails and telecoms, and I shall focus today on fraudulent advertisements and fraud. I should say that I welcome the fact the Government changed the legislation from the draft Bill when the Bill was presented to the House of Commons so that fraudulent advertisements and fraud were caught more in the Bill than had originally been anticipated.

As we have heard, victims of fraud suffer not just financially but emotionally and mentally, with bouts of anxiety and depression. They report feeling “embarrassed or depressed” about being scammed. Many lose a significant amount of money in a way that severely impacts their lives and, in the worst cases, people have been known to take their own lives. In case of things such as romance scams or investment scams, people’s trust is severely undermined in any communication that they subsequently receive. I thank all of those victims of fraud who gave evidence to our inquiry and have done so to other inquiries in this House and in the House of Commons.

Fraud is a pretty broad term, as we set out in the report, and we should be clear that this Bill covers fraud facilitated by user-generated content or via search results and fraudulent advertisements on the largest social media and search services. My noble friend the Minister spoke about the meeting held earlier this week between Members of this House and Ministers, and officials produced a helpful briefing note that makes it clear that the Bill covers such fraud. However, as I said, emails, SMS and MMS messages, and internet service providers—web hosting services—are not covered by the Bill. There remains very much a gap that victims, sadly, can fall through.

The point of the amendments in the group, and the reason I hope that the Minister can at least say yes to some of them, is that they are pushing in the direction that the Government want to go too. At the moment, the Bill appears to exclude fraudulent advertisements from several key duties that apply to other priority illegal content, thereby leaving consumers with less protection. In particular, the duties or lack of them around transparency reporting, user reporting and complaints in relation to fraudulent advertisements is concerning. It does not make any sense. That is why I hope that the Minister can explain the drafting. It could be argued that fraudulent advertising is already included in transparency reporting as defined in the Bill, but that is limited to a description of platforms’ actions and does not include obligations to provide information on the incidence of fraudulent advertisements or other key details, as is required for other types of illegal content.

Transparency reporting, as I suspect we will hear from a number of noble Lords, is essential for the regulator to see how prevalent fraudulent advertisements are on a platform’s service and whether that platform is successfully mitigating the advertisements. It remains essential, too, that users can easily report fraudulent content when they come across it and for there to be a procedure that allows users to complain if platforms are failing in their duty to keep users safe.

I should point my noble friend to the Government’s fraud strategy published last week. Paragraph 86 states:

“We want to make it as simple as possible for users to report fraud they see online. This includes scam adverts, false celebrity endorsements and fake user profiles. In discussion with government, many of the largest tech companies have committed to making this process as seamless and consistent as possible. This means, regardless of what social media platform or internet site you are on, you should be able to find the ‘report’ button within a single click, and then able to select ‘report fraud or scams’.”


The Government are saying that they want user reporting to be as simple as possible. These amendments suggest ways in which we can make user reporting as simple as possible as regards fraudulent advertisers.

The amendments address the gap in the Bill’s current drafting by inserting fraudulent advertising alongside other illegal content duties for social media reporting in Clause 16, complaints in Clause 17 and the equivalent clauses for search engines in Clauses 26 and 27. The amendments add fraudulent advertising alongside other illegal content into the description of the transparency reporting requirements in Schedule 8. Without these amendments, the regulator will struggle to understand the extent of the problem of fraudulent advertisements and platforms will probably fail to prevent this harmful content being posted.

This will, I hope, be a short debate, and I look forward to hearing what my noble friend the Minister has to say on this point. I beg to move.

18:30
Lord Lucas Portrait Lord Lucas (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I also have a pair of amendments in this group. I am patron of a charity called JobsAware, which specialises in dealing with fraudulent job advertisements. It is an excellent example of collaboration between government and industry in dealing with a problem such as this. Going forward, though, they will be much more effective if there is a decent flow of information and if this Bill provides the mechanism for that. I would be very grateful if my noble friend would agree to a meeting, between Committee and Report, to discuss how that might best be achieved within the construct of this Bill.

It is not just the authorities who are able to deter these sort of things from happening. If there is knowledge spread through reputable networks about who is doing these things, it becomes much easier for other people to stop them happening. At the moment, the experience in using the internet must bear some similarity to walking down a Victorian street in London with your purse open. It really is all our responsibility to try to do something about this, since we now live so much of our life online. I very much look forward to my noble friend’s response.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I had the great privilege of serving as a member of this House’s Fraud Act 2006 and Digital Fraud Committee under the excellent chairing of the noble Baroness, Lady Morgan. She has already told us of the ghastly effects that fraud has on individuals and indeed its adverse effects on businesses. We heard really dramatic statistics, such as when Action Fraud told us that 80% of fraud is cyber enabled.

Many of us here will have been victims of fraud—I have been a victim—or know people who have been victims of fraud. I was therefore very pleased when the Government introduced the fraudulent advertising provisions into the Bill, which will go some way to reducing the prevalence of online fraud. It seems to me that it requires special attention, which is what these amendments should do.

We heard in our inquiry about the problems that category 1 companies had in taking down fraudulent advertisements quickly. Philip Milton, the public policy manager at Meta, told us that it takes between 24 and 48 hours to review possibly harmful content after it has been flagged to the company. He recognised that, due to the deceptive nature of fraudulent advertising, Meta’s systems do not always recognise that advertising is fraudulent and, therefore, take-down rates would be variable. That is one of the most sophisticated tech platforms—if it has difficulties, just imagine the difficulty that other companies have in both recognising and taking down fraudulent advertising.

Again and again, the Bill recognises the difficulties that platforms have in systematising the protections provided in the Bill. Fraud has an ever-changing nature and is massively increasing—particularly so for fraudulent advertising. It is absolutely essential that the highest possible levels of transparency are placed upon the tech companies to report their response to fraudulent advertising. Both Ofcom and users need to be assured that not only do the companies have the most effective reporting systems but, just as importantly, they have the most effective transparency to check how well they are performing.

To do this, the obligations on platforms must go beyond the transparency reporting requirements in the Bill. These amendments would ensure that they include obligations to provide information on incidence of fraud advertising, in line with other types of priority illegal content. These increased obligations are part of checking the effectiveness of the Bill when it comes to being implemented.

The noble Baroness, Lady Stowell, told us on the fifth day of Committee, when taking about the risk-assessment amendments she had tabled:

“They are about ensuring transparency to give all users confidence”.—[Official Report, 9/5/23; col. 1755.]


Across the Bill, noble Lords have repeatedly stated that there needs to be a range of ways to judge how effectively the protections provided are working. I suggest to noble Lords that these amendments are important attempts to help make the Bill more accountable and provide the data to future-proof the harms it is trying to deal with. As we said in the committee report:

“Without sufficient futureproofing, technology will most likely continue to create new opportunities for fraudsters to target victims”.


I ask the Minister to at least look at some of these amendments favourably.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I shall say very briefly in support of these amendments that in 2017, the 5Rights Foundation, of which I am the chair, published the Digital Childhood report, which in a way was the thing that put the organisation on the map. The report looked at the evolving capacity of children through childhood, what technology they were using, what happened to them and what the impact was. We are about to release the report again, in an updated version, and one of the things that is most striking is the introduction of fraud into children’s lives. At the point at which they are evolving into autonomous people, when they want to buy presents for their friends and parents on their own, they are experiencing what the noble Baroness, Lady Morgan, expressed as embarrassment, loss of trust and a sense of deserting confidence—I think that is probably the phrase. So I just want to put on the record that this is a problem for children also.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this has been an interesting short debate and the noble Baroness, Lady Morgan, made a very simple proposition. I am very grateful to her for introducing this so clearly and comprehensively. Of course, it is all about the way that platforms will identify illegal, fraudulent advertising and attempt to align it with other user-to-user content in terms of transparency, reporting, user reporting and user complaints. It is a very straightforward proposition.

First of all, however, we should thank the Government for acceding to what the Joint Committee suggested, which was that fraudulent advertising should be brought within the scope of the Bill. But, as ever, we want more. That is what it is all about and it is a very straightforward proposition which I very much hope the Minister will accede to.

We have heard from around the Committee about the growing problem and I will be very interested to read the report that the noble Baroness, Lady Kidron, was talking about, in terms of the introduction of fraud into children’s lives—that is really important. The noble Baroness, Lady Morgan, mentioned some of the statistics from Clean Up the Internet, Action Fraud and so on, as did the noble Viscount, Lord Colville. And, of course, it is now digital. Some 80% of fraud, as he said, is cyber-enabled, and 23% of all reported frauds are initiated on social media—so this is bang in the area of the Bill.

It has been very interesting to see how some of the trade organisations, the ABI and others, have talked about the impact of fraud, including digital fraud. The ABI said:

“Consumers’ confidence is being eroded by the ongoing proliferation of online financial scams, including those predicated on impersonation of financial service providers and facilitated through online advertising. Both the insurance and long-term savings sectors are impacted by financial scams perpetrated via online paid-for advertisements, which can deprive vulnerable consumers of their life savings and leave deep emotional scars”.


So, this is very much a cross-industry concern and very visible to the insurance industry and no doubt to other sectors as well.

I congratulate the noble Baroness, Lady Morgan, on her chairing of the fraud committee and on the way it came to its conclusions and scrutinised the Bill. Paragraphs 559, 560 and 561 all set out where the Bill needs to be aligned to the other content that it covers. As she described, there are two areas where the Bill can be improved. If they are not cured, they will substantially undermine its ability to tackle online fraud effectively.

This has the backing of Which? As the Minister will notice, it is very much a cross-industry and consumer body set of amendments, supporting transparency reporting and making sure that those platforms with more fraudulent advertising make proportionately larger changes to their systems. That is why there is transparency reporting for all illegal harms that platforms are obliged to prevent. There is no reason why advertising should be exempt. On user reporting and complaints, it is currently unclear whether this applies only to illegal user-generated content and unpaid search content or if it also applies to illegal fraudulent advertisements. At the very least, I hope the Minister will clarify that today.

Elsewhere, the Bill requires platforms to allow users to complain if the platform fails to comply with its duties to protect users from illegal content and with regard to the content-reporting process. I very much hope the Minister will accede to including that as well.

Some very simple requests are being made in this group. I very much hope that the Minister will take them on board.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

It is the simple requests that always seem to evade the easy solutions. I will not go back over the very good introductory speech from the noble Baroness, which said it all; the figures are appalling and the range of fraud-inspired criminality is extraordinary. It plays back to a point we have been hammering today: if this Bill is about anything, it is the way the internet amplifies that which would be unpleasant anyway but will now reach epidemic proportions.

I wonder whether that is the clue to the problem the noble Baroness was commenting on—I think more in hope than in having any way to resolve it. It is great news that three Bills are doing all the stuff we want. We have talked a bit about three-legged stools; this is another one that might crash over. If we are not careful, it will slip through the cracks. I am mixing my metaphors again.

If the Minister would not mind a bit of advice, it seems to me that this Bill could do certain things and do them well. It should not hold back and wait for the others to catch up or do things differently. The noble Baroness made the point about the extraordinarily difficult to understand gap, in that what is happening to priority illegal content elsewhere in the Bill does not apply to this, even though it is clearly illegal activity. I understand that there is a logical line that it is not quite the same thing—that the Bill is primarily about certain restricted types of activity on social media and not the generality of fraud—but surely the scale of the problem and our difficulty in cracking down on it, by whatever routes and whatever size of stool we choose, suggest that we should do what we can in this Bill and do it hard, deeply and properly.

Secondly, we have amendments later in Committee on the role of the regulators and the possibility recommended by the Communications and Digital Committee that we should seek statutory backing for regulation in this area. Here is a classic example of more than two regulators working to achieve the same end that will probably bump into each other on the way. There is no doubt that the FCA has primary responsibility in this area, but the reality is that the damage is being done by the amplification effect within the social media companies.

18:45
It may or may not be correct, in terms of what we are doing, to restrict what the Bill does to those aspects of user-to-user content and other areas. If something is illegal, surely the Bill should be quite clear that it should not be happening and Ofcom should have the necessary powers, however we frame them, to make sure we follow this through to the logical conclusion. The most-needed powers are the ability for Ofcom to take the lead, if required, in relation to the other regulators who have an impact on this world—can we be sure that is in the Bill and can be exercised?—and to make sure that the transparency, the user reporting and the complaints issues that are so vital to cracking this in the medium term get sorted. I leave that with the Minister to take forward.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

I am grateful to my noble friends for their amendments in this group, and for the useful debate that we have had. I am grateful also to my noble friend Lady Morgan of Cotes and the members of her committee who have looked at fraud, and for the work of the Joint Committee which scrutinised the Bill, in earlier form, for its recommendations on strengthening the way it tackles fraud online. As the noble Lord, Lord Clement-Jones, said, following those recommendations, the Government have brought in new measures to strengthen the Bill’s provisions to tackle fraudulent activity on in-scope services. I am glad he was somewhat satisfied by that.

All in-scope services will be required to take proactive action to tackle fraud facilitated through user-generated content. In addition, the largest and most popular platforms have a stand-alone duty to prevent fraudulent paid-for advertising appearing on their services. This represents a major step forward in ensuring that internet users are protected from scams, which have serious financial and psychological impacts, as noble Lords noted in our debate. Fully addressing the challenges of paid-for advertising is a wider task than is possible through the Bill alone. Advertising involves a broad range of actors not covered by the current legislative framework, such as advertising intermediaries. I am sympathetic to these concerns and the Government are taking action in this area. Through the online advertising programme, we will deliver a holistic review of the regulatory framework in relation to online advertising. The Government consulted on this work last year and aim to publish a response erelong. As the noble Lord, Lord Stevenson, and others noted, there are a number of Bills which look at this work. Earlier this week, there was a meeting hosted by my noble friends Lord Camrose, Lord Sharpe of Epsom and Lady Penn to try to avoid the cracks opening up between the Bills. I am grateful to my noble friend Lady Morgan for attending; I hope it was a useful discussion.

I turn to the amendments tabled by my noble friend. The existing duties on user reporting and user complaints have been designed for user-generated content and search content and are not easily applicable to paid-for advertising. The duties on reporting and complaints mechanisms require platforms to take action in relation to individual complaints, but many in-scope services do not have control over the paid-for advertising on their services. These amendments are therefore difficult to operate for many in-scope services and would create a substantial burden for small businesses. I assure her and other noble Lords that the larger services, which have strong levers over paid-for advertising, will have to ensure that they have processes in place to enable users to report fraudulent advertising.

In reference to transparency reporting, let me assure my noble friend and others that Ofcom can already require information about how companies comply with their fraudulent advertising duties through transparency reports. In addition, Ofcom will also have the power to gather any information it requires for the purpose of exercising its online safety functions. These powers are extensive and will allow Ofcom to assess compliance with the fraudulent advertising duties.

The noble Viscount, Lord Colville of Culross, asked about the difficulty of identifying fraudulent advertising. Clauses 170 and 171 give guidance and a duty on Ofcom about providers making a judgment about content, including fraudulent advertising. There will also be a code of practice on fraudulent advertising to provide further guidance on mechanisms to deal with this important issue.

My noble friend Lord Lucas’s Amendments 94 and 95 aim to require services to report information relating to fraudulent advertising to UK authorities. I am confident that the Bill’s duties will reduce the prevalence of online fraud, reducing the need for post hoc reporting in this way. If fraud does appear online, there are adequate systems in place for internet users to report this to the police.

People can report a scam to Action Fraud, the national reporting service for fraud and cybercrime. Reports submitted to Action Fraud are considered by the National Fraud Intelligence Bureau and can assist a police investigation. Additionally, the Advertising Standards Authority has a reporting service for reporting online scam adverts, and those reports are automatically shared with the National Cyber Security Centre.

The online advertising programme, which I mentioned earlier, builds on the Bill’s fraudulent advertising duty and looks at the wider online advertising system. That programme is considering measures to increase accountability and transparency across the supply chain, including proposals for all parties to enhance record keeping and information sharing.

My noble friend Lord Lucas was keen to meet to speak further. I will pass that request to my noble friend Lord Sharpe of Epsom, who I think would be the better person to talk to in relation to this on behalf of the Home Office—but I am sure that one of us will be very happy to talk with him.

I look forward to discussing this issue in more detail with my noble friend Lady Morgan and others between now and Report, but I hope that this provides sufficient reassurance on the work that the Government are doing in this Bill and in other ways. I invite my noble friends not to press their amendments.

Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, I am grateful to my noble friend for replying to my amendments and for his offer of a meeting, which I will certainly accept when issued.

The Government are missing some opportunities here. I do not know whether he has tried reporting something to Action Fraud, but if you have not lost money you cannot do it; you need to have been gulled and lost money for any of the government systems to take you seriously. While you can report something to the other ones, they do not tell you what they have done. There is no feedback or mechanism for encouraging and rewarding you for reporting—it is a deficient system.

When it comes to job adverts, by and large they go through job boards. There is a collection of people out there who are not direct internet providers who have leverage, and a flow of data to them can make a huge difference; there may also be other areas. It is that flow of data that enables job scams to be piled down on, and that is what the Bill needs to improve. Although the industry as a whole is willing, there just is not the impetus at the moment to make prevention nearly as good as it should be.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I thank my noble friend the Minister very much indeed for his response. Although this has been a short debate, it is a good example of us all just trying to get the Bill to work as well as possible—in this case to protect consumers, but there will be other examples as well.

My noble friend said that the larger services in particular are the ones that are going to have to deal with fraudulent advertisements, so I think the issue about the burdens of user reporting do not apply. I remind him of the paragraph I read out from the Fraud Strategy, where the Government themselves say that they want to make the reporting of fraud online as easy as possible. I will read the record of what he said very carefully, but it might be helpful after that to have a further conversation or perhaps for him to write to reassure those outside this Committee who are looking for confirmation about how transparency reporting, user reporting and complaints will actually apply in relation to fraudulent advertisements, so that this can work as well as possible.

On that basis, I will withdraw my amendment for today, but I think we would all be grateful for further discussion and clarification so that this part of the Bill works as well as possible to protect people from any kind of fraudulent advertisement.

Amendment 53 withdrawn.
Clause 16 agreed.
Clause 17: Duties about complaints procedures
Amendments 53A to 55 not moved.
Clause 17 agreed.
House resumed.
House adjourned at 6.56 pm.

Online Safety Bill

Committee (7th Day)
Relevant document: 28th Report from the Delegated Powers Committee
15:19
Amendment 56
Moved by
56: After Clause 17, insert the following new Clause—
“OFCOM reviews of complaints systems
(1) Within the period of one year beginning on the day on which this Act is passed, and annually thereafter, OFCOM must review the workings of the complaints systems set up by regulated companies under section 17 (duties about complaints procedures), as to—(a) their effectiveness;(b) their cost and efficiency; and(c) such other matters as seem appropriate.(2) In undertaking the reviews under subsection (1), OFCOM may take evidence from such bodies and individuals as it considers appropriate.(3) If OFCOM determines from the nature of the complaints being addressed, and the volumes of such complaints, that systems established under section 17 are not functioning as intended, it may establish an online safety ombudsman with the features outlined in subsections (4) to (8), with the costs of this service being met from the levy on regulated companies.(4) The purpose of the online safety ombudsman is to provide an impartial out-of-court procedure for the resolution of any dispute between—(a) a user of a regulated user-to-user service, or a nominated representative for that user, and(b) the regulated service provider,in cases where complaints made under processes which are compliant with section 17 have not, in the view of the user (or their representative), been adequately addressed.(5) The ombudsman must allow for a user (or their representative) who is a party to such a dispute to refer their case to the ombudsman if they are of the view that any feature or conduct of one or more provider of a regulated user-to-user service, which is relevant to that dispute, presents (or has presented) a material risk of—(a) significant or potential harm;(b) contravening a user’s rights, as set out in the Human Rights Act 1998, including freedom of expression; or(c) failure to uphold terms of service.(6) The ombudsman may make special provision for children, including (but not limited to) prioritisation of—(a) relevant provisions under the United Nations Convention on the Rights of the Child; or(b) a child’s physical, emotional or psychological state.(7) The ombudsman must have regard to the desirability of any dispute resolution service provided by the ombudsman being— (a) free;(b) easy to use, including (where relevant) taking into account the needs of vulnerable users and children;(c) effective and timely;(d) fair and flexible, taking into account different forms of technology and the unique needs of different types of user; and(e) transparent.(8) The Secretary of State must ensure that use of any dispute resolution service provided by the ombudsman does not affect the ability of a user (or their representative) to bring a claim in civil proceedings.”Member’s explanatory statement
This new Clause would require Ofcom to conduct regular reviews of the effectiveness of complaints procedures under Clause 17. If Ofcom were of the view that such procedures were not functioning effectively, they would be able to establish an online safety ombudsman with the features outlined in subsections (4) to (8) of the Clause.
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

My Lords, Amendment 56 proposes a pathway towards setting up an independent ombudsman for the social media space. It is in my name, and I am grateful to the noble Lord, Lord Clement-Jones, for his support. For reasons I will go into, my amendment is a rather transparent and blatant attempt to bridge a gap with the Government, who have a sceptical position on this issue, and I hope that the amendment in its present form will prove more attractive to them than our original proposal.

At the same time, the noble Baroness, Lady Newlove, has tabled an amendment on this issue, proposing an independent appeals mechanism

“to provide impartial out of court resolutions for individual users of regulated services”.

Given that this is almost exactly what I want to see in place—as was set out in my original amendment, which was subsequently rubbished by the Government—I have also signed the noble Baroness’s amendment, and I very much look forward to her speech. The Government have a choice.

The noble Baroness, Lady Fox, also has amendments in this group, although they are pointing in a slightly different direction. I will not speak to them at this point in the proceedings, although I make it absolutely clear that, while I look forward to hearing her arguments —she is always very persuasive—I support the Bill’s current proposals on super-complaints.

Returning to the question of why we think the Bill should make provision for an independent complaints system or ombudsman, I suppose that, logically, we ought first to hear the noble Baroness, Lady Newlove, then listen to the Government’s response, which presumably will be negative. My compromise amendment could then be considered and, I hope, win the day with support from all around the Committee—in my dreams.

We have heard the Government’s arguments already. As the Minister said in his introduction to the Second Reading debate all those months ago on 1 February 2023, he was unsympathetic. At that time, he said:

“Ombudsman services in other sectors are expensive, often underused and primarily relate to complaints which result in financial compensation. We find it difficult to envisage how an ombudsman service could function in this area, where user complaints are likely to be complex and, in many cases, do not have the impetus of financial compensation behind them”.—[Official Report, 1/2/23; col. 690.]


Talk about getting your retaliation in first.

My proposal is based on the Joint Committee’s unanimous recommendation:

“The role of the Online Safety Ombudsman should be created to consider complaints about actions by higher risk service providers where either moderation or failure to address risks leads to … demonstrable harm (including to freedom of expression) and recourse to other routes of redress have not resulted in a resolution”.


The report goes on to say that there could

“be an option in the Bill to extend the remit of the Ombudsman to lower risk providers. In addition … the Ombudsman would as part of its role i) identify issues in individual companies and make recommendations to improve their complaint handling and ii) identify systemic industry wide issues and make recommendations on regulatory action needed to remedy them. The Ombudsman should have a duty to gather data and information and report it to Ofcom. It should be an ‘eligible entity’ to make super-complaints”

possible. It is a very complicated proposal. Noble Lords will understand from the way the proposal is framed that it would provide a back-up to the primary purpose of complaints, which must be to the individual company and the service it is providing. But it would be based on a way of learning from experience, which it would build up as time went on.

I am sure that the noble Lord, Lord Clement-Jones, will flesh out the Joint Committee’s thinking on this issue when he comes to speak, but I make the point that other countries preparing legislation on online safety are in fact building in independent complaints systems; we are an outlier on this. Australia, Canada and others have already legislated. Another very good example nearer to hand is in Ireland. We are very lucky to have with us today the noble Baroness, Lady Kidron, a member of the expert panel whose advice to the Irish Government to set up such a system in her excellent report in May 2022 has now been implemented. I hope that she will share her thoughts about these amendments later in the debate.

Returning to the Government’s reservations about including an ombudsman service in the Bill, I make the following points based on my proposals in Amendment 56. There need not be any immediate action. The amendment as currently specified requires Ofcom to review complaints systems set up by the companies under Clause 17 as to their effectiveness and efficiency. It asks Ofcom to take other evidence into account and then, and only then, to take the decision of whether to set up an ombudsman system. If there were no evidence of a need for such a service, it would not happen.

As for the other reservations raised by the Minister when he spoke at Second Reading, he said:

“Ombudsman services in other sectors are expensive”.


We agree, but we assume that this would be on a cost recovery model, as other Ofcom services are funded in that way. The primary focus will always be resolving complaints about actions or inactions of particular companies in the companies’ own redress systems, and Ofcom can always keep that under review.

He said that they are “often underused”. Since we do not know at the start what the overall burden will be, we think that the right solution is to build up slowly and let Ofcom decide. There are other reasons why it makes sense to prepare for such a service, and I will come to these in a minute.

He said that other ombudsman services

“primarily relate to complaints which result in financial compensation”.

That is true, but the evidence from other reports, and that we received in the Joint Committee, was that most complainants want non-financial solutions: they want egregious material taken down or to ensure that certain materials are not seen. They are not after the money. Where a company is failing to deliver on those issues in their own complaints system, to deny genuine complainants an appeal to an independent body seems perverse and not in accordance with natural justice.

He said that

“user complaints are likely to be complex”.—[Official Report, 1/2/23; col. 690.]

Yes, they probably are, but that seems to be an argument for an independent appeals body, not against it.

To conclude, we agree that Ofcom should not be the ombudsman and that the right approach is for Ofcom to set up the system as and when it judges that it would be appropriate. We do not want Ofcom to be swamped with complaints from users of regulated services, who, for whatever reason, have not been satisfied by the response of the individual companies or to complex cases, or seek system-wide solutions. But Ofcom needs to know what is happening on the ground, across the sector, as well as in each of the regulated companies, and it needs to be kept aware of how the system as a whole is performing. The relationship between the FCA and the Financial Ombudsman Service is a good model here. Indeed, the fact that some of the responsibilities to be given to Ofcom in the Bill will give rise to complaints to the FOS suggests that there would be good sense in aligning these services right from the start.

We understand that the experience from Australia is that the existence of an independent complaints function can strengthen the regulatory functions. There is also evidence that the very existence of an independent complaints mechanism can provide reassurances to users that their online safety is being properly supported. I beg to move.

Baroness Newlove Portrait Baroness Newlove (Con)
- View Speech - Hansard - - - Excerpts

My Lords, this is the first time that I have spoken in Committee. I know we have 10 days, but it seems that we will go even further because this is so important. I will speak to Amendments 250A and 250B.

I thank the noble Lords, Lord Russell of Liverpool and Lord Stevenson of Balmacara, and, of course— if I may be permitted to say so—the amazing noble Baroness, Lady Kidron, who is an absolute whizz on this, for placing their names on these amendments, as well as the 5Rights Foundation, the Internet Watch Foundation and the UK Safer Internet Centre for their excellent briefings. I have spoken to these charities, and the work they do is truly amazing. I do not think that the Bill will recognise just how much time and energy they give to support families and individuals. Put quite simply, we can agree that services’ internal complaint mechanisms are failing.

Let me tell your Lordships about Harry. Harry is an autistic teenager who was filmed by a member of the public in a local fast-food establishment when he was dysregulated and engaging in aggressive behaviour. This footage was shared out of context across social media, with much of the response online labelling Harry as a disruptive teenager who was engaging in unacceptable aggression and vandalising public property. This was shared thousands of times over the course of a few weeks. When Harry and his mum reported it to the social media platforms, they were informed that it did not violate community guidelines and that there was a public interest in the footage remaining online. The family, quite rightly, felt powerless. Harry became overwhelmed at the negative response to the footage and the comments made about his behaviour. He became withdrawn and stopped engaging. He then tried to take his own life.

15:30
It was at this point that Harry’s mum reached out to the voluntary-run service Report Harmful Content, as she had nowhere else to turn. Report Harmful Content is run by the charity South West Grid for Learning. It was able to mediate between the social media sites involved to further explain the context and demonstrate the real-world harm that this footage, by remaining online, was having on the family and on Harry’s mental health. Only then did the social media companies concerned remove the content.
Sadly, Harry’s story is not an exception. In 2022, where a platform initially refused to take down content, Report Harmful Content successfully arbitrated the removal of content in 87% of cases, thus demonstrating that even if the content did not violate community guidelines, it was clear that harm had been done. There are countless cases of members of the public reporting a failure to remove content that was bullying them. This culture of inaction has led to apathy and a disbelief among users that their appeals will ever be redressed. Research published by the Children’s Commissioner for England found that 40% of children did not report harmful content because they felt that there
“was no point in doing so”.
The complaints mechanism in the video-sharing platform regulation regime is being repealed without an alternative mechanism to fill the gap. The current video-sharing platform regulation requires platforms to
“provide for an impartial out-of-court procedure for the resolution of any dispute between a person using the service and the provider”
to operate impartial dispute resolution in the event. In its review of the first year of this regulation, Ofcom highlighted that the requirements imposed on platforms in scope are not being met in full currently. However, instead of strengthening existing appeals processes, the VSP regime is set to be repealed and superseded by this Bill.
The Online Safety Bill does not have an individual appeals process, meaning that individuals will be left without an adequate pathway to redress. The Bill establishes only a “super-complaints” process for issues concerning multiple cases or cases highlighting a systemic risk. It will ultimately fall to the third sector to highlight cases to Ofcom on behalf of individuals.
The removal of an appeals process—given the repeal of the VSP regime—would be in stark contrast with the direction of travel in other nations. Independent appeals processes exist in Australia and New Zealand, and more countries are also looking at adopting independent appeals. The new Irish Online Safety and Media Regulation Act includes provision
“for the making of a complaint to the Commission”.
The Digital Services Act in Europe also puts a process in place. There is precedent for these systems. It cannot be right that the Republic of Ireland and the UK and its territories have over 52 ombudsmen in 32 sectors, yet none of them works in digital at a time when online harm—especially to children, as we hear time and again in your Lordships’ House—is at unprecedented levels.
The Government’s response so far has been insufficient. When the Online Safety Bill received its seventh sitting debate, much discussion related to independent appeals, referred to here as the need for an ombudsman. The Digital Minister recognised:
“In some parts of our economy, we have ombudsmen who deal with individual complaints, financial services being an obvious example. The Committee has asked the question, why no ombudsman here? The answer, in essence, is a matter of scale and of how we can best fix the issue. The volume of individual complaints generated about social media platforms is just vast”.
Dame Maria Miller MP said that
“it is not a good argument to say that this is such an enormous problem that we cannot have a process in place to deal with it”.—[Official Report, Commons, Online Safety Bill Committee, 9/6/22; col. 295-96.]
In response to the Joint Committee’s recommendation for an ombudsman, the Government said:
“An independent resolution mechanism such as an Ombudsman is relatively untested in areas of non-financial harm. Therefore, it is difficult to know how an Ombudsman service could function where user complaints are likely to be complex and where financial compensation is not usually appropriate. An Ombudsman service may also disincentivise services from taking responsibility for their users’ safety. Introducing an independent resolution mechanism at the same time as the new regime may also pose a disproportionate regulatory burden for services and confuse users … The Secretary of State will be able to reconsider whether independent resolution mechanisms are appropriate at the statutory review. Users will also already have a right of action in court if their content is removed by a service provider in breach of the terms and conditions. We will be requiring services to specifically state this right of action clearly in their terms and conditions”.
Delaying the implementation of an individual appeals process will simply increase the backlog of cases and will allow for the ripple effect of harm to go unreported, unaddressed and unaccounted for.
There is precedent for individual complaints systems, as I have mentioned, both in the UK and abroad. Frankly, the idea that an individual complaints process will disincentivise companies from taking responsibility does not hold weight, given that these companies’ current appeal mechanisms are woefully inadequate. Users must not be left to the courts to have their appeals addressed. This process is cost-prohibitive for most and cannot be the only pathway to justice for victims, especially children.
To conclude, I have always personally vowed to speak up for those who endure horrific suffering and injustices from tormentors. I know how the pain and trauma that comes from systems that have been set up being no longer being fit for purpose feels. I therefore say this to my noble friend the Minister: nothing is too difficult if you really want to find a solution. The public have asked for this measure and there is certainly wide precedent for it. By not allowing individuals an appeals process, the Government’s silence simply encourages the tormentors and leaves the tormented alone.
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak in support of Amendments 250A and 250B; I am not in favour of Amendment 56, which is the compromise amendment. I thank the noble Baroness, Lady Newlove, for setting out the reasons for her amendments in such a graphic form. I declare an interest as a member of the Expert Group on an Individual Complaints Mechanism for the Government of Ireland.

The day a child or parent in the UK has a problem with an online service and realises that they have nowhere to turn is the day that the online safety regime will be judged to have failed in the eyes of the public. Independent redress is a key plank of any regulatory system. Ombudsmen and independent complaint systems are available across all sectors, from finance and health to utilities and beyond. As the noble Lord, Lord Stevenson, set out, they are part of all the tech regulation that has been, or is in the process of being, introduced around the world.

I apologise in advance if the Minister is minded to agree to the amendment, but given that, so far, the Government have conceded to a single word in a full six days in Committee, I dare to anticipate that that is not the case and suggest three things that he may say against the amendment: first, that any complaints system will be overwhelmed; secondly, that it will offer a get-out clause for companies from putting their own robust systems in place; and, thirdly, that it will be too expensive.

The expert group of which I was a member looked very carefully at each of these questions and, after taking evidence from all around the globe, it concluded that the system need not be overwhelmed if it had the power to set clear priorities. In the case of Ireland, those priorities were complaints that might result in real-world violence and complaints from or on behalf of children. The expert group also determined that the individual complaints system should be

“afforded the discretion to handle and conclude complaints in the manner it deems most appropriate and is not unduly compelled toward or statutorily proscribed to certain courses of action in the Bill”.

For example, there was a lot of discussion on whether it could decide not to deal with copycat letters, treat multiple complaints on the same or similar issue as one, and so on.

Also, from evidence submitted during our deliberations, it became clear that many complainants have little idea of the law and that many complaints should be referred to other authorities, so among the accepted recommendations was that the individual complaints system should be

“provided with a robust legal basis for transferring or copying complaints to other bodies as part of the triage process”—

for example, to the data regulator, police, social services and other public bodies. The expert group concluded that this would actually result in better enforcement and compliance in the ecosystem overall.

On the point that the individual complaints mechanism may have the unintended consequence of making regulated services lazy, the expert group—which, incidentally, comprised a broad group of specialisms such as ombudsmen, regulators and legal counsel among others—concluded that it was important for the regulator to set a stringent report and redress code of practice for regulated companies so that it was not possible for any company to just sit back until people were so fed up that they went to the complaints body. The expert group specifically said in its report that it

“is acutely aware of the risk of … the Media Commission … drawing criticism for the failings of the regulated entities to adequately comply with systemic rules. In this regard, an individual complaints mechanism should not be viewed as a replacement for the online platforms’ complaint handling processes”.

Indeed, the group felt that an individual complaints system complemented the powers given to the regulator, which could and should take enforcement against those companies that persistently fail to introduce an adequate complaints system—not least because the flow of complaints would act as an early warning system of emerging harms, which is of course one of the regulator’s duties under the Bill.

When replying to a question from the noble Lord, Lord Knight of Weymouth, last week about funding digital literacy, the Minister made it clear that the online safety regime would be self-financing via the levy. In which case, it does not seem to be out of proportion to have a focused and lean system in which the urgent, the vulnerable and the poorly served have somewhere to turn.

The expert group’s recommendation was accepted in full by Ireland’s Minister for Media, Culture and Tourism, Catherine Martin, who said she would

“always take the side of the most vulnerable”

and the complaint system would deal with people who had

“exhausted the complaints handling procedures by any online services”.

I have had the pleasure of talking to its new leadership in recent weeks, and it is expected to be open for business in 2024.

I set that out at length just to prove that it is possible. It was one of the strong recommendations of the pre-legislative committee, and had considerable support in the other place, as we have heard. I think both Ofcom and DSIT should be aware that many media outlets have not yet clocked that this complicated Bill is so insular that the users of tech have no place to go and no voice.

While the Bill can be pushed through without a complaints system, this leaves it vulnerable. It takes only one incident or a sudden copycat rush of horrors, which have been ignored or trivialised by the sector with complainants finding themselves with nowhere to go but the press, to undermine confidence in the whole regulatory edifice.

15:45
So I have three questions for the Minister. The first two are on the VSP regime which, as was set out by the noble Baroness, Lady Newlove, is being cancelled by the Bill. First, could the Minister confirm to the Committee that the VSP complaints system has done nothing useful since it was put in place? Therefore, was the decision to repeal it based on its redundancy?
Secondly, if the system has indeed been deemed redundant, is that because of a failure of capacity or implementation by Ofcom—this is crucial for the Committee to understand, as Ofcom is about to take on the huge burden of this Bill—or is it because all the companies within the regime are now entirely compliant?
Thirdly, once a child or parent has exhausted a company’s complaints system, where, under the Bill in front of us, do the Government think they should go?
I have not yet heard from the noble Baroness, Lady Fox, on her amendments, so I reserve the right to violently agree with her later, but I simply do not understand her reasoning for scrapping super-complaints from the Bill. Over the last six days in Committee, the noble Baroness has repeatedly argued that your Lordships must be wary about putting too much power in the hands of the Government or the tech sector, yet here we have a mechanism that allows users a route to justice that does not depend on their individual wealth. Only those with deep pockets and the skin of a rhinoceros can turn to the law as individuals. A super-complaints system allows interested parties, whether from the kids sector or the Free Speech Union, to act on behalf of a group. As I hope I have made clear, this is additional to, not instead of, an individual complaints system, and I very much hoped to have the noble Baroness’s support for both.
Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I also put my name to Amendments 250A and 250B, but the noble Baronesses, Lady Newlove and Lady Kidron, have done such a good job that I shall be very brief.

To understand the position that I suspect the Government may put forward, I suggest one looks at Commons Hansard and the discussion of this in the seventh Committee sitting of 9 June last year. To read it is to descend into an Alice in Wonderland vortex of government contradictions. The then Digital Minister—a certain Chris Philp, who, having been so effective as Digital Minister, was promoted, poor fellow, to become a Minister in the Home Office; I do not know what he did to deserve that—in essence said that, on the one hand, this problem is absolutely vast, and we all recognise that. When responding to the various Members of the Committee who put forward the case for an independent appeals mechanism, he said that the reason we cannot have one is that the problem is too big. So we recognise that the problem is very big, but we cannot actually do anything about it, because it is too big.

I got really worried because he later did something that I would advise no Minister in the current Government ever to do in public. Basically, he said that this

“groundbreaking and world-leading legislation”—[Official Report, Commons, Online Safety Bill Committee, 9/6/22; col. 296.]

will fix this. If I ruled the world, if any Minister in the current Government said anything like that, they would immediately lose the Whip. The track record of people standing up and proudly boasting how wonderful everything is going to be, compared to the evidence of what actually happens, is not a track record of which to be particularly proud.

I witnessed, as I am sure others did, the experience of the noble Baroness, Lady Kidron, pulling together a group of bereaved parents: families who had lost their child through events brought about by the online world. A point that has stayed with me from that discussion was the noble Baroness, Lady Kidron, who was not complaining, saying at the end that there is something desperately wrong with the system where she ends up as the point person to try to help these people resolve their enormous difficulties with these huge companies. I remind noble Lords that the family of Molly Russell, aided by a very effective lawyer, took no less than five years to get Meta to actually come up with what she was looking at online. So the most effective complaints process, or ombudsman, was the fact they were able to have a very able lawyer and an exceptionally able advocate in the shape of the noble Baroness, Lady Kidron, helping in any way she could. That is completely inadequate.

I looked at the one of the platforms that currently helps individual users—parents—trying to resolve some of the complaints they have with companies. It is incredibly complicated. So relying on the platforms themselves to bring forward, under the terms of the Bill, completely detailed systems and processes to ensure that these things do not happen, or that if there is a complaint it will be followed up dutifully and quickly, does not exactly fill me with confidence, based on their previous form.

For example, as a parent or an individual, here are some of the questions you might have to ask yourself. How do I report violence or domestic abuse online? How do I deal with eating disorder content on social media? How do I know what is graphic content that does not breach terms? How do I deal with abuse online? What do I do as a UK citizen if I live outside the UK? It is a hideously complex world out there. On the one hand, bringing in regulations to ensure that the platforms do what they are meant to, and on the other hand charging Ofcom to act as the policeman to make sure that they are actually doing it, is heaping yet more responsibility on Ofcom. The noble Lord, Lord Grade, is showing enormous stamina sitting up in the corner; he is sitting where the noble Lord, Lord Young, usually sits, which is a good way of giving the Committee a good impression.

What I would argue to the Minister is that to charge Ofcom with doing too much leads us into dangerous territory. The benefit of having a proper ombudsman who deals with these sorts of complaints week in, week out, is exactly the same argument as if one was going to have a hip or a knee replacement. Would you rather have it done by a surgical team that does it once a year or one that does it several hundred times a year? I do not know about noble Lords, but I would prefer the latter. If we had an effective ombudsman service that dealt with these platforms day in, day out, they would be the most effective individuals to identify whether or not those companies were actually doing what they are meant to do in the law, because they would be dealing with them day in, day out, and would see how they were responding. They could then liaise with Ofcom in real time to tell it if some platforms were not performing as they should. I feel that that would be more effective.

The only question I have for the Minister is whether he would please agree to meet with us between now and Report to really go into this in more detail, because this is an open goal which the Government really should be doing something to try to block. It is a bit of a no-brainer.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I see my amendments as being probing. I am very keen on having a robust complaints system, including for individuals, and am open to the argument about an ombudsman. I am listening very carefully to the way that that has been framed. I tabled these amendments because while I know we need a robust complaints system—and I think that Ofcom might have a role in that—I would want that complaints system to be simple and as straightforward as possible. We certainly need somewhere that you can complain.

Ofcom will arguably be the most powerful regulator in the UK, effectively in charge of policing a key element of democracy: the online public square. Of course, one question one might ask is: how do you complain about Ofcom in the middle of it all? Ironically, an ombudsman might be somewhere where you would have to field more than just complaints about the tech companies.

I have suggested completely removing Clauses 150 to 152 from the Bill because of my reservations, beyond this Bill and in general, about a super-complaints system within the regulatory framework, which could be very unhelpful. I might be wrong, and I am open to correction if I have misunderstood, but the Bill’s notion of an eligible entity who will be allowed to make this complaint to Ofcom seems, at the moment, to be appointed on criteria set only by the Secretary of State. That is a serious problem. There is a danger that the Secretary of State could be accused of partiality or politicisation. We therefore have to think carefully about that.

I also object to the idea that certain organisations are anointed with extra legitimacy as super-complaints bodies. We have seen this more broadly. You will often hear Ministers say, in relation to consultations, “We’ve consulted stakeholders and civil society organisations”, when they are actually often referring to lobbying organisations with interests. There is a free-for-all for NGOs and interest groups. We think of a lot of charities as very positive but they are not necessarily neutral. I just wanted to query that.

There is also a danger that organisations will end up speaking on behalf of all women, all children or all Muslims. That is something we need to be careful about in a time of identity politics. We have seen it happen offline with self-appointed community leaders, but say, for example, there is a situation where there is a demand by a super-complainant to remove a particular piece of content that is considered to be harmful, such as an image of the Prophet Muhammad. These are areas where we have to admit that if people then say, “We speak on behalf of”, they will cause problems.

Although charities historically have had huge credibility, as I said, we know from some of the scandals that have affected charities recently that they are not always the saviours. They are certainly not immune from corruption, political bias, political disputes and so on.

I suppose my biggest concern is that the function in the Bill is not open to all members of the public. That seems to be a problem. Therefore, we are saying that certain groups and individuals will have a greater degree of influence over the permissibility of speech than others. There are some people who have understood these clauses to mean that it would act like a class action—that if enough people are complaining, it must be a problem. But, as noble Lords will know from their inboxes, sometimes one is inundated with emails and it does not necessarily show a righteous cause. I do not know about anyone else who has been involved in this Bill, but I have had exactly the same cut-and-paste email about violence against women and girls hundreds of times. That usually means a well-organised, sometimes well-funded, mass mobilisation. I have no objection, but just because you get lots of emails it does not mean that it is a good complaint. If you get only one important email complaint that is written by an individual, surely you should respect that minority view.

Is it not interesting that the assumption of speakers so far has been that the complaints will always be that harms have not been removed or taken notice of? I was grateful when the noble Baroness, Lady Kidron, mentioned the Free Speech Union and recognised, as I envisage, that many of the complaints will be about content having been removed—they will be free speech complaints. Often, in that instance, it will be an individual whose content has been removed. I cannot see how the phrasing of the Bill helps us in that. Although I am a great supporter of the Free Speech Union, I do not want it to represent or act on behalf of, say, Index on Censorship or even an individual who simply thinks that their content should not be removed—and who is no less valid than an official organisation, however much I admire it.

I certainly do not want individual voices to be marginalised, which I fear the Bill presently does in relation to complaints. I am not sure about an ombudsman; I am always wary of creating yet another more powerful body in the land because of the danger of over-bureaucratisation.

16:00
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I had to miss a few sessions of the Committee but I am now back until the end. I remind fellow Members of my interests: I worked for one of the largest platforms for a decade, but I have no current interests. It is all in the register if people care to look. I want to contribute to this debate on the basis of that experience of having worked inside the platforms.

I start by agreeing with the noble Baroness, Lady Kidron, the noble Lord, Lord Stevenson, and my noble friend Lord Clement-Jones. The thrust of their amendments—the idea that something will be needed here—is entirely correct. We have created in the Online Safety Bill a mechanism that we in this Committee know is intended primarily to focus on systems and how Ofcom regulates them, but what the public out there hear is that we are creating a mechanism that will meet their concerns—and their concerns will not end with systems. As the noble Baroness, Lady Newlove, eloquently described, their concerns in some instances will be about specific cases and the question will be: who will take those up?

If there is no other mechanism and no way to signpost people to a place where they can seek redress, they will come to Ofcom. That is something we do not want. We want Ofcom to be focused on the big-ticket items of dealing with systems, not bogged down in dealing with thousands of individual complaints. So we can anticipate a situation in which we will need someone to be able to deal with those individual complaints.

I want to focus on making that workable, because the volume challenge might not be as people expect. I have seen from having worked on the inside that there is a vast funnel of reports, where people report content to platforms. Most of those reports are spurious or vexatious; that is the reality. Platforms have made their reporting systems easy, as we want them to do —indeed, in the Bill we say, “Make sure you have really easy-to-use reporting systems”—but one feature of that is that people will use them simply to express a view. Over the last couple of weeks, all the platforms will have been inundated with literally millions of reports about Turkish politicians. These will come from the supporters of either side, reporting people on the other side—claiming that they are engaged in hate speech or pornography or whatever. They will use whatever tool they can. That is what we used to see day in, day out: football teams or political groups that report each other. The challenge is to separate out the signal—the genuinely serious reports of where something is going wrong—from the vast amount of noise, of people simply using the reporting system because they can. For the ombudsman, the challenge will be that signal question.

Breaking that down, from the vast funnel of complaints coming in, we have a smaller subset that are actionable. Some of those will be substantive, real complaints, where the individual simply disagrees with the decision. That could be primarily for two reasons. The first is that the platform has made a bad decision and failed to enforce its own policies. For example, you reported something as being pornographic, and it obviously was, but the operator was having a bad day—they were tired, it was late in the day and they pressed “Leave up” instead of “Take down”. That happens on a regular basis, and 1% of errors like that across a huge volume means a lot of mistakes being made. Those kinds of issues, where there is a simple operator error, should get picked up by the platforms’ own appeal mechanisms. That is what they are there for, and the Bill rightly points to that. A second reviewer should look at it. Hopefully they are a bit fresher, understand that a mistake was made and can simply reverse it. Those operator error reports can be dealt with internally.

The second type would be where the platform enforces policies correctly but, from the complainant’s point of view, the policies are wrong. It may be a more pro-free speech platform where the person says, “This is hate speech”, but the platform says, “Well, according to our rules, it is not. Under our terms of service, we permit robust speech of this kind. Another platform might not, but we do”. In that case, the complainant is still unhappy but the platform has done nothing wrong—unless the policies the platform is enforcing are out of step with the requirements under the Online Safety Bill, in which case the complaint should properly come to Ofcom. Based on the individual complaint, a complainant may have something material for Ofcom. They are saying that they believe the platform’s policies and systems are not in line with the guidance issued by Ofcom—whether on hate speech, pornography or anything else. That second category of complaint would come to Ofcom.

The third class concerns the kind of complaint that the noble Baroness, Lady Newlove, described. In some ways, this is the hardest. The platform has correctly enforced its policies but, in a particular case, the effect is deeply unfair, problematic and harmful for an individual. The platform simply says, “Look, we enforced the policies. They are there. This piece of content did not violate them”. Any outsider looking at it would say, “There is an injustice here. We can clearly see that an individual is being harmed. A similar piece of content might not be harmful to another individual, but to this individual it is”. In those circumstances, groups such as the South West Grid for Learning, with which I work frequently, perform an invaluable task. We should recognise that there is a network of non-governmental organisations in the United Kingdom that do this day in, day out. Groups such as the Internet Watch Foundation and many others have fantastic relations and connections with the platforms and regularly bring exceptional cases to them.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

We are glad to have the noble Lord back. I want also to put on the record that the South West Grid for Learning is very supportive of this amendment.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

It has let me know as well. In a way, the amendment seeks to formalise what is already an informal mechanism. I was minded initially to support Amendment 56 in the name of my noble friend Lord Clement-Jones and the noble Lord, Lord Stevenson.

This landscape is quite varied. We have to create some kind of outlet, as the noble Baroness, Lady Kidron, rightly said. That parent or individual will want to go somewhere, so we have to send them somewhere. We want that somewhere to be effective, not to get bogged down in spurious and vexatious complaints. We want it to have a high signal-to-noise ratio—to pull out the important complaints and get them to the platforms. That will vary from platform to platform. In some ways, we want to empower Ofcom to look at what is and is not working and to be able to say, “Platform A has built up an incredible set of mechanisms. It’s doing a good job. We’re not seeing things falling through the cracks in the same way as we are seeing with platform B. We are going to have to be more directive with platform B”. That very much depends on the information coming in and on how well the platforms are doing their job already.

I hope that the Government are thinking about how these individual complaints will be dealt with and about the demand that will be created by the Bill. How can we have effective mechanisms for people in the United Kingdom who genuinely have hard cases and have tried, but where there is no intermediary for the platform they are worried about? In many cases, I suspect that these will be newer or smaller platforms that have arrived on the scene and do not have established relationships. Where are these people to go? Who will help them, particularly in cases where the platform may not systemically be doing anything wrong? Its policies are correct and it is enforcing them correctly, but any jury of peers would say that an injustice is being done. Either an exception needs to be made or there needs to be a second look at that specific case. We are not asking Ofcom to do this in the rest of the legislation.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, it is always somewhat intimidating to follow the noble Lord, Lord Allan, though it is wonderful to have him back from his travels. I too will speak in favour of Amendments 250A and 250B in the name of my noble friend, from not direct experience in the social media world but tangentially, from telecoms regulation.

I have lived, as the chief executive of a business, in a world where my customers could complain to me but also to an ombudsman and to Ofcom. I say this with some hesitation, as my dear old friends at TalkTalk will be horrified to hear me quoting this example, but 13 years ago, when I took over as chief executive, TalkTalk accounted for more complaints to Ofcom than pretty much all the other telcos put altogether. We were not trying to be bad—quite the opposite, actually. We were a business born out of very rapid growth, both organic and acquisitive, and we did not have control of our business at the time. We had an internal complaints process and were trying our hardest to listen to it and to individual customers who were telling us that we were letting them down, but we were not doing that very well.

While my noble friend has spoken so eloquently about the importance of complaints mechanisms for individual citizens, I am actually in favour of them for companies. I felt the consequences of having an independent complaints system that made my business listen. It was a genuine failsafe system. For someone to have got as far as complaining to the telecoms ombudsman and to Ofcom, they had really lost the will to live with my own business. That forced my company to change. It has forced telecoms companies to change so much that they now advertise where they stand in the rankings of complaints per thousand customers. Even in the course of the last week, Sky was proclaiming in its print advertising that it was the least complained-about to the independent complaints mechanism.

So this is not about thinking that companies are bad and are trying to let their customers down. As the noble Lord, Lord Allan, has described, managing these processes is really hard and you really need the third line of defence of an independent complaints mechanism to help you deliver on your best intentions. I think most companies with very large customer bases are trying to meet those customers’ needs.

For very practical reasons, I have experienced the power of these sorts of systems. There is one difference with the example I have given of telecoms: it was Ofcom itself that received most of those complaints about TalkTalk 13 years ago, and I have tremendous sympathy with the idea that we might unleash on poor Ofcom all the social media complaints that are not currently being resolved by the companies. That is exactly why, as Dame Maria Miller said, we need to set up an independent ombudsman to deal with this issue.

From a very different perspective from that of my noble friend, I struggle to understand why the Government do not want to do what they have just announced they want to do in other sectors such as gambling.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I had better start by declaring an interest. It is a great pleasure to follow the noble Baroness, Lady Harding, because my interest is directly related to the ombudsman she has just been praising. I am chairman of the board of the Trust Alliance Group, which runs the Energy Ombudsman and the telecoms ombudsman. The former was set up under the Consumers, Estate Agents and Redress Act 2007 and the latter under the Communications Act 2003.

Having got that off my chest, I do not have to boast about the efficacy of ombudsmen; they are an important institution, they take the load off the regulator to a considerable degree and they work closely with the participating companies in the schemes they run. On balance, I would prefer the Consumers, Estate Agents and Redress Act scheme because it involves a single ombudsman, but both those ombudsmen demonstrate the benefit in their sectors.

The noble Lord, Lord Stevenson, pretty much expressed the surprise that we felt when we read the Government’s response to what we thought was a pretty sensible suggestion in the Joint Committee’s report. He quoted it, and I am going to quote it again because it is such an extraordinary statement:

“An independent resolution mechanism such as an Ombudsman is relatively untested in areas of non-financial harm”.


If you look at the ones for which I happen to have some responsibility, and at the other ombudsmen— there is a whole list we could go through: the Legal Ombudsman, the Local Government and Social Care Ombudsman, the Parliamentary and Health Service Ombudsman—there are a number who are absolutely able to take a view on non-financial matters. It is a bit flabbergasting, if that is a parliamentary expression, to come across that kind of statement in a government response.

16:15
There has been distilled wisdom during the course of this debate. Although there may be differences of view about whether we have half a loaf or a full loaf, what is clear is that we are all trying to head in the same direction, which is to have an ombudsman for complaints in this sector. We need to keep reminding everybody that this is not a direct complaints system: it is a secondary complaints system, and you have to have exhausted the complaints within the social media platform. My noble friend described some of the complexity of that extremely well, and I thank the noble Baroness, Lady Kidron, for setting out some of the complexities and the views of the expert group.
I mentioned the Government’s response to the Joint Committee, but as the noble Baroness, Lady Newlove, said, we already have an independent appeals system in the video-sharing platform legislation. Why are we going backwards in this Bill? We should be being more comprehensive as a result of this. This Bill is set to dismantle an essential obligation that supports victims of online harm on video-sharing platforms. We have to be more comprehensive, not less. The South West Grid for Learning’s independent appeals process, which has been mentioned already today, highlights that a significant number of responses received by victims of harmful content from industry platforms were initially incorrect, and RHC was able to resolve them.
Some of these misunderstandings are not necessarily complaints that need adjudication; sometimes it is actually miscommunication. We have heard during the debate that other countries are already doing this; several noble Lords have mentioned Ireland, Australia and New Zealand. We need that ability also, and this is where I disagree with the noble Baroness, Lady Fox, although there was a nuance in her argument. It was a probing amendment—how about that? The fact that representative organisations can defend users’ rights for largescale breaches of the law is very important, and I was rather surprised by her criticism of the fact that lobby groups can bring action. Orchestrating complaints is the nature of group or class actions in litigation, so the question is begging to be tested. Those complaints need to be tested, and it is perfectly legitimate for a group to bring those complaints.
The noble Baroness, Lady Newlove, mentioned the research published by the Children’s Commissioner which showed that 40% of children did not report harmful content because they felt there was no point in doing so. That is pretty damning of the current situation. The noble Lord, Lord Russell, and my noble friend made the very strong point that we do not want to bog down the regulator. We see what happens under the data protection legislation. The ICO has an enormous number of complaints to deal with directly, without the benefit of an ombudsman. This scheme could alleviate the burden on the regulator and be highly effective. I do not think we have heard an argument in Committee against this; it must be the way forward. I very much hope that the Minister will take this forward after today and install an ombudsman for the Bill.
Viscount Camrose Portrait The Parliamentary Under-Secretary of State, Department for Science, Innovation and Technology (Viscount Camrose) (Con)
- Hansard - - - Excerpts

My Lords, the amendments in this group are concerned with complaints mechanisms. I turn first to Amendment 56 from the noble Lord, Lord Stevenson of Balmacara, which proposes introducing a requirement on Ofcom to produce an annual review of the effectiveness and efficiency of platforms’ complaints procedures. Were this review to find that regulated services were not complying effectively with their complaints procedure duties, the proposed new clause would provide for Ofcom to establish an ombudsman to provide a dispute resolution service in relation to complaints.

While I am of course sympathetic to the aims of this amendment, the Government remain confident that service providers are best placed to respond to individual user complaints, as they will be able to take appropriate action promptly. This could include removing content, sanctioning offending users, reversing wrongful content removal or changing their systems and processes. Accordingly, the Bill imposes a duty on regulated user-to-user and search services to establish and operate an easy-to-use, accessible and transparent complaints procedure. The complaints procedure must provide for appropriate action to be taken by the provider in relation to the complaint.

It is worth reminding ourselves that this duty is an enforceable requirement. Where a provider is failing to comply with its complaints procedure duties, Ofcom will be able to take enforcement action against the regulated service. Ofcom has a range of enforcement powers, including the power to impose significant penalties and confirmation decisions that can require the provider to take such steps as are required for compliance. In addition, the Bill includes strong super-complaints provisions that will allow for concerns about systemic issues to be raised with the regulator, which will be required to publish its response to the complaint. This process will help to ensure that Ofcom is made aware of issues that users are facing.

Separately, individuals will also be able to submit complaints to Ofcom. Given the likelihood of an overwhelming volume of complaints, as we have heard, Ofcom will not be able to investigate or arbitrate on individual cases. However, those complaints will be an essential part of Ofcom’s horizon-scanning, research, supervision and enforcement activity. They will guide Ofcom in deciding where to focus its attention. Ofcom will also have a statutory duty to conduct consumer research about users’ experiences in relation to regulated services and the handling of complaints made by users to providers of those services. Further, Ofcom can require that category 1, 2A and 2B providers set out in their annual transparency reports the measures taken to comply with their duties in relation to complaints. This will further ensure that Ofcom is aware of any issues facing users in relation to complaints processes.

At the same time, I share the desire expressed to ensure that the complaints mechanisms will be reviewed and assessed. That is why the Bill contains provisions for the Secretary of State to undertake a review of the efficacy of the entire regulatory framework. This will take place between two and five years after the Part 3 provisions come into force, which is a more appropriate interval for the efficacy of the duties around complaints procedures to be reviewed, as it will allow time for the regime to bed in and provide a sufficient evidence base to assess whether changes are needed.

Finally, I note that Amendment 56 assumes that the preferred solution following a review will be an ombudsman. There is probably not enough evidence to suggest that an ombudsman service would be effective for the online safety regime. It is unclear how an ombudsman service would function in support of the new online safety regime, because individual user complaints are likely to be complex and time-sensitive—and indeed, in many cases financial compensation would not be appropriate. So I fear that the noble Lord’s proposed new clause pre-empts the findings of a review with a solution that is resource-intensive and may be unsuitable for this sector.

Amendments 250A and 250B, tabled by my noble friend Lady Newlove, require that an independent appeals system is established and that Ofcom produces guidance to support this system. As I have set out, the Government believe that decisions on user redress and complaints are best dealt with by services. Regulated services will be required to operate an easy-to-use, accessible and transparent complaints procedure that enables users to make complaints. If services do not comply with these duties, Ofcom will be able to utilise its extensive enforcement powers to bring them into compliance.

The Government are not opposed to revisiting the approach to complaints once the regime is up and running. Indeed, the Bill provides for the review of the regulatory framework. However, it is important that the new approach, which will radically change the regulatory landscape by proactively requiring services to have effective systems and processes for complaints, has time to bed in before it is reassessed.

Turning specifically to the points made by my noble friend and by the noble Baroness, Lady Kidron, about the impartial out of court dispute resolution procedure in the VSP, the VSP regime and the Online Safety Bill are not directly comparable. The underlying principles of both regimes are of course the same, with the focus on systems regulation and protections for users, especially children. The key differences are regarding the online safety framework’s increased scope. The Bill covers a wider range of harms and introduces online safety duties on a wider range of platforms. Under the online safety regime, Ofcom will also have a more extensive suite of enforcement powers than under the UK’s VSP regime.

On user redress, the Bill goes further than the VSP regime as it will require services to offer an extensive and effective complaints process and will enable Ofcom to take stronger enforcement action where they fail to meet this requirement. That is why the Government have put the onus of the complaints procedure on the provider and set out a more robust approach which requires all in-scope, regulated user to user and search services to offer an effective complaints process that provides for appropriate action to be taken in relation to the complaint. This will be an enforceable duty and will enable Ofcom to utilise its extensive online safety enforcement powers where services are not complying with their statutory duty to provide a usable, accessible and transparent complaints procedure.

At the same time, we want to ensure that the regime can develop and respond to new challenges. That is why we have included a power for the Secretary of State to review the regulatory framework once it is up and running. This will provide the correct mechanism to assess whether complaint handling mechanisms can be further strengthened once the new regulations have had time to bed in.

The Government are confident that the Online Safety Bill represents a significant step forward in keeping users safe online for these reasons.

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, could I just ask a question? This Bill has been in gestation for about five to six years, during which time the scale of the problems we are talking about has increased exponentially. The Government appear to be suggesting that they will, in three to five years, evaluate whether or not their approach is working effectively.

There was a lot of discussion in this Chamber yesterday about the will of the people and whether the Government were ignoring it. I gently suggest that the very large number of people, who are having all sorts of problems or who are fearful of harm from the online world, will not find in the timescale that the Government are proposing the sort of remedy and speed of action I suspect they were hoping for. Certainly, the rhetoric the Government have used and continue to use at regular points in the Bill when they are slightly on the back foot seems to be designed to try to make the situation seem better than it is.

Will the Minister and the Bill team take on board that there are some very serious concerns that there will be a lot of lashing back at His Majesty’s Government if in three years’ time—which I fear may be the case—we still have a situation where a large body of complaints are not being dealt with? Ofcom is going to suffer from major ombudsman-like constipation trying to deal with this, and the harms will continue. I think I speak for the Committee when I say that the arguments the Minister and the government side are making really do not hold water.

16:30
I thought in particular of the direct experience of the noble Baroness, Lady Harding, demonstrating the effect on her company—so substitute platforms for that—of knowing that you are being held to account. Having a system that helps the regulator understand in real time whether or not these companies are doing what they should—they are an early warning system and would know earlier than Ofcom would—just seems sensible. But perhaps being sensible is not what this Bill is about.
Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I do not know about that last point. I was going to say that I am very happy to meet the noble Lord to discuss it. It seems to me to come down to a matter of timing and the timing of the first review. As I say, I am delighted to meet the noble Lord. By the way, the relevant shortest period is two years not three, as he said.

Baroness Newlove Portrait Baroness Newlove (Con)
- View Speech - Hansard - - - Excerpts

Following on from my friend, the noble Lord, Lord Russell, can I just say to the Minister that I would really welcome all of us having a meeting? As I am listening to this, I am thinking that three to five years is just horrific for the families. This Bill has gone on for so long to get where we are today. We are losing sight of humanity here and the moral compass of protecting human lives. For whichever Government is in place in three to five years to make the decision to say it does not work is absolutely shameful. Nobody in the Government will be accountable and yet for that family, that single person may commit suicide. We have met the bereaved families, so I say to the Minister that we need to go round the table and look at this again. I do not think it is acceptable to say that there is this timeline, this review, for the Secretary of State when we are dealing with young lives. It is in the public interest to get this Bill correct as it navigates its way back to the House of Commons in a far better state than how it arrived.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I would love the noble Viscount to answer my very specific question about who the Government think families should turn to when they have exhausted the complaints system in the next three to five years. I say that as someone who has witnessed successive Secretaries of State promising families that this Bill would sort this out. Yes?

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I stress again that the period in question is two years not three.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

It is between two and five years. It can be two; it can be five. I am very happy to meet my noble friend and to carry on doing so. The complaints procedure set up for families is to first approach the service provider in an enforceable manner and should the provider fail to meet its enforceable duties to then revert to Ofcom before the courts.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I am sorry but that is exactly the issue at stake. The understanding of the Committee currently is that there is then nowhere to go if they have exhausted that process. I believe that complainants are not entitled to go to Ofcom in the way that the noble Viscount just suggested.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

Considerably more rights are provided than they have today, with the service provider. Indeed, Ofcom would not necessarily deal with individual complaints—

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

They would go to the service provider in the first instance and then—

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

What recourse would they have, if Ofcom will not deal with individual complaints in those circumstances?

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I am happy to meet and discuss this. We are expanding what they are able to receive today under the existing arrangements. I am happy to meet any noble Lords who wish to take this forward to help them understand this—that is probably best.

Amendments 287 and 289 from the noble Baroness, Lady Fox of Buckley, seek to remove the provision for super-complaints from the Bill. The super-complaints mechanism is an important part of the Bill’s overall redress mechanisms. It will enable entities to raise concerns with Ofcom about systemic issues in relation to regulated services, which Ofcom will be required to respond to. This includes concerns about the features of services or the conduct of providers creating a risk of significant harm to users or the public, as well as concerns about significant adverse impacts on the right to freedom of expression.

On who can make super-complaints, any organisation that meets the eligibility criteria set out in secondary legislation will be able to submit a super-complaint to Ofcom. Organisations will be required to submit evidence to Ofcom, setting out how they meet these criteria. Using this evidence, Ofcom will assess organisations against the criteria to ensure that they meet them. The assessment of evidence will be fair and objective, and the criteria will be intentionally strict to ensure that super-complaints focus on systemic issues and that the regulator is not overwhelmed by the number it receives.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

To clarify and link up the two parts of this discussion, can the Minister perhaps reflect, when the meeting is being organised, on the fact that the organisations and the basis on which they can complain will be decided by secondary legislation? So we do not know which organisations or what the remit is, and we cannot assess how effective that will be. We know that the super-complainants will not want to overwhelm Ofcom, so things will be bundled into that. Individuals could be excluded from the super-complaints system in the way that I indicated, because super-complaints will not represent everyone, or even minority views; in other words, there is a gap here now. I want that bit gone, but that does not mean that we do not need a robust complaints system. Before Report at least—in the meetings in between—the Government need to advise on how you complain if something goes wrong. At the moment, the British public have no way to complain at all, unless someone sneaks it through in secondary legislation. This is not helpful.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

As I said, we are happy to consider individual complaints and super-complaints further.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

Again, I am just pulling this together—I am curious to understand this. We have been given a specific case—South West Grid for Learning raising a case based on an individual but that had more generic concerns—so could the noble Viscount clarify, now or in writing, whether that is the kind of thing that he imagines would constitute a super-complaint? If South West Grid for Learning went to a platform with a complaint like that—one based on an individual but brought by an organisation—would Ofcom find that complaint admissible under its super-complaints procedure, as imagined in the Bill?

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

Overall, the super-complaints mechanism is more for groupings of complaints and has a broader range than the individual complaints process, but I will consider that point going forward.

Many UK regulators have successful super-complaints mechanisms which allow them to identify and target emerging issues and effectively utilise resources. Alongside the Bill’s research functions, super-complaints will perform a vital role in ensuring that Ofcom is aware of the issues users are facing, helping them to target resources and to take action against systemic failings.

On the steps required after super-complaints, the regulator will be required to respond publicly to the super-complaint. Issues raised in the super-complaint may lead Ofcom to take steps to mitigate the issues raised in the complaint, where the issues raised can be addressed via the Bill’s duties and powers. In this way, they perform a vital role in Ofcom’s horizon-scanning powers, ensuring that it is aware of issues as they emerge. However, super-complaints are not linked to any specific enforcement process.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, it has just occurred to me what the answer is to the question, “Where does an individual actually get redress?” The only way they can get redress is by collaborating with another 100 people and raising a super-complaint. Is that the answer under the Bill?

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

No. The super-complaints mechanism is better thought of as part of a horizon-scanning mechanism. It is not—

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

So it is not really a complaints system; it is a horizon-scanning system. That is interesting.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

The answer to the noble Lord’s question is that the super-complaint is not a mechanism for individuals to complain on an individual basis and seek redress.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

This is getting worse and worse. I am tempted to suggest that we stop talking about this and try to, in a smaller group, bottom out what we are doing. I really think that the Committee deserves a better response on super-complaints than it has just heard.

As I understood it—I am sure that the noble Baroness, Lady Kidron, is about to make the same point—super-complaints are specifically designed to take away the pressure on vulnerable and younger persons to have responsibility only for themselves in bringing forward the complaint that needs to be resolved. They are a way of sharing that responsibility and taking away the pressure. Is the Minister now saying that that is a misunderstanding?

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I have offered a meeting; I am very happy to host the meeting to bottom out these complaints.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I understand that the Minister has been given a sticky wicket of defending the indefensible. I welcome a meeting, as I think the whole Committee does, but it would be very helpful to hear the Government say that they have chosen to give individuals no recourse under the Bill—that this is the current situation, as it stands, and that there is no concession on the matter. I have been in meetings with people who have been promised such things, so it is really important, from now on in Committee, that we actually state at the Dispatch Box what the situation is. I spent quite a lot of the weekend reading circular arguments, and we now need to get to an understanding of what the situation is. We can then decide, as a Committee, what we do in relation to that.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

As I said, I am very happy to hold the meeting. We are giving users greater protection through the Bill, and, as agreed, we can discuss individual routes to recourse.

I hope that, on the basis of what I have said and the future meeting, noble Lords have some reassurance that the Bill’s complaint mechanisms will, eventually, be effective and proportionate, and feel able not to press their amendments.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

I am very sorry that I did not realise that the Minister was responding to this group of amendments; I should have welcomed him to his first appearance in Committee. I hope he will come back—although he may have to spend a bit of time in hospital, having received a pass to speak on this issue from his noble friend.

This is a very complicated Bill. The Minister and I have actually talked about that over tea, and he is now learning the hard lessons of what he took as a light badinage before coming to the Chamber today. However, we are in a bit of a mess here. I was genuinely trying to get an amendment that would encourage the department to move forward on this issue, because it is quite clear from the mood around the Committee that something needs to be resolved here. The way the Government are approaching this is by heading towards a brick wall, and I do not think it is the right way forward.

16:45
The Minister cannot ignore the evidence from the two very well-respected practitioners who have been involved in this sort of process and understand how it works that this is not the way forward. He has heard somebody who works professionally in this area explain how the system works in practice. He is hearing from individuals who, as we have now discovered, otherwise have nowhere to go. We are being told what seems to be a very confusing story about what the super-complaints system is about and how it will be done. This must be sorted, otherwise he will find that the Great British public, for whom the Bill is designed, particularly younger people, will turn around and say, “This is what you promised us?” They will not believe it and they will not like it.
All I heard coming through from the debate is that Ofcom will pick up a lot of complaints and use them to inform itself about what should happen five years down the track, the next time that the regulatory review takes place. That is not what we are about here. This is about filling a gap in a system for which promises have been issued over the seven long years that we have been waiting for the Bill. People out there expect the Bill to make their lives much more reasonable and to be respectful of their rights and responsibilities.
We find that the VSP provisions are being deleted—a system we already have, which at least does first approximation work. We find that we are reinforcing the inequality of arms between individuals and companies. We find that DCMS—it is not the same department because the Minister is now in DSIT, but it is his former sister department—is creating an ombudsman for gambling problems, having identified that they have gone too far, too fast, and are now out of control and need to be responded to. This just does not add up. The Government are in a mess. Please sort it. I beg leave to withdraw the amendment.
Amendment 56 withdrawn.
Clause 18: Duties about freedom of expression and privacy
Amendment 57 not moved.
Amendment 58
Moved by
58: Clause 18, page 20, line 32, at end insert “as defined under the Human Rights Act 1998 and its application to the United Kingdom.”
Baroness Fraser of Craigmaddie Portrait Baroness Fraser of Craigmaddie (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am delighted to propose this group of amendments on devolution issues. I am always delighted to see the Committee so full to talk about devolution issues. I will speak particularly to Amendments 58, 136, 225A and 228 in this group, all in my name. I am very grateful to the noble Lord, Lord Foulkes of Cumnock, for supporting them.

As I have said before in Committee, I have looked at the entire Bill from the perspective of a devolved nation, in particular at the discrepancies and overlaps of Scots law, UK law and ECHR jurisprudence that I was concerned had not been taken into account or addressed by the Bill as it stands. Many have said that they are not lawyers; I am also not. I am therefore very grateful to the Law Society of Scotland, members of Ofcom’s Advisory Committee for Scotland, and other organisations such as the Carnegie Trust and Legal to Say, Legal to Type, which have helped formulate my thinking. I also thank the Minister and the Bill team for their willingness to discuss these issues in advance with me.

When the first proposed Marshalled List for this Committee was sent round, my amendments were dotted all over the place. When I explained to the Whips that they were all connected to devolved issues and asked that they be grouped together, that must have prompted the Bill team to go and look again; the next thing I know, there is a whole raft of government amendments in this group referring to Wales, Northern Ireland, the Bailiwick of Guernsey and the Isle of Man—though not Scotland, I noted. These government amendments are very welcome; if nothing else, I am grateful to have achieved that second look from the devolved perspective.

In the previous group, we heard how long the Bill had been in gestation. I have the impression that, because online safety decision-making is a centralised and reserved matter, the regions are overlooked and engaged only at a late stage. The original internet safety Green Paper made no reference to Scotland at all; it included a section on education describing only the English education system and an annexe of legislation that did not include Scottish legislation. Thankfully, this oversight was recognised by the White Paper, two years later, which included a section on territorial scope. Following this, the draft Bill included a need for platforms to recognise the differences in legislation across the UK, but this was subsequently dropped.

I remain concerned that the particular unintended consequences of the Bill for the devolved Administrations have not been fully appreciated or explored. While online safety is a reserved issue, many of the matters that it deals with—such as justice, the police or education —are devolved, and, as many in this House appreciate, Scots law is different.

At the moment, the Bill is relatively quiet on how freedom of expression is defined; how it applies to the providers of user-to-user services and their duties to protect users’ rights to freedom of expression; and how platforms balance those competing rights when adjudicating on content removal. My Amendment 58 has similarities to Amendment 63 in the name of the noble and learned Lord, Lord Hope of Craighead. It seeks to ensure that phrases such as “freedom of expression” are understood in the same way across the United Kingdom. As the noble and learned Lord pointed out when speaking to his Amendment 63 in a previous group, words matter, and I will therefore be careful to refer to “freedom of expression” rather than “freedom of speech” throughout my remarks.

Amendment 58 asks the Government to state explicitly which standards of speech platforms apply in each of the jurisdictions of the UK, because at this moment there is a difference. I accept that the Human Rights Act is a UK statute already, but, under Article 10—as we have heard—freedom of expression is not an absolute right and may be subject to such formalities, conditions, restrictions or penalties as are prescribed by law.

The noble Lord, Lord Moylan, argued last week that the balance between freedom of expression and any condition or restriction was not an equal one but was weighted in favour of freedom of expression. I take this opportunity to take some issue with my noble friend, who is not in his place, on this. According to the Equality and Human Rights Commission, the British Institute of Human Rights and Supreme Court judgments, human rights are equal and indivisible, neither have automatic priority, and how they are balanced depends on the context and the particular facts.

In Scotland, the Scottish Government believe that they are protecting freedom of expression, but the Hate Crime and Public Order (Scotland) Act 2021 criminalises speech that is not illegal elsewhere in the UK. Examples from the Scottish Government’s own information note state that it is now an offence in Scotland

“if the urging of people to cease practising their religion is done in a threatening or abusive manner or, alternatively, … if a person were to urge people not to engage in same-sex sexual activity while making abusive comments about people who identify as lesbian, gay or bisexual”.

The Lord Advocate’s guidance to the police says that

“an incident must be investigated as a hate crime if it is perceived, by the victim or any other person, to be aggravated by prejudice”.

I stress that I make no absolutely comment about the merits, or otherwise, of the Hate Crime and Public Order (Scotland) Act. I accept that it is yet to be commenced. However, commencement is in the hands of the Scottish Parliament, not the Minister and his team, and I highlight it here as an illustration of the divergence of interpretation that is happening between the devolved nations now, and as an example of what could happen in the future.

So, I would have thought that we would want to take a belt-and-braces approach to ensuring that there cannot be any differences in interpretation of what we mean by freedom of expression, and I hope that the Minister will accept my amendment for the sake of clarity. Ofcom is looking for clarity wherever possible, and clarity will be essential for platforms. Amendment 58 would allow platforms to interpret freedom of expression as a legal principle, rather than having to adapt considerations for Scotland, and it would also help prevent Scottish users’ content being censored more than that of English users, as platforms could rely on a legally certain basis for decision-making.

The hate crime Act was also the motivation for my Amendment 136, which asks why the Government did not include it on the list of priority offences in Schedule 7. I understand that the Scottish Government did not ask for it to be included, but since when did His Majesty’s Government do what the Scottish Government ask of them?

I have assumed that the Scottish Government did not ask for it because the hate crime Act is yet to be commenced in Scotland and there are, I suspect, multiple issues to be worked out with Police Scotland and others before it can be. I stress again that it is not my intention that the Hate Crime and Public Order (Scotland) Act should dictate the threshold for illegal and priority illegal content in this Bill—Amendment 136 is a probing amendment—but the omission of the hate crime Act does raise the question of a devolution deficit because, while the definition of “illegal content” varies, people in areas of the UK with more sensitive thresholds would have to rely on the police to enforce some national laws online rather than benefiting from the additional protections of the Ofcom regime.

Clause 53(5)(c) of this Bill states that

“the offence is created by this Act or, before or after this Act is passed, by”—

this is in sub-paragraph (iv)—

“devolved subordinate legislation made by a devolved authority with the consent of the Secretary of State or other Minister of the Crown”.

How would this consent be granted? How would it involve this Parliament? What consultation should be required, and with whom—particularly since the devolved offence might change the thresholds for the offence across the whole of the UK? The phrase “consent of the Secretary of State” implies that a devolved authority would apply to seek consent. Should not this application process be set out in the Bill? What should the consultation process with devolved authorities and Ofcom be if the Secretary of State wishes to initiate the inclusion of devolved subordinate legislation? Do we not need a formal framework for parliamentary scrutiny—an equivalent of the Grimstone process, perhaps? I would be very happy to work with the Minister and his team on a Parkinson process between now and Report.

Amendments 225A and 228 seek to ensure that there is an analysis of users’ online experiences in the different nations of the UK. Amendment 225A would require Ofcom to ensure that its research into online experiences was analysed in a nation-specific way while Amendment 228 would require Ofcom’s transparency reporting to be reported via each nation. The fact is that, at this moment in time, we do not know whether there is a difference in the online experience across the four nations. For example, are rural or remote communities at greater risk of online harm because they have a greater dependence on online services? How would online platforms respond to harmful sectarian content? What role do communication technologies play in relation to offline violence, such as knife crime?

We can compare other data by nation, for example on drug use or gambling addiction. Research and transparency reporting are key to understanding nation-specific harms online, but I fear that Ofcom will have limited powers in this area if they are not specified in the Bill. Ofcom has good working relationships from the centre with the regions, and part of this stems from the fact that legislation in other sectors, such as broadcasting, requires it to have advisory committees in each of the nations to ensure that English, Scottish, Northern Irish and Welsh matters are considered properly. Notably, those measures do not exist in this Bill.

The interplay between the high-level and reserved nature of internet services and online safety will require Ofcom to develop a range of new, wider partnerships in Scotland—for example with Police Scotland—and to collaborate closely at a working level with a wide range of interests within the Scottish Government, where such interests will be split across a range of ministerial portfolios. In other areas of its regulatory responsibility, Ofcom’s research publications provide a breakdown of data by nation. Given the legislative differences that already exist between the four nations, it is an omission that such a breakdown is not explicitly required in the Bill.

I have not touched—and I am not going to touch—on how this Bill might affect other devolved Administrations. The noble Baroness, Lady Foster of Aghadrumsee, apologises for being unable to be in the Chamber to lend her voice from a Northern Ireland perspective— I understand from her that the Justice (Sexual Offences and Trafficking Victims) Act (Northern Ireland) 2022 might be another example of this issue—but she has indicated her support here. As my noble friend Lady Morgan of Cotes said last Thursday:

“The Minister has done a very good job”


of

“batting away amendments”.—[Official Report, 11/5/23; col. 2043.]

However, I am in an optimistic mood this afternoon, because the Minister responded quite positively to the request from the noble and learned Lord, Lord Hope, that we should define “freedom of expression”. There is great benefit to be had from ensuring that this transparency of reporting and research can be broken down by nation. I am hopeful, therefore, that the Minister will take the points that I have raised through these amendments and that he will, as my noble friend Lady Morgan of Cotes hoped, respond by saying that he sees my points and will work with me to ensure that this legislation works as we all wish it to across the whole of the UK. I beg to move.

17:00
Lord Foulkes of Cumnock Portrait Lord Foulkes of Cumnock (Lab Co-op)
- View Speech - Hansard - - - Excerpts

My Lords, I warmly support the amendment moved by the noble Baroness, Lady Fraser of Craigmaddie, to which I have added my name. I agree with every word she said in her introduction. I could not have said it better and I have nothing to add.

Lord Hope of Craighead Portrait Lord Hope of Craighead (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I follow the noble Lord, Lord Foulkes, with just a few words. As we have been reminded, I tabled Amendment 63, which has already been debated. The Minister will remember that my point was about legal certainty; I was not concerned with devolution, although I mentioned Amendment 58 just to remind him that we are dealing with all parts of the United Kingdom in the Bill and it is important that the expression should have the same meaning throughout all parts.

We are faced with the interesting situation which arose in the strikes Bill: the subject matter of the Bill is reserved, but one must have regard to the fact that its effects spread into devolved areas, which have their own systems of justice, health and education. That is why there is great force in the point that the noble Baroness, Lady Fraser, has been making. I join the noble Lord, Lord Foulkes, in endorsing what she said without going back into the detail, but remind the Minister that devolution exists, even though we are dealing with reserved matters.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this is unfamiliar territory for me, but the comprehensive introduction of the noble Baroness, Lady Fraser, has clarified the issue. I am only disappointed that we had such a short speech from the noble Lord, Lord Foulkes—uncharacteristic, perhaps I could say—but it was good to hear from the noble and learned Lord, Lord Hope, on this subject as well. The noble Baroness’s phrase “devolution deficit” is very useful shorthand for some of these issues. She has raised a number of questions about the Secretary of State’s powers under Clause 53(5)(c): the process, the method of consultation and whether there is a role for Ofcom’s national advisory committees. Greater transparency in order to understand which offences overlap in all this would be very useful. She deliberately did not go for one solution or another, but issues clearly arise where the thresholds are different. It would be good to hear how the Government are going to resolve this issue.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pity that we have not had the benefit of hearing from the Minister, because a lot of his amendments in this group seem to bear on some of the more generic points made in the very good speech by the noble Baroness, Lady Fraser. I assume he will cover them, but I wonder whether he would at least be prepared to answer any questions people might come back with—not in any aggressive sense; we are not trying to scare the pants off him before he starts. For example, the points made by the noble Lord, Lord Clement-Jones, intrigue me.

I used to have responsibility for devolved issues when I worked at No. 10 for a short period. It was a bit of a joke, really. Whenever anything Welsh happened, I was immediately summoned down to Cardiff and hauled over the coals. You knew when you were in trouble when they all stopped speaking English and started speaking Welsh; then, you knew there really was an issue, whereas before I just had to listen, go back and report. In Scotland, nobody came to me anyway, because they knew that the then Prime Minister was a much more interesting person to talk to about these things. They just went to him instead, so I did not really learn very much.

I noticed some issues in the Marshalled List that I had not picked up on when I worked on this before. I do not know whether the Minister wishes to address this—I do not want to delay the Committee too much—but are we saying that to apply a provision in the Bill to the Bailiwick of Guernsey or the Isle of Man, an Order in Council is required to bypass Parliament? Is that a common way of proceeding in these places? I suspect that the noble and learned Lord, Lord Hope, knows much more about this than I do—he shakes his head—but this is a new one on me. Does it mean that this Parliament has no responsibility for how its laws are applied in those territories, or are there other procedures of which we are unaware?

My second point again picks up what the noble Lord, Lord Clement-Jones, was saying. Could the Minister go through in some detail the process by which a devolved authority would apply to the Secretary of State—presumably for DSIT—to seek consent for a devolved offence to be included in the Online Safety Bill regime? If this is correct, who grants to what? Does this come to the House as a statutory instrument? Is just the Secretary of State involved, or does it go to the Privy Council? Are there other ways that we are yet to know about? It would be interesting to know.

To echo the noble Lord, Lord Clement-Jones, we probably do need a letter from the Minister, if he ever gets this cleared, setting out exactly how the variation in powers would operate across the four territories. If there are variations, we would like to know about them.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- Hansard - - - Excerpts

My Lords, I am very grateful to my noble friend Lady Fraser of Craigmaddie for her vigilance in this area and for the discussion she had with the Bill team, which they and I found useful. Given the tenor of this short but important debate, I think it may be helpful if we have a meeting for other noble Lords who also want to benefit from discussing some of these things in detail, and particularly to talk about some of the issues the noble Lord, Lord Stevenson of Balmacara, just raised. It would be useful for us to talk in detail about general questions on the operation of the law before we look at this again on Report.

In a moment, I will say a bit about the government amendments which stand in my name. I am sure that noble Lords will not be shy in taking the opportunity to interject if questions arise, as they have not been shy on previous groups.

I will start with the amendments tabled by my noble friend Lady Fraser. Her Amendment 58 seeks to add reference to the Human Rights Act 1998 to Clause 18. That Act places obligations on public authorities to act compatibly with the European Convention on Human Rights. It does not place obligations on private individuals and companies, so it would not make sense for such a duty on internet services to refer to the Human Rights Act.

Under that Act, Ofcom has obligations to act in accordance with the right to freedom of expression under Article 10 of the European Convention on Human Rights. As a result, the codes that Ofcom draws up will need to comply with the Article 10 right to freedom of expression. Schedule 4 to the Bill requires Ofcom to ensure that measures which it describes in a code of practice are designed in light of the importance of protecting the right of users’

“freedom of expression within the law”.

Clauses 44(2) and (3) provide that platforms will be treated as complying with their freedom of expression duty if they take the recommended measures that Ofcom sets out in the codes. Platforms will therefore be guided by Ofcom in taking measures to comply with its duties, including safeguards for freedom of expression through codes of practice.

My noble friend’s Amendment 136 seeks to add offences under the Hate Crime and Public Order (Scotland) Act 2021 to Schedule 7. Public order offences are already listed in Schedule 7 to the Bill, which will apply across the whole United Kingdom. This means that all services in scope will need proactively to tackle content that amounts to an offence under the Public Order Act 1986, regardless of where the content originates or where in the UK it can be accessed.

The priority offences list has been developed with the devolved Administrations, and Clause 194 outlines the parliamentary procedures for updating it. The requirements for consent will be set out in the specific subordinate legislation that may apply to the particular offence being made by the devolved authorities—that is to say, they will be laid down by the enabling statutes that Parliament will have approved.

Amendment 228 seeks to require the inclusion of separate analyses of users’ online experiences in England, Wales, Scotland and Northern Ireland in Ofcom’s transparency reports. These transparency reports are based on the information requested from category 1, 2A and 2B service providers through transparency reporting. I assure my noble friend that Ofcom is already able to request country-specific information from providers in its transparency reports. The legislation sets out high-level categories of information that category 1, 2A and 2B services may be required to include in their transparency reports. The regulator will set out in a notice the information to be requested from the provider, the format of that information and the manner in which it should be published. If appropriate, Ofcom may request specific information in relation to each country in the UK, such as the number of users encountering illegal content and the incidence of such content.

Ofcom is also required to undertake consultation before producing guidance about transparency reporting. In order to ensure that the framework is proportionate and future-proofed, however, it is vital to allow the regulator sufficient flexibility to request the types of information that it sees as relevant, and for that information to be presented by providers in a manner that Ofcom has deemed to be appropriate.

Similarly, Amendment 225A would require separate analyses of users’ online experiences in England, Wales, Scotland and Northern Ireland in Ofcom’s research about users’ experiences of regulated services. Clause 141 requires that Ofcom make arrangements to undertake consumer research to ascertain public opinion and the experiences of UK users of regulated services. Ofcom will already be able to undertake this research on a country-specific basis. Indeed, in undertaking its research and reporting duties, as my noble friend alluded to, Ofcom has previously adopted such an approach. For instance, it is required by the Communications Act 2003 to undertake consumer research. While the legislation does not mandate that Ofcom conduct and publish nation-specific research, Ofcom has done so, for instance through its publications Media Nations and Connected Nations. I hope that gives noble Lords some reassurance of its approach in this regard. Ensuring that Ofcom has flexibility in carrying out its research functions will enable us to future-proof the regulatory framework, and will mean that its research activity is efficient, relevant and appropriate.

I will now say a bit about the government amendments standing in my name. I should, in doing so, highlight that I have withdrawn Amendments 304C and 304D, previously in the Marshalled List, which will be replaced with new amendments to ensure that all the communications offences, including the new self-harm offence, have the appropriate territorial extent when they are brought forward. They will be brought forward as soon as possible once the self-harm offence has been tabled.

Amendments 267A, 267B, 267C, 268A, 268B to 268G, 271A to 271D, 304A, 304B and 304E are amendments to Clauses 160, 162, 164 to 166, 168 and 210 and Schedule 14, relating to the extension of the false and threatening communications offences and the associated liability of corporate officers in Clause 166 to Northern Ireland.

This group also includes some technical and consequential amendments to the false and threatening communications offences and technical changes to the Malicious Communications (Northern Ireland) Order 1988 and Section 127 of the Communications Act 2003. This will minimise overlap between these existing laws and the new false and threatening communications offences in this Bill. Importantly, they mirror the approach taken for England and Wales, providing consistency in the criminal law.

This group also contains technical amendments to update the extent of the epilepsy trolling offence to reflect that it applies to England, Wales and Northern Ireland.

17:15
Amendment 286B is a technical amendment to repeal a provision in the Digital Economy Act 2017 that will become redundant when Part 3 of that Act is repealed by this Bill.
Amendments 304F and 304G give the Bailiwick of Guernsey and the Isle of Man the power to extend the Online Safety Bill to their jurisdictions, should they wish. Amendments 304A and 304H to 304K have been tabled to reflect the Bailiwick of Jersey opting to forgo a permissive extent clause in this instance.
With the offer of a broader meeting to give other noble Lords the benefit of the discussions with the Bill team that my noble friend has had—I extend that invitation to her, of course, to continue the conversation with us—I hope that provides information about the government amendments in this group and some reassurance on the points that my noble friend has made. I hope that she will be willing to withdraw her amendment and that noble Lords will accept the government amendments.
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I suggested that we might see a table, independent of the meetings, although I am sure they could coincide. Would it be possible to have a table of all the criminal offences that the Minister listed and how they apply in each of the territories? Without that, we are a bit at sea as to exactly how they apply.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes, that would be a sensible way to view it. We will work on that and allow noble Lords to see it before they come to talk to us about it.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I put on record that the withdrawal of Part 3 of the Digital Economy Act 2017 will be greeted with happiness only should the full schedule of AV and harms be put into the Bill. I must say that because the noble Baroness, Lady Benjamin, is not in her place. She worked very hard for that piece of legislation.

Baroness Fraser of Craigmaddie Portrait Baroness Fraser of Craigmaddie (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the Minister for his response. I take it as a win that we have been offered a meeting and further discussion, and the noble Lord, Lord Foulkes, agreeing with every word I said. I hope we can continue in this happy vein in my time in this House.

The suggestion from the noble Lord, Lord Stevenson, of a table is a welcome one. Something that has interested me is that some of the offences the Minister mentioned were open goals: there were holes leaving it open in Northern Ireland and not in England and Wales, or whatever. For example, epilepsy trolling is already a criminal offence in Scotland, but I am not sure that was appreciated when we started this discussion.

I look forward to the meeting and I thank the Minister for his response. I am still unconvinced that we have the right consultation process for any devolved authority wanting to apply for a subordinate devolved Administration to be included under this regime.

It concerns me that the Minister talked about leaving requesting data that Ofcom deemed to be appropriate. The feeling on the ground is that Ofcom, which is based in London, may not understand what is or is not necessarily appropriate in the devolved Administrations. The fact that in other legislation—for example, on broadcasting—it is mandated that it is broken down nation by nation is really important. It is even more important because of the interplay between the devolved and the reserved matters. The fact that there is no equivalent Minister in the Scottish Government to talk about digital and online safety things with means that a whole raft of different people will need to have relationships with Ofcom who have not hitherto.

I thank the Minister. On that note, I withdraw my amendment.

Amendment 58 withdrawn.
Amendments 59 to 64 not moved.
Clause 18 agreed.
Clause 19: Record-keeping and review duties
Amendments 64A to 64D
Moved by
64A: Clause 19, page 21, line 36, leave out “all”
Member’s explanatory statement
This is a technical amendment needed because the new duty to supply records of risk assessments to OFCOM (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 1 services.
64B: Clause 19, page 21, line 36, at end insert “(as indicated by the headings).”
Member’s explanatory statement
This amendment provides clarification because the new duty to supply records of risk assessments to OFCOM (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 1 services.
64C: Clause 19, page 21, line 38, after “of” insert “all aspects of”
Member’s explanatory statement
This amendment concerns a duty imposed on providers to keep records of risk assessments. The added words make it clear that full records must be made.
64D: Clause 19, page 21, line 38, at end insert “, including details about how the assessment was carried out and its findings.”
Member’s explanatory statement
This amendment concerns a duty imposed on providers to keep records of risk assessments. The added words make it clear that full records must be made.
Amendments 64A to 64D agreed.
Amendments 65 and 65ZA not moved.
Amendment 65A
Moved by
65A: Clause 19, page 22, line 26, at end insert—
“(8A) As soon as reasonably practicable after making a record of a risk assessment as required by subsection (2), or revising such a record, a duty to supply OFCOM with a copy of the record (in full).”Member’s explanatory statement
This amendment requires providers of Category 1 services to supply copies of their records of risk assessments to OFCOM. The limitation to Category 1 services is achieved by an amendment in the name of the Minister to clause 6.
Amendment 65A agreed.
Amendment 65AA not moved.
Clause 19, as amended, agreed.
Clause 20: Providers of search services: duties of care
Amendments 65B to 65E
Moved by
65B: Clause 20, page 23, line 5, leave out “and (3)” and insert “to (3A)”
Member’s explanatory statement
This technical amendment is consequential on the other changes to clause 20 (arising from the new duties in clauses 23, 25 and 29 which are imposed on providers of Category 2A services only - see the amendments in the Minister’s name to those clauses below).
65C: Clause 20, page 23, line 10, at end insert “(2) to (8)”
Member’s explanatory statement
This amendment is consequential on the amendments in the Minister’s name to clause 23 below (because the new duty to summarise illegal content risk assessments in a publicly available statement is only imposed on providers of Category 2A services).
65D: Clause 20, page 23, line 15, at end insert “(2) to (6)”
Member’s explanatory statement
This amendment is consequential on the amendments in the Minister’s name to clause 29 below (because the new duty to supply records of risk assessments to OFCOM is only imposed on providers of Category 2A services).
65E: Clause 20, page 23, line 15, at end insert—
“(2A) Additional duties must be complied with by providers of particular kinds of regulated search services, as follows.”Member’s explanatory statement
This technical amendment is consequential on the other changes to clause 20 (arising from the new duties in clauses 23, 25 and 29 which are imposed on providers of Category 2A services only - see the amendments in the Minister’s name to those clauses below).
Amendments 65B to 65E agreed.
Amendment 66 not moved.
Amendments 66A to 66D
Moved by
66A: Clause 20, page 23, line 16, leave out “In addition,”
Member’s explanatory statement
This technical amendment is consequential on the other changes to clause 20 (arising from the new duties in clauses 23, 25 and 29 which are imposed on providers of Category 2A services only - see the amendments in the Minister’s name to those clauses below).
66B: Clause 20, page 23, line 20, at end insert “(2) to (8)”
Member’s explanatory statement
This amendment is consequential on the amendments in the Minister’s name to clause 25 below (because the new duty to summarise children’s risk assessments in a publicly available statement is only imposed on providers of Category 2A services).
66C: Clause 20, page 23, line 20, at end insert—
“(3A) All providers of regulated search services that are Category 2A services must comply with the following duties in relation to each such service which they provide—(a) the duty about illegal content risk assessments set out in section 23(8A),(b) the duty about children’s risk assessments set out in section 25(8A), and(c) the duty about record-keeping set out in section 29(8A).”Member’s explanatory statement
This amendment ensures that the new duties set out in the amendments in the Minister’s name to clauses 23, 25 and 29 below (duties to summarise risk assessments in a publicly available statement and to supply records of risk assessments to OFCOM) are imposed on providers of Category 2A services only.
66D: Clause 20, page 23, line 21, at end insert—
“(5) For the meaning of “Category 2A service”, see section 86 (register of categories of services).”Member’s explanatory statement
This amendment inserts a signpost to the meaning of “Category 2A service”.
Amendments 66A to 66D agreed.
Clause 20, as amended, agreed.
Clause 21 agreed.
Clause 22: Illegal content risk assessment duties
Amendment 66DA not moved.
Amendment 66E
Moved by
66E: Clause 22, page 24, line 38, after “29(2)” insert “and (8A)”
Member’s explanatory statement
This amendment inserts a signpost to the new duty in clause 29 about supplying records of risk assessments to OFCOM.
Amendment 66E agreed.
Clause 22, as amended, agreed.
Clause 23: Safety duties about illegal content
Amendments 66F and 66G
Moved by
66F: Clause 23, page 24, line 42, leave out “all”
Member’s explanatory statement
This is a technical amendment needed because the new duty to summarise illegal content risk assessments in a publicly available statement (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 2A services.
66G: Clause 23, page 24, line 42, at end insert “(as indicated by the headings).”
Member’s explanatory statement
This amendment provides clarification because the new duty to summarise illegal content risk assessments in a publicly available statement (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 2A services.
Amendments 66F and 66G agreed.
Amendments 67 to 72 not moved.
Amendment 72A
Moved by
72A: Clause 23, page 25, line 31, at end insert—
“(8A) A duty to summarise in a publicly available statement the findings of the most recent illegal content risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to individuals).”Member’s explanatory statement
This amendment requires providers of Category 2A services to summarise (in a publicly available statement) the findings of their latest risk assessment regarding illegal content. The limitation to Category 2A services is achieved by an amendment in the name of the Minister to clause 20.
Amendment 72A agreed.
Clause 23, as amended, agreed.
Amendment 73 not moved.
Clause 24: Children’s risk assessment duties
Amendments 74 and 75 not moved.
Amendment 75A
Moved by
75A: Clause 24, page 26, line 45, after “29(2)” insert “and (8A)”
Member’s explanatory statement
This amendment inserts a signpost to the new duty in clause 29 about supplying records of risk assessments to OFCOM.
Amendment 75A agreed.
Clause 24, as amended, agreed.
Clause 25: Safety duties protecting children
Amendment 75B
Moved by
75B: Clause 25, page 27, line 4, at end insert “(as indicated by the headings).”
Member’s explanatory statement
This amendment provides clarification because the new duty to summarise children’s risk assessments in a publicly available statement (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 2A services.
Amendment 75B agreed.
Amendments 76 to 81 not moved.
Amendment 81A
Moved by
81A: Clause 25, page 27, line 46, at end insert—
“(8A) A duty to summarise in a publicly available statement the findings of the most recent children’s risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to children).”Member’s explanatory statement
This amendment requires providers of Category 2A services to summarise (in a publicly available statement) the findings of their latest children’s risk assessment. The limitation to Category 2A services is achieved by an amendment in the name of the Minister to clause 20.
Amendment 81A agreed.
Amendments 82 to 85 not moved.
Clause 25, as amended, agreed.
Clause 26: Duty about content reporting
Amendment 86 not moved.
Clause 26 agreed.
Clause 27: Duties about complaints procedures
Amendment 87 not moved.
Clause 27 agreed.
Clause 28: Duties about freedom of expression and privacy
Amendment 88 not moved.
Clause 28 agreed.
Clause 29: Record-keeping and review duties
Amendments 88A to 88D
Moved by
88A: Clause 29, page 31, line 4, leave out “all”
Member’s explanatory statement
This is a technical amendment needed because the new duty to supply records of risk assessments to OFCOM (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 2A services.
88B: Clause 29, page 31, line 4, at end insert “(as indicated by the headings).”
Member’s explanatory statement
This amendment provides clarification because the new duty to supply records of risk assessments to OFCOM (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 2A services.
88C: Clause 29, page 31, line 6, after “of” insert “all aspects of”
Member’s explanatory statement
This amendment concerns a duty imposed on providers to keep records of risk assessments. The added words make it clear that full records must be made.
88D: Clause 29, page 31, line 6, at end insert “, including details about how the assessment was carried out and its findings.”
Member’s explanatory statement
This amendment concerns a duty imposed on providers to keep records of risk assessments. The added words make it clear that full records must be made.
Amendments 88A to 88D agreed.
Amendments 89 and 90 not moved.
Amendment 90A
Moved by
90A: Clause 29, page 31, line 37, at end insert—
“(8A) As soon as reasonably practicable after making a record of a risk assessment as required by subsection (2), or revising such a record, a duty to supply OFCOM with a copy of the record (in full).”Member’s explanatory statement
This amendment requires providers of Category 2A services to supply copies of their records of risk assessments to OFCOM. The limitation to Category 2A services is achieved by an amendment in the name of the Minister to clause 20.
Amendment 90A agreed.
Amendment 90B not moved.
Clause 29, as amended, agreed.
17:30
Amendments 91 and 91A not moved.
Clause 30: Children’s access assessments
Amendment 92 not moved.
Clause 30 agreed.
Clause 31 agreed.
Schedule 3 agreed.
Amendment 93 not moved.
Clause 32 agreed.
Clause 33: Duties about fraudulent advertising: Category 1 services
Amendment 94 not moved.
Clause 33 agreed.
Clause 34: Duties about fraudulent advertising: Category 2A services
Amendment 95 not moved.
Clause 34 agreed.
Clause 35 agreed.
Amendment 96
Moved by
96: After Clause 35, insert the following new Clause—
“Suicide or self-harm content duties
(1) This section sets out the duties about harmful suicide or self-harm content which apply to all regulated user-to-user services and providers of search services.(2) This section applies in respect of all service users.(3) A duty to include provisions in the terms of service specifying the treatment to be applied in relation to harmful suicide or self-harm content.(4) The possible kinds of treatment of content referred to in subsection (3) are—(a) taking down the content;(b) restricting users’ access to the content;(c) limiting the recommendation or promotion of the content.(5) A duty to explain in the terms of service the provider’s response to the risks relating to harmful suicide or self- harm content by reference to—(a) any provisions of the terms of service included in compliance with the duty set out in subsection (3), and(b) any other provisions of the terms of service designed to mitigate or manage those risks.(6) If provisions are included in the terms of service in compliance with the duty set out in subsection (3), a duty to ensure that those provisions—(a) are clear and accessible, and(b) are applied consistently in relation to content which meets the definition in section 207.”Member’s explanatory statement
This creates a duty for providers of regulated user-to-user services and search services to manage harmful suicide or self-harm content, applicable to both children and adults.
Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- Hansard - - - Excerpts

I am particularly grateful to the noble Lords who co-signed Amendments 96, 240 and 296 in this group. Amendment 225 is also important and warrants careful consideration, as it explicitly includes eating disorders. These amendments have strong support from Samaritans, which has helped me in drafting them, and from the Mental Health Foundation and the BMA. I declare that I am an elected member of the BMA ethics committee.

We have heard much in Committee about the need to protect children online more effectively even than in the Bill. On Tuesday the noble Baroness, Lady Morgan of Cotes, made a powerful speech acknowledging that vulnerability does not stop at the age of 18 and that the Bill currently creates a cliff edge whereby there is protection from harmful content for those under 18 but not for those over 18. The empowerment tools will be futile for those seriously contemplating suicide and self-harm. No one should underestimate the power of suicide contagion and the addictive nature of the content that is currently pushed out to people, goading them into such actions and drawing them into repeated viewings.

Amendment 96 seeks to redress that. It incorporates a stand-alone provision, creating a duty for providers of user-to-user services to manage harmful content about suicide or self-harm. This provision would operate as a specific category, relevant to all regulated services and applicable to both children and adults. Amendment 296 defines harmful suicide or self-harm content. It is important that we define that to avoid organisations such as Samaritans, which provide suicide prevention support, being inadvertently caught up in clumsy, simplistic search engine categorisation.

Suicide and self-harm content affects people of all ages. Adults in distress search the internet, and children easily bypass age-verification measures and parental controls even when the have been switched on. The Samaritans Lived Experience Panel reported that 82% of people who died by suicide, having visited websites that encouraged suicide and/or methods of self-harm, were over the age of 25.

Samaritans considers that the types of suicide and self-harm content that are legal but unequivocally harmful include, but are not limited to, information, depictions, instructions and advice on methods of self-harm and suicide; content that portrays self-harm and suicide as positive or desirable; and graphic descriptions or depictions of self-harm and suicide. As the Bill stands, platforms will not even need to consider the risk that such content could pose to adults. This will leave all that dangerous online content widely available and undermines the Bill’s intention from the outset.

Last month, other parliamentarians and I met Melanie, whose relative Jo died by suicide in 2020. He was just 23. He had accessed suicide-promoting content online, and his family are speaking out to ensure that the Bill works to avoid future tragedies. A University of Bristol study reported that those with severe suicidal thoughts actively use the internet to research effective methods and often find clear suggestions. Swansea University reported that three quarters of its research participants had harmed themselves more severely after viewing self-harm content online.

Amendment 240 complements the other amendments in this group, although it would not rely on them to be effective. It would establish a specific unit in Ofcom to monitor the prevalence of suicide, self-harm and harmful content online. I should declare that this is in line with the Private Member’s Bill I have introduced. In practice, that means that Ofcom would need to assess the efficacy of the legislation in practice. It would require Ofcom to investigate the content and the algorithms that push such content out to individuals at an alarming rate.

Researchers at the Center for Countering Digital Hate set up new accounts in the USA, UK, Canada and Australia at the minimum age TikTok allows, which is 13. These accounts paused briefly on videos about body image and mental health, and “liked” them. Within 2.6 minutes, TikTok recommended suicide content, and it sent content on eating disorders within eight minutes.

Ofcom’s responsibility for ongoing review and data collection, reported to Parliament, would take a future-facing approach covering new technologies. New communications and internet technologies are being developed at pace in ways we cannot imagine. The term

“in a way equivalent … to”

in Amendment 240 is specifically designed to include the metaverse, where interactions are instantaneous, virtual and able to incite, encourage or provoke serious harm to others.

We increasingly live our lives online. Social media is expanding, while user-to-user sites are now shopping platforms for over 70% of UK consumers. However, online is also being used to sell suicide kits or lethal substances, as recently covered in the press. It is important that someone holds the responsibility for reporting on dangers in the online world. Harmful suicide content methods and encouragement were found through a systematic review to be massed on sites with low levels of moderation and easy search functions for images. Some 78% of people with lived experience of suicidality and self-harm surveyed by Samaritans agree that new laws are needed to make online spaces safer.

I urge noble Lords to support my amendments, which aim to ensure that self-harm, suicide and seriously harmful content is addressed across all platforms in all categories as well as search engines, regardless of their functionality or reach, and for all persons, regardless of age. Polling by Samaritans has shown high support for this: four out of five agree that harmful suicide and self-harm content can damage adults as well as children, while three-quarters agree that tech companies should by law prevent such content being shown to users of all ages.

If the Government are not minded to adopt these amendments, can the Minister tell us specifically how the Bill will take a comprehensive approach to placing duties on all platforms to reduce dangerous content promoting suicide and self-harm? Can the Government confirm that smaller sites, such as forums that encourage suicide, will need to remove priority illegal content, whatever the level of detail in their risk assessment? Lastly—I will give the Minister a moment to note my questions—do the Government recognise that we need an amendment on Report to create a new offence of assisting or encouraging suicide and serious self-harm? I beg to move.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I particularly support Amendment 96, to which I have added my name; it is a privilege to do so. I also support Amendment 296 and I cannot quite work out why I have not added my name to it, because I wholeheartedly agree with it, but I declare my support now.

I want to talk again about an issue that the noble Baroness, Lady Finlay, set out so well and that we also touched on last week, about the regulation of suicide and self-harm content. We have all heard of the tragic case of Molly Russell, but a name that is often forgotten in this discussion is Frankie Thomas. Frankie was a vulnerable teenager with childhood trauma, functioning autism and impulsivity. After reading a story about self-harm on the app Wattpad, according to the coroner’s inquest, she went home and undertook

“a similar act, resulting in her death”.

I do not need to repeat the many tragic examples that have already been shared in this House, but I want to reiterate the point already made by the BMA in its very helpful briefing on these amendments: viewing self-harm and suicide content online can severely harm the user offline. As I said last week when we were debating the user empowerment tools, this type of content literally has life or death repercussions. It is therefore essential that the Bill takes this sort of content more seriously and creates specific duties for services to adhere to.

We will, at some point this evening—I hope—come on to debate the next group of amendments. The question for Ministers to answer on this group, the next one and others that we will be debating is, where we know that content is harmful to society—to individuals but also to broader society—why the Government do not want to take the step of setting out how that content should be properly regulated. I think it all comes from their desire to draw a distinction between content that is illegal and content that is not illegal but is undoubtedly, in the eyes of pretty well every citizen, deeply harmful. As we have already heard from the noble Baroness, and as we heard last week, adults do not become immune to suicide and self-harm content the minute they turn 18. In fact, I would argue that no adult is immune to the negative effects of viewing this type of content online.

This amendment, therefore, is very important, as it would create a duty for providers of regulated user-to-user services and search engines to manage harmful suicide or self-harm content applicable to both children and adults, recognising this cliff edge otherwise in the Bill, which we have already talked about. I strongly urge noble Lords, particularly the Minister, to agree that protecting users from this content is one of the most important things that the Bill can do. People outside this House are looking to us to do this, so I urge the Government to support this amendment today.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I am pleased that we have an opportunity, in this group of amendments, to talk about suicide and self-harm content, given the importance of it. It is important to set out what we expect to happen with this legislation. I rise particularly to support Amendment 225, to which my noble friend Lady Parminter added her name. I am doing this more because the way in which this kind of content is shared is incredibly complex, rather than simply because of the question of whether it is legal or illegal.

17:45
Our goal in the regulations should actually be twofold. First, we do, of course, want to reduce the likelihood that somebody who is at lower risk of suicide and self-harm might move into the higher-risk group as a result of their online activity on user-to-user services. That is our baseline goal. We do not want anyone to go from low risk to high risk. Secondly, as well as this “create no new harm” goal, we have a harm reduction goal, which is that people who are already at a higher risk of suicide and self-harm might move into a lower-risk category through the use of online services to access advice and support. It is really important, in this area, that we do not lose sight of the fact there are two aspects to the use of online services. It is no simple task to try to achieve both these goals, as they can sometimes be in tension with each other in respect of particular content types.
There is a rationale for removing all suicide and self-harm content, as that is certainly a way to achieve that first goal. It makes it less likely that a low-risk person would encounter—in the terms of the Bill—or be exposed to potentially harmful content. Some countries certainly do take that approach. They say that anything that looks like it is encouraging suicide or self-harm should be removed, full stop. That is a perfectly rational and legitimate approach.
There is, however, a cost to this approach, which I wish to tease out. It would be helpful in this debate to understand that, and it might not be immediately apparent what that cost is. It is that there are different kinds of individuals posting this content, so if we look at the experience of what happens on online platforms, there certainly is a community of people who post content with the express aim of hurting others: people who we often call trolls, who are small in number but incredibly toxic. They are putting out suicide and self-harm content because they want other people to suffer. They might think it is funny, but whatever they think, they are doing it with an expressly negative intent.
There is also a community of individuals and organisations who believe that they are sharing content to help those who are at risk. This can vary: some can be formal organisations such as the Samaritans, and others can be enterprising individuals, sometimes people who themselves had experiences that they wish to share, who will create online fora and share content. It might be content that looks similar to content that appears harmful, but their expressed goal is seeking to help others online. Most of these fora are for that purpose. Then there are the individuals themselves, who are looking for advice and support relevant to what is happening in their own lives and to connect with others who share their experiences.
We might see the same piece of content very differently when posted by people in these groups. If an individual in that troll group is showing an image of self-harm, that is an aggressive, harmful act; there is no excuse for it, and we want to get rid of it. However, the same content might be part of an educational exchange when posted by an expert organisation. The noble Baroness, Lady Finlay, said that we needed to make sure that this new legislation did not inadvertently sweep up those who were in that educational space.
The hardest group is the group of individuals, where, in many cases, the posting of that content is a cry for help, and an aggressive response by the platform can, sadly, be counterproductive to that individual if they have gone online to seek help. The effect of that is that the content is removed and, because they violated the platform’s terms of service, that person who is feeling lonely and vulnerable might lose social media accounts that are important to them for seeking help. Therefore, by seeking to reduce their exposure to content, we might inadvertently end up creating a scenario in which they lose all that is valuable to them. That is the other inadvertent harm that we want to ensure we avoid in regulating and seeking to have Ofcom issue the most appropriate guidance.
We should be able to advance both goals: removing the content that is posted with harmful intent but enabling content that is there as a cry for help, or as a support and advice service. It is in that context that something like the proposal for an expert group for Ofcom is very helpful. Again, having worked at a platform, I can say that we often reached out to advisers and sought help. Sometimes, the advice was conflicting. Some people would say it was really important that if someone was sharing images of self-harm they should be got rid of; others would say that, in certain contexts, it was really important to allow that person to share the image of self-harm and have a discussion with others—and that maybe the response was to use that as a trigger, to point them towards a support service that they need.
Again, when somebody is at imminent risk of suicide, protocols were developed to deal with that when the solution is nothing that the platform can do. If a platform has detected that somebody is at imminent risk of suicide, it needs to find a way to ensure that either a support body such as the Samaritans or, in many cases, the police are notified so that they can go to that person’s house, knock on the door and prevent the suicide happening. Platforms in some countries have the relationships that they need with local bodies. Giving that advice is very sensitive; you are disclosing highly sensitive personal data to an outside body, against the individual’s wishes. There will not be consent from them, in many cases, and that has to be worked through.
If we are thinking about protocols for dealing with self-harm content, we will reach some of the same issues. It may be that informing parents, a school or some other body to get help to that individual would be the right thing to do. That is very sensitive in terms of the data disclosure and privacy aspects.
The Bill is an opportunity to improve all of this. There are pieces of very good practice and, clearly, areas where not enough is being done and too much very harmful content—particularly content that is posted with the express intent of causing harm—is being allowed to circulate. I hope that, through the legislation and by getting these protocols right, we can get to the point where we are both preventing lower-risk people moving into a higher-risk category and enabling people already in a high-risk category to get the help, support and advice that they need. Nowadays, online is often the primary tool that could benefit them.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, as usual, the noble Lord, Lord Allan of Hallam, has explained with some nuance the trickiness of this area, which at first sight appears obvious—black and white—but is not quite that. I want to explore some of these things.

Many decades ago, I ran a drop-in centre for the recovering mentally ill. I remember my shock the first time that I came across a group of young women who had completely cut themselves up. It was a world that I did not know but, at that time, a very small world—a very minor but serious problem in society. Decades later, going around doing lots of talks, particularly in girls’ schools where I am invited to speak, I suddenly discovered that whole swathes of young women were doing something that had been considered a mental health problem, often hidden away. Suddenly, people were talking about a social contagion of self-harm happening in the school. Similarly, there were discussions about eating disorders being not just an individual mental health problem but something that kind of grew within a group.

Now we have the situation with suicide sites, which are phenomenal at exploiting those vulnerabilities. This is undoubtedly a social problem of some magnitude. I do not in any way want to minimise it, but I am not sure exactly how legislation can resolve it or whether it will, even though I agree that it could do certain things.

Some of the problems that we have, which have already been alluded to, really came home to me when I read about Instagram bringing in some rules on self-harm material, which ended up with the removal of posts by survivors of self-harm discussing their illness. I read a story in the Coventry Evening Telegraph—I was always interested because I ran the drop-in centre for the recovering mentally ill in Coventry—where a young woman had some photographs taken down from Instagram because they contained self-harm images of her scars. The young woman who had suffered these problems, Marie from Tile Hill, wanted to share pictures of her scars with other young people because she felt that they would help others recover. She had got over it and was basically saying that the scars were healing. In other words, it was a kind of self-help group for users online, yet it was taken down.

It is my usual problem: this looks to be a clear-cut case, yet the complexities can lead to problems of censorship of a sort. I was really pleased that the noble Baroness, Lady Finlay, stressed the point about definitions. Search engines such as Google have certainly raised the problem of a concern, or worry, that people looking for help—or even looking to write an essay on suicide, assisted suicide or whatever—will end up not being able to find appropriate material.

I also want to ask a slightly different question. Who decides which self-harms are in our definitions and what is the contagion? When I visit schools now, there is a new social contagion in town, I am afraid to say, which is that of gender dysphoria. In the polling for a newly published report Show, Tell and Leave Nothing to the Imagination by Jo-Anne Nadler, which has just come out, half of the young people interviewed said that they knew someone at their school who wanted to change gender or had already, while one in 10 said that they wanted to change their gender.

That is just an observation; your Lordships might ask what it has to do with the Bill. But these are actually problem areas that are being affirmed by educational organisations and charities, by which I mean that organisations that have often worked with government have been consulted as stakeholders. They have recommended to young women where online to purchase chest binders, which will stop them developing, or where and how to use puberty blockers. Eventually, they are affirming double mastectomies or castration. By the way, this is of course all over social media, because once you start to search on it, TikTok is rife with it. Algorithmically, we are reminded all the time to think about systems: once you have had a look at it, it is everywhere. My point is that this is affirmed socially.

Imagine a situation whereby, in society offline, some young woman who has an eating disorder and weighs 4 stone comes to you and says “Look, I’m so fat”. If you said, “Yes, of course you’re fat—I’ll help you slim”, we would think it was terrible. When that happens online, crudely and sometimes cruelly, we want to tackle it here. If some young woman came to you and said, “I want to self-harm; I feel so miserable that I want to cut myself”, and you started recommending blades, we would think it was atrocious behaviour. In some ways, that is what is happening online and that is where I have every sympathy with these amendments. Yet when it comes to gender dysphoria, which actually means encouraging self-harm, because it is a cultural phenomenon that is popular it does not count.

In some ways, I could be arguing that we should future-proof this legislation by including those self-harms in the definition put forward by the amendments in this group. However, I raise it more to indicate that, as with all definitions, it is not quite as easy as one would think. I appreciate that a number of noble Lords, and others engaged in this discussion, might think that I am merely exhibiting prejudice rather than any genuine compassion or concern for those young people. I would note that if noble Lords want to see a rising group of people who are suicidal and feel that their life is really not worth living, search out the work being done on detransitioners who realise too late that that affirmation by adults has been a disaster for them.

On the amendments suggesting another advisory committee with experts to advise Ofcom on how we regulate such harms, I ask that we are at least cautious about which experts. To mention one of the expert bodies, Mermaids has become controversial and has actually been advocating some of those self-harms, in my opinion. It is now subject to a Charity Commission investigation but has been on bodies such as this advising about young people. I would not think that appropriate, so I just ask that some consideration is given to which experts would be on such bodies.

18:00
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I also support the amendments in the name of my noble friend Lady Finlay. I want to address a couple of issues raised by the noble Lord, Lord Allan. He made a fantastic case for adequate redress systems, both at platform level and at independent complaint level, to really make sure that, at the edge of all the decisions we make, there is sufficient discussion about where that edge lies.

The real issue is not so much the individuals who are in recovery and seeking to show leadership but those who are sent down the vortex of self-harm and suicide material that comes in its scores—in its hundreds and thousands—and completely overwhelms them. We must not make a mistake on the edge case and not deal with the issue at hand.

There is absolutely not enough signposting. I have seen at first hand—I will not go through it again; I have told the Committee already—companies justifying material that it was inconceivable to justify as being a cry for help. A child with cuts and blood running down their body is not a cry for help; that is self-harm material.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

From experience, I think it is true that companies get defensive and seek to defend the indefensible on occasion. I agree with the noble Baroness on that, but I will balance it a little as I also work with people who were agonising over not wanting to make a bad situation worse. They were genuinely struggling and seeking to do the right thing. That is where the experts come in. If someone would say to them, “Look, take this stuff down; that is always better”, it would make their lives easier. If they said, “Please leave it up”, they could follow that advice. Again, that would make their lives easier. On the excuses, I agree that sometimes they are defending the indefensible, but also there are people agonising over the right thing to do and we should help them.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I absolutely agree. Of course, good law is a good system, not a good person.

I turn to the comments that I was going to make. Uncharacteristically, I am a little confused about this issue and I would love the Minister’s help. My understanding on reading the Bill very closely is that self-harm and suicide content that meets a legal definition will be subject to the priority illegal content duties. In the case of children, we can safely anticipate that content of this kind will be named primary priority content. Additionally, if such content is against the terms of service of a regulated company, it can be held responsible to those terms. It will have to provide a user empowerment tool on category 1 services so that it can be toggled out if an adult user wishes. That is my understanding of where this content has already been dealt with in the Bill. To my mind, this leaves the following ways in which suicide and self-harm material, which is the subject of this group of amendments, is not covered by the Bill. That is what I would like the Minister to confirm, and I absolutely stand by to be corrected.

In the case of adults, if self-harm and suicide material does not meet a bar of illegal content and the service is not category 1, there is no mechanism to toggle it out. Ofcom has no power to require a service to ensure tools to toggle self-harm and suicide material out by default. This means that self-harm and suicide material can be as prevalent as they like—pushed, promoted and recommended, as I have just explained—if it is not contrary to the terms of service, so long as it does not reach the bar of illegal content.

Search services are not subject to these clauses— I am unsure about that. In the case of both children and adults, if self-harm and suicide material is on blogs or services with limited functionality, it is out of scope of the Bill and there is absolutely nothing Ofcom can do. For non-category 1 services—the majority of services which claim that an insignificant number of children access their site and thus that they do not have to comply with the child safety duties—there are no protections for a child against this content.

I put it like that because I believe that each of the statements I just made could have been fixed by amendments already discussed during the past six days in Committee. We are currently planning to leave many children without the protection of the safety duties, to leave vulnerable adults without even the cover of default protections against material that has absolutely no public interest and to leave companies to decide whether to promote or use this material to fuel user engagement—even if it costs well-being and lives.

I ask the Minister to let me know if I have misunderstood, but I think it is really quite useful to see what is left once the protections are in place, rather than always concentrating on the protections themselves.

Baroness Healy of Primrose Hill Portrait Baroness Healy of Primrose Hill (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I support the noble Baroness, Lady Finlay of Llandaff, in her Amendment 96 and others in this group. The internet is fuelling an epidemic of self-harm, often leading to suicide among young people. Thanks to the noble Baroness, Lady Kidron, I have listened to many grieving families explaining the impact that social media had on their beloved children. Content that includes providing detailed instructions for methods of suicide or challenges or pacts that seek agreement to undertake mutual acts of suicide or deliberate self-injury must be curtailed, or platforms must be made to warn and protect vulnerable adults.

I recognise that the Government acknowledge the problem and have attempted to tackle it in the Bill with the new offence of encouraging or assisting serious self-harm and suicide and by listing it as priority illegal content. But I agree with charities such as Samaritans, which says that the Government are taking a partial approach by not accepting this group of amendments. Samaritans considers that the types of suicide and self-harm content that is legal but unequivocally harmful includes information, depictions, instructions and advice on methods of self-harm or suicide, content that portrays self-harm and suicide as positive or desirable and graphic descriptions or depictions of self-harm and suicide.

With the removal of regulation of legal but harmful content, much suicide and self-harm content can remain easily available, and platforms will not even need to consider the risk that such content could pose to adult users. These amendments aim to ensure that harmful self-harm and suicide content is addressed across all platforms and search services, regardless of their functionality or reach, and, importantly, for all persons regardless of age.

In 2017 an inquiry into suicides of young people found suicide-related internet use in 26% of deaths in under-20s and 13% of deaths in 20 to 24 year-olds. Three-quarters of people who took part in Samaritans’ research with Swansea University said that they had harmed themselves more severely after viewing self-harm content online, as the noble Baroness, Lady Finlay, pointed out. People of all ages can be susceptible to harm from this dangerous content. There is shocking evidence that between 2011 and 2015, 151 patients who died by suicide were known to have visited websites that encouraged suicide or shared information about methods of harm, and 82% of those patients were over 25.

Suicide is complex and rarely caused by one thing. However, there is strong evidence of associations between financial difficulties, mental health and suicide. People on the lowest incomes have a higher suicide risk than those who are wealthier, and people on lower incomes are also the most affected by rising prices and other types of financial hardship. In January and February this year the Samaritans saw the highest percentage of first-time phone callers concerned about finance or unemployment—almost one in 10 calls for help in February. With the cost of living crisis and growing pressure on adults to cope with stress, it is imperative that the Government urgently bring in these amendments to help protect all ages from harmful suicide and self-harm content by putting a duty on providers of user-to-user services to properly manage such content.

A more comprehensive online safety regime for all ages will also increase protections for children, as research has shown that age verification and restrictions across social media and online platforms are easily bypassed by them. As the Bill currently stands, there is a two-tier approach to safety which can still mean that children may circumnavigate safety controls and find this harmful suicide and self-harm content.

Finally, user empowerment duties that we debated earlier are no substitute for regulation of access to dangerous suicide and self-harm online content through the law that these amendments seek to achieve.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the noble Baroness, Lady Finlay, for introducing the amendments in the way she did. I think that what she has done, and what this whole debate has done, is to ask the question that the noble Baroness, Lady Kidron, posed: we do not know yet quite where the gaps are until we see what the Government have in mind in terms of the promised new offence. But it seems pretty clear that something along the lines of what has been proposed in this debate needs to be set out as well.

One of the most moving aspects of being part of the original Joint Committee on the draft Bill was the experience of listening to Ian Russell and the understanding, which I had not come across previously, of the sheer scale of the kind of material that has been the subject of this debate on suicide and self-harm encouragement. We need to find an effective way of dealing with it and I entirely take my noble friend’s point that this needs a combination of protectiveness and support. I think the combination of these amendments is designed to do precisely that and to learn from experience through having the advisory committee as well.

It is clear that, by itself, user empowerment is just not going to be enough in all of this. I think that is the bottom line for all of us. We need to go much further, and we owe a debt to the noble Baroness, Lady Finlay, for raising these issues and to the Samaritans for campaigning on this subject. I am just sorry that my noble friend Lady Tyler cannot be here because she is a signatory to a number of the amendments and feels very strongly about these issues as well.

I do not think I need to unpack a great deal of the points that have been made. We know that suicide is a leading cause of death in males under 50 and females under 35 in the UK. We know that so many of the deaths are internet-related and we need to find effective methods of dealing with this. These are meant to be practical steps.

I take the point of the noble Baroness, Lady Fox, not only that it is a social problem of some magnitude but that the question of definitions is important. I thought she strayed well beyond where I thought the definition of “self-harm” actually came. But one could discuss that. I thought the noble Baroness, Lady Kidron, saying that we want good law, not relying on good people, was about definitions. We cannot just leave it to the discretion of an individual, however good they may be, moderating on a social media platform.

18:15
Along with the Samaritans, I very much regret that we no longer have the legal but harmful category, which would help guide us in this area. I share its view that we need to protect people of all ages from all extremely dangerous suicide and self-harm content on large and small platforms. I think this is a way of doing it. The type of content can be perhaps more tightly defined in terms of the kinds of information, depictions, instructions and the kinds of content, the portrayal and the graphic descriptions that occur. One perhaps might be able to do more in that direction. I very much hope that we can move further today with some assurance from the Minister in this area.
The establishment of a specific unit within Ofcom, which was the subject of the Private Member’s Bill of the noble Baroness, Lady Finlay, is potentially a very useful addition to the Bill. I very much hope that the Minister takes that on board as well.
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

This has been a very good debate indeed. I have good days and bad days in Committee. Good days are when I feel that the Bill is going to make a difference and things are going to improve and the sun will shine. Bad days are a bit like today, where we have had a couple of groups, and this is one of them, where I am a bit worried about where we are and whether we have enough—I was going to use that terrible word “ammunition” but I do not mean that—of the powers that are necessary in the right place and with the right focus to get us through some of the very difficult questions that come in. I know that bad cases make bad law, but they can also illustrate why the law is not good enough. As the noble Baroness, Lady Kidron, was saying, this is possibly one of the areas we are in.

The speeches in the debate have made the case well and I do not need to go back over it. We have got ourselves into a situation where we want to reduce harm that we see around but do not want to impact freedom of expression. Both of those are so important and we have to hold on to them, but we find ourselves struggling. What do we do about that? We think through what we will end up with this Bill on the statute book and the codes of practice through it. This looks as though it is heading towards the question of whether the terms of service that will be in place will be sufficient and able to restrict the harms we will see affecting people who should not be affected by them. But I recognise that the freedom of expression arguments have won the day and we have to live with that.

The noble Baroness, Lady Kidron, mentioned the riskiness of the smaller sites—categories 2A and 2B and the ones that are not even going to be categorised as high as that. Why are we leaving those to cause the damage that they are? There is something not working here in the structure of the Bill and I hope the Minister will be able to provide some information on that when he comes to speak.

Obviously, if we could find a way of expressing the issues that are raised by the measures in these amendments as being illegal in the real world, they would be illegal online as well. That would at least be a solution that we could rely on. Whether it could be policed and serviced is another matter, but it certainly would be there. But we are probably not going to get there, are we? I am not looking at the Minister in any hope but he has a slight downward turn to his lips. I am not sure about this.

How can we approach a legal but harmful issue with the sort of sensitivity that does not make us feel that we have reduced people’s ability to cope with these issues and to engage with them in an adult way? I do not have an answer to that.

Is this another amplification issue or is it deeper and worse than that? Is this just the internet because of its ability to focus on things to keep people engaged, to make people stay online when they should not, to make them reach out and receive material that they ought not to get in a properly regulated world? Is it something that we can deal with because we have a sense of what is moral and appropriate and want to act because society wants us to do it? I do not have a solution to that, and I am interested to hear what the Minister will say, but I think it is something we will need to come back to.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, like everyone who spoke, I and the Government recognise the tragic consequences of suicide and self-harm, and how so many lives and families have been devastated by it. I am grateful to the noble Baroness and all noble Lords, as well as the bereaved families who campaigned so bravely and for so long to spare others that heartache and to create a safer online environment for everyone. I am grateful to the noble Baroness, Lady Finlay of Llandaff, who raised these issues in her Private Member’s Bill, on which we had exchanges. My noble friend Lady Morgan is right to raise the case of Frankie Thomas and her parents, and to call that to mind as we debate these issues.

Amendments 96 and 296, tabled by the noble Baroness, Lady Finlay, would, in effect, reintroduce the former adult safety duties whereby category 1 companies were required to assess the risk of harm associated with legal content accessed by adults, and to set and enforce terms of service in relation to it. As noble Lords will know, those duties were removed in another place after extensive consideration. Those provisions risked creating incentives for the excessive removal of legal content, which would unduly interfere with adults’ free expression.

However, the new transparency, accountability and freedom of expression duties in Part 4, combined with the illegal and child safety duties in Part 3, will provide a robust approach that will hold companies to account for the way they deal with this content. Under the Part 4 duties, category 1 services will need to have appropriate systems and processes in place to deal with content or activity that is banned or restricted by their terms of service.

Many platforms—such as Twitter, Facebook and TikTok, which the noble Baroness raised—say in their terms of service that they restrict suicide and self-harm content, but they do not always enforce these policies effectively. The Bill will require category 1 companies—the largest platforms—fully to enforce their terms of service for this content, which will be a significant improvement for users’ safety. Where companies allow this content, the user-empowerment duties will give adults tools to limit their exposure to it, if they wish to do so.

The noble Baroness is right to raise the issue of algorithms. As the noble Lord, Lord Stevenson, said, amplification lies at the heart of many cases. The Bill will require providers specifically to consider as part of their risk assessments how algorithms could affect children’s and adults’ exposure to illegal content, and content that is harmful to children, on their services. Providers will need to take steps to mitigate and effectively manage any risks, and to consider the design of functionalities, algorithms and other features to meet the illegal content and child safety duties in the Bill.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

Following our earlier discussion, we were going to have a response on super-complaints. I am curious to understand whether we had a pattern of complaints—such as those the noble Baroness, Lady Kidron, and others received—about a platform saying, under its terms of service, that it would remove suicide and self-harm content but failing to do so. Does the Minister think that is precisely the kind of thing that could be substantive material for an organisation to bring as a super-complaint to Ofcom?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My initial response is, yes, I think so, but it is the role of Ofcom to look at whether those terms of service are enforced and to act on behalf of internet users. The noble Lord is right to point to the complexity of some marginal cases with which companies have to deal, but the whole framework of the Bill is to make sure that terms of service are being enforced. If they are not, people can turn to Ofcom.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I am sorry to enter the fray again on complaints, but how will anyone know that they have failed in this way if there is no complaints system?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I refer to the meeting my noble friend Lord Camrose offered; we will be able to go through and unpick the issues raised in that group of amendments, rather than looping back to that debate now.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

The Minister is going through the structure of the Bill and saying that what is in it is adequate to prevent the kinds of harms to vulnerable adults that we talked about during this debate. Essentially, it is a combination of adherence to terms of service and user-empowerment tools. Is he saying that those two aspects are adequate to prevent the kinds of harms we have talked about?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes, they are—with the addition of what I am coming to. In addition to the duty for companies to consider the role of algorithms, which I talked about, Ofcom will have a range of powers at its disposal to help it assess whether providers are fulfilling their duties, including the power to require information from providers about the operation of their algorithms. The regulator will be able to hold senior executives criminally liable if they fail to ensure that their company is providing Ofcom with the information it requests.

However, we must not restrict users’ right to see legal content and speech. These amendments would prescribe specific approaches for companies’ treatment of legal content accessed by adults, which would give the Government undue influence in choosing, on adult users’ behalf, what content they see—

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I wanted to give the Minister time to get on to this. Can we now drill down a little on the terms of service issue? If the noble Baroness, Lady Kidron, is right, are we talking about terms of service having the sort of power the Government suggest in cases where they are category 1 and category 2A but not search? There will be a limit, but an awful lot of other bodies about which we are concerned will not fall into that situation.

Also, I thought we had established, much to our regret, that the terms of service were what they were, and that Ofcom’s powers—I paraphrase to make the point—were those of exposure and transparency, not setting minimum standards. But even if we are talking only about the very large and far-reaching companies, should there not be a power somewhere to engage with that, with a view getting that redress, if the terms of service do not specify it?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

The Bill will ensure that companies adhere to their terms of service. If they choose to allow content that is legal but harmful on their services and they tell people that beforehand—and adults are able and empowered to decide what they see online, with the protections of the triple shield—we think that that strikes the right balance. This is at the heart of the whole “legal but harmful” debate in another place, and it is clearly reflected throughout the approach in the Bill and in my responses to all of these groups of amendments. But there are duties to tackle illegal content and to make sure that people know the terms of service for the sites they choose to interact with. If they feel that they are not being adhered to—as they currently are not in relation to suicide and self-harm content on many of the services—users will have the recourse of the regulator to turn to.

18:30
I think that noble Lords are racing ahead a little bit in being pessimistic about the work of Ofcom, which will be proactive in its supervisory role. That is a big difference from the status quo, in terms of the protection for users. We want to strike the right balance to make sure that we are enforcing terms of service while protecting against the arbitrary removal of legal content, and the Bill provides companies with discretion about how to treat that sort of content, as accessed by their users. However, we agree that, by its nature, this type of content can be very damaging, particularly for vulnerable young people, which is why the Government remain committed to introducing a new criminal offence of content that encourages or promotes serious self-harm. The new offence will apply to all victims, children as well as adults, and will be debated once it is tabled; we will explore these details a bit more then. The new law will sit alongside companies’ requirements to tackle illegal suicide content, including material that encourages or assists suicide under the terms of the Suicide Act 1961.
The noble Baronesses, Lady Finlay and Lady Kidron, asked about smaller websites and fora. We are concerned about the widespread availability of content online which promotes or advertises methods of suicide and self-harm, and which can easily be accessed by people who are young or vulnerable. Where suicide and self-harm websites host user-generated content, they will be in scope of the Bill. Those sites will need proactively to prevent users being exposed to priority illegal content, including content that encourages or assists suicide, as set out in the 1961 Act.
The noble Baroness asked about the metaverse, which is in scope of the Bill as a user-to-user service. The approach of the Bill is to try to remain technology neutral.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I will plant a flag in reference to the new offences, which I know we will come back to again. It is always helpful to look at real-world examples. There is a lot of meme-based self-harm content. Two examples are the Tide Pods challenge—the eating of detergent capsules—and choking games, both of which have been very common and widespread. It would be helpful, ahead of our debate on the new offences, to understand whether they are below or above the threshold of serious self-harm and what the Government’s intention is. There are arguments both ways: obviously, criminalising children for being foolish carries certain consequences, but we also want to stop the spread of the content. So, when we come to that offence, it would be helpful if the Minister could use specific examples, such as the meme-based self-harm content, which is quite common.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I thank the noble Lord for the advance notice to think about that; it is helpful. It is difficult to talk in general terms about this issue, so, if I can, I will give examples that do, and do not, meet the threshold.

The Bill goes even further for children than it does for adults. In addition to the protections from illegal material, the Government have indicated, as I said, that we plan to designate content promoting suicide, self-harm or eating disorders as categories of primary priority content. That means that providers will need to put in place systems designed to prevent children of any age encountering this type of content. Providers will also need specifically to assess the risk of children encountering it. Platforms will no longer be able to recommend such material to children through harmful algorithms. If they do, Ofcom will hold them accountable and will take enforcement action if they break their promises.

It is right that the Bill takes a different approach for children than for adults, but it does not mean that the Bill does not recognise that young adults are at risk or that it does not have protections for them. My noble friend Lady Morgan was right to raise the issue of young adults once they turn 18. The triple shield of protection in the Bill will significantly improve the status quo by protecting adults, including young adults, from illegal suicide content and legal suicide or self-harm content that is prohibited in major platforms’ terms and conditions. Platforms also have strong commercial incentives, as we discussed in previous groups, to address harmful content that the majority of their users do not want to see, such as legal suicide, eating disorder or self-harm content. That is why they currently claim to prohibit it in their terms and conditions, and why we want to make sure that those terms and conditions are transparently and accountably enforced. So, while I sympathise with the intention from the noble Baroness, Lady Finlay, her amendments raise some wider concerns about mandating how providers should deal with legal material, which would interfere with the careful balance the Bill seeks to strike in ensuring that users are safer online without compromising their right to free expression.

The noble Baroness’s Amendment 240, alongside Amendment 225 in the name of the noble Lord, Lord Stevenson, would place new duties on Ofcom in relation to suicide and self-harm content. The Bill already has provisions to provide Ofcom with broad and effective information-gathering powers to understand how this content affects users and how providers are dealing with it. For example, under Clause 147, Ofcom can already publish reports about suicide and self-harm content, and Clauses 68 and 69 empower Ofcom to require the largest providers to publish annual transparency reports.

Ofcom may require those reports to include information on the systems and processes that providers use to deal with illegal suicide or self-harm content, with content that is harmful to children, or with content which providers’ own terms of service prohibit. Those measures sit alongside Ofcom’s extensive information-gathering powers. It will have the ability to access the information it needs to understand how companies are fulfilling their duties, particularly in taking action against this type of content. Furthermore, the Bill is designed to provide Ofcom with the flexibility it needs to respond to harms—including in the areas of suicide, self-harm and eating disorders—as they develop over time, in the way that the noble Baroness envisaged in her remarks about the metaverse and new emerging threats. So we are confident that these provisions will enable Ofcom to assess this type of content and ensure that platforms deal with it appropriately. I hope that this has provided the sufficient reassurance to the noble Baroness for her not to move her amendment.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I asked a number of questions on specific scenarios. If the Minister cannot answer them straight away, perhaps he could write to me. They all rather called for “yes/no” answers.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

The noble Baroness threw me off with her subsequent question. She was broadly right, but I will write to her after I refresh my memory about what she said when I look at the Official Report.

Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I am extremely grateful to everyone who has contributed to this debate. It has been a very rich debate, full of information; my notes have become extensive during it.

There are a few things that I would like to know more about: for example, how self-harm, which has been mentioned by the Minister, is being defined, given the debate we have had about how to define self-harm. I thought of self-harm as something that does lasting and potentially life-threatening damage. There are an awful lot of things that people do to themselves that others might not like them doing but that do not fall into that category. However, the point about suicide and serious self-harm is that when you are dead, that is irreversible. You cannot talk about healing, because the person has now disposed of their life, one way or another.

I am really grateful to the noble Baroness, Lady Healy, for highlighting how complex suicide is. Of course, one of the dangers with all that is on the internet is that the impulsive person gets caught up rapidly, so what would have been a short thought becomes an overwhelming action leading to their death.

Having listened to the previous debate, I certainly do not understand how Ofcom can have the flexibility to really know what is happening and how the terms of service are being implemented without a complaints system. I echo the really important phrase from the noble Lord, Lord Stevenson of Balmacara: if it is illegal in the real world, why are we leaving it on the internet?

Many times during our debates, the noble Baroness, Lady Kidron, has pushed safety by design. In many other things, we have defaults. My amendments were not trying to provide censorship but simply trying to provide a default, a safety stop, to stop things escalating, because we know that they are escalating at the moment. The noble Lord, Lord Stevenson of Balmacara, asked whether it was an amplification or a reach issue. I add, “or is it both?”. From all the evidence we have before us, it appears to be.

I am very grateful to the noble Lord, Lord Clement-Jones, for pressing that we must learn from experience and that user empowerment to switch off simply does not go far enough: people who are searching for this and already have suicidal ideation will not switch it off because they have started searching. There is no way that could be viewed as a safety feature in the Bill, and it concerns me.

Although I will withdraw my amendment today, of course, I really feel that we will have to return to this on Report. I would very much appreciate the wisdom of other noble Lords who know far more about working on the internet and all the other aspects than I do. I am begging for assistance in trying to get the amendments right. If not, the catalogue of deaths will mount up. This is literally a once-in-a-lifetime opportunity. For the moment, I beg leave to withdraw.

Amendment 96 withdrawn.
Clause 36: Codes of practice about duties
Amendment 96A not moved.
Amendment 97
Moved by
97: Clause 36, page 36, line 42, at end insert “including a code of practice describing measures for the purpose of compliance with the relevant duties so far as relating to violence against women and girls.”
Member’s explanatory statement
This amendment would impose an express obligation on OFCOM to issue a code of practice on violence against women and girls rather than leaving it to OFCOM’s discretion. This would ensure that Part 3 providers recognise the many manifestations of online violence, including illegal content, that disproportionately affect women and girls.
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

My Lords, it is a great pleasure to move Amendment 97 and speak to Amendment 304, both standing in my name and supported by the noble Baroness, Lady Kidron, the right reverend Prelate the Bishop of Gloucester and the noble Lord, Lord Knight of Weymouth. I am very grateful for their support. I look forward to hearing the arguments by the noble Lord, Lord Stevenson, for Amendment 104 as well, which run in a similar vein.

These amendments are also supported by the Domestic Abuse Commissioner, the Revenge Porn Helpline, BT, EE and more than 100,000 UK citizens who have signed End Violence Against Women’s petition urging the Government to better protect women and girls in the Bill.

I am also very grateful to the noble Baroness, Lady Foster of Aghadrumsee—I know I pronounced that incorrectly—the very distinguished former Northern Ireland politician. She cannot be here to speak today in favour of the amendment but asked me to put on record her support for it.

I also offer my gratitude to the End Violence Against Women Coalition, Glitch, Refuge, Carnegie UK, NSPCC, 5Rights, Professor Clare McGlynn and Professor Lorna Woods. Between them all, they created the draft violence against women and girls code of practice many months ago, proving that a VAWG code of practice is not only necessary but absolutely deliverable.

Much has already been said on this, both here and outside the Chamber. In the time available, I will focus my case for these amendments on two very specific points. The first is why VAWG, violence against women and girls, should have a specific code of practice legislated for it, rather than other content we might debate. The second is what having a code of practice means in relation to the management of that content.

Ofcom has already published masses of research showing that abuse online is gendered. The Government’s own fact sheet, sent to us before these debates, said that women and girls experience disproportionate levels of abuse online. They experience a vast array of abuse online because of their gender, including cyberflashing, harassment, rape threats and stalking. As we have already heard and will continue to hear in these debates, some of those offences and abuse reach a criminal threshold and some do not. That is at the heart of this debate.

18:45
The first death threat that I received—I have received a number, sadly, both to me and to my family—did not talk about death or dying. It said that I was going to be “Jo Coxed”. Of course, I reported that to Twitter and the AI content moderator. Because it did not have those words in it, it was not deemed to be a threat. It was not until I could speak to a human being—in this case, the UK public affairs manager of Twitter, to whom I am very grateful—that it even started to be taken seriously.
The fear of being harassed is impacting women’s freedom of speech. The Fawcett Society has found that 73% of female MPs, versus 51% of male MPs, say that they avoid speaking online in certain discussions because of fear of the consequences of doing so. Other women in the public eye, such as the presenter Karen Carney, have also been driven offline due to gendered abuse.
Here is the thing I cannot reconcile with the government response on this so far. This Government have absolutely rightly recognised that violence against women and girls is a national threat. They have made it a part of the strategic policing requirement. If tackling online abuse against women and girls is a priority, as the Government say, and if, as in the stated manifesto commitment of 2019, they want the UK to be
“the safest place in the world to be online”,
why are the words “women and girls” not used once in the 262 pages of the current draft of the Bill?
The Minister has said that changes have been made in the other House on the Bill—I understand that—and that it is now focused more on the protection of children in relation to certain content, whereas adults are deemed to be able to choose more what they see and how they see it. But there is a G in VAWG, for girls. The code of practice that we are talking about would benefit that very group of people—young girls, who are children—whom the Government have said that they really want to protect through the Bill.
Online harassment does not affect only women in the public eye but all women. I suspect that we all now know the statistic that women are 27 times more likely to be harassed online than men. In other words, to have an online presence as a woman is to expect harassment. That is not to say that men do not face abuse online, but a lot of the online abuse is deliberately gendered and is targeted at women. Do men receive rape threats on the same vast scale as women and young girls?
It should not be the public’s job to force a platform to act on the harmful content that it is hosting, just as it should not be a woman’s job to limit her online presence to prevent harassment. But the sad reality is that, in its current form, the Bill is doing very little to force platforms to act holistically in relation to violence against women and girls and to change their culture online.
The new VAWG-relevant criminal offences listed in the Bill—I know that my noble friend the Minister will rely on these in his response to the debate—including cyberflashing and coercive and controlling behaviour, are an important step, but even these new offences have their own issues, which I suspect we will come on to debate in the next day of Committee: for example, cyberflashing being motive-based instead of consent-based. Requiring only those platforms caught by the Bill to look at the criminal offences individually ignores the rest of the spectrum of gendered abuse.
Likewise, the gender-neutral approach in the Bill will harm children. NSPCC research found that in 2021-22, four in five victims of online grooming offences were girls. The Internet Watch Foundation, an organisation we are going to talk about in the next group, has found in recently published statistics that girls are more likely to be seriously abused online. I have already stated that this is not to say that boys and men do not experience abuse online, but the fact is that women and girls are several times more likely to be abused. This is not an argument against free speech; people online should be allowed to debate and disagree with each other, but discussions can and should be had without the threat of rape or harassment.
Again, the Government will argue that the triple-shield approach to combating legal but harmful content online will sufficiently protect women and girls, but this is not the case. Instead of removing this content, the Bill’s user empowerment tools—much debated already—expect women to shield themselves from seeing it. All this does is allow misogynistic and often violent conversations to continue without women knowing about them, the result of which can be extremely dangerous. A victim of domestic abuse could indeed block the user threatening to kill them, but that does not stop that user from continuing to post the threats he is making, or even posting photos of the victim’s front door. Instead of protecting the victim, these tools potentially leave them even more vulnerable to real-life harms. Likewise, the triple shield will rely too heavily on platforms setting their own terms and conditions. We have just heard my noble friend the Minister using this argument in the last group, but the issue is that the platforms can choose to water down their terms and conditions, and Ofcom is then left without recourse.
I turn to what a violence against women and girls code of practice would mean. It could greatly reduce all the dangers I have just laid out. It would ensure that services regularly review their algorithms to stop misogyny going viral, and that moderators are taught, for example, how to recognise different forms of online violence against women and girls, including forms of tech abuse. Ofcom has described codes of practice as
“key documents, which set out the steps services can take to comply with their duties”.
Services can choose to take an alternative approach to complying with their duties, provided that it is consistent with the duties in the Bill, but codes will provide a clear route to compliance, and Ofcom envisages that many services will therefore take advantage of them.
The value of having a code lies in its systemic approach. It does not focus on individual items of content—which is one of the worries that have been expressed, both in this House and outside—but it focuses the platforms’ minds on the whole environment in which the tech-enabled abuse occurs. The code of practice would make the UK the first country in the world to hold tech companies specifically to account on tackling violence against women and girls. It would also make the Online Safety Bill more future-proof, because it would provide a proactive and agile route for identifying and problem-solving new forms of online VAWG as they emerge, rather than delaying action until the creation of a new criminal offence when the next relevant piece of primary legislation comes along.
I finish by saying that throughout the Bill’s journey through Parliament, we have debated whether it sufficiently protects women and girls. The objective answer is, “No, it does not”, but there appears to be a real reluctance to accept this as fact. Instead of just agreeing to disagree on this topic, we instead have an opportunity here to protect millions of women and girls online with a violence against women and girls code of practice. So I ask noble Lords to support this critical amendment, not just for the sake of themselves, their daughters, their sisters or their wives but for the sake of the millions of women whose names we will never know but who will be grateful that we stood on their side on the issue of gendered online violence. I beg to move.
Lord Bishop of Gloucester Portrait The Lord Bishop of Gloucester
- View Speech - Hansard - - - Excerpts

My Lords, I have added my name to Amendments 97 and 304, and I wholeheartedly agree with all that the noble Baroness, Lady Morgan, said by means of her excellent introduction. I look forward to hearing what the noble Baroness, Lady Kidron, has to say as she continues to bring her wisdom to the Bill.

Let me say from the outset, if it has not been said strongly enough already, that violence against women and girls is an abomination. If we allow a culture of intimidation and misogyny to exist online, it will spill over to offline experiences. According to research by Refuge, almost one in five domestic abuse survivors who experienced abuse or harassment from their partner or former partner via social media said they felt afraid of being attacked or being subjected to physical violence as a result. Some 15% felt that their physical safety was more at risk, and 5% felt more at risk of so-called honour-based violence. Shockingly, according to Amnesty International, 41% of women who experienced online abuse or harassment said that these experiences made them feel that their physical safety was threatened.

Throughout all our debates, I hesitate to differentiate between the real and virtual worlds, because that is simply not how we live our lives. Interactions online are informed by face-to-face interactions, and vice versa. To think otherwise is to misunderstand the lived experience of the majority—particularly, dare I say, the younger generations. As Anglican Bishop for HM Prisons, I recognise the complexity of people’s lives and the need to tackle attitudes underpinning behaviours. Tackling the root causes of offending should always be a priority; there is potential for much harm later down the line if we ignore warning signs of hatred and misogyny. Research conducted by Refuge found that one in three women has experienced online abuse or harassment perpetrated on social media or another online platform at some point in their lives. That figure rises to almost two in three, or 62%, among young women. This must change.

We did some important work in your Lordships’ House during the passage of the Domestic Abuse Act to ensure that all people, including women and girls, are safe on our streets and in their homes. As has been said, introducing a code of practice as outlined will help the Government meet their aim of making the UK the safest place in the world to be online, and it will align with the Government’s wider priority to tackle violence against women and girls as a strategic policing requirement. Other strategic policing requirements, including terrorism and child sexual exploitation, have online codes of practice, so surely it follows that there should be one for VAWG to ensure that the Bill aligns with the Government’s position elsewhere and that there is not a gap left online.

I know the Government care deeply about tackling violence against women and girls, and I believe they have listened to some concerns raised by the sector. The inclusion of the domestic abuse and victims’ commissioners as statutory consultees is welcomed, as is the Government’s amendment to recognise controlling and coercive behaviour as a priority offence. However, without this code of conduct, the Bill will fail to address duties of care in relation to preventing domestic abuse and violence against women and girls in a holistic and encompassing way. The onus should not be on women and girls to remove themselves from online spaces; we have seen plenty of that in physical spaces over the years. Women and girls must be free to appropriately express themselves online and offline without fear of harassment. We must do all we can to prevent expressions of misogyny from transforming into violent actions.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I have added my name to Amendments 97 and 304, and I support the others in this group. It seems to be a singular failure of any version of an Online Safety Bill if it does not set itself the task of tackling known harms—harms that are experienced daily and for which we have a phenomenal amount of evidence. I will not repeat the statistics given in the excellent speeches made by the noble Baroness, Lady Morgan, and the right reverend Prelate, but will instead add two observations.

19:00
I am of the generation that saw women break gender barriers and glass ceilings. We were ourselves the beneficiaries of the previous generation, which both intellectually and practically pushed the cause of gender equality. Many of us are also employers, mothers, aunties or friends of a generation in which the majority of the young favour a more gender-equal world. Yet we have seen online the amplification of those who hold a profound resentment of what I would characterise as hard-won and much-needed progress. The vileness and violence of their fury is fuelled by the rapacious appetite of algorithms that profit from engagement, which has allowed gender-based detractors, haters and abusers to normalise misogyny to such a degree that rape threats and threats of violence are trotted out against women for the mildest of perceived infractions. In the case of an academic colleague of mine, her crime—for which she received rape threats—was offering a course in women’s studies.
If the price of having a voice online continues to be that you have to withstand a supercharged swarm of abuse then for many women it is simply not worth it. As the noble Baroness, Lady Morgan, said, they are effectively silenced. This sadly extends to girls, who repeatedly say that, as the statistics persistently show, they are put off any kind of public role and even expressing a view because they fear both judgment on how they look and abuse for what they say. How heart- breaking it is that the organising technology of our time is so regressive for women and girls that the gains we have made in our lifetime are being denied them. This is why I believe that Parliament must be clear that an environment in which women and girls are routinely silenced or singled out for abuse is not okay.
My second observation is slightly counterintuitive, because I so wish for these amendments to find their way into the Bill. I have a sense of disquiet that there will be no similar consideration of other exposed or vulnerable groups that are less well-represented in Parliament. I therefore want to take this opportunity to say once again that we have discussed in our debates on previous groups amendments that would commit the Bill to the Equality Act 2010, with the expectation that companies will adhere to UK law across all groups with protected characteristics, including those who may have more than one protected characteristic, and take note that—this point has been made in a number of briefings—women with disabilities and mixed or global-majority heritage come in for double, sometimes triple, doses of abuse. In saying that, I wish to acknowledge that the amendments in the name of the noble Baroness, Lady Fox, which make it clear that the discussion of protected characteristics does not in and of itself constitute harm. I very much agree with her on that.
Perhaps this is a good moment to remember that the Bill is proposed as a systems and processes regime—no single piece of content will be at stake but rather, if a company is amplifying and promoting at scale behaviours that hound women and girls out of the public space, Ofcom will have the tools to deal with it. At the risk of repeating myself, these are not open spaces; they are 100% engineered and largely privately owned. I fail to see another environment in which it is either normal or lawful to swarm women with abuse and threat.
On our first day in Committee, the Minister said in his response to the amendment in the name of the noble Lord, Lord Stevenson, that the Government are very clear on the Bill’s purposes. Among the list of purposes that he gave was
“to protect people who face disproportionate harm online including, for instance, because of their sex or their ethnicity or because they are disabled”.—[Official Report, 19/4/23; col. 724.]
I ask the Minister to make this Bill come true on that purpose.
Baroness Healy of Primrose Hill Portrait Baroness Healy of Primrose Hill (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I strongly support Amendment 97 in the name of the noble Baroness, Lady Morgan. We must strengthen the Bill by imposing an obligation on Ofcom to develop and issue a code of practice on violence against women and girls. This will empower Ofcom and guide services in meeting their duties in regard to women and girls, and encourage them to recognise the many manifestations of online violence that disproportionately affect women and girls.

Refuge, the domestic abuse charity, has seen a growing number of cases of technology-facilitated domestic abuse in recent years. As other noble Lords have said, this tech abuse can take many forms but social media is a particularly powerful weapon for perpetrators, with one in three women experiencing online abuse, rising to almost two in three among young women. Yet the tech companies have been too slow to respond. Many survivors are left waiting weeks or months for a response when they report abusive content, if indeed they receive one at all. It appears that too many services do not understand the risks and nature of VAWG. They do not take complaints seriously and they think that this abuse does not breach community standards. A new code would address this with recommended measures and best practice on the appropriate prevention of and response to violence against women and girls. It would also support the delivery of existing duties set out in the Bill, such as those on illegal content, user empowerment and child safety.

I hope the Minister can accept this amendment, as it would be in keeping with other government policies, such as in the strategic policing requirement, which requires police forces to treat violence against women and girls as a national threat. Adding this code would help to meet the Government’s national and international commitments to tackling online VAWG, such as the tackling VAWG strategy and the Global Partnership for Action on Gender-Based Online Harassment and Abuse.

The Online Safety Bill is a chance to act on tackling the completely unacceptable levels of abuse of women and girls by making it clear through Ofcom that companies need to take this matter seriously and make systemic changes to the design and operation of their services to address VAWG. It would allow Ofcom to add this as a priority, as mandated in the Bill, rather than leave it as an optional extra to be tackled at a later date. The work to produce this code has already been done thanks to Refuge and other charities and academics who have produced a model that is freely available and has been shared with Ofcom. So it is not an extra burden and does not need to delay the implementation of the Bill; in fact, it will greatly aid Ofcom.

The Government are to be congratulated on their amendment to include controlling or coercive behaviour in their list of priority offences. I would like to congratulate them further if they can accept this valuable Amendment 97.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I start by commending my noble friend Lady Morgan on her clear introduction to this group of amendments. I also commend the noble Baroness, Lady Kidron, on her powerful speech.

From those who have spoken so far, we have a clear picture of the widespread nature of some of the abuse and offences that women experience when they go online. I note from what my noble friend Lady Morgan said that there is widespread support from a range of organisations outside the Committee for this group of amendments. She also made an important and powerful point about the potential chilling effect of this kind of activity on women, including women in public life, being able to exercise their right to freedom of expression.

I feel it is important for me to make it clear that—this is an obvious thing—I very much support tough legal and criminal sanctions against any perpetrator of violence or sexual abuse against women. I really do understand and support this, and hear the scale of the problem that is being outlined in this group of amendments.

Mine is a dissenting voice, in that I am not persuaded by the proposed solution to the problem that has been described. I will not take up a lot of the Committee’s time, but any noble Lords who were in the House when we were discussing a group of amendments on another piece of legislation earlier this year may remember that I spoke against making misogyny a hate crime. The reason why I did that then is similar, in that I feel somewhat nervous about introducing a code of conduct which is directly relevant to women. I do not like the idea of trying to address some of these serious problems by separating women from men. Although I know it is not the intention of a code such as this or any such measures, I feel that it perpetuates a sense of division between men and women. I just do not like the idea that we live in a society where we try to address problems by isolating or categorising ourselves into different groups of people, emphasising the sense of weakness and being victims of any kind of attack or offence from another group, and assuming that everybody who is in the other group will be a perpetrator of some kind of attack, criticism or violence against us.

My view is that, in a world where we see some of this serious activity happening, we should do more to support young men and boys to understand the proper expectations of them. When we get to the groups of amendments on pornography and what more we can do to prevent children’s access to it, I will be much more sympathetic. Forgive me if this sounds like motherhood and apple pie, but I want us to try to generate a society where basic standards of behaviour and social norms are shared between men and women, young and old. I lament how so much of this has broken down, and a lot of the problems we see in society are the fault of political and—dare I say it?—religious leaders not doing more to promote some of those social norms in the past. As I said, I do not want us to respond to the situation we are in by perpetuating more divisions.

I look forward to hearing what my noble friend the Minister has to say, but I am nervous about the solution proposed in the amendments.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, it gives me great pleasure to follow the noble Baroness, Lady Stowell of Beeston, not least because she became a dissenting voice, and I was dreading that I might be the only one.

First, I think it important that we establish that those of us who have spent decades fighting violence against women and girls are not complacent about it. The question is whether the physical violence we describe in the Bill is the same as the abuse being described in the amendments. I worry about conflating online incivility, abuse and vile things said with physical violence, as is sometimes done.

I note that Refuge, an organisation I have a great deal of respect for, suggested that the user empowerment duties that opted to place the burden on women users to filter out their own online experience was the same as asking women to take control of their own safety and protect themselves offline from violence. I thought that was unfair, because user empowerment duties and deciding what you filter out can be women using their agency.

19:15
In that context, I wanted to probe Amendment 104, in the name of the noble Lord, Lord Stevenson of Balmacara, and whether affording, as it states, a higher standard of protection to women and girls could actually be disempowering. I am always concerned about discriminatory special treatment for women. I worry that we end up presenting or describing young women as particularly vulnerable due to their sex, overemphasising victimhood. That, in and of itself, can undermine women’s confidence rather than encouraging them to see themselves as strong, resilient and so on. I was especially worried about Amendment 171, in the name of the noble Baroness, Lady Featherstone, but she is not here to move it. It states that content that promotes or perpetuates violence against women and girls should be removed, and the users removed if they are identified as creating or even disseminating it.
What always worries me about this is Bill that, because we want to improve the world—I know that is the joint enterprise here—we could get carried away. Whereas in law we have a very narrow definition of what incitement to violence is, here we are not very specific about it. I worry about the low threshold whereby somebody who creates a horrible sexist meme will be punished, but then someone who just retweets it will be treated in the same way. I want to be able to have a conversation about why that is the wrong thing to do. I am worried, as I always am, about censorship and so on.
The statistics are a bit confusing on this. There are often-repeated statistics, but you need to dig down, look at academic papers and talk to people who work in this field. An academic paper from Oxford Internet Surveys from August 2021 notes that the exceptional prevalence of online hostility to women is largely based on anecdotal experience, and that a closer look across the British population, contrary to conventional wisdom, shows that women are not necessarily more likely than men to experience hateful speech online. It also notes, however, that there is empirical evidence to show that subgroups of women—it cites journalists and politicians—can be disproportionately targeted. I will not go through all the statistics, although I do have them here, but there were very small differences of 2% or 3%, and in some instances young men were more likely to suffer abuse.
Ofcom’s Online Nation report of June 2022 says that women are more negatively affected by trolling and so on, but again, I worry about the gender point. I am worried that what we are trying to tackle here is what we all know to be a toxic and nasty political atmosphere in society that is reflected online. We all know what we are talking about. People bandy around the most vile labels; we see that regularly on social media, and, if you are a woman, it does take on this nasty, sexist side. It is incredibly unpleasant.
We also have to recognise that that is a broad, moral, social and cultural problem, which I hope that we will try to counter. I am no men’s rights sympathiser by any stretch, but I also noticed that the Ofcom report said that young men are more likely than women to have experienced seeing potentially harmful behaviour or content online in the four weeks before the survey response—64% of men as against 60% of women. Threats of physical violence were more prevalent with boys than women—16% versus 11%—while sexual violence was the other way around. So I want us to have a sense of proportion and say that no one on the receiving end of this harmful, nasty trolling should be ignored, regardless of their sex.
It is also interesting—I will finish with this—that UK women are avid users of social media platforms, spending more than a quarter of their waking hours online and around half an hour more than men each day. We say that the online world is inhospitable to women, but there are a lot of them on it regardless, so we need a sense of perspective. Ofcom’s report makes the point that, often,
“the benefits of being online outweigh the risks”.
More women than men disagree with that; 63% agree with it compared with 71% of males. But we need a sense of perspective here because, actually, the majority of young men and women like being online. Sometimes it can give young women a sense of solidarity and sisterhood. All the surveys that I have read say that female participants feel less able to share their opinions and use their voice online, and we have to ask why.
The majority of young people who I work with are women. They say the reason they dare not speak online is not misogynist hate speech but cancel culture. They are walking on eggshells, as there are so many things that you are not allowed to say. Your Lordships will also be aware that, in gender-critical circles, for example, a lot of misogynistic hate speech is directed at women who are not toeing the line on a particular orthodoxy today. I do not want a remedy for that toxicity, with women not being sure if they can speak out because of cancel culture, if that remedy introduces more censorious trends.
Baroness Burt of Solihull Portrait Baroness Burt of Solihull (LD)
- View Speech - Hansard - - - Excerpts

My Lords, it is an honour to follow some very knowledgeable speakers, whose knowledge is much greater than mine. Nevertheless, I feel the importance of this debate above and beyond any other that I can think of on this Bill. However, I do not agree with the noble Baroness, Lady Stowell of Beeston, who said that women should not be victims. They are not victims; they are being victimised. We need a code—the code that is being proposed—not for the victims but for the tech companies, because of the many diverse strands of abuse that women face online. This is an enabler for the tech companies to get their heads around what is coming and to understand it a lot better. It is a helpful tool, not a mollycoddling tool at all.

I strongly agree with everything else, apart from what was said by the noble Baroness, Lady Fox, which I will come on to in a second. I and, I am sure, other noble Lords in this Chamber have had many hundreds of emails from concerned people, ordinary people, who nevertheless understand the importance of what this code of practice will achieve today. I speak for them, as well as the others who have supported this particularly important amendment.

As their supporters have pointed out in this Chamber, Amendments 97 and 304 are the top priority for the Domestic Abuse Commissioner, who believes that, if they do not pass, the Bill will not go far enough to prevent and respond effectively to domestic abuse online. The noble Baroness, Lady Fox, spoke about the need to keep a sense of proportion, but online abuse is everywhere. According to the charity Refuge—I think this was mentioned earlier—over one-third of women and 62% of young women have experienced online abuse and harassment.

I am sure that the Minister is already aware that a sector coalition of experts on violence against women and girls put together the code of practice that we are discussing today. It is needed, as I have said, because of the many strands of abuse that are perpetuated online. However, compliance with the new terms of service to protect women and girls is not cheap. In cost- driven organisations, the temptation will be to relax standards as time goes by, which we have seen in the past in the cases of Facebook and Twitter. The operators’ feet must be held to the fire with this new, stricter and more comprehensive code. People’s lives depend on it.

In his remarks, can the Minister indicate whether the Government are at least willing to look at this code? Otherwise, can he explain how the Government will ensure that domestic abuse and its component offences are understood by providers in the round?

Baroness Gohir Portrait Baroness Gohir (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to support the noble Baronesses, Lady Morgan and Lady Kidron, the right reverend Prelate the Bishop of Gloucester and the noble Lord, Lord Knight of Weymouth, on Amendment 97 to Clause 36 to mandate Ofcom to produce codes of practice, so that these influential online platforms have to respond adequately to tackle online violence against women and girls.

Why should we care about these codes of practice being in the Bill? Not doing so will have far-reaching consequences, of which we have already heard many examples. First, it will threaten progress on gender equality. As the world moves to an increasingly digital future, with more and more connections and conversations moving online, women must have the same opportunity as men to be a part of the online world and benefit from being in the online space.

Secondly, it will threaten the free speech of women. The voices of women are more likely to be suppressed. Because of abuse, women are more likely to reduce their social media activity or even leave social media platforms altogether.

Thirdly, we will be failing in our obligation to protect the human rights of women. Every woman has the right to be and feel safe online. I thank the noble Baroness, Lady Kidron, who highlighted online abuse due to intersecting identities. The noble Baroness, Lady Stowell, mentioned that this could cause divisions; there are divisions already, given the level of online abuse faced by women. Until we get an equal and just society, additional measures are needed. I know that the noble Baroness, Lady Fox, is worried about censorship, but women also have the right to feel safe online and offline. The noble Baroness is worried about whether this is a proportionate response, but I do feel that it is.

Relying on tech companies to self-regulate on VAWG is a bad idea. At present, the overwhelming majority of tech companies are led by men and their employees are most likely to be men, who will be taking decisions on content and on moderating that content. So we are relying on the judgment of a sector that itself needs to be more inclusive of women and is known for not sufficiently tackling the online abuse of women and girls.

I will give a personal example. Someone did not like what I said on Twitter and posted a message with a picture of a noose, which I found threatening. I reported that and got a response to say that it did not violate terms and conditions, so it remained online.

The culture at these tech companies was illustrated a few years ago when employees at Google walked out to protest against sexism. Also, research a couple of years ago by a campaign group called Global Witness found that Facebook used biased algorithms that promoted career and gender stereotypes, resulting in particular job roles being seen by men and others being seen by women. We know that other algorithms are even more harmful and sinister and promote hatred and misogyny. So relying on a sector that may not care much about women’s rights or their well-being to do the right thing is not going to work. Introducing the VAWG code in the Bill will help to make tech companies adequately investigate and respond to reports of abuse and take a proactive approach to minimise and prevent the risk of abuse taking place in the first instance.

19:30
I also add my support to the noble Baroness’s Amendment 304, to Clause 207, so that the definition of violence against women and girls is in line with the gold standard framework provided by the Istanbul convention, which was ratified by the UK Government.
The Bill is a unique opportunity to protect women and girls online, so I commend the Government on introducing it to Parliament, but they could do much more to combat violence against women and girls. If they do not, inaction now will result in us sleepwalking into a culture of normalising despicable behaviour towards women even more openly. If online perpetrators of abuse are not tackled robustly now, it will embolden them further to escalate and gaslight victims. The Government must send perpetrators a message that there is no place to hide—not even online. If the Government want the UK to be the safest place in the world to be online, they should agree to these amendments.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I will be very brief. My noble friend has very eloquently expressed the support on these Benches for these amendments, and I am very grateful to the noble Baroness, Lady Morgan, for setting out the case so extremely convincingly, along with many other noble Lords. It is, as the noble Baroness, Lady Kidron, said, about the prevention of the normalisation of misogyny. As my noble friend said, it is for the tech companies to prevent that.

The big problem is that the Government have got themselves into a position where—except in the case of children—the Bill now deals essentially only with illegal harms, so you have to pick off these harms one by one and create illegality. That is why we had the debate in the last group about other kinds of harm. This is another harm that we are debating, precisely because the Government amended the Bill in the Commons in the way that they did. But it does not make this any less important. It is quite clear; we have talked about terms of service, user empowerment tools, lack of enforcement, lack of compliance and all the issues relating to these harms. The use of the expression “chilling effect”—I think by the noble Baroness, Lady Kidron—and then the examples given by the noble Baroness, Lady Gohir, absolutely illustrated that. We are talking about the impact on freedom of expression.

I am afraid that, once again, I do not agree with the noble Baroness, Lady Fox. Why do I find myself disagreeing on such a frequent basis? I think the harms override the other aspects that the noble Baroness was talking about.

We have heard about the lack of a proper complaints system—we are back to complaints again. These themes keep coming through, and until the Government see that there are flaws in the Bill, I do not think we are going to make a great deal more progress. The figure given was that more than half of domestic abuse survivors did not receive a response from the platform to their report of domestic abuse-related content. That kind of example demonstrates that we absolutely need this code.

There is an absolutely convincing case for what one of our speakers, probably the right reverend Prelate, called a holistic way of dealing with these abuses. That is what we need, and that is why we need this code.

Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, the amendments in this group, which I am pleased to speak to now, shine a very bright light on the fact that there is no equality when it comes to abuse. We are not starting at a level playing field. This is probably the only place that I do not want to level up; I want to level down. This is not about ensuring that men can be abused as much as women; it is about the very core of what the Bill is about, which is to make this country the safest online space in the world. That is something that unites us all, but we do not start in the same place.

I thank all noble Lords for their very considered contributions in unpicking all the issues and giving evidence about why we do not have that level playing field. Like other noble Lords, I am grateful to the noble Baroness, Lady Morgan, for her thorough, illustrative and realistic introduction to this group of amendments, which really framed it today. Of course, the noble Baroness is supported in signing the amendment by the noble Baroness, Lady Kidron, the right reverend Prelate the Bishop of Gloucester and my noble friend Lord Knight.

The requirement in Amendment 97 that there should be an Ofcom code of practice is recognition that many aspects of online violence disproportionately affect women and girls. I think we always need to come back to that point, because nothing in this debate has taken me away from that very clear and fundamental point. Let us remind ourselves that the online face of violence against women and girls includes—this is not a full list—cyberflashing, abusive pile-ons, incel gangs and cyberstalking, to name but a few. Again, we are not starting from a very simple point; we are talking about an evolving online face of violence against women and girls, and the Bill needs to keep pace.

I associate myself with the words of the noble Baroness, Lady Morgan, and other noble Lords in thanking and appreciating the groups and individuals who have already done the work, and who have—if I might use the term—an oven-ready code of practice available to the Minister, should he wish to avail himself of it. I share the comments about the lack of logic. If violence against women and girls is part of the strategic policing requirement, and the Home Secretary says that dealing with violence against women and girls is a priority, why is this not part of a joined-up government approach? That is what we should now be seeing in the Bill. I am sure the Minister will want to address that question.

The right reverend Prelate the Bishop of Gloucester rightly said that abuse is abuse. Whether it is online or offline, it makes no difference. The positive emphasis should be that women and girls should be able to express themselves online as they should be able to offline. Again, that is a basic underlying point of these amendments.

I listened very closely to the words of the noble Baroness, Lady Stowell. I understand her nervousness, and she is absolutely right to bring before the Committee that perhaps a code of conduct of this nature could allow and encourage, to quote her, division. The challenge we have is that women and girls have a different level of experience. We all want to see higher standards of behaviour, as the noble Baroness referred to—I know that we will come back to that later. However, I cannot see how not having a code of conduct will assist those higher standards because the proposed code of conduct simply acknowledges the reality, which is that women and girls are 27 times more likely to be abused online than men are. I want to put on record that this is not about emphasising division, saying that it is all right to abuse men or, as the noble Baroness gives me the opportunity to say, saying that all men are somehow responsible—far from it. As ever, this is something that unites us all: the tackling of abuse wherever it takes place.

Amendment 104 in the name of my noble friend Lord Stevenson proposes an important change to Schedule 4: that

“women and girls, and vulnerable adults”

should have a higher standard of protection than other adult users. That amendment is there because the Bill is silent on these groups. There is no mention of them, so we seek to change this through that amendment.

To return to the issue of women and girls, two-thirds of women who report abuse to internet companies do not feel heard. Three-quarters of women change their behaviour after receiving online abuse. I absolutely agree with the noble Baroness, Lady Kidron, who made the point that the Bill currently assumes that there is no interconnection between different safety duties where somebody has more than one protected characteristic, because it misses reality. One has only to talk to Jewish women to know that, although anti-Semitism knows no bounds, if you are a Jewish woman then there is no doubt that you will be the subject of far greater abuse than your male counterpart. Similarly, women of colour are one-third more likely to be mentioned in abusive tweets than white women. Again, there is no level playing field.

As it stands, the Bill puts an onus on women and girls to protect themselves from online violence and abuse. The problem, as has been mentioned many times, is that user empowerment tools do not incentivise services to address the design of their service, which may be facilitating the spread of violence against women and girls. That point was very well made by my noble friend Lady Healy and the noble Baroness, Lady Gohir, in their contributions.

On the question of the current response to violence against women and girls from tech companies, an investigation by the Times identified that platforms such as TikTok and YouTube are profiting from a wave of misogynist content, with a range of self-styled “self-help gurus”, inspired by the likes of Andrew Tate, offering advice to their millions of followers, encouraging men and boys, in the way described by the noble Baroness, Lady Stowell, to engage with women and girls in such a way that amounts to pure abuse, instructing boys and men to ensure that women and girls in their lives are “compliant”, “insecure” and “well- behaved”. This is not the kind of online space that we seek.

I hope that the Minister, if he cannot accept the amendments, will give his assurance that he can understand what is behind them and the need for action, and will reflect and come back to your Lordships’ House in a way that can allow us to level down, rather than level up, the amount of abuse that is aimed at men but also, in this case in particular, at women and girls.

19:45
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, protecting women and girls is a priority for His Majesty’s Government, at home, on our streets and online. This Bill will provide vital protections for women and girls, ensuring that companies take action to improve their safety online and protect their freedom of expression so that they can continue to play their part online, as well as offline, in our society.

On Amendments 94 and 304, tabled by my noble friend Lady Morgan of Cotes, I want to be unequivocal: all service providers must understand the systemic risks facing women and girls through their illegal content and child safety risk assessments. They must then put in place measures that manage and mitigate these risks. Ofcom’s codes of practice will set out how companies can comply with their duties in the Bill.

I assure noble Lords that the codes will cover protections against violence against women and girls. In accordance with the safety duties, the codes will set out how companies should tackle illegal content and activity confronting women and girls online. This includes the several crimes that we have listed as priority offences, which we know are predominantly perpetrated against women and girls. The codes will also cover how companies should tackle harmful online behaviour and content towards girls.

Companies will be required to implement systems and processes designed to prevent people encountering priority illegal content and minimise the length of time for which any such content is present. In addition, Ofcom will be required to carry out broad consultation when drafting codes of practice to harness expert opinions on how companies can address the most serious online risks, including those facing women and girls. Many of the examples that noble Lords gave in their speeches are indeed reprehensible. The noble Baroness, Lady Kidron, talked about rape threats and threats of violence. These, of course, are examples of priority illegal content and companies will have to remove and prevent them.

My noble friend Lady Morgan suggested that the Bill misses out the specific course of conduct that offences in this area can have. Clause 9 contains provisions to ensure that services

“mitigate and manage the risk of the service being used for the commission or facilitation of”

an offence. This would capture patterns of behaviour. In addition, Schedule 7 contains several course of conduct offences, including controlling and coercive behaviour, and harassment. The codes will set out how companies must tackle these offences where this content contributes to a course of conduct that might lead to these offences.

To ensure that women’s and girls’ voices are heard in all this, the Bill will, as the right reverend Prelate noted, make it a statutory requirement for Ofcom to consult the Victims’ Commissioner and the domestic abuse commissioner about the formation of the codes of practice. As outlined, the existing illegal content, child safety and child sexual abuse and exploitation codes will already cover protections for women and girls. Creating a separate code dealing specifically with violence against women and girls would mean transposing or duplicating measures from these in a separate code.

In its recent communication to your Lordships, Ofcom stated that it will be consulting quickly on the draft illegal content and child sexual abuse and exploitation codes, and has been clear that it has already started the preparatory work for these. If Ofcom were required to create a separate code on violence against women and girls this preparatory work would need to be revised, with the inevitable consequence of slowing down the implementation of these vital protections.

An additional stand-alone code would also be duplicative and could cause problems with interpretation and uncertainty for Ofcom and providers. Linked to this, the simpler the approach to the codes, the higher the rates of compliance are likely to be. The more codes there are covering specific single duties, the more complicated it will be for providers, which will have to refer to multiple different codes, and the harder for businesses to put in place the right protections for users. Noble Lords have said repeatedly that this is a complex Bill, and this is an area where I suggest we should not make it more complex still.

As the Bill is currently drafted, Ofcom is able to draft codes in a way that addresses a range of interrelated risks affecting different groups of users, such as people affected in more than one way; a number of noble Lords dealt with that in their contributions. For example, combining the measures that companies can take to tackle illegal content targeting women and girls with the measures they can take to tackle racist abuse online could ensure a more comprehensive and effective approach that recognises the point, which a number of noble Lords made, that people with more than one protected characteristic under the Equality Act may be at compound risk of harm. If the Bill stipulated that Ofcom separate the offences that disproportionately affect women and girls from other offences in Schedule 7, this comprehensive approach to tackling violence against women and girls online could be lost.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

Could my noble friend the Minister confirm something? I am getting rather confused by what he is saying. Is it the case that there will be just one mega code of practice to deal with every single problem, or will there be lots of different codes of practice to deal with the problems? I am sure the tech platforms will have sufficient people to be able to deal with them. My understanding is that Ofcom said that, while the Bill might not mandate a code of practice on violence against women and girls, it would in due course be happy to look at it. Is that right, or is my noble friend the Minister saying that Ofcom will never produce a code of practice on violence against women and girls?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

It is up to Ofcom to decide how to set the codes out. What I am saying is that the codes deal with specific categories of threat or problem—illegal content, child safety content, child sexual abuse and exploitation—rather than with specific audiences who are affected by these sorts of problems. There is a circularity here in some of the criticism that we are not reflecting the fact that there are compound harms to people affected in more than one way and then saying that we should have a separate code dealing with one particular group of people because of one particular characteristic. We are trying to deal with categories of harm that we know disproportionately affect women and girls but which of course could affect others, as the noble Baroness rightly noted. Amendment 304—

Baroness Merron Portrait Baroness Merron (Lab)
- Hansard - - - Excerpts

I thank the Minister for giving way. There is a bit of a problem that I would like to raise. I think the Minister is saying that there should not be a code of practice in respect of violence against women and girls. That sounds to me like there will be no code of practice in this one particular area, which seems rather harsh. It also does not tackle the issue on which I thought we were all agreed, even if we do not agree the way forward: namely, that women and girls are disproportionately affected. If it is indeed the case that the Minister feels that way, how does he suggest this is dealt with?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

There are no codes designed for Jewish people, Muslim people or people of colour, even though we know that they are disproportionately affected by some of these harms as well. The approach taken is to tackle the problems, which we know disproportionately affect all of those groups of people and many more, by focusing on the harms rather than the recipients of the harm.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Hansard - - - Excerpts

Can I check something with my noble friend? This is where the illogicality is. The Government have mandated in the Strategic Policing Requirement that violence against women and girls is a national threat. I do not disagree with him that other groups of people will absolutely suffer abuse and online violence, but the Government themselves have said that violence against women and girls is a national threat. I understand that my noble friend has the speaking notes, the brief and everything else, so I am not sure how far we will get on this tonight, but, given the Home Office stance on it, I think that to say that this is not a specific threat would be a mistake.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

With respect, I do not think that that is a perfect comparison. The Strategic Policing Requirement is an operational policing document intended for chief constables and police and crime commissioners in the important work that they do, to make sure they have due regard for national threats as identified by the Home Secretary. It is not something designed for commercial technology companies. The approach we are taking in the Bill is to address harms that can affect all people and which we know disproportionately affect women and girls, and harms that we know disproportionately affect other groups of people as well.

We have made changes to the Bill: the consultation with the Victims’ Commissioner and the domestic abuse commissioner, the introduction of specific offences to deal with cyber-flashing and other sorts of particular harms, which we know disproportionately affect women and girls. We are taking an approach throughout the work of the Bill to reflect those harms and to deal with them. Because of that, respectfully, I do not think we need a specific code of practice for any particular group of people, however large and however disproportionately they are affected. I will say a bit more about our approach. I have said throughout, including at Second Reading, and my right honourable friend the Secretary of State has been very clear in another place as well, that the voices of women and girls have been heard very strongly and have influenced the approach that we have taken in the Bill. I am very happy to keep talking to noble Lords about it, but I do not think that the code my noble friend sets out is the right way to go about solving this issue.

Amendment 304 seeks to adopt the Istanbul convention definition of violence against women and girls. The Government are already compliant with the Convention on Preventing and Combating Violence Against Women and Domestic Violence, which was ratified last year. However, we are unable to include the convention’s definition of violence against women and girls in the Bill, as it extends to legal content and activity that is not in scope of the Bill as drafted. Using that definition would therefore cause legal uncertainty for companies. It would not be appropriate for the Government to require companies to remove legal content accessed by adults who choose to access it. Instead, as noble Lords know, the Government have brought in new duties to improve services’ transparency and accountability.

Amendment 104 in the name of the noble Lord, Lord Stevenson, seeks to require user-to-user services to provide a higher standard of protection for women, girls and vulnerable adults than for other adults. The Bill already places duties on service providers and Ofcom to prioritise responding to content and activity that presents the highest risk of harm to users. This includes users who are particularly affected by online abuse, such as women, girls and vulnerable adults. In overseeing the framework, Ofcom must ensure that there are adequate protections for those who are most vulnerable to harm online. In doing so, Ofcom will be guided by its existing duties under the Communications Act, which requires it to have regard when performing its duties to the

“vulnerability of children and of others whose circumstances appear to OFCOM to put them in need of special protection”.

The Bill also amends Ofcom’s general duties under the Communications Act to require that Ofcom, when carrying out its functions, considers the risks that all members of the public face online, and ensures that they are adequately protected from harm. This will form part of Ofcom’s principal duty and will apply to the way that Ofcom performs all its functions, including when producing codes of practice.

In addition, providers’ illegal content and child safety risk assessment duties, as well as Ofcom’s sectoral risk assessment duties, require them to understand the risk of harm to users on their services. In doing so, they must consider the user base. This will ensure that services identify any specific risks facing women, girls or other vulnerable groups of people.

As I have mentioned, the Bill will require companies to prioritise responding to online activity that poses the greatest risk of harm, including where this is linked to vulnerability. Vulnerability is very broad. The threshold at which somebody may arguably become vulnerable is subjective, context-dependent and maybe temporary. The majority of UK adult users could be defined as vulnerable in particular circumstances. In practice, this would be very challenging for Ofcom to interpret if it were added to the safety objectives in this way. The existing approach allows greater flexibility so that companies and Ofcom can focus on the greatest threats to different groups of people at any given time. This allows the Bill to adapt to and keep pace with changing risk patterns that may affect different groups of people.

20:00
I am conscious that, for understandable reasons, the noble Baroness, Lady Featherstone, is not here to speak to her Amendment 171. I will touch on it briefly in her absence. It relates to platform transparency about providers’ approaches to content that promotes violence against women, girls and vulnerable groups. Her amendment raises an extremely important issue. It is essential that the Bill increase transparency about the abuse of women, girls and vulnerable people online. This is why the Bill already empowers Ofcom to request extensive information about illegal and harmful online abuse of women, girls and vulnerable groups in transparency reports. Accepting this amendment would therefore be duplicative. I hope that the noble Baroness would agree that it is not needed.
I hope that I have given some reassurance that the Bill covers the sort of violent content about which noble Lords are rightly concerned, no matter against whom it is directed. The Government recognise that many of these offences and much of the violence does disproportionately affect women and girls in the way that has been correctly pointed out. We have reflected this in the way in which the Bill and its regulatory framework are to operate. I am happy to keep discussing this matter with my noble friend. She is right that it is important, but I hope that, at this juncture, she will be content to withdraw her amendment.
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Hansard - - - Excerpts

My Lords, I thank my noble friend for his response, which I will come on to in a moment. This has been a fascinating debate. Yet again, it has gone to the heart of some of the issues with this Bill. I thank all noble Lords who have spoken, even though I did not quite agree with everything they said. It is good that this Committee shows just how seriously it takes the issue of violence against women and girls. I particularly thank all those who are watching from outside. This issue is important to so many.

There is no time to run through all the brilliant contributions that have been made. I thank the right reverend Prelate the Bishop of Gloucester for her support. She made the point that, these days, for most people, there is no online/offline distinction. To answer one of the points made, we sometimes see violence or abuse that starts online and then translates into the offline world. Teachers in particular are saying that this is the sort of misogyny they are seeing in classrooms.

As the noble Baroness, Lady Merron, said, the onus should not be on women and girls to remove themselves from online spaces. I also thank the noble Baronesses, Lady Kidron and Lady Gohir, for their support. The noble Baroness, Lady Kidron, talked about the toxic levels of online violence. Parliament needs to say that this is not okay—which means that we will carry on with this debate.

I thank the noble Baroness, Lady Healy, for her contribution. She illustrated so well why a code of practice is needed. We can obviously discuss this, but I do not think the Minister is quite right about the user reporting element. For example, we have heard various women speaking out who have had multiple rape threats. At the moment, the platforms require each one to be reported individually. They do not put them together and then work out the scale of threat against a particular user. I am afraid that this sort of threat would not breach the illegal content threshold and therefore would not be caught by the Bill, despite what the Minister has been saying.

I agree with my noble friend Lady Stowell. I would love to see basic standards—I think she called it “civility” —and a better society between men and women. One of the things that attracts me most to the code of practice is that it seeks cultural and societal changes—not just whack-a-mole with individual offences but changing the whole online culture to build a healthier and better society.

I will certainly take up the Minister’s offer of a meeting. His response was disappointing. There was no logic to it at all. He said that the voice of women and girls is heard throughout the Bill. How can this be the case when the very phrase “women and girls” is not mentioned in 262 pages? Some 100,000 people outside this Chamber disagree with his position and on the need for there to be a code of practice. I say to both Ofcom and the tech platforms that a code has been drafted. Please do not do the “Not drafted here; we’re not going to adopt it”. It is there, the work has been done and it can easily be taken on.

I would be delighted to discuss the definition in Amendment 304 with my noble friend. I will of course withdraw my amendment tonight, but we will certainly return to this on Report.

Amendment 97 withdrawn.
Amendment 98 not moved.
House resumed. Committee to begin again not before 8.45 pm.

Online Safety Bill

Committee (7th Day) (Continued)
20:45
Clause 36: Codes of practice about duties
Amendment 98A
Moved by
98A: Clause 36, page 37, line 29, at end insert—
“(ga) the Children’s Commissioner,(gb) the Commissioner for Victims and Witnesses,(gc) the Domestic Abuse Commissioner,”Member’s explanatory statement
This amendment provides that in preparing a draft code of practice or amendments of a code of practice under clause 36, OFCOM must also consult the Children’s Commissioner, the Commissioner for Victims and Witnesses and the Domestic Abuse Commissioner.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- Hansard - - - Excerpts

My Lords, the amendments in this group consider the role of collaboration and consultation in Ofcom’s approach. The proposals range in their intent, and include mandating additional roles for young people in the framework, adding new formal consultation requirements, and creating powers for Ofcom to work with other organisations.

I reassure noble Lords that the Government take these concerns extremely seriously. That is why the Bill already places the voices of experts, users and victims at the heart of the regime it establishes. In fact, the intent of many of the amendments in this group will already be delivered. That includes Ofcom working with others effectively to deliver the legislation, consulting on draft codes of practice, and having the ability to designate specific regulatory functions to other bodies where appropriate. Where we can strengthen the voices of users, victims or experts—without undermining existing processes, reducing the regulator’s independence or causing unacceptable delays—the Government are open to this. That is why I am moving the amendment today. However, as we have heard in previous debates, this is already a complex regulatory framework, and there is a widespread desire for it to be implemented quickly. Therefore, it is right that we guard against creating additional or redundant requirements which could complicate the regime or unduly delay implementation.

I turn to the amendment in my name. As noble Lords know, Ofcom will develop codes of practice setting out recommended measures for companies to fulfil their duties under the Bill. When developing those codes, Ofcom must consult various persons and organisations who have specific knowledge or expertise related to online harms. This process will ensure that the voices of users, experts and others are reflected in the codes, and, in turn, that the codes contain appropriate and effective measures.

One of the most important goals of the Bill, as noble Lords have heard me say many times, is the protection of children. It is also critical that the codes reflect the views of victims of online abuse, as well as the expertise of those who have experience in managing them. Therefore, the government amendment seeks to name the Commissioner for Victims and Witnesses, the domestic abuse commissioner and the Children’s Commissioner as statutory consultees under Clause 36(6). Ofcom will be required to consult those commissioners when preparing or amending a code of practice.

Listing these commissioners as statutory consultees will guarantee that the voices of victims and those who are disproportionately affected by online abuse are represented when developing codes of practice. This includes, in particular, women and girls—following on from our debate on the previous group—as well as children and vulnerable adults. This will ensure that Ofcom’s codes propose specific and targeted measures, such as on illegal content and content that is harmful to children, that platforms can take to address abuse effectively. I therefore hope that noble Lords will accept it.

I will say a little about some of the other amendments in this group before noble Lords speak to them. I look forward to hearing how they introduce them.

I appreciate the intent of Amendment 220E, tabled by the noble Lord, Lord Clement-Jones, and my noble friend Lady Morgan of Cotes, to address the seriousness of the issue of child sexual exploitation and abuse online. This amendment would allow Ofcom to designate an expert body to tackle such content. Where appropriate and effective, Section 1(7) of the Communications Act 2003 and Part II of the Deregulation and Contracting Out Act 1994 provide a route for Ofcom to enter into co-regulatory arrangements under the online safety framework.

There are a number of organisations that could play a role in the future regulatory framework, given their significant experience and expertise on the complex and important issue of tackling online child sexual exploitation and abuse. This includes the Internet Watch Foundation, which plays a pivotal role in the detection and removal of child sexual abuse material and provides vital tools to support its members to detect this abhorrent content.

A key difference from the proposed amendment is that the existing route, following consultation with Ofcom, requires an order to be made by a Minister, under the Deregulation and Contracting Out Act 1994, before Ofcom can authorise a co-regulator to carry out regulatory functions. Allowing Ofcom to do this, without the need for secondary legislation, would allow Ofcom to bypass existing parliamentary scrutiny when contracting out its regulatory functions under the Bill. By contrast, the existing route requires a draft order to be laid before, and approved by, each House of Parliament.

The noble Lord, Lord Knight of Weymouth, tabled Amendment 226, which proposes a child user advocacy body. The Government are committed to the interests of child users being represented and protected, but we believe that this is already achieved through the Bill’s existing provisions. There is a wealth of experienced and committed representative groups who are engaged with the regulatory framework. As the regulator, Ofcom will also continue to consult widely with a range of interested parties to ensure that it understands the experience of, and risks affecting, children online. Further placing children’s experiences at the centre of the framework, the Government’s Amendment 98A would name the Children’s Commissioner as a statutory consultee for the codes of practice. The child user advocacy body proposed in the noble Lord’s Amendment 226 may duplicate the Children’s Commissioner’s existing functions, which would create uncertainty, undermining the effectiveness of the Children’s Commissioner’s Office. The Government are confident that the Children’s Commissioner will effectively use her statutory duties and powers to understand children’s experiences of the digital realm.

For the reasons that I have set out, I am confident that children’s voices will be placed at the heart of the regime, with their interests defended and advocated for by the regulator, the Children’s Commissioner, and through ongoing engagement with civil society groups.

Similarly, Amendment 256, tabled by the noble Baroness, Lady Bennett of Manor Castle, seeks to require that any Ofcom advisory committees established by direction from the Secretary of State under Clause 155 include at least two young people. Ofcom has considerable experience in setting up committees of this kind. While there is nothing that would preclude committee membership from including at least two young people, predetermining the composition of any committee would not give Ofcom the necessary space and independence to run a transparent process. We feel that candidates should be appointed based on relevant understanding and technical knowledge of the issue in question. Where a board is examining issues with specific relevance to the interests of children, we would expect the committee membership to reflect that appropriately.

I turn to the statement of strategic priorities. As I hope noble Lords will agree, future changes in technology will likely have an impact on the experience people have online, including the nature of online harms. As provided for by Clause 153, the statement of strategic priorities will allow the Secretary of State to set out a statement of the Government’s strategic priorities in relation to online safety. This ensures that the Government can respond to changes in the digital and regulatory landscape at a strategic level. A similar power exists for telecommunications, the management of the radio spectrum, and postal services.

Amendments 251 to 253 seek to place additional requirements on the preparation of a statement before it can be designated. I reassure noble Lords that the existing consultation and parliamentary approval requirements allow for an extensive process before a statement can be designated. These amendments would introduce unnecessary steps and would move beyond the existing precedent in the Communications Act when making such a statement for telecommunications, the management of the radio spectrum, and postal services.

Finally, Amendment 284, tabled by the noble Lord, Lord Stevenson of Balmacara, proposes changes to Clause 171 on Ofcom’s guidance on illegal content judgments. Ofcom is already required to consult persons it considers appropriate before producing or revising the guidance, which could include the groups named in the noble Lord’s amendment. This amendment would oblige Ofcom to run formal public consultations on the illegal content guidance at two different stages: first, at a formative stage in the drafting process, and then before publishing a final version. These consultations would have to be repeated before subsequently amending or updating the guidance in any way. This would impose duplicative, time-consuming requirements on the regulator to consult, which are excessive when looking at other comparable guidance. The proposed consultations under this amendment would ultimately delay the publication of this instrumental guidance.

I will listen to what noble Lords have to say when they speak to their amendments, but these are the reasons why, upon first reading, we are unpersuaded by them.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the Minister for opening the group. This is a slightly novel procedure: he has rebutted our arguments before we have even had a chance to put them—what is new? I hope he has another speech lined up for the end which accepts some of the arguments we put, to demonstrate that he has listened to all the arguments made in the debate.

I will speak mainly to Amendments 220E and 226, ahead of the noble Baroness, Lady Kidron; I understand that the noble Baroness, Lady Merron, will be speaking at the end of the group to Amendment 226. I am very grateful to the noble Baroness, Lady Morgan, for signing Amendment 220E; I know she feels very strongly about this issue as well.

As the Minister said, this amendment is designed to confirm the IWF’s role as the recognised body for dealing with notice and take-down procedures for child sexual abuse imagery in the UK and to ensure that its long experience and expertise continues to be put to best use. In our view, any delay in establishing the roles and responsibilities of expert organisations such as the IWF in working with Ofcom under the new regulatory regime risks leaving a vacuum in which the risks to children from this hateful form of abuse will only increase. I heard what the Minister said about the parliamentary procedure, but that is a much slower procedure than a designation by Ofcom, so I think that is going to be one of the bones of contention between us.

The Internet Watch Foundation is a co-regulatory body with over 25 years of experience working with the internet industry, law enforcement and government to prevent the uploading of, and to disable public access to, known child sexual abuse, and to secure the removal of indecent images and videos of children from the internet. The organisation has had some considerable success over the last 25 years, despite the problem appearing to be getting worse globally.

In 2022, it succeeded in removing a record 255,000 web pages containing child sexual abuse. It has also amassed a database of more than 1.6 million unique hashes of child sexual abuse material, which has been provided to the internet industry to keep its platforms free from such material. In 2020, the Independent Inquiry into Child Sexual Abuse concluded that, in the UK, the IWF

“sits at the heart of the national response to combating the proliferation of indecent images of children. It is an organisation that deserves to be acknowledged publicly as a vital part of how, and why, comparatively little child sexual abuse material is hosted in the UK”.

21:00
Our Joint Committee on the draft Online Safety Bill stated back in December 2021—that seems a long time ago now—that the IWF
“made a persuasive case that they should be co-designated by Ofcom to regulate CSEA content, an argument supported by the CPS and by TalkTalk”
in their evidence to the committee. We also concluded that
“it would have been beneficial to see more information … about how such co-designation might be achieved or even”
the processes for and
“a timeline on when such decisions will be taken”.
In fact, the then Minister, Chris Philp MP, backed up the importance of the IWF to the regulatory landscape during the Public Bill Committee in the Commons in June 2022, stating that
“agencies such as the Internet Watch Foundation and others should co-operate closely. There is already very good working between the Internet Watch Foundation, law enforcement and others—they seem to be well networked together and co-operating closely”.—[Official Report, Commons, Public Order Bill Committee, 14/6/22; col. 421.]
But I think that is not utterly clear, and that is the reason for putting this forward.
It is clear that Ofcom does not see regulation as a solo effort just by itself; of course, that cannot be the case for something as big and important as this. There is already so much expertise in other organisations such as the IWF. I hope that this amendment will tease out some of the progress being made—or not—in terms of co-designation and the relationship between Ofcom and organisations such as the IWF. What consultation has been carried out with the IWF to date? Are arrangements going to be come to with it in terms of CSEA and other similar material?
Three years have elapsed since we saw the first draft of this Bill, so we have had plenty of time to have those discussions and consultations. When can we expect decisions to be made? What steps will be taken to help organisations such as the IWF to scale up if they do get a response to assistance from Ofcom? What conclusions has Ofcom reached on the role for the IWF in the new regulatory landscape? I very much hope that we will get a more positive response to some of those questions at the end of this debate.
I want to come on to Amendment 226. My noble friend Lady Tyler is not well, unfortunately, and cannot be here today, but she is a strong supporter of this amendment, which deals with child protection and mental health. Mental health is the number one reason why children call Childline. We know that online abuse and harm can have a profound impact on children’s mental health. Online child abuse has a devastating impact on children. It profoundly affects their mental health and their experience of relationships, with many continuing to experience the impact long after the period of abuse. Exposure to legal but harmful content can have an incredibly damaging impact on children’s mental and emotional well-being. Some children told Childline that they were experiencing
“anxiety, intrusive thoughts, low self-esteem and trouble sleeping”
as a result of seeing that harmful material online. That is why we support this amendment to introduce an advocacy body for children.
I heard what the Minister had to say about that: that it was duplicative, and so on. I am sure there are some jolly good reasons that the Minister can adumbrate, but I do not necessarily believe that the Children’s Commissioner believes that that is their specific role in these circumstances. That is why Young Minds and the Molly Rose Foundation, established by Ian Russell after the death of his daughter Molly, have come together with the NSPCC to support the introduction of an advocacy body for children. So I hope that the Minister will be rather more positive than he started out being on Amendment 226.
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I shall speak briefly to Amendments 220E and 226. On Amendment 220E, I say simply that nothing should be left to chance on IWF. No warm words or good intentions replace the requirement for its work to be seamlessly and formally integrated into the OSB regime. I put on record the extraordinary debt that every one of us owes to those who work on the front line of child sexual abuse. I know from my own work how the images linger. We should all do all that we can to support those who spend every day chasing down predators and finding and supporting victims and survivors. I very much hope that, in his response, the Minister will agree to sit down with the IWF, colleagues from Ofcom and the noble Lords who tabled the amendment and commit to finding a language that will give the IWF the reassurance it craves.

More generally, I raise the issue of why the Government did not accept the pre-legislative committee’s recommendation that the Bill provide a framework for how bodies will work together, including when and how they will share powers, take joint action and conduct joint investigations. I have a lot of sympathy with the Digital Regulation Co-operation Forum in its desire to remain an informal body, but that is quite different from the formal power to share sensitive data and undertake joint action or investigation.

If history repeats itself, enforcing the law will take many years and very likely will cost a great deal of money and require expertise that it makes no sense for Ofcom to reproduce. It seems obvious that it should have the power to co-designate efficiently and effectively. I was listening to the Minister when he set out his amendment, and he went through the process that Ofcom has, but it did not seem to quite meet the “efficiently and effectively” model. I should be interested to know why there is not more emphasis on co-regulation in general and the sharing of powers in particular.

In the spirit of the evening, I turn to Amendment 226 and make some comments before the noble Baroness, Lady Merron, has outlined the amendment, so I beg her indulgence on that. I want to support and credit the NSPCC for its work in gathering the entire child rights community behind it. Selfishly, I have my own early warning system, in the form of the 5Rights youth advisory group, made up of the GYG—gifted young generation—from Gravesend. It tells us frequently exactly what it does not like and does like about the online world. More importantly, it reveals very early on in our interactions the features or language associated with emerging harms.

Because of the lateness of the hour, I will not give your Lordships all the quotes, but capturing and reflecting children’s insight and voices is a key part of future-proofing. It allows us to anticipate new harms and, where new features pop up that are having a positive or negative impact, it is quite normal to ask the user groups how they are experiencing those features and that language themselves. That is quite normal across all consumer groups so, if this is a children’s Bill, why are children not included in this way?

In the work that I do with companies, they often ask what emerging trends we are seeing. For example, they actually say that they will accept any additions to the list of search words that can lead to self-harm content, or “What do we know about the emoji language that is happening now that was not happening last week?” I am always surprised at their surprise when we say that a particular feature is causing anxiety for children. Rather than being hostile, their response is almost always, “I have never thought about it that way before”. That is the value of consulting your consumer—in this case, children.

I acknowledge what the Minister said and I welcome the statutory consultees—the Children’s Commissioner, the Victims’ Commissioner and so on. It is a very welcome addition, but this role is narrowly focused on the codes of practice at the very start of the regulatory cycle, rather than the regulatory system as a whole. It does not include the wider experience of those organisations that deal with children in real time, such as South West Grid for Learning or the NSPCC, or the research work done by 5Rights, academics across the university sector or research partners such as Revealing Reality—ongoing, real-time information and understanding of children’s perspectives on their experience.

Likewise, super-complaints and Ofcom’s enforcement powers are what happen after harms take place. I believe that we are all united in thinking that the real objective of the exercise is to prevent harm. That means including children’s voices not only because it is their right but because, so often in my experience, they know exactly what needs to happen, if only we would listen.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Hansard - - - Excerpts

My Lords, I speak mainly to support Amendment 220E, to which I have added my name. I am also delighted to support government Amendment 98A and I entirely agree with the statutory consultees listed there. I will make a brief contribution to support the noble Lord, Lord Clement-Jones, who introduced Amendment 220E. I thank the chief executive at Ofcom for the discussions that we have had on the designation and the Minister for the reply he sent me on this issue.

I have a slight feeling that we are dancing on the head of a pin a little, as we know that we have an absolutely world-leading organisation in the form of the Internet Watch Foundation. It plays an internationally respected role in tackling child sexual abuse. We should be, and I think we are, very proud to have it in the United Kingdom, and the Government want to enhance and further build on the best practice that we have seen. As we have already heard and all know, this Bill has been a very long time in coming and organisations such as the Internet Watch Foundation, which are pretty certain because of their expertise and the good work they have done already, should be designated.

However, without knowing that and without having a strong steer of support from the Minister, it becomes harder for them to operate, as they are in a vacuum. Things such as funding and partnership working become harder and harder, as well, which is what I mean by dancing on the head of a pin—unless the Minister says something about another organisation.

The IWF was founded in 1996, when 18% of the world’s known child sexual abuse material was hosted in the UK. Today that figure is less than 1% and has been since 2003, thanks to the work of the IWF’s analysts and the partnership approach the IWF takes. We should say thank you to those who are at the front line of the grimmest material imaginable and who do this to keep our internet safe.

I mentioned, in the previous group, the IWF’s research on girls. It says that it has seen more girls appearing in this type of imagery. Girls now appear in 96% of the imagery it removes from the internet, up almost 30 percentage points from a decade ago. That is another good reason why we want the internet and online to be a safe place for women and girls. As I say, any delay in establishing the role and responsibility of an expert organisation such as the IWF in working with Ofcom risks leaving a vacuum in which the risk is to children. That is really the ultimate thing; if there is a vacuum left and the IWF is not certain about its position, then what happens is that the children who are harmed most by this awful material are the ones who are not being protected. I do not think that is what anybody wants to see, however much we might argue about whether an order should be passed by Parliament or by Ofcom.

21:15
The Minister has already mentioned the Internet Watch Foundation tonight. It would be very helpful if he could set out whether he does expect Ofcom to work with the IWF to deliver its obligations under the Bill and whether he will commit to discussing this with both Ofcom and the IWF, due to the impact this uncertainty is having on the IWF. I am sure, as the noble Baroness, Lady Kidron, suggested, that I and the noble Lord, Lord Clement-Jones, would be happy to attend any such meeting as well, if that would help, but I do not think that we need necessarily to hold hands with these two extremely grown-up organisations in meeting the Minister. Obviously, the IWF has to have some sort of commitment in relation to its funding, to help it, as we have said, to scale up to offering services to the 24,000 companies that could be attempting to access its services to comply with the Bill.
I think it is fair to say this is a probing amendment. It is an important amendment, and I think that without that confirmation or indication from the Minister, we do leave a vacuum in which this hateful material proliferates, and children become ever more vulnerable.
Baroness Healy of Primrose Hill Portrait Baroness Healy of Primrose Hill (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I have put my name to Amendment 220E, in order that the Internet Watch Foundation is duly recognised for its work and there is clarity for its role in the future regulatory landscape. So far, no role has yet been agreed with Ofcom. This could have a detrimental effect on the vital work of the IWF in combating the proliferation of child sexual abuse images and videos online.

As other noble Lords have said, the work of the IWF in taking down the vile webpages depicting sexual abuse of children is vital to stemming this tide of abuse on the internet. Worryingly, self-generated images of children are on the rise, and now account for two-thirds of the content that is removed by the IWF. Seven to 10 year-olds are now the fastest-growing age group appearing in these images. As the noble Baroness, Lady Morgan, said, girls appear in 96% of the imagery the IWF removes from the internet—up almost 30 percentage points from a decade ago. The abuse of boys is also on the rise. In the past year the IWF has seen an 138% increase in images involving them, often linked to sexual extortion.

This amendment attempts to clarify the future role of the IWF, so we await the response from the Government with interest. Tackling this growing plague of child sexual abuse is going to take all the expert knowledge that can be found, and Ofcom would be strengthened in its work by formally co-operating with the IWF.

Briefly, I also support Amendment 226, in the name of my noble friend Lord Knight, to require Ofcom to establish an advocacy body for children. I raised this at Second Reading, as I believe that children must be represented not just by the Children's Commissioner, welcome though that is, but by a body that actively includes them, not just speaks for them. The role of the English Children’s Commissioner as a statutory consultee is not an alternative to advocacy. The commissioner’s role is narrowly focused on inputting into the codes of practice at the start of the regulatory cycle, not as an ongoing provider of children’s experiences online.

This body would need to be UK-wide, with dedicated staff to consistently listen to children through research projects and helplines. It will be able to monitor new harms and rapidly identify emerging risks through its direct continual contact with children. This body would assist Ofcom and strengthen its ability to keep up with new technology. The new body will be able to share insights with the regulator to ensure that decisions are based on a live understanding of children’s safety online and to act as an early warning system. Establishing such a body would increase trust in Ofcom’s ability to stay in touch with those it needs to serve and be recognised by the tech companies as a voice for children.

There must be a mechanism that ensures children’s interests and safety online are promoted and protected. Children have a right to participate fully in the digital world and have their voices heard, so that tech companies can design services that allow them to participate in an age-appropriate way to access education, friendships and entertainment in a safe environment, as the Bill intends. One in three internet users is a child; their rights cannot be ignored.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I support Amendment 220E in the names of my noble friend Lord Clement-Jones and the noble Baroness, Lady Morgan of Cotes. I also support the amendments in the name of the noble Baroness, Lady Kidron, and Amendment 226, which deals with children’s mental health.

I have spoken on numerous occasions in this place about the devastating impact child sexual abuse has and how it robs children of their childhoods. I am sure everyone here will agree that every child has the right to a childhood free of sexual exploitation and abuse. That is why I am so passionate about protecting children from some of the most shocking and obscene harm you can imagine. In the case of this amendment and child sexual abuse, we are specifically talking about crimes against children.

The Internet Watch Foundation is an organisation I am proud to support as one of its parliamentary champions, because its staff are guardian angels who work tirelessly beyond the call of duty to protect children. In April 2019, I was honoured to host the IWF’s annual report here in Parliament. I was profoundly shocked and horrified by what I heard that day and in my continued interactions with the IWF.

That day, the IWF told the story of a little girl called Olivia. Olivia was just three years old when IWF analysts saw her. She was a little girl, with big green eyes and golden-brown hair. She was photographed and filmed in a domestic setting. This could have been any bedroom or bathroom anywhere in the country, anywhere in the world. Sadly, it was her home and she was with somebody she trusted. She was in the hands of someone who should have been there to look after her and nurture her. Instead, she was subjected to the most appalling sexual abuse over several years.

The team at the IWF have seen Olivia grow up in these images. They have seen her be repeatedly raped, and the torture she was subjected to. They tracked how often they saw Olivia’s images and videos over a three-month period. She appeared 347 times. On average that is five times every single day. In three in five of those images, she was being raped and tortured. Her imagery has also been identified as being distributed on commercial websites, where people are profiting from this appalling abuse.

I am happy to say that Olivia, thankfully, was rescued by law enforcement in 2013 at the age of eight, five years after her abuse began. Her physical abuse ended when the man who stole her childhood was imprisoned, but those images remain in circulation to this day. We know from speaking with adult survivors who have experienced revictimisation that it is the mental torture that blights lives and has an impact on their ability to leave their abuse in the past.

This Bill is supposed to help children like Olivia—and believe you me, she is just one of many, many children. The scale of these images in circulation is deeply worrying. In 2022, the IWF removed a record number of 255,000 web pages containing images of the sexual abuse and exploitation of children. Each one of these web pages can contain anything from one individual image of a child like Olivia, to thousands.

The IWF’s work is vital in removing millions of images from the internet each and every year, day in, day out. These guardian angels work tirelessly to stop this. As its CEO Susie Hargreaves often tells me, the world would be a much better place if the IWF did not have to exist, because this would mean that children were not suffering from sexual abuse or having such content spread online. But sadly, there is a need for the IWF. In fact, it is absolutely vital to the online safety landscape in the UK. As yet, this Bill does not go anywhere near far enough in recognising the important contribution the IWF has to make in implementing this legislation.

Victims of sexual abuse rely upon the IWF to protect and fight for them, safe in the knowledge that the IWF is on their side, working tirelessly to prevent millions of people potentially stumbling across their images and videos. This amendment is so important because, as my noble friend said, any delay to establishing roles and responsibilities of organisations like the IWF in working with Ofcom under the regulator regime risks leaving a vacuum in which the risks to children like Olivia will only increase further.

I urge the Government to take action to ensure that Ofcom clarifies how it intends to work with the Internet Watch Foundation and acknowledges the important part it has to play. We are months away from the Bill finally receiving Royal Assent. For children like Olivia, it cannot come soon enough; but it will not work as well as it could without the involvement of the Internet Watch Foundation. Let us make sure that we get this right and safeguard our children by accepting this amendment.

Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, as the noble Lord, Lord Clement-Jones, observed, we have approached this group in an interesting way, having already heard the Minister’s feelings about the amendment. As I always think, forewarned is forearmed—so at least we know our starting point, and I am sure the Minister has listened to the debate and is reflecting.

I start by welcoming government Amendment 98A. We certainly value the work of various commissioners, but this amendment does not provide for what I would call a comprehensive duty. It needs supplementing by other approaches, and these are provided for by the amendments in this group.

The noble Baronesses, Lady Morgan, Lady Benjamin and Lady Kidron, and my noble friend Lady Healy and others, have made a powerful case for the Internet Watch Foundation being the designated expert body. I too wish to pay tribute to those who tackle online child sexual exploitation and abuse. They do it on behalf of all of us, but most notably the children they seek to protect, and their work is nothing short of an act of service.

Amendment 220E is in the names of the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Morgan. Despite the recommendation by the Joint Committee that scrutinised the draft Bill in December 2021 for the Internet Watch Foundation’s role in the future regulatory landscape to be clearly identified within the timescale set, it would require a role to be agreed with Ofcom, which has not yet happened. Perhaps the Minister can give the Committee some sense of where he feels Ofcom is in respect of the inclusion of the Internet Watch Foundation.

21:30
This amendment is designed to confirm IWF’s role as the recognised body dealing with notice and takedown procedures of child sexual abuse imagery in the UK. It would ensure that its long experience and expertise continue to be put to best use, which I am sure the Committee wants to see.
We have already heard evidence that the severity of child sexual abuse is becoming ever worse. In the past two years, category A child sexual abuse—the most severe form, including the rape and sexual torture of babies and toddlers—has doubled. In 2022 there were 51,369 web pages containing category A material, compared with 25,050 web pages in 2020. As the noble Baroness, Lady Morgan, reminded us in linking to the earlier debate, girls are now appearing in 96% of the imagery removed from the internet—up almost 30 percentage points from a decade ago. The abuse of boys is also on the rise. In the last year, the IWF has seen a 138% increase in images involving boys, often linked to sexual extortion, and often self-referred by children through the world-leading Report Remove portal.
I turn to Amendment 226 in the name of my noble friend Lord Knight. The number one message here is that user advocacy is the missing piece in this regulatory regime. Ofcom will be navigating extremely complex child safeguarding issues. These will develop at a rapid pace, yet there are no formal mechanisms in the Bill for Ofcom to gather insight directly from children and from safeguarding experts. These amendments seek to put this right.
As many noble Lords have said, it is essential that we hear from children and young people. Of course, I understand that the role of statutory consultees, such as the Children’s Commissioner, is extremely important. As noble Lords have already indicated, this role is narrowly focused on the codes of practice, at the very start of the regulatory cycle, rather than providing ongoing, real-time information and an understanding of children’s experiences, as we are trying to do. These will show how effectively—or not—platforms are complying with their new duties.
I am grateful to leading children’s charities—5Rights, Barnardo’s and YoungMinds—as well as to the organisations that have been set up by bereaved parents campaigning for child safety online. These include the Molly Rose Foundation and the Breck Foundation. All these groups have joined with the NSPCC to call for the introduction of an advocacy body for children.
The noble Baroness, Lady Kidron, spoke from her own experience through 5Rights of the value of the contribution of children in tackling online harms. I hope that the Minister will draw on this extremely valuable experience when thinking about what Amendment 226 seeks to do.
My other point in this regard is that we know from other regulated settings that user advocacy is extremely effective. Those user advocacy organisations assist regulators rather than getting in the way, and they assist them by identifying areas of harm or user detriment. Importantly, they can provide something of a counter- balance to lobbying from regulated sectors. For example, Citizens Advice is a consumer advocate in the essential service market, while Transport Focus speaks up for transport users.
Given the scale of online harm and the vulnerability of child users, I put it to the Minister that similar advocacy arrangements should be introduced in the Bill, otherwise we are going to be in the invidious position of children at risk of sexual abuse online receiving fewer statutory user advocacy protections than, say, users of a Post Office or passengers on a bus. I am sure that is not the intention, but we must make sure that we are not in that place.
All the amendments in the name of my noble friend Lord Stevenson would ensure that relevant voices were heard. There are repeated debates in your Lordships’ House about the need to consult and to get the right people around the table. All these amendments seek to do that, so I hope the Minister will take them in the spirit in which they are intended, which is to strengthen the arm of those who seek to protect children.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am grateful to noble Lords who have spoken to their amendments. Regarding the lead amendment in the group, I take on board what was said about its inevitable pre-emption—something that I know all too well from when the boot is on the other foot in other groups. However, I have listened to the points that were made and will of course respond.

I join the tributes rightly paid by noble Lords to the Internet Watch Foundation. The Government value its work extremely highly and would support the use of its expertise and experience in helping to deliver the aims of the Bill. My noble friend Lady Morgan of Cotes is right to say that it is on the front line of this work and to remind us that it encounters some of the most horrific and abhorrent content in the darkest recesses of the internet—something that I know well from my time as an adviser at the Home Office, as well as in this capacity now. Both the Secretary of State for Science, Innovation and Technology and the Minister for Safeguarding at the Home Office recently provided a foreword to the foundation’s latest annual report.

Clearly, Ofcom will need a wide variety of relationships with a range of organisations. Ofcom has been in regular contact with the Internet Watch Foundation, recognising its significant role in supporting the objectives of online safety regulation, and is discussing a range of options to make the best use of its expertise. The noble Lord, Lord Clement-Jones, asked what consultation and discussion is being had. We support the continuation of that engagement and are in discussions with the Internet Watch Foundation ourselves to understand how it envisages its role in supporting the regulatory environment. No decisions have been made on the co-regulatory role that other organisations may play. The Government will work with Ofcom to understand where it may be effective and beneficial to delivering the regulatory framework. Careful assessment of the governance, independence and funding of any organisations would be needed if co-designation were to be considered, but officials from the Department for Science, Innovation and Technology and the Home Office are in discussion with the IWF in relation to a memorandum of understanding to support ongoing collaboration.

On the designation of regulatory functions, we are satisfied that the powers under the Communications Act and the Deregulation and Contracting Out Act are sufficient, should other bodies be required to deliver specific aspects of the regime, so we do not see a need to amend the Bill in the way the amendments in this group suggest. Those Acts require an order from the Minister in order to designate any functions. The Minister has to consult Ofcom before making the order, and that is the mechanism that was used to appoint the Advertising Standards Authority to regulate broadcast advertising. It remains appropriate for Parliament to scrutinise the delivery of these important regulatory functions; accordingly, such an order cannot be made unless a draft of the order has been laid before, and approved by a resolution of, each House of Parliament.

The noble Baroness, Lady Merron, dwelt on the decision not to include a child user advocacy body. As I said in my earlier remarks and in relation to other groups, the Bill ensures that children’s voices will be heard and that what they say will be acted on. Ofcom will have statutory duties requiring it to understand the opinions and experiences of users, including children, by consulting widely when developing its codes. Ofcom will also have the flexibility to establish other mechanisms for conducting research about users’ experience. Additionally, the super-complaints process, which we began discussing this afternoon, will make sure that entities, including those that represent the interests of children, will have their voices heard and will help Ofcom recognise and eliminate systemic failings.

We are also naming the Children’s Commissioner as a statutory consultee for Ofcom in developing its codes of practice. A further new child user advocacy body would encroach on the wider statutory functions of the Children’s Commissioner. Both bodies would have similar responsibilities and powers to represent the interests of child users of regulated services, to protect and promote the interests of child users of regulated services, and to be a statutory consultee for the drafting and amendment of Ofcom’s codes of practice.

The noble Baroness, Lady Kidron, when discussing the input of the Children’s Commissioner into the regulatory framework, suggested that it was a here and now issue. She is right: the Children’s Commissioner will represent children’s views to Ofcom in preparing the codes of practice to ensure that they are fully informing the regime, but the commissioner will also have a continuing role, as they will be the statutory consultee on any later amendments to the codes of practice relating to children. That will ensure that they can engage in the ongoing development of the regime and can continue to feed in insights on emerging risks identified through the commissioner’s statutory duty to understand children’s experiences.

The Bill further ensures that new harms and risks to children are proactively identified by requiring that Ofcom make arrangements to undertake research about users’ experiences on regulated services. This will build on the significant amount of research that Ofcom already does, better to understand children’s experience online, particularly their experiences of online harms.

The super-complaints process will enable an eligible entity to make a complaint to Ofcom regarding a provider or providers that cause significant harm or significant adverse impact on users, including children. This will help Ofcom to recognise and eliminate systemic failings, including those relating to children, and will ensure that children’s views and voices continue to inform the regime as it is developed.

The Bill will also require that Ofcom undertake consumer consultation in relation to regulated services. This will, in effect, expand the scope of the Communications Consumer Panel to online safety matters, and will ensure that the needs of users, including children, are at the heart of Ofcom’s regulatory approach.

I draw noble Lords’ attention to the provisions of Clause 141(2), which states that Ofcom must make arrangements to ascertain

“the experiences of United Kingdom users of regulated services”.

That, of course, includes children. I hope, therefore, that noble Lords will be satisfied that the voices of children are indeed being listened to throughout the operation of the Bill. However, we have high regard for the work of the Internet Watch Foundation. I hope that noble Lords will be willing not to press their amendments—after the noble Lord, Lord Clement-Jones, asks his question.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I am in the slightly strange position of not having moved the amendment, but I want to quickly respond. I was slightly encouraged by what the Minister said about Ofcom having been in regular contact with the IWF. I am not sure that that is mutual; maybe Ofcom thinks it is in good contact with the IWF, but I am not sure the IWF thinks it is in good contact with Ofcom. However, I am encouraged that the Minister at least thinks that that has been the case and that he is encouraging consultation and the continuation of engagement.

21:45
What the Minister absolutely has not said is what role he envisages for the IWF in all this in terms of possible co-regulation. He went swiftly on to talking about other co- regulation arrangements which had not yet been determined. He did not actually say what kinds of co-regulation arrangements might be appropriate with the IWF. Then he went on—he keeps going swiftly on—to say that an MoU with the Home Office is in prospect. This sounds very interesting, but no doubt we will want to know more about precisely what is envisaged as part of the MoU.
Of course, the Minister’s conclusion is that there is no need to amend the Bill because we have parliamentary procedure and draft regulations, and because Ofcom will be consulted and so on. That is all fair enough. As the noble Baroness, Lady Morgan, said, this is a probing amendment. If we have done something to speed up the process, all well and good, but the essence of this is to get something cracking. I hope that the debate has at least had some impact, but this is still incredibly vague. We do not really know what role is envisaged for the IWF. The Minister has heard around the Committee the regard in which the IWF is held. He has heard our desire to see that it is an integral part of the protection process and the procedures under the Bill, and to see it work with Ofcom.
Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I have held back from contributing to this group, because it is not really my group and I have not really engaged in the topic at all. I have been waiting to see whether somebody who is engaged in it would raise this point.

The one factual piece of information that has not been raised in the debate is the fact that the IWF, of which I too am a huge admirer—I have huge respect for the work that it does; it does some fantastic work—is a registered charity. That may lead to some very proper questions about what its role should be in any kind of formal relationship with a statutory regulator. I noticed that no one is proposing in any of these amendments that it be put on the face of the Bill, which, searching back into my previous roles and experience, I think I am right to say would not be proper anyway. But even in the context of whatever role it might have along with Ofcom, I genuinely urge the DCMS and/or Ofcom to ensure that they consult the Charity Commission, not just the IWF, on what is being proposed so that it is compatible with its other legal obligations as a charity.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

If I might follow up that comment, I agree entirely with what the noble Baroness has just said. It is very tricky for an independent charity to have the sort of relationship addressed in some of the language in this debate. Before the Minister completes his comments and sits down again, I ask him: if Ofcom were to negotiate a contracted set of duties with the IWF—indeed, with many other charities or others who are interested in assisting with this important work—could that be done directly by Ofcom, with powers that it already has? I think I am right to say that it would not require parliamentary approval. It is only if we are talking about co-regulation, which again raises other issues, that we would go through a process that requires what sounded like the affirmative procedure—the one that was used, for example, with the Advertising Standards Authority. Is that right?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

Yes, I think it is. I am happy to confirm that in writing. I am grateful to my noble friend Lady Stowell, who of course is a former chairman of the Charity Commission, for making the point about the charitable status of the foundation. I should clarify that officials from the Department for Science, Innovation and Technology and the Home Office are in touch with the IWF about its role.

Speedily moving on, Ofcom is in discussion with the foundation about a memorandum of understanding. I hope that reassures the noble Lord, Lord Clement-Jones, that they are in reciprocal contact. Obviously, I cannot pre-empt where their discussions are taking them in relation to that MoU, but it is between Ofcom and the foundation. Careful consideration of governance, funding and issues of charity, as my noble friend raised, would have to be thought about if co-designation were being considered.

Amendment 98A agreed.
Amendments 99 to 101 not moved.
Clause 36, as amended, agreed.
Clause 37 agreed.
Amendment 102 not moved.
Schedule 4: Codes of practice under section 36: principles, objectives, content
Amendments 103 to 108 not moved.
Schedule 4 agreed.
Clause 38: Procedure for issuing codes of practice
Amendment 109 not moved.
House resumed.
House adjourned at 9.52 pm.

Online Safety Bill

Committee (8th Day)
16:36
Relevant documents: 28th Report from the Delegated Powers Committee
Clause 38: Procedure for issuing codes of practice
Amendment 110
Moved by
110: Clause 38, page 38, line 24, leave out subsections (2) to (8) and insert—
“(2) Upon receiving the draft code of practice from OFCOM, the Secretary of State must— (a) make a statement confirming they have received the draft code of practice, and(b) lay the draft code of practice before Parliament.(3) Unless the Secretary of State intends to give a direction to OFCOM under section 39(1) in relation to the draft, regulations giving effect to the code of practice may not be laid before Parliament unless the Secretary of State has—(a) consulted each devolved authority on the content of the draft code of practice;(b) produced an impact assessment including, but not limited to, an assessment of the impact of the proposed regulations on—(i) human rights and equalities,(ii) freedom of expression, and(iii) employment and labour; and(c) produced an assessment of the impact of the proposed regulations on children and vulnerable adults.(4) The Secretary of State may not make regulations under this section until any select committee charged by the relevant House of Parliament with scrutinising regulations made under this section has—(a) completed its consideration of the draft code of practice and the impact assessments referred to in subsection (3)(b) and (c), and(b) reported on its deliberation to the relevant House; andthe report of the committee has been debated in that House, or the period of six weeks beginning on the day on which the committee reported has elapsed.(5) The Secretary of State may not lay regulations under this section until they are satisfied that—(a) issues raised by a devolved authority have been resolved, or(b) if they have not been resolved, the Secretary of State has informed Parliament of the steps they intend to take in response to the issues raised.”Member’s explanatory statement
This amendment, which replaces most of the current Clause 38, would require the Secretary of State to publish draft codes of conduct from OFCOM for consideration by relevant committees of both Houses of Parliament.
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

My Lords, I rise to move Amendment 110 in my name and thank the noble Lord, Lord Clement-Jones, for his support. This is a complex group of amendments but they are about very significant powers that are supposed to be granted to the Secretary of State in this Bill. We believe that this part of the Bill must be significantly amended before it leaves this House, and while we await the Government’s response to the amendments in my name and that of the noble Baroness, Lady Stowell, I want to make it clear that if we do not see some significant movement from the Government we will return to these issues on Report. As it looks as though we will be having another long hiatus before Report, there is plenty of time for discussion and agreement.

Two House of Lords committees—the Communications and Digital Committee and the Delegated Powers and Regulatory Reform Committee—have called on the Government to remove or amend a number of the clauses engaged by these amendments, and a third, the Constitution Committee, has noted the concerns raised. I think it fair to say that these issues concern all parties and all groups in the House and urgently need addressing. The noble Baroness, Lady Stowell, in her capacity as chair of the Communications and Digital Committee, has a number of amendments very similar to mine to which I and others have signed up, and which I know she will go through in detail. I support the line she and the committee are taking, although I make some additional suggestions in some areas.

The amendments from the noble Lord, Lord Moylan —who I am sad to see is not in his place and who will not therefore be able to participate in this debate—broadly support the thrust of the amendments in this group. Perhaps they do not go quite as far as ours do, but it is certainly nice to have him on our side—for a change. I do not want to delay the Committee as I know many of us will want to discuss the points which will be raised in detail by the noble Baroness, Lady Stowell, so I think the best thing is for me to talk more generally about where we think the Government need to change approach, and I hope my remarks will open up the debate.

Before I do that, I thank the Carnegie Trust—I know a number of noble Lords have received documentation from it—for its detailed work in this area in particular, but it has covered the Bill comprehensively. It has been invaluable and we have also received support from the All-Party Digital Regulation Group, which has been pushing information around as well.

We have mentioned in the past the difficulty of amending the Bill because of the structures and the different way it treats the various types of company likely to be in scope. But, in essence, my amendments would ensure that Ofcom is able to operate as an independent regulator, delivering what is required of it under the Bill, and is not subject to instruction or direction by the Secretary of State except in exceptional circumstances. We are told that these will be restricted mainly to national security issues or public safety, though precisely what those issues are going to be needs spelling out in the Bill.

The Secretary of State should not be able to give Ofcom direction. In the broadcasting regime, there are no equivalent powers. Our press is not regulated in that way. We believe that the right approach is that the Secretary of State should, if he or she wishes, write to Ofcom with non-binding observations when it is thought necessary to do so. It would be for Ofcom to have regard to such letters, but there should be no requirement to act, provided that it operates within its powers as set out in the Bill. It follows that the powers taken by the Secretary of State in Clause 156 to issue directions to Ofcom in special circumstances, in Clause 157 to issue detailed tactical guidance to Ofcom in the exercise of its functions, and in Clause 153, which allows the Secretary of State to make a statement of strategic priorities relating to online safety, are significant threats to the independence of Ofcom, and we believe that they should be deleted. In addition, Clauses 38 and 39 need to be revised.

The independence of media regulators is important and must be preserved as it is at present. That is the norm in most developed democracies. The UK has signed many international statements in this vein, including, as recently as in April 2022 at the Council of Europe, a statement saying that

“media and communication governance should be independent and impartial to avoid undue influence on policy making, discriminatory treatment and preferential treatment of powerful groups, including those with significant political or economic power”.

I hope that when he comes to respond to the debate, the Minister will confirm that he stands by that international agreement that his Government have signed up to.

My second point deals with the other powers given to the Secretary of State in the Online Safety Bill—for example, to specify in regulations the primary priority content harmful to children and priority content harmful to children in Clause 54; to amend the duties on fraudulent advertising in Clause 191; to change the exemption to the regime in Clause 192; and to amend the list of terrorism offences, CSEA offences and other priority offences in Clause 194. Appropriate procedures for the exercise of these powers—ensuring that they are in line with the approach of this group of amendments —need to be set out in the Bill, because the present drafting is, in our view, inadequate. The reliance on conventional secondary legislation approval mechanisms will not be sufficient given the scale and impact of what is in contemplation.

At Second Reading, the Minister said,

“we remain committed to ensuring that Ofcom maintains its regulatory independence, which is vital to the success of this framework … We intend to bring forward two changes to the existing power: first, replacing the ‘public policy’ wording with a defined list of reasons that a direction can be made; and secondly, making it clear that this element of the power can only be used in exceptional circumstances … the framework ensures that Parliament will always have the final say on codes of practice, and that strong safeguards are in place to ensure that the use of this power is transparent and proportionate”.—[Official Report, 1/2/23; cols. 691-2.]

Those are fine words but, unfortunately, we have not yet seen the draft amendments that would give credence to that statement. Can the Minister give us any hint on the timetable?

My third point is that we are also not convinced that the processes currently specified for the approval of the high volume of secondary legislation pursuant to the Bill, including the codes of practice, engage sufficiently with Parliament. As my noble friend Lady Merron said at Second Reading, in our view the Bill suffers from an imbalance around what role Parliament should have in scrutinising the new regime and how changes to the statutory functions will be accommodated in future years. We can all agree that there will certainly be many more such occasions and more legislation in this area in future years.

This is, of course, a skeleton Bill, requiring significant amounts of secondary legislation before it begins to bite. How should Parliament be involved, both in the necessary scrutiny of those codes of practice, which put the regime into practice and define the way in which the regulated companies are to operate, and in anticipating changes that will be required as technology develops? It is to answer this question that I have put down a number of amendments aimed at carving out a role for the Select Committees of the two Houses—or perhaps a new Joint Committee, if that were to be the decision of Parliament. Indeed, that was a recommendation of the pre-legislative scrutiny committee and the Communications and Digital Committee in previous reports.

My Amendment 290, after Clause 197, tries to gather together the instances of powers exercisable by the Secretary of State and provide an additional parliamentary stage each time those powers are exercised. This would require that:

“The Secretary of State may not exercise the powers”


granted under the Bill unless and until

“any select committee charged by the relevant House of Parliament with scrutinising such regulations has … completed its consideration of the draft regulations and … reported on their deliberation to the relevant House”.

I appreciate that this is a major step. Introducing parliamentary scrutiny of this type may mean it takes more time to achieve results in what is already a complex process. Maybe this should be introduced in stages so as not to delay further the measures in the Bill.

16:45
The idea of engaging the Select Committees of Parliament is not unprecedented. It was introduced in a similar form as the Grimstone rule, using an agreed statement from the Dispatch Box setting out a commitment by the Government for the procedure for Select Committee consideration of trade agreements in both Houses under the international trade Bill. Similar issues have been raised recently in other Bills this Session. Do we like the sound of a parallel Parkinson rule? The noble Lord smiled—he must be pleased.
At heart, I recognise that this is in principle no more than ensuring that the expertise and knowledge of those who have served in an appropriate parliamentary Select Committee are grafted on to the normal affirmative or negative approval mechanisms for secondary legislation, but I also think it opens up a substantial new way of doing what has, on many occasions, been merely a rubber-stamping of what can be rather significant policy changes. It also gives a good opportunity to bring Parliament and parliamentarians into the policy delivery mechanism in what seems to me to be a satisfying way. It makes sense to do this for a complex new regime in a fast-changing technological environment such as the one that the Bill is ushering in, but it might have other applications, particularly consideration of other legislation that is currently in the pipeline. I beg to move.
Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, it is a great pleasure to follow the noble Lord, Lord Stevenson. I am grateful to him, the noble Lord, Lord Clement-Jones, and the noble Viscount, Lord Colville of Culross, for their support for my amendments, which I will come to in a moment. Before I do, I know that my noble friend Lord Moylan will be very disappointed not to be here for the start of this debate. From the conversation I had with him last week when we were deliberating the Bill, I know that he is detained on committee business away from the House. That is what is keeping him today; I hope he may join us a bit later.

Before I get into the detail of my amendments, I want to take a step back and look at the bigger picture. I remind noble Lords that on the first day in Committee, when we discussed the purpose of the Bill, one of the points I made was that, in my view, the Bill is about increasing big tech’s accountability to the public. For too long, and I am not saying anything that is new or novel here, it has enjoyed power beyond anything that other media organisations have enjoyed—including the broadcasters, which, as we know, have been subject to regulation for a long time now. I say that because, in my mind, the fundamental problem this legislation seeks to address is the lack of accountability of social media and tech platforms to citizens and users for the power and influence they have over our lives and society, as well as their economic impact. The latter will be addressed via the Digital Markets, Competition and Consumers Bill.

I emphasise “if that is the problem”, because when we talk about this bit of the Bill and the amendments we have tabled, we have started—and I am as guilty of this as anyone else—to frame it very much as if the problem is around the powers for the Secretary of State. In my view, we need to think about why they are not, in the way they are currently proposed, the right solution to the problem that I have outlined.

I do not think what we should be doing, as some of what is proposed in the Bill tends to do, is shift the democratic deficit from big tech to the regulator, although, of course, like all regulators, Ofcom must serve the public interest as a whole, which means taking everyone’s expectations seriously in the way in which it goes about its work.

That kind of analysis of the problem is probably behind some of what the Government are proposing by way of greater powers for the Secretary of State for oversight and direction of the regulator in what is, as we have heard, a novel regulatory space. I think that the problem with some, although not all, of the new powers proposed for the Secretary of State is that they would undermine the independence of Ofcom and therefore dilute the regulator’s authority over the social media and tech platforms, and that is in addition to what the noble Lord, Lord Stevenson, has already said, which is that there is a fundamental principle about the independence of media regulators in the western world that we also need to uphold and to which the Government have already subscribed.

If that is the bigger picture, my amendments would redress the balance between the regulator and the Executive, but there remains the vital role of Parliament, which I will come back to in a moment and which the noble Lord, Lord Stevenson, has already touched on, because that is where we need to beef up oversight of regulators.

Before I get into the detail, I should also add that my amendments have the full authority of your Lordships’ Communications and Digital Select Committee, which I have the great honour of chairing. In January, we took evidence from my noble friend Minister and his colleague, Paul Scully, and our amendments are the result of their evidence. I have to say that my noble friend on the Front Bench is someone for whom I have huge respect and admiration, but on that day when the Ministers were before us, we found as a committee that the Government’s evidence in respect of the powers that they were proposing for the Secretary of State was not that convincing.

I shall outline the amendments, starting with Amendments 113, 114, and 115. I am grateful to other noble Lords who have signed them, which demonstrates support from around the House. The Bill allows the Secretary of State to direct Ofcom to change its codes of practice on regulating social media firms for reasons of public policy. While it is legitimate for the Government to set strategic direction, this goes further and amounts to direct and unnecessary interference. The Government have suggested clarifying this clause, as we have heard, with a list of issues such as security, foreign policy, economic policy and burden to business, but it is our view as a committee that the list of items is so vague and expansive that almost anything could be included in it. Nor does it recognise the fact that the Government should respect the separation of powers between Executive and regulator in the first place, as I have already described. These amendments would therefore remove the Secretary of State’s power to direct Ofcom for reasons of public policy. Instead, the Secretary of State may write to Ofcom with non-binding observations on issues of security and child safety to which it must have regard. It is worth noting that under Clause 156 the Secretary of State still has powers to direct Ofcom in special circumstances to address threats to public health, safety and security, so the Government will not be left toothless, although I note that the noble Lord, Lord Stevenson, is proposing to remove Clause 156. Just to be clear, the committee is not proposing removing Clause 156; that is a place where the noble Lord and I propose different remedies.

Amendments 117 and 118 are about limiting the risk of infinite ping-pong. As part of its implementation work, Ofcom will have to develop codes of practice, but the Government can reject those proposals infinitely if they disagree with them. At the moment that would all happen behind closed doors. In theory, this process could go on for ever, with no parliamentary oversight. The Select Committee and I struggle to understand why the Government see this power as necessary, so our amendments would remove the Secretary of State’s power to issue unlimited directions to Ofcom on a draft code of practice, replacing it with a maximum of two exchanges of letters.

Amendment 120, also supported by the noble Lords I referred to earlier, is closely related to previous amendments. It is designed to improve parliamentary oversight of Ofcom’s draft codes of practice. Given the novel nature of the proposals to regulate the online world, we need to ensure that the Government and Ofcom have the space and flexibility to develop and adapt their proposals accordingly, but there needs to be a role for Parliament in scrutinising that work and being able to hold the Executive and regulator to account where needed. The amendment would ensure that the affirmative procedure, and not the negative procedure currently proposed in the Bill, was used to approve Ofcom’s codes of practice if they had been subject to attempts by the Secretary of State to introduce changes. This amendment is also supported by the Delegated Powers and Regulatory Reform Committee in its report.

Finally, Amendment 257 would remove paragraph (a) from Clause 157(1). This is closely related to previous amendments regarding the Secretary of State’s powers. The clause currently provides powers to provide wide-ranging guidance to Ofcom about how it carries out its work. This is expansive and poorly defined, and the committee again struggled to see the necessity for it. The Secretary of State already has extensive powers to set strategic priorities for Ofcom, establish expert advisory committees, direct action in special circumstances, direct Ofcom about its codes or just write to it if my amendments are accepted, give guidance to Ofcom about its media literacy work, change definitions, and require Ofcom to review its codes and undertake a comprehensive review of the entire online safety regime. Including yet another power to give unlimited guidance to Ofcom about how it should carry out its work seems unnecessary and intrusive, so this amendment would remove it, by removing paragraph (a) of Clause 157(1).

I hope noble Lords can see that, even after taking account of the amendments that the committee is proposing, the Secretary of State would be left with substantial and suitable powers to discharge their responsibilities properly.

Perhaps I may comment on some of the amendments to which I have not added my name. Amendment 110 from the noble Lords, Lord Stevenson and Lord Clement-Jones, and Amendment 290 from the noble Lord, Lord Stevenson, are about parliamentary oversight by Select Committees. I do not support the detail of these amendments nor the procedures proposed, because I believe they are potentially too cumbersome and could cause too much delay to various processes. As I have already said, and as the noble Lord, Lord Stevenson, said in opening, the Select Committee and I are concerned to ensure that there is adequate parliamentary oversight of Ofcom as it implements this legislation over the next few years. My committee clearly has a role in this, alongside the new DSIT Select Committee in the House of Commons and perhaps others, but we need to guard against duplication and fragmentation.

17:00
There are bigger questions about the parliamentary oversight of regulators in the wider digital field, as none of this is sector-specific and it also affects the CMA—we will see that magnified when we get to the digital markets Bill—so I do not think parliamentary oversight is something that we can just ignore. It is an issue of growing importance when it comes to regulators, particularly those that are regulating in areas that are new and different and require a different kind of approach by those regulators. As I said at the start, we need to get the distribution of power right between big tech, the regulators, the Government and Parliament if we are to achieve what I think is our ultimate aim and purpose: greater accountability to the public at large for the technology that has so much power at so many levels of our individual and national life.
In reply to my letter to the Secretary of State in January—with my committee hat on—which is available on the committee’s website, the Minister here and Paul Scully, the Minister in the other place, indicated a willingness to discuss my amendments after Committee. I hope my noble friend and his colleagues will honour that commitment and the Government will accept my amendments. While there are other amendments in this group that would provide interesting solutions—as the noble Lord, Lord Stevenson, has said, some of them go a bit further than those that I am proposing—what the committee is proposing represents a measured and appropriate approach, and I hope the Government take it seriously. I look forward to discussing that further with the Minister.
I also hope that the Government support Parliament in enhancing its oversight of the regulators in which so much power is being vested. However expert, independent and professional they may be—I note that my noble friend Lord Grade is not in the Chamber today, as I believe he is overseas this week, but no one respects and admires my noble friend more than I do, and I am not concerned in any way about the expertise and professionalism of Ofcom—none the less we are in a situation where they are being vested with a huge amount of power and we need to make sure that the oversight of them is right. Even if I do not support that which is specifically put forward by the noble Lord, Lord Stevenson, this is an area where we need to move forward but we need the Government to support us in doing so if we are going to make it happen. I look forward to what my noble friend has to say in response to this group.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I have put my name to Amendments 113, 114, 117, 118, 120 and 257. As the noble Baroness, Lady Stowell, has said, it is crucial that Ofcom both has and is seen to have complete independence from political interference when exercising its duty as a regulator.

On Ofcom’s website there is an article titled “Why Independence Matters in Regulating TV and Radio”—for the purposes of the Bill, I suggest that we add “Online”. It states:

“We investigate following our published procedures which contain clear, transparent and fair processes. It’s vital that our decisions are always reached independently and impartially”.


I am sure there are few Members of the Committee who would disagree with that statement. That sentiment is supported by a recent UNESCO conference to create global guidance for online safety regulation, whose concluding statement said that

“an independent authority is better placed to act impartially in the public interest and to avoid undue influence from political or industry interests”.

As the noble Baroness, Lady Stowell, has said, that is what successive Governments have striven to do with Ofcom’s regulation of broadcast and radio. Now the Government and Parliament must succeed in doing the same by setting up this Bill to ensure absolute independence for Ofcom in regulating the digital space.

The codes of practice drawn up by Ofcom will be central to the guidance for the parameters set out by the media regulator for the tech companies, so it is essential that the regulator, when setting them up, can act independently from political interference. In my view and that of many local Lords, Clause 39 does not provide that level of independence from political interference. No impartial observer can think that the clause as drafted allows Ofcom the independence that it needs to shape the limits of the tech platforms’ content. In my view, this is a danger to freedom of expression in our country by giving permission for the Secretary of State to interfere continually and persistently in Ofcom’s work.

Amendments 114 and 115 would ensure a badly needed reinforcement of the regulator’s independence. I see why the Minister would want a Secretary of State to have the right to direct the regulator, but I ask him to bear in mind that it will not always be a Minister he supports who is doing the directing. In those circumstances, surely he would prefer a Secretary of State to observe or have regard to the views on the draft codes of practice. Likewise, the endless ping-pong envisaged by Clause 39(7) and (9) allows huge political pressure and interference to be placed on the regulator. This would not be allowed in broadcast regulation, so why is it allowed for online regulation, which is already the dominant medium and can get only more dominant and more important?

Amendment 114 is crucial. Clause 39(1)(a), allowing the Minister’s direction to cover public policy, covers almost everything and is impossibly broad and vague. If the Government want an independent regulator, can the Minister explain how this power would facilitate that goal? I am unsure of how the Government will approach this issue, but I am told that they want to recognise the concerns about an overmighty Secretary of State by bringing forward their own amendment, limiting the powers of direction to specific policy areas. Can the Minister confirm that he is looking at using the same areas as in the Communications Act 2003, which are

“national security … relations with the government of a country … compliance with international obligations of the United Kingdom … the safety of the public or of public health”?

I worry about any government amendment which might go further and cover economic policy and burden to business. I understand that the Government would want to respond to the concerns that this Bill might create a burden on business and therefore could direct Ofcom to ease regulations in these areas. However, if this area is to be included, surely it will create a lobbyists’ charter. We all know how effective the big tech companies have been at lobbying the Government and slowing down the process of shaping this Bill. The Minister has only to talk to some of the Members who have helped to shape the Bill to know the determination and influence of those lobbying companies.

To allow the DCMS Secretary of State to direct Ofcom continuously to modify the codes of practice until they are no longer a burden to business would dramatically dilute the power and independence of the UK’s world-respected media regulator. Surely this is not what the people of Britain would want; the Minister should not want it either. The words “vague” and “broad” are used repeatedly by freedom of speech campaigners when looking at the powers of political interference in the Bill.

When the draft Bill came out, I was appalled by the extraordinary powers that it gave the Secretary of State to modify the content covered by “legal but harmful”, and I am grateful to the Government for responding to the Joint Committee and many other people’s concerns about this potentially authoritarian power. Clause 39 is not in the same league, but for all of us who want to ensure that Ministers do not have the power to interfere in the independence of Ofcom, I ask the Minister to accept the well-thought-through solutions represented by these amendments and supported by all Benches. I also support the request made by the noble Baroness, Lady Stowell, that Parliament should be involved in the oversight of Ofcom. I ask the Minister to respond to these widely supported amendments, either by accepting them or by tabling amendments of his own which guarantee the independence of the regulator.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I broadly support all these amendments in spirit, since, as we have heard, they tackle excessive levels of influence that any Secretary of State is awarding themselves to shape the strategic priorities and codes of conduct of Ofcom. I will speak to Amendments 254 and 260, tabled by the noble Lord, Lord Moylan, who I am glad to see in his place. He will see in Hansard that he was about to be much missed. I cannot do him credit, but I will carry on regardless because I support his amendments.

The main difference between Amendment 254 and other similar amendments is that it requires that any guidance issued to Ofcom—under Clause 157, for example—is

“approved by resolution of each House of Parliament”

rather than by committees. However, the spirit of it, which is to remove the Secretary of State’s power to give wide-ranging guidance or instructions about Ofcom’s functions, and the concerns that we have about that, is broadly in line with everything else we have heard.

It is important to ask whether it is appropriate for our right to freedom of expression to be curtailed by secondary legislation, which cannot be amended, which has little parliamentary oversight and on which challenges are very often reduced to nothing more than rhetorical whinges in this House. That would mean that the power exercised by the Secretary of State would bypass the full democratic process.

In fact, it also weakens our capacity to hold Ofcom to account. One thing that has become apparent throughout our Committee deliberations is that—as I think the noble Baroness, Lady Stowell, indicated—Ofcom will be an uber-regulator. It is itself very powerful, as I have raised, with respect to potentially policing and controlling what UK citizens see, read and have access to online, what they are allowed to search, private messaging and so on. In some ways, I want Ofcom to have much more scrutiny and be accountable to Parliament, elected politicians and the public realm. But that is not the same as saying that Ofcom should be accountable and answerable to the Secretary of State; that would be a whole different ball game. It could be said that the Secretary of State will be elected, but we know that that is a sleight of hand.

I want more accountability and scrutiny of Ofcom by individual users of online services; we even talked about that, the other day, in relation to complaints. I want more democratic scrutiny, but the Bill does the opposite of that by allowing the Government a huge amount of executive power to shape the proposed system of online speech moderation and even influence political discourse in the public square.

I want to move on to that issue. Under the Bill, the Secretary of State will have the power to set Ofcom’s strategic priorities, direct Ofcom to modify its code of practice through secondary legislation, set criteria for platform categorisation and designate priority illegal offences. They will be able to change codes of practice for “reasons of public policy”, which is as vague a phrase as you will ever get. I fear that, frankly, that level of discretion is likely to lead to a highly politicised and—my dread—censorship-heavy approach to regulation.

The Secretary of State could come under extreme pressure to respond to each individual concerning case of digital content—whatever happens to be in the news this week—with an ever-expanding list of areas to be dealt with. I dread that this will inevitably be exploited by highly political lobbyists and organisations, who will say, “You must act on this. This is hate speech. You’ve got to do something about this”. That is a completely arbitrary way to behave.

According to the Bill, Ofcom has no choice but to comply, and that obviously leads to the dangers of politicisation. I do not think it is scaremongering to say that this is politicisation and could compromise the independence of Ofcom. The Secretary of State’s power of direction could mean that the Government are given the ability to shape the permissibility of categories of online content, based on the political mood of the day, and the political whims of a specific Secretary of State to satisfy a short-term moral panic on a particular issue.

One question for the Minister, and the Government, is: should you ever create powers that you would not want to see your political opponents exercising? The Secretary of State today will not always be the Secretary of State tomorrow; they will not always be in the same image. Awarding such overwhelming powers, and the potential politicising of policing speech, might feel comfortable for the Government today; it might be less comfortable when you look at the way that some people view, for example, the tenets of Conservatism.

In recent weeks, since a “National Conservatism” conference was held up the road, I have heard members of opposition parties describe the contents of Conservatism as “Trumpist”, “far-right” and “fascist” hate speech. I am worried—on behalf of the Government —that some of those people might end up as a Secretary of State and it could all blow up in their face as it were, metaphorically.

In all seriousness, because I am really not interested in the fate of either the Opposition or the Government in terms of their parties, I am trying to say that it is too arbitrary. In a situation where we have such weak commitments to freedom of conscience, thought or speech in this Bill, I really do not want to give the Secretary of State the power to broaden out the targets that might be victim to it.

Finally—and I apologise to the noble Lord, Lord Moylan, who would have been much more professional, specific and hard-hitting on his amendment—from what I have heard, I hope that all the tablers of the amendments from all parties might well have got together by Report and come up with satisfactory amendments that will deal with this. I think we all agree, for once, that something needs to be done to curtail power. I look forward to supporting that later in the process.

17:15
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I rise very briefly to support the amendments in the name of the noble Baroness, Lady Stowell, and the noble Lord, Lord Stevenson. Like other speakers, I put on record my support for the regulator being offered independence and Parliament having a role.

However, I want to say one very brief and minor thing about timing—I feel somewhat embarrassed after the big vision of the noble Baroness, Lady Stowell. Having had quite a lot of experience of code making over the last three years, I experienced the amount of time that the department was able to take in responding to the regulator as being a point of power, a point of lobbying, as others have said, and a point of huge distraction. For those of us who have followed the Bill for five years and as many Secretaries of State, we should be concerned that none of the amendments has quite tackled the question of time.

The idea of acting within a timeframe is not without precedent; the National Security and Investment Act 2021 is just one recent example. What was interesting about that Act was that the reason given for the Secretary of State’s powers being necessary was as a matter of national security—that is, they were okay and what we all agree should happen—but the reason for the time restriction was for business stability. I put it to the Committee that the real prospect of children and other users being harmed requires the same consideration as business stability. Without a time limit, it is possible that inaction can be used to control or simply fritter away.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I will make a short contribution on this substantive question of whether concerns about ministerial overreach are legitimate. Based on a decade of being on the receiving end of representations from Ministers, the short answer is yes. I want to expand on that with some examples.

My experience of working on the other side, inside a company, was that you often got what I call the cycle of outrage: something is shared on social media that upsets people; the media write a front-page story about it; government Ministers and other politicians get involved; that then feeds back into the media and the cycle spins up to a point where something must be done. The “something” is typically that the Minister summons people, such as me in my old job, and brings them into an office. That itself often becomes a major TV moment, where you are brought in, browbeaten and sent out again with your tail between your legs, and the Minister has instructed you to do something. That entire process takes place in the political rather than the regulatory domain.

I readily concede that, in many cases, something of substance needed to be addressed and there was a genuine problem. It is not that this was illegitimate, but these amendments are talking about the process for what we should do when that outrage is happening. I agree entirely with the tablers of the amendments that, to the extent that that process can be encapsulated within the regulator rather than a Minister acting on an ad hoc basis, it would be a significant improvement.

I also note that this is certainly not UK-specific, and it would happen in many countries with varying degrees of threat. I remember being summoned to the Ministry of the Interior in Italy to meet a gentleman who has now sadly passed. He brought me into his office, sat me down, pointed to his desk and said “You see that desk? That was Mussolini’s desk”. He was a nice guy and I left with a CD of his rhythm and blues band, but it was clear that I was not supposed to say no to him. He made a very clear and explicit political direction about content that was on the platform.

One big advantage of this Bill is that it has the potential to move beyond that world. It could move from individual people in companies—the noble Baroness, Lady Stowell of Beeston, made this point very powerfully—to changing the accountability model away from either platforms being entirely accountable themselves or platforms and others, including Ministers, somehow doing deals that will have an impact, as the noble Baroness, Lady Fox, and the noble Viscount, Lord Colville, said, on the freedom of expression of people across the country. We do not want that.

We want to move on in the Bill and I think we have a model which could work. The regulator will take on the outrage and go as far as it can under the powers granted in the Bill. If the regulator believes that it has insufficient powers, it will come back to Parliament and ask for more. That is the way in which the system can and should work. I think I referred to this at Second Reading; we have an opportunity to create clear accountability. Parliament instructs Ofcom, which instructs the platforms. The platforms do what Ofcom says, or Ofcom can sanction them. If Ofcom feels that its powers are deficient, it comes back to Parliament. The noble Lord, Lord Stevenson, and others made the point about scrutiny and us continually testing whether Ofcom has the powers and is exercising them correctly. Again, that is entirely beneficial and the Government should certainly be minded to accept those amendments.

With the Secretary of State powers, as drafted in the Bill and without the amendments we are considering today, we are effectively taking two steps forward and one step back on transparency and accountability. We have to ask: why take that step back when we are able to rely on Ofcom to do the job without these directions?

The noble Baroness, Lady Stowell of Beeston, made the point very clearly that there are other ways of doing this. The Secretary of State can express their view. I am sure that the Minister will be arguing that the Secretary of State’s powers in the Bill are better than the status quo because at least what the Secretary of State says will be visible; it will not be a back-room deal. The noble Baroness, Lady Stowell of Beeston, has proposed a very good alternative, where the Secretary of State makes visible their intentions, but not in the form of an order—rather in the form of advice. The public—it is their speech we are talking about—then have the ability to see whether they agree with Ofcom, the companies or the Secretary of State if there is any dispute about what should happen.

It is certainly the case that visible instructions from the Secretary of State would be better, but the powers as they are still leave room for arm-twisting. I can imagine a future scenario in which future employees of these platforms are summoned to the Secretary of State. But now the Secretary of State would have a draft order sitting there. The draft order is Mussolini’s desk. They say to the people from the platforms, “Look, you can do what I say, or I am going to send an order to Ofcom”. That takes us back to this world in which the public are not seeing the kind of instructions being given.

I hope that the Government will accept that some amendment is needed here. All the ones that have been proposed suggest different ways of achieving the same objective. We are trying to protect future Secretaries of State from an unhealthy temptation to intervene in ways that they should not.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, on day eight of Committee, I feel that we have all found our role. Each of us has spoken in a similar vein on a number of amendments, so I will try to be brief. As the noble Lord, Lord Allan, has spoken from his experience, I will once again reference my experience as the chief executive, for seven years, of a business regulated by Ofcom; as the chair of a regulator; and as someone who sat on the court of, arguably, the most independent of independent regulators, the Bank of England, for eight years.

I speak in support of the amendments in the name of my noble friend Lady Stowell, because, as a member of the Communications and Digital Committee, my experience, both of being regulated and as a regulator, is that independent regulators might be independent in name—they might even be independent in statute—but they exist in the political soup. It is tempting to think that they are a sort of granite island, completely immovable in the political soup, but they are more like a boat bobbing along in the turbulence of politics.

As the noble Lord, Lord Allan, has just described, they are influenced both overtly and subtly by the regulated companies themselves—I am sure we have both played that game—by politicians on all sides, and by the Government. We have played these roles a number of times in the last eight days; however, this is one of the most important groups of amendments, if we are to send the Bill back in a shape that will really make the difference that we want it to. This group of amendments challenges whether we have the right assignment of responsibility between Parliament, the regulator, government, the regulated and citizens.

It is interesting that we—every speaker so far—are all united that the Bill, as it currently stands, does not get that right. To explain why I think that, I will dwell on Amendment 114 in the name of my noble friend Lady Stowell. The amendment would remove the Secretary of State’s ability to direct Ofcom to modify a draft of the code of practice “for reasons of public policy”. It leaves open the ability to direct in the cases of terrorism, child sexual abuse, national security or public safety, but it stops the Secretary of State directing with regard to public policy. The reason I think that is so important is that, while tech companies are not wicked and evil, they have singularly failed to put internet safety, particularly child internet safety, high enough up their pecking order compared with delivering for their customers and shareholders. I do not see how a Secretary of State will be any better at that.

Arguably, the pressures on a Secretary of State are much greater than the pressures on the chief executives of tech companies. Secretaries of State will feel those pressures from the tech companies and their constituents lobbying them, and they will want to intervene and feel that they should. They will then push that bobbing boat of the independent regulator towards whichever shore they feel they need to in the moment—but that is not the way you protect people. That is not the way that we treat health and safety in the physical world. We do not say, “Well, maybe economics is more important than building a building that’s not going to fall down if we have a hurricane”. We say that we need to build safe buildings. Some 200 years ago, we were having the same debates about the physical world in this place; we were debating whether you needed to protect children working in factories, and the consequences for the economics. Well, how awful it is to say that today. That is the reality of what we are saying in the Bill now: that we are giving the Secretary of State the power to claim that the economic priority is greater than protecting children online.

I am starting to sound very emotional because at the heart of this is the suggestion that we are not taking the harms seriously enough. If we really think that we should be giving the Secretary of State the freedom to direct the regulator in such a broad way, we are diminishing the seriousness of the Bill. That is why I wholeheartedly welcome the remark from the noble Lord, Lord Stevenson, that he intends to bring this back with the full force of all of us across all sides of the Committee, if we do not hear some encouraging words from my noble friend the Minister.

Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - - - Excerpts

My Lords, it is pleasure to follow the noble Baroness, Lady Harding, whose very powerful speech took us to the heart of the principles behind these amendments. I will add my voice, very briefly, to support the amendments for all the key reasons given. The regulator needs to be independent of the Secretary of State and seen to be so. That is the understandable view of the regulator itself, Ofcom; it was the view of the scrutiny committee; and it appears to be the view of all sides and all speakers in this debate. I am also very supportive of the various points made in favour of the principle of proper parliamentary scrutiny of the regulator going forward.

One of the key hopes for the Bill, which I think we all share, is that it will help set the tone for the future global conversation about the regulation of social media and other channels. The Government’s own impact assessment on the Bill details parallel laws under consideration in the EU, France, Australia, Germany and Ireland, and the noble Viscount, Lord Colville, referred to standards set by UNESCO. The standards set in the OSB at this point will therefore be a benchmark across the world. I urge the Government to set that benchmark at the highest possible level for the independence and parliamentary oversight of the regulator.

17:30
Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I speak to support my noble friend Lady Stowell and the noble Lord, Lord Stevenson. I would like to share two insights: one a piece of experience from my role as a junior Minister, and one as it bears down on the Bill.

As a junior Health Minister responsible for innovation and life sciences, it was my responsibility to look after 22 arm’s-length bodies, including the MHRA—an incredibly powerful regulator, possibly as powerful and important as Ofcom is, and certainly will be under this new Bill. As the junior Minister, you are under huge pressure from civil society, from the pharma industry and from noble Lords—some of whom I see in the Chamber today—who all have extremely strong opinions about the regulation of medicines. They also have, at times, very important insights about patients and what might be able to be done if certain innovative medicines could be accelerated. The great thing about being the Life Sciences Minister is that there is nothing you can do about it whatever. Your hands are tied. The MHRA obeys science and the regulation of science and not, I am pleased to say, Ministers, because Ministers are not good people to judge the efficacy and safety of medicines.

My advice to the Minister is to embrace the Bethell principle: that it is a huge relief not to be able to interfere in the day-to-day operations of your regulator. I remember speaking at a G7 meeting of Health Ministers to one of my compadres, who expressed huge envy for the British system because he had demonstrators and political donors on his back night and day, trying to get him to fix the regulations one way or the other. That is my point about the day-to-day management and implementation of policy.

When it comes to the objectives of the regulator, the Bill maybe leaves scope for some improvement. I thought my noble friend put it extremely well: it is where Parliament needs to have a voice. We have seen that on the subject of age verification for porn—a subject I feel very strongly about—where, at the moment, Parliament is leaving it to the regulator to consult industry, users of the internet and wider civic society to determine what the thresholds for age verification should be. That is a mistake; it is not the right way round to do things. It is where Parliament should have a voice, because these are mandatory population-wide impositions. We are imposing them on the population, and that is best done by Parliament, not the regulator. It needs the heft of Parliament when it comes to imposing and enforcing those regulations. If you do not have that parliamentary heft, the regulator may be on a granite island but it would be a very lonely island without the support it needs when taking on extremely powerful vested interests. That is why Parliament needs a reach into the system when it comes to objective setting.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to follow the noble Lord, Lord Bethell, who is clearly passionate about this aspect. As the noble Baroness, Lady Harding, said, this is one of the most important groups of amendments that we have to debate on the Bill, even though we are on day eight of Committee. As she said, it is about the right assignment of responsibilities, so it is fundamental to the way that the Bill will operate.

My noble friend Lord Allan brilliantly summed up many of the arguments, and he has graphically described the problem of ministerial overreach, as did the noble Baroness, Lady Harding. We on these Benches strongly support the amendments put forward by the noble Lord, Lord Stevenson, and those put forward by the noble Baroness, Lady Stowell. Obviously, there is some difference of emphasis. They each follow the trail of the different committees of which their proposers were members, which is entirely understandable. I recall that the noble Lord, Lord Gilbert, was the hinge between the two committees—and brilliantly he did that. I very much hope that, when we come back at the next stage, if the Minister has not moved very far, we will find a way to combine those two strands. I think they are extremely close—many noble Lords have set out where we are on accountability and oversight.

Strangely, we are not trying to get out of the frying pan of the Secretary of State being overbearing and move to where we have no parliamentary oversight. Both the noble Baroness, Lady Stowell, and the noble Lord, Lord Stevenson, are clearly in favour of greater oversight of Ofcom. The question is whether it is oversight of the codes and regulation or of Ofcom itself. I think we can find a way to combine those two strands. In that respect, I entirely agree with the noble Baroness, Lady Fox: it is all about making sure that we have the right kind of oversight.

I add my thanks to Carnegie UK. The noble Lord, Lord Stevenson, and the noble Baroness, Lady Stowell, set out the arguments, and we have the benefit of the noble Baroness’s letter to the Secretary of State of 30 January, which she mentioned in her speech. They have set out very clearly where speakers in this debate unanimously want to go.

The Government have suggested some compromise on Clause 39. As the noble Lord, Lord Stevenson said, we have not seen any wording for that, but I think it is highly unlikely that that, by itself, will satisfy the House when we come to Report.

There are many amendments here which deal with the Secretary of State’s powers, but I believe that the key ones are the product of both committees, which is about the Joint Committee. If noble Lords read the Government’s response to our Joint Committee on the draft Bill, they will see that the arguments given by the Government are extremely weak. I think it was the noble Baroness, Lady Stowell, who used the phrase “democratic deficit”. That is exactly what we are not seeking: we are trying to open this out and make sure we have better oversight and accountability. That is the goal of the amendments today. We have heard from the noble Viscount, Lord Colville, about the power of lobbying by companies. Equally, we have heard about how the Secretary of State can be overbearing. That is the risk we are trying to avoid. I very much hope that the Minister sees his way to taking on board at least some of whichever set of amendments he prefers.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, the amendments concern the independence of Ofcom and the role of parliamentary scrutiny. They are therefore indeed an important group, as those things will be vital to the success of the regime that the Bill sets up. Introducing a new, ground-breaking regime means balancing the need for regulatory independence with a transparent system of checks and balances. The Bill therefore gives powers to the Secretary of State comprising a power to direct Ofcom to modify a code of practice, a power to issue a statement of strategic priorities and a power to issue non-binding guidance to the regulator.

These powers are important but not novel; they have precedent in the Communications Act 2003, which allows the Secretary of State to direct Ofcom in respect of its network and spectrum functions, and the Housing and Regeneration Act 2008, which allows the Secretary of State to make directions to the Regulator of Social Housing to amend its standards. At the same time, I agree that it is important that we have proportionate safeguards in place for the use of these powers, and I am very happy to continue to have discussions with noble Lords to make sure that we do.

Amendment 110, from the noble Lord, Lord Stevenson, seeks to introduce a lengthier process regarding parliamentary approval of codes of practice, requiring a number of additional steps before they are laid in Parliament. It proposes that each code may not come into force unless accompanied by an impact assessment covering a range of factors. Let me reassure noble Lords that Ofcom is already required to consider these factors; it is bound by the public sector equality duty under the Equality Act 2010 and the Human Rights Act 1998 and must ensure that the regime and the codes of practice are compliant with rights under the European Convention on Human Rights. It must also consult experts on matters of equality and human rights when producing its codes.

Amendment 110 also proposes that any designated Select Committee in either House has to report on each code and impact assessment before they can be made. Under the existing process, all codes must already undergo scrutiny by both Houses before coming into effect. The amendment would also introduce a new role for the devolved Administrations. Let me reassure noble Lords that the Government are working closely with them already and will continue to do so over the coming months. As set out in Schedule 5 to the Scotland Act 1998, however, telecommunications and thereby internet law and regulation is a reserved policy area, so input from the devolved Administrations may be more appropriately sought through other means.

Amendments 111, 113, 114, 115, and 117 to 120 seek to restrict or remove the ability of the Secretary of State to issue directions to Ofcom to modify draft codes of practice. Ofcom has great expertise as a regulator, as noble Lords noted in this debate, but there may be situations where a topic outside its remit needs to be reflected in a code of practice. In those situations, it is right for the Government to be able to direct Ofcom to modify a draft code. This could, for example, be to ensure that a code reflects advice from the security services, to which Ofcom does not have access. Indeed, it is particularly important that the Secretary of State be able to direct Ofcom on matters of national security and public safety, where the Government will have access to information which Ofcom will not.

I have, however, heard the concerns raised by many in your Lordships’ House, both today and on previous occasions, that these powers could allow for too much executive control. I can assure your Lordships that His Majesty’s Government are committed to protecting the regulatory independence of Ofcom, which is vital to the success of the framework. With this in mind, we have built a number of safeguards into the use of the powers, to ensure that they do not impinge on regulatory independence and are used only in limited circumstances and for the appropriate reasons.

I have heard the strong feelings expressed that this power must not unduly restrict regulatory independence, and indeed share that feeling. In July, as noble Lords noted, the Government announced our intention to make substantive changes to the power; these changes will make it clear that the power is for use only in exceptional circumstances and will replace the “public policy” wording in Clause 39 with a defined list of reasons for which a direction can be made. I am happy to reiterate that commitment today, and to say that we will be making these changes on Report when, as the noble Lord, Lord Clement-Jones, rightly said, noble Lords will be able to see the wording and interrogate it properly.

Additionally, in light of the debate we have just had today—

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

Can my noble friend the Minister clarify what he has just said? When he appeared in front of the Communications and Digital Committee, I think he might have been road-testing some of that language. In the specific words used, he would still have allowed the Secretary of State to direct Ofcom for economic reasons. Is that likely to remain the case? If it is, I feel it will not actually meet what I have heard is the will of the Committee.

17:45
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

When we publish the wording, we will rightly have an opportunity to discuss it before the debate on Report. I will be happy to discuss it with noble Lords then. On the broader points about economic policy, that is a competency of His Majesty’s Government, not an area of focus for Ofcom. If the Government had access to additional information that led them to believe that a code of practice as drafted could have a significant, disproportionate and adverse effect on the livelihoods of the British people or to the broader economy, and if it met the test for exceptional circumstances, taking action via a direction from the Secretary of State could be warranted. I will happily discuss that when my noble friend and others see the wording of the changes we will bring on Report. I am sure we will scrutinise that properly, as we should.

I was about to say that, in addition to the commitment we have already made, in the light of the debate today we will also consider whether transparency about the use of this power could be increased further, while retaining the important need for government oversight of issues that are genuinely beyond Ofcom’s remit. I am conscious that, as my noble friend Lady Stowell politely said, I did not convince her or your Lordships’ committee when I appeared before it with my honourable friend Paul Scully. I am happy to continue our discussions and I hope that we may reach some understanding on this important area.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Hansard - - - Excerpts

I am sorry to interrupt, but may I clarify what my noble friend just said? I think he said that, although he is open to increasing the transparency of the procedure, he does not concede a change—from direction to a letter about guidance which Ofcom should take account of. Is he willing to consider that as well?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am happy to continue to discuss it, and I will say a bit more about the other amendments in this group, but I am not able to say much more at this point. I will happily follow this up in discussion with my noble friend, as I know it is an issue of interest to her and other members of your Lordships’ committee.

The noble Lord, Lord Stevenson, asked about our international obligations. As noble Lords noted, the Government have recognised the importance of regulatory independence in our work with international partners, such as the Council of Europe’s declaration on the independence of regulators. That is why we are bringing forward the amendments previously announced in another place. Ensuring that powers of direction can be issued only in exceptional circumstances and for a set of reasons defined in the Bill will ensure that the operational independence of Ofcom is not put at risk. That said, we must strike a balance between parliamentary oversight and being able to act quickly where necessary.

Regarding the amendment tabled by my noble friend Lady Stowell, which calls for all codes which have been altered by a direction to go through the affirmative procedure, as drafted, the negative procedure is used only if a direction is made to a code of practice relating to terrorism or child sexual exploitation or abuse, for reasons of national security or public safety. It is important that the parliamentary process be proportionate, particularly in cases involving national security or public safety, where a code might need to be amended quickly to protect people from harm. We therefore think that, in these cases, the negative procedure is more appropriate.

On timing, the Government are committed to ensuring that the framework is implemented quickly, and this includes ensuring that the codes of practice are in force. The threshold of exceptional circumstances for the power to direct can lead to a delay only in situations where there would otherwise be significant consequences for national security or public safety, or for the other reasons outlined today.

My noble friend Lord Moylan was not able to be here for the beginning of the debate on this group, but he is here now. Let me say a little about his Amendment 254. Under Clause 153, the Secretary of State can set out a statement of the Government’s strategic priorities in relation to matters of online safety. This power is necessary, as future technological changes are likely to shape online harms, and the Government must be able to state their strategic priorities in relation to them. My noble friend’s amendment would go beyond the existing precedent for the statement of strategic priorities in relation to telecommunications, management of the radio spectrum, and postal services outlined in the Communications Act. The Secretary of State must consult Ofcom and other appropriate persons when preparing this statement. This provides the opportunity for widespread scrutiny of a draft statement before it can be designated through a negative parliamentary procedure. We consider that the negative procedure is appropriate, in line with comparable existing arrangements.

Amendment 257 from the noble Lord, Lord Stevenson, seeks to remove the Secretary of State’s power to issue guidance to Ofcom about the exercise of its online safety functions. Issuing guidance of this kind, with appropriate safeguards, including consultation and limitations on its frequency, is an important part of future-proofing the regime. New information—for example, resulting from parliamentary scrutiny or technological developments—may require the Government to clarify the intent of the legislation.

Amendments 258 to 260 would require the guidance to be subject to the affirmative procedure in Parliament. Currently, Ofcom must be consulted, and any guidance must be laid before Parliament. The Bill does not subject the guidance to a parliamentary procedure because the guidance does not create any statutory requirements, and Ofcom is required only to have had regard to it. We think that remains the right approach.

The noble Lord, Lord Stevenson, has made clear his intention to question Clause 156, which grants the Secretary of State the power to direct Ofcom’s media literacy activity only in special circumstances. This ensures that the regulatory framework is equipped to respond to significant future threats—for example, to the health or safety of the public, or to national security. I have already set out, in relation to other amendments, why we think it is right that the Secretary of State can direct Ofcom in these circumstances.

The delegated powers in the Bill are crucial to ensuring that the regulatory regime keeps pace with changes in this area. Amendment 290 from the noble Lord, Lord Stevenson, would go beyond the existing legislative process for these powers, by potentially providing for additional committees to be, in effect, inserted into the secondary legislative process. Established committees themselves are able to decide whether to scrutinise parts of a regime in more detail, so I do not think they need a Parkinson rule to do that.

Noble Lords have expressed a common desire to see this legislation implemented as swiftly as possible, so I hope they share our wariness of any amendments which could slow that process down. The process as envisaged in this amendment is an open-ended one, which could delay implementation. Of course, however, it is important that Parliament is able to scrutinise the work of the regulator. Like most other regulators, Ofcom is accountable to Parliament on how it exercises its functions. The Secretary of State is required to present its annual report and accounts before both Houses. Ministers from Scotland, Wales and Northern Ireland must also lay a copy of the report before their respective Parliament or Assembly. Moreover, the officers of Ofcom can be required to appear before Select Committees to answer questions about its operations on an annual basis. Parliament will also have a role in approving a number of aspects of the regulatory framework through its scrutiny of both the primary and secondary legislation. This will include the priority categories for harms and Ofcom’s codes of practice.

More broadly, we want to ensure that this ground-breaking legislation has the impact we intend. Ongoing parliamentary scrutiny of it will be crucial to help to ensure that. There is so much expertise in both Houses, and it has already helped to improve this legislation, through the Joint Committee on the draft Bill, the DCMS Select Committee in another place and, of course, your Lordships’ Communications and Digital Committee.

As my noble friend Lady Stowell said, we must guard against fragmentation and duplication, which we are very mindful of. Although we do not intend to legislate for a new committee—as I set out on previous occasions, including at Second Reading and before the Communications and Digital Committee—we remain happy to discuss possible mechanisms for oversight to ensure that we make best use of the expertise in both Houses of Parliament so that the Bill delivers what we want. With that, I hope that Members of the Committee will be happy to continue the discussions in this area and not press their amendments.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

I am grateful to the noble Lord for his comprehensive response and for the welcome change in tone and the openness to further debate and discussions. I thank all those who spoke in the debate. The noble Baroness, Lady Harding, was right: we are getting into a routine where we know roughly where our places are and, if we have contributions to make, we make them in the right order and make them comprehensive. We did our bit quite well, but I am afraid that the Minister’s response made me a bit confused. As I said, I welcome the change of tone, the sense of engagement with some of the issues and the ability to meet to discuss ways forward in some of those areas. But he then systematically and rather depressingly shut off just about everything that I thought we were going to discuss. I may be overstating that, so I will read Hansard carefully to make sure that there are still chinks of light in his hitherto impenetrable armour. I really must stop using these metaphors— I thought that the noble Baroness, Lady Harding, had managed to get me off the hook with her question about whether we were an island of concrete rock, and about whether the boat was going to end up in the stormy sea that we were creating. I decided that I could not follow that, so I will not.

We ought to take forward and address three things, which I will briefly go through in the response. One that we did not nail down was the good point made by the noble Baroness, Lady Kidron, that we had focused on regulatory structures in the form of set bodies relating—or not relating—to parliamentary procedures and to Ministers and their operations. She pointed out that, actually, the whole system has a possible drag effect that we also need to think about. I note that good point because we probably need a bit of time to think about how that would work in the structures that come forward.

The noble Lord, Lord Allan, said that we are trying to look at the changing of the accountability model. I disagree with the word “changing” because we are not trying to change anything; we have a model that works, but the new factor that we are trying to accommodate is the intensity of interaction and, as we said, the amplification that comes from the internet. I worry that this was not being picked up enough in the Minister’s response, but we will pick it up later and see if we can get through it.

The three points I wanted to make sure of were as follows. Following the line taken by the noble Baroness, Lady Stowell, one point is on trying to find a proper balance between the independence of the regulator; the Secretary of State’s right, as an elected leader of this aspect of the Government, to make recommendations and proposals to that regulator on how the system can be better; and Parliament’s ability to find a place in that structure, which is still eluding us a little, so we will need to spend more time on it. There is enough there to be reassured that we will find a way of balancing the independence of the regulator and the role of the Secretary of State. It does not need as many mentions in the legislation as it currently has. There is clearly a need for the Secretary of State to be able to issue direction in cases of national security et cetera—but it is the “et cetera” that I worry about: what are these instances? Until they are nailed down and in the Bill, there has to be a question about that.

18:00
How Parliament relates to the process is not yet defined. There is a willingness to debate about that. We do not have to go down any of the models that have been discussed today, but there are interesting ways of trying to do that. I think there is a general sense around the Committee that we want Parliament to have a bigger role complementary to what is in the Bill but different from it in a way that will be helpful and supportive of what we are trying to do.
The noble Lord, Lord Bethell, was very good about his experiences as a Minister with a strong regulator. We should listen to him—as we must do on many other issues, of course. But on this one he makes a point that there is a strength to be gained by isolating the way in which Parliament, Ministers and the regulator operate. I hope very much we will get there. Are we a rock or are we a boat? I am not sure, but we want to have a good journey and I would like to look forward to having that in future. I beg leave to withdraw the amendment.
Amendment 110 withdrawn.
Amendments 111 and 112 not moved.
Clause 38 agreed.
Clause 39: Secretary of State’s powers of direction
Amendments 113 to 119 not moved.
Clause 39 agreed.
Clause 40: Procedure for issuing codes of practice following direction under section 39
Amendment 120 not moved.
Clause 40 agreed.
Clauses 41 and 42 agreed.
Clause 43: Minor amendments of codes of practice
Amendment 121 not moved.
Clause 43 agreed.
Clause 44: Relationship between duties and codes of practice
Amendments 122 to 122ZB not moved.
Clause 44 agreed.
Clause 45 agreed.
Clause 46: Duties and the first codes of practice
Amendment 122ZC not moved.
Clause 46 agreed.
Clause 47: OFCOM’s guidance about certain duties in Part 3
Amendment 122A
Moved by
122A: Clause 47, page 46, line 10, after “29” insert “, except the duty set out in subsection (8A) of those sections”
Member’s explanatory statement
This amendment ensures that OFCOM need not produce guidance about the new duties in clauses 19 and 29 to supply records of risk assessments to OFCOM.
Amendment 122A agreed.
Clause 47, as amended, agreed.
Clause 48: OFCOM’s guidance: content that is harmful to children and user empowerment
Amendment 123 not moved.
Amendment 123A
Moved by
123A: Clause 48, page 46, line 22, at end insert—
“(c) pornographic material that must be put behind age assurance as set out in section (Ofcom's guidance about age assurance).Member’s explanatory statement
This amendment ensures parity of age assurance between pornographic material on part 3 and part 5 services.
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, it is a privilege to introduce Amendments 123A, 142, 161 and 184 in my name and those of the noble Lords, Lord Bethell and Lord Stevenson, and the right reverend Prelate the Bishop of Oxford. These amendments represent the very best of your Lordships’ House and, indeed, the very best of Parliament and the third sector because they represent an extraordinary effort to reach consensus between colleagues across the House including both opposition parties, many of the Government’s own Benches, a 40-plus group of Back-Bench Conservatives and the Opposition Front Bench in the other place. Importantly, they also enjoy the support of the commercial age check sector and a vast array of children’s charities and, in that regard, I must mention the work of Barnardo’s, CEASE and 5Rights, which have really led the charge.

I will spend the bulk of my time setting out in detail the amendments themselves, and I will leave my co-signatories and others to make the arguments for them. Before I do, I once again acknowledge the work of the noble Baroness, Lady Benjamin, who has been fighting this fight for many years, and the noble Baroness, Lady Harding, whose characteristic pragmatism was midwife to the drafting process. I also acknowledge the time spent talking about this issue with the Secretary of State, the noble Lord the Minister and officials at DSIT. I thank them for their time and their level of engagement.

Let me first say a few words about age assurance and age verification. Age assurance is the collective term for all forms and levels of age verification, which means an exact age, and age estimation, which is an approximate or probable age. Age assurance is not a technology; it is any system that seeks to achieve a level of certainty about the age or age range of a person. Some services with restricted products and services have no choice but to have the very highest level of assurance or certainty—others less so.

To be clear at the outset, checking someone’s age, whether by verification or estimation, is not the same as establishing identity. While it is absolutely the case that you can establish age as a subset of establishing someone’s identity, the reverse is not necessarily true. Checking someone’s age does not need to establish their identity.

Age assurance strategies are multifaceted. As the ICO’s guidance in the age-appropriate design code explains, online services can deploy a range of methods to achieve the necessary level of certainty about age or age range. For example, self-verification, parental authentication, AI estimation and/or the use of passports and other hard identifiers may all play a role in a single age assurance strategy, or any one of them may be a mechanism in itself in other circumstances. This means that the service must consider its product and make sure that the level of age assurance meets the level of risk.

Since we first started debating these issues in the context of the Digital Economy Act 2017, the technology has been transformed. Today, age assurance might just as effectively be achieved by assessing the fluidity of movement of a child dancing in a virtual reality game as by collecting their passport. The former is over 94% accurate within five seconds and is specific to that particular child, while a passport may be absolute but less reliable in associating the check with a particular child. So, in the specific context of that dancing child, it is likely that the former gives the greater assurance. When a service’s risk profile requires absolute or near absolute certainty—for example, any of the risks that are considered primary priority harms, including, but not limited to, pornography—having the highest possible level of assurance must be a precondition of access.

Age assurance can also be used to ensure that children who are old enough to use a service have an age-appropriate experience. This might mean disabling high-risk features such as hosting, livestreaming or private messaging for younger children, or targeting child users or certain age groups with additional safety, privacy and well-being interventions and information. These amendments, which I will get to shortly, are designed to ensure both. To achieve the levels of certainty and privacy which are widely and rightly demanded, the Bill must both reflect the current state of play and anticipate nascent and emerging technology that will soon be considered standard.

That was a long explanation, for which I apologise, but I hope it makes it clear that there is no single approach, but, rather, a need to clearly dictate a high bar of certainty for high-risk services. A mixed economy of approaches, all geared towards providing good outcomes for children, is what we should be promoting. Today we have the technology, the political will and the legislative mechanism to make good on our adult responsibilities to protect children online. While age assurance is eminently achievable, those responsible for implementing it and, even more importantly, those subject to it need clarity on standards; that is to say, rules of the road. In an era when data is a global currency, services have shown themselves unable to resist the temptation to repurpose information gleaned about the age of their users, or to facilitate the access to industrial amounts of harmful material for children for commercial gain. As with so many of tech’s practices, this has eroded trust and heightens the need for absolute clarity on how services build their age-assurance systems and what they do—and do not do—with the information they gather, and the efficacy and security of the judgments they make.

Amendment 125A simply underlines the point made frequently in Committee by the noble Baroness, Lady Ritchie of Downpatrick, that the Bill should make it clear that pornography should not be judged by where it is found but by the nature of the material itself. It would allow Ofcom to provide guidance on pornographic material that should be behind an age gate, either in Part 3 or Part 5.

Amendment 142 seeks to insert a new clause setting out matters that Ofcom must reflect in its guidance for effective age assurance; these are the rules of the road. Age assurance must be secure and maintain the highest levels of privacy; this is paramount. I do not believe I need to give examples of the numerous data leaks but I note the excessive data harvesting undertaken by some of the major platforms. Age assurance must not be an excuse to collect users’ personal and sensitive information unnecessarily, and it should not be sold, stored or used for other purposes, such as advertising, or offered to third parties.

Age assurance must be proportionate to the risk, as per the results of the child risk assessment, and let me say clearly that proportionality is not a route to allow a little bit of porn or a medium amount of self-harm, or indeed a lot of both, to a small number of children. In the proposed new clause, proportionality means that if a service is high-risk, it must have the highest levels of age assurance. Equally, if a service is low-risk or no-risk, it may be that no age assurance is necessary, or it should be unobtrusive in order to be proportionate. Age-assurance systems must provide mechanisms to challenge or change decisions to ensure that everyone can have confidence in their use, and they do not keep individuals—adults or children—out of spaces they have the right to be in. It must be inclusive and accessible so that children with specific accessibility needs are considered at the point of its design, and it must provide meaningful information so that users can understand the mode of operation. I note that the point about accessibility is of specific concern to the 5Rights young advisers. Systems must be effective. It sounds foolish to say so, but look at where we are now, when law in the US, Europe, the UK and beyond stipulates age restrictions and they are ignored to the tune of tens of millions of children.

Age assurance is not to rely solely on the user to provide information; a tick box confirming “I am 18” is not sufficient for any service that carries a modicum of risk. It must be compatible with the following laws: the Data Protection Act, the Human Rights Act, the Equality Act and the UNCRC. It must have regard to the risks and opportunities of interoperable age assurance, which, in the future, will see these systems seamlessly integrated into our services, just as opening your phone with your face, or using two-factor authentication when transferring funds, are already normalised. It must consult with the Information Commissioner and other persons relevant to technological expertise and an understanding of child development.

On that point, I am in full support of the proposal from the noble Lord, Lord Allan, to require Ofcom to produce regular reports on age-assurance technology, and see his amendment as a necessary companion piece to these amendments. Importantly, the amendment stipulates that the guidance should come forward in six months and that all systems of age assurance, whether estimated or verified, whether operated in-house or by third-party providers, and all technologies must adhere to the same principles. It allows Ofcom to point to technical standards in its guidance, which I know that the ISO and the IEEE are currently drafting with this very set of principles in mind.

18:15
Amendment 161, which I promise I will get through a little more quickly, simply sets out the need for any regulated service to have an appropriate level of confidence in the age or age range of child relative to risk. It makes clear under what circumstances an age-assurance strategy is required and that any methodology is permitted provided it is adequate to the risk inherent in the service and meets Ofcom’s guidance, which I have already spoken to. Paragraph 3 of the proposed new schedule specifies that the highest standard of age assurance is required for pornographic services covered by Part 5; that is, they must confirm beyond reasonable doubt that the user is not a child. It also makes provision for auditing systems that age-check children. Paragraph 4 deals expressly with pornography accessed via Part 3 services and, crucially, it requires the same high bar of age assurance to access pornography.
I pause for a moment to underline the fact that the impact on children from pornography, which I know other noble Lords will talk to, is not lessened by the route by which they access it it. Arguably, pornography that a child sees in the context of a Part 3 service of news, chatter and shopping is normalised by that context and, therefore, worse. So while we are clear that a Part 3 service must put material that reaches the definition of porn in Clause 70(2) behind an age gate, we are not, as some would suggest, age-gating the internet.
Paragraph 5 of the proposed new schedule makes it clear that a company must consider all parts of the service separately; for example, those that are high-risk may require a higher level of age-assurance than those that are not. Paragraph 3 makes it clear that existing users, as well as new users, should be given the benefit of the schedule. Paragraph 7 refers to the definition and Paragraph 8 is a commencement clause that requires this coming into effect within 12 months of the Act receiving Royal Assent, a subject to which I will return to in a moment.
Between them, Amendments 142 and 146 together and separately give the services, Ofcom and children the very best chance of introducing effective, privacy-preserving age-verification and estimation. There will be no more avoiding, no more excuses, no more using age checking as a data point for commercial purposes. While the Bill requires age assurance under certain circumstances, age checking as a concept is not brought in by this Bill. It is already widely demanded by US, EU and UK laws, but it is poorly done and largely unregulated, so we continue to see children in their millions accessing platforms they are too young to be on and children who are 13, but do not yet have adult capacity, being offered services designed for adults which do not account for their vulnerabilities.
That brings me to the idea of a commencement clause in both amendments. The failure to implement Part 3 of the DEA means there is simply no trust left in the community that the Government will do as they say or even that the Online Safety Act will do as it says. That is not helped by the repealing of Part 3 last week, hidden in a group of government amendments on devolution. Putting a time limit will, if history repeats itself, allow campaigners to go to the courts.
I am encouraged by indications that the Government will bring forward their own amendments and by their willingness to put some of this in the Bill; however, I must express a certain level of frustration at the tendency to reject what we have written in favour of something that clearly does less. I struggle to see why we need to argue that age assurance systems should be secure, that they should not use data for other purposes, that users should have a way of challenging age assurance decisions, or that they should be inclusive or accessible or take account of a child’s need to access certain information. These are not aspirational; they are a critical intervention needed to make age assurance workable. They have had a great deal of expert input over many years, so much so that they have been adopted voluntarily by the commercial age check sector. More importantly, without a transparent and trusted system of age assurance, all the provisions in the Bill aimed at ensuring that children have heightened protections will fall down.
The bereaved parents of Olly, Molly, Breck, Frankie and Sophie have come to your Lordships’ House to urge noble Lords to back these amendments. As they said when meeting Minister Scully, each of them had parental controls, each of them reported problems to companies, schools and/or the police—and still their children are dead. We need this regime for self-harm and pro-suicide material as much as we need it for pornography. If it were not against parliamentary rules, I would get down on my knees and beg the Minister to go back to the department and say that these amendments must be passed with the full force of their meaning. This is robust, practical and much needed to make the Bill acceptable to adults and safe for children.
Amendment 184, also in my name, which seeks to establish the age of porn performers, will be spoken to by others at greater length, but I take the opportunity to tell your Lordships’ House that this is already the case in the United States and it works very well. I suggest we follow suit. Finally, this group should be seen as a companion piece to the harms schedule put forward by the same group of noble Lords, and which has the full support of all those groups and people I mentioned at the outset. The scale and range of expertise that has gone into this package of amendments is astonishing and between them, they would go a very long way to ensure that companies do not profit from purveying harms to children. I beg to move.
Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

My Lords, it is a tremendous honour to speak after the noble Baroness, Lady Kidron. I think it fair to say that we would just not be here today if not for the advocacy she has performed on this issue and on the parallel issue of harms, over a great many years. I say to the Minister and to any in the Chamber who are thinking about being in the regulatory space in the years to come that they are going to have the noble Baroness on their back on these issues and it is very well worth listening to her words.

I echo the thanks and tributes the noble Baroness, Lady Kidron, has already paid, but I also want to single out the Minister and thank him for the leadership he has shown on these issues. I know he is very passionate about transforming our relationship in the digital world and the role the Bill can play, and we all recognise his commitment to making sure that the Bill gets over the line. I also thank those from another place who have passed us the Bill and are now closely watching our proceedings in this Chamber. We know that the Bill will be going back there and it is worth bearing in mind that these provisions have a lot of scrutiny and interest from Members of Parliament.

I am extremely concerned that for all the Bill’s strengths—and it has a great many—the measures on age verification are ill-defined and require tightening up in five particular ways. As the noble Baroness, Lady Kidron, alluded to, I shall talk a little bit about the rationale for the amendments in my name and those of the noble Lord, Lord Stevenson, the right reverend Prelate the Bishop of Oxford and the noble Baroness, Lady Kidron, and also speak in support of amendments in the same group from the noble Baronesses, Lady Ritchie and Lady Benjamin, and the noble Lord, Lord Allan.

To summarise, I am concerned about a mindset that is in awe of and deeply concerned about a colossal tsunami of legal judicial reviews coming our way as a result of measures to put pornography behind age verification, and that somehow the route out of that is to go into the battlefield with a very loose set of arrangements defined in the Bill and to perform regulation by consultation. Consultations are invaluable when it comes to implementation but are not appropriate for setting objectives. I fear that too often in the Bill at the moment, it is the objectives that are going out to the public, to vested interests and to the industry to consult on, and that is why the amendment seeks to put some of those objectives in the Bill. We want to give a clear definition for age verification, to put in a clear timetable and to ensure that those consultations, which will be contested, can be clear and deliver clear value.

I want to talk about our effective age-assurance schedule; the introduction of a threshold of “beyond reasonable doubt”; the introduction of independent auditing of the performance of age-verification measures; and our wish to erase proportionality provisions for websites that carry pornographic content, to introduce a clear timetable and to protect underage performers. At present, the Bill gives the example of age verification as just one possible measure for protecting children from accessing pornography. However, nowhere in the Bill is a clear definition or standard set out that age verification should meet. The amendments seek to address that.

At the moment the Bill defines age assurance as

“measures designed to estimate or verify the age or age-range of users of a service”.

That is just far too vague. There is no output-based performance standard. It leaves regulated services free to apply age-verification systems that at present have no clear oversight or quality control. We already know that some major platforms have a public position whereby apparently, they welcome child safety measures, but we also know that a great many pornography operators will resist implementing age verification to gain a competitive advantage or because they believe it will be detrimental to their business model. Leaving this wide-open gap for them to negotiate the operational efficiency of their age-verification measures is a big mistake.

That is why we have tabled Amendment 161, which would introduce an effective age-assurance schedule. That is an essential building block for the other amendments in this group, including the important Amendment 142. The effective age-assurance schedule would set out in the Bill the requirement for age checking about pornographic content to be of the absolute strongest kind. The amendment gives an indication of where the benchmark should be:

“‘age verification’ … beyond reasonable doubt”.

The amendment also sets out the underlying principles of age-verification regimes: that they must be independently audited, effective and privacy-preserving. The amendment has with it a commencement date to ensure that other provisions for age-checking porn can happen as quickly as possible after Royal Assent of the Bill.

We all agree that verifying age should be based not on any particular technology but on an outcome threshold. That is why we have pushed so hard for the “beyond reasonable doubt” threshold. It is an effort to set a clear and high bar for access to pornography. If that does not happen, I fear that, under the cover of proportionality, Ofcom will accept that Part 3 services, the regular websites, will need only to apply estimation techniques rather than demanding an increased level of assurance to ensure that minors cannot access pornography wherever it is found.

We have had feedback from the Bill team that the phrase “beyond reasonable doubt” is more usually found in criminal proceedings than civil legislation. I am very open to a discussion about whether there is a better or alternative phrase. If the Minister would like to address that point from the Dispatch Box, that would be very welcome.

On independent auditing, a number of noble Lords in this Committee have noted their concerns about internet companies marking their own homework. My noble friend Lady Wyld mentioned that and made a comparison with her own daughter’s homework provisions on an earlier day in Committee.

18:30
The fear is that many of the large pornography companies are establishing their own age-verification companies. I admire their enthusiasm but doubt their intentions. It is incumbent on the Government to ensure that the tools they produce and implement meet the requirements of the legislation. For Ofcom to verify the effectiveness of a particular app or piece of software is neither intrusive nor a business restraint; it is simply about giving confidence to parents that the technology is robust and trustworthy. Therefore, we would look for this aspect of the amendment to be built into any age-verification system.
We need to be very clear about what we mean by “proportionate”. We need to make it crystal clear that any pornographic content that meets the Bill’s definition of pornography must always require verification at the outcome standard—we suggest “beyond reasonable doubt”. The Gambling Commission does not apply any proportionality test to age restrictions, and nor do Soho nightclubs or the sellers of pornographic magazines. We have made it clear that the worst content should always meet this bar.
As drafted, the Bill creates a substantial loophole for social media, which my noble friend Lady Harding alluded to. While I appreciate the Minister’s attempts to answer these questions during previous debates, it would be extremely helpful if he could address this point in his comments.
The timetable for these measures is a very significant issue. The noble Baroness, Lady Kidron, made this point extremely well. The Bill leaves the timetable for implementation completely open-ended. There is no clear road map from Ofcom for implementation and there is the opportunity for repeated delays. These measures have an enormous amount of public support and have the attention of both Houses. It is not reasonable to put forward legislation with an uncapped timetable attached.
Age verification is already in place in other countries. In the UK, video-on-demand platforms are already subject to age verification. The pornography industry was already preparing to implement AV in 2019 and therefore is ready to start implementing the changes quickly after measures are enacted. When the regulator and the legislature moved in Louisiana, the pornographic industry was able to bring in an implementation programme extremely quickly indeed.
On this issue, the Minister’s words on timing at Second Reading did not provide any hard dates, sequences or an implementation programme. The Minister said that the Government intend
“to have the regime operational as soon as possible after Royal Assent”.—[Official Report, 1/2/23; col. 774.]
As with Part 3 of the Digital Economy Act, which did not have a statutory deadline, there will always be a tendency to let the best be the enemy of the good, or to fear some kind of backlash from the industry that might delay commencement. That is why Amendment 142, in my name and those of the noble Baroness, Lady Kidron, the noble Lord, Lord Stevenson, and the right reverend Prelate the Bishop of Oxford, would direct Ofcom to prepare and issue a code of practice within six months of Royal Assent.
The issue of age verification and consent for performers is one I feel particularly strongly about. It is absolutely unconscionable that we do not have age verification and consent checks for anyone featured in pornographic content. We heard extremely powerful testimony at Second Reading from those who found that videos of themselves when they were underage were repeatedly featured on pornographic channels. It is impossible for them to take down that material or to reverse any decision they made to take part in one of these productions. It is absolutely our responsibility to change that with this Bill.
This was addressed in the House of Commons by Amendment 33 from Dame Diana Johnson. At that time, the Secretary of State, Jeremy Wright, claimed that the proposals were unworkable. I do not accept those points. There are many industries where the publishers establish the provenance of material, intellectual property ownership and the identity of participants. The music, film, gaming and software industries all have very good examples of good practice.
Many pornography sites already have measures in place to achieve these standards; for instance, OnlyFans, one of the largest players, has made steps in that direction. The pornography industry, often the originator of technology innovation, can surely find mechanisms to address this point—that is my view, and that of over 80% of the UK public. I very much hope that my noble friend the Minister will address this point, which is in Amendment 184, based on the Commons amendment from Dame Diana Johnson.
The measures in this group are not meant to stifle innovation or to hold back the industry—quite the opposite. My noble friend Lady Harding alluded to the Industrial Revolution; taking children out of the pits led to a great investment in, and the growth of, the coal mining industry. Setting clear tracks for progress and putting in place humane provisions create the conditions under which industries can flourish. I fear that, if we do not get this one right, we will be tripping over ourselves; the pornographic industry will become grit in the gears of industry for years to come. By being clearer and more emphatic in these measures, the Bill can be an agent for innovation and encourage a great flourishing of these very important industries.
Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I support everything that was said by the intrepid noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell. I will speak to Amendment 185, which is in my name and is supported by the noble Lord, Lord Farmer. My amendment seeks to bring the regime for online pornography content in line with what exists offline.

The Video Recordings Act 1984 makes it a criminal offence to have prohibited content offline or to supply any unclassified work. Under this regulation, the BBFC will not classify any pornographic content that is illegal or material that is potentially harmful. That includes material that depicts or promotes child sex abuse, incest, trafficking, torture and harmful sexual acts. This content would not be considered R18, and so would be prohibited for DVD and Blu-ray. This also applies, under the Communications Act 2003, to a wide range of services that are regulated by Ofcom, from large providers such as ITVX or Disney+, to smaller providers including those that produce or provide pornographic content.

However, in the wild west of the online world, there is no equivalent regulation. Online pornography so far has been left to evolve without boundaries or limitations. Devastatingly, this has had a disastrous impact on child protection. Content that would be prohibited offline is widely available on mainstream pornographic websites. This includes material that promotes violent sexual activity, including strangulation; pornography that depicts incest, including that between father and daughters or brothers and sisters; and content that depicts sexual activity with adult actors made to look like children. This content uses petite, young-looking adult performers, who are made to look underage through props such as stuffed toys, lollipops and children’s clothing. This blurring of the depiction of sexual activity with adult actors who are pretending to be underage makes it so much harder to spot illegal child sex abuse material.

According to research by Dr Vera-Gray and Professor McGlynn, incest pornography is rife. Online, all of this can be accessed at the click of a button; offline, it would not be sold in sex shops. Surely this Bill should bring an end to such disparities. This content is extremely harmful: promoting violence against girls and women, sexualising children and driving the demand for real child sex abuse material, which of course is illegal.

Depictions of sexual activity with the title “teen” are particularly violent. A study analysing the content of the three most accessed pornographic websites in the UK found that the three most common words in videos containing exploitation were “schoolgirl”, “girl” and “teen”. It is clear that underage sexual activity is implied. How have we as a society arrived at a point where one of the most commonly consumed pornographic genres is sexual violence directed at children?

Our security services can confirm this too. Retired Chief Constable Simon Bailey, the former child protection lead at the National Police Chiefs’ Council, told the Independent Inquiry into Child Sex Abuse that the availability of pornography was

“creating a group of men who will look at pornography”

so much that they reach

“the point where they are simply getting no sexual stimulation from it … so the next click is child abuse imagery”.

We know that the way pornography affects the brain means that users need more and more extreme content to fulfil themselves. It is like a drug. Pornography sites know this and exploit it. They design their sites to keep users for as long as possible, so as to increase exposure to adverts and therefore their revenue. They do this by presenting a user with ever-more extreme content. In 2021, Dr Vera-Gray and Professor McGlynn found that one in every eight titles advertised to a new user described acts of sexual violence.

I recently hosted a screening of the harrowing documentary “Barely Legal” here in the House of Lords. The documentary demonstrated just how far the pornography industry will go to make a profit, using extremely young-looking adult actors in content that suggests sexual activity with underage girls. Believe it or not, the pornography industry is worth much more than Hollywood; it makes thousands and thousands of dollars per second. Its quest for money comes at the expense of child protection and of society as a whole. This cannot be allowed to continue without regulation. Enough is enough.

Interviews with offenders who view illegal child sex abuse material in the UK indicate that most had not intentionally sought out child sex abuse materials. Nine out of 10 offenders said that they first encountered child sex abuse material through online pop-ups and linked material while looking at pornography sites.

I visited Rye Hill prison in Rugby, which houses over 600 sex offenders. Many said that they were affected by viewing porn, with devastating, life-changing outcomes. The largest ever survey of offenders who watch child sex abuse material online found significant evidence that those who watch illegal material are at high risk of going on to contact or abuse a child directly. Almost half said that they sought direct contact with children through online platforms after viewing child sexual abuse material.

This is an urgent and immediate child protection issue affecting our children. These concerns were shared earlier this year by the Children’s Commissioner for England, whose research found that 79% of children had encountered violent pornography

“depicting … degrading or pain-inducing sex acts”

before they reached the age of 18. The impact that this is having on our children is immeasurable.

18:45
I declare an interest as vice-president of the children’s charity, Barnardo’s. It has shared with many of us how its front-line services are working every day with children who have accessed pornography. It supports children who have taken part in acts that they have seen in pornographic videos, despite feeling uncomfortable and scared. Children see these acts as expected parts of relationships. I will never forget being told about a 10 year-old boy who said to a four year-old girl, “I’m going to rape you and you’re going to like it”. This is why I have campaigned to protect children from pornographic content for over a decade. We are creating a conveyor belt of child sex abusers who will inflict pain and suffering on others.
That is why I am supportive of the package of amendments tabled by the noble Baroness, Lady Kidron —whom I have the highest regard for—to ensure, among other vitally important things, that children are protected from accessing pornographic content. I believe that pornography is a gateway to so many other harms that are included in this Bill. Given the distinct proven harm of pornography, only the most high-level age verification is acceptable, where the age of the user is proved to be 18 or above, beyond reasonable doubt.
I am supportive of the amendment tabled by the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, which would place a duty on pornography companies to verify the age and consent of all performers in pornographic content. Pornography companies currently moderate user-uploaded pornographic content at great volume and speed, meaning that it is almost impossible for the moderators to spot content in which performers are underage or do not consent. Again, we cannot allow profit to come before protection, so this duty should be applied.
Without this package of amendments working in unison, the public health emergency of pornography— this devastating plague—will be allowed to create generations of children who are experiencing harm and generations of adults who create harm in their intimate relationships.
To sum up, I support a clearer definition of age assurance and age verification for pornography, and a six-month implementation deadline—we all know what happened with the repeated delays in implementing Part 3 of the Digital Economy Act 2017. We need performer age checks and quicker enforcement, without the need to go into court, and most of all, to cover all porn with the same regulation, whether on social media or on dedicated sites. I urge the Government to accept all these amendments. I look forward to receiving the Minister’s assurances that this regulation of online pornographic content will be included within the scope of the Online Safety Bill. We need to show our children that we truly care.
Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - - - Excerpts

My Lords, it is such a privilege to follow the noble Baroness, Lady Benjamin. I pay tribute to her years of campaigning on this issue and the passion with which she spoke today. It is also a privilege to follow the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, in supporting all the amendments in this group. They are vital to this Bill, as all sides of this Committee agree. They all have my full support.

When I was a child, my grandparents’ home, like most homes, was heated by a coal fire. One of the most vital pieces of furniture in any house where there were children in those days was the fireguard. It was there to prevent children getting too near to the flame and the smoke, either by accident or by design. It needed to be robust, well secured and always in position, to prevent serious physical harm. You might have had to cut corners on various pieces of equipment for your house, but no sensible family would live without the best possible fireguard they could find.

We lack any kind of fireguard at present and the Bill currently proposes an inadequate fireguard for children. A really important point to grasp on this group of amendments is that children cannot be afforded the protections that the Bill gives them unless they are identified as children. Without that identification, the other protections fail. That is why age assurance is so foundational to the safety duties and mechanisms in the Bill. Surely, I hope, the Minister will acknowledge both that we have a problem and that the present proposals offer limited protection. We have a faulty fireguard.

These are some of the consequences. Three out of five 11 to 13 year-olds have unintentionally viewed pornography online. That is most of them. Four out of five 12 to 15 year-olds say they have had a potentially harmful experience online. That is almost universal. Children as young as seven are accessing pornographic content and three out of five eight to 11 year-olds—you might want to picture a nine year-old you know—have a social media profile, when they should not access those sites before the age of 13. That profile enables them to view adult content. The nation’s children are too close to the fire and are being harmed.

There is much confusion about what age assurance is. As the noble Baroness, Lady Kidron, has said, put simply it is the ability to estimate or verify an individual’s age. There are many different types of age assurance, from facial recognition to age verification, which all require different levels of information and can give varying levels of assurance. At its core, age assurance is a tool which allows services to offer age-appropriate experiences to their users. The principle is important, as what might be appropriate for a 16 year-old might be inappropriate for a 13 year-old. That age assurance is absolutely necessary to give children the protections they deserve.

Ofcom’s research shows that more than seven out of 10 parents of children aged 13 to 17 were concerned about their children seeing age-inappropriate content or their child seeing adult or sexual content online. Every group I have spoken to about the Bill in recent months has shared this concern. Age assurance would enable services to create age-appropriate experiences for children online and can help prevent children’s exposure to this content. The best possible fireguard would be in place.

Different levels of age assurance are appropriate in different circumstances. Amendments 161 and 142 establish that services which use age assurance must do so in line with the basic rules of the road. They set out that age assurance must be proportionate to the level of risk of a service. For high-risk services, such as pornography, sites much establish the age of their users beyond reasonable doubt. Equally, a service which poses no risk may not need to use age assurance or may use a less robust form of age assurance to engage with children in an age-appropriate manner—for example, serving them the terms and conditions in a video format.

As has been said, age assurance must be privacy-preserving. It must not be used as an excuse for services to use the most intrusive technology for data-extractive purposes. These are such common-sense amendments, but vital. They will ensure that children are prevented from accessing the most high-risk sites, enable services to serve their users age-appropriate experiences, and ensure that age assurance is not used inappropriately in a way that contravenes a user’s right to privacy.

As has also been said, there is massive support for this more robust fireguard in the country at large, across this House and, I believe, in the other place. I have not yet been able to understand, or begin to understand, the Government’s reasons for not providing the best protection for our children, given the aim of the Bill. Better safeguards are technically possible and eminently achievable. I would be grateful if the Minister could attempt to explain what exactly he and the Government intend to do, given the arguments put forward today and the ongoing risks to children if these amendments are not adopted.

Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to follow the right reverend Prelate the Bishop of Oxford. He used an interesting analogy of the fireguard; what we want in this legislation is a strong fireguard to protect children.

Amendments 183ZA and 306 are in my name, but Amendment 306 also has the name of the noble Lord, Lord Morrow, on it. I want to speak in support of the general principles raised by the amendments in this group, which deal with five specific areas, namely: the definition of pornography; age verification; the consent of those participating in pornographic content; ensuring that content which is prohibited offline is also prohibited online; and the commencement of age verification. I will deal with each of these broad topics in turn, recognising that we have already dealt with many of the issues raised in this group during Committee.

As your Lordships are aware, the fight for age verification has been a long one. I will not relive that history but I remind the Committee that when the Government announced in 2019 that they would not implement age verification, the Minister said:

“I believe we can protect children better and more comprehensively through the online harms agenda”.—[Official Report, Commons, 17/10/19; col. 453.]


Four years later, the only definition for pornography in the Bill is found in Clause 70(2). It defines pornographic content as

“produced solely or principally for the purpose of sexual arousal”.

I remain to be convinced that this definition is more comprehensive than that in the Digital Economy Act 2017.

Amendment 183ZA is a shortened version of the 2017 definition. I know that the Digital Economy Act is out of vogue but it behoves us to have a debate about the definition, since what will be considered as pornography is paramount. If we get that wrong, age verification will be meaningless. Everything else about the protections we want to put in place relies on a common understanding of when scope of age verification will be required. Put simply, we need to know what it is we are subjecting to age verification and it needs to be clear. The Minister stated at Second Reading that he believed the current definition is adequate. He suggested that it ensured alignment across different pieces of legislation and other regulatory frameworks. In reviewing other legislation, the only clear thing is this: there is no standard definition of pornography across the legislative framework.

For example, Section 63 of the Criminal Justice and Immigration Act 2008 uses the definition in the Bill, but it requires a further test to be applied: meeting the definition of “extreme” material. Section 368E of the Communications Act 2003 regulates online video on demand services. That definition uses the objective tests of “prohibited material”, meaning material too extreme to be classified by the British Board of Film Classification, and “specially restricted material”, covering R18 material, while also using a subjective test that covers material that

“might impair the physical, mental or moral development”

of under-18s.

19:00
This is wholly different from the definition in Clause 70(2). The Bill’s definition is therefore not somehow aligned with other legislation. It primarily examines intention: why was a particular piece of content created? This is a subjective test that begs the question whether content providers could argue that the intent of the material is not sexual arousal but, for example, art or education, and therefore that it falls outside the scope of the Bill. Subjectivity must be removed as far as possible. Amendment 183ZA proposes adding an objective test: would the content receive, or has it been given, an R18 or 18 certificate, or would it be considered too extreme for either? The subjective test remains but makes it clear that the intention is for the Bill to cover material which, were it physical media, would receive a classification from the British Board of Film Classification that puts it firmly in the adult category. This definition is in line with offline platforms such as cinemas and DVDs and is broadly consistent with the definition already in place for other online platforms, such as video on demand services. Ensuring parity of regulation across platforms should be a key principle.
The Government have stated many times that what is illegal offline should be illegal online. It is illegal for a person to supply age-restricted physical media to someone below the age set by the British Board of Film Classification. This is exactly the same principle the Bill seeks to apply online by using age verification. I know your Lordships will agree that parents will expect the same standards that apply offline to apply online. Content deemed unsuitable for under-18s offline is unsuitable for them online. That is what the public expect this Bill to deliver, and my Amendment 183ZA seeks to do just that. Can the Minister be sure that the current definition will cover all material—18, R18 and unclassified—that would be age-gated for children offline?
That brings me to Amendment 185 in the name of the noble Baroness, Lady Benjamin, which also seeks parity between offline and online content by referencing the video on demand services standards. There is no barrier to finding material online which is illegal or so extreme that it would not receive a BBFC classification offline. Online, such material cannot be included in video on demand services, many of which are pornography providers. A couple of weeks ago, the noble Baroness, Lady Benjamin, hosted a film for us all to watch, and I must say I found that a very daunting video of what actually can happen. In the contribution of the noble Baroness this afternoon, she referred to a lot of what appeared on that day. That was most instructive, and it clearly told me that such material should be banned online and offline.
It seems absurd that if this Bill is passed as it stands, online content delivered by user-to-user services or commercial pornography sites will be held to a lower standard than other online content or content sold offline in shops. Amendment 185 makes it clear that the regulation of pornography should be consistent across all platforms.
I will also make some comments about Amendment 184. Large pornography websites are filled with material that has been uploaded without consent. Material that features children is extremely concerning. Uploaded material also portrays adults filmed with consent but without their consent for it to be shared with the world. We also need to remember that pornography production can involve individuals trafficked, coerced, forced or threatened. There should be a way to verify an individual’s age and their consent. Large porn companies need to be held to account. This Bill can deliver that, and I urge the Government to ensure that Amendment 184 is delivered on Report.
Regrettably, Amendments 184 and 185 would apply only to provider content, not user-to-user content—an issue I raised previously in Committee. That cannot be right and makes me wonder whether a service such as Pornhub might argue that third-party, studio-produced content that it uploads is not provider content and could be another loophole in the Bill. I do not wish to reopen a previous debate in Committee, but there needs to be parity across Parts 3 and 5 of the Bill. I note that the noble Baroness, Lady Kidron, who is a leading light on this issue, has tabled Amendment 123A in an attempt to bridge the gap between Parts 3 and 5. However, I do not think the amendment gets to the heart of the issue. Between now and Report, we need to work together across the House to ensure that this issue of parity between Parts 3 and 5 is dealt with.
I turn to the other amendments in the name of the noble Baroness, Lady Kidron, and others. The need for robust age verification for pornography is undisputed. The Bill clearly creates different regulatory frameworks for different types of content, based on who uploads it rather than its impact. I repeat that pornography, wherever it is found, should be regulated to the highest standards, with consistency across the Bill. That is why I welcome these amendments, which seek to ensure that anyone seeking to access pornographic content is verified as being aged 18 or older and that the standard of proof for their age is deemed beyond reasonable doubt. There needs to be consistent age verification, rather that age assurance, for pornographic content.
I have some questions for the noble Baroness about the detail. I am concerned that the amendments do not deal with different regulatory frameworks for pornography in Parts 3 and 5, creating potential loopholes which could be exploited by large pornography providers. I am particularly concerned about the approach taken in paragraphs 3 and 4 of the proposed new schedule in Amendment 161, and subsection (3)(c) of the proposed new clause in Amendment 142. My Amendments 183A and 183B, which we debated previously, would ensure that duties are consistent across the Bill, and I urge the noble Baroness to reflect on the need for clarity and consistency before Report.
Finally, I turn to my Amendment 306, also in the name of the noble Lord, Lord Morrow, which would ensure that age verification is brought into force within six months of the Bill receiving Royal Assent. A whole generation of children have become adults since the Digital Economy Act was passed but never commenced. That cannot be allowed to happen again and is why we need a clear commencement date in the Bill. Parents expect age verification—and swiftly. Given the recent welcome announcement that primary priority content, including pornography, will be named in the Bill, that should be an achievable goal.
I hope the Minister will commit to bringing this back on Report. I was concerned on 25 April when he said that the Government intend to commence Parts 3 and 5 at different times. I ask how this would work in practice, as a pornography provider may have its own provider content and user-to-user content on the same service, which implies that one part of the service would be regulated while the other would not. In theory, the content could be the same images. Logically, that content should be regulated from the same moment.
Thousands of children over the years have been let down. We know the harm that is caused by pornography. We legislated in 2017 to stop that harm, yet if the Ofcom road map is accurate it could be two or three more years before pornography is regulated. That is unacceptable. We must ensure that we do all we can to bring in age verification as quickly as possible.
Baroness Jenkin of Kennington Portrait Baroness Jenkin of Kennington (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I too add my support to Amendments 123A, 142, 161, 183, 184, 185, 297, 300 and 306. I am grateful to my noble friend Lord Bethell and the noble Baroness, Lady Kidron, for putting before us such a comprehensive list of amendments seeking to protect children from a host of online harms, including online pornography. I am also grateful to the noble Baroness, Lady Benjamin, who, through her Amendment 185, draws our attention to the horrifying material that is prohibited in the offline world though is inexplicably legal in the online world. I also lend my support to Amendment 306 in the name of noble Baroness, Lady Ritchie, and the noble Lord, Lord Morrow, in relation to the swift implementation of age verification for pornography. I am sorry to have jumped the queue.

I spoke at Second Reading on the harms of pornography to children, but so much more evidence has come to my notice since then. I recently wrote an article for the Daily Telegraph about age verification, which resulted in my inbox being absolutely flooded by parents saying, “Please keep going”. There are probably noble Lords here who feel that we have spoken enough about pornography over the last few weeks, but anybody who has watched any of this would, I am afraid, beg to differ. I hope noble Lords will forgive me for quoting from another email I received in response to that article, which is relevant to today’s debate. A young man wrote:

“When I first visited online porn, I was about 12”.


Incidentally, that is the average age at the first exposure. He said:

“I can remember feeling that this was ‘wrong’ but also that it was something that all boys do. I had no idea about masturbation, but that soon followed, and I was able to shake off the incredibly depressive sensation of having done something wrong after finishing by finding many online resources informing me that the practice was not bad, and actually quite healthy. Only over the past 3 years have I been able to tackle this addiction and I am now 31.


I will try and keep this letter as succinct as possible, but I believe the issue of pornography is at the root of so many issues in society that nobody, no man at least, seems willing to speak about it openly. If you research what happens in the brain of a person viewing pornography”,


especially when so young,

“you see that the dopamine receptors get so fried it’s almost as bad as a heroin experience and far more addictive. Far more addictive, in that I can just log on to my phone and open Pandora’s box at any time, anywhere, and it’s all free.

I’ll tell you that I became alienated from women, in that I became afraid of them. Perhaps out of guilt for looking at pornography. Instead of having the confidence to ask a girl out and experience an innocent teenage romance, I would be in my room looking at all sorts of images.


The human brain requires novelty, mine does at least, so soon you find yourself veering off from the boring vanilla porn into much darker territories.


The internet gives you access to literally everything you could possibly imagine, and the more you get sucked down the rabbit hole, the more alienated you become from your peers. You are like an addict searching for your next hit, your whole world revolves around your libido and you can’t actually look at a woman without fantasising about sex.


Then if you do manage to enter into a relationship, the damage this causes is beyond comprehension. Instead of living each moment with your partner, you end up in a dual relationship with your phone, masturbating behind their back. In fact, your partner can’t keep up with the porn, and you end up with issues with your erections and finding her attractive.


Whenever you would watch information about porn on TV or the internet, you would be told that it should be encouraged and is healthy. You end up trying to watch porn with your partner, and all the weird psychological ramifications that has. You go further down the rabbit-hole, but for some reason nothing feels right and you have this massive crippling depression following you wherever you go in life”.


I hope noble Lords will forgive me for reading that fairly fully. It is a tiny illustration, and it is typical of how pornography steals men’s childhoods and their lives. I discussed this with young men recently, and one told me that, because he had been in Dubai—where there is no access to it—for a month, he feels much better and plans to keep away from this addictive habit. When young men reach out to Peers because they have nowhere else to go, we must surely concede that we have failed them. We have failed generations of boys and girls—girls who are afraid to become women because of what they see—and, if we do not do something now, we will fail future generations.

19:15
Porn addiction is very real and it is growing. As I just read, it triggers the dopamine processes in the brain and, just like addictive products such as tobacco and alcohol, it can create pathways within the brain that lead to cravings, which push consumers to search longer and continually for the same level of high. What is worse is that the amount of dopamine that floods the brain increases only with repeated consumption. Porn can trigger this process endlessly because it is endlessly available.
No one doubts that we have a serious problem on our hands. The Children’s Commissioner report published in January this year made clear that the volume of pornography accessed by children is rising and that pornography exposure is widespread and normalised, to the extent that children cannot opt out. We know that it is shaping sexual scripts and, in the absence of good relationships and sex education, it is teaching children that violent sex is normal, that girls and children like to be subjugated, and that boys and men need to be dominating and violent. We cannot let this continue, which is why I support the package of amendments outlined today by the noble Baroness, Lady Kidron, and my noble friend Lord Bethell.
I strongly appreciate that the noble Baroness, Lady Ritchie, and the noble Lord, Lord Morrow, raised similar issues on the timeline for age verification. I am aware that the Bill makes provision for age verification for pornography, and I was relieved to hear that primary priority content and priority content would be in the Bill. But there is still a massive problem with the intended timeline for age verification: the Bill fails to outline a timeline for the implementation of it—there is no clear road map in the Bill—which allows for repeated delays. This is on top of what is already a repeated delay; as we heard, it could have been introduced in 2017. I and many others in both Houses fear that, without the timetable, it could be several more years before it is put into place. Parents expect age verification to start protecting children soon after Royal Assent, and a long delay will be unpopular and baffling to the public—it is baffling to me. Either it can be introduced now, in which case it should be, or it cannot, in which case I would be grateful if the Minister could explain why, in words that the public can understand.
I am also bewildered by the issue that the noble Baroness, Lady Benjamin, raised. I seek to support her by asking my noble friend the Minister: why is content depicting child sexual abuse allowed to be freely accessed online when it would be prohibited or illegal in the offline world? I watched with horror the documentary “Barely Legal”, screened here last week. It outlined this violent and horrific material, with young women dressed up to look like children, told to look as young as possible and having sex—and worse—with older men. The instructions from the interviewed porn directors and producers who produce this material were that the younger these women could be made to look, the better. This material contributes to the number of porn consumers—mostly men—who, as a result of watching this material, seek out real child sexual abuse material. So I fully support the amendment from the noble Baroness, Lady Benjamin, which would make this content illegal and prohibited online, as it is offline. I ask my noble friend the Minister to explain why this is not already the case.
Lord Morrow Portrait Lord Morrow (DUP)
- View Speech - Hansard - - - Excerpts

My Lords, I wish to direct my comments to Amendments 123A, 142, 161, 183, 297 and 306, and I want to focus mainly on age verification.

The starting point for effective age verification is being able to define what it is that is being regulated. That is why I support Amendment 183ZA in the name of the noble Baroness, Lady Ritchie. Without that clarity on what it is that needs to be age verified, the expectation of parents will not be met, and children who deserve the strongest and fullest protection will still be subject to harm. The noble Baroness made a convincing case for parity of regulation across different media, and that indeed was the principle behind the definition in the Digital Economy Act 2017. I hope that the Minister will set out either today in the Committee or via a letter afterwards how content which would fall under the different British Board of Film Classification ratings of 18, R18 and unclassified would be treated under the definition in this Bill. I cannot stress this enough: a proper definition of pornography is the foundation of good age-verification legislation. I agree with the noble Baroness, Lady Ritchie, that the definition in the Bill is just not robust.

I stood here on 28 January 2022 at the Second Reading of my Private Member’s Bill encouraging the Government to bring into effect Part 3 of the Digital Economy Act 2017, which would have brought in age verification for commercial pornographic websites. Looking back to 2017, that legislation seems pioneering. Other countries have implemented similar measures since, but our children are still waiting despite the Government’s assurances, when they postponed implementation of Part 3 in October 2019 in favour of the Bill we are debating today, that preventing children’s access to pornography is a critically urgent issue. It still is. On 9 May, the Children’s Commissioner again reiterated the importance of age verification. She said,

“I am categorically clear: no child should be able to access or watch pornography. Protecting children from seeing inappropriate material is critical”.


I congratulate the noble Baroness, Lady Kidron, on yet again trying to persuade the Government to protect children in a robust manner. I am also pleased to support the noble Baroness, Lady Ritchie, as a co-signatory to her Amendment 306. She has learned the lessons of 2017 and is seeking to ensure that this policy cannot be abandoned again. A six-month implementation clause is the very least that we should accept.

It is now well documented that early exposure to pornography carries with it a host of harms—we have just heard something about that from the previous speaker—for the children and young people exposed and for society more widely. The Children’s Commissioner has plainly spelled out those harms in the reports she has published this year. These dangers are ones which parents alone are unable to prevent. With pornography now merely one click away for children across the UK, it is no wonder that the majority of children have already been exposed to pornography before they become teenagers. It is heartbreaking that most children report that they have been exposed to it accidentally. The reality is that robust age verification is an effective antidote to this pervasive problem. Shockingly, only 4.5% of the top 200 pornographic websites have any mechanism to prevent or detect children accessing their sites, and it is unlikely that they would meet the bar of robust age verification.

It is clear that there is almost unanimous support for age verification across your Lordships’ House. However the question before us is whether the Bill as it stands enables a robust enough level of protection. I welcome the duties in Clauses 11 and 22 on age verification. At present, the Bill enables pornographic sites to apply a light touch and, potentially, entirely inadequate age verification. Without a coherent, consistent approach, we are leaving the door open to those wishing to circumvent the prevention of harm we are putting in place.

Considering what we know and have heard about this industry, while the inclusion of age verification checks is welcome, is it really appropriate to leave the critical task of designing and implementing them to the pornography industry? I feel not. Should they be the ones charged with safeguarding our children and young people? Really? Would many in the pornographic industry prioritise their own web traffic over the welfare of children and young people? I think the answers to those questions are very clear.

Amendments 142 and 161 set higher standards so that the test is that there should be a “beyond reasonable doubt” age verification approach which should apply to Part 5 services or material that meet the definition of pornography in Clause 70(2). This will ensure children and young people are proactively protected from the deeply detrimental impact of online pornography.

While I welcome the new schedule introduced by Amendment 161, one of my key concerns is that there should be a consistent approach to enforcement of age verification when it comes to online pornography. Regardless of which websites it is found on, all forms of pornography should be held to the highest possible regulatory standards. I recognise that the noble Baroness has sought to go some way to addressing this with Amendment 123A, but this covers only Ofcom guidance on what constitutes pornographic material, rather than the requirements of age verification per se and the duty under Clause 11. Indeed, the amendment refers to age assurance, not age verification.

I know that we have already debated these points under earlier amendments tabled by the noble Baroness, Lady Ritchie, but the importance of the point cannot be overstated. Indeed, the Children’s Commissioner said the same in her report of 9 May, saying that the requirement for robust age verification must be

“consistent across all types of regulated services – both user to user sites and pornography providers”.

I must ask the noble Baroness, Lady Kidron—who I have admired on what she has been doing here—why paragraphs 3 and 4 of the proposed new schedule in Amendment 161 create different barriers under separate regulatory regimes for Part 3 services and Part 5 services. Why does the new clause proposed in Amendment 142 on the Ofcom guidance on age assurance refer only to Part 5 in subsection (3)(c)? Why is it that regulatory regimes for content produced by providers will differ from those where content is uploaded by users?

As it stands, those two regulatory regimes will be treated differently and, as I have said, I am not reassured by Amendment 123A. My concern is that, without change, we will not see all digital pornography treated with parity, serving only to create ambiguity and potential loopholes. I hope your Lordships will take note of the advice of the Children’s Commissioner and prevent that happening.

I would also like to ask the noble Baroness about her plans for bringing this schedule into effect. Paragraph 8 says the schedule should be in effect within 12 months, but there is no obligation elsewhere to bring in Parts 3 and 5 on that same timetable, so age verification may be required but could not be implemented until other parts of the Bill are commenced.

I urge the noble Baroness, Lady Kidron, to take note of Amendment 306 in the name of the noble Baroness, Lady Ritchie, which puts the commencement elements of age verification into the commencement clause with a timetable of three months for the Ofcom guidance and the rest of Part 5 and relevant enforcement powers within six months. Of course, the noble Baroness, Lady Ritchie, in her earlier amendments, intended for Part 3 to have the same duties as Part 5 and I certainly hope we will come back to that on Report. I was concerned that, at the end of that debate, the Minister said about commencement:

“This may mean there will be a limited period of time during which Part 5 protections are in place ahead of those in Part 3”. —[Official Report, 25/4/23; col. 1201.]


This again reiterates a dual approach to Part 3 and Part 5 services which host pornography.

I hope the Minister will assure us that it is no longer the Government’s plan that pornography will be on the face of the Bill as primary priority content, and that they are making it clear to service providers now that they need to plan to prevent children accessing this material, so it will be possible to commence the duties at the same time. Indeed, this would be in line with the comments made by the Minister in Committee in the other place, although I regret that he was not as ambitious about the timetable as Amendment 306.

19:30
I conclude by saying that we are well aware of the dangers that online pornography poses and, while age verification on pornographic websites is an important step forward, we must utilise this opportunity to ensure that the age verification processes are robust and able to function in such a way as to prevent children and young people being exposed to pornography. This change should be brought in as soon as possible and consistently across the Bill.
Lord Farmer Portrait Lord Farmer (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I support the noble Baroness, Lady Benjamin, in bringing the need for consistent regulation of pornographic content to your Lordships’ attention and have added my name in support of Amendment 185. I also support Amendments 123A, 142, 161, 183, 184 and 306 in this group.

There should not be separate regimes for how pornographic content is regulated in this country. I remember discussions about this on Report of the Digital Economy Bill around six years ago. The argument for not making rules for the online world consistent with those for the offline world was that the CPS was no longer enforcing laws on offline use anyway. Then as now, this seems simply to be geared towards letting adults continue to have unrestricted access to an internet awash with pornographic material that depicts and/or promotes child sexual abuse, incest, trafficking, torture, and violent or otherwise harmful sexual acts: adult freedoms trumping all else, including the integrity of the legal process. In the offline world, this material is illegal or prohibited for very good reason.

The reason I am back here, arguing again for parity, is that, since 2017, an even deeper seam of academic research has developed which fatally undermines the case for untrammelled cyber-libertarianism. It has laid bare the far-reaching negative impacts that online pornography has had on individuals and relationships. One obvious area is the sharp rise in mental ill-health, especially among teenagers. Research from CEASE, the Centre to End All Sexual Exploitation, found that over 80% of the public would support new laws to limit free and easy access.

Before they get ensnared—and some patients of the Laurel Centre, a private pornography addiction clinic, watch up to 14 hours of pornography a day—few would have been aware that sexual arousal chained to pornography can make intimate physical sex impossible to achieve. Many experience pornography-induced erectile dysfunction and Psychology Today reports that

“anywhere from 17% to 58% of men who self-identify as heavy/compulsive/addicted users of porn struggle with some form of sexual dysfunction”.

As vice-chair of the APPG on Issues Affecting Men and Boys, I am profoundly concerned that very many men and boys are brutalised by depictions of rape, incest, violence and coercion, which are not niche footage on the dark web but mainstream content freely available on every pornography platform that can be accessed online with just a few clicks.

The harms to their growing sons, which include an inability to relate respectfully to girls, should concern all parents enough to dial down drastically their own appetite for porn. There is enormous peer pressure on teenage boys and young men to consume it, and its addictive nature means that children and young people, with their developing brains, are particularly susceptible. One survey of 14 to 18 year-olds found almost a third of boys who used porn said it had become a habit or addiction and a third had enacted it. Another found that the more boys watched porn and were sexually coercive, the less respect they had for girls.

Today’s headlines exposed the neurotoxins in some vaping products used by underage young people. There are neurotoxins in all the porn that would be caught by subsection 368E(2) of the Communications Act 2003, if it was offline—hence the need for parity and, just like the vapes, children as well as adults will continue to be exposed. Trustworthy age verification will stop children stumbling across it or finding it in searches, but adults who are negligent, or determined to despoil children’s innocence, will facilitate their viewing it if it remains available online. This Bill will not make the UK the safest place in the world for children online if we continue to allow content that should be prohibited, for good reason, to flood into our homes.

Helen Rumbelow, writing in the Times earlier this month, said the public debate—the backdrop to our own discussions in this Bill—is “spectacularly ill-informed” because we only talk about porn’s side-effects and not what is enacted. So here goes. Looking at the most popular pages of the day on Pornhub, she found that 12 out of 32 showed men physically abusing women. One-third of these showed what is known as “facial abuse”, where a woman’s airway is blocked by a penis: a porn version of waterboarding torture. She described how

“in one a woman is immobilised and bound by four straps and a collar tightened around her neck. She ends up looking like a dead body found in the boot of a car. In another a young girl, dressed to look even younger in a pair of bunny ears and pastel socks, is held down by an enormous man pushing his hand on her neck while she is penetrated. The sounds that came from my computer were those you might expect from a battle hospital: cries of pain, suction and “no, no, no”. I won’t tell you the worst video I saw as you may want to stop reading now. I started to have to take breaks to go outside and look at the sky and remember kindness”.

Turning briefly to the other amendments, I thank my noble friend Lord Bethell for his persistence in raising the need for the highest standard of age verification for pornography. I also commend the noble Baroness, Lady Kidron, for her continued commitment to protecting children from harmful online content and for representing so well the parents who have lost children, in the most awful of circumstances, because of online harms. I therefore fully support the package of amendments in this group tabled by the noble Baroness, Lady Kidron, and my noble friend Lord Bethell.

This Bill should be an inflection point in history and future generations will judge us on the decisions we make now. It is highly like they will say “Shame on them”. To argue that we cannot put the genie back in the bottle is defeatist and condemns many of our children and grandchildren to the certainty of a dystopic relational future. I say “certain” because it is the current reality of so many addicted adults who wish they could turn back the clock. Therefore, it is humane and responsible, not quaint or retrogressive, to insist that this Government act decisively to make online and offline laws consistent and reset the dial.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to my Amendment 232, as well as addressing issues raised more broadly by this group of amendments. I want to indicate support from these Benches for the broader package of amendments spoken to so ably by the noble Baroness, Lady Kidron. I see my noble friend Lord Clement-Jones has returned to check that I am following instructions during my temporary occupation of the Front Bench.

The comments I will make are going to focus on an aspect which I think we have not talked about so much in the debate, which is age assurance in the context of general purpose, user-to-user and search services, so-called Part 3, because we like to use confusing language in this Bill, rather than the dedicated pornography sites about which other noble Lords have spoken so powerfully. We have heard a number of contributions on that, and we have real expertise in this House, not least from my noble friend Lady Benjamin.

In the context of age assurance more generally, I start with a pair of propositions that I hope will be agreed to by all participants in the debate and build on what I thought was a very balanced and highly informative introduction from the noble Baroness, Lady Kidron. The first proposition is that knowledge about the age of users can help all online platforms develop safer services than they could absent that information—a point made by the right reverend Prelate the Bishop of Oxford earlier. The second is that there are always some costs to establishing age, including to the privacy of users and through some of the friction they encounter when they wish to use a service. The task before us is to create mechanisms for establishing age that maximise the safety benefits to users while minimising the privacy and other costs. That is what I see laid out in the amendment that the noble Baroness, Lady Kidron, has put before us.

My proposed new clause seeks to inform the way that we construct that balance by tasking Ofcom with carrying out regular studies into a broad range of approaches to age assurance. This is exactly the type of thinking that is complementary to that in Amendment 142; it is not an alternative but complementary to it. We may end up with varying views on exactly where that balance should be struck. Again, I am talking about general purpose services, many of which seek to prohibit pornography—whether they do so 100%, it is a different set of arguments from those that apply to services which are explicitly dedicated to pornography. We may come to different views about where we eventually strike the balance but I think we probably have a good, shared understanding of the factors that should be in play. I certainly appreciate the conversations I have had with the noble Baroness, Lady Kidron, and others about that, and think we have a common understanding of what we should be considering.

If we can get this formulation right, age assurance may be one of the most significant measures in the Bill in advancing online safety, but if we get it wrong, I fear we may create a cookie banner scenario, such as the one I warned about at Second Reading. This is my shorthand for a regulatory measure that brings significant costs without delivering its intended benefits. However keen we are to press ahead, we must always keep in mind that we do not want to create legislation that is well-intended but does not have the beneficial effect that we all in this Committee want.

Earlier, the noble Baroness, Lady Harding, talked about the different roles that we play. I think mine is to try to think about what will actually work, and whether the Bill will work as intended, and to try to tease out any grit in it that may get in the way. I want in these remarks to flag what I think are four key considerations that may help us to deliver something that is actually useful and avoid that cookie banner outcome, in the context of these general purpose, Part 3 services.

First, we need to recognise that age assurance is useful for enabling as well as disabling access to content—a point that the noble Baroness, Lady Kidron, rightly made. We rightly focus on blocking access to bad content, but other things are also really important. For example, knowing that a user is very young might mean that the protocol for the reporting system gets to that user report within one hour, rather than 24 hours for a regular report. Knowing that a user is young and is being contacted by an older user may trigger what is known as a grooming protocol. Certainly at Facebook we had that: if we understood that an older user was regularly contacting younger users, that enabled us to trigger a review of those accounts to understand whether something problematic was happening—something that the then child exploitation and online protection unit in the UK encouraged us to implement. A range of different things can then be enabled. The provision of information in terms that a 13 year-old would understand can be triggered if you know the age of that user.

Equally, perfectly legitimate businesses, such as alcohol and online gambling businesses, can use age assurance to make sure that they exclude people who should not be part of that. We in this House are considering measures such as junk food advertising restrictions, which again depend on age being known to ensure that junk food which can be legitimately marketed to older people is not marketed to young people. In a sense, that enables those businesses to be online because, absent the age-gating, they would struggle to meet their regulatory obligations.

Secondly, we need to focus on outcomes, using the risk assessment and transparency measures that the Bill creates for the first time. We should not lose sight of those. User-to-user and search services will have to do risk assessments and share them with Ofcom, and Ofcom now has incredible powers to demand information from them. Rather than asking, “Have you put in an age assurance system?”, we can ask, “Can you tell us how many 11 year-olds or 15 year-olds you estimate access the wrong kind of content?”, and, “How much pornography do you think there is on your service despite the fact that you have banned it?” If the executives of those companies mislead Ofcom or refuse to answer, there are criminal sanctions in the Bill.

The package for user-to-user and search services enables us to really focus on those outcomes and drill down. In many cases, that will be more effective. I do not care whether they have age-assurance type A or type B; I care whether they are stopping 99.9% of 11 year-olds accessing the wrong kind of content. Now, using the framework in the Bill, Ofcom will be able to ask those questions and demand the answers, for the first time ever. I think that a focus on outcomes rather than inputs—the tools that they put in place—is going to be incredibly powerful.

19:45
The third consideration is quite a difficult one. We need to plan for actual behaviour, and how it will change over time and how we adapt to that, rather than relying on assumptions that we make today. My experience is that actual behaviour often differs. Cookie banners are an example. The assumption of the regulator was, “We’ll put these cookie banners in place, and it will be so off-putting that people will stop using cookies”. The reality is completely different: everyone has just carried on using cookies and people click through. The behaviour has not matched up to expectations.
You can put a lot of these tools in place, such as age assurance and age-restricted services. If you build an age-restricted version of your service—there is a YouTube for kids along with many other kids’ services—then you can see whether or not they are going to be acceptable. If people are rejecting them, you need to adapt. There is no point saying, “Well, you should go and use YouTube Kids”. If people are signing up for it but finding it too restrictive and going elsewhere, we need to be able to think about how we can adapt to that and work with it.
The reality today, as the right reverend Prelate the Bishop of Oxford referred to, is that 60% of kids are on social media, and in many cases their parents have bought the phone and enabled that access. How do we deal with that? We cannot just bury our heads in the sand and ignore it; we have to be able to adapt to that behaviour and think about what tools work in that environment.
My last point on adaptation is that we found, working in the industry, that sometimes the law incentivised ignorance, which is the worst possible outcome. We had age-estimation tools that allowed us to understand that children were 11 or 12. They may have passed an age check by providing ID that showed that they were overage, and their parents may have helped them, but we knew they were not. But that knowledge itself created legal risk, so we would bury the knowledge. If kids are going on these online platforms, my view is that I would much rather the platforms use all the tools available—we should not discourage them from understanding that these are 11 year-olds—and we find a way to work with that and make the service as safe as possible. These are hard questions because, while we want the law to work in an absolute way, in practice people are very creative and will work around and through systems.
The final point about making all this work is understanding the key role of usability. I was struck by the compelling vision of noble Baroness, Lady Kidron, of low-friction age assurance. There are issues of principle and practice when it comes to making sure that age assurance is usable. The argument around principle is that we as a free society do not want to create unnecessary friction for people to access information online. We can put measures in place and impinge on freedom of expression in a way that is necessary and proportionate. With regard to all the arguments that we have heard about access to pornography sites, the case is of course clear and absolute—there is a strong case for putting in place very restrictive measures—but for other general purpose services that younger people may want to use to connect with family and friends, we need to tread quite carefully if we are putting in place things that might infringe on their rights. The UN convention also talks about the right to express oneself. We need to be careful about how we think that through, and usability is critical to that.
I share the noble Baroness’s vision, in a sense. A lot of it could happen at the phone level, when a parent sets a phone up for their 11 year-old. The phone knows that you are an 11 year-old, and there is a system in place for the phone to tell the apps that you install that you are 11. That works in a smooth way and is potentially low friction but very effective. There are exceptions, but, by and large, teenagers have their own phone and the phone is tied to them; if you know the age of the phone then you have a highly reliable understanding of the age of the user. There is a lot in there. It is those kinds of low-friction measures that we should be looking for.
At the other end of the spectrum, if every app that you use asks you to upload your passport, that is not going to work. People will just bypass it, or they will stick to the apps they know already and never install new ones. We would end up concentrating the market, and our friends at the Competition and Markets Authority would not be very pleased with that potential outcome. It is about making something usable from both a principled reason—not blocking access to legitimate services—and a pragmatic reason: making sure we do not create an incentive. One of the phrases the team doing the age verification at the Facebook would use was, “Our competition is lying”. If people do not like what you are doing, they will simply lie and find their way round it. We need to bear that in mind, even under the new regime.
If we are mindful of these practical considerations, we should be able to deliver very useful age-assurance tools. I hope the Minister will agree with that. I look forward to hearing the Government’s description of how they think all this will work, because that bit is missing from the debate. I know other Lords will have heard from Ofcom, which started putting out information about how it thinks it will work. The age assurance community has started putting out information. The gap in all this is that it is not clear what the Government think this future world will look like. I hope they can at least start to fill that gap today by trying to explain to us, in response to these amendments, their vision for the new world of age assurance, not just for the pornography sites but, critically, for all these other sites that millions of us use that certainly aim to be gateways not to pornography but rather to other forms of communication and information.
Baroness Gohir Portrait Baroness Gohir (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I am still getting used to the rules in Committee. I did not get up quickly enough before the noble Lord, Lord Allan, so I hope I am able to add my voice to the amendments.

I support the amendments tabled by the noble Baroness, Lady Kidron, and supported by the noble Lord, Lord Bethell, to bring about robust age verification for pornography wherever it is found online and the need to prove beyond reasonable doubt that the user is above the age of 18. I also support the amendment tabled by the noble Baroness, Lady Benjamin, which would mean that content that is prohibited and illegal offline would also be prohibited online, and the other amendments in this group, tabled by the noble Baroness, Lady Ritchie, and the noble Lord, Lord Allan.

The impact that pornography has on violence against women and girls is well documented, yet the Bill does not address it. What is considered mainstream pornography today would have once been seen as extreme. In fact, swathes of content that is readily available online for all to view in just a few clicks would be illegal and prohibited by the British Board of Film Classification to possess or supply offline, under the powers given to it by the Video Recordings Act 1984. But because online pornography has been allowed to evolve without any oversight, pornographic content that includes overt sexual violence, such as choking, gagging and forceful penetration, is prevalent. This is alongside content that sexualises children, as mentioned by the noble Baroness, Lady Benjamin, which includes petite, young-looking adult actors being made to look like children, and pornography which depicts incest.

This content is not just available in niche corners of the internet or the dark web; it is presented to users on mainstream websites. Research by the academics Clare McGlynn and Fiona Vera-Gray into the titles of videos that were available on the landing page of three of the UK’s most popular pornography websites revealed that one in eight titles used descriptions such as “pain”, “destroy”, “brutal”, “torture”, “violate”, “hurt”, and many others that are too unpleasant to mention in this Chamber. To reiterate, these videos were available on the landing pages and presented to first-time visitors to the site without any further searching necessary. We have to acknowledge that the vast majority of online pornographic content, viewed by millions across the globe, is directly promoting violence against women and girls.

A very real example of this impact is Wayne Couzens, the murderer of Sarah Everard. In court, a former colleague set out how Couzens was attracted to “brutal sexual pornography”. Indeed, it is not hard to find an addiction to violent pornography in the background of many notorious rapists and killers of women. While not all men will jump from violent pornography to real-life harm, we know that for some it acts as a gateway to fulfilling a need for more extreme stimulation.

A study published in 2019 in the National Library of Medicine by Chelly Maes involving 568 adolescents revealed that exposure to pornographic content was related to individuals’ resistance towards the #MeToo movement and increased acceptance of rape myths. Even the Government’s own research found substantial evidence of an association between the use of pornography and harmful attitudes and behaviours towards women and girls, yet the Bill does nothing to bring parity between the way that pornography is regulated online and offline. That is why I support the amendment in the name of the noble Baroness, Lady Benjamin, to address this inconsistency.

We must remember that while this pornography has an effect on the viewer, it also has an effect on the performers taking part. Speaking to the APPG on Commercial Sexual Exploitation’s recent inquiry into pornography, Linda Thompson, the national co-ordinator at the Women’s Support Project, said:

“We know pornography is a form of violence against women. It is not just fantasy; it is the reality for the women involved. It is not just a representation of sex, it is actual sexual violence that is occurring to the women”.


There is currently no obligation for pornography companies to verify that a performer in pornographic content is over the age of 18 and that they consent. Once the videos are uploaded on to platforms, they go through little moderation, meaning that only the most overtly extreme and obviously illegal and non-consensual content is readily identified and reported. This means that for many girls and women, their sexual abuse and rape is readily available and is viewed for pleasure. The Bill is an opportunity to rectify this and to put this duty on pornography companies. For that reason, I am supportive of amendments tabled by the noble Baroness, Lady Kidron, to put this duty on them.

Finally, we must remember the impact that having access to this content is having on our children right now. A recent report by the Children’s Commissioner for England about pornography and subsequent harmful sexual behaviour and abuse demonstrates this. Using data from children’s own testimonies about cases of child sexual abuse committed against them by another child, references to specific acts of sexual violence commonly found in pornography were present in 50% of the cases examined.

Pornography is how children are receiving a sex education. We know the impact that this type of pornographic content is having on adults, yet we are allowing children to continue having unfettered access as their attitudes towards sex and relationships are forming. In 2021, the singer Billie Eilish spoke out about her experiences of watching pornography from the age of 11, saying:

“The first few times I … had sex, I was not saying no to things that were not good. It was because I thought that’s what I was supposed to be attracted to”.


Children are seeing violent sexual acts as normal and expected parts of relationships. This cannot continue. That is why I am supporting the amendments tabled by the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, to bring in robust age verification for pornography wherever it is found online. I thank them and the noble Baroness, Lady Benjamin, for their hard work in raising awareness of the online risks posed to children. I also express gratitude to all the children’s charities such as Barnardo’s which work tirelessly day and night to keep children safe. The Government also have a duty to keep children safe.

20:00
Baroness Foster of Aghadrumsee Portrait Baroness Foster of Aghadrumsee (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I too should have spoken before the noble Lord, Lord Allan; I should have known, given his position on the Front Bench, that he was speaking on behalf of the Liberal Democrats. I was a little reticent to follow him, knowing his expertise in the technical area, but I am very pleased to do so now. I support this very important group of amendments and thank noble Lords for placing them before us. I echo the thanks to all the children’s NGOs that have been working in this area for so long.

For legislators, ambiguity is rarely a friend, and this is particularly true in legislation dealing with digital communications, where, as we all acknowledge, the law struggles to keep pace with technical innovation. Where there is ambiguity, sites will be creative and will evade what they see as barriers—of that I have no doubt. Therefore, I strongly believe that there is a need to have clarity where it can be achieved. That is why it is important to have in the Bill a clear definition of age verification for pornography.

As we have heard this evening, we know that pornography is having a devastating impact on our young people and children: it is impacting their mental health and distorting their views of healthy sexual relationships. It is very upsetting for me that evidence shows that children are replicating the acts they see in pornographic content, thinking that it is normal. It is very upsetting that, in particular, young boys who watch porn think that violence during intimacy is a normal thing to do. The NSPCC has told us that four in 10 boys aged 11 to 16 who regularly view porn say they want to do that because they want to get ideas as to the type of sex they want to try. That is chilling. Even more chilling is the fact that content is often marketed towards children, featuring characters from cartoons, such as “Frozen”, “Scooby Doo” and “The Incredibles”, to try to draw young people on to those sites. Frankly, that is unforgivable; it is why we need robust age verification to protect our children from this content. It must apply to all content, regardless of where it is found; we know, for instance, that Twitter is often a gateway to pornographic sites for young people.

The noble Lord, Lord Bethell, referred to ensuring, beyond all reasonable doubt, that the user is over 18. I know that that is a very high standard—it is the criminal law level—but I believe it is what is needed. I am interested to hear what the Minister has to say about that, because, if we are to protect children and if we take on the role of the fireguard, which the right reverend Prelate referred to, we need to make sure that it is as strong as possible.

Also, this is not just about making sure that users are over 18; we need to make sure that adults, not children, are involved in the content. The noble Baroness, Lady Benjamin, talked about adults being made to look like children, but there is also the whole area of young people being trafficked and abused into pornography production; therefore, Amendment 184 on performer age checks is very important.

I finish by indicating my strong support for Amendment 185 in the name of the noble Baroness, Lady Benjamin. Some, if not most, mainstream pornography content sites are degrading, extremely abusive and violent. Such content would be prohibited in the offline world and is illegal to own and to have; this includes sexual violence including strangulation, incest and sexualising children. We know that this is happening online because, as we have heard, some of the most frequently searched terms on porn sites are “teens”, “schoolgirls” or “girls”, and the lack of regulation online has allowed content to become more and more extreme and abusive. That is why I support Amendment 185 in the name of noble Baroness, Lady Benjamin, which seeks to bring parity between the online and offline regulation of pornographic content.

This Bill has been eagerly awaited. There is no doubt about that. It has been long in the gestation—some people would say too long. We have had much discussion in this Committee but let us get it right. I urge the Minister to take on board the many points made this afternoon. That fireguard needs not only to be put in place, but it needs to be put in place so that it does not move, it is not knocked aside and so that it is at its most effective. I support the amendments.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I also failed to stand up before the noble Lord, Lord Allan, did. I too am always slightly nervous to speak before or after him for fear of not having the detailed knowledge that he does. There have been so many powerful speeches in this group. I will try to speak swiftly.

My role in this amendment was predefined for me by the noble Baroness, Lady Kidron, as the midwife. I have spent many hours debating these amendments with my noble friend Lord Bethell, the noble Baroness, Lady Kidron, and with many noble Lords who have already spoken in this debate. I think it is very clear from the debate why it is so important to put a definition of age assurance and age verification on the face of the Bill. People feel so passionately about this subject. We are creating the digital legal scaffolding, so being really clear what we mean by the words matters. It really matters and we have seen it mattering even in the course of this debate.

My two friends—they are my friends—the noble Baroness, Lady Kidron, and my noble friend Lord Bethell both used the word “proportionate”, with one not wanting us to be proportionate and the other wanting us to be proportionate. Yet, both have their names to the same amendment. I thought it might be helpful to explain what I think they both mean—I am sure they will interrupt me if I get this wrong—and explain why the words of the amendment matter so much.

Age assurance should not be proportionate for pornography. It should be the highest possible bar. We should do everything in our power to stop children seeing it, whether it is on a specific porn site or on any other site. We do not want our children to see pornography; we are all agreed on that. There should not be anything proportionate about that. It should be the highest bar. Whether “beyond reasonable doubt” is the right wording or it should instead be “the highest possible bar practically achievable”, I do not know. I would be very keen to hear my noble friend the Minister’s thoughts on what the right wording is because, surely, we are all clear it should be disproportionate; it should absolutely be the hardest we can take.

Equally, age assurance is not just about pornography, as the noble Lord, Lord Allan, has said. We need to have a proportionate approach. We need a ladder where age assurance for pornography sits at the top, and where we are making sure that nine year-olds cannot access social media sites if they are age-rated for 13. We all know that we can go into any primary school classroom in the land and find that the majority of nine year-olds are on social media. We do not have good age assurance further down.

As both the noble Lord, Lord Allan, and the noble Baroness, Lady Kidron, have said, we need age assurance to enable providers to adapt the experience to make it age-appropriate for children on services we want children to use. It needs to be both proportionate and disproportionate, and that needs to be defined on the face of the Bill. If we do not, I fear that we will fall into the trap that the noble Lord, Lord Allan, mentioned: the cookie trap. We will have very well-intentioned work that will not protect children and will go against the very thing that we are all looking for.

In my role as the pragmatic midwife, I implore my noble friend the Minister to hear what we are all saying and to help us between Committee and Report, so that we can come back together with a clear definition of age assurance and age verification on the face of the Bill that we can all support.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, about half an hour ago I decided I would not speak, but as we have now got to this point, I thought I might as well say what I was going to say after all. I reassure noble Lords that in Committee it is perfectly permissible to speak after the winder, so no one is breaking any procedural convention. That said, I will be very brief.

My first purpose in rising is to honour a commitment I made last week when I spoke against the violence against women and girls code. I said that I would none the less be more sympathetic to and supportive of stronger restrictions preventing child access to pornography, so I want to get my support on the record and honour that commitment in this context.

My noble friend Lady Harding spoke on the last group about bringing our previous experiences to bear when contributing to some of these issues. As I may have said in the context of other amendments earlier in Committee, as a former regulator, I know that one of the important guiding principles is to ensure that you regulate for a reason. It is very easy for regulators to have a set of rules. The noble Baroness, Lady Kidron, referred to rules of the road for the tech companies to follow. It is very easy for regulators to examine whether those rules are being followed and, having decided that they have, to say that they have discharged their responsibility. That is not good enough. There must be a result, an outcome from that. As the noble Lord, Lord Allan, emphasised, this must be about outcomes and intended benefits.

I support making it clear in the Bill that, as my noble friend Lady Harding said, we are trying to prevent, disproportionately, children accessing pornography. We will do all we can to ensure that it happens, and that should be because of the rules being in place. Ofcom should be clear on that. However, I also support a proportionate approach to age assurance in all other contexts, as has been described. Therefore, I support the amendments tabled by the noble Baroness, Lady Kidron, and my noble friend Lord Bethell, and the role my noble friend Lady Harding has played in arriving at a pragmatic solution.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, it is a privilege to be in your Lordships’ House, and on some occasions it all comes together and we experience a series of debates and discussions that we perhaps would never have otherwise reached, and at a level which I doubt could be echoed anywhere else in the world. This is one of those days. We take for granted that every now and again, we get one of these rapturous occasions when everything comes together, but we forget the cost of that. I pay tribute, as others have, to the noble Baroness, Lady Kidron. She has worked so hard on this issue and lots of other issues relating to this Bill and has exhausted herself more times than is right for someone of her still youthful age. I am very pleased that she is going off on holiday and will not be with us for a few days; I wish her well. I am joking slightly, but I mean it sincerely when I say that we have had a very high-quality debate. That it has gone on rather later than the Whips would have wanted is tough, because it has been great to hear and be part of. However, I will be brief.

It was such a good debate that I felt a tension, in that everybody wanted to get in and say what they wanted to say be sure they were on the record. That can sometimes be a disaster, because everyone repeats everything, but as the noble Baroness, Lady Harding, said, we know our roles, we know what to say and when to say it, and it has come together very nicely. Again, we should congratulate ourselves on that. However, we must be careful about something which we keep saying to each other but sometimes do not do. This is a Bill about systems, not content. The more that we get into the content issues, the more difficult it is to remember what the Bill can do and what the regulator will be able to do if we get the Bill to the right place. We must be sure about that.

I want to say just a few things about where we need to go with this. As most noble Lords have said, we need certainty: if we want to protect our children, we have to be able to identify them. We should not be in any doubt about that; there is no doubt that we must do it, whatever it takes. The noble Lord, Lord Allan, is right to say that we are in the midst of an emerging set of technologies, and there will be other things coming down the line. The Bill must keep open to that; it must not be technology-specific, but we must be certain of what this part is about, and it must drill down to that. I come back to the idea of proportionality: we want everybody who is 18 or under to be identifiable as such, and we want to be absolutely clear about that. I like the idea that this should be focused on the phones and other equipment we use; if we can get to that level, it will be a step forward, although I doubt whether we are there yet.

20:15
Secondly, we must not leave open the idea that there will somehow be different approaches to how we regulate what children see in this space. We are talking about pornography here. Whether it comes through a Part 3 or Part 5 service, or accidently through a blog or some other piece of information, it has to be stopped. We do not want our children to receive it. That must be at the heart of what we are about, and not just something we think about as we go along. The Bill is complicated and difficult to read. It has to be because the services are different, but that should not be at the expense of the ability to say at the end: “We did it”.
Thirdly, I worry about some things that have crept into the debate on the proportionality issue. If “a small number” means that we will somehow let a few children see something, that will not be acceptable. Everybody has said this. Let us be clear about it: this is either 100% or it is not worth doing. If so, the question of whether we do it is not about finding the right form of words, such as “beyond reasonable doubt”; it is about certainty.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

As the noble Baroness, Lady Kidron, set out at the beginning of this debate, the amendments in this group have involved extensive discussions among Members in both Houses of Parliament, who sit on all sides of both Houses. I am very grateful for the way noble Lords and Members in another place have done that. They have had those preliminary discussions so that our discussions in the debate today and in preparation for it could be focused and detailed. I pay particular tribute to the noble Baroness, Lady Kidron, and my noble friends Lord Bethell and Lady Harding, who have been involved in extensive discussions with others and then with us in government. These have been very helpful indeed; they continue, and I am happy to commit to their continuing.

Age-assurance technologies will play an important role in supporting the child safety duties in this Bill. This is why reference is made to them on the face of the Bill—to make it clear that the Government expect these measures to be used for complying with the duties to protect children from harmful content and activity online. Guidance under Clause 48 will already cover pornographic content. While this is not currently set out in the legislation, the Government intend, as noble Lords know, to designate pornographic content as a category of primary priority content which is harmful to children. As I set out to your Lordships’ House during our debate on harms to children, we will amend the Bill on Report to list the categories of primary and primary priority content on the face of the Bill.

I am very grateful to noble Lords for the engagement we have had on some of the points raised in Amendments 142 and 306 in recent weeks. As we have been saying in those discussions, the Government are confident that the Bill already largely achieves the outcomes sought here, either through existing provisions in it or through duties in other legislation, including data protection legislation, the Human Rights Act 1998 and the Equality Act 2010. That is why we think that re-stating duties on providers which are already set out in the Bill, or repeating duties set out in other legislation, risks causing uncertainty, and why we need to be careful about imposing specific timelines on Ofcom by which it must produce age-assurance guidance. It is essential that we protect Ofcom’s ability robustly to fulfil its consultation duties for the codes of practice. If Ofcom is given insufficient time to fulfil these duties, the risk of legal challenge being successful is increased.

I welcome Ofcom’s recent letter to your Lordships, outlining its implementation road map, which I hope provides some reassurance directly from the regulator on this point. Ofcom will prioritise protecting children from pornography and other harmful content. It intends to publish, this autumn, draft guidance for Part 5 pornography duties and draft codes of practice for Part 3 illegal content duties, including for child sexual exploitation and abuse content. Draft codes of practice for children’s safety duties will follow next summer. These elements of the regime are being prioritised ahead of others, such as the category 1 duties, to reflect the critical importance of protecting children.

Although we believe that the Bill already largely achieves the outcomes sought, we acknowledge the importance of ensuring that there are clear principles for Ofcom to apply when recommending or requiring the use of age-assurance technologies. I am happy to reassure noble Lords that the Government will continue to consider this further and are happy to continue our engagement on this issue, although any amendment must be made in a way that sits alongside existing legislation and within the framework of the Bill.

I turn to Amendments 161 and 183. First, I will take the opportunity to address some confusion about the requirements in Parts 3 and 5 of the Bill. The Bill ensures that companies must prevent children accessing online pornography, regardless of whether it is regulated in Part 3 or Part 5. The Government are absolutely clear on this point; anything less would be unacceptable. The most effective approach to achieving this is to focus on the outcome of preventing children accessing harmful content, which is what the Bill does. If providers do not prevent children accessing harmful content, Ofcom will be able to bring enforcement action against them.

I will address the point raised by my noble friend Lord Bethell about introducing a standard of “beyond reasonable doubt” for age verification for pornography. As my noble friend knows, we think this a legally unsuitable test which would require Ofcom to determine the state of mind of the provider, which would be extremely hard to prove and would therefore risk allowing providers to evade their duties. A clear, objective duty is the best way to ensure that Ofcom can enforce compliance effectively. The Bill sets clear outcomes which Ofcom will be able to take action on if these are not achieved by providers. A provider will be compliant only if it puts in place systems and processes which meet the objective requirements of the child safety duties.

The provisions in the Bill on proportionality are important to ensure that the requirements in the child safety duties are tailored to the size and capacity of providers. Smaller providers or providers with less capacity are still required to meet the child safety duties where their services pose a risk to children. They will need to put in place sufficiently stringent systems and processes that reflect the level of risk on their services and will need to make sure these systems and processes achieve the required outcomes of the child safety duties.

The Government expect companies to use age-verification technologies to prevent children accessing services which pose the highest risk of harm to children, such as online pornography. However, companies may use another approach if it is proportionate to the findings of the child safety risk assessment and a provider’s size and capacity. This is an important element to ensure that the regulatory framework remains risk-based and proportionate.

Age verification may not always be the most appropriate or effective approach for user-to-user companies to comply with their duties. For example, if a user-to-user service such as a social medium does not allow—

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I am sorry to interrupt. The Minister said that he would bear in mind proportionality in relation to size and capacity. Is that not exactly the point that the noble Baroness, Lady Harding, was trying to make? In relation to children, why will that be proportionate? A single child being damaged in this way is too much.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

The issue was in relation to a provider’s size and capacity; it is an issue of making sure it is effective and enforceable, and proportionate to the size of the service in question. It may also not be the most effective approach for companies to follow to comply with their duties. If there is a company such as a user-to-user service in social media that says it does not allow pornography under its terms of service, measures such as content moderation and user reporting might be more appropriate and effective for protecting children than age verification in those settings. That would allow content to be better detected and taken down, while—

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I understand that, but it is an important point to try to get on the record. It is an outcome-based solution that we are looking for, is it not? We are looking for zero activity where risks to children are there. Clearly, if the risk assessment is that there is no risk that children can be on that site, age verification may not be required— I am extending it to make a point—but, if there is a risk, we need to know that the outcome of that process will be zero. That is my point, and I think we should reflect on that.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am very happy to, and the noble Lord is right that we must be focused on the outcomes here. I am very sympathetic to the desire to make sure that providers are held to the highest standards, to keep children protected from harmful content online.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

I know the Minister said that outcomes are detailed in the Bill already; I wonder whether he could just write to us and describe where in the Bill those outcomes are outlined.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I shall happily do that, and will happily continue discussions with my noble friend and others on this point and on the appropriate alternative to the language we have discussed.

On the matter of Ofcom independently auditing age- assurance technologies, which my noble friend also raised, the regulator already has the power to require a company to undertake and pay for a report from a skilled person about a regulated service. This will assist Ofcom in identifying and assessing non-compliance, and will develop its understanding of the risk of failure to comply. We believe that this is therefore already provided for.

I reassure noble Lords that the existing definition of pornographic content in the Bill already captures the same content that Amendment 183ZA, in the name of the noble Baroness, Lady Ritchie of Downpatrick, intends to capture. The definition in the Bill shares the key element of the approach Ofcom is taking for pornography on UK-established video-sharing platforms. This means that the industry will be familiar with this definition and that Ofcom will have experience in regulating content which meets it.

The definition is also aligned with that used in existing legislation. I take on board the point she made about her trawl of the statute book for it, but the definition is aligned elsewhere in statute, such as in the Coroners and Justice Act 2009. This means that, in interpreting the existing definition in the Bill, the courts may be able to draw on precedent from the criminal context, giving greater certainty about its meaning. The definition of pornography in Part 5 is also consistent with the British Board of Film Classification’s guidelines for the definition of sex works, which is

“works whose primary purpose is sexual arousal or stimulation”

and the BBFC’s definition of R18. We therefore think it is not necessary to refer to BBFC standards in this legislation. Including the definition in the Bill also retains Parliament’s control of the definition, and therefore also which content is subject to the duties in Part 5. That is why we believe that the definition as outlined in the Bill is more straightforward for both service providers and Ofcom to apply.

I turn to Amendments 184 and 185. The Government share the concerns raised in today’s debate about the wider regulation of online pornography. It is important to be clear that extreme pornography, so-called revenge pornography and child sexual exploitation and abuse are already illegal and are listed as priority offences in the Bill. This means that under the illegal content duties, Part 3 providers, which will include some of the most popular commercial pornography services, must take proactive, preventive measures to limit people’s exposure to this criminal content and behaviour.

20:30
As I have made clear in previous debates, providers will need to protect children from all forms of online pornography, including illegal pornography or content that the British Board of Film Classification refuses to classify. Providers in scope of Part 5 are publishers which directly control the material on their services, and which can already be held liable for existing extreme pornography and child sexual exploitation and abuse offences captured by the criminal law. The most appropriate mechanism for dealing with these services is, rather than a regulatory regime, the criminal law.
I can also reassure noble Lords that the Government’s new offences relating to sharing and sending intimate images without consent will apply to providers in scope of Part 5. They will be criminally liable for any non-consensual intimate images published on their service.
In relation to Amendment 184, as intimate image abuse will already be illegal in criminal law, it is unnecessary to include a specific duty for Part 5 providers to prohibit this content. Any publisher that shares such images on its site would risk breaking the law and could face a prison sentence. The Bill is also not the right mechanism to regulate content produced or published by the adult industry with regard to the consent of performers appearing in pornographic content. Copyright and contract law already gives performers based in the UK the right to authorise the making of a recording of their performance. Any works recorded and made available to the public without the performer’s consent would constitute an infringement of their rights. As a private right, it is for the performer to enforce this, not a broader regulatory regime.
Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

Does my noble friend the Minister recognise that those laws have been in place for the 30 years of the internet but have not successfully been used to protect the rights of those who find their images wrongly used, particularly those children who have found their images wrongly used in pornographic sites? Does he have any reflections on how that performance could be improved?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I would want to take advice and see some statistics, but I am happy to do that and to respond to my noble friend’s point. I was about to say that my noble friend Lady Jenkin of Kennington asked a number of questions, but she is not here for me to answer them.

I turn to Amendment 232 tabled by the noble Lord, Lord Allan of Hallam. Because of the rapid development of age-assurance technologies, it is right that they should be carefully assessed to ensure that they are used effectively to achieve the outcomes required. I am therefore sympathetic to the spirit of his amendment, but must say that Ofcom will undertake ongoing research into the effectiveness of age-assurance technologies for its various codes and guidance, which will be published. Moreover, when preparing or updating the codes of practice, including those that refer to age-assurance technologies, Ofcom is required by the Bill to consult a broad range of people and organisations. Parliament will also have the opportunity to scrutinise the codes before they come into effect, including any recommendations regarding age assurance. We do not think, therefore, that a requirement for Ofcom to produce a separate report into age-assurance technologies is a necessary extra burden to impose on the regulator.

In relation to this and all the amendments in this group, as I say, I am happy to carry on the discussions that we have been having with a number of noble Lords, recognising that they speak for a large number of people in your Lordships’ House and beyond. I reiterate my thanks, and the Government’s thanks, to them for the way in which they have been going about that. With that, I encourage them not to press their amendments.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I thank everyone for their contributions this evening. As the noble Lord, Lord Stevenson, said, it is very compelling when your Lordships’ House gets itself together on a particular subject and really agrees, so I thank noble Lords very much for that.

I am going to do two things. One is to pick up on a couple of questions and, as has been said by a number of noble Lords, concentrate on outcomes rather than contributions. On a couple of issues that came up, I feel that the principle of pornography being treated in the same way in Parts 3 and 5 is absolute. We believe we have done it. After Committee we will discuss that with noble Lords who feel that is not clear in the amendment to make sure they are comfortable that it is so. I did not quite understand in the Minister’s reply that pornography was being treated in exactly the same way in Parts 3 and 5. When I say “exactly the same way”, like the noble Lord, Lord Allan, I mean not necessarily by the same technology but to the same level of outcome. That is one thing I want to emphasise because a number of noble Lords, including the noble Baroness, Lady Ritchie, the noble Lord, Lord Farmer, and others, are rightly concerned that we should have an outcome on pornography, not concentrate on how to get there.

The second thing I want to pick up very briefly, because it was received so warmly, is the question of devices and on-device age assurance. I believe that is one method, and I know that at least one manufacturer is thinking about it as we speak. However, it is an old battle in which companies that do not want to take responsibility for their services say that people over here should do something different. It is very important that devices, app stores or any of the supposed gatekeepers are not given an overly large responsibility. It is the responsibility of everyone to make sure that age assurance is adequate.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

I hope that what the noble Baroness is alluding to is that we need to include gatekeepers, app stores, device level and sideloading in another part of the Bill.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

But of course—would I dare otherwise? What I am saying is that these are not silver bullets and we must have a mixed economy, not only for what we know already but for what we do not know. We must have a mixed economy, and we must not make an overly powerful one platform of age assurance. That is incredibly important, so I wanted to pick up on that.

I also want to pick up on user behaviour and unintended consequences. I think there was a slight reference to an American law, which is called COPPA and is the reason that every website says 13. That is a very unhelpful entry point. It would be much better if children had an age-appropriate experience from five all the way to 18, rather than on and off at 13. I understand that issue, but that is why age assurance has to be more than one thing. It is not only a preventive thing but an enabling thing. I tried to make that very clear so I will not detain the Committee on that.

On the outcome, I say to the Minister, who has indeed given a great deal of time to this, that more time is needed because we want a bar of assurance. I speak not only for all noble Lords who have made clear their rightful anxiety about pornography but also on behalf of the bereaved parents and other noble Lords who raised issues about self-harming of different varieties. We must have a measurable bar for the things that the Bill says that children will not encounter—the primary priority harms. In the negotiation, that is non-negotiable.

On the time factor, I am sorry to say that we are all witness to what happened to Part 3. It was pushed and pushed for years, and then it did not happen—and then it was whipped out of the Bill last week. This is not acceptable. I am happy, as I believe other noble Lords are, to negotiate a suitable time that gives Ofcom comfort, but it must be possible, with this Bill, for a regulator to bring something in within a given period of time. I am afraid that history is our enemy on this one.

The third thing is that I accept the idea that there has to be more than principles, which is what I believe Ofcom will provide. But the principles have to be 360 degrees, and the questions that I raised about security, privacy and accessibility should be in the Bill so that Ofcom can go away and make some difficult judgments. That is its job; ours is to say what the principle is.

I will tell one last tiny story. About 10 years ago, I met in secret with one of the highest-ranking safety officers in one of the companies that we always talk about. They said to me, “We call it the ‘lost generation’. We know that regulation is coming, but we know that it is not soon enough for this generation”. On behalf of all noble Lords who spoke, I ask the Government to save the next generation. With that, I withdraw the amendment.

Amendment 123A withdrawn.
Clause 48 agreed.
House resumed. Committee to begin again not before 9.22 pm.

Online Safety Bill

Committee (8th Day) (Continued)
21:29
Clause 49: “Regulated user-generated content”, “user-generated content”, “news publisher content”
Amendment 124
Moved by
124: Clause 49, page 47, line 6, at end insert—
“(2A) Subsection (2)(e) does not apply in respect of a regulated user-to-user service which is operated by an organisation which—(a) is a relevant publisher (within the meaning of section 41 of the Crime and Courts Act 2013), and(b) has an annual UK turnover in excess of £100 million.”Member’s explanatory statement
This amendment seeks to ensure the comment sections of the largest newspaper websites are subject to the Online Safety Bill’s regulatory regime.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I shall speak to Amendment 124 but also to Amendments 126 and 227, all of which were tabled by my noble friend Lord McNally and supported by the noble Lord, Lord Lipsey. Sadly, they are both unable to do battle today, for health reasons, and I start by wishing them both a speedy recovery. I hope that I at least partly do justice to their intentions and to these amendments today.

These amendments are designed to address significant loopholes in the Bill which have been very clearly pointed out by Hacked Off, Impress—the press regulator—and the Press Recognition Panel. These loopholes risk enabling extremist publishers to take advantage of the overbroad “recognised news publisher” exemption and allow hatred and other online harms to spread on some of the most popular social media forums online—the newspaper comment sections. Amendment 124 would remove comment sections operated by news websites where the publisher has a UK turnover of more than £100 million from the exemption for regulated user-generated content.

Some of the most harmful online content is in newspaper comment sections, which are in fact social media forums themselves and are read by millions of readers every day. Hacked Off has found examples of misogyny, explicit anti-Semitic language, Holocaust denial and more. Women in public life are also the target of misogyny in these comments sections. Professor Corinne Fowler, an academic who was criticised by some newspapers after contributing to a National Trust report, describing her experience, wrote that

“unregulated comments beneath articles, including the Telegraph and The Times as well as the Daily Mail and the Express … contained scores of suggestions about how to kill or injure me. Some were general ideas, such as hanging, but many were gender-specific, saying that I should be burnt at the stake like a witch … without me knowing, my son (then 12 years old) read these reader comments. He became afraid for my safety. The comments were easily accessible: he googled ‘Corinne Fowler National Trust’ and scrolled below the articles. No child should have to deal with hate speech directed at a parent”.

Amendment 126 would have the effect of incentivising newspapers to sign up to an independent regulator. It would expand the definition of a “recognised news publisher” to incorporate any entity that is a member of an approved regulator, while excluding publishers that are not members of such a regulator, unless they are broadcasters and regulated by Ofcom. Recognised news publishers enjoy wide exemptions in the Bill. Their content is not only protected from being taken down by platforms, but a new provision will require platforms to actively consult media publishers before removing their content. As a result, news publishers will enjoy greater free speech rights under the Bill than private citizens.

The criteria to qualify as a “recognised news publisher” is different for broadcasters and other media. For broadcasters, outlets must be regulated by Ofcom. For non-broadcast media, outlets need only meet a list of vague criteria: have a standards code, which could say anything; have a complaints process, which could also say anything; have a UK office; have staff; and not be a sanctioned title. As a result, a host of extremist and disinformation publishing websites may qualify immediately, or with minor administrative changes, for this rather generous exemption. For example, conspiracy theorist and racist David Icke’s website could qualify with minor administrative changes. He would be free to propagate his dangerous and, in many cases, anti-Semitic conspiracies on social media. Heritage and Destiny, an openly racist website, would likewise be able to qualify with minor changes and spread racial hatred on social media. Infowars could open up a UK office, qualify and spread harmful content on social media.

This amendment would replace that vague list of criteria with the simple requirement that, to access the exemption, non-broadcast media publishers must be in a PRP-approved independent regulator. The effect would be that extremists and harmful publishers would not be able to access the exemption. All publishers would have the same free speech rights as everyone else, unless they are otherwise regulated under the charter system or Ofcom in the case of broadcasters.

Amendment 227 requires Ofcom’s reporting on the impact of the regulatory regime on the availability and treatment of news publishers and journalistic content to also cover what impact the news publisher exemption and journalistic content duty have on the regime’s efficacy. The Bill requires Ofcom to publish a report on whether the new regime will harm freedom of the press. This is despite the fact the Bill already goes to extraordinary lengths to protect the interests of the press. This very modest amendment would require Ofcom’s report to also query whether the news publisher exemption is undermining the regulatory regime.

Impress, which is the UK’s only press regulator approved by the Press Recognition Panel under royal charter, says that the Bill leaves the public vulnerable and exposed to online harms and therefore falls short of the Government’s aim of making the UK the safest place to be online. It has summarised the three ways in which the current Bill is in danger of undermining its principal function—to protect the public from online harms—which could be resolved by these amendments.

First, the Bill creates an uneven playing field. A poor definition of what constitutes a news publisher threatens to undermine the public protection benefits of the Bill. Secondly, the Bill misses an opportunity to fight misinformation or disinformation. The Bill undermines industry standards and fails to distinguish journalism from fake news. Thirdly, the Bill could be easily used as a cover to spread serious harms. The Bill’s current journalism exemptions create dangerous loopholes which could easily be exploited to spread misinformation and disinformation. Publishers should be required to demonstrate compliance and oversight in relation to their published code of conduct and complaints policy.

If we needed any more persuasion, a letter to me from David Wolfe KC, the chair of the PRP, provides an additional twist:

“I am writing to draw your attention to the Bill’s potential impact on the regulation of the press and news publishers in the UK. Specifically, to Clause 50 of the Bill, which explains the circumstances in which news publishers are taken out of the proposed Ofcom regulatory regime … it does not specify any minimum standards and does not specify who is to assess publishers. The practical implication, though, is that Ofcom—whose board are appointed by the Secretary of State … and which operates under their direct oversight—will not only set the minimum requirements but also undertake the assessment. Paradoxically, the possibility of political interference, which Lord Leveson and the Royal Charter set out to avoid (in the Royal Charter and PRP framework) might now be directly introduced for all UK news publishers”.


That means that the national press, which has avoided regulation, is coming under the regulation of Ofcom. I will be very interested to hear what a number of noble Lords might have to say on that subject.

Taken together, these amendments would address serious flaws in the Bill, and I very much hope that the Government’s response will be to reflect on them. I beg to move.

Lord Black of Brentwood Portrait Lord Black of Brentwood (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I join the noble Lord in wishing the noble Lords, Lord McNally and Lord Lipsey, well. I hope they are watching us on the television—perhaps as a cure for insomnia at this time of night. I declare my interest as deputy chairman of the Telegraph Media Group and of the Regulatory Funding Company and note my other interests set out in the register. I must admit I was gripped by a sense of déjà vu when I saw these amendments on the Marshalled List, because I fear they risk catapulting us back into the debate over matters which were settled a decade ago in response to events which took place two decades or more ago.

Before coming on to the detail of some of the amendments that the noble Lord set out, I will make a few general points which relate principally to Amendments 126 and 227 but impinge on the whole group.

First, I do not believe that this Bill, which is about the enormous, unaccountable and unregulated platforms and the dangers they pose to the vulnerable, is the place to reopen the debate about press regulation. Later in the year there will be a media Bill, recently published in draft, which will contain provisions to repeal Section 40 of the Crime and Courts Act 2013. If noble Lords want to discuss the whole issue of the royal charter and punitive legislation against the press, I respectfully suggest that that is the time and place to do so.

Secondly, this Bill has widespread support. The vast majority of people agree with its aims, even if we have disagreements at the edges. If the Bill ceases to be the Online Safety Bill and becomes the state regulation of the press Bill, it will become enormously controversial not just here but internationally.

That is my third point: the enormous global ramifications of seeking to use novel online legislation to force state-backed regulation on the press. The Crime and Courts Act 2013 and the establishment of the royal charter were roundly condemned by international press freedom organisations worldwide—the very same press freedom organisations we all claim to support when talking about the safety of journalists or the way in which the press is controlled in authoritarian regimes. Those same organisations condemned it utterly and they would look on with incredulity and horror if this, the first brave piece of legislation in the world to tackle online safety, was corrupted in this way and in a manner which sent the wrong signals to undemocratic regimes worldwide that it is okay to censor the press in the name of making the platforms accountable.

I was going to make a few comments about IPSO, which the noble Lord raised, but I see that the noble Lord, Lord Faulks, is in his place and I am sure he will make them much more effectively than I would.

The other general point is that this group of amendments flies in the face of the most fundamental Leveson recommendation. In his report, he stressed that it was essential that the system of self-regulation remained voluntary. What these proposals do is the antithesis of that. In effect, they hold a gun to the head of the industry and say, “Either you join a state-approved regulator, or you’re subject to the statutory control of Ofcom”. There is no voluntary element in that at all because either route ends up in a form of state regulation. That is Hobson’s choice.

Finally, as I have said to this House before, and I hoped I would never have to say again, the vast majority of the press will not under any circumstances join a regulator which is authorised by a state body and underpinned by the threat of legislation. Even Sir Brian Leveson said that he recognised that this was a matter of principle. That principle is that the press cannot be free if it is subject to any form of statutory control, however craftily concealed. That position has existed for many centuries and is threatened by the amendments. The reason for that is that if Amendment 126, and some of the others, went through, none of the major publishers at national, regional and local level, nor magazines, would be exempt from the terms of the Bill and would become subject to the statutory control of Ofcom—something that Ofcom has always made clear that it wants nothing to do with—and the prospect of unlimited penal sanctions. That is the end of a free press, by any definition.

I will very briefly discuss a few specifics. Amendment 124 seeks to bring the comments sections of basically all national newspaper websites within the Bill’s statutory regime. These are already regulated by IPSO, unless the noble Lord, Lord Faulks, corrects me, and they come under its jurisdiction as soon as a complaint is made to the publishers, even if they are not moderated. Unlike social media, which is entirely different in its reach and impact, editors are legally responsible for what appears on their websites, which is why in most cases there are strong content moderation procedures in place. That is why comments sections rightly fall within the limited functionality exemption in the Bill, because there is such limited scope for harm. The impact of Amendment 124 would be to introduce confusing and complex double regulation of comments sections on websites, to the detriment of the public who wish to engage in legitimate debate.

21:45
Amendment 127 also veers in the direction of extending statutory controls, because it is a subjective test, unlike the others in Clause 50, which would in effect require either the tech platforms or Ofcom to make value judgments about the timeliness of complaints handling, either by publishers or by IPSO. When it comes to media freedom, subjective tests in the hands of state regulators end up making bad law.
Finally, Amendment 227 seeks to extend Ofcom’s powers to include an assessment of whether the news publisher exemption is adversely impacting the online safety regime. That would again place a state regulator in the position of assessing whether independent voluntary self-regulation, of the sort envisaged by Leveson, complied with an online safety regime which was never intended to encompass press regulation. It is, in effect, the royal charter by the backdoor, trying to shoehorn a square peg into a round hole in a way which makes this legislation and the powers of the regulator even more complex and controversial.
At the end of the day, this should not be a press regulation Bill, and it is wrong to try to do that. It is a Bill about the responsibility of the vast unaccountable, unregulated platforms which disseminate so much dangerous and harmful content without anyone having recourse, as we have heard powerfully already this afternoon, and not a Bill about the publishers who produce verifiable, trusted journalism which is the lifeblood of a democracy. We confuse the two at our peril and at the cost of the free press, which I know all your Lordships hold dear.
Lord Faulks Portrait Lord Faulks (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, much of what I would have said has been said by the noble Lord, Lord Black, so I will make my contribution brief. Elegantly dressed up as these amendments were by the noble Lord on behalf of the noble Lords, Lord Lipsey and Lord McNally, to whom I also say get well soon, they are in fact intended to change the way the press is currently regulated. I declare my interest as chairman of IPSO, a post I have held since January 2020. IPSO regulates 95%, by circulation, of the printed press, and that includes online versions of newspapers.

Noble Lords will remember the Leveson inquiry, following the discovery of unacceptable press practices including phone hacking. Parliament’s response was to create the Press Recognition Panel and the concept of an approved regulator. It was not state regulation, but nor was it the status quo ante. Only one regulator has sought and attained approved status: Impress. The Press Recognition Panel was chaired by David Wolfe KC, who provided a quotation to the noble Lord. Impress is funded by the estate of Max Mosley. It does not regulate any of the main national newspapers, which have either, like the Guardian, elected for self-regulation, or, like most of the others, selected IPSO as their regulator. Now, clearly it would be unattractive for me to extol the virtues of IPSO, but to its critics I recommend reading the newly published independent external review, written by Sir Bill Jeffrey, former Permanent Secretary at the MoD. I think readers would generally be reassured by the report.

Section 40 of the Crime and Courts Act was intended as a stick—or was it a carrot—to drive newspapers into the arms of the approved regulator. Even when I had nothing to do with press regulation, I did not like that provision, which has hovered over the newspaper industry like the sword of Damocles. It has never been brought into effect, and I welcome the fact that the Government now intend to repeal Section 40 via the media Bill—although I accept, as the noble Lord, Lord Black, said, that there may be a debate about the proper scope of regulation, and indeed of Section 40, when that comes before Parliament.

As I understand these amendments, regulation of the largest websites would prospectively be the subject of the Online Safety Bill’s regulatory regime. I echo comments already made that this extraordinarily significant Bill is not primarily directed at press regulation at all. It is intended by these amendments that for newspapers to qualify for the recognised news publisher status, they would have to be a member of an approved regulator. This is plainly an attempt to dismantle the current system of press regulation.

It seems something of an irony that newspapers that are regulated by IPSO or even self-regulated have accountability, however imperfect, whereas, pending the passing of the Bill, internet platforms are wholly unregulated—yet it is sought to pass off some of the regulation of newspapers to Ofcom. Is Ofcom ready, willing or even equipped to replicate the complaints system that currently obtains? I think Ofcom would have quite enough to do. Is its horizon-scanning model even appropriate for press complaints? It is very early days to increase the scope of Ofcom’s rule. The Government have promised a review of the regulatory framework in two or three years; I suppose then it might be possible to assess whether Ofcom’s role should change or be enlarged. Until then, it seems inappropriate to do so.

I suggest that the current system of press regulation should not be the subject of further statutory provision at this juncture, or indeed at all. There have been some deplorable press practices in the past, but the traditional printed press in this country, albeit a much-reduced animal with diminished circulation and advertising revenues, nevertheless has some real strengths. A free, vigorous and challenging press is part of a functioning democracy. We should be very wary of giving a Government, of whatever colour and by whatever means, greater power to control it.

Baroness Grey-Thompson Portrait Baroness Grey-Thompson (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I speak in favour of Amendments 124, 126 and 227 to which my name is attached. I will reserve my comments mostly to the Bill’s loophole on newspaper comment sections.

These forums would qualify as social media platforms under the Bill’s definition were it not for a special exemption in Clause 49. They have been found to host some of the most appalling and despicable content online. I will paraphrase some examples so as not to subject the Committee to the specific language used, but they include anti-Semitic slurs in comments appearing under articles covering a violent attack on a synagogue; Holocaust denial; and speculation that Covid was created and spread by a secretive global cabal of powerful individuals who control the world’s leaders like puppets.

Some of the worst abuse is reserved for women in public life, which I and others in your Lordships’ House have personally experienced. In an article about a female leader, comments included that she should be struck down or executed by the SAS. Others commented graphically on her appearance and made disturbing sexual remarks. Another woman, Professor Fowler—who the noble Lord, Lord Clement-Jones, has already discussed —was described as having a sick mind and a mental disorder; one comment implied that a noose should be prepared for her. There are many more examples.

Comment sections are in too many cases badly regulated and dangerous places for members of the public. The exemption for them is unwarranted. Specifically, it protects any social media platform where users make comments in response to what the Bill describes as “provider content”. In this case, that means comments posted in response to articles published by the newspaper. This is materially no different from user exchanges of any other kind and should be covered just the same.

The Government have previously argued that there should be a distinction between newspaper comment sections and other platforms, in that other platforms allow for virality because posts that are liked and retweeted do better than the others. But this is exactly the same for many modern comment sections. Lots of these include functionality to upvote certain comments, which can then rise to the top of the comment section on that article.

There are estimated to be around 15 million people on Twitter in the UK—I am one of them—but more than twice that number read newspaper websites every month. These comment sections are social media platforms with the same power, reach and capacity to cause harm as the US giants. We should not treat them any differently on account of the fact that they are based out of Fleet Street rather than Silicon Valley.

There are some concerns that the Bill’s requirements would put an undue burden on small organisations running comment sections, so this amendment would apply only to organisations with an annual turnover in excess of £100 million. This would ensure that only the largest titles, which can surely afford it, are required to regulate their comment sections. Amendment 124 would close the comment section loophole, and I urge the Government to act on it.

It is a great shame that, due to the lateness of the hour, my noble friend Lady Hollins is unable to be here. She would strongly support Amendment 126 on several points but specifically wanted to talk about how the exemption creates double standards between how the public and news publishers are treated, and puts platforms and Ofcom in an impossible situation over whether newspapers meet vague criteria to access exemptions.

I also support Amendments 126 and 227, which would help protect the public from extremist and other dangerous websites by preventing them accessing the separate media exemption. In all these matters, we must not let overbroad exemptions and loopholes undermine what good work this Bill could do.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, while considering this group of amendments, a comment by Index on Censorship came to mind. Critical of aspects of the Bill, it worried out loud about whether this legislation

“will reverse the famous maxim ‘publish and be damned’, to become, ‘consider the consequences of all speech, or be damned’”.

In that context, I am very grateful—relieved at least— that the freedom of the press is given due regard and protections in the Bill. Freedom of the press is one practical form in which freedom of expression exists and is invaluable in a democracy. It is so crucial that it has been at the centre of democratic struggles in this very Parliament for more than five centuries—ever since the first printing press meant that the masses could gain access to the written word. It fuelled the pamphleteers of the English Civil War. It made a hero of MP John Wilkes in the 18th century, his victory giving the press freedom to report on the goings-on of the great and the good, to muckrake and to dig the dirt; long may that continue.

So I welcome that news publishers’ content on their own websites is not in scope of the legislation; that if platforms take down or restrict access to trusted news sources, they will face significant sanctions; that platforms must notify news publishers if they want to take down their content and, if the publisher disputes that, the platform must not remove it until the dispute is resolved; and that Ofcom must also review the efficacy of how well the platforms are protecting news.

I say “Hurrah!” to all that. If only the Bill treated all content with such a liberal and proportionate approach, I would not be standing up and speaking quite so much. But on the press specifically, I strongly oppose Amendments 124 and 126—as well as Amendment 127, now that it has been explained and I understand it; I did not quite before. Amendment 124 would mean that the comment section of the largest newspaper websites were subject to the regulation in the Bill.

It is important to note—as has been explained—that user comments are already regulated by IPSO, the Independent Press Standards Organisation, and that individual publishers have strong content moderation system policies and the editor is ultimately liable for comments. That is the key issue here. This is about protecting editorial independence from state interference. Amendment 124 does the opposite. That amendment would also restrict the ability of UK citizens to discuss and engage with publishers’ content.

It is part of a lively and vital public square to be free to debate and discuss articles in newspapers. We have heard some pretty graphic and grim descriptions from the noble Baroness, Lady Grey-Thompson, and the noble Lord, Lord Clement-Jones, about those comments; but for me, ironically, the comment section in newspapers is a form of accountability of the press to readers and the audience. Although the descriptions were grim, much of that section is intelligent, well-informed and interesting feedback. I will talk a little about hate afterwards.

22:00
What is more, one likely outcome of this amendment is that newspapers could shut down their comments sections. The cost of investing in proactive scanning or child safety technology would be prohibitively expensive, and I think that would be a great loss. Whenever a newspaper article is published and, maybe because it is controversial, the newspaper decides not to have a comments section, all over social media people say, “That’s not fair, I wanted to say something on it”, and they just comment on social media.
I am especially opposed to Amendment 126, which would mean that only those news publishers that would qualify for recognised news-publisher status would have to be a member of an approved regulator. We have to be clear what is meant by approved here: it means state approved. It would be the return of state licensing of the press and wipe out all those hard-won gains dating back from Milton’s Areopagitica and John Lilburne and the Levellers’ sacrifices for press freedom. I just do not want to throw those away; it would leave publishers in an impossible position of choosing between submitting to state-backed regulation or leaving their media content open to censure or censorship by tech giants, or Ofcom.
I think it is an attempt at coercing or bullying these papers into what is a Hacked Off-inspired, Leveson-style regulation system by the back door that has been rejected by the vast majority of the print media, as has been explained. It will remove vital protections for press freedom built into the Bill and allow anyone who refused to be blackmailed into state licensing and statutory content regulation, or thrown under the bus, and it would effectively greenlight Silicon Valley censorship of UK journalism.
That said, while it is important to give due respect to press freedom under a specific category, just as we do with academic freedom, I am still a little squeamish about the special-favours approach towards the mainstream media, as it is described, and legacy media. Perhaps that is the one thing on which I agree with the noble Lord, Lord Clement-Jones. Privileging comments sections of newspapers, while offering no parallel protections for some members of the public to comment on social media on the internet itself, is a problem. While we are focusing in this Bill only and largely on the negatives of the internet, we should remember that it has been hugely democratising, removing official gatekeeping and allowing ordinary people to publish and amplify their voices, which were often silenced or ignored in the past. Yet now they are peculiarly subject to censorious measures and, what is more, their reputations are traduced.
I thought it was interesting, listening to the discussion about hate in response to the way the noble Baroness, Lady Grey-Thompson, and the noble Lord, Lord Clement-Jones, discussed the comments section in newspapers—which I am not naive enough to imagine are not full of some horrors, as they described—that there is a danger that we have an impression of the British public as a hate-fuelled mob who, as soon as you let them speak, spew out anti-Semitism, misogyny and all the rest of it. As I constantly try to say throughout this Bill, the whole notion of hate is at least subjective and often quite complicated.
One example that happened just a couple of days ago was when an organisation called Stop Funding Hate, in response to an article about a female sportswoman who legitimately raised concerns about the disputes on sex and gender in sport, and who believes that women’s sport is in danger and on the line, described it as “bigotry” and “hate”. That led to a great deal of abuse of the female athlete. Stop Funding Hate then led a campaign to get a corporate boycott of advertising from the Telegraph, on the basis that the article was hate-fuelled—whereas I think the censorious boycott was hate-fuelled.
Therefore, using big business money, in this instance, as a weapon to dictate editorial content shows that press freedom is on the line in a variety of ways. Women arguing for protecting single-sex sport, and then being subject to vile misogyny, themselves being described as using transphobic hate speech shows me, at least, that in the name of fighting hate we should not have any attempts to assault press freedom. I will oppose all three of these amendments.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I support Amendment 227 in particular. I am pleased to contribute, as someone who gave evidence to the Leveson inquiry, explaining why social media should not be in scope for any new press regulation scheme. It is entertaining for me now to come through the looking glass and listen to the noble Lords, Lord Black of Brentwood and Lord Faulks, in particular making the kinds of argument I made then, as we discuss whether the press should be in scope for a new social media regulatory scheme.

These amendments are a helpful way to test how the Government expect their decision to afford certain privileges for online activity by journalists and news publishers to work. That is what the regime does, in effect, with the rationale, which was explained to us, that this is why certain bodies can be privileged when using user-to-user services and search engines in a way that, if they were not afforded that status, they would not be given those privileges. Again, it is noteworthy that there has often been criticism of social media precisely for giving special treatment to some users, including in stories in some of the press that we are talking about, and here we are creating not just a state sanction but a state-ordered two-tier system that all the social media companies will now have to adopt. That creates some interesting questions in itself.

I want to press the Minister primarily on definitions. It is certainly my experience that definitions of who is a journalist or a news media publisher are challenging and can be highly political. There have been several pressure points, pushing social media companies to try to define journalists and news publishers for themselves, outside of any regulatory scheme—notably following the disputes about misinformation and disinformation in the United States. The European Union also has a code of practice on misinformation and disinformation. Every time someone approaches this subject, they ask social media companies to try to distinguish journalists and news media from other publishers. So these efforts have been going on for some time, and many of them have run into disputes because there is no consistent agreement about who should be in or outside those regimes. This is one of those problems that seems clear and obvious when you stand back from it, but the more that you zoom in, the more complex and messy it becomes. We all say, “Oh yes, journalists and news publishers—that is fine”, and we write that in the legislation, but, in practice, it will be really hard when people have to make decisions about individuals.

Some news organisations are certainly highly problematic. Most terrorist organisations have news outlets and news agencies. They do not advertise themselves as such but, if you work in a social media platform, you have to learn to distinguish them. They are often presented entirely legitimately, and some of the information that you use to understand why they are problematic may be private, which creates all sorts of problems. Arguably, this is the Russia Today situation: it presented itself as legitimate and was registered with Ofcom for a period of time; we accepted that it was a legitimate news publisher, but we changed our view because we regard the Russian Government as a terrorist regime, in some senses. That is happening all of the time, with all sorts of bodies across the world that have created these news organisations. In the Middle East in particular, you have to be extraordinarily careful—you think that something is a news organisation but you then find that it has a Hezbollah connection and, there you go, you have to try to get rid of it. News organisations tied to extremist organisations is one area that is problematic, and my noble friend referred to it already.

There is also an issue with our domestic media environment. Certainly, most people would regard Gary Lineker as a journalist who works for a recognised news publisher—the BBC—but not everyone will agree with that definition. Equally, most people regard the gentleman who calls himself Tommy Robinson as not being a journalist; however much he protests that he is in front of judges and others, and however much support he has from recognised news publishers in the United States, most people would say that he is not a journalist. The community of people who agree that Gary Lineker is not a journalist and that of people who think that Tommy Robinson is not a journalist do not overlap much, but I make the point that there is continually this contention about individuals, and people have views about who should be in or out of any category that we create.

This is extraordinarily difficult, as in the Bill we are tasking online services with a very hard job. In a few lines of it, we say: “Create these special privileges for these people we call journalists and news publishers”. That is going to be really difficult for them to do in practice and they are going to make mistakes, either exclusionary or inclusionary. We are giving Ofcom an incredibly difficult role, which is why this debate is important, because it is going to have to adjudicate when that journalist or news publisher says to Ofcom: “I think this online platform is breaching the Online Safety Act because of the way it treated me”. Ofcom is going to have to take a view about whether that organisation or individual is legitimate. Given the individuals I named, you can bet your bottom dollar that someone is going to go to Ofcom and say, “I don’t think that Gary Lineker or the BBC are legitimate”. That one should be quite easy; others across the spectrum will be much more difficult for it to deal with.

That is the primary logic underlying Amendment 227: we have known unknowns. There will be unanticipated effects of this legislation and, until it is in place and those decisions are being made, we do not know how it will work. Frankly, we do not know whether, as a result of legal trickery and regulatory decisions, we have inadvertently created a loophole where some people will be able to go and win court cases by claiming protections that we did not intend them to have. I disagree with the noble Lord, Lord Black: I do not think Amendment 227 undermines press freedom in any sense at all. All it does is to say: “We have created an Online Safety Bill. We expect it to enhance people’s safety and within it we have some known unknowns. We do not know how this exemption is going to work. Why not ask Ofcom to see if any of those unintended consequences happen?”

I know that we are labouring our way through the Online Safety Bill version 1, so we do not want to think about an online safety Bill version 2, but there will at some point have to be a revision. It is entirely rational and sensible that, having put this meaningful exemption in there—it has been defended, so I am sure that the Government will not want to give it up—the least we can do is to take a long, hard look, without interfering with press freedom, and get Ofcom to ask, “Did we see those unintended consequences? Do we need to look at the definitions again?”

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, the noble Lord, Lord Allan, has clearly and comprehensively painted a picture of the complex world in which we now live, and I do not think that anybody can disagree with that or deny it. We are in a world which is going to keep evolving; we have talked in lots of other contexts about the pace of change, and so on. However, in recognising all that, what the noble Lord has just described—the need for constant evaluation of whether this regime is working effectively—is a job for Parliament, not for Ofcom. That is where I come back to in starting my response to this group of amendments.

Briefly—in order that we can get to the wind-ups and conclude business for the day—ensuring that recognised news publishers and organisations are not subject to Ofcom or any form of state regulation is a vital principle. I am pleased that the Government have included the safeguards which they have in the legislation, while also making it much harder for the tech platforms to restrict the freedom of recognised news publishers and users’ access to them.

I reiterate that I understand that this is becoming increasingly complicated, but these are important principles. We have to start in the world that we currently understand and know, ensure that we protect those publications which we recognise as trusted news providers now, and do not give way on those principles. As my noble friend Lord Black said, regarding debates about Section 40 of the Crime and Courts Act, there will be an opportunity to re-evaluate that in due course when we come to the media Bill. For what it is worth, my personal view is that I support the Government’s intention to remove it.

22:15
The only other thing I would add is that as far as the accountability of news publishers is concerned, I, too, think this is important. It is an important element of their proper editorial oversight that they control and oversee online comments sections and that they are subject to the same sort of self-regulation and regulation by IPSO as has been described. However, it is also important to say to these news publishers and any organisations that rely on the support and continued use and subscription of their users that they ought to have in place good-quality customer service regimes. If people want to raise complaints and concerns, not necessarily just about content but about the way in which they manage their subscriptions, their inability to cancel their subscriptions or indeed a demand for a more flexible approach to their subscriptions, I would like to see much better accountability in the way that these organisations look after and service the people who are their readers. The future of news publishers ultimately relies on them meeting the expectations of their readers and giving voice to the perspective of their readers, and for as long as they do that, I think they should enjoy the freedom to operate without statutory control, and therefore I do not support the amendments in this group.
Baroness Gohir Portrait Baroness Gohir (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I support Amendments 124, 126 and 227. I thank the noble Lords, Lord Lipsey and Lord McNally, for proposing these amendments and I wish them well.

A number of far-right websites already exist across the internet which are capable, with minimal reform, of meeting the requirements to qualify as recognised news publishers and benefit from the exemption. Some of these websites host content from known high-profile racists. These extreme websites feature anti-Semitism, hatred of women and hatred of Muslims. The Centre for Media Monitoring, part of the Muslim Council of Britain, has criticised the Bill’s media exemption. The threat of far-right and anti-Muslim websites arguing that they constitute a news publisher is not only inevitable but very dangerous. As news publishers, they would have the freedom to propagate fake news, disinformation and conspiracy theories about Islam and Muslims.

The thought that UK-based racist outlets would be able to access this exception is horrific enough, but there is also a risk that extremist news websites currently based in the USA and elsewhere around the world will seek to relocate to Britain to benefit from the exemption in future. This is because while the exemption does not require publishers to abide by any specific set of standards, it does require publishers to have a UK office. Perversely, this creates an incentive for an extremist website based the US, for example, from where many of the internationally most popular racially hateful websites currently operate, to establish an office here in the UK. In doing so, it may then be able to post content under the terms of the exemption. Indeed, this exemption risks paving the way for a catastrophic scenario in which, on account of this exemption, the UK becomes less safe. It is critical that the Government listen and engage with these concerns.

Amendment 124 seeks to ensure that newspaper comment sections are properly regulated. Anyone can be a target of hatred in a newspaper comment section, but they are most likely to have Islamophobic, anti-Semitic, racist and misogynistic content. Without the amendment, the Bill’s provisions on the media will endanger those it is intended to protect. These amendments propose a compromise which is the right approach and will ensure that people are protected from abuse while also retaining the media exemption for responsible newspaper publishers. I hope the Government will engage more on these matters and work towards a solution.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I regret that my noble friend Lord Lipsey is unable to be here. I wish him and the noble Lord, Lord McNally, well. I also regret that my noble friend Lord Stevenson is not here to wind up this debate and introduce his Amendment 127. Our inability to future-proof these proceedings means that, rather than talking to the next group, I am talking to this one.

I want to make four principal points. First, the principle of press freedom, as discussed by the noble Lords, Lord Black and Lord Faulks, in particular, is an important one. We do not think that this is the right Bill to reopen those issues. We look forward to the media Bill as the opportunity to discuss these things more fully across the House.

Secondly, I have some concerns about the news publisher exemption. In essence, as the noble Lord, Lord Clement-Jones, set out, as long as you have a standards code, a complaints process, a UK address and a team of contributors, the exemption applies. That feels a bit loose to me, and it opens up the regime to some abuse. I hear what the noble Baronesses, Lady Gohir and Lady Grey-Thompson, said about how we already see pretty dodgy outfits allowing racist and abusive content to proliferate. I look forward to the Minister’s comments on whether the bar we have at the moment is too low and whether there is some reflection to be done on that.

The third point is on my noble friend Lord Stevenson’s Amendment 127, which essentially says that we should set a threshold around whether complaints are dealt with in a timely manner. In laying that amendment, my noble friend essentially wanted to probe. The noble Lord, Lord Faulks, is here, so this is a good chance to have him listen to me say that we think that complaints should be dealt with more swiftly and that the organisation that he chairs could do better at dealing with that.

My fourth comment is about comments, particularly after listening to the speech of the noble Baroness, Lady Grey-Thompson, about some of the hateful comment that is hidden away inside the comments that news publishers carry. I was very much struck by what she said in respect of some of the systems of virality that are now being adopted by those platforms. There, I think Amendment 227 is tempting. I heard what the noble Baroness, Lady Stowell, said, and I think I agree that this is better addressed by Parliament.

For me, that just reinforces the need for this Bill, more than any other that I have ever worked on in this place, to have post-legislative scrutiny by Parliament so that we, as a Parliament, can review whether the regime we are setting up is running appropriately. It is such a novel regime, in particular around regulating algorithms and artificial intelligence. It would be an opportunity to see whether, in this case, the systems of virality were creating an amplification of harm away from the editorial function that the news publishers are able to exercise over the comments.

On that basis, and given the hour, I am happy to listen with care to the wise words of the Minister.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Digital, Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I join noble Lords who have sent their best wishes to the noble Lords, Lord Lipsey and Lord McNally.

His Majesty’s Government are committed to defending the invaluable role of a free media. We are clear that our online safety legislation must protect the vital role of the press in providing people with reliable and accurate information.

We have included strong protections for news publishers’ and journalistic content in the Bill, which extends to the exemption from the Bill’s safety duties for users’ comments and reviews on news publishers’ sites. This reflects a wider exemption for comments and reviews on provider content more generally. For example, reviews of products on retailers’ sites are also exempt from regulation. This is designed to avoid disproportionate regulatory burden on low-risk services.

Amendment 124 intends to modify that exemption, so that the largest news websites no longer benefit and are subject to the Bill’s regulatory regime. Below-the-line comments are crucial for enabling reader engagement with the news and encouraging public debate, as well as for the sustainability—and, as the noble Baroness, Lady Fox, put it, the accountability—of the news media. We do not consider it proportionate, necessary or compatible with our commitment to press freedom to subject these comment sections to oversight by Ofcom.

We recognise that there can sometimes be unpleasant or abusive below-the-line comments. We have carefully considered the risks of this exemption against the need to protect freedom of speech and media freedoms on matters of public interest. Although comment functions will not be subject to online regulation, I reassure the Members of the Committee who raised concerns about some of the comments which have attracted particular attention that sites hosting such comments can, in some circumstances, be held liable for any illegal content appearing on them, where they have actual knowledge of the content in question and fail to remove it expeditiously.

The strong protections for recognised news publishers in the Bill include exempting their content from the Bill’s safety duties, requiring category 1 platforms to notify recognised news publishers and to offer a right of appeal before removing or moderating any of their content. Clause 50 stipulates the clear criteria that publishers will have to meet to be considered a “recognised news publisher” and to benefit from those protections. When drafting these criteria, the Government have been careful to ensure that established news publishers are captured, while limiting the opportunity for bad actors to qualify.

Amendment 126 seeks to restrict the criteria for recognised news publishers in the Bill, so that only members of an approved regulator within the meaning of Section 42 of the Crime and Courts Act 2013 benefit from the protections offered by the Bill. This would create strong incentives for publishers to join specific press regulators. We do not consider this to be compatible with our commitment to a free press. We will repeal existing legislation that could have that effect, specifically Section 40 of the Crime and Courts Act 2013, through the media Bill, as noble Lords have noted, which has recently been published. Without wanting to make a rod for my own back when we come to that Bill, I agree with my noble friend Lord Black of Brentwood that it would be the opportunity to have this debate, if your Lordships so wished.

The current effect of this amendment would be to force all news publishers to join a single press regulator—namely Impress, the only UK regulator which has sought approval by the Press Recognition Panel—if they were to benefit from the exclusion for recognised news publishers. Requiring a publisher to join specific regulators is, in the view of His Majesty’s Government, not only incompatible with protecting press freedom in the UK but unnecessary given the range of detailed criteria which a publisher must meet to qualify for the additional protections, as set out in Clause 50 of the Bill.

As part of our commitment to media freedom, we are committed to independent self-regulation of the press. As I have indicated, Clause 50 stipulates the clear criteria which publishers will have to meet to be considered a “recognised news publisher” and to benefit from the protections in the Bill. One of those criteria is for entities to have policies and procedures for handling and resolving complaints. Amendment 127 from the noble Lord, Lord Stevenson, adds a requirement that these policies and procedures must cover handling and resolving complaints “in a timely manner”. To include such a requirement will place the responsibility on Ofcom to decide what constitutes “timely”, and, in effect, put it in the position of press regulator. That is not something that we would like. We believe that the criteria set out in Clause 50 are already strong, and we have taken significant care to ensure that established news publishers are captured, while limiting the opportunity for bad actors to benefit.

I turn now to Amendment 227. We recognise that, as legislation comes into force, it will be necessary to ensure that the protections we have put in place for journalistic and news publisher content are effective. We need to ensure that the regulatory framework does not hinder access to such content, particularly in the light of the fact that, in the past, news content has sometimes been removed or made less visible by social media moderators or algorithms for unclear reasons, often at the height of news cycles. That is why we have required Ofcom to produce a specific report, under Clause 144, assessing the impact of the Bill on the availability and treatment of news publisher and journalistic content on category 1 services.

22:30
Amendment 227 would require Ofcom’s report also to assess the impact that the recognised news publisher exemption and journalistic content duties have on the efficacy of the new online safety regulatory framework, and
“the securing of public safety from online harms”.
The Bill currently requires the Secretary of State to review the regulatory framework established by the Bill at least two years after it comes into force, as set out in Clause 159. This review will encompass the elements that these amendments seek to assess, because it requires an evaluation of how effective the new regulatory framework is at minimising the risk of harm to people in the United Kingdom.
The Secretary of State must consult Ofcom in producing this report, as well as any other persons she considers appropriate. Any concerns about the recognised news publisher and journalistic content exemptions can be brought to the Secretary of State’s attention in the course of this review. Requiring Ofcom also to assess these factors in the production of its report under Clause 144 would therefore be duplicative. On that basis, I hope the noble Lord will be willing to—
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

Before the Minister closes his folder and sits down, perhaps I could say that I listened carefully and would just like him to reflect a little more for us on my question of whether the bar is set too low and there is too much wriggle room in the exemption around news publishers. A tighter definition might be something that would benefit the Bill and the improvement of the Bill when we come back to it on Report.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

Looking at the length of Clause 50—and I note that the noble Lord, Lord Allan of Hallam, made much the same point in his speech—I think the definitions set out in Clause 50 are extensive. Clause 50(1) sets out a number of recognised news publishers, obviously including

“the British Broadcasting Corporation, Sianel Pedwar Cymru”—

self-evidently, as well as

“the holder of a licence under the Broadcasting Act 1990 or 1996”

or

“any other entity which … meets all of the conditions in subsection (2), and … is not an excluded entity”

as set out in subsection (3). Subsection (2) sets out a number of specific criteria which I think capture the recognised news publishers we want to see.

Noble Lords will be aware of the further provisions we have brought forward to make sure that entities that are subject to a sanction are not able to qualify, such as—

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

I think it is actually quite important that there is—to use the language of the Bill—a risk assessment around the notion that people might game it. I thought the noble Baroness, Lady Gohir, made a very good point. People are very inventive and, if you have ever engaged with the people who run some of those big US misinformation sites—let us just call them that—you will know that they have very inventive, very clever people. They will be looking at this legislation and if they figure out that by opening a UK office and ticking all the boxes they will now get some sorts of privileges in terms of distributing their misinformation around the world, they will do it. They will try it, so I certainly think it is worth there being at least some kind of risk assessment against that happening.

In two years’ time we will be able to see whether the bad thing happened, but whether or not it is the Minister having a conversation with Ofcom now, I just think that forewarned is forearmed. We know that that is a possibility and it would be helpful for some work to be done now to make sure that that is not a loophole that none of us want, I think.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

I am mindful of the examples the noble Lord gave in his speech. Looking at some of the provisions set out in subsection (2) about a body being

“subject to a standards code”

or having

“policies and procedures for handling and resolving complaints”,

I think on first response that those examples he gave would be covered. But I will certainly take on board the comments he made and those the noble Baroness, Lady Gohir, made as well and reflect on them. I hope—

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

On a final point of clarification, in contrast, I think the exemption may be too narrow, not too broad. With the emergence of blogs and different kinds of news organisations—I think the noble Lord, Lord Allan, described well the complexity of what we have—and some of the grimmer, grosser examples of people who might play the system, does the Minister acknowledge that that might be dealt with by the kind of exemptions that have been used for RT? When somebody is really an extremist representative of, I do not know, ISIS, pretending to be a media organisation, the sensible thing to do would be to exempt them, rather than to overtighten the exemptions, so that new, burgeoning, widely read online publications can have press freedom protection.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will certainly take on board the points the noble Baroness raises. Hearing representations in both directions on the point would, on first consideration, reassure me that we have it right, but I will certainly take on board the points which the noble Baroness, the noble Lord and others have raised in our debate on this. As the noble Lord, Lord Allan, suggests, I will take the opportunity to discuss it with Ofcom, as we will do on many of the issues which we are discussing in this Committee, to make sure that its views are taken on board before we return to these and other issues on Report.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I thank the Minister for his responses to the number of different issues that he has been asked about after he sat down, so to speak, which I think have been taken on board. I thank the noble Baronesses, Lady Gohir and Lady Grey-Thompson, and the noble Lord, Lord Allan, for their support for these amendments.

It is also a pleasure to see the noble Lord, Lord Black, who has clearly been lured out at this late hour—perhaps not so unwillingly; it gives him a chance to rehearse some of the arguments for the media Bill coming down the track. Along with the noble Lord, Lord Faulks, I am sure he will enjoy the media Bill when it comes. I had many happy hours sitting next door to the noble Lord, Lord Black, on the Joint Committee on the draft Bill, on which I may say that we agreed on most things.

But the fact is that one person’s strong exemptions is another person’s special privileges, and that very much applies in these circumstances as regards what I think the noble Baroness, Lady Fox, would call the mainstream media. I enjoyed what the noble Baroness had to say, because she came round to some kind of agreement at the end on what we might call mainstream media exceptionalism, which is a fair description. There is an element of cakeism about the way that the mainstream media seem to want to have it.

The Minister talked about low-risk services, and I think that was the reason why we had these questions asked about risk assessment. How does the Minister know that these below-the-line comments sections are low-risk unless a risk assessment has been made? We heard from the noble Baronesses, Lady Grey-Thompson and Lady Gohir, about the content of some of those comments sections. That does not sound low-risk to me at all, and I think that is the basis of their support for the amendments. When the Minister says it is not proportionate to regulate these comments sections, he is assuming they are low-risk services without much evidence. He threw a small bone by saying that sites can be held liable for illegal content, but that is a relatively small bone in those circumstances.

I took some comfort from the way that the noble Baroness, Lady Stowell, talked about the need for constant evaluation. Perish the thought, but we may well need an online safety Bill number 2; I just hope it is not too soon and that we have time for a little evaluation of how this Bill operates. That is why I am pretty keen that we should get this Bill into the best possible shape.

I thought what the noble Lord, Lord Knight, had to say about post-legislative scrutiny was very apposite, but the hour is late so I will not go through too many other aspects of this. This has been a good debate. I hope the media Bill is not déjà vu all over again, but we will see what happens when we get to it. In the meantime, I beg leave to withdraw the amendment.

Amendment 124 withdrawn.
House resumed.
Committee (9th Day)
11:52
Relevant document: 28th Report from the Delegated Powers Committee
Clause 49: “Regulated user-generated content”, “user-generated content”, “news publisher content”
Amendment 125
Moved by
125: Clause 49, page 47, line 22, at end insert—
“(c) machine-generated content is to be regarded as user-generated content of a service if—(i) the creation or use of the machine-generated content involves interacting with user-generated content,(ii) it takes the form or identity of a user,(iii) it provides content that constitutes illegal, primary priority content or priority content, or would constitute it if created in another format, or(iv) a user has in any way facilitated any element of the generation by way of a command, prompt, or any other instruction, however minimal.”Member’s explanatory statement
This amendment would add machine-generated content to regulated content in the bill and gives meaning to how it could be regarded as ‘user-generated content’ of the service, and allows virtual and augmented reality material to be treated on an equal basis as on other formats.
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, I rise to introduce this group. On Tuesday in Committee, I said that having reached day 8 of the Committee we had all found our roles; now, I find myself in a different role. The noble Baroness, Lady Kidron, is taking an extremely well-earned holiday and was never able to be in the House today. She has asked me to introduce this group and specifically to speak to Amendment 125 in her name.

I strongly support all the amendments in the group, particularly those that would result in a review, but will limit my words to Amendment 125. I also thank the other co- signatories, the noble Baroness, Lady Finlay, who is in her place, and my noble friend Lord Sarfraz, who made such a compelling speech at Second Reading on the need for the Bill to consider emerging technologies but who is also, sadly, abroad, on government business.

I start with something said by Lord Puttnam, and I paraphrase: that we were forbidden from incorporating the word “digital” throughout the whole process of scrutiny of the communications Act in 2002. As a number of us observed at the time, he said, it was a terrible mistake not to address or anticipate these issues when it was obvious that we would have to return to it all at some later date. The Online Safety Bill is just such a moment: “Don’t close your eyes and hope”, he said, “but look to the future and make sure that it is represented in the Bill”.

With that in mind, this amendment is very modest. I will be listening carefully, as I am sure the noble Baroness, Lady Kidron, will from a distance, to my noble friend the Minister because if each aspect of this amendment is already covered in the Bill, as I suspect he will want to say, then I would be grateful if he could categorically explain how that is the case at the Dispatch Box, in sufficient detail that a future court of law can clearly understand it. If he cannot state that then I will be asking the House, as I am sure the noble Baroness, Lady Kidron, would, to support the amendment’s inclusion in the Bill.

There are two important supporters of this amendment. If the Committee will forgive me, I want to talk briefly about each of them because of the depth of understanding of the issues they have. The first is an enforcement officer who I shall not name, but I and the noble Baroness, Lady Kidron, want to thank him and his team for the extraordinary work that they do, searching out child sexual abuse in the metaverse. The second, who I will come to in a little bit, is Dr Geoff Hinton, the inventor of the neural network and most often referred to as “the godfather of AI”, whom the noble Baroness, Lady Kidron, met last week. Both are firm supporters of this amendment.

The amendment is part of a grouping labelled future-proofing but, sadly, this is not in the future. It is with us now. The rise of child sexual abuse in the metaverse is growing phenomenally. Two months ago, at the behest of the Institution of Engineering and Technology, the noble Baroness, Lady Kidron, hosted a small event at which members of a specialist police unit explained to colleagues from both Houses that what they were finding online was amongst the worst imaginable, but was not adequately caught by existing laws. I should just warn those listening to or reading this—I am looking up at the Public Gallery, where I see a number of young people listening to us—that I am about to briefly recount some really horrific stuff from what we saw and heard.

The quality of AI imagery is now at the point where a realistic AI image of a child can be produced. Users are able to produce or order indecent AI images, based on a child known to them. Simply by uploading a picture of a next door neighbour’s child or a family member, or taking a child’s image from social media and putting that face on existing abuse images, they can create a body for that picture or, increasingly, make it 3D and take it into an abuse room. The type of imagery produced can vary from suggestive or naked to penetrative sex; for the most part, I do not think I should be repeating in this Chamber the scenarios that play out.

VR child avatars can be provided with a variety of bespoke abuse scenarios, which the user can then interact with. Tailor-made VR experiences are being advertised for production on demand. They can be made to meet specific fetishes or to feature a specific profile of a child. The production of these VR abuse images is a commercial venture. Among the many chilling facts we learned was that the Oculus Meta Quest 2, which is the best-selling VR headset in the UK, links up to an app that is downloaded on to the user’s mobile phone. Within that app, the user can search for other users to follow and engage with—either through the VR headset or via instant messaging in their mobile app. A brief search through the publicly viewable user profiles on this app shows a huge number of profiles with usernames indicative of a sexual interest in children.

Six weeks after the event, the noble Baroness, Lady Kidron, spoke to the same officer. He said that already the technology was a generation on—in just six weeks. The officer made a terrible and terrifying prediction: he said that in a matter of months this violent imagery, based on and indistinguishable from an actual known child, will evolve to include moving 3D imagery and that at that point, the worlds of VR and AI will meet and herald a whole new phase in offending. I will quote this enforcement officer. He said:

“I hate to think where we will be in six months from now”.


While this group is labelled as future-proofing the Bill, I remind noble Lords that in six months’ time, the provisions of the Bill will not have been implemented. So this is not about the future; it is actually about the now.

12:00
Even though what I am describing is abhorrent, to some it may appear to be a victimless crime or a thought crime that might take the place of real crimes, since it could be argued that nobody gets hurt. There are three points to say against that. First, evidence shows that rehearsing child-abuse fantasies online radically accelerates the offender pathway—the length of time between looking at images and abusing a child. Secondly, the relative anonymity of the online world has enabled and supercharged the spread of such content and risks, normalising its production and consumption. Thirdly, the current advances in AI allow perpetrators to create and share thousands of images of a child in a matter of minutes. That leaves the police overwhelmed with the impossible task of distinguishing between the AI-created children and the real children who are being abused. The sheer volume of abuse imagery can remain undiscovered and therefore unreached. This is a perverse and chilling game of whack-a-mole.
A small band of enforcement officers are crying out for our help because they are concerned that existing law does not reach this material and that blurring the role of machine and master risks undermining their ability to enforce the law. While Sections 62 to 69 and Schedule 13 of the Coroners and Justice Act 2009 go some way towards bringing certain computer-generated images into the scope of the law, much of the sexual offences law fails to reach the online world. As a result, the enforcement community is struggling to deal with the new generation of automated and semi-automated systems that create not only abuse images but abusive scenarios at the touch of a button. As the police officer explained to us, the biggest change required is the provision of specific offences covering virtual abuse in the VR social environment, to protect children in those areas against the psychological impact of virtual abuse.
This amendment makes a small change to the definition of “content”, to make clear that machine-generated content is to be regarded as user-generated content of a service, under the following circumstances: first, if the creation or use of the content interacts with user-generated content; secondly, if it takes the form or identity of a user; thirdly, if it provides content that would reach the bar of illegal primary priority content or priority content in another format; and finally, if a user has in any way facilitated any element of the generation by way of a command prompt or any other instruction, however minimal. This would go a long way to support the police in their unenviable task.
When my noble friend the Minister responds, I would ask that he confirms that the scope of the Bill—user-to-user services and search—does not fetter law enforcement. We discussed services of limited functionality being out of scope earlier in Committee, when discussing Amendment 2. For example, would a person or an automated process creating this material at scale, with no user-to-user functionality, be out of scope? The concern must be that existing laws covering child sexual abuse do not address the current state of technology, and this Bill may be drawn too narrowly to catch the abuse that is happening at ever-increasing scale.
Finally, this brings me to Dr Geoff Hinton. After a decade at Google, he retired and has chosen to speak freely about his profound worries concerning the future of AI, joining the chorus of those on the front line who are demanding that we regulate it before it is too late. I am a keen and enthusiastic early adopter of new technology, but we should listen very carefully to his concerns. He says that AI systems can learn and provide a compelling view of the world at such speed and scale that, in the hands of bad actors, they will in the very near future obliterate any version of a common reality. A deluge of fake images, videos and texts will be the data upon which future AI-driven communication will be built, leaving all of us unable to distinguish between fact and fiction. That is a very scary view of the world and we should take his professional concern very seriously, particularly when we focus on this Bill and how we protect our children in this world.
Given the scope of the Bill, we obviously will not be able to address every one of the hopes or fears of AI as it stretches out ahead of us, but it is a huge mistake for the Online Safety Bill to pretend that this future is not already with us. In this amendment and the whole group, we are attempting to put in the Bill the requirements to recognise those future dangers. As Dr Hinton has made clear, it is necessary to treat the fake as if it were real today, because we are no longer certain what is fake and what is real. We do a disservice to our children if we do not recognise that reality today.
I appreciate that I have spoken for far too long on this very small amendment. It closes a loophole which means that if machine-generated material is imitating user-to-user behaviour, takes the form of a user, or would in another context meet the bar of illegal primary priority content or priority content, it should be treated as such under the safety duties of the Bill. That is all it does. This would prevent the police standing by as the horrific rise in the use of abuse rooms—which act as a rehearsal for abusing children—continues. It is much needed and an essential first step down this road. I beg to move.
Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I am very grateful to the noble Baroness, Lady Harding, for the way she introduced this group of amendments. I have added my name to Amendment 125 and have tabled probing Amendments 241 and 301 in an attempt to future-proof the Bill. As the noble Baroness has said, this is not the future but today, tomorrow and forever, going forwards.

I hope that there are no children in the Public Gallery, but from my position I cannot see.

Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- Hansard - - - Excerpts

Then I shall slightly modify some of the things I was going to say.

When this Bill was conceived, the online world was very different from how it is today. It is hard to imagine how it will look in the future. I am very grateful to the noble Baroness, Lady Berridge, and the Dawes Centre for Future Crime at UCL, for information that they have given to me. I am also grateful to my noble friend Lady Kidron, and the enforcement officers who have shared with us images which are so horrific that I wish that I had never seen them—but you cannot unsee what you have seen. I admire how they have kept going and maintained a moral compass in their work.

The metaverse is already disrupting the online world as we know it. By 2024, it is estimated that there will be 1.7 billion mobile augmented-reality user devices worldwide. More than one-fifth of five to 10 year-olds already have a virtual reality headset of their own, or have asked for similar technology as a gift. The AI models are also developing quickly. My Amendment 241 would require Ofcom to be alert to the ways in which emerging technologies allow for activities that are illegal in the real world to be carried out online, to identify the places where the law is not keeping up to date with technological developments.

The metaverse seems to have 10 attributes. It is multiuser and multipurpose, content is user-generated, it is immersive, and spatial interactions occur in virtual reality or have physical environments enhanced by augmented reality. Its digital aspects do not expire when the experience ends, and it is multiplatform and interoperable, as users move between platforms. Avatars are involved, and in the metaverse there is ownership of the avatars or other assets such as virtual property, cryptocurrency et cetera. These attributes allow it to be used to master training scenarios of complex situations, such as in surgical training for keyhole surgery, where it can improve accuracy rapidly. On the horizon are brain/computer interfaces, which may be very helpful in rehabilitative adaptation after severe neurological damage.

These developments have great potential. However, dangers arise when virtual and augmented reality devices are linked to such things as wearable haptic suits, which allow the user to feel interactions through physical sensation, and teledildonics, which are electronic devices that simulate sexual interaction.

With the development of deep-fake imagery, it is now possible for an individual to order a VR experience of abusing the image of a child whom they know. The computer-generated images are so realistic that they are almost impossible to distinguish from those that would be cartoon-generated. An avatar can sexually assault the avatar of a minor, and such an avatar of the minor can be personalised. Worryingly, there have been growing reports of these assaults and rapes happening. Since the intention of VR is to trick the human nervous system into experiencing perceptual and bodily reactions, while such a virtual assault may not involve physical touching, the psychological, neurological and emotional experience can be similar to a physical assault.

This fuels sex addiction and violence addiction, and is altering the offender pathway: once the offender has engaged with VR abuse material, there is no desire to go back to 2D material. Offenders report that they want more: in the case of VR, that would be moving to live abuse, as has been said. The time from the development of abnormal sexual desires to real offending is shortened as the offender seeks ever-increasing and diverse stimulation to achieve the same reward. Through Amendment 125, such content would be regarded as user-generated.

Under Amendment 241, Ofcom could suggest ways in which Parliament may want to update the current law on child pornography to catch such deep-fake imagery, as these problematic behaviours are illegal in the real world but do not appear to be illegal online or in the virtual world.

Difficulties also arise over aspects of terrorism. It is currently a criminal offence to attend a terrorist training ground. Can the Minister confirm that Amendment 136C, which we have debated and which will be moved in a later group, would make attending a virtual training ground illegal? How will Ofcom be placed to identify and close any loopholes?

The Dawes Centre for Future Crime has identified 31 unique crime threats or offences which are risks in the metaverse, particularly relating to child sexual abuse material, child grooming, investment scams, hate crime, harassment and radicalisation.

I hope the Minister can confirm that the Bill already applies to the metaverse, with its definition of user-to-user services and technology-neutral terminology, and that its broad definition of “encountering” includes experiencing content such as haptic suits or virtual or augmented reality through the technology-neutral expression “or other automated tool”. Can the Minister also confirm that the changes made in the other place in Clause 85 require providers of metaverse services to consider the level of risk of the service being used for the commission or facilitation of a priority offence?

The welcome addition to the Bill of a risk assessment duty, however, should be broadened to include offences which are not only priority offences. I ask the Minister: will the list of offences in Schedules 5 to 7 to the Bill be amended to include the option of adding to this list to cover other harmful offences such as sexual offences against adults, impersonation scams, and cyber physical attacks such as cyber burglary, which can lead to planned burglary, attacks on key infrastructure and assault?

The ability to expand the risk assessment criteria could future-proof the Bill against such offences by keeping the list open, rather than closed as it is at the moment, to other serious offences committed in user-to-user or combined service providers. Such duties should apply across all services, not only those in category 1, because the smaller platforms, which are not covered by empowerment duties, may present a particularly high risk of illegal content and harmful behaviours.

Can the Minister therefore please tell us how content that is illegal in the real world will be reported, and how complaints can be made when it is encountered, if it is not a listed priority offence in the Bill? Will the Government expand the scope to cover not only illegal content, as defined in Clauses 207 and 53, but complex activities and interactions that are possible in the metaverse? How will the list of priority offences be expanded? Will the Government amend the Bill to enable Ofcom to take a risk-based approach to identifying who becomes classified as a category 1 provider?

I could go on to list many other ways in which our current laws will struggle to remain relevant against the emerging technologies. The list’s length shows the need for Ofcom to be able to act and report on such areas—and that Parliament must be alive to the need to stay up to date.

Lord Harlech Portrait Lord Harlech (Con)
- Hansard - - - Excerpts

My Lords, I am grateful to the noble Baroness, Lady Finlay of Llandaff, for tempering her remarks. On tempering speeches and things like that, I can inform noble Lords that the current school group have been escorted from the Chamber, and no further school groups will enter for the duration of the debate on this group of amendments.

12:15
Baroness Berridge Portrait Baroness Berridge (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to support Amendment 241, in the name of the noble Baroness, Lady Finlay, as she mentioned. I also spoke in the Private Member’s Bill that the noble Baroness previously brought before your Lordships’ House, in a similar vein, regarding future-proofing.

The particular issue in Amendment 241 that I wish to address is

“the extent to which new communications and internet technologies allow for behaviours which would be in breach of the law if the equivalent behaviours were committed in the physical world”.

The use of “behaviours” brings into sharp focus the applicability of the Online Safety Bill in the metaverse. Since that Private Member’s Bill, I have learned much about future-proofing from the expert work of the Dawes Centre for Future Crime at UCL. I reached out to the centre as it seemed to me that some conduct and crimes in the physical world would not be criminal if committed in the metaverse.

I will share the example, which seems quite banal, that led me to contact them. The office meeting now takes place in the metaverse. All my colleagues are represented by avatars. My firm has equipped me with the most sophisticated haptic suit. During the meeting, the avatar of one of my colleagues slaps the bum of my avatar. The haptic suit means that I have a physical response to that, to add to the fright and shock. Even without such a suit, I would be shocked and frightened. Physically, I am, of course, working in my own home.

Lord Harlech Portrait Lord Harlech (Con)
- Hansard - - - Excerpts

My Lords, I apologise to my noble friend. I ask that we pause the debate to ask this school group to exit the Chamber. We do not think that the subject matter and content will be suitable for that audience. I am very sorry. The House is pausing.

Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- Hansard - - - Excerpts

In this moment while we pause, I congratulate the noble Lord, the Government Whip, for being so vigilant: some of us in the Chamber cannot see the whole Gallery. It is appreciated.

Baroness Berridge Portrait Baroness Berridge (Con)
- Hansard - - - Excerpts

I, too, thank my noble friend the Government Whip. I apologise too if I have spoken out of discourtesy in the Committee: I was not sure whose name was on which amendment, so I will continue.

Physically, I am, of course, working in my home. If that behaviour had happened in the office, it would be an offence, an assault: “intentional or reckless application of unlawful force to another person”. It will not be an offence in the metaverse and it is probably not harassment because it is not a course of conduct.

Although the basic definition of user-to-user content covers the metaverse, as does encountering, as has been mentioned in relation to content under Clause 207, which is broad enough to cover the haptic suits, the restriction to illegal content could be problematic, as the metaverse is a complex of live interactions that mimics real life and such behaviours, including criminal ones. Also, the avatar of an adult could sexually assault the avatar of a child in the metaverse, and with haptic technologies this would not be just a virtual experience. Potentially even more fundamentally than Amendment 125, the Bill is premised on the internet being a solely virtual environment when it comes to content that can harm. But what I am seeking to outline is that conduct can also harm.

I recognise that we cannot catch everything in this Bill at this moment. This research is literally hot off the press; it is only a few weeks old. At the very least, it highlights the need for future-proofing. I am aware that some of the issues I have highlighted about the fundamental difference between conduct and content refer to clauses noble Lords may already have debated. However, I believe that these points are significant. It is just happenstance that the research came out and is hot off the press. I would be grateful if the Minister would meet the Dawes Centre urgently to consider whether there are further changes the Government need to make to the Bill to ensure that it covers the harms I have outlined.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I have put my name to Amendments 195, 239 and 263. I also strongly support Amendment 125 in the name of my noble friend Lady Kidron.

During this Committee there have been many claims that a group of amendments is the most significant, but I believe that this group is the most significant. This debate comes after the Prime Minister and the Secretary of State for Science and Technology met the heads of leading AI research companies in Downing Street. The joint statement said:

“They discussed safety measures … to manage risks”


and called for

“international collaboration on AI safety and regulation”.

Surely this Bill is the obvious place to start responding to those concerns. If we do not future-proof this Bill against the changes in digital technology, which are ever increasing at an ever-faster rate, it will be obsolete even before it is implemented.

My greatest concern is the arrival of AI. The noble Baroness, Lady Harding, has reminded us of the warnings from the godfather of AI, Geoffrey Hinton. If he is not listened to, who on earth should we be listening to? I wholeheartedly support Amendment 125. Machine-generated content is present in so much of what we see on the internet, and its presence is increasing daily. It is the future, and it must be within scope of this Bill. I am appalled by the examples that the noble Baroness, Lady Harding, has brought before us.

In the Communications and Digital Committee inquiry on regulating the internet, we decided that horizon scanning was so important that we called for a digital authority to be created which would look for harms developing in the digital world, assess how serious a threat they posed to users and develop a regulated response. The Government did not take up these suggestions. Instead, Ofcom has been given the onerous task of enforcing the triple shield which under this Bill will protect users to different degrees into the future.

Amendment 195 in the name of the right reverend Prelate the Bishop of Oxford will ensure that Ofcom has knowledge of how well the triple shield is working, which must be essential. Surveys of thousands of users undertaken by companies such as Kantar give an invaluable snapshot of what is concerning users now. These must be fed into research by Ofcom to ensure that future developments across the digital space are monitored, updated and brought to the attention of the Secretary of State and Parliament on a regular basis.

Amendment 195 will reveal trends in harms which might not be picked up by Ofcom under the present regime. It will look at the risk arising for individuals from the operation of Part 3 services. Clause 12 on user empowerment duties has a list of content and characteristics from which users can protect themselves. However, the characteristics for which or content with which users can be abused will change over time and these changes need to be researched, anticipated and implemented.

This Bill has proved in its long years of gestation that it takes time to change legislation, while changes on the internet take just minutes or are already here. The regime set up by these future-proofing amendments will at least go some way to protecting users from these fast-evolving harms. I stress to your Lordships’ Committee that this is very much precautionary work. It should be used to inform the Secretary of State of harms which are coming down the line. I do not think it will give power automatically to expand the scope of harms covered by the regime.

Amendment 239 inserts a new clause for an Ofcom future management of risks review. This will help feed into the Secretary of State review regime set out in Clause 159. Clause 159(3)(a) currently looks at ensuring that regulated services are operating using systems and process which, so far as relevant, are minimising the risk of harms to individuals. The wording appears to mean that the Secretary of State will be viewing all harms to individuals. I would be grateful if the Minister could explain to the Committee the scope of the harms set out in Clause 159(3)(a)(i). Are they meant to cover only the harms of illegality and harms to children, or are they part of a wider examination of the harms regime to see whether it needs to be contracted or expanded? I would welcome an explanation of the scope of the Secretary of State’s review.

The real aim of Amendment 263 is to ensure that the Secretary of State looks at research work carried out by Ofcom. I am not sure how politicians will come to any conclusions in the Clause 159 review unless they are required to look at all the research published by Ofcom on future risk. I would like the Minister to explain what research the Secretary of State would rely on for this review unless this amendment is accepted. I hope Amendment 263 will also encourage the Secretary of State to look at possible harms not only from content, but also from the means of delivering this content.

This aim was the whole point of Amendment 261, which has already been debated. However, it needs to be borne in mind when considering that harms come not just from content, but also from the machine technology which delivers it. Every day we read about new developments and threats posed by a fast-evolving internet. Today it is concerns about ChatGPT and the race for the most sophisticated artificial intelligence. The amendments in this group will provide much-needed reinforcement to ensure that the Online Safety Bill remains a beacon for continuing safety online.

Lord Bishop of Chelmsford Portrait The Lord Bishop of Chelmsford
- View Speech - Hansard - - - Excerpts

My Lords, I shall speak in favour of Amendments 195, 239 and 263, tabled in the names of my right reverend friend the Bishop of Oxford, the noble Lord, Lord Clement-Jones, and the noble Viscount, Lord Colville of Culross, who I thank for his comments.

My right reverend friend the Bishop of Oxford regrets that he is unable to attend today’s debate. I know he would have liked to be here. My right reverend friend tells me that the Government’s Centre for Data Ethics and Innovation, of which he was a founding member, devoted considerable resource to horizon scanning in its early years, looking for the ways in which AI and tech would develop across the world. The centre’s analysis reflected a single common thread: new technologies are developing faster than we can track them and they bring with them the risk of significant harms.

This Bill has also changed over time. It now sets out two main duties: the illegal content duty and the children duty. These duties have been examined and debated for years, including by the joint scrutiny committee. They are refined and comprehensive. Risk assessments are required to be “suitable and sufficient”, which is traditional language from 20 years of risk-based regulation. It ensures that the duties are fit for purpose and proportionate. The duties must be kept up to date and in line with any service changes. Recent government amendments now helpfully require companies to report to Ofcom and publish summaries of their findings.

However, in respect of harms to adults, in November last year the Government suddenly took a different tack. They introduced two new groups of duties as part of a novel triple shield framework, supplementing the duty to remove illegal harms with a duty to comply with their own terms of service and a duty to provide user empowerment tools. These new duties are quite different in style to the illegal content and children duties. They have not benefited from the prior years of consultation.

As this Committee’s debates have frequently noted, there is no clear requirement on companies to assess in the round how effective their implementation of these new duties is or to keep track of their developments. The Government have changed this Bill’s system for protecting adults online late in the day, but the need for risk assessments, in whatever system the Bill is designed around, has been repeated again and again across Committee days. Even at the close of day eight on Tuesday, the noble Lords, Lord Allan of Hallam and Lord Clement-Jones, referred explicitly to the role of risk assessment in validating the Bill’s systems of press reforms. Surely this persistence across days and groups of debate reflects the systemically pivotal role of risk assessments in what is, after all, meant to be a systems and processes rather than a content-orientated Bill.

But it seems that many people on many sides of this Committee believe that an important gap in risk assessment for harms to adults has been introduced by these late changes to the Bill. My colleague the right reverend Prelate is keen that I thank Carnegie UK for its work across the Bill, including these amendments. It notes:

“Harms to adults which might trickle down to become harms to children are not assessed in the current Bill”.


The forward-looking parts of its regime need to be strengthened to ensure that Parliament and the Secretary of State review new ways in which harms manifesting as technology race along, and to ensure that they then have the right advice for deciding what to do about them. To improve that advice, Ofcom needs to risk assess the future and then to report its findings.

12:30
As the Committee can see, Amendment 195 is drawn very narrowly, out of respect for concerns about freedom of expression, even though the Government have still not explained how risk assessment poses any such threat. Ofcom would be able to request information from companies, using its information-gathering powers in Clause 91, to complete its future-proofing risk assessment. That is why, as Carnegie again notes,
“A risk assessment required of OFCOM for the purposes of future proofing alone could fill this gap”
in the Bill’s system,
“without even a theoretical threat to freedom of expression”.
Amendment 239 would require Ofcom to produce a forward-looking report, based on a risk assessment, to inform the Secretary of State’s review of the regime.
Amendment 263 would complete this systemic implementation of risk assessment by ensuring that future reviews of the regime by the Secretary of State include a broad assessment of the harms arising from regulated services, not just regulated content. This amendment would ensure ongoing consideration of risk management, including whether the regime needs expanding or contracting. I urge the Minister to support Amendments 195, 239 and 263.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, like others, I thank the Whips for intervening to protect children from hearing details that are not appropriate for the young. I have to say that I was quite relieved because I was rather squirming myself. Over the last two days of Committee, I have been exposed to more violent pornographic imagery than any adult, never mind a child, should be exposed to. I think we can recognise that this is certainly a challenging time for us.

I do not want any of the comments I will now make to be seen as minimising understanding of augmented reality, AI, the metaverse and so on, as detailed so vividly by the noble Baronesses, Lady Harding and Lady Finlay, in relation to child safety. However, I have some concerns about this group, in terms of proportionality and unintended outcomes.

Amendment 239, in the names of the right reverend Prelate the Bishop of Oxford, the noble Lord, Lord Clement-Jones, and the noble Viscount, Lord Colville of Culross, sums up some of my concerns about a focus on future-proofing. This amendment would require Ofcom to produce reports about future risks, which sounds like a common-sense demand. But my question is about us overly focusing on risk and never on opportunities. There is a danger that the Bill will end up recommending that we see these new technologies only in a negative way, and that we in fact give more powers to expand the scope for harmful content, in a way that stifles speech.

Beyond the Bill, I am more generally worried about what seems to be becoming a moral panic about AI. The precautionary principle is being adopted, which could mean stifling innovation at source and preventing the development of great technologies that could be of huge benefit to humanity. The over-focus on the dangers of AI and augmented reality could mean that we ignore the potential large benefits. For example, if we have AI, everyone could have an immediately responsive GP in their pocket—goodness knows that, for those trying to get an appointment, that could be of great use and benefit. It could mean that students have an expert tutor in every subject, just one message away. The noble Baroness, Lady Finlay, spoke about the fantastic medical breakthroughs that augmented reality can bring to handling neurological damage. Last night, I cheered when I saw how someone who has never been able to walk now can, through those kinds of technologies. I thought, “Isn’t this a brilliant thing?” So all I am suggesting is that we have to be careful that we do not see these new technologies only as tools for the most perverted form of activity among a small minority of individuals.

I note, with some irony, that fewer qualms were expressed by noble Lords about the use of AI when it was proposed to scan and detect speech or images in encrypted messages. As I argued at the time, this would be a threat to WhatsApp, Signal and so on. Clauses 110 and 124 have us using AI as a blunt proactive technology of surveillance, despite the high risks of inaccuracy, error and false flags. But there was great enthusiasm for AI then, when it was having an impact on individuals’ freedom of expression—yet, here, all we hear are the negatives. So we need to be balanced.

I am also concerned about Amendment 125, which illustrates the problem of seeing innovation only as a threat to safety and a potential problem. For example, if the Bill considers AI-generated content to be user-generated content, only large technology companies will have the resources—lawyers and engineers—necessary to proceed while avoiding crippling liability.

In practice, UK users risk being blocked out from new technologies if we are not careful about how we regulate here. For example, users in the European Union currently cannot access Google Bard AI assistant because of GDPR regulations. That would be a great loss because Google Bard AI is potentially a great gain. Despite the challenges of the likes of ChatGPT and Bard AI that we keep reading about, with people panicking that this will lead to wide-scale cheating in education and so on, this has huge potential as a beneficial technology, as I said.

I have mentioned that one of the unintended consequences—it would be unintended—of the whole Bill could be that the UK becomes a hostile environment for digital investment and innovation. So start-ups that have been invested in—like DeepMind, a Google-owned and UK-based AI company—could be forced to leave the UK, doing huge damage to the UK’s digital sector. How can the UK be a science and technology superpower if we end up endorsing anti-innovation, anti-progress and anti-business measures by being overly risk averse?

I have the same concerns about Amendment 286, which requires periodic reviews of new technology content environments such as the metaverse and other virtual augmented reality settings. I worry that it will not be attractive for technology companies to confidently invest in new technologies if there is this constant threat of new regulations and new problems on the horizon.

I have a query that mainly relates to Amendment 125 but that is also more general. If virtual augmented reality actually involves user-to-user interaction, like in the metaverse, is it not already covered in the Bill? Why do we need to add it in? The noble Baroness, Lady Harding, said that it has got to the point where we are not able to distinguish fake from real, and augmented reality from reality. But she concludes that that means that we should treat fake as real, which seems to me to rather muddy the waters and make it a fait accompli. I personally—

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

I am sorry to interrupt, but I will make a clarification; the noble Baroness is misinterpreting what I said. I was actually quoting the godfather of AI and his concerns that we are fast approaching a space where it will be impossible—I did not say that it currently is—to distinguish between a real child being abused and a machine learning-generated image of a child being abused. So, first, I was quoting the words of the godfather of AI, rather than my own, and, secondly, he was looking forward—only months, not decades—to a very real and perceived threat.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I personally think that it is pessimistic view of the future to suggest that humanity cannot rise to the task of being able to distinguish between deep fakes and real images. Organising all our lives, laws and liberties around the deviant predilections of a minority of sexual offenders on the basis that none of us will be able to tell the difference in the future, when it comes to that kind of activity, is rather dangerous for freedom and innovation.

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak very briefly. I could disagree with much of what the noble Baroness just said, but I do not need to go there.

What particularly resonates with me today is that, since I first entered your Lordships’ House at the tender age of 28 in 1981, this is the first time I can ever remember us having to rein back what we are discussing because of the presence of young people in the Public Gallery. I reflect on that, because it brings home the gravity of what we are talking about and its prevalence; we cannot run away or hide from it.

I will ask the Minister about the International Regulatory Cooperation for a Global Britain: Government Response to the OECD Review of International Regulatory Cooperation of the UK, published 2 September 2020. He will not thank me for that, because I am sure that he is already familiar and word-perfect with this particular document, which was pulled together by his noble friend, the noble Lord, Lord Callanan. I raise this because, to think that we can in any way, shape or form, with this piece of legislation, stem the tide of what is happening in the online world—which is happening internationally on a global basis and at a global level—by trying to create regulatory and legal borders around our benighted island, is just for the fairies. It is not going to happen.

Can the Minister tell us about the degree to which, at an international level, we are proactively talking to, and learning from, other regulators in different jurisdictions, which are battling exactly the same things that we are? To concentrate the Minister’s mind, I will point out what the noble Lord, Lord Callanan, committed the Government to doing nearly three years ago. First, in relation to international regulatory co-operation, the Government committed to

“developing a whole-of-government IRC strategy, which sets out the policies, tools and respective roles of different departments and regulators in facilitating this; … developing specific tools and guidance to policy makers and regulators on how to conduct IRC; and … establishing networks to convene international policy professionals from across government and regulators to share experience and best practice on IRC”.

I am sure that, between now and when he responds, he will be given a detailed answer by the Bill team, so that he can tell us exactly where the Government, his department and Ofcom are in carrying out the commitments of the noble Lord, Lord Callanan.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - - - Excerpts

My Lords, although I arrived a little late, I will say, very briefly, that I support the amendments wholeheartedly. I support them because I see this as a child protection issue. People viewing AI, I believe, will lead to them going out to find real children to sexually abuse. I will not take up any more time, but I wholeheartedly agree with everything that has been said, apart from what the noble Baroness, Lady Fox, said. I hope that the Minister will look very seriously at the amendments and take them into consideration.

12:45
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, on behalf of my noble friend Lord Clement-Jones, I will speak in support of Amendments 195, 239, 263 and 286, to which he added his name. He wants me to thank the Carnegie Trust and the Institution of Engineering and Technology, which have been very helpful in flagging relevant issues for the debate.

Some of the issues in this group of amendments will range much more widely than simply the content we have before us in the Online Safety Bill. The right reverend Prelate the Bishop of Chelmsford is right to flag the question of a risk assessment. People are flagging to us known risks. Once we have a known risk, it is incumbent on us to challenge the Minister to see whether the Government are thinking about those risks, regardless of whether the answer is something in the Online Safety Bill or that there needs to be amendments to wider criminal law and other pieces of legislation to deal with it.

Some of these issues have been dealt with for a long time. If you go back and look at the Guardian for 9 May 2007, you will see the headline,

“Second Life in virtual child sex scandal”.


That case was reported in Germany about child role-playing in Second Life, which is very similar to the kind of scenarios described by various noble Lords in this debate. If Second Life was the dog that barked but did not bite, we are in quite a different scenario today, not least because of the dramatic expansion in broadband technology, for which we can thank the noble Baroness, Lady Harding, in her previous role. Pretty much everybody in this country now has incredible access, at huge scale, to high-speed broadband, which allows those kinds of real life, metaverse-type environments to be available to far more people than was possible with Second Life, which tended to be confined to a smaller group.

The amendments raise three significant groups of questions: first, on scope, and whether the scope of the Online Safety Bill will stretch to what we need; secondly, on behaviour, including the kinds of new behaviours, which we have heard described, that could arise as these technologies develop; and, finally, on agency, which speaks to some of the questions raised by the noble Baroness, Lady Fox, on AIs, including the novel questions about who is responsible when something happens through the medium of artificial intelligence.

On scope, the key question is whether the definition of “user-to-user”, which is at the heart of the Bill, covers everything that we would like to see covered by the Bill. Like the noble Baroness, Lady Harding, I look forward to the Minister’s response; I am sure that he has very strongly prepared arguments on that. We should take a moment to give credit to the Bill’s drafters for coming up with these definitions for user-to-user behaviours, rather than using phrases such as, “We are regulating social media or specific technology”. It is worth giving credit, because a lot of thought has gone into this, over many years, with organisations such as the Carnegie Trust. Our starting point is a better starting point than many other legislative frameworks which list a set of types of services; we at least have something about user-to-user behaviours that we can work with. Having said that, it is important that we stress-test that definition. That is what we are doing today: we are stress-testing, with the Minister, whether the definition of “user-to-user” will still apply in some of the novel environments.

It certainly seems likely—and I am sure that the Minister will say this—that a lot of metaverse activity would be in scope. But we need detailed responses from the Minister to explain why the kinds of scenario that have been described—if he believes that this is the case; I expect him to say so—would mean that Ofcom would be able to demand things of a metaverse provider under the framework of the user-to-user requirements. Those are things we all want to see, including the risk assessments, the requirement to keep people away from illegal content, and any other measures that Ofcom deems necessary to mitigate the risks on those platforms.

It will certainly be useful for the Minister to clarify one particular area. Again, we are fortunate in the UK that pseudo-images of child sexual abuse are illegal and have been illegal for a long time. That is not the case in every country around the world, and the noble Lord, Lord Russell, is quite right to say that this an area where we need international co-operation. Having dealt with it on the platforms, some countries have actively chosen not to criminalise pseudo-images; others just have not considered it.

In the UK, we were ahead of the game in saying, “If it looks like a photo of child abuse, we don’t care whether you created it on Photoshop, or whatever—it is illegal”. I hope that the Minister can confirm that avatars in metaverse-type environments would fall under that definition. My understanding is that the legislation refers to photographs and videos. I would interpret an avatar or activity in a metaverse as a photo or video, and I hope that is what the Government’s legal officers are doing.

Again, it is important in the context of this debate and the exchange that we have just had between the noble Baronesses, Lady Harding and Lady Fox, that people out there understand that they do not get away with it. If you are in the UK and you create a child sexual abuse image, you can be taken to court and go to prison. People should not think that, if they do it in the metaverse, it is okay—it is not okay, and it is really important that that message gets out there.

This brings us to the second area of behaviours. Again, some of the behaviours that we see online will be extensions of existing harms, but some will be novel, based on technical capabilities. Some of them we should just call by their common or garden term, which is sexual harassment. I was struck by the comments of the noble Baroness, Lady Berridge, on this. If people go online and start approaching other people in sexual terms, that is sexual harassment. It does not matter whether it is happening in a physical office, on public transport, on traditional social media or in the metaverse—sexual harassment is wrong and, particularly when directed at minors, a really serious offence. Again, I hope that all the platforms recognise that and take steps to prevent sexual harassment on their platforms.

That is quite a lot of the activity that people are concerned about, but others are much more complex and may require updates to legislation. Those are particularly activities such as role-playing online, where people play roles and carry out activities that would be illegal if done in the real world. That is particularly difficult when it is done between consenting adults, when they choose to carry out a role-playing activity that replicates an illegal activity were it to take place in the real world. That is hard—and those with long memories may remember a group of cases around Operation Spanner in the 1990s, whereby a group of men was prosecuted for consensual sadomasochistic behaviour. The case went backwards and forwards, but it talked to something that the noble Baroness, Lady Fox, may be sympathetic to—the point at which the state should intervene on sexual activities that many people find abhorrent but which take place between consenting adults.

In the context of the metaverse, I see those questions coming front and centre again. There are all sorts of things that people could role-play in the metaverse, and we will need to take a decision on whether the current legislation is adequate or needs to be extended to cater for the fact that it now becomes a common activity. Also important is the nature of it. The fact that it is so realistic changes the nature of an activity; you get a gut feeling about it. The role-playing could happen today outside the metaverse, but once you move it in there, something changes. Particularly when children are involved, it becomes something that should be a priority for legislators—and it needs to be informed by what actually happens. A lot of what the amendments seek to do is to make sure that Ofcom collects the information that we need to understand how serious these problems are becoming and whether they are, again, something that is marginal or something that is becoming mainstream and leading to more harm.

The third and final question that I wanted to cover is the hardest one—the one around agency. That brings us to thinking about artificial intelligence. When we try to assign responsibility for inappropriate or illegal behaviour, we are normally looking for a controlling mind. In many cases, that will hold true online as well. I know that the noble Lord, Lord Knight of Weymouth, is looking at bots—and with a classic bot, you have a controlling mind. When the bots were distributing information in the US election on behalf of Russia, that was happening on behalf of individuals in Russia who had created those bots and sent them out there. We still had a controlling mind, in that instance, and a controlling mind can be prosecuted. We have that in many instances, and we can expect platforms to control them and expect to go after the individuals who created the bots in the same way that we would go after things that they do as a first party. There is a lot of experience in the fields of spam and misinformation, where “bashing the bots” is the daily bread and butter of a lot of online platforms. They have to do it just to keep their platforms safe.

We can also foresee a scenario with artificial intelligence whereby it is less obvious that there is a controlling mind or who the controlling mind should be. I can imagine a situation whereby an artificial intelligence has created illegal content, whether that is child sexual abuse material or something else that is in the schedule of illegal content in the Bill, without the user having expected it to happen or the developer having believed or contemplated that it could happen. Let us say that the artificial intelligence goes off and creates something illegal, and that both the user and the developer can show the question that they asked of the artificial intelligence and show how they coded it, showing that neither of them intended for that thing to happen. In the definition of artificial intelligence, it has its own agency in that scenario. The artificial intelligence cannot be fined or sent to prison. There are some things that we can do: we can try to retrain it, or we can kill it. There is always a kill switch; we should never forget that with artificial intelligence. Sam Altman at OpenAI can turn off ChatGPT if it is behaving in an illegal way.

There are some really important questions around that issue. There is the liability for the specific instance of the illegality happening. Who do we hold liable? Even if everyone says that it was not their intention, is there someone that we can hold liable? What should the threshold be at which we can execute that death sentence on the AI? If an AI is being used by millions of people and on a small number of occasions it does something illegal, is that sufficient? At what point do we say that the AI is rogue and that, effectively, it needs to be taken out of operation? Those are much wider questions than we are dealing with immediately with in the Bill, but I hope that the Minister can at least point to what the Government are thinking about these kind of legal questions, as we move from a world of user-to-user engagement to user-to-user-to-machine engagement, when that machine is no longer a creature of the user.

Baroness Berridge Portrait Baroness Berridge (Con)
- Hansard - - - Excerpts

I have had time just to double-check the offences. The problem that exists—and it would be helpful if my noble friend the Minister could confirm this—is that the criminal law is defined in terms of person. It is not automatic that sexual harassment, particularly if you do not have a haptic suit on, would actually fall within the criminal law, as far as I understand it, which is why I am asking the Minister to clarify. That was the point that I was making. Harassment per se also needs a course of conduct, so if it was not a touch of your avatar in a sexual nature, it clearly falls outside criminal law. That is the point of clarification that we might need on how the criminal law is framed at the moment.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I am grateful to the noble Baroness. That is very helpful.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

That is exactly the same issue with child sexual abuse images—it is about the way in which criminal law is written. Not surprisingly, it is not up to date with evolution of technology.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I am grateful for that intervention as well. That summarises the core questions that we have for the Minister. Of the three areas that we have for him, the first is the question of scope and the extent to which he can assure us that the Bill as drafted will be robust in covering the metaverse and bots, which are the issues that have been raised today. The second is on behaviours and to the two interventions that we have just had. We have been asking whether, with the behaviours that are criminal today, that criminality will stretch to new, similar forms of behaviour taking place in new environments—let us put it that way. The behaviour, the intent and the harm are the same, but the environment is different. We want to understand the extent to which the Government are thinking about that, where that thinking is happening and how confident they are that they can deal with that.

Finally, on the question of agency, how do the Government expect to deal with the fact that we will have machines operating in a user-to-user environment when the connection between the machine and another individual user is qualitatively different from anything that we have seen before? Those are just some small questions for the Minister on this Thursday afternoon.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, the debate on this group has been a little longer, deeper and more important than I had anticipated. It requires all of us to reflect before Report on some of the implications of the things we have been talking about. It was introduced masterfully by the noble Baroness, Lady Harding, and her comments—and those from the noble Baronesses, Lady Finlay and Lady Berridge—were difficult to listen to at times. I also congratulate the Government Whip on the way he handled the situation so that innocent ears were not subject to some of that difficult listening. But the questions around the implications of virtual reality, augmented reality and haptic technology are really important, and I hope the Minister will agree to meet with the noble Baroness, Lady Berridge, and the people she referenced to reflect on some of that.

13:00
The noble Baroness, Lady Fox, raised some of the right questions around the balance of this debate. I am a technology enthusiast, so I will quote shortly from my mobile phone, which I use for good, although a lot of this Bill is about how technology is used for bad. I am generally of the view that we have a responsibility to put some safety rails around this technology. I know that the noble Baroness agrees, in respect of children in particular. As ever, in responding to her, I end up saying “It’s all about balance” in the same way as the Minister ends up saying “It’s all about unintended consequences”.
Amendments 283ZZA and 283ZZB in my name are, as the noble Lord, Lord Allan, anticipated, about who controls autonomous bots. I was really grateful to hear his comments, because I put down the amendments on a bit of a hunch without being that confident that I understood what I was talking about technically. He understands what he is talking about much better than I do in this regard, so it is reassuring that I might be on to something of substance.
I was put on to it by reading a New York Times article about Geoffrey Hinton, the now labelled “Godfather of AI”. The article stated:
“Down the road, he is worried that future versions of the technology pose a threat to humanity because they often learn unexpected behavior from the vast amounts of data they analyse. This becomes an issue, he said, as individuals and companies allow AI systems not only to generate their own computer code but actually run that code on their own”.
As a result, I went to OpenAI’s ChatGPT and asked whether it could create code. Of course, it replied that it could help me with creating code. I said, “Can you code me a Twitter bot?” It said, “Certainly, I can help you create a basic Twitter bot using Python. Here is an example of a Twitter bot that post tweets”. Then I got all the instructions on how to do it. The AI will help me get on and create something that starts then to be able to create autonomous behaviours and activity. It is readily available to all of us now, and that should cause us some concern.
The Bill certainly needs to clarify—as the amendment tabled by the noble Baroness, Lady Kidron, and introduced so well by the noble Baroness, Lady Harding, goes to—whether or not a bot is a user. If a bot is a user and the Minister can assure us of that, things get a lot easier. But given that it is possible to code a realistic avatar generating its own content and behaviour in the metaverse, the core question I am driving at is: who is responsible for that behaviour? Is it the person who is deemed to be controlling it, as it says in Clause 170(7), which talks about
“a person who may be assumed to control the bot or tool”?
As the noble Lord, Lord Allan, said, that is not always going to be that straightforward when behaviours start to be something that the AI itself generates, and it generates behaviours that are not expected by the person who might be perceived to have controlled it. No one really controls it; the creator does not necessarily control it. I am just offering the simple amendment “or owns it” to allow some legal culpability to be clarified. It might be that the supplier of the virtual environment is culpable. These are questions that I am seeking to answer with my amendment from the Minister, so that we get clarity on how Ofcom is supposed to regulate all of these potential harms in the future.
Some months ago, I went to a Speaker’s Lecture given by Stuart Russell, who delivered the Reith Lectures around AI. He talked about the programming of an AI-powered vacuum cleaner that was asked to clear up as much dirt as possible. What then plays out is that the vacuum cleaner gets a bit of dirt up off the carpet and then spews it out and picks it up again, because that is the way of maximising the intent of the programming. It is very difficult to anticipate the behaviour of AI if you do not get the instructions exactly right. And that is the core of what we are worried about. Again, when I asked ChatGPT to give me some guidance on a speaking note to this question, it was quite helpful in also guiding me towards an embedded danger of bias and inequity. The AI is trained by data; we know a certain amount about the bias of data, but it is difficult to anticipate how that will play out as the AI feeds and generates its own data.
The equity issues that can then flow are something that we need to be confident that this legislation will be able to deal with. As the right reverend Prelate the Bishop of Chelmsford reminded us, when the legal but harmful elements of the Bill were taken out between draft stage and publication, we lost the assessment of future risk as being something that was in place before, which I think was an unintended consequence of taking those things out. It would be great to see those back, as Amendment 139 and Amendment 195 from the right reverend Prelate the Bishop of Oxford suggest. The reporting that the noble Baroness, Lady Finlay, is proposing in her amendments is important in giving us as Parliament a sense of how this is going. My noble friend Lord Stevenson tabled Amendment 286 to pay particular regard to the metaverse, and I support that.
Ultimately, the key test for the Minister is, as others have said, that tech is changing really fast. It is changing the online environment and our relationship with it as humans very quickly indeed; the business models will change really quickly as a result and they, by and large, are likely to drive quite a lot of the platform behaviour. But can the regulator, as things are currently set out in this legislation, react and change quickly enough in response to that highly dynamic environment? Can we anticipate that what is inconceivable at the moment is going to be regulatable by this Bill? If not, we need to make sure that Parliament has opportunities to revisit this. As I have said before, I strongly support post-legislative scrutiny; I personally think a permanent Joint Committee of both Houses around digital regulation, so that we have some sustained body of expertise of parliamentarians in both Houses to keep up with this, would be extremely useful to Parliament.
As a whole, I think these amendments are really helpful to the Minister and to Parliament in pointing us towards where we can strengthen the future-proofing of the Bill. I look forward to the Minister’s response.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a grim but important debate to open the Committee’s proceedings today. As my noble friend Lady Harding of Winscombe and others have set out, some of the issues and materials about which we are talking are abhorrent indeed. I join other noble Lords in thanking my noble friend Lord Harlech for his vigilance and consideration for those who are watching our proceedings today, to allow us to talk about them in the way that we must in order to tackle them, but to ensure that we do so sensitively. I thank noble Lords for the way they have done that.

I pay tribute also to those who work in this dark corner of the internet to tackle these harms. I am pleased to reassure noble Lords that the Bill has been designed in a way that responds to emerging and new technologies that may pose a risk of harm. In our previous debates, we have touched on explicitly naming certain technologies and user groups or making aspects of the legislation more specific. However, one key reason why the Government have been resistant to such specificity is to ensure that the legislation remains flexible and future-proofed.

The Bill has been designed to be technology-neutral in order to capture new services that may arise in this rapidly evolving sector. It confers duties on any service that enables users to interact with each other, as well as search services, meaning that any new internet service that enables user interaction will be caught by it.

Amendment 125, tabled by the noble Baroness, Lady Kidron—whose watchful eye I certainly feel on me even as she takes a rare but well-earned break today—seeks to ensure that machine-generated content, virtual reality content and augmented reality content are regulated content under the Bill. I am happy to confirm to her and to my noble friend Lady Harding who moved the amendment on her behalf that the Bill is designed to regulate providers of user-to-user services, regardless of the specific technologies they use to deliver their service, including virtual reality and augmented reality content. This is because any service that allows its users to encounter content generated, uploaded or shared by other users is in scope unless exempt. “Content” is defined very broadly in Clause 207(1) as

“anything communicated by means of an internet service”.

This includes virtual or augmented reality. The Bill’s duties therefore cover all user-generated content present on the service, regardless of the form this content takes, including virtual reality and augmented reality content. To state it plainly: platforms that allow such content—for example, the metaverse—are firmly in scope of the Bill.

The Bill also ensures that machine-generated content on user-to-user services created by automated tools or machine bots will be regulated by the Bill where appropriate. Specifically, Clause 49(4)(b) means that machine-generated content is regulated unless the bot or automated tool producing the content is controlled by the provider of the service. This approach ensures that the Bill covers scenarios such as malicious bots on a social media platform abusing users, or when users share content produced by new tools, such as ChatGPT, while excluding functions such as customer service chatbots which are low risk. Content generated by an artificial intelligence bot and then placed by a user on a regulated service will be regulated by the Bill. Content generated by an AI bot which interacts with user-generated content, such as bots on Twitter, will be regulated by the Bill. A bot that is controlled by the service provider, such as a customer service chatbot, is out of scope; as I have said, that is low risk and regulation would therefore be disproportionate. Search services using AI-powered features will be in scope of the search duties.

The Government recognise the need to act both to unlock the opportunities and to address the potential risks of this technology. Our AI regulation White Paper sets out the principles for the responsible development of AI in the UK. These principles, such as safety and accountability, are at the heart of our approach to ensuring the responsible development and use of artificial intelligence. We are creating a horizon-scanning function and a central risk function which will enable the Government to monitor future risks.

The Bill does not distinguish between the format of content present on a service. Any service that allows its users to encounter content generated, uploaded or shared by other users is in scope unless exempt, regardless of the format of that content. This includes virtual and augmented reality material. Platforms that allow such content, such as the metaverse, are firmly in scope of the Bill and must take the required steps to protect their users from harm. I hope that gives the clarity that my noble friend and others were seeking and reassurance that the intent of Amendment 125 is satisfied.

The Bill will require companies to take proactive steps to tackle all forms of online child sexual abuse, including grooming, live streaming, child sexual abuse material and prohibited images of children. If AI-generated content amounts to a child’s sexual exploitation or abuse offence in the Bill, it will be subject to the illegal content duties. Regulated providers will need to take steps to remove this content. We will shortly bring forward, and have the opportunity to debate in Committee, a government amendment to address concerns relating to the sending of intimate images. This will cover the non-consensual sharing of manufactured images—more commonly known as deepfakes. The possession and distribution of altered images that appear to be indecent photographs of children is ready covered by the indecent images of children offences, which are very serious offences with robust punishment in law.

13:15
The noble Baroness, Lady Finlay of Llandaff, asked about an issue touched on in Amendment 85C. Under their illegal content safety duties, companies must put in place safety measures that mitigate and manage the risks identified in their illegal content risk assessment. As part of this, in-scope services such as Meta will be required to assess the level of risk of their service being used for the commission or facilitation of a priority offence. They will then be required to mitigate any such risks. This will ensure that providers implement safety by design measures to mitigate a broad spectrum of factors that enable illegal activity on their platforms. This includes when these platforms facilitate new kinds of user-to-user interactions that may result in offences manifesting themselves in new ways online.
Schedules 5, 6 and 7, which list the priority offences, are not static lists and can be updated. To maintain flexibility and to keep those lists responsive to emerging harms and legislative changes, the Secretary of State has the ability to designate additional offences as priority offences via statutory instrument, subject to parliamentary scrutiny. It should be noted that Schedule 7 already contains several sexual offences, including extreme pornography, so-called revenge pornography and sexual exploitation, while Schedule 6 is focused solely on child sexual abuse and exploitation offences. Fraud and financial offences are also listed in Schedule 7. In this way, these offences are already captured, and mean that all in-scope services must take proactive measures to tackle these types of content. These schedules have been designed to focus on the most serious and prevalent offences, where companies can take effective and meaningful action. They are, therefore, primarily focused on offences that can be committed online, so that platforms are able to take effective steps proactively to identify and tackle such offences. If we were to add offences to these lists that could not be effectively tackled, it would risk spreading companies’ resources too thinly and diluting their efforts to tackle the offences we have listed in the Bill.
The Bill establishes a differentiated approach to ensure that it is proportionate to the risk of harm that different services pose. Category 1 services are subject to additional duties, such as transparency, accountability and free speech duties, as well as duties such as protections for journalistic and democratic content. These duties reflect the influence of the major platforms over our online democratic discourse. The designation of category 1 services is based on how easily, quickly and widely user-generated content is disseminated. This reflects how those category 1 services have the greatest influence over public discourse because of their high reach. Requiring all companies to comply with the full range of category 1 duties would impose a disproportionate regulatory burden on smaller companies, which do not exert the same amount of influence over public discourse. This would divert their resources away from the vital task of tackling illegal content and protecting children.
The noble Baroness, Lady Finlay, also asked about virtual training grounds. Instruction or training for terrorism is illegal under existing terrorism legislation, and terrorism is listed as a priority offence in this Bill. Schedule 5 to the Bill lists the terrorism offences that constitute priority offences. These are drawn from existing terrorism legislation, including the Terrorism Act 2000, the Anti-terrorism, Crime and Security Act 2001 and the Terrorism Act 2006. Section 6 of the 2006 Act covers instruction or training for terrorism and Section 2 of that Act covers dissemination of terrorist publications. Companies in scope of the Online Safety Bill will be required to take proactive steps to prevent users encountering content that amounts to an offence under terrorism legislation.
Amendments 195, 239, 263, 241, 301 and 286 seek to ensure that the Bill is future-proofed to keep pace with emerging technologies, as well as ensuring that Ofcom is able to monitor and identify new threats. The broad scope of the Bill means that it will capture all services that enable user interaction as well as search services, enabling its framework to continue to apply to new services that have not yet been invented. In addition, the Government fully agree that Ofcom must assess future risks and monitor the emergence of new technologies. That is why the Bill already gives Ofcom broad horizon-scanning and robust information-gathering powers, and why it requires Ofcom to carry out extensive risk assessments. These will ensure that it can effectively supervise and regulate new and emerging user-to-user services.
Ofcom is already conducting extensive horizon scanning and I am pleased to confirm that it is planning a range of research into emerging technologies in relation to online harms. The Bill also requires Ofcom to review and update its sectoral risk assessments, risk profiles and codes of practice to ensure that those reflect the risks and harms of new and emerging technology. The amendments before us would therefore duplicate existing duties and powers for Ofcom. In addition, as noble Lords will be aware, the Bill already has built-in review mechanisms to ensure that it works effectively.
My right honourable friends the Prime Minister and the Secretary of State for Science, Innovation and Technology are clear that artificial intelligence is the defining technology of our time, with the potential to bring positive changes, but also that the success of this technology is founded on having the right guardrails in place, so that the public can have the confidence that artificial intelligence is being used in a safe and responsible way. The UK’s approach to AI regulation will need to keep pace with the fast-moving advances in this technology. That is why His Majesty’s Government have deliberately adopted an agile response to unlock opportunities, while mitigating the risks of the technology, as outlined in our AI White Paper. We are engaging extensively with international partners on these issues, which have such profound consequences for all humankind.
Clause 159 requires the Secretary of State to undertake a review into the operation of the regulatory framework between two and five years after the provisions come into effect. This review will consider any new emerging trends or technologies, such as AI, which could have the potential to compromise the efficacy of the Bill in achieving its objectives. I am happy to assure the noble Viscount, Lord Colville of Culross, and the right reverend Prelate the Bishop of Chelmsford that the review will cover all content and activity being regulated by the Bill, including legal content that is harmful to children and content covered by user-empowerment tools. The Secretary of State must consult Ofcom when she carries out this review.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

Will the review also cover an understanding of what has been happening in criminal cases where, in some of the examples that have been described, people have tried to take online activity to court? We will at that point understand whether the judges believe that existing offences cover some of these novel forms of activity. I hope the review will also extend not just to what Ofcom does as a regulator but to understand what the courts are doing in terms of the definitions of criminal activity and whether they are being effective in the new online spaces.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I believe it will. Certainly, both government and Parliament will take into account judgments in the court on this Bill and in related areas of law, and will, I am sure, want to respond.

Baroness Berridge Portrait Baroness Berridge (Con)
- Hansard - - - Excerpts

It is not just the judgments of the courts; it is about how the criminal law as a very basic point has been framed. I invite my noble friend the Minister to please meet with the Dawes Centre, because it is about future crime. We could end up with a situation in which more and more violence, particularly against women and girls, is being committed in this space, and although it may be that the Bill has made it regulated, it may not fall within the province of the criminal law. That would be a very difficult situation for our law to end up in. Can my noble friend the Minister please meet with the Dawes Centre to talk about that point?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am happy to reassure my noble friend that the director of the Dawes Centre for Future Crime sits on the Home Office’s Science Advisory Council, whose work is very usefully fed into the work being done at the Home Office. Colleagues at the Ministry of Justice keep criminal law under constant review, in light of research by such bodies and what we see in the courts and society. I hope that reassures my noble friend that the points she raised, which are covered by organisations such as the Dawes Centre, are very much in the mind of government.

The noble Lord, Lord Allan of Hallam, explained very effectively the nuances of how behaviour translates to the virtual world. He is right that we will need to keep both offences and the framework under review. My noble friend Lady Berridge asked a good and clear question, to which I am afraid I do not have a similarly concise answer. I can reassure her that generated child sexual abuse and exploitation material is certainly illegal, but she asked about sexual harassment via a haptic suit; that would depend on the specific circumstances. I hope she will allow me to respond in writing, at greater length and more helpfully, to the very good question she asked.

Under Clause 56, Ofcom will also be required to undertake periodic reviews into the incidence and severity of content that is harmful to children on the in-scope services, and to recommend to the Secretary of State any appropriate changes to regulations based on its findings. Clause 141 also requires Ofcom to carry out research into users’ experiences of regulated services, which will likely include experiences of services such as the metaverse and other online spaces that allow user interaction. Under Clause 147, Ofcom may also publish reports on other online safety matters.

The questions posed by the noble Lord, Lord Russell of Liverpool, about international engagement are best addressed in a group covering regulatory co-operation, which I hope we will reach later today. I can tell him that we have introduced a new information-sharing gateway for the purpose of sharing information with overseas regulators, to ensure that Ofcom can collaborate effectively with its international counterparts. That builds on existing arrangements for sharing information that underpin Ofcom’s existing regulatory regimes.

The amendments tabled by the noble Lord, Lord Knight of Weymouth, relate to providers’ judgments about when content produced by bots is illegal content, or a fraudulent advertisement, under the Bill. Clause 170 sets out that providers will need to take into account all reasonably available relevant information about content when making a judgment about its illegality. As we discussed in the group about illegal content, providers will need to treat content as illegal when this information gives reasonable grounds for inferring that an offence was committed. Content produced by bots is in scope of providers’ duties under the Bill. This includes the illegal content duties, and the same principles for assessing illegal content will apply to bot-produced content. Rather than drawing inferences about the conduct and intent of the user who generated the content, the Bill specifies that providers should consider the conduct and the intent of the person who can be assumed to have controlled the bot at the point it created the content in question.

The noble Lord’s amendment would set out that providers could make judgments about whether bot-produced content is illegal, either by reference to the conduct or mental state of the person who owns the bot or, alternatively, by reference to the person who controls it. As he set out in his explanatory statement and outlined in his speech, I understand he has brought this forward because he is concerned that providers will sometimes not be able to identify the controller of a bot, and that this will impede providers’ duties to take action against illegal content produced by them. Even when the provider does not know the identity of the person controlling the bot, however, in many cases there will still be evidence from which providers can draw inferences about the conduct and intent of that person, so we are satisfied that the current drafting of the Bill ensures that providers will be able to make a judgment on illegality.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My concern is also whether or not the bot is out of control. Can the Minister clarify that issue?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

It depends on what the noble Lord means by “out of control” and what content the bot is producing. If he does not mind, this may be an issue which we should go through in technical detail and have a more free-flowing conservation with examples that we can work through.

13:30
Providers will consider contextual evidence such as the circumstances in which the content was created or information about how the bot normally behaves on the site. In many cases, the person who “owns” the bot may be the same person who controls it. In such instances, providers will be required to consider the conduct and mental state of the owner when considering whether the relevant bot has produced illegal content. Where the ownership of the bot is relevant, it will already be captured by this clause, but I am very happy to kick the tyres of that with the noble Lord and any others who wish to join us.
Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- Hansard - - - Excerpts

This is a very interesting discussion; the noble Lord, Lord Knight, has hit on something really important. When somebody does an activity that we believe is criminal, we can interrogate them and ask how they came to do it and got to the conclusion that they did. The difficulty is that those of us who are not super-techy do not understand how you can interrogate a bot or an AI which appears to be out of control on how it got to the conclusion that it did. It may be drawing from lots of different places and there may be ownership of lots of different sources of information. I wonder whether that is why we are finding how this will be monitored in future so concerning. I am reassured that the noble Lord, Lord Knight of Weymouth, is nodding; does the Minister concur that this may be a looming problem for us?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I certainly concur that we should discuss the issue in greater detail. I am very happy to do so with the noble Lord, the noble Baroness and others who want to do so, along with officials. If we can bring some worked examples of what “in control” and “out of control” bots may be, that would be helpful.

I hope the points I have set out in relation to the other issues raised in this group and the amendments before us are satisfactory to noble Lords and that they will at this point be content not to press their amendments.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I thank all noble Lords who have contributed to a thought-provoking and, I suspect, longer debate than we had anticipated. At Second Reading, I think we were all taken aback when this issue was opened up by my noble friend Lord Sarfraz; once again, we are realising that this requires really careful thought. I thank my noble friend the Minister for his also quite long and thoughtful response to this debate.

I feel that I owe the Committee a small apology. I am very conscious that I talked in quite graphic detail at the beginning when there were still children in the Gallery. I hope that I did not cause any harm, but it shows how serious this is that we have all had to think so carefully about what we have been saying—only in words, without any images. We should not underestimate how much this has demonstrated the importance of our debates.

On the comments of the noble Baroness, Lady Fox, I am a huge enthusiast, like the noble Lord, Lord Knight, for the wonders of the tech world and what it can bring. We are managing the balance in this Bill to make sure that this country can continue to benefit from and lead the opportunities of tech while recognising its real and genuine harms. I suggest that today’s debate has demonstrated the potential harm that the digital world can bring.

I listened carefully—as I am certain the noble Baroness, Lady Kidron, has been doing in the digital world—to my noble friend’s words. I am encouraged by what he has put on the record on Amendment 125, but there are some specific issues that it would be helpful for us to talk about, as he alluded to, after this debate and before Report. Let me highlight a couple of those.

First, I do not really understand the technical difference between a customer service bot and other bots. I am slightly worried that we are defining in the specific one type of bot that would not be captured by this Bill. I suspect that there might be others in future. We must think carefully through whether we are getting too much into the specifics of the technology and not general enough in making sure we capture where it could go. That is one example.

Secondly, as my noble friend Lady Berridge would say, I am not sure that we have got to the bottom of whether this Bill, coupled with the existing body of criminal law, will really enable law enforcement officers to progress the cases as they see fit and protect vulnerable women—and men—in the digital world. I very much hope we can extend the conversation there. We perhaps risk getting too close to the technical specifics if we are thinking about whether a haptic suit is in or out of scope of the Bill; I am certain that there will be other technologies that we have not even thought about yet that we will want to make sure that the Bill can capture.

I very much welcome the spirit in which this debate has been held. When I said that I would do this for the noble Baroness, Lady Kidron, I did not realise quite what a huge debate we were opening up, but I thank everyone who has contributed and beg leave to withdraw the amendment.

Amendment 125 withdrawn.
Amendment 125A not moved.
Clause 49 agreed.
Clause 50: “Recognised news publisher”
Amendment 126 not moved.
Amendment 126A
Moved by
126A: Clause 50, page 48, line 31, at end insert “, and
(iii) is not a sanctioned entity (see subsection (3A)).”Member’s explanatory statement
The effect of this amendment, combined with the next amendment in the Minister’s name, is that any entity which is designated for the purposes of sanctions regulations is not a “recognised news publisher” under this Bill, with the result that the Bill’s protections which relate to “news publisher content” don’t apply.
Amendment 126A agreed.
Amendment 127 not moved.
Amendment 127A
Moved by
127A: Clause 50, page 49, line 9, at end insert—
“(3A) A “sanctioned entity” is an entity which—(a) is designated by name under a power contained in regulations under section 1 of the Sanctions and Anti-Money Laundering Act 2018 that authorises the Secretary of State or the Treasury to designate persons for the purposes of the regulations or of any provisions of the regulations, or (b) is a designated person under any provision included in such regulations by virtue of section 13 of that Act (persons named by or under UN Security Council Resolutions).”Member’s explanatory statement
The effect of this amendment, combined with the preceding amendment in the Minister’s name, is that any entity which is designated for the purposes of sanctions regulations is not a “recognised news publisher” under this Bill, with the result that the Bill’s protections which relate to “news publisher content” don’t apply.
Amendment 127A agreed.
Clause 50, as amended, agreed.
Clause 51 agreed.
Clause 52: Restricting users’ access to content
Amendments 127B and 127C
Moved by
127B: Clause 52, page 50, line 23, after second “the” insert “voluntary”
Member’s explanatory statement
This amendment and the next amendment in the Minister’s name ensure that restrictions on a user’s access to content resulting from the user voluntarily activating any feature of a service do not count as restrictions on users’ access for the purposes of Part 3 of the Bill.
127C: Clause 52, page 50, line 25, leave out from “service” to “, or” in line 26 and insert “(for example, features, functionalities or settings included in compliance with the duty set out in section 12(2) or (6) (user empowerment))”
Member’s explanatory statement
This amendment and the previous amendment in the Minister’s name ensure that restrictions on a user’s access to content resulting from the user voluntarily activating any feature of a service do not count as restrictions on users’ access for the purposes of Part 3 of the Bill.
Amendments 127B and 127C agreed.
Clause 52, as amended, agreed.
Clause 53: “Illegal content” etc
Amendments 128 to 130 not moved.
Clause 53 agreed.
Schedule 5 agreed.
Schedule 6: Child sexual exploitation and abuse offences
Amendments 131 to 133 not moved.
Schedule 6 agreed.
House resumed. Committee to begin again not before 2.19 pm.
Committee (9th Day) (Continued)
16:46
Clause 68: Transparency reports about certain Part 3 services
Amendment 160A
Moved by
160A: Clause 68, page 62, line 23, leave out paragraph (d) and insert—
“(d) be made publicly available, subject to appropriate redactions, on the date specified in the notice.”
Member’s explanatory statement
This amendment makes clear that Ofcom guidance under Clause 66 must outline how a platform’s terms of service would be considered “adequate and appropriate”, as required under a new Clause in the name of Lord Stevenson of Balmacara.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, as we have said many times, this is a complex Bill. As we reflect on the priorities for Report, we can be more relaxed about some of the specifics on how Ofcom may operate, thereby giving it more flexibility—the flexibility it needs to be agile in the online world—if we as a Parliament trust Ofcom. Building trust, I believe, is a triangulation. First, there is independence from government—as discussed in respect of Secretary of State powers. Secondly, we need proper scrutiny by Parliament. Earlier today I talked about my desire for there to be proper post-legislative scrutiny and a permanent Joint Committee to do that. The third leg of the stool is the transparency to assist that scrutiny.

Clause 68 contains the provisions which would require category 1, 2A and 2B services to produce an annual transparency report containing information described by Ofcom in a notice given to the service. Under these provisions, Ofcom would be able to require these services to report on, among other things: information about the incidence of illegal content and content that is harmful to children; how many users are assumed to have encountered this content by means of the service; the steps and processes for users to report this content; and the steps and processes which a provider uses for dealing with this content.

We welcome the introduction of transparency reporting in relation to illegal content and content that is harmful to children. We agree with the Government that effective transparency reporting plays a crucial role in building Ofcom’s understanding of online harms and empowering users to make a more informed choice about the services they use.

However, despite the inclusion of transparency reporting in the Bill representing a step in the right direction, we consider that these requirements could and should be strengthened to do the trust building we think is important. First, the Bill should make clear that, subject to appropriate redactions, companies will be required to make their transparency reports publicly available—to make them transparent—hence Amendment 160A.

Although it is not clear from the Bill whether companies will be required to make these reports publicly available, we consider that, in most instances, such a requirement would be appropriate. As noted, one of the stated purposes of transparency reporting is that it would enable service users to make more informed choices about their own and their children’s internet use—but they can only do so if the reports are published. Moreover, in so far as transparency reporting would facilitate public accountability, it could also act as a powerful incentive for service providers to do more to protect their users.

We also recognise that requiring companies to publish the incidences of CSEA content on their platforms, for instance, may have the effect of encouraging individuals seeking such material towards platforms on which there are high incidences of that content—that must be avoided. I recognise that simply having a high incidence of CSEA content on a platform does not necessarily mean that that platform is problematic; it could just mean that it is better at reporting it. So, as ever with the Bill, there is a balance to be struck.

Therefore, we consider that the Bill should make it explicit that, once provided to Ofcom, transparency reports are to be made publicly available, subject to redactions. To support this, Ofcom should be required to produce guidance on the publication of transparency reports and the redactions that companies should make before making reports publicly accessible. Ofcom should also retain the power to stop a company from publishing a particular transparency report if it considers that the risk of directing individuals to illegal materials outweighs the benefit of making a report public—hence Amendments 160B and 181A.

Amendments 165 and 229 are in my noble friend Lord Stevenson’s name. Amendment 165 would broaden the transparency requirements around user-to-user services’ terms of service, ensuring that information can be sought on the scope of these terms, not just their application. As I understand it, scope is important to understand, as it is significant in informing Ofcom’s regulatory approach. We are trying to guard against minimal terms of service where detail is needed for users and Ofcom.

The proposed clause in Amendment 229 probes how Ofcom will review the effectiveness of the transparency requirements in the Bill. It would require Ofcom to undertake a review of the effectiveness of transparency reports within three years and every five years thereafter, and it would give the Secretary of State powers to implement any recommendations made by the regulator. The Committee should note that we also include a requirement that a Select Committee, charged by the relevant House, must consider and report on the regulations, with an opportunity for Parliament to debate them. So we link the three corners of the triangle rather neatly there.

If we agree that transparency is an important part of building trust in Ofcom in doing this difficult and innovative regulatory job—it is always good to see the noble Lord, Lord Grade, in his place; I know he is looking forward to getting on with this—then this proposed clause is sensible. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I am pleased that the noble Lord, Lord Knight of Weymouth, has given us an opportunity to talk about transparency reports with these amendments, which are potentially a helpful addition to the Bill. Transparency is one of the huge benefits that the legislation may bring. One of the concerns that the public have and that politicians have always had with online platforms is that they appear to be a black box—you cannot see what is going on in them.

In the entire edifice that we are constructing in the Online Safety Bill, there are huge opportunities to change that. The platforms will have to do risk assessments —there are measures in the Bill to make sure that information about these is put out—and they will have to take active steps to mitigate any risks they find. Again, we may get directions and guidance from Ofcom that will explain to the public exactly what is expected of them. The final piece of the jigsaw is the transparency reports that show the outcomes—how a platform has performed and what it has done to meet its obligations in dealing with content and behaviour on its services.

For the record, I previously worked for one of the platforms, and I would have said that I was on the pro-transparency wing of the transparency party inside the company. I believed that it was in the platform’s interest: if you do not tell people what you are doing, they will make things up about you, and what they make up will generally be worse than what you are actually doing. So there are huge advantages to the platforms from being transparent.

The noble Lord, Lord Knight, has picked up on some important points in his Amendment 160B, which talks about making sure that the transparency report is not counterproductive by giving the bad guys information that they could use to ill effect. That is a valid point; it is often debated inside the platforms. Sometimes, I argued furiously with my colleagues in the platforms about why we should disclose information. They would ask, “What about the bad guys?” Sometimes I challenged that, but other times it would have been a genuine and accurate concern. The noble Lord mentioned things such as child sexual abuse material, and we have to recognise that the bad guys are incredibly devious and creative, and if you show them anything that they can use against you to get around your systems, they will try to do that. That is a genuine and valid concern.

The sort of thing that you might put into a transparency report is, for example, whether you have banned particular organisations. I would be in favour of indicating to the public that an organisation is banned, but you can see that the potential impact of that is that all the people you are concerned about would create another organisation with a different name and then get back on to your platform. We need to be alive to those kinds of concerns.

It is also relevant to Amendment 165 and the terms of service that the more granular and detailed your terms of service are, the better they are for public information, but there are opportunities to get around them. Again, we would have that argument internally. I would say, “If we are prohibiting specific hate speech terms, tell people that, and then they won’t use them”. For me, that would be a success, as they are not using those hate speech terms anymore, but, of course, they may then find alternative hate speech terms that they can use instead. You are facing that battle all the time. That is a genuine concern that I hope we will be able to debate. I hope that Ofcom will be able to mitigate that risk by discussing with platforms what these transparency reports should look like. In a sense, we are doing a risk assessment of the transparency report process.

Amendment 229 on effectiveness is really interesting. My experience was that if you did not have a transparency report, you were under huge pressure to produce one and that once you produced one, nobody was interested. For fear of embarrassing anyone in the Committee, I would be curious to know how many noble Lords participating in this debate have read the transparency reports already produced by Meta Platforms, Google and others. If they have not read them, they should not be embarrassed, because my experience was that I would talk to regulators and politicians about something they had asked me to come in to talk about, such as hate speech or child sexual abuse material, and I learned to print off the transparency report. I would go in and say, “Well, you know what we are doing; it’s in our transparency report”. They would ask, “What transparency report?”, and I would have to show them. So, having produced a transparency report, every time we published it, we would expect there to be public interest, but little use was made of it. That is not a reason not to do them—as I said, I am very much in favour of doing them—but, on their own, they may not be effective, and Amendment 229 touches on that.

I was trying to think of a collective noun for transparency reports and, seeing as they shed light, I think it may be a “chandelier”. Where we may get the real benefit is if Ofcom can produce a chandelier of transparency reports, taking all the information it gets from the different platforms, processing it and selecting the most relevant information—the reports are often too long for people to work their way through—so that it can enable comparisons. That is really good and it is quite good for the industry that people know that platform A did this, platform B did that, and platform C did something else. They will take note of that, compare with each other and want to get into the best category. It is also critical that Ofcom puts this into user-friendly language, and Ofcom has quite a good record of producing intelligible reports. In the context of Amendment 229, a review process is good. One of the things that might come out of that, thinking ahead, would be Ofcom’s role in producing meta transparency reports, the chandelier that will shed light on what the whole sector is doing.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, for once I want to be really positive. I am actually very positive about this whole group of amendments because more transparency is essential in what we are discussing. I especially like Amendment 165 from the noble Lord, Lord Stevenson of Balmacara, because it is around terms of service for user-to-user services and ensures that information can be sought on the scope as well as the application. This is important because so much has been put on user-to-user services as well as on terms of service. You need to know what is going on.

I want particularly to compliment Amendment 229 that says that transparency reports should be

“of sufficient quality to enable service users and researchers to make informed judgements”,

et cetera. That is a very elegant way in which to say that they should not be gobbledegook. If we are going to have them, they should be clear and of a quality that we can read. Obviously, we do not want them to be unreadable and full of jargon and legalistic language. I am hoping that that is the requirement.

17:00
I am positive because I have been worried that so much depends on terms of service and how that can lead to the overremoval of content and, as we discussed the other day, there is no individual complaints mechanism in terms of Ofcom. So I am grasping for ways in which users can have a right of redress. Understanding why something has been taken down is very important. As the noble Lord, Lord Allan of Hallam, said, so much is hidden from users. People will constantly say, “My material has been deboosted”, and it might well have been. They will say things such as, “The algorithms are hiding content, even if they are not removing it”. I have noticed that people can get very paranoid, and it can fuel conspiracy theories because you get people saying, “Nobody has retweeted my tweet. This is a deboosting algorithm”, when actually they have not retweeted it because it was boring. If you could see a bit more clearly what the policies were instead of feeling that they are hidden from your view, it would lessen the paranoia and that kind of accusation.
My only caveat to this proposal relates to what the noble Lord, Lord Allan, said about the bad guys being banned and that, if they are banned, they might emerge somewhere else. We also need to recognise that sometimes people who are called the bad guys can be banned, and they are not the bad guys. They need to be able to say, “We’re not the bad guys”. That is why the more detail, the better. The only other caveat is that I do not want to be in a situation where we demand endless regulatory complexity and reports and impositions make life impossible for the services in terms of red tape and paperwork. That is my only caveat. Generally speaking, however, I am very positive about these amendments, and I hope that by Report they become, one way or another, part of the Bill.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I strongly support the amendment in the names of the noble Lords, Lord Knight and Lord Stevenson, as well as my noble friend Lady Featherstone. The essence of the message from the noble Lord, Lord Knight, about the need for trust and the fact that you can gain trust through greater transparency is fundamental to this group.

The Joint Committee’s report is now a historical document. It is partly the passage of time, but it was an extraordinary way in which to work through some of the issues, as we did. We were very impacted by the evidence given by Frances Haugen, and the fact that certain things came to light only as a result of her sharing information with the Securities and Exchange Commission. We said at the time that:

“Lack of transparency of service providers also means that people do not have insight into the prevalence and nature of activity that creates a risk of harm on the services that they use”.


That is very much the sense that the noble Lord, Lord Stevenson, is trying to get to by adding scope as well.

We were very clear about our intentions at the time. The Government accepted the recommendation that we made and said that they agreed with the committee that

“services with transparency reporting requirements should be required to publish their transparency reports in full, and in an accessible and public place”.

So what we are really trying to do is to get the Government to agree to what they have already agreed to, which we would have thought would be a relatively straightforward process.

There are some other useful aspects, such as the review of effectiveness of the transparency requirements. I very much appreciate what my noble friend just said about not reading transparency reports. I read the oversight reports but not necessarily the transparency reports. I am not sure that Frances Haugen was a great advert for transparency reports at the time, but that is a mere aside in the circumstances.

I commend my noble friend Lady Featherstone’s Amendment 171, which is very consistent with what we were trying to achieve with the code of practice about violence against women and girls. That would fit very easily within that. One of the key points that my noble friend Lord Allan made is that this is for the benefit of the platforms as well. It is not purely for the users. Of course it is useful for the users, but not exclusively, and this could be a way of platforms engaging with the users more clearly, inserting more fresh air into this. In these circumstances it is pretty conclusive that the Government should adhere to what they agreed to in their response to the Joint Committee’s report.

Viscount Camrose Portrait The Parliamentary Under-Secretary of State, Department for Science, Innovation and Technology (Viscount Camrose) (Con)
- View Speech - Hansard - - - Excerpts

As ever, I thank all noble Lords who have spoken. I absolutely take, accept and embrace the point that transparency is wholly critical to what we are trying to achieve with the Bill. Indeed, the chandelier of transparency reports should be our shared aim—a greenhouse maybe. I am grateful for everyone’s contributions to the debate. I agree entirely with the views expressed. Transparency is vital in holding companies to account for keeping their users safe online. As has been pointed out, it is also to the benefit of the platforms themselves. Confident as I am that we share the same objectives, I would like to try to reassure noble Lords on a number of issues that have been raised.

Amendments 160A, 160B and 181A in the name of the noble Lord, Lord Knight of Weymouth, seek to require providers to make their transparency reports publicly available, subject to appropriate redactions, and to allow Ofcom to prevent their publication where it deems that the risks posed by drawing attention to illegal content outweigh the benefit to the public of the transparency report. Let me reassure the noble Lord that the framework, we strongly believe, already achieves the aim of those amendments. As set out in Clause 68, Ofcom will specify a range of requirements in relation to transparency reporting in a notice to categories 1, 2A and 2B. This will include the kind of information that is required in the transparency report and the manner in which it should be published. Given the requirement to publish the information, this already achieves the intention of Amendment 160A.

The specific information requested for inclusion within the transparency report will be determined by Ofcom. Therefore, the regulator will be able to ensure that the information requested is appropriate for publication. Ofcom will take into account any risks arising from making the information public before issuing the transparency notice. Ofcom will have separate information-gathering powers, which will enable the regulator to access information that is not suitable to be published in the public domain. This achieves the intention of Amendment 160B. There is also a risk of reducing trust in transparency reporting if there is a mechanism for Ofcom to prevent providers publishing their transparency reports.

Amendment 181A would require Ofcom to issue guidance on what information should be redacted and how this should be done. However, Ofcom is already required to produce guidance about transparency reports, which may include guidance about what information should be redacted and how to do this. It is important to provide the regulator with the flexibility to develop appropriate guidance.

Amendment 165 seeks to expand the information within the transparency reporting requirements to cover the scope of the terms of service set out by user-to-user providers. I very much agree with the noble Lord that it is important that Ofcom can request information about the scope of terms of service, as well as about their application. Our view is that the Bill already achieves this. Schedule 8 sets out the high-level matters about which information may be required. This includes information about how platforms are complying with their duties. The Bill will place duties on user-to-user providers to ensure that any required terms of service are clear and accessible. This will require platforms to set out what the terms of service cover—or, in other words, the scope. While I hope that this provides reassurance on the matter, if there are still concerns in spite of what I have said, I am very happy to look at this. Any opportunity to strengthen the Bill through that kind of clarity is worth looking at.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I welcome the Minister’s comments. I am interrupting just because this is my amendment rather than my noble friend Lord Knight’s. The word “scope” caused us some disquiet on this Bench when we were trying to work out what we meant by it. It has been fleshed out in slightly different ways around the Chamber, to advantage.

I go back to the original intention—I am sorry for the extensive introduction, but it is to make sure that I focus the question correctly—which was to make sure that we are not looking historically at the terms of reference that have been issued, and whether they are working in a transparency mode, but addressing the question of what is missing or is perhaps not addressed properly. Does the Minister agree that that would be taken in by the word “scope”?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I think I probably would agree, but I would welcome a chance to discuss it further.

Finally, Amendment 229 intends to probe how Ofcom will review the effectiveness of transparency requirements in the Bill. It would require Ofcom to produce reports reviewing the effectiveness of transparency reports and would give the Secretary of State powers to implement any recommendations made by the regulator. While I of course agree with the sentiment of this amendment, as I have outlined, the transparency reporting power is designed to ensure that Ofcom can continuously review the effectiveness of transparency reports and make adjustments as necessary. This is why the Bill requires Ofcom to set out in annual transparency notices what each provider should include in its reports and the format and manner in which it should be presented, rather than putting prescriptive or static requirements in the Bill. That means that Ofcom will be able to learn, year on year, what will be most effective.

Under Clause 145, Ofcom is required to produce its own annual transparency report, which must include a summary of conclusions drawn from providers’ transparency reports, along with the regulator’s view on industry best practice and other appropriate information—I hope and think that goes to some of the points raised by the noble Lord, Lord Allan of Hallam.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, just before the Minister moves on—and possibly to save me finding and reading it—can he let us know whether those annual reports by Ofcom will be laid before Parliament and whether Parliament will have a chance to debate them?

17:15
Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I believe so, but I will have to confirm that in writing. I am sorry not to be able to give a rapid answer.

Clause 159 requires the Secretary of State to review in total the operation of the regulatory framework to ensure it is effective. In that review, Ofcom will be a statutory consultee. The review will specifically require an assessment of the effectiveness of the regulatory framework in ensuring that the systems and processes used by services provide transparency and accountability to users.

The Bill will create what we are all after, which is a new culture of transparency and accountability in the tech sector. For the reasons I have laid out, we are confident that the existing provisions are sufficiently broad and robust to provide that. As such, I hope the noble Lord feels sufficiently reassured to withdraw the amendment.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, that was a good, quick debate and an opportunity for the noble Viscount to put some things on the record, and explain some others, which is helpful. It is always good to get endorsement around what we are doing from both the noble Lord, Lord Allan, and the noble Baroness, Lady Fox. That is a great spread of opinion. I loved the sense of the challenge as to whether anyone ever reads the transparency reports whenever they are published; I imagine AI will be reading and summarising them, and making sure they are not written as gobbledygook.

On the basis of what we have heard and if we can get some reassurance that strong transparency is accompanied by strong parliamentary scrutiny, then I am happy to withdraw the amendment.

Amendment 160A withdrawn.
Amendment 160B not moved.
Clause 68 agreed.
Amendment 161 not moved.
Schedule 8: Transparency reports by providers of Category 1 services, Category 2A services and Category 2B services
Amendments 162 to 181 not moved.
Schedule 8 agreed.
Clause 69: OFCOM’s guidance about transparency reports
Amendment 181A not moved.
Clause 69 agreed.
Amendment 182 not moved.
Clause 70: “Pornographic content”, “provider pornographic content”, “regulated provider pornographic content”
Amendments 183 and 183ZA not moved.
Clause 70 agreed.
Clause 71: Scope of duties about regulated provider pornographic content
Amendment 183A not moved.
Clause 71 agreed.
Clause 72: Duties about regulated provider pornographic content
Amendments 183B to 185 not moved.
Clause 72 agreed.
Clause 73 agreed.
Amendment 185A
Moved by
185A: After Clause 73, insert the following new Clause—
“Duties on providers of online marketplace services
(1) This section sets out duties that apply in relation to providers of online marketplace services.(2) A duty to put in place proportionate systems and processes to prevent child users from encountering listings of knives for sale on the platform, including (where appropriate) excluding relevant listings from advertising or other algorithms.(3) A duty to put in place proportionate systems and processes to identify and remove listings of knives or similar products which are marketed in a manner which would reasonably appear to a user to—(a) promote violence or threatening behaviour,(b) encourage self-harm, or(c) look menacing. (4) A duty to put in place proportionate systems and processes to ensure, beyond reasonable doubt, that any purchaser of a knife meets or exceeds the minimum legal age for purchasing such items.(5) For the purposes of this section, the online marketplace may have regard to different age restrictions in different parts of the United Kingdom.(6) For the purposes of subsection (3)(c), a knife may look menacing if it is, or appears to be similar to, a “zombie knife”, “cyclone knife” or machete.(7) In this section, “online marketplace service” means a service using software, including a website, part of a website or an application, operated by or on behalf of a trader, which allows consumers to conclude distance contracts with other traders or consumers.”Member’s explanatory statement
This new Clause would introduce duties on online marketplaces to limit child access to listings of knives, and to take proactive steps to identify and remove any listings of knives or similar products which refer to violence or self-harm. While online sales of knives are not illegal, under-18s (under-16s in Scotland) should not be able to purchase them.
Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

My Lords, I have Amendments 185A and 268AA in this group. They are on different subjects, but I will deal with them in the same contribution.

Amendment 185A is a new clause that would introduce duties on online marketplaces to limit child access to listings of knives and take proactive steps to identify and remove any listings of knives or products such as ornamental zombie knives that are suggestive of acts of violence or self-harm. I am sure the Minister will be familiar with the Ronan Kanda case that has given rise to our bringing this amendment forward. The case is particularly horrible; as I understand it, sentencing is still outstanding. Two young boys bought ninja blades and machetes online and ultimately killed another younger boy with them. It has been widely featured in news outlets and is particularly distressing. We have had some debate on this in another place.

As I understand it, the Government have announced a consultation on this, among other things, looking at banning the sale of machetes and knives that appear to have no practical use other than being designed to look menacing or suitable for combat. We support the consultation and the steps set out in it, but the amendment provides a chance to probe the extent to which this Bill will apply to the dark web, where a lot of these products are available for purchase. The explanatory statement contains a reference to this, so I hope the Minister is briefed on the point. It would be very helpful to know exactly what the Government’s intention is on this, because we clearly need to look at the sites and try to regulate them much better than they are currently regulated online. I am especially concerned about the dark web.

The second amendment relates to racist abuse; I have brought the subject before the House before, but this is rather different. It is a bit of a carbon copy of Amendment 271, which noble Lords have already debated. It is there for probing purposes, designed to tease out exactly how the Government see public figures, particularly sports stars such as Marcus Rashford and Bukayo Saka, and how they think they are supposed to deal with the torrents of racist abuse that they receive. I know that there have been convictions for racist content online, but most of the abuse goes unpunished. It is not 100% clear that much of it will be identified and removed under the priority offence provisions. For instance, does posting banana emojis in response to a black footballer’s Instagram post constitute an offence, or is it just a horrible thing that people do? We need to understand better how the law will act in this field.

There has been a lot of debate about this issue, it is a very sensitive matter and we need to get to the bottom of it. A year and a half ago, the Government responded to my amendment bringing online racist abuse into the scope of what is dealt with as an offence, which we very much welcomed, but we need to understand better how these provisions will work. I look forward to the Minister setting that out in his response. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to speak primarily to the amendments in the name of my noble friend Lord Clement-Jones, but I will also touch on Amendment 268AA at the same time. The amendments that I am particularly interested in are Amendments 200 and 201 on regulatory co-operation. I strongly support the need for this, and I will illustrate that with some concrete examples of why this is essential to bring to life the kinds of challenges that need to be dealt with.

The first example relates to trying to deal with the sexual grooming of children online, where platforms are able to develop techniques to do that. They can do that by analysing the behaviour of users and trying to detect whether older users are consistently trying to approach younger users, and the kind of content of the messages they may be sending to them where that is visible. These are clearly highly intrusive techniques. If a platform is subject to the general data protection regulation, or the UK version of that, it needs to be very mindful of privacy rights. We clearly have, there, two potentially interested bodies in the UK environment. We have the child protection agencies, and we will have, in future, Ofcom seeking to ensure that the platform has met its duty of care, and we will have the Information Commission’s Office.

A platform, in a sense, can be neutral as to what it is instructed to do by the regulator. Certainly, my experience was that the platforms wanted to do those kinds of activities, but they are neutral in the sense that they will do what they are told is legal. There, you need clarity from the regulators together to say, “Yes, we have looked at this and you are not going to do something on the instruction of the child safety agency and then get criticised, and potentially fined, by the Data Protection Agency for doing the thing you have been instructed to do”—so we need those agencies to work together.

The second example is in the area of co-operation around antiterrorism, another key issue. The platforms have created something called the Global Internet Forum to Counter Terrorism. Within that forum, they share tools and techniques—things such as databases of information about terrorist content and systems that you can use to detect them—and you are encouraged within that platform to share those tools and techniques with smaller platforms and competitors. Clearly, again, there is a very significant set of questions, and if you are in a discussion around that, the lawyers will say, “Have the competition lawyers cleared this?” Again, therefore, something that is in the public interest—that all the platforms should be using similar kinds of technology to detect terrorist content—is something where you need a view not just from the counterterrorism people but also, in our case, from the Competition and Markets Authority. So, again, you need those regulators to work together.

The final example is one which I know is dear to the heart of the noble Baroness, Lady Morgan of Cotes, which is fraudsters, which we have dealt with, where you might have patterns of behaviour where you have information that comes from the telecoms companies regulated by Ofcom, the internet service providers, regulated by Ofcom, and financial institutions, regulated by their own family of regulators—and they may want to share data with each other, which is something that is subject to the Information Commission’s Office again. So, again, if we are going to give platforms instructions, which we rightly do in this legislation, and say, “Look, we want you to get tougher on online fraudsters; we want you to demonstrate a duty of care there”, the platforms will need—certainly those regulators: financial regulators, Ofcom and the Information Commissioner’s Office—to sort those things out.

Having a forum such as the one proposed in Amendment 201, where these really difficult issues can be thrashed out and clear guidance can be given to online services, will be much more efficient than what sometimes happened in the past, where you had the left hand and the right hand of the regulatory world pulling you in different directions. I know that we have the Digital Regulation Cooperation Forum. If we can build on those institutions, it is essential and ideal that they have their input before the guidance is issued, rather than have a platform comply with guidance from regulator A and then get dinged by regulator B for doing the thing that they have been instructed to do.

That leads to the very sensible Amendment 201 on skilled persons. Again, Ofcom is going to be able to call in skilled persons. In an area such as data protection, that might be a data protection lawyer, but, equally, it might be that somebody who works at the Information Commissioner’s Office is actually best placed to give advice. Amendment 200—the first of the two that talks about skilled persons being able to come from regulators—makes sense.

Finally, I will touch on the issues raised in Amendment 268AA—I listened carefully and understand that it is a probing amendment. It raises some quite fundamental questions of principle—I suspect that the noble Baroness, Lady Fox, might want to come in on these—and it has been dealt with in the context of Germany and its network enforcement Act: I know the noble Lord, Lord Parkinson of Whitley Bay, can say that in the original German. That Act went in the same direction, motivated by similar concerns around hate speech.

17:30
This raises some fundamental questions about what we want from privacy law and what we want in terms of criminal prosecutions. There is a spectrum of offences, and for some I think we have accepted that platforms should report; on child sexual abuse material, platforms have a duty to report every incidence to the regulator. When it comes to threats to life, the expectation would be clear, so if you have knowledge—this happens—of an imminent terrorist attack or even of somebody who is about to commit suicide, it is clear that you should go to the police or the relevant authorities with that information. Then you have this broad spectrum of other criminal offences which may be problematic. I do not want to minimise the effect on people of hate speech crimes, but they are of a different order, shall we say, from threat-to-life cases, where I think reporting is broadly supported. We have to make a decision there.
My starting point is to be nervous about platforms acting in that policing capacity for offences that are not at the most extreme end of the spectrum. Individuals who are worried about that activity can go to the police directly themselves and can generally take the content to the police—literally; they can print it off—who can make a judgment about whether to go to the Crown Prosecution Service. I worry about the platforms doing it partly from a constitutional point of view, because I am not sure that I want them acting in that quasi-legal capacity, but also, frankly, from a volume point of view. The risk is that if you put this duty on a platform, because it is really hard to understand what is criminal hate speech and what is merely hateful hate speech, the temptation will be to send everything over. If you do that, first, you have a greater violation of privacy, and secondly, you probably have not helped the police, because they get swamped with reports that they cannot manage.
I hope that is a helpful counterargument to the idea that platforms should automatically report material. However, I recognise that it leaves an open question. When people engage in that kind of behaviour online and it has serious real-world consequences, how do we make sure that they do not feel that it is consequence-free—that they understand that there are consequences? If they have broken the law, they should be prosecuted. There may be something in streamlining the process where a complainant goes to the police and the police are able to access the information they need, having first assessed that it is worth prosecuting and illegal, so that we make that loop work first before we head in the direction of having platforms report content en masse because they believe it may have violated laws where we are not at that most serious end of the spectrum.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, so few of us are involved in this discussion that we are now able to write each other’s speeches. I thank the noble Lord, Lord Allan of Hallam, for articulating some of my concerns, probably more elegantly than I will myself. I will focus on two amendments in this group; in fact, there are lots of interesting things, but I will focus on both the amendments from the noble Lord, Lord Bassam of Brighton.

On the issue of proactive steps to remove listings of knives for young people, I am so sympathetic to this because in a different area of my life I am pretty preoccupied with the problem of knife crime among young people. It really bothers me and I worry about how we tackle it. My concern of course is that the police should be working harder to solve that problem and that we cannot anticipate that the Bill will solve all social problems. There is a danger of removing the focus from law enforcement in a real-world problem, as though removing how you buy the knife is the issue. I am not convinced that that helps us.

I wanted to reflect on the kind of dilemmas I am having around this in relation to the story of Mizzy that is doing the rounds. He is the 18 year-old who has been posting his prank videos on TikTok and has caused quite a stir. People have seen him wandering into strangers’ homes uninvited, asking random people in the street if they want to die, running off with an elderly lady’s dog and making fun of Orthodox Jews—generally speaking, this 18 year-old is obnoxious. His TikTok videos have gone viral; everybody is discussing them.

This cruelty for kicks genre of filming yourself, showing your face full to the camera and so on, is certainly abhorrent but, as with the discussion about knife crime, I have noticed that some people outside this House are attempting to blame the technology for the problem, saying that the videos should have been removed earlier and that it is TikTok’s fault that we have this anti-social behaviour, whereas I think it is a much deeper, broader social problem to do with the erosion of adult authority and the reluctance of grown-ups to intervene clearly when people are behaving badly—that is my thesis. It is undoubtedly a police matter. The police seem to have taken ages to locate Mizzy. They eventually got him and charged him with very low offences, so he was on TV being interviewed the other evening, laughing at how weak the law was. Under the laws he was laughing at, he could freely walk into somebody’s house or be obnoxious and get away with it. He said, “We can do what we want”. That mockery throws up problems, but I do not necessarily think that the Bill is the way to solve it.

That leads me to my concerns about Amendment 268AA, because Mizzy was quoted in the Independent newspaper as saying:

“I’m a Black male doing these things and that’s why there’s such an uproar”.


I then went on a social media thread in which any criticism of Mizzy’s behaviour was described as racist harassment. That shows the complexity of what is being called for in Amendment 268AA, which wants platforms to take additional steps

“to combat incidents of online racially aggravated harassment”.

My worry is that we end up with not only Mizzy’s TikTok videos being removed but his critics being removed for racially harassing him, so we have to be very careful here.

Amendment 268AA goes further, because it wants tech companies to push for prosecution. I really think it is a dangerous step to encourage private companies to get tangled up in deciding what is criminal and so on. The noble Lord, Lord Allan, has exactly described my concerns, so I will not repeat them. Maybe I can probe this probing amendment. It also broadens the issue to all forms of harassment.

By the way, the amendment’s explanatory statement mentions the appalling racist abuse aimed at footballers and public figures, but one of the fascinating things was that when we number-crunched and went granular, we found that the majority of that racist abuse seemed to have been generated by bots, which takes us to the position of the noble Lord, Lord Knight, earlier: who would you prosecute in that instance? Bots not even based in the UK were generating what was assumed to be an outbreak of racist abuse among football fans in the UK, but the numbers did not equate to that. There were some people being racist and vile and some things that were generated in these bot farms.

To go back to the amendment, it goes on to broaden the issue out to

“other forms of harassment and threatening or abusive behaviour”.

Again, this is much more complicated in today’s climate, because those kinds of accusation can be deployed for bad faith reasons, particularly against public figures.

We have an example close to this House. I hope that Members have been following and will show solidarity over what has been happening to the noble Baroness, Lady Falkner of Margravine, who is chair of the Equality and Human Rights Commission and tasked with upholding the equality law but is at the centre of a vicious internal row after her officials filed a dossier of complaints about her. They have alleged that she is guilty of harassment. A KC is being brought in, there are 40 complaints and the whole thing is costing a fortune for both taxpayers and the noble Baroness herself.

It coincided with the noble Baroness, Lady Falkner, advising Ministers to update the definition of sex in the Equality Act 2010 to make clear that it refers to biological sex and producing official advice clarifying that trans women can be lawfully excluded from female-only spaces. We know how toxic that whole debate is.

Many of us feel that a lot of the accusations against the noble Baroness are ideologically and politically motivated vexatious complaints. I am distressed to read newspaper reports that say that she has been close to tears and has asked why anyone would go into public service. All this is for the crime of being a regulator upholding and clarifying the law. I hope it does not happen to the person who ends up regulating Ofcom—ending up close to tears as he stands accused of harassment, abusive behaviour and so on.

The point is that she is the one being accused of harassment. I have seen the vile abuse that she has received online. It is completely defamatory, vicious abuse and yet somehow it ends up being that, because she does not provide psychological safety at work and because of her views, she is accused of harassment and is the one in the firing line. I do not want us to introduce that kind of complexity—this is what I have been worried about throughout—into what is banned, removed or sent to the police as examples of harassment or hate crime.

I know that is not the intention of these amendments; it is the unintended consequences that I dread.

Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak chiefly to Amendment 262 in my name, although in speaking after the noble Baroness, Lady Fox, who suggested that the grown-ups should control anti-social behaviour by young people online, I note that there is a great deal of anti-social behaviour online from people of all ages. This is relevant to my Amendment 262.

It is a very simple amendment and would require the Secretary of State to consult with young people by means of an advisory board consisting of people aged 25 and under when reviewing the effectiveness and proportionality of this legislation. This amendment is a practical delivery of some of the discussion we had earlier in this Committee when we were talking about including the Convention on the Rights of the Child in the Bill. There is a commonly repeated phrase, “Nothing about us without us”. It was popularised by disability activists in the 1990s, although in doing a little research for this I found that it originates in Latin in Poland in the 15th century. So it is an idea that has been around for a long while and is seen as a democratic standard. It is perhaps a variation of the old “No taxation without representation”.

This suggestion of an advisory board for the Secretary of State is because we know from the discussion earlier on the children’s rights amendments that globally one in three people online is a child under the age of 18. This comes to the point of the construction of your Lordships’ House. Most of us are a very long way removed in experiences and age—some of us further than others. The people in this Committee thinking about a 12 year-old online now are parents, grandparents and great-grandparents. I venture to say that it is very likely that the Secretary of State is at least a generation older than many of the people who will be affected by its provisions.

This reflects something that I also did on the Health and Care Bill. To introduce an advisory panel of young people reporting directly to the Secretary of State would ensure a direct voice for legislation that particularly affects young people. We know that under-18s across the UK do not have any role in elections to the other place, although 16 and 17 year-olds have a role in other elections in Wales and Scotland now. This is really a simple, clear, democratic step. I suspect the Minister might be inclined to say, “We are going to talk to charities and adults who represent children”. I suggest that what we really need here is a direct voice being fed in.

I want to reflect on a recent comment piece in the Guardian that made a very interesting argument: that there cannot be, now or in the future, any such thing as a digital native. Think of the experience of someone 15 or 20 years ago; yes, they already had the internet but it was a very different beast to what we have now. If we refer back to some of the earlier groups, we were starting to ask what an internet with widespread so-called generative artificial intelligence would look like. That is an internet which is very different from even the one that a 20 year-old is experiencing now.

It is absolutely crucial that we have that direct voice coming in from young people with experience of what it is like. They are an expert on what it is like to be a 12 year-old, a 15 year-old or a 20 year-old now, in a way that no one else can possibly be, so that is my amendment.

17:45
I will briefly comment on a couple of other amendments in this group. I am really hoping that the Minister is going to say that the Government will agree with the amendments that replace the gendered term “chairman” with chair. I cannot imagine why we are still writing legislation in 2023 with such gendered terms.
I also want to comment on the amendments from the noble Lord, Lord Stevenson of Balmacara, whom we have not heard from yet. They are Amendments 202ZA and 210A, both of which refer to “journalistic material” and sources. What I want to put on record relates to the Minister’s response to my comments on journalistic sources and encryption on day 3 in Committee. He said then that
“there is no intention or expectation that the tools required to be used under this power would result in a compromising of those sources”.—[Official Report, 27/4/23; col. 1325.]
He was referring to journalistic sources. I have had a great number of journalists and their representatives reaching out to me, pointing to the terms used by the Minister. They have said that if those algorithms, searches or de-encryption tools are let loose, there is no way of being able to say, “That’s a bit of journalism, so the tool’s not going to apply to it”. That simply does not add up. The amendments in this group are getting into that much broader issue, so I look forward to hearing from the noble Lord, Lord Stevenson, on them.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this is the most miscellaneous of all the groups that we have had, so it has rightly been labelled as such—and the competition has been pretty strong. I want to come back to the amendments of the noble Lord, Lord Stevenson, and of the noble Lord, Lord Bassam, but first I want to deal with my Amendments 200 and 201 and to put on the record the arguments there.

Again, if I refer back to our joint report, we were strongly of the view—alongside the Communications and Digital Committee—that there should be a statutory requirement for regulators

“to cooperate and consult with one another”.

Although we welcomed the formation of the DRCF, it seemed to us that there should be a much firmer duty. I was pleased to hear the examples that my noble friend put forward of the kinds of co-operation that will be needed. The noble Baroness, Lady Morgan, clearly understands that, particularly in the area of fraud, it could be the FCA or ICO, and it could be Ofcom in terms in social media. There is a range of aspects to this—it could be the ASA.

These bodies need to co-operate. As my noble friend pointed out, they can apparently conflict; therefore, co-operating on the way that they advise those who are subject to regulation is rather important. It is not just about the members of the Digital Regulation Cooperation Forum. Even the IWF and the ASA could be included in that, not to mention other regulators in this analogous space. That forum has rightly been labelled as “Digital”, and digital business is now all-pervasive and involves a huge number of regulatory aspects.

Although in this context Ofcom will have the most relevant powers and expertise, and many regulators will look to it for help in tackling online safety issues, effective public protection will be achieved through proper regulatory co-operation. Therefore, Ofcom should be empowered to co-operate with others to share information. As much as it can, Ofcom should be enabled to work with other regulators and share online safety information with them.

It has been very heartening to see the noble Lord, Lord Grade, in his place, even on a Thursday afternoon, and heartening how Ofcom has engaged throughout the passage of the Bill. We know the skills that it is bringing on board, and with those skills we want it to bring other regulators into its work. It seems that Ofcom is taking the lead on those algorithmic understanding skills, but we need Ofcom to have the duty to co-operate with the other regulators on this as well.

Strangely, in Clause 103 the Bill gives Ofcom the general ability to co-operate with overseas regulators, but it is largely silent on co-operation with UK regulators. Indeed, the Communications Act 2003 limits the UK regulators with which Ofcom can share information, excluding the ICO, for example, which is rather perverse in these circumstances. However, the Bill has a permissive approach to overseas regulators so, again, it should extend co-operation and information-sharing in respect of online safety to include regulators overseeing the offences in Schedule 7 that we have spent some time talking about today—the enforcement authorities, for instance, those responsible for enforcing the offences in relation to priority harms to children and priority offences regarding adults. Elsewhere in regulation, the Financial Conduct Authority may have a general duty to co-operate. The reverse may be true, so that duty of co-operation will need to work both ways.

As my noble friend Lord Allan said, Amendment 200, the skilled persons provision, is very straightforward. It is just to give the formal power to be able to use the expertise from a different regulator. It is a very well-known procedure to bring skilled persons into inquiries, which is exactly what is intended there.

Both amendments tabled by the noble Lord, Lord Bassam, are rather miscellaneous too, but are not without merit, particularly Amendment 185A. Please note that I agree with the noble Baroness, Lady Fox. I 100% support the intention behind the amendment but wonder whether the Bill is the right vehicle for it. No doubt the Minister will answer regarding the scope and how practical it would be. I absolutely applaud the noble Lord for campaigning on this issue. It is extraordinarily important, because we have seen some tragic outcomes of these weapons being available for sale online.

Amendment 268AA, also tabled by the noble Lord, Lord Bassam, is entirely different. Our Joint Committee heard evidence from Edleen John of the FA and Rio Ferdinand about abuse online. It was powerful stuff. I tend to agree with my noble friend. We have talked about user empowerment, the tools for it and, particularly in the context of violence against women and girls, the need for a way to be able to report that kind of abuse or other forms of content online. This is a candidate for that kind of treatment. While platforms obviously need to prevent illegal content and have systems to prevent it and so on, having assessed risk in the way that we have heard about previously, I do not believe that expecting the platforms to pick it up and report it, turning them into a sort of proto-enforcer, is the most effective way. We have to empower users. I absolutely share the objectives set out.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

My Lords, when I brought an amendment to a police Bill, my local football club said to me that it was anticipating spending something like £100,000 a year trying to create and develop filters, which were commercially available, to stop its footballers being able to see the abuse that they were getting online. It did that for a very sensible commercial reason because those footballers’ performance was affected by the abuse they got. I want to know how the noble Lord sees this working if not by having some form of intervention that involves the platforms. Obviously, there is a commercial benefit to providers of filters et cetera, but it is quite hard for those who have been victims to see a way to make this useful to them without some external form of support.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I absolutely take what the noble Lord is saying, and I am not saying that the platforms do not have responsibility. Of course they do: the whole Bill is about the platforms taking responsibility with risk assessment, adhering to their terms of service, transparency about how those terms are operating, et cetera. It is purely on the question of whether they need to be reporting that content when it occurs. They have takedown responsibilities for illegal content or content that may be seen by children and so on, but it is about whether they have the duty to report to the police. It may seem a relatively narrow point, but it is quite important that we go with the framework. Many of us have said many times that we regret the absence of “legal but harmful” but, given where we are, we basically have to go with that architecture.

I very much enjoyed listening to the noble Baroness, Lady Bennett. There is no opportunity lost in the course of the Bill to talk about ChatGPT or GPT-4, and that was no exception. It means that we need to listen to how young people are responding to the way that this legislation operates. I am fully in favour of whatever mechanism it may be. It does not need to be statutory, but I very much hope that we do not treat this just as the end of the process but will see how the Bill works out and will listen and learn from experience, and particularly from young people who are particularly vulnerable to much of the content, and the way that the algorithms on social media work.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

I am so sorry. With due respect to the noble Lord, Lord Stevenson, the noble Baroness, Lady Bennett, reminded me that his Amendments 202ZA and 210A, late entrants into the miscellaneous group, go very much with the grain that we are trying to get in within the area of encryption. We had quite a long debate about encryption on Clause 110. As ever, the noble Lord has rather cunningly produced something that I think will get us through the eye of the free speech needle. They are two very cunning amendments.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

I thank the noble Lord for that. Free expression, my Lords, not free speech.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Freedom of expression.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

Yes, freedom of expression. That is right.

I will start where the noble Lord, Lord Clement-Jones, finished, although I want to come back and cover other things. This is a very complicated group. I do not think we can do it quickly, as each issue is important and is worth trying to take forward.

18:00
Amendments in my name that came very late have been included here. Unfortunately, we did not have time to degroup them. I think they would have been better on their own, but they are here, and we will have the debate. Amendments 202ZA and 210A look as if they have come from a very different place, but, as the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Bennett, have said, they are a continuation of the debate we were having a couple of days ago on encryption. They are proposed as a compromise.
I hope that the end result will be that I will not move them, but that we can have an offline meeting about them to see whether there is a way forward on this. We left the debate on the powers the Bill attempts to take on encryption in a slightly unbalanced place. It is clear that, for very good and persuasive reasons, where there may be criminality happening on an encrypted service, powers will have to be available to those responsible for prosecuting that criminal activity so that they can access the necessary evidence. We do not dispute that at all.
How do you do that when it is fully encrypted and breaking the encryption raises dangers and difficulties? How do you do it if you are relying on not fully tested technological solutions which require giving powers to Ofcom to commission and operate through the companies a procedure we do not yet know will exist? It may work for indecent images but almost certainly will not work for counterterrorism. How do you do it in a way which is guided by the principle that the ability to get the data, when the material has been transmitted in an encrypted form, should not be at the expense of freedom of expression? Therefore, a technical solution looks like a possible winner. It may be in the future, but I do not believe we are there yet, but we have been promised a meeting on this topic and I am looking forward to it.
Thinking again about this and having been in receipt of further correspondence from others outside who have been watching this debate very closely, the amendments to Clauses 110 and 112 are suggested as a compromise which might get us to that point. It takes us down a slightly different route—I am not sure that this has been explored with the Bill team and therefore we should have a meeting to discuss it—of trying to dig a bit deeper into what would constitute a reasonable ground for persuading those responsible for hosting encrypted material and arguing convincingly that there is evidence to break their rule of non-interference, at least to the stage of metadata. The key here is not the content itself, but the ability to reach back to those attempting to use encrypted systems for criminal or other illegal behaviour.
In that sense, what is proposed here is a requirement for Ofcom, if it is left with Ofcom. I still believe it would be better if there were a third-party review of that on a judicial level following the RIPA proposals. It is a suggestion which might command support in the industry by allowing for a definitional approach making sure that we are talking about proper journalistic material and allowing that to be taken as a route forward so that there would be reassurance for those who are concerned on the journalistic side that the material would not be accessed and used in a way which would be detrimental to their activity and it could be protected. I will not take it any further than that, unless others would like it. That is the purpose behind this amendment. I am sorry that it was late, and it should not really have been in this group, but it is good to have got it on the table. I hope it will feed into the discussions we are due to have.
Having said that, I will briefly go back to my noble friend Lord Bassam’s amendments. The noble Lord, Lord Clement-Jones, made a couple of the points I wanted to make, but I will reinforce them. I am particularly glad that my noble friend came, given that his throat is as bad as it is—I am sure that it is entirely due to his overenthusiastic support for his football team. No doubt he celebrated late into the night its success in getting into the Europa competition. I never use sporting metaphors, but I have to use one for my noble friend Lord Bassam.
On knives, I am pleased to see this here—we had a good response to it, and I look forward to the Minister’s response. I first came across this issue when I was relatively new in your Lordships’ House. I had a placement with the Met, over a period of time, to get to know how it worked and everything else. It was a fantastic experience that was organised well; anyone who has not done it should do it. It is a good way of learning a bit more about something that is clearly in the public consciousness at the moment.
One of my visits was to a group of young officers operating in and around Brixton. I spent three days there and experienced a riot that they had not anticipated, which was quite exciting. The main point was that we spoke a lot about knives and their role in society. The evidence I saw, in practice, was that this was a burgeoning problem that the police were not well equipped to deal with—this was five or six years ago. It was not for want of trying; it was just that the way the gang culture operated in Brixton, as I understood it, was that the responsibility for enrolling, for maintaining discipline and, subsequently, for operating a gang there was largely governed by rules well away from those recognised in civilised society. The methods of control were knives being placed into the bodies of persons who were being disciplined. The police had no way of coping with that.
Part of the problem with this, as my noble friend Lord Bassam mentioned, is the supply of very unpleasant weapons coming in, usually ordered through the dark web. Again, the police felt that they did not have the equipment, knowledge, skills or even the time to track them down. They were always chasing their tail and were never catching up—they could never keep ahead of it. A really important issue is buried in this amendment; we need to consider it more broadly and society needs to take account of it. If there is an issue within the Bill that should be addressed, it is that. We would like it discussed and hope it will be thought about and implemented if possible.
On Amendment 268AA, I will go back to what the noble Lord, Lord Clement-Jones, said about the evidence we received in the Joint Committee—it was extraordinarily powerful, particularly that from Rio Ferdinand but also that from others who accompanied him on that occasion—about the impact that the internet was having on the health and well-being of players, particularly those affected by abuse after games. He said—I am sure that he will not mind me referring to this—that, before the internet got to the point where it is now, there was still terrible abuse in the stadiums when you were playing, but, because it did not come with you when you left the stadium, you were able to relax, go home and get away from it. But, with the internet, you see it on your timeline and in tweets, and it is sent to you by your friends—and it became impossible and 24/7. It became a real burden, and he saw the impact on younger players—we have seen plenty of evidence of that.
When we reflected in the committee as a result of that evidence, we were working with a version of the Bill that had Clause 11, on legal but harmful content. We were trying to find ways to get a better sense and balance. The committee clearly said that it did not think that legal but harmful was an appropriate way forward, but we certainly also recognised that that meant that a process would have to be in place to deal with material that is not what society wishes to see circulating and influencing the process and young people in particular. We recognised then that there was a problem with having an online safety Act that does not require companies to act on misogynous abuse or hatred being stirred up against coloured or disabled people, to give but two examples of where the gap would emerge. Although we recommended that the legal but harmful clause should be removed, we said that there had to be
“a statutory requirement on providers to have in place proportionate systems and processes to identify and mitigate reasonably foreseeable risks of harm arising from regulated activities defined under the Bill”.
We are not there yet.
Amendment 268AA in the name of my noble friend Lord Bassam gets us a little into the issues of racially aggravated behaviour, harassment and other forms of abuse, and I am very interested to hear what the Government’s response to it will be. I am not sure that we have the tools yet in the Bill, or in the terms of reference approach that has been taken, that would allow that proposal to happen, but maybe the Minister will be able to help us with this when he responds.
I will touch on other amendments in this group. I hope that we will receive a positive response to the amendment from my noble friend Lady Merron, who unfortunately cannot be with us at the moment, in relation to instances of gendered language.
The amendments proposed and spoken to by the noble Lord, Lord Clement-Jones, are important in themselves, but also play to a bigger point, particularly Amendment 201. We do not have much down on this in relation to the question of how Ofcom will relate to other regulators, but the case he made was very persuasive. I hope that the Minister can say something about that. The idea is that we will muddle on with the existing arrangement of informal networking between very powerful regulators—each of whom will have, as the noble Lord said, sometimes conflicting rules about how things go—which can be brokered through a co-operation agreement. But it would be better if it were accompanied by a set of real powers to work together, including joint powers in cases where there are issues affecting both personal data and the impact that Ofcom will have in relation to companies’ operations. We should also recognise that there will be other regulators joining the club every year that will need to be involved and processed. Some basic understanding of the rules—maybe not in the Bill, but certainly forecast to be brought forward in a future piece of legislation—seems to be vital to give them the context with which they can begin to work together and from which we can learn the lessons that will be necessary when new powers are prepared. I am very supportive of Amendment 201, and I hope that there will be a positive thought about how we might take it forward.
The noble Lord, Lord Bethell, is not in his place so, presumably, will not speak to his Amendment 220D, which would give Ofcom powers to delegate some of its regulated powers to another body. Other similar amendments are coming up later, so maybe that point will be picked up then. However, I will put on the record, as the noble Baroness, Lady Stowell, has said on other occasions, that there are problems with simply adding other regulators into what is a very powerful statutory body, particularly if they are not, in any way, public bodies themselves. With no disrespect to those currently working in them, I think that, where are a charitable body or a private company is engaging with bodies such as Ofcom, there should be no question of statutory powers being delegated to them; that must not happen. A form of contract for particular work to be delivered under the control of the statutory body, Ofcom, is fine, but we should not be talking about coequal powers; that would be wrong.
Finally, I do not want to anticipate the Minister in introducing the amendments in his name, but we have no objections to them. I am sure that they will work exactly as he proposes and that they will be acceptable.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, this has been miscellany, indeed. We must be making progress if we are picking up amendments such as these. I thank noble Lords who have spoken to the amendments and the issues covered in them.

I turn first to Amendment 185A brought to us by the noble Lord, Lord Bassam of Brighton, which seeks to add duties on online marketplaces to limit children’s access to the sale of knives, and proactively to identify and remove listings which appear to encourage the sale of knives for the purposes of violence or self-harm. Tackling knife crime is a priority for His Majesty’s Government; we are determined to crack down on this violent scourge, which is devastating our communities. I hope that he will forgive me for not drawing on the case he mentioned, as it is still sub judice. However, I certainly take the point he makes; we are all too aware of cases like it up and down the country. I received an email recently from Amanda and Stuart Stephens, whose son, Olly, was murdered by two boys, one of whom was armed with a knife. All these cases are very much in our minds as we debate the Bill.

Let me try to reassure them and the noble Lord as well as other Members of the Committee that the Bill, through its existing duties and other laws on the statute book, already achieves what the noble Lord seeks with his amendment. The sale of offensive weapons and of knives to people under the age of 18 are criminal offences. Any online retailer which directly sells these prohibited items can already be held criminally liable. Once in force, the Bill will ensure that technology platforms, including online marketplaces, prevent third parties from using their platform to sell offensive weapons or knives to people under the age of 18. The Bill lists both these offences as priority offences, meaning that user-to-user services, including online marketplaces, will have a statutory obligation proactively to prevent these offences taking place on their services.

18:15
I am grateful to the noble Lord, Lord Stevenson of Balmacara, for his support for the government amendments. The Government are committed to ensuring that the regime set up by the Bill is cost-neutral to the taxpayer. As such, it will be funded via annual fees on regulated services the revenue from which is at or above a set revenue threshold. At present, Ofcom is preparing for its new duties as regulator, and funding this by the retention of receipts under the Wireless Telegraphy Act 2006. Once the Bill that we are debating in this Committee is in place, Ofcom will charge fees to recoup this money alongside funding its ongoing costs. As the Bill is still before this Committee, Ofcom has not yet been granted the information-gathering powers which are necessary to prepare the fee regime. This means it cannot issue information requests for financial information from firms. As such, only when the Bill passes will Ofcom be able to begin implementing the fee regime.
In consideration of this, the Government have decided that fees should be charged from the financial year 2025-26 at the earliest. The decision does not affect implementation timings for any other areas of the regime. Amendments 186A to 186C will ensure that the fee regime functions in a fair and practical manner under these timings. The amendments ensure that the costs that Ofcom incurs while preparing for, and exercising, its online safety functions are met by the retention of receipts under the Wireless Telegraphy Act, up until the point when the fee regime is operational. They also ensure that these costs are recovered in a proportionate manner by extending the Schedule 10 recouping regime. I hope that that will have the support of this Committee.
Amendment 200 seeks to expand the definition of “skilled person” to include a regulator or self-regulatory body. I assure the noble Lord, Lord Allan of Hallam, that the Bill’s definition of a “skilled person” is already sufficiently broad to include a regulator or self-regulatory body. As set out in Clause 207(1), the Bill’s existing definition of “person” includes “any organisation or association of persons”. This means that the Bill’s definition of “skilled person” enables Ofcom to appoint an individual, organisation, body of persons or association of persons which appears to it to have the skills necessary to prepare a specific report. That includes a regulator or self-regulatory body, as the noble Lord’s amendment suggests.
On regulatory co-operation, I am conscious that I promised the noble Lord, Lord Russell of Liverpool, further details on this when he asked about it in an earlier grouping, so I shall set them out now so that he can consult them in the official record. I reassure the noble Lord and other noble Lords that Ofcom has strong existing relationships with domestic regulators. That has been supported by the establishment of the Digital Regulation Cooperation Forum, which we have discussed before. Effective regulatory co-ordination is essential for addressing the cross-cutting opportunities and challenges posed by digital technologies and services.
The creation of the forum was a significant step forward in delivering greater coherence at the institutional level and has been widely welcomed by industry and consumer representatives. Its formation has been particularly timely in bringing together the key regulators involved in the proposed new online safety, data and digital competition regimes. Its work has already delivered real and wide-ranging impact, including landmark policy statements clarifying the interactions between digital regulatory regimes, research into cross-cutting issues and horizon-scanning activities on new regulatory challenges. It is important to note that the Information Commissioner’s Office is a member of the forum. We will continue to assess how best to support collaboration between digital regulators and ensure that their approaches are joined up.
In addition, Ofcom already has a statutory footing to share information with UK regulators under the Communications Act 2003. Section 393 of that Act includes provisions for sharing information between Ofcom and other regulators in the UK, such as the Information Commissioner’s Office, the Financial Conduct Authority and the Competition and Markets Authority. So, we believe the issues set out in the amendment are covered.
Let me turn now to the cunning amendments from the noble Lord, Lord Stevenson, which seek to introduce special provisions to apply in cases where a notice issued under Clause 110 would involve the monitoring of journalistic material or material identifying journalistic sources. I appreciate the way he has set those out and I am very happy to have the more detailed discussion with the Bill team that he suggested. Let me just say, though, that the Government are fully committed to protecting the integrity of journalistic material and there is no intention that the technologies required under Clause 110 in relation to private communications would identify anything other than child sexual abuse and exploitation content. These powers are subject to strong safeguards to protect the privacy of all users. Any technologies required on private communications must be accredited by Ofcom as being highly accurate in detecting only child sexual exploitation and abuse content. These minimum standards of accuracy will be approved and published by the Secretary of State following advice from Ofcom and will ensure that it is highly unlikely that journalistic material that is not such content would be erroneously flagged or removed.
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I am sorry to interrupt. The Minister has twice given a positive response, but he limited it to child sexual exploitation; he did not mention terrorism, which is in fact the bigger issue. Could he confirm that it is both?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes, and as I say, I am happy to talk with the noble Lord about this in greater detail. Under the Bill, category 1 companies will have a new duty to safeguard all journalistic content on their platform, which includes citizen journalism. But I will have to take all these points forward with him in our further discussions.

My noble friend Lord Bethell is not here to move his Amendment 220D, which would allow Ofcom to designate online safety regulatory duties under this legislation to other bodies. We have previously discussed a similar issue relating to the Internet Watch Foundation, so I shall not repeat the points that we have already made.

On the amendments on supposedly gendered language in relation to Ofcom advisory committees in Clauses 139 and 155, I appreciate the intention to make it clear that a person of either sex should be able to perform the role of chairman. The Bill uses the term “chairman” to be consistent with the terminology in the Office of Communications Act 2002, and we are confident that this will have no bearing on Ofcom’s decision-making on who will chair the advisory committees that it must establish, just as, I am sure, the noble Lord’s Amendment 56 does not seek to be restrictive about who might be an “ombudsman”.

I appreciate the intention of Amendment 262 from the noble Baroness, Lady Bennett of Manor Castle. It is indeed vital that the review reflects the experience of young people. Clause 159 provides for a review to be undertaken by the Secretary of State, and published and laid before Parliament, to assess the effectiveness of the regulatory framework. There is nothing in the existing legislation that would preclude seeking the views of young people either as part of an advisory group or in other ways. Moreover, the Secretary of State is required to consult Ofcom and other persons she considers appropriate. In relation to young people specifically, it may be that a number of different approaches will be effective—for example, consulting experts or representative groups on children’s experiences online. That could include people of all ages. The regulatory framework is designed to protect all users online, and it is right that we take into account the full spectrum of views from people who experience harms, whatever their age and background, through a consultation process that balances all their interests.

Amendment 268AA from the noble Lord, Lord Bassam, relates to reporting requirements for online abuse and harassment, including where this is racially motivated—an issue we have discussed in Questions and particularly in relation to sport. His amendment would place an additional requirement on all service providers, even those not in scope of the Bill. The Bill’s scope extends only to user-to-user and search services. It has been designed in this way to tackle the risk of harm to users where it is highest. Bringing additional companies in scope would dilute the efforts of the legislation in this important regard.

Clauses 16 and 26 already require companies to set up systems and processes that allow users easily to report illegal content, including illegal online abuse and harassment. This amendment would therefore duplicate this existing requirement. It also seeks to create an additional requirement for companies to report illegal online abuse and harassment to the Crown Prosecution Service. The Bill does not place requirements on in-scope companies to report their investigations into crimes that occur online, other than child exploitation and abuse. This is because the Bill aims to prevent and reduce the proliferation of illegal material and the resulting harm it causes to so many. Additionally, Ofcom will be able to require companies to report on the incidence of illegal content on their platforms in its transparency reports, as well as the steps they are taking to tackle that content.

I hope that reassures the noble Lord that the Bill intends to address the problems he has outlined and those explored in the exchange with the noble Lord, Lord Clement-Jones. With that, I hope that noble Lords will support the government amendments in this group and be satisfied not to press theirs at this point.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I listened very carefully to the Minister’s response to both my amendments. He has gone some way to satisfying my concerns. I listened carefully to the concerns of the noble Baroness, Lady Fox, and noble Lords on the Lib Dem Benches. I am obviously content to withdraw my amendment.

I do not quite agree with the Minister’s point about dilution on the last amendment—I see it as strengthening —but I accept that the amendments themselves slightly stretch the purport of this element of the legislation. I shall review the Minister’s comments and I suspect that I shall be satisfied with what he said.

Amendment 185A withdrawn.
Clauses 74 to 78 agreed.
Clause 79: OFCOM’s fees statements
Amendment 186 not moved.
Amendment 186A
Moved by
186A: Clause 79, page 71, line 20, leave out paragraph (b)
Member’s explanatory statement
This amendment omits a provision about recouping OFCOM’s preparatory costs via fees under Part 6 of the Bill, because it is now intended to recoup all preparatory costs incurred before the fees regime is in operation via the charging of additional fees under Schedule 10 (see also the amendment to Schedule 10 in the Minister’s name).
Amendment 186A agreed.
Clause 79, as amended, agreed.
Clause 80: Recovery of OFCOM’s initial costs
Amendment 186B
Moved by
186B: Clause 80, page 71, line 26, leave out from “incurred” to end of line 27 and insert “before the first day of the initial charging year.”
Member’s explanatory statement
This amendment is to the clause introducing Schedule 10 (recovery of OFCOM’s initial costs). The amendment reflects the change to Schedule 10 proposed by the amendment of that Schedule in the Minister’s name.
Amendment 186B agreed.
Clause 80, as amended, agreed.
Schedule 10: Recovery of OFCOM’s initial costs
Amendment 186C
Moved by
186C: Schedule 10, page 212, line 37, leave out from “before” to end of line 39 and insert “the first day of the initial charging year on—
(a) preparations for the exercise of their online safety functions, or(b) the exercise of their online safety functions;”Member’s explanatory statement
Schedule 10 enables OFCOM to charge additional fees to recover certain online safety costs which are met by the retention of receipts under the Wireless Telegraphy Act 2006. This amendment extends the Schedule 10 regime to cover all costs incurred before the main fees regime under Part 6 of the Bill is in operation (as opposed to only covering preparatory costs incurred before the commencement of clause 79).
Amendment 186C agreed.
Schedule 10, as amended, agreed.
Clause 81 agreed.
Clause 82: General duties of OFCOM under section 3 of the Communications Act
Amendment 187 not moved.
Clause 82 agreed.
Amendment 188 not moved.
Clauses 83 and 84 agreed.
Amendments 189 to 191 not moved.
Clause 85 agreed.
Schedule 11: Categories of regulated user-to-user services and regulated search services: regulations
Amendment 192
Moved by
192: Schedule 11, page 216, line 30, after “service” insert “, including significant risk of harm,”
Member’s explanatory statement
There are some platforms which, whilst attracting small user numbers, are hubs for extreme hateful content and should be regulated as larger user-to-user services.
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Hansard - - - Excerpts

My Lords, I am very grateful to the noble Baronesses, Lady Parminter and Lady Deech, and the noble Lord, Lord Mann, for their support. After a miscellaneous selection of amendments, we now come back to a group of quite tight amendments. Given the hour, those scheduling the groupings should be very pleased because for the first time we have done all the groups that we set out to do this afternoon. I do not want to tempt fate, but I think we will have a good debate before we head off for a little break from the Bill for a while.

18:30
I am very sympathetic to the other amendments in this grouping: Amendments 192A and 194. I think there is a common theme running through them all, unsurprisingly, which I hope my noble friend the Minister will be able to address in his remarks. Some of these amendments come about because we do not know the exact categorisation of the services we are most concerned about in this House and beyond, and how that categorisation process is going to work and be kept under review. That is probably the reason behind this group of amendments.
As noble Lords will be aware, the Bill proposes two major categories of regulated company, category 1 and category 2, and there is another carve-out for search services. Much of the discussion about the Bill has focused on the regulatory requirements for category 1 companies, but—again, we have not seen the list—it is expected that the list of category 1 companies may number only a few dozen, while thousands and thousands of platforms and search engines may not meet that threshold. But some of those other platforms, while attracting small user numbers, are hubs for extremely hateful content. In a previous debate we heard about the vile racist abuse often aimed at particular groups. Some of these platforms are almost outside some of our own experiences. They are deliberately designed to host such hateful content and to try to remain under the radar, but they are undoubtedly deeply influential, particularly to those—often vulnerable—users who access them.
Platforms such as 8kun, 4chan and BitChute are perhaps becoming more well known, whereas Odysee, Rumble and Minds remain somewhat obscure. There are numerous others, and all are easily accessible from anyone’s browser. What does the harm caused by these platforms look like? Some examples are in the public domain. For example, the mass shooting in Buffalo, in America, was carried out by a terrorist whose manifesto was inspired by 4chan’s board and who spoke of its influence on him. Later in this debate we are going to hear about specific content related to suicide, self-harm or eating disorders, which we have already debated in other contexts in these Committee proceedings.
The Center for Countering Digital Hate revealed that the four leading forums it analysed for incels—involuntary celibates—were filled with extreme hatred of women, glorification of violence and active discussion of paedophilia. On Gab, an “anti-Jewish meme repository”, grotesque anti-Semitic caricatures of Jews are shared from an account with an offensive name that seeks to deny the Holocaust. Holocaust denial material is similarly shared across BitChute, where it is also possible to find a video on the supposed
“Jewish Plan To Genocide The White Race”
and, of course, 9/11 conspiracy theories. Meanwhile, on Odysee, other than discussion of the supposed “fake Holocaust” one can find discussion of the “Jewish problem”. On Minds, both President Zelensky and President Putin are condemned for having “kike”—an offensive term for Jews—inner circles, while other posts state that communism is Jewish control and the vessel to destroy our freedom.
The Government and many others know very well that these small, high-harm platforms are a problem. MPs in earlier debates on this Bill raised concerns repeatedly. The noble Lord, Lord Austin, raised this at Second Reading in your Lordships’ House and, nearly a year ago, the then Secretary of State issued a ministerial Statement indicating that, while the Government appreciated that small high-harm platforms do damage,
“more research is required before such platforms can be assigned to the category 1 designation for the online safety regime”.
This was despite Ofcom’s road map for online safety making it clear that it had already identified a number of small platforms that are clearly giving cause for concern.
So the case for action, as set out in my remarks and elsewhere, is proven. The Antisemitism Policy Trust has given evidence to the Joint Committee on the draft Bill and the Bill Committee in another place about this. The Community Security Trust, HOPE not hate and many others have data that demonstrates the level of hateful anti-Semitic and other racist and misogynistic abuse on these platforms. I know others will refer to the work of the Samaritans, the Mental Health Foundation and Beat in raising issues around suicide, self-harm and eating disorder content.
Extraordinarily, these are not platforms where this content is stumbled on or somehow hidden. They are set up deliberately to spread this content, to get people to look at it and to amplify this deeply harmful material. These sites act as feeders for hateful messages and activity on mainstream platforms or as receptors for those directed away from those larger services to niche, hate-filled rabbit holes. We need to think about this as the Bill is implemented. As we hope that the larger platforms will take action and live up to the terms of service they say they have, without action this content will unfortunately disappear to smaller platforms which will still be accessed and have action in the online and offline worlds. I hope my noble friend the Minister will say something about post-implementation in relation to these platforms.
Amendment 192 is a small, technical amendment. It does not compel Ofcom to add burdens to all small platforms but provides a specific recourse for the Secretary of State to consider the risks of harm as part of the process of categorisation. A small number of well-known, small high-harm sites would be required to add what will ultimately be minimal friction and other measures proportionate to their size. They will be required to deliver enhanced transparency. This can only be for the good, given that in some cases these sites are designed specifically to spread harm and radicalise users towards extreme and even terrorist behaviours.
The Government accept that there is a problem. Internet users broadly accept that there is a problem. It must be sensible, in deciding on categorisation, to look at the risk of harm caused by the platforms. I beg to move.
Lord Griffiths of Burry Port Portrait Lord Griffiths of Burry Port (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to Amendment 192A. There can be nothing more comfortable within the terms of parliamentary debate than to find oneself cossetted by the noble Baroness, Lady Morgan, on one side and my noble friend Lord Stevenson on the other. I make no apology for repeating the thrust of the argument of the noble Baroness, but I will narrow the focus to matters that she hinted at which we need to think about in a particular way.

We have already debated suicide, self-harm and eating disorder content hosted by category 1 providers. There is a need for the Bill to do more here, particularly through strengthening the user empowerment duties in Clause 12 so that the safest option is the default. We have covered that ground. This amendment seeks to address the availability of this content on smaller services that will fall outside category 1, as the noble Baroness has said. The cut-off conditions under which services will be determined to fall within category 1 are still to be determined. We await further progress on that. However, there are medium-sized and small providers whose activities we need to look at. It is worth repeating—and I am aware that I am repeating—that these include suicide and eating disorder forums, whose main business is the sharing and discussion of methods and encouragement to engage in these practices. In other words, they are set up precisely to do that.

We know that that there are smaller platforms where users share detailed information about methods of suicide. One of these in particular has been highlighted by families and coroners as playing a role in the suicides of individuals in the UK. Regulation 28 reports—that is, an official request for action—have been issued to DCMS and DHSC by coroners to prevent future comparable deaths.

A recent systematic review, looking at the impact of suicide and self-harm-related videos and photographs, showed that potentially harmful content concentrated specifically on sites with low levels of moderation. Much of the material which promotes and glorifies this behaviour is unlikely to be criminalised through the Government’s proposed new offence of encouragement to serious self-harm. For example, we would not expect all material which provides explicit instructional information on how to take one’s life using novel and effective methods to be covered by it.

The content has real-world implications. There is clear evidence that when a particular suicide method becomes better known, the effect is not simply that suicidal people switch from one intended method to the novel one, but that suicides occur in people who would not otherwise have taken their own lives. There are, therefore, important public health reasons to minimise the discussion of dangerous and effective suicide methods.

The Bill’s pre-legislative scrutiny committee recommended that the legislation

“adopt a more nuanced approach, based not just on size and high-level functionality, but factors such as risk, reach, user base, safety performance, and business model”.

This amendment is in line with that recommendation, seeking to extend category 1 regulation to services that carry a high level of risk.

The previous Secretary of State appeared to accept this argument—but we have had a lot of Secretaries of State since—and announced a deferred power that would have allowed for the most dangerous forums to be regulated; but the removal of the “legal but harmful” provisions from the legislation means that this power is no longer applicable, as its function related to the “adult risk assessment” duty, which is no longer in the Bill.

This amendment would not shut down dangerous services, but it would make them accountable to Ofcom. It would require them to warn their users of what they were about to see, and it would require them to give users control over the type of content that they see. That is, the Government’s proposed triple shield would apply to them. We would expect that this increased regulatory burden on small platforms would make them more challenging to operate and less appealing to potential users, and would diminish their size and reach over time.

This amendment is entirely in line with the Government’s own approach to dangerous content. It simply seeks to extend the regulatory position that they themselves have arrived at to the very places where much of the most dangerous content resides. Amendment 192A is supported by the Mental Health Foundation, the Samaritans and others that we have been able to consult. It is similar to Amendment 192, which we also support, but this one specifies that the harmful material that Ofcom must take account of relates to self-harm, suicide and eating disorders. I would now be more than happy to give way—eventually, when he chooses to do it—to my noble friend Lord Stevenson, who is not expected at this moment to use the true and full extent of his abilities at being cunning.

Baroness Bull Portrait Baroness Bull (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to offer support for all the amendments in this group, but I will speak principally to Amendment 192A, to which I have added my name and which the noble Lord, Lord Griffiths, has just explained so clearly. It is unfortunate that the noble Baroness, Lady Parminter, cannot be in her place today. She always adds value in any debate, but on this issue in particular I know she would have made a very compelling case for this amendment. I will speak principally about eating disorders, because the issues of self-harm have already been covered and the hour is already late.

The Bill as it stands presumes a direct relationship between the size of a platform and its potential to cause harm. This is simply not the case: a systematic review which we heard mentioned confirmed what all users of the internet already know—that potentially harmful content is often and easily found on smaller, niche sites that will fall outside the scope of category 1. These sites are absolutely not hard to find—they come up on the first page of a Google search—and some hide in plain sight, masquerading, particularly in the case of eating disorder forums, as sources of support, solace or factual information when in fact they encourage and assist people towards dangerous practices. Without this amendment, those sites will continue spreading their harm and eating disorders will continue to have the highest mortality rate of all mental illnesses in the UK.

18:45
I was going to say something about the suicide sites, but the point which comes out of the research has already been made, that when a novel method of suicide becomes more well known, it is not just that people intending to kill themselves switch from one method to another but that the prevalence of suicide increases. As the noble Lord said, this is not just about preventing individual tragedies but is indeed a public health issue.
I very much welcome the steps being taken in the Bill to tackle the prevalence of damaging content, particularly as it applies to children. However, I believe that, as the Bill stands, smaller providers will fly under the radar and vulnerable adults will be harmed—the Bill is extremely light on protections for that category of people. Amendment 192A absolutely seeks to ensure that the Bill tackles content wherever it gives rise to a very high risk of harm, irrespective of the platform’s size. Arguments about regulatory burden on small sites should not apply when health, well-being and lives are at risk. The pre-legislative committee was absolutely alive to this, and its recommendations highlighted the risks here of the small, high-risk companies. As we heard, the previous Secretary of State announced a deferred power but that lapsed when the adult risk assessments were removed.
I fear that the current approach in the Bill will push people who promote this kind of content simply to create smaller platforms where they are beyond the arm of the law. It is not clear whether they would be caught instead by the Government’s new offence of encouraging or assisting serious self-harm. I know we have not debated that yet, but I cannot understand whether encouragement to starvation would be covered by that new offence. It is properly too early to ask the Minister to clarify that, but if he has the answer, I would like to understand that.
We have heard the term “rabbit hole”; there is a rabbit hole, where people intent on self-harm or indeed those who suffer from eating disorders go from larger platforms to smaller and niche ones where they encounter the very content that feeds their addiction, or which fuels and enables their desire to self-harm. As I said in a previous grouping, this cannot be the intention of the Bill, I do not believe it is the intention of the Government, and I hope that the Minister will listen to the arguments that the noble Baroness, Lady Morgan of Cotes, set out so effectively.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I am a poor substitute for the noble Baroness, Lady Parminter, in terms of the substance of the issues covered by these amendments, but I am pleased that we have been able to hear from the noble Baroness, Lady Bull, on that. I will make a short contribution on the technology and the challenges of classification, because there are some important issues here that the amendments bring out.

We will be creating rules for categorising platforms. As I understand it, the rules will have a heavy emphasis on user numbers but will not be exclusively linked to user numbers. It would be helpful if the Minister could tease out a little more about how that will work. However, it is right even at this stage to consider the possibility that there will need to be exceptions to those rules and to have a mechanism in place for that.

We need to recognise that services can grow very quickly these days, and some of the highest-risk moments may be those when services have high growth but still very little revenue and infrastructure in place to look after their users. This is a problem generally with stepped models, where you have these great jumps; in a sense, a sliding scale would be more rational, so that responsibilities increase over time, but clearly from a practical view it is hard to do that, so we are going to end up with some kind of step model.

We also need to recognise that, from a technical point of view, it is becoming cheaper and easier to build new user-to-user services all the time. That has been the trend for years, but it is certainly the case now. If someone wants to create a service, they can rent the infrastructure from a number of providers rather than buying it, they can use a lot of code that is freely available—they do not need to write as much code as they used to—and they can promote their new service using all the existing social networks, so you can go from zero to significant user numbers in very quick time, and that is getting quicker all the time. I am interested to hear how the Minister expects such services to be regulated.

The noble Baroness, Lady Morgan, referred to niche platforms. There will be some that have no intention to comply, even if we categorise them as a 2B service. The letter will arrive from Ofcom and go in the bin. They will have no interest whatever. Some of the worst services will be like that. The advantage of us ensuring that we bring them into scope is that we can move through the enforcement process quickly and get to business disruption, blocking, or whatever we need to do to get them out of the UK market. Other niche services will be willing to come into line if they are told they are categorised as 2B but given a reasonable set of requirements. Some of Ofcom’s most valuable work might be precisely to work with them: services that are borderline but recognise that they want to have a viable business, and they do not have a viable business by breaking the law. We need to get hold of them and bring them into the net to be able to work with them.

Finally, there is another group which is very mainstream but in the growing phase and busy growing and not worrying about regulation. For that category of company, we need to work with them as they grow, and the critical thing is to get them early. I think the amendments would help Ofcom to be able get to them early—ideally, in partnership with other regulators, including the European Union, which is now regulating in a similar way under the Digital Services Act. If we can work with those companies as they come into 2B, then into category 1—in European speak, that is a VLOP, a very large online platform—and get them used to the idea that they will have VLOP and category 1 responsibilities before they get there, we can make a lot more progress. Then we can deliver what we are all trying to, which is a safer internet for people in the UK

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

I shall speak very briefly at this hour, just to clarify as much as anything. It seems important to me that there is a distinction between small platforms and large platforms, but my view has never been that if you are small, you have no potential harms, any more than if you are large, you are harmful. The exception should be the rule. We have to be careful of arbitrary categorisation of “small”. We have to decide who is going to be treated as though they are a large category 1 platform. I keep saying but stress again: do not assume that everybody agrees what significant risk of harm or hateful content is. It is such highly disputed political territory outside the online world and this House that we must recognise that it is not so straightforward.

I am very sympathetic, by the way, to the speeches made about eating disorders and other issues. I see that very clearly, but other categories of speech are disputed and argued over—I have given loads of examples. We end up where it is assumed that the manifestoes of mass shooters appear on these sites, but if you read any of those manifestoes of mass shooters, they will often be quoting from mainstream journalists in mainstream newspapers, the Bible and a whole range of things. Just because they are on 4Chan, or wherever, is not necessarily the problem; it is much more complicated.

I ask the Minister, and the proposers of the amendment, to some extent: would it not be straightforwardly the case that if there is a worry about a particular small platform, it might be treated differently—

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I just want to react to the manifestos of mass shooters. While source material such the Bible is not in scope, I think the manifesto of a shooter is clear incitement to terrorism and any platform that is comfortable carrying that is problematic in my view, and I hope it would be in the noble Baroness’s view as well.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I was suggesting that we have a bigger problem than it appearing on a small site. It quotes from mainstream media, but it ends up being broadly disseminated and not because it is on a small site. I am not advocating that we all go round carrying the manifestos of mass shooters and legitimising them. I was more making the point that it can be complicated. Would not the solution be that you can make appeals that a small site is treated differently? That is the way we deal with harmful material in general and the way we have dealt with, for example, RT as press without compromising on press freedom. That is the kind of point I am trying to make.

I understand lots of concerns but I do not want us to get into a situation where we destroy the potential of all smaller platforms—many of them doing huge amounts of social good, part of civil society and all the rest of it—by treating them as though they are large platforms. They just will not have the resources to survive, that is all my point is.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I am going to be extremely brief given the extremely compelling way that these amendments have been introduced by the noble Baroness, Lady Morgan, and the noble Lord, Lord Griffiths, and contributed to by the noble Baroness, Lady Bull. I thank her for her comments about my noble friend Lady Parminter. I am sure she would have wanted to be here and would have made a very valuable contribution as she did the other day on exactly this subject.

As the noble Baroness, Lady Fox, has illustrated, we have a very different view of risk across this Committee and we are back, in a sense, into that whole area of risk. I just wanted to say that I think we are again being brought back to the very wise words of the Joint Committee. It may sound like special pleading. We keep coming back to this, and the noble Lord, Lord Stevenson, and I are the last people standing on a Thursday afternoon.

We took a lot of evidence in this particular area. We took the trouble to go to Brussels and had a very useful discussion with the Centre on Regulation in Europe and Dr Sally Broughton Micova. We heard a lot about interconnectedness between some of these smaller services and the impact in terms of amplification across other social media sites.

We heard in the UK from some of the larger services about their concerns about the activities of smaller services. You might say “They would say that, wouldn’t they?” but they were pretty convincing. We heard from HOPE not Hate, the Antisemitism Policy Trust and Stonewall, stressing the role of alternative services.

Of course, we know that these amendments today—some of them sponsored by the Mental Health Foundation, as the noble Lord, Lord Griffiths, said, and Samaritans—have a very important provenance. They recognise that these are big problems. I hope that the Minister will think strongly about this. The injunction from the noble Lord, Lord Allan, to consider how all this is going to work in practice is very important. I very much hope that when we come to consider how this works in practical terms that the Minister will think very seriously about the way in which risk is to the fore— the more nuanced approach that we suggested—and the whole way that profiling by Ofcom will apply. I think that is going to be extremely important as well. I do not think we have yet got to the right place in the Bill which deals with these risky sites. I very much hope that the Minister will consider this in the quite long period between now and when we next get together.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a good little debate with some excellent speeches, which I acknowledge. Like the noble Lord, Lord Clement-Jones, I was looking at the Joint Committee’s report. I concluded that one of the first big issues we discussed was how complicated the categorisation seemed in relation to the task that was being set for Ofcom. We comforted ourselves with the thought that if you believe that this is basically a risk-assessment exercise and that all the work Ofcom will subsequently do is driven by its risk assessments and its constant reviewing of them, then the categorisation is bound to fall down because the risks will reveal the things that need to happen.

19:00
As we have been through the process in our discussions in Committee, we keep coming across issues where proportionality seems to come around. The proportionality that I worry about is that which says, “If only a small number of people are affected by this, then obviously less needs to be happening at Ofcom level”. We debated this earlier in relation to the amendments on children and seemed to come out in two different positions.
I believe that there should be zero tolerance on whether children should be accessing material which is illegal for them, but the Bill does not say that. It says that all Ofcom’s work has to be done in proportion to the impact, not only in the direct work of trying to mitigate harms or illegality that could occur but taking into account the economic size of the company and the impact that the work would have on its activities. I do not think we can square that off, so I appeal to the Minister, when he comes to respond, to look at it from the other end. Why is it not possible to have a structure which is driven by the risk? If the risk assessment reveals risks that require action, there should not be a constraint simply because the categorisation hurdle has been met. The risk is what matters. Does he agree?
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

I am grateful to noble Lords for helping us to reach our target for the first time in this Committee, especially to do so in a way which has given us a good debate on which to send us off into the Whitson Recess. I am off to the Isle of Skye, so I will make a special detour to Balmacara in honour of the noble Lord.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

The noble Lord does not believe anything that I say at this Dispatch Box, but I will send a postcard.

As noble Lords are by now well aware, all services in scope of the Bill, regardless of their size, will be required to take action against illegal content and all services likely to be accessed by children must put in place protections for children. Companies designated as category 1 providers have significant additional duties. These include the overarching transparency, accountability and freedom of expression duties, as well as duties on content of democratic importance, news publishers’ content, journalistic content and fraudulent advertising. It is right to put such duties only on the largest platforms with features enabling the greatest reach, as they have the most significant influence over public discourse online.

I turn first to Amendment 192 in the name of my noble friend Lady Morgan of Cotes and Amendment 192A from the noble Lord, Lord Griffiths of Burry Port, which are designed to widen category 1 definitions to include services that pose a risk of harm, regardless of their number of users. Following removal of the legal but harmful provisions in another place, the Bill no longer includes the concept of risk of harm in Category 1 designation. As we set out, it would not be right for the Government to define what legal content it considers harmful to adults, and it follows that it would not be appropriate for the Government to categorise providers and to require them to carry out duties based on this definition.

In addition, requiring all companies to comply with the full range of Category 1 duties would pose a disproportionate burden on services which do not exert the same influence over public discourse online. I appreciate the point made by the noble Baroness, Lady Bull, with regard to regulatory burden. There is a practical element to this as well. Services, particularly smaller ones, have finite resources. Imposing additional duties on them would divert them from complying with their illegal and child safety duties, which address the most serious online harms. We do not want to weaken their ability to tackle criminal activity or to protect children.

As we discussed in detail in a previous debate, the Bill tackles suicide and self-harm content in a number of ways. The most robust protections in the Bill are for children, while those for adults strike a balance between adults being protected from illegal content and given more choice over what legal content they see. The noble Lord, Lord Stevenson, asked why we do not start with the highest risk rather than thinking about the largest services, but we do. We start with the most severe harms—illegal activity and harm to children. We are focusing on the topics of greatest risk and then, for other categories, allowing adults to make decisions about the content with which they interact online.

A number of noble Lords referred to suicide websites and fora. We are concerned about the widespread availability of content online which promotes and advertises methods of suicide and self-harm, which can be easily accessed by young or vulnerable people. Under the Bill, where suicide and self-harm websites host user-generated content, they will be in scope of the legislation. These sites will need proactively to prevent users from being exposed to priority illegal content, including content which encourages or assists suicide under the terms of the Suicide Act 1961. Additionally, it is an offence under Section 4(3) of the Misuse of Drugs Act 1971 for a website to offer to sell controlled drugs to consumers in England and Wales. Posting advice on how to obtain such drugs in England and Wales is also likely to be an offence, regardless of where the person providing the advice is located.

The Bill also limits the availability of such content by placing illegal content duties on search services, including harmful content which affects children or where this content is shared on user-to-user services. This will play a key role in reducing traffic that directs people to websites which encourage or assist suicide, and reduce the likelihood of users encountering such content. The noble Baroness, Lady Bull, asked about starvation. Encouraging people to starve themselves or not to take prescribed medication will be covered.

Amendment 194 tabled by the noble Lord, Lord Stevenson of Balmacara, seeks to ensure that Ofcom can designate companies as category 1, 2A or 2B on a provisional basis, when it considers that they are likely to meet the relevant thresholds. This would mean that the relevant duties can be applied to them, pending a full assessment by Ofcom. The Government recognise the concern highlighted by the noble Lord, Lord Allan, about the rapid pace of change in the technology sector and how that can make it challenging to keep the register of the largest and most influential services up to date. I assure noble Lords that the Bill addresses this with a duty which the Government introduced during the Bill’s recommittal in another place. This duty, at Clause 88, requires Ofcom proactively to identify and publish a list of companies which are close to category 1 thresholds. This will reduce any delays in Ofcom adding additional obligations on companies which grow rapidly, or which introduce new high-risk features. It will also ensure that the regime remains agile and adaptable to emerging threats.

Platforms with the largest reach and greatest influence over public discourse will be designated as category 1. The Bill sets out a clear process for determining category 1 providers, based on thresholds relating to these criteria, which will be set by the Secretary of State in secondary legislation. The process has been designed to ensure that it is transparent and evidence-based. We expect the main social media platforms and possibly some others to be designated as category 1 services, but we do not wish to prejudge the process set out above by indicating which specific services are likely to be designated, as I have set out on previous groups.

The amendment would enable Ofcom to place new duties on companies without due process. Under the approach that we take in the Bill, Ofcom can designate companies as belonging to each category based only on an objective assessment of evidence against thresholds approved by Parliament. The Government’s approach also provides greater certainty for companies, as is proposed in this amendment. We have heard concerns in previous debates about when companies will have the certainty of knowing their category designation. These amendments would introduce continuous uncertainty and subjectivity into the designation process and would give Ofcom significant discretion over which companies should be subject to which duties. That would create a very uncertain operating environment for businesses and could reduce the attractiveness of the UK as a place to do business.

I hope that explains why we are not taken by these amendments but, in the spirit of the Whitsun Recess, I will certainly think about them on the train as I head north. I am very happy to discuss them with noble Lords and others between now and our return.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

Before the Minister sits down, he did let slip that he was going on the sleeper, so I do not think that there will be much thinking going on—although I did not sleep a wink the last time I went, so I am sure that he will have plenty of time.

I am sure that the noble Baroness, Lady Morgan, will want to come in—but could he repeat that again? Risk assessment drives us, but the risk assessment for a company that will not be regarded as a category 1 provider because it does not meet categorisation thresholds means that, even though it is higher risk than perhaps even some of the category 1 companies, it will not be subject to the requirements to pick up the particular issues raised by the noble Baroness and the noble Lord, and their concerns for those issues, which are clearly social harms, will not really be considered on a par.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

In the response I gave, I said that we are making the risk assessment that the riskiest behaviour is illegal content and content which presents a harm to children. That is the assessment and the approach taken in the Bill. In relation to other content which is legal and for adults to choose how they encounter it, there are protections in the Bill to enforce terms of service and empower users to curate their own experience online, but that assessment is made by adult users within the law.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

I thank all noble Lords who spoke in this short but important debate. As we heard, some issues relating to risk and harm have been returned to and will no doubt be again, and we note the impact of the absence of legal but harmful as a concept. As the noble Baroness, Lady Bull, said, I know that the noble Baroness, Lady Parminter, was very sad that she could not be here this afternoon due to another engagement.

I will not keep the House much longer. I particularly noted the noble Baroness’s point that there should not be, and is not, a direct relationship between the size of the platform and its ability to cause harm. There is a balance to be struck between the regulatory burden placed on platforms versus the health and well-being of those who are using them. As I have said before, I am not sure that we have always got that particular balance right in the Bill.

The noble Lord, Lord Allan, was very constructive: it has to be a good thing if we are now beginning to think about the Bill’s implementation, although we have not quite reached the end and I do not want to prejudge any further stages, in the sense that we are now thinking about how this would work. Of course, he is right to say that some of these platforms have no intention of complying with these rules at all. Ofcom and the Government will have to work out what to do about that.

Ultimately, the Government of the day—whoever it might be—will want the powers to be able to say that a small platform is deeply harmful in terms of its content and reach. When the Bill has been passed, there will be pressure at some point in the future on a platform that is broadcasting or distributing or amplifying content that is deeply harmful. Although I will withdraw the amendment today, my noble friend’s offer of further conversations, and more detail on categorisation and of any review of the platforms as categorised as category 1, 2 and beyond, would be very helpful in due course. I beg leave to withdraw.

Amendment 192 withdrawn.
Amendments 192A and 193 not moved.
Schedule 11 agreed.
Clause 86 agreed.
Amendment 194 not moved.
Clauses 87 and 88 agreed.
Clause 89: OFCOM’s register of risks, and risk profiles, of Part 3 services
Amendments 194A to 197 not moved.
Clause 89 agreed.
Clause 90 agreed.
House resumed.

Online Safety Bill

Committee (10th Day)
12:16
Relevant document: 28th Report from the Delegated Powers Committee
Clause 91: Power to require information
Amendment 198
Moved by
198: Clause 91, page 82, line 14, at end insert—
“(o) the purpose of obtaining information relevant to the death of a child (as defined in section (Duties of OFCOM in certain cases where a child has died)(3)).”Member’s explanatory statement
This amendment is consequential on Baroness Kidron’s amendment after Clause 117 which would add a new Clause imposing express duties on OFCOM in certain cases where a child has died.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, first, I want to recognise the bravery of the families of Olly, Breck, Molly, Frankie and Sophie in campaigning for the amendments we are about to discuss. I also pay tribute to Mia, Archie, Isaac, Maia and Aime, whose families I met this morning on their way to the House. It is a great privilege to stand alongside them and witness their courage and dignity in the face of unimaginable grief. On behalf of myself, my co-signatories—the noble Lords, Lord Stevenson and Lord Clement-Jones, and the noble Baroness, Lady Morgan—and the huge number of Peers and MPs who have supported these amendments, I thank them for their work and the selflessness they have shown in their determination to ensure that other families do not suffer as they have.

This group includes Amendments 198, 199, 215 and 216, which, together, would create a pathway for coroners and, by extension, families to get access to information relevant to the death of a child from technology services. The amendments would put an end to the inhumane situation whereby coroners and families in crisis are forced to battle faceless corporations to determine whether a child’s engagement with a digital service contributed to their death. Bereaved families have a right to know what happened to their children, and coroners have a duty to ensure that lessons are learned and that those who have failed in their responsibilities are held accountable.

Since the Minister is going to be the bearer of good news this afternoon, I will take the time to make arguments for the amendments as they stand. I simply say that, while parents have been fighting for access to information, those same companies have continued to suggest friends, material and behaviours that drive children into places and spaces in which they are undermined, radicalised into despair and come to harm. In no other circumstance would it be acceptable to withhold relevant information from a court procedure. It is both immoral and a failure of justice if coroners cannot access and review all relevant evidence. For the families, it adds pain to heartbreak as they are unable to come to terms with what has happened because there is still so much that they do not know.

I am grateful to the Government for agreeing to bring forward on Report amendments that will go a very long way towards closing the loopholes that allow companies to refuse coroners’ demands and ignore parents’ entreaties. The Government’s approach is somewhat different from that in front of us, but it covers the same ground. These amendments are the result of the considerable efforts of Ministers and officials from DSIT and the Ministry of Justice, with the invaluable support of the right honourable Sajid Javid MP. I wish to note on the record the leadership of the Secretary of State, who is currently on leave, and the Minister here, the noble Lord, Lord Parkinson.

The Government’s amendments will create an express power for Ofcom to require information from services about a deceased child user’s online activity following the receipt of a Schedule 5 request from a coroner. This will vastly increase the reach and power of that coroner. Information that Ofcom can request from regulated companies under the Online Safety Bill is extremely wide and includes detailed data on what is recommended; the amount of time the child spent on the service when they accessed it; their user journey; what content they liked, shared, rewatched, paused and reported; and whether other users raised red flags about the child’s safety or well-being before their death.

Information notices prompted by a Schedule 5 request from a coroner will be backed by Ofcom’s full enforcement powers and will apply to all regulated companies. If a service fails to comply, it may be subject to enforcement action, including senior management liability and fines of up to £18 million or 10% of global turnover—vastly different from the maximum fine of £1,000 under the Coroners and Justice Act 2009. Moreover, these amendments will give coroners access to Ofcom’s expertise and understanding of how online services work and of online services’ safety duties to children. Also, there will be provisions empowering Ofcom to share information freely to assist coroners in their inquiries. Companies must provide a dedicated means of communication to manage requests for information from bereaved parents and provide written responses to those requests. I look forward to the Minister setting out that these will be operated by a team of experts and backed up by Ofcom in ensuring that the communication is adequate, timely and not obstructive. Importantly, if the communication is not adequate, bereaved families will be able to notify Ofcom.

There are a small number of outstanding questions. We remain concerned that only larger companies will be required to set out their policies on disclosure. Sadly, children are often coerced and nudged into smaller sites that have less robust safety mechanisms. Small is not safe. A further issue is to ensure that a coroner is able, via a Schedule 5 notice given to Ofcom, to compel senior management to appear at an inquest. This is a crucial ask of the legal community, who battled and failed to get companies to attend inquests, notably Wattpad at the Frankie Thomas inquest and Snap Inc at Molly Russell’s inquest. Can the Minister undertake to close these gaps before Report?

A number of matters sit outside the scope of the Online Safety Bill. I am particularly grateful to the Secretary of State for committing in writing to further work beyond the Bill to ensure that the UK’s approach is comprehensive and watertight. The Government will be exploring ways in which the Data Protection and Digital Information (No. 2) Bill can support and complement these provisions, including the potential for a code that requires data preservation if a parent or enforcement officer contacts a helpline or if there is constructive knowledge, such as when a death has been widely reported, even before a Schedule 5 notice has been delivered.

The Government are engaging with the Chief Coroner to provide training in order to ensure that coroners have the knowledge they need to carry out inquests where children’s engagement with online services is a possible factor in their death. I am concerned about the funding of this element of the Government’s plans and urge the Minister to indicate whether this could be part of Ofcom’s literacy duties and therefore benefit from the levy. Possibly most importantly, the Secretary of State has undertaken to approach the US Government to ensure that coroners can review private messages that fall outside the scope of this Bill in cases where a child’s death is being investigated. I am grateful to the noble Lord, Lord Allan, for his support in articulating the issue, and accept the invitation to work alongside the department to achieve this.

There are only two further things to say. First, delivery is in the drafting, and I hope that when he responds, the Minister will assure the House that we will see the proposed amendments well before Report so that we can ensure that this works as we have all agreed. Secondly, the Government are now looking very carefully at other amendments which deal with prevention of harm in one way or another. I share the gratitude of Bereaved Parents for Online Safety for the work that has gone into this set of amendments. However, we want to see safety by design; a comprehensive list of harms to children in the Bill, including harms caused or amplified by the design of service; principles for age assurance which ensure that the systems put in place by regulated services are measurable, secure and fit for purpose; and a proper complaints service, so that children have somewhere to turn when things go wrong. What we have been promised is a radical change of status for the coroner and for the bereaved families. What we want is fewer dead children. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, some of the issues that we have been dealing with in this Bill are more abstract or generic harms, but here we are responding to a specific need of families in the UK who are facing the most awful of circumstances.

I want to recognise the noble Baroness, Lady Kidron, for her direct support for many of those families, and for her persistent efforts to use policy and the tools we have available to us here to improve the situation for families who, sadly, will face similar tragedies in future. I appreciate the time that she has spent with me in the spirit of finding workable solutions. It is an alliance that might seem improbable, given our respective responsibilities, which have sometimes placed us in publicly adversarial roles. However, one of the strengths of this Committee process is that it has allowed us to focus on what is important and to find that we have more in common than separates us. Nothing could be more important than the issue we are dealing with now.

I am pleased that it looks like we will be able to use this Bill to make some significant improvements in this area to address the challenges faced by those families, some of whom are here today, challenges which add to their already heart-wrenching distress. The first challenge these families face is to find someone at an online service who is willing and able to answer their questions about their loved one’s use of that platform. This question about contacts at online platforms is not limited to these cases but comes up in other areas.

As noble Lords will know, I used to work for Facebook, where I was often contacted by all sorts of Governments asking me to find people in companies, often smaller companies, concerning very serious issues such as terrorism. Even when they were dealing with the distribution of terrorist content, they would find it very challenging. There is a generic problem around getting hold of people at platforms. A real strength of the Online Safety Bill is that it will necessarily require Ofcom to develop contacts at all online services that offer user-to-user and search services to people in the UK. The Government estimate that 25,000 entities are involved. We are talking about Ofcom building a comprehensive database of pretty much any service that matters to people in the UK.

Primarily, these contacts will be safety focused, as their main responsibility will be to provide Ofcom with evidence that the service is meeting its duties of care under the Bill, so again, they will have the right people in the right companies on their database in future. Importantly, Ofcom will have a team of several hundred people, paid for by a levy on these regulated services, to manage the contacts at the right level. We can expect that, certainly for the larger services, there may be a team of several people at Ofcom dedicated to working with them, whereas for the smaller services it may be a pooled arrangement whereby one Ofcom staff member deals with a group. However, in all cases there will be someone at the regulator with a responsibility for liaising with those companies. We do not expect Ofcom to use those contacts to resolve questions raised by individuals in the UK as a matter of course, but it makes sense to make this channel available where there is a relatively small number of highly impactful cases such as we are dealing with here.

12:30
Having established contact, which we hope will happen in a more orderly way in future with the support from Ofcom, the second challenge lies in the decision about what information a service is willing and able to disclose to assist an inquiry. The frustration that services are unwilling, or declare themselves unable, to disclose the information required has been very widely publicised. We see a potential answer in this Bill, in that it grants Ofcom these new powers to order services to disclose a wide range of information under the clauses listed in Chapter 4. It seems entirely sensible to use these new powers for these specific purposes. Services will understand that they have to respond to information requests. As the noble Baroness pointed out, this will not be discretionary: they will be under serious sanction if they refuse, without good reason, to respond to an information request. It seems to me that this will make the decision quite straightforward for many services and, if they seek legal advice, as is often the case, this will come back to say that they do not have a choice and must disclose.
One way to think about the overall effect of the Bill is that it is replacing discretionary decision-making by online services with a body of instructions from the British state, via our regulator, Ofcom, on how we expect decisions to be made in relation to safety. When platforms have received requests from bereaved families to date, they have been exercising their own discretion, weighing up their views on the potential benefits and harms of certain forms of disclosure. I have been involved in those: a platform weighs up the decision; it is not carried out by any third party. With the information notice process that we expect to be coming forward, the decision about what should be disclosed in which circumstances moves from the platforms to coroners and Ofcom acting on behalf of the British state and affected families. Frankly, if I still worked for an online service, I would welcome this shift of responsibility for these decisions. I hope that we will see the new process—which, as I understand, will be introduced in later amendments—work smoothly, with good co-operation from regulated services.
There are still two provisos to that, which are worth noting at this stage. The noble Baroness, Lady Kidron, has already touched on these, but we should put them on record because they will need to be answered as we get into the debate on the government amendments.
First, we need to ensure that any process for data disclosure has the right checks and balances in place to ensure that it is not used inappropriately. I am certainly very confident in Ofcom’s propriety as a regulator and in the overall legal framework that we have in the Bill and in the Human Rights Act that underpins that, which creates an overall framework for issues such as people’s privacy rights. Everything has to work within that framework. But we should keep stressing the safeguards in place, because these mean that services can feel very safe about complying with orders under the Bill and that it will not involve breaching other rights that they have to users. This is a broader question: there are orders that come from other Governments, shall we say, for disclosure of data that certainly would be problematic. If we are going to order companies to disclose data, we need to make absolutely clear that all of those safeguards are in place for them to feel confident to do so.
Secondly, it must also address any actual or perceived conflicts of law. The noble Baroness touched on this. I particularly note that we have the Data Protection Act and that it is very clear in data protection law that you should not be holding personal data without good reason, so platforms have policies in place that they believe that they are instructed to follow by the Information Commissioner’s Office, for example to delete data from accounts that are no longer in use. We need to think about a process for data preservation in particular circumstances. That already works well in law enforcement: for example, there is a well-established process when somebody is accused of a crime to request data preservation of information relevant to that crime. We need to think about how that principle may apply here and ensure that it is in place. I do not think that the companies are being awkward in this case: they are being told by the data protection regulator to do one thing, and if we are now going to tell them to make an exception, we need to give them a legal basis on which to do that.
There is also the point of potential conflicts with law in other regimes, especially companies outside the UK. The noble Baroness mentioned the Stored Communications Act, which is quite significant for US companies, but there are similar measures in place in other countries—we cannot put companies in a position where they have to choose whose law to break. It should not be beyond our wit to work with a friendly Government such as that of the United States to say, “We all understand that we are trying to help bereaved families here, and we have pieces of law in place; how do we make those work together so that they do not create frustration, which none of us wants them to do and is not what they are intended for?”
I repeat my offer to help if there is anything I can do to try to unblock some of this, work on the detail and make sure that this is effective. This is an area where we could make significant progress and certainly move on from a situation that has frustrated everybody and been unacceptable to date.
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am very pleased to support the noble Baroness, Lady Kidron, with these amendments. I also welcome the fact that we have, I hope, reached the final day of this stage of the Bill, which means that it is getting closer to becoming an Act of Parliament. The amendments to these clauses are a very good example of why the Bill needs to become an Act sooner rather than later.

As we heard during our earlier debates, social media platforms have for far too long avoided taking responsibility for the countless harms that children face on their services. We have, of course, heard about Molly Russell’s tragic death and heard from the coroner’s inquest report that it was on Instagram that Molly viewed some of the most disturbing posts. Despite this, at the inquest Meta’s head of health and well-being policy shied away from taking blame and claimed that the posts which the coroner said contributed to Molly’s death

“in a more than minimal way”

were, in Meta’s words, “safe”. Molly’s family and others have to go through the unthinkable when they lose their child in such a manner. Their lives can be made so much harder when they attempt to access their child’s social media accounts and activities only to be denied by the platforms.

The noble Baroness’s various amendments are not only sensible but absolutely the right thing to do. In many ways, it is a great tragedy that we have had to wait for this piece of primary legislation for these companies to start being compelled and told. I understand what the noble Lord, Lord Allan, very rationally said—companies should very much welcome these amendments—but it is a great shame that often they have not behaved better in these circumstances previously.

There is perhaps no point going into the details, because we want to hear from the Minister about what the Government will propose. I welcome the fact that the Government have engaged early-ish on these amendments and on these matters.

The amendments would force platforms to comply with coroners in investigations into the death of a child, have a named senior manager in relation to inquests and allow easier access to a child’s social media account for bereaved families. We will have to see what the Government’s amendments do to reflect that. One of the areas that the noble Baroness said had perhaps not been buttoned down is the responsibility for a named senior manager in relation to an inquest. This is requiring that:

“If Ofcom has issued a notice to a service provider they must name a senior manager responsible for providing material on behalf of the service and to inform that individual of the consequences for not complying”.


The noble Lord, Lord Allan, set out very clearly why having a named contact in these companies is important. Bereaved families find it difficult, if not impossible, to make contact with tech companies: they get lost in the automated systems and, if they are able to access a human being, they are told that the company cannot or will not give that information. We know that different coroners have had widely differing experiences getting information from the social media platforms, some refusing altogether and others obfuscating. Only a couple of companies have co-operated fully, and in only one or two instances. Creating a single point of contact, who understands the law—which, as we have just heard, is not necessarily always straightforward, particularly if it involves different jurisdictions—understands what is technically feasible and has the authority and powers afforded to the regulator will ensure a swifter, more equitable and less distressing process.

I have really set this out because we will obviously hear what the Minister will set out, but if it does not reflect having a named senior manager, then I hope very much that we are able to discuss that between this and the next stage.

Social media platforms have a responsibility to keep their users safe. When they fail, they should be obligated to co-operate with families and investigations, rather than seeking to evade them. Seeing what their child was viewing online before their death will not bring that child back, but it will help families on their journey towards understanding what their young person was going through, and towards seeking justice. Likewise, ensuring that platforms comply with inquests will help to ease the considerable strain on bereaved families. I urge noble Lords to support these amendments or to listen to what the Government say. Hopefully, we can come up with a combined effort to put an end to the agony that these families have been through.

Baroness Healy of Primrose Hill Portrait Baroness Healy of Primrose Hill (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I strongly support this group of amendments in the name of the noble Baroness, Lady Kidron, and other noble Lords. I, too, acknowledge the campaign group Bereaved Families for Online Safety, which has worked so closely with the noble Baroness, Lady Kidron, 5Rights and the NSPCC to bring these essential changes forward.

Where a child has died, sadly, and social media is thought to have played a part, families and coroners have faced years of stonewalling, often never managing to access data or information relevant to that death; this adds greatly to their grief and delays the finding of some kind of closure. We must never again see a family treated as Molly Russell’s family was treated, when it took five years of campaigning to get partial sight of material that the coroner found so distressing that he concluded that it contributed to her death in a more than minimal way; nor can it be acceptable for a company to refuse to co-operate, as in the case of Frankie Thomas, where Wattpad failed to provide the material requested by the coroner on the grounds that it is not based within the UK’s jurisdiction. With the threat of a fine of only £1,000 to face, companies feel little need to comply. These amendments would mean that tech companies now had to comply with Ofcom’s information notices or face a fine of up to 10% of their global revenue.

Coroners’ powers must be strengthened by giving Ofcom the duty and power to require relevant information from companies in cases where there is reason to suspect that a regulated service provider may hold information relevant to a child’s death. Companies may not want to face up to the role they have played in the death of a child by their irresponsible recommending and pushing of violent, sexual, depressive and pro-suicide material through algorithmic design, but they need to be made to answer when requested by a coroner on behalf of a bereaved family.

Amendment 215 requires a named senior manager, a concept that I am thankful is already enshrined in the Bill, to receive and respond to an information notice from Ofcom to ensure that a child’s information, including their interactions and behaviour and the actions of the regulated service provider, is preserved and made available. This could make a profound difference to how families will be treated by these platforms in future. Too often in the past, they have been evasive and unco-operative, adding greatly to the inconsolable grief of such bereaved parents. As Molly Russell's father Ian said:

“Having lived through Molly’s extended inquest, we think it is important that in future, after the death of a child, authorities’ access to data becomes … a matter of course”


and

“A more compassionate, efficient and speedy process”.


I was going to ask the Government to accept these amendments but, having listened to the noble Baroness, Lady Kidron, I am looking forward to their proposals. We must ensure that a more humane route for families and coroners to access data relating to the death of a child is at last available in law.

Baroness Newlove Portrait Baroness Newlove (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I support the amendments standing in the name of the noble Baroness, Lady Kidron, and other noble Lords. I have listened to noble Lords, so I am not going to repeat what has been said. I pay my respects to the family because as someone who is still going through the criminal justice system, I absolutely feel the anguish of these families.

While we are talking about a digital platform, we are also talking about human lives, and that is what we have to remain focused on. I am not a techno, and all these words in the digital world sound like a lot of Japanese to me. I am not ignorant about what noble Lords are saying, but it has made me realise that, while we have gone forward, for a lot of people and families it still feels like wading through jelly.

I want to speak about how the families will feel and how they will connect through all of these gateways to get what they should quite rightly have about their loved ones’ lives and about what has been said about them online. Surely the platforms should have a duty of care, then perhaps we would not be here discussing these amendments. Noble Lords have spoken about the technical aspects of these amendments. By that, we mean data and the role of the coroner. As a former victims’ commissioner, I had many discussions with the Chief Coroner about other victims who have suffered loss as well. I think that people do not understand how victims’ families feel in the courtroom because you feel alone, and I imagine there are more legal aspects from these mega companies than these families can afford.

12:45
I hope that when the Minister comes to the Dispatch Box it is good, but there are lots of other things. We need to dot the “i”s and cross the “t”s to make sure that families feel that their voices are heard and that they get legal advice and information not just from the coroner but get copies of everything without extra charge.
I want to talk about a humane route for grieving parents and guardians to access data to understand more about the circumstances in which their children died. There is now, horribly, a number of cases, which we have pointed out, where these platforms have left parents in automated loops. They have described it as feeling like contacting a lost property department. Services have refused to engage with inquiries and have refused to appear at inquests. This leaves parents dealing with an already unimaginable situation with nowhere to turn, having to put the pieces together themselves. I am still going through that, and noble Lords cannot imagine the sheer frustration of looking at a blank wall, banging your head against a wall and thinking you have a support system in place but actually it just closes the door in your face.
The noble Baroness, Lady Kidron, has already highlighted the tragedies and issues of these families. In the hours after Olly Stephens was murdered, his parents, Amanda and Stuart Stephens, had to trawl through social media sites to get evidence. This is about their son. We are not talking about data protection; this is about human life. For the Bill to navigate and push forward to support these families, it has to have that humanity, that humane level, to go through and help them. It is all right setting it up to have digital companies comply with Ofcom and the coroner, but there still may be a huge gap to help these families navigate, understand, get the evidence, have copies and feel that we are talking about their loved ones. They are not actually having a voice to understand what emotions they are going through. As my noble friend Lady Morgan has just said, it took Molly Russell’s family five years, and then they were drowned in 36,000 pages of almost impenetrable data just 12 days before the original inquest. That caused a further five months’ delay. That is not acceptable. This is a further trauma for families.
The parents of Frankie Thomas, who took her life in 2018 aged 15, described their desperation in the aftermath of her death. Her mother Judy said she felt like she was in the wilderness contacting somebody at Instagram. I use social media, and I see many people’s sheer frustration at having no response from social media platforms. I agree we should have a single point of contact—a SPOT, as they are known in this world—to help these families, but I would like that single point of contact to go even further. I want that for all victims of crime, but I also do not want it to hinder people if that person has gone on holiday or is off sick and nobody shares that information. We need to ensure that they have a lot of people trained on this who can pick it up. We have a system where the server is not looked at but blocks the families. What is the point of having a single point of contact if that person cannot give other members of staff access to that data to help families?
Grieving families must be given a humane route, facilitated by Ofcom, to access that information from all the platforms where their children have died. We cannot go back after all these experiences and all the energy that the families have put into campaigning for something that should quite rightly be there: information about their loved one. We should not have to have discussions in this place to get that information for families. It is inhumane and shameful and they should not have to go through it any more.
I look forward to listening to the Minister. I pay honour to the noble Baroness, Lady Kidron, for the work she has done. I am glad that she has had conversations because I am still waiting for the Minister to answer my letters. These families should not be here fighting for justice for their loved ones. We should be trying to make their life a bit better, to give them a healthier lifestyle and to understand. I hope the Minister will come up with something good because if there are further gaps, we need to challenge him once again to ensure that we do not fail the families who are listening to us today.
Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, following on from the excellent points that the noble Baroness has made, I want to pursue the same direction. In this group of amendments we are essentially trying to reduce the incidence of tragedies such as those that the families there in the Gallery have experienced and trying to ensure that no one—that is probably unrealistic, but at least far fewer people—will have the same experience.

I particularly want to focus the Minister and the Bill team on trying to think through how to ensure that, as and when something tragic happens, what happens to the families faced with that—the experience that they have and the help that I hope in future they will be able to receive—will make it a less traumatic, lonely and baffling experience than it clearly has been to date.

At the heart of this, we are talking about communication; about the relationship between Ofcom and the platforms; probably about the relationships between platforms and other platforms, in sharing knowledge; about the relationship between Ofcom and government; about the relationship between Ofcom and regulators in other jurisdictions; and about the relationship between our Government and other Governments, including, most importantly, the Government in the US, where so many of these platforms are based. There is a network of communication that has to work. By its very nature, trying to capture something as all-encompassing as that in primary legislation will in some ways be out of date before it even hits the statute book. It is therefore incredibly important that there is a dynamic information-sharing and analytics process to understand what is going on in the online world, and what the experience is of individuals who are interacting with that world.

That brings me neatly back to an amendment that we have previously discussed, which I suspect the noble Viscount sitting on the Front Bench will remember in painful detail. When we were talking about the possibility of having an independent ombudsman to go to, what we heard from all around the House was, “Where do we go? If we have gone to the platforms and through the normal channels but are getting nowhere, where do we go? Are we on our own?”. The answer that we felt we were getting a few weeks ago was, “That’s it, you’ve got to lump it”. That is simply not acceptable.

I ask the Minister and the Bill team to ensure that there is recognition of the dynamic nature of what we are dealing with. We cannot capture it in primary legislation. I hope we cannot capture it in secondary instruments either; speaking as a member of the Secondary Legislation Scrutiny Committee, we have quite enough of them as it is so we do not want any more, thank you very much. However, it is incredibly important that the Government think about a dynamic form of having up-to-date information so that they and all the other parties in this area know what is going on.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I support this group of amendments. I pay tribute to the families who I see are watching us as we debate this important group. I also pay tribute to my noble friend Lady Newlove, who has just given one of the most powerful speeches in the full 10 days of Committee.

The real sadness is that we are debating what happens when things go horribly wrong. I thank my noble friend the Minister and the Secretary of State, who is currently on leave, for the very collaborative way in which I know they have approached trying to find the right package—we are all waiting for him to stand up and speak to show us this. Very often, Governments do not want to give concessions early in the process of a Bill going through because they worry that those of us campaigning for concessions will then ask for more. In this case, as the noble Lord, Lord Russell, has just pointed to, all we are asking for in this Bill is to remember that a concession granted here helps only when things have gone horribly wrong.

As the noble Baroness, Lady Kidron, said, what we really want is a safer internet, where fewer children die. I reiterate the comments that she made at the end of her speech: as we have gone through Committee, we have all learned how interconnected the Bill is. It is fantastic that we will be able to put changes into it that will enable bereaved families not to have to follow the path that the Russells and all the other bereaved families campaigning for this had to follow—but that will not be enough. We also need to ensure that we put in place the safety-by-design amendments that we have been discussing. I argue that one of the most important is the one that the noble Lord, Lord Russell, has just referenced: when you already know that your child is in trouble but you cannot get help, unfortunately no one wants then to be able to say, “It’s okay. Bereaved families have what they need”. We need to do more than that.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a very moving debate for a very important cause. I thank the noble Baroness, Lady Kidron, for introducing it in the way that she did, along with those who have spoken in the debate.

The good news is that this is very much a cross-party and cross-Bench debate. It clearly appears to be a concern that the Government share, and I appreciate that. I agree with the noble Baroness, Lady Harding, that it is not a weakness for the Government to concede here but very much the logic of where we have now got to. Compared with what is in the Joint Committee report on the draft Bill, what seems to be proposed—and I very much look forward to hearing what the Minister has to say—goes further than what we were proposing, so it may be that we have reached another milestone. However, we wait to hear the detail.

Like other noble Lords, I pay tribute to the bereaved parents. We heard from parents during our consideration of the draft Online Safety Bill and we have heard further since then, particularly as a result of the two notable inquests into the deaths of Frankie Thomas and Molly Russell, which highlighted the difficulties that families and coroners face. Both families talked about the additional toll on their mental health as they battle for information, and the impossibility of finding closure in the absence of answers.

The noble Baroness, Lady Newlove, said in her very moving speech that a humane process must be established for bereaved families and coroners to access data pertinent to the death of a child. That is what we have been seeking, and I pay tribute to the relentless way in which the noble Baroness, Lady Kidron, has pursued this issue on behalf of us all, supported by 5Rights and the NSPCC. We must have a transparent process in which bereaved families and coroners can access information from regulated services in cases where social media may have played a part in the death of a child.

My noble friend Lord Allan—who I am delighted is so plugged in to what could be the practical way of solving some of these issues—expertly described how Ofcom’s powers could and should be used and harnessed for this purpose. That very much goes with the grain of the Bill.

I shall repeat a phrase that the noble Baroness, Lady Kidron, used: the current situation is immoral and a failure of justice. We absolutely need to keep that in mind as we keep ourselves motivated to find the solution as soon as we possibly can. I look forward to good news from the Minister about the use of information notices for the purpose that has been heralded by the noble Baroness, Lady Kidron, but of course the devil is in the detail. We will obviously want to see the detail of the amendment well before Report.

13:00
The noble Baroness, Lady Kidron, asked a number of additional questions about data preservation, and a number of noble Lords, including the noble Lord, Lord Russell, and the noble Baroness, Lady Newlove, talked about the question of help, perhaps with a dedicated helpline for bereaved families. Then there is a question about the obligations of senior management in appearing at inquests. As the noble Lord, Lord Russell, said, we continually need to understand the experience of parents in these circumstances, so the data arising from an independent complaints system is extremely important.
If that was not enough on the architecture of the Online Safety Bill, the noble Baroness also mentioned things that reside outside it, such as data inheritance, given that the data protection and digital information Bill is coming down the track. It is good that the noble Viscount, Lord Camrose, is sitting on the Front Bench because he will no doubt be dealing with that data protection Bill. We will of course table amendments to that at the time.
There have also been questions about the training for coroners and about approaching the US Government, which is an even larger dimension than anything I have mentioned so far. I very much look forward to hearing what the Minister has to say and hope that we will have achieved the goal that so many families want us to achieve.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am all that is left between us and hearing from the Minister with his good news, so I will constrain my comments accordingly.

The noble Baroness, Lady Kidron, begin by paying tribute to the parents of Olly, Breck, Molly, Frankie and Sophie. I very much join her in doing that; to continually have to come to this place and share their trauma and experience comes at a great emotional cost. We are all very grateful to them for doing it and for continuing to inform and motivate us in trying to do the right thing. I am grateful to my noble friend Lady Healy and in particular to the noble Baroness, Lady Newlove, for amplifying that voice and talking about the lost opportunity, to an extent, of our failure to find a way of imposing a general duty of care on the platforms, as was the original intention when the noble Baroness, Lady Morgan, was the Secretary of State.

I also pay a big tribute to the noble Baroness, Lady Kidron. She has done the whole House, the country and the world a huge service in her campaigning around this and in her influence on Governments—not just this one—on these issues. We would not be here without her tireless efforts, and it is important that we acknowledge that.

We need to ensure that coroners can access the information they need to do their job, and to have proper sanctions available to them when they are frustrated in being able to do it. This issue is not without complication, and I very much welcome the Government’s engagement in trying to find a way through it. I too look forward to the good news that has been trailed; I hope that the Minister will be able to live up to his billing. Like the noble Baroness, Lady Harding, I would love to see him embrace, at the appropriate time, the “safety by design” amendments and some others that could complete this picture. I also look forward to his answers on issues such as data preservation, which the noble Lord, Lord Allan, covered among the many other things in his typically fine speech.

I very much agree that we should have a helpline and do more about that. Some years ago, when my brother-in-law sadly died in his 30s, it fell to me to try to sort out his social media accounts. I was perplexed that the only way I could do it was by fax to these technology companies in California. That was very odd, so to have proper support for bereaved families going through their own grief at that moment seems highly appropriate.

As we have discussed in the debates on the Bill, a digital footprint is an asset that is exploited by these companies. But it is an asset that should be regarded as part of one’s estate that can be bequeathed to one’s family; then some of these issues would perhaps be lessened. On that basis, and in welcoming a really strong and moving debate, I look forward to the Minister’s comments.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- Hansard - - - Excerpts

My Lords, this has been a strong and moving debate, and I am grateful to the noble Baroness, Lady Kidron, for bringing forward these amendments and for the way she began it. I also echo the thanks that the noble Baroness and others have given to the families of Breck Bednar, Sophie Parkinson, Molly Russell, Olly Stephens, Frankie Thomas and all the young people whose names she rightly held in remembrance at the beginning of this debate. There are too many others who find themselves in the same position. The noble Lord, Lord Knight, is right to pay tribute to their tirelessness in campaigning, given the emotional toll that we know it has on them. I know that they have followed the sometimes arcane processes of legislation and, as my noble friend Lady Morgan said, we all look forward to the Bill becoming an Act of Parliament so that it can make a difference to families who we wish to spare from the heartache they have had.

Every death is sorrowful, but the death of a child is especially heartbreaking. The Government take the issues of access to information relating to a deceased child very seriously. We have undertaken extensive work across government and beyond to understand the problems that parents, and coroners who are required to investigate such deaths, have faced in the past in order to bring forward appropriate solutions. I am pleased to say that, as a result of that work, and thanks to the tireless campaigning of the noble Baroness, Lady Kidron, and our discussions with those who, very sadly, have first-hand experience of these problems, we will bring forward a package of measures on Report to address the issues that parents and coroners have faced. Our amendments have been devised in close consultation with the noble Baroness and bereaved families. I hope the measures will rise to the expectations they rightly have and that they will receive their support.

The package of amendments will ensure that coroners have access to the expertise and information they need to conduct their investigations, including information held by technology companies, regardless of size, and overseas services such as Wattpad, mentioned by the noble Baroness, Lady Healy of Primrose Hill, in her contribution. This includes information about how a child interacted with specific content online as well as the role of wider systems and processes, such as algorithms, in promoting it. The amendments we bring forward will also help to ensure that the process for accessing data is more straightforward and humane. The largest companies must ensure that they are transparent with parents about their options for accessing data and respond swiftly to their requests. We must ensure that companies cannot stonewall parents who have lost a child and that those parents are treated with the humanity and compassion they deserve.

I take the point that the noble Baroness, Lady Kidron, rightly makes: small does not mean safe. All platforms will be required to comply with Ofcom’s requests for information about a deceased child’s online activity. That will be backed by Ofcom’s existing enforcement powers, so that where a company refuses to provide information without a valid excuse it may be subject to enforcement action, including sanctions on senior managers. Ofcom will also be able to produce reports for coroners following a Schedule 5 request on matters relevant to an investigation or inquest. This could include information about a company’s systems and processes, including how algorithms have promoted specific content to a child. This too applies to platforms of any size and will ensure that coroners are provided with information and expertise to assist them in understanding social media.

Where this Bill cannot solve an issue, we are exploring alternative avenues for improving outcomes as well. For example, the Chief Coroner has committed to consider issuing non-legislative guidance and training for coroners about social media, with the offer of consultation with experts.

Baroness Newlove Portrait Baroness Newlove (Con)
- Hansard - - - Excerpts

I am sorry to interrupt my noble friend. On the coroners’ training and national guidelines, the Chief Coroner has no powers across the nation over all the coroners. How is he or she going to check that the coroners are keeping up with their training and are absolutely on the ball? The Chief Coroner has no powers across the country and everything happens in London; we are talking about outside London. How can we know that no other family has to suffer, considering that we have this legislation?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My noble friend rightly pulled me up for not responding to her letter as speedily as we have been dealing with the questions raised by the noble Baroness, Lady Kidron. We have had some useful meetings with Ministers at the Ministry of Justice, which the noble Baroness has attended. I would be very happy to provide some detail on this to my noble friend—I am conscious of her experience as Victims’ Commissioner—either in writing or to organise a briefing if she would welcome that.

The noble Lord, Lord Allan of Hallam, rightly raised data protection. Where Ofcom and companies are required to respond to coroners’ requests for information, they are already required to comply with personal data protection legislation, which protects the privacy of other users. This may include the redaction of information that would identify other users. We are also exploring whether guidance from the Information Commissioner's Office could support technology companies to understand how data protection law applies in such cases.

The noble Lord mentioned the challenges of potential conflicts of law around the world. Where there is a conflict of laws—for example, due to data protection laws in other jurisdictions—Ofcom will need to consider the best way forward on a case-by-case basis. For example, it may request alternative information which could be disclosed, and which would provide insight into a particular issue. We will seek to engage our American counterparts to understand any potential and unintended barriers created by the US Stored Communications Act. I can reassure the noble Lord that these matters are in our mind.

We are also aware of the importance of data preservation to both coroners and bereaved parents. The Government agree with the principle of ensuring that these are preserved. We will be working towards solving this in the Data Protection and Digital Information Bill. In addition, we will explore whether there are further options to improve outcomes for parents in that Bill as well. I want to assure noble Lords and the families watching this debate closely that we will do all we can to deliver the necessary changes to give coroners and parents the information that they seek and to ensure a more straightforward and humane process in the future.

I turn in detail to the amendments the noble Baroness, Lady Kidron, brought forward. First, Amendments 215 and 216 include new requirements on Ofcom, seeking to ensure that coroners and parents can obtain data from social media companies after the death of a child. Amendment 215 would give Ofcom the ability to impose senior management liability on an individual in cases where a coroner has issued a notice requiring evidence to be provided in an inquest into the death of a child. Amendment 216 would put Ofcom’s powers at the disposal of a coroner or close relatives of a deceased child so that Ofcom would be obliged to require information from platforms or other persons about the social media activity of a deceased child. It also requires service providers to provide a point of contact. Amendments 198 and 199 are consequential to this.

As I said, we agree with the intent of the noble Baroness’s amendments and we will deal with it in the package that we will bring forward before Report. Our changes to the Bill will seek to ensure that Ofcom has the powers it needs to support coroners and their equivalents in Scotland, so that they have access to the information they need to conduct investigations into a child’s death where social media may have played a part.

13:15
We also agree on the importance of greater transparency and accountability from companies and clearer communication with parents. We have had a useful debate about whether it is best to have a single named person or a group of people. As my noble friend Lady Newlove rightly pointed out, with staff turnover and people taking leave, it may be that a team of people is more appropriate. But what is essential is that there is human and humane contact for people to engage with companies. We will introduce further measures that require greater clarity from services and create new lines of communication with parents.
We are separately exploring further solutions which can be brought forward in the Data Protection and Digital Information Bill, which will ensure that the myriad issues bereaved parents face in the context of their child’s death are addressed in line with the proposals that the noble Baroness has brought forward.
Noble Lords rightly asked when they may see the proposals. We will bring forward further details ahead of Report, but I am very happy to commit to sharing the draft clauses with the noble Baroness, Lady Kidron, in the first instance. It would be very helpful to us all, the Government included, for her to cast her expert eye over them. I hope that she will welcome this and I am very grateful to noble Lords for their contributions to the debate.
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I do indeed welcome it. I do not feel I can do justice to all the speakers; I think I will cry, as I did when the noble Baroness, Lady Newlove, was speaking. I shall not do that, but I will thank all noble Lords from the bottom of my heart and will speak to just a couple of technical matters.

First, I accept the help of the noble Lord, Lord Allan, on the progress of the data protection negotiations with the US Government. That will be very helpful. I want to put on the record that there has been a lot of discussion about the privacy of other users and ensuring that it is central, particularly because other young people are in these interactions and we have to protect them, too. That is very much in our mind.

I welcome and thank the Minister. He said a couple of things, including that he hoped that what he will bring forward will rise to the expectation—so do I. The expectation is set high, and I hope that the Government rise to it. In relation to that, I note that a number of noble Lords carefully planted their expectations in Hansard. I will be giving the noble Lord a highlighter so that he can find them. I note that it was a particular skill of the ex-Secretary of State for DCMS, for laying down the things she expected to see.

I understood “exploring” and “in our mind”; the Government have certain things in their mind. I understand the context of that because we are talking about other Bills and things that are yet to come. I want to make a statement—I do not know whether it is a promise or a threat; I rather suspect it is both. I will not rest until this entire ecosystem is sorted. This is not about winning an amendment or a concession. This is about putting it right for families and, indeed, for coroners, who are not doing a good job under the current regime.

Finally, I echo those who have pointed out the other amendments that we are seeking on safety by design, age assurance and having the harms in the Bill. I believe I speak for Bereaved Parents for Online Safety; that is what they wish to see come from their pain. It has been the privilege of my life to deal with these parents and these families and I thank the Committee for its support. With my conditions set out, I wish to withdraw my amendment.

Amendment 198 withdrawn.
Clause 91 agreed.
Clause 92: Information notices
Amendment 199 not moved.
Clause 92 agreed.
Clause 93 agreed.
Clause 94: Reports by skilled persons
Amendment 200 not moved.
Clause 94 agreed.
Clauses 95 to 97 agreed.
Amendment 200A
Moved by
200A: After Clause 97, insert the following new Clause—
“Amendment of Criminal Justice and Police Act 2001
(1) The Criminal Justice and Police Act 2001 is amended as follows.(2) In section 57(1) (retention of seized items), after paragraph (t) insert—“(u) paragraph 8 of Schedule 12 to the Online Safety Act 2023.”(3) In section 65 (meaning of “legal privilege”)—(a) after subsection (8B) insert—“(8C) An item which is, or is comprised in, property which has been seized in exercise or purported exercise of the power of seizure conferred by paragraph 7(f), (j) or (k) of Schedule 12 to the Online Safety Act 2023 is to be taken for the purposes of this Part to be an item subject to legal privilege if, and only if, the seizure of that item was in contravention of paragraph 17(3) of that Schedule (privileged information or documents).”;(b) in subsection (9)—(i) at the end of paragraph (d) omit “or”;(ii) at the end of paragraph (e) insert “or”;(iii) before the closing words insert—“(g) paragraph 7(f), (j) or (k) of Schedule 12 to the Online Safety Act 2023.”(4) In Part 1 of Schedule 1 (powers of seizure to which section 50 of the Act applies), after paragraph 73U insert—“Online Safety Act 202373V Each of the powers of seizure conferred by paragraph 7(f), (j) and (k) of Schedule 12 to the Online Safety Act 2023.””Member’s explanatory statement
This amendment has the effect of providing that section 50 of the Criminal Justice and Police Act 2001 (additional powers of seizure from premises) applies to the powers of seizure under paragraph 7(f), (j) and (k) of Schedule 12 to the Bill; and makes related amendments to that Act.
Amendment 200A agreed.
Schedule 12 agreed.
Clauses 98 to 103 agreed.
Amendment 201 not moved.
Clauses 104 to 109 agreed.
Clause 110: Notices to deal with terrorism content or CSEA content (or both)
Amendments 202 to 205 not moved.
Amendment 205A
Moved by
205A: Clause 110, page 95, line 11, leave out “relating to terrorism content present on a service” and insert “that relates to a user-to-user service (or to the user-to-user part of a combined service) and requires the use of technology in relation to terrorism content”
Member’s explanatory statement
This amendment makes it clear that the requirement in clause 110(7) regarding which content is communicated publicly is relevant to user-to-user services and may apply in both the cases mentioned in clause 110(2)(a)(i) and (ii).
Amendment 205A agreed.
Amendment 206 not moved.
Clause 110, as amended, agreed.
Amendments 207 and 208 not moved.
Clause 111 agreed.
Clause 112: Matters relevant to a decision to give a notice under section 110(1)
Amendments 209 and 210 not moved.
Clause 112 agreed.
Amendment 210A not moved.
Clause 113 agreed.
Clause 114: Review and further notice under section 110(1)
Amendment 211 not moved.
Clause 114 agreed.
Clause 115: OFCOM’s guidance about functions under this Chapter
Amendments 212 and 213 not moved.
Clause 115 agreed.
Amendment 214 not moved.
Clauses 116 and 117 agreed.
Amendments 215 and 216 not moved.
Clause 118: Provisional notice of contravention
Amendments 216A to 216C not moved.
Clause 118 agreed.
Amendment 217
Moved by
217: After Clause 118, insert the following new Clause—
“Notice by OFCOM to payment-services providers and ancillary services
(1) Where OFCOM have issued a provisional notice of contravention to a regulated service, which specifies the person has failed, or is failing, to comply with a duty or requirement in section 72 (duties about regulated provider pornographic content), it must give notice of that fact to any payment-services provider or ancillary service.(2) A notice under subsection (1) must—(a) identify the regulated service in such manner as OFCOM considers appropriate,(b) state whether the provisional notice of contravention relates to a duty under subsection (2) or (3) of section 72, or duties under both,(c) give OFCOM’s reasons for their opinion that the regulated service has failed, or is failing, to comply with it, and(d) provide such further particulars as OFCOM consider appropriate.(3) When OFCOM give notice under this section, OFCOM must inform the regulated service, by notice, that they have done so.(4) In this section—“ancillary service” has the same meaning as in section 131(11);“payment-services provider” means a person who appears to OFCOM to provide services, in the course of a business, which enable funds to be transferred in connection with the payment by any person for access to pornographic content made available on the internet by the regulated service;“pornographic content” has the meaning given by section 70(2);“provisional notice of contravention” has the same meaning as in section 118(1).”Member’s explanatory statement
This new Clause requires OFCOM to notify payment-service providers and ancillary services of a regulated service which is found to have breached duties relating to pornographic content.
Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

My Lords, I will speak to Amendment 217 in my name. I express my deep gratitude to the noble Baronesses, Lady Benjamin and Lady Ritchie of Downpatrick, and the noble Lord, Lord Curry of Kirkharle, for adding their names in support. I will also address other amendments in this group that bring about business-disruption measures that enforce compliance with the important measures on pornography and harm that we have scrutinised already and will debate—briefly, I expect—at the end of today.

Amendment 217 is modest, but I believe it could make a big difference. It seeks to use the commercial interests of the pornography sites to change their behaviours by ensuring that their important supply chains are informed of breaches in regulations when they have happened. We know that this works because we have seen it work already. It has been widely reported that, at the start of December 2020, Pornhub, the famous porn site, said in its search bar that it was hosting 13.5 million clips. Then, on 14 December, that figure was dramatically reduced overnight to 5 million. What had happened was that Pornhub had removed two-thirds of the videos because of a decision by its payment companies, Visa and Mastercard, on 10 December, that they would withdraw payment services from Pornhub’s parent company, MindGeek.

That very important decision followed high-profile press reports, including in the New York Times, that Pornhub hosted vile videos of child abuse, rape and revenge pornography, and videos of people who had not consented to being recorded. These were illegal recordings—Mastercard said that its own investigation confirmed that the site was hosting illegal content. So, quite simply, the scrutiny of the nature of much of Pornhub’s content became too much for those payment companies. To protect themselves and to avoid being tarred by association, Visa and Mastercard had to act, which in turn meant that Pornhub had to act. This is the commercial reality of how the internet will be policed, whether we like it or not. It may well be that commercial interests can drive changes in behaviour much more quickly than blunt regulatory action. At the end of the day, I am interested just in measures that protect children, however they work—and this amendment facilitates effective action.

Payment and ancillary service providers can act in ways that Governments cannot easily do. The Bill could not require such actions as its duties extend only to the platforms themselves and the regulator, not ancillary services essential to the business model; but it can facilitate such interventions by making breaches of regulation transparent to the world. To enable this, the amendment would require Ofcom to notify financial and ancillary services of any breaches of regulations—no ifs or buts, no exemptions and no hiding the bad results. This notification is part and parcel of the process of issuing a provisional notice of contravention in any case, much like when Ofcom gives a notice under Section 110(1). The regulations say that

“OFCOM must carry out a review of the provider’s compliance with the notice”.

This discretion is at the point of choosing to give the notice. All that the process then entails is directed.

There is a significant limitation to this version of the amendment: it applies only to pornography providers covered by Part 5. That is deliberate. Of course, I would like to see it apply to all services with any pornographic content, which I hope will be included in changes that we will see in primary priority content. I will take a moment to flag to the Minister that amendments to this amendment may be needed if there are perhaps—I speak hopefully here—government amendments to the Bill that tweak the Part 3 and Part 5 distinctions before Report. Amendment 217 places no duties on providers of payment or ancillary services themselves; it simply gives them a right to be informed. It is about transparency and awareness, which are fundamental tenets of the Bill. For that reason, I very much hope that the Minister will commit to embracing this simple and proportionate measure.

This transparency measure becomes more pertinent and relevant when we look at other measures in this group, particularly those that introduce service-restriction measures. As other noble Lords will explain in more detail, I hope, these will allow Ofcom to require the supply chain of companies that support the internet industry—they are often reputable players that can be reached by our UK courts—to cut off essential support services to those who make transgressions. These might include services like hosting and search and, as I mentioned, payment companies like Mastercard and Visa. Without revenue from UK customers, there is little point in any service trying to find ways around access blocks.

13:30
Amendments 218D, 218F, 218J and 218L in this group seek to address the scale and speed of this action. We really should not apply any discount factor to the cost side of the business case based on the chances of being targeted for enforcement. We need to know that there is 100% compliance. These amendments put what the Government have already said on a legal footing. Ofcom should be able to make multiple applications simultaneously—and I note that my noble friend Lord Grade is here; he has spoken very movingly about the challenge faced by Ofcom in trying to create behaviour change in the internet.
To anticipate my noble friend the Minister, I note that in the Committee in the other place the then Minister Chris Philp said that
“procedures under the existing civil procedure rules already allow so-called multi-party claims to be made”.—[Official Report, Commons, Online Safety Bill Committee, 21/6/2022; col. 501.]
I make it clear that the provisions under the existing Civil Procedure Rules relating to so-called multi-party claims are designed for something other than what we consider today. There may be hundreds, thousands or even tens of thousands of interventions needed to get the sites that need to be sanctioned back within the law. That number of defendants or respondents would be unprecedented under such an administrative procedure, and would be another reason for legal challenge to the whole process.
We must also consider the practicalities of going to court before taking action. I support the case for judicial oversight before draconian measures are taken based on subjective decisions about how harmful a site is or how well it is protecting children from any other broader harms. That seems a very reasonable approach—but for something as clear-cut as child access to pornography, where the decisions are black and white, there needs to be no delay to action. We do not need to waste the court’s time protecting pornographers before every enforcement action. With these powers, Ofcom can avoid lengthy battles with well-funded, high-profile sites, leading to it losing its well-founded reputation for effectiveness. Noble Lords should note that, earlier this month, France’s Digital Minister, Jean-Noël Barrot, announced new legislation to give its regulator, Arcom, the ability to block adult sites without going to court. That speaks very fulsomely of its experience in this area.
By way of conclusion, we know that the Government believe that the access and service restriction orders are a last resort, because they amount to, in effect, unplugging a website from the internet so that people in the United Kingdom cannot access them and so that supporting services such as payment services do not help them. These are very severe interventions, but it is precisely these dramatic measures that will be needed to bring pornographic sites back within the realms of reasonable behaviour of the kind that we expect in the real world.
If a provider outside the UK ignores letters and fines, these measures may well be the only possibility. Many pornography providers probably have absolutely no intention of even trying to comply with the kinds of regulations that are envisaged in the Bill. They are probably not based in the UK, are never going to pay a fine and are probably incorporated in some obscure offshore jurisdiction. Ofcom will need to use these powers in such circumstances, and on a bulk scale. We should not put that enforcement activity at risk of the legal stalling games that these sites will undoubtedly play. For that reason, I ask the Minister to commit to these changes by government amendment before Report next month.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I want to speak to Amendment 218JA in this group, in my name, to which the noble Baroness, Lady Morgan of Cotes, has added her name. This is really trying to understand what the Government’s intentions are in respect of access restriction orders.

Just to take a step back, in the Online Safety Bill regime we are creating, in effect, a licensing regime for in-scope services and saying that, if you want to operate in the United Kingdom and you are covered by the Bill—whether that is the pornography services that the noble Lord, Lord Bethell, referred to or a user-to-user or search service—here are the conditions to which you must adhere. That includes paying a fee to Ofcom for your supervision, and then following the many thousands of pages of guidance that I suspect we will end up producing and issuing to those companies. So what we are exploring here is what happens if a particular organisation does not decide to take up the offer of a licence.

Again, to go back to the previous debate, success for the Bill would be that it has a sufficient deterrent effect that the problems that we are seeking to fix are addressed. I do not think we are looking to block services or for them to fail—we are looking for them to succeed, so stage one is that Ofcom asks them nicely. It says, “You want to operate in the UK, here is what you need to do—it’s a reasonable set of requests we are making”, and the services say, “Fine”. If not, they choose to self-limit—and it is quite trivial for any online service to say, “I’m going to check incoming traffic, and if this person looks like they are coming from the UK, I’m not going to serve them”. That is self-limiting, which is an option that would be preferable if a service chose not to accept the licence condition. But let us assume that it has accepted the licence condition, and Ofcom is going to be monitoring it on a routine basis—and if Ofcom thinks it is not meeting its requirements, whether that is to produce a risk assessment or to fulfil its duty of care, Ofcom will then instruct it to do something. If it fails to follow that instruction, we are in the territory of the amendments that we are considering here: either it has refused to accept the licence conditions and to self-limit, or it has accepted them but has failed to do what we expect it to do. It has signed up and thought that it is not serious, and it is not doing the things that we expect it to do.

At that point, Ofcom has to consider what it can do. The first stage is quite right, in the group of clauses that we are looking at—Ofcom can bring in these business disruption measures. As the noble Lord, Lord Bethell, rightly pointed out, in many instances that will be effective. Any commercial service—not just pornography services, but an online service that depends on advertising—that is told that it can no longer take credit card payments from UK businesses to advertise on the service, will, one hopes, come into line and say, “That’s the end of my business in the UK—I may as well cut myself off”. But if it wants to operate, it will come into line, because that way it gets its payment services restored. But there will be others for which that is insufficient—perhaps that is not their business model—and they will carry on regardless. At that point, we may want to consider the access restrictions.

In a free society, none of us should take pleasure in the idea that we are going to instruct the internet services or block them. That is not our first instinct, but something that is rather potentially a necessary evil. At some point, there may be services that are so harmful and so oblivious to the regime that we put in place that we need to block them. Here we are trying to explore what would happen in those circumstances. The first kind of block is one that we are used to doing, and we do it today for copyright-infringing sites and a small number of other sites that break the law. We instruct service providers such as BT and TalkTalk to implement a network-level block. There are ways you can do that—various technical ways that we do not need to go into in this debate—whereby we can seek to make it so that an ordinary UK user, when they type in www.whatever, will not get to the website. But increasingly people are using technology that will work around that. Browsers, for example, may create traffic between your web browser and the online service such that TalkTalk or BT or the access provider has no visibility as to where you are going and no capability of blocking it. BT has rightly raised that. There will be different views about where we should go with this, but the question is absolutely legitimate as to what the Government’s intentions are, which is what we want to try to tease out with this amendment.

Again, we should be really candid. Somebody who is determined to bypass all the access controls will do so. There is no world in which we can say that we can guarantee that somebody with a UK internet connection can never get to a particular website. What we are seeking to do is to make violating services unavailable for most of the people most of the time. We would be unhappy if it was only some of the people some of the time, but it is not going to be all of the people all of the time. So the question is: what constitutes a sufficient access restriction to either bring them to heel or to ensure that, over the longer term, the harm is not propagated, because these services are generally not made available? It would be really helpful if the Minister was able to tease that out.

Certainly, in my view, there are services such as TOR—the Onion Router—where there is no entity that you can ask to block stuff, so if someone was using that, there is nothing that you can reasonably do. At the other end of the spectrum, there are services such as BT and TalkTalk, where it is relatively straightforward to say to them that they should block. Then there are people in between, such as browser owners that are putting in place these encrypted tunnels for very good reasons, for privacy, but which can also add value-added stuff—helping to manage bandwidth better, and so on. Is it the Government’s intention that they are going to be served with access restriction orders? That is a valid question. We might have different views about what is the right solution, but it is really important for the sector that it understands and is able to prepare if that is the Government’s intention. So we need to tease that out; that is the area in which we are looking for answers from the Government.

The second piece is to think about the long term. If our prediction—or our hope and expectation—is that most companies will come into line, that is fine; the internet will carry on as it does today but in a safer way. However, if we have misjudged the mood, and a significant numbers of services just stick their thumb up at Ofcom and say, “We are not going to play—block us if you dare”, that potentially has significant consequences for the internet as it will operate in the United Kingdom. It would be helpful to understand from the Minister whether the Government have any projections or predictions as to which way we are going to go. Are we talking about the vast majority of the internet continuing as it is today within the new regime, with the odd player that will be outside that, or is it the Government’s expectation that there may need to be blocking of significant numbers of services, essentially for the foreseeable future?

Other countries such as France and Germany have been dealing with this recently, as the noble Lord, Lord Bethell, is probably aware of. They have sought to restrict access to pornography services, and there have been all sorts of consequent knock-on effects and challenges at a technical level. It would be helpful to understand whether our expectation is that we will see the same in the United Kingdom or that something else is going to happen. If the Government do not have that information today, or if they have not made those projections, it would be helpful to know their thinking on where that might happen. Who will be able to inform us as to what that the future landscape is likely to look like as it evolves, and as Ofcom gains these powers and starts to instruct companies that they must obtain licences, and then seeks to take enforcement action against those that choose not to play the game?

Lord Curry of Kirkharle Portrait Lord Curry of Kirkharle (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I support Amendment 217 in the name of the noble Lord, Lord Bethell, and very much support the comments that he has made. I will speak to Amendments 218C, 218E, 218H and 218K in my name within this group. I also support the intent of the other amendments in this group tabled by the noble Lord, Lord Bethell.

I appreciate the process helpfully outlined by the noble Lord, Lord Allan. However, when looking at Ofcom’s implementation of existing provisions on video-sharing platforms, the overwhelming impression is of a very drawn-out process, with Ofcom failing to hold providers to account. Despite being told by Ofcom that a simple tick-box declaration by the user confirming that they are over 18 is not sufficient age verification, some providers are still using only that system. Concerningly, Ofcom has not taken decisive action.

When children are at severe risk, it is not appropriate to wait. Why, for example, should we allow porn sites to continue to host 10 million child sexual abuse videos while Ofcom simply reports that it is continuing to partner with these platforms to get a road map of action together? As has been mentioned by the noble Lord, Lord Bethell, Visa and Mastercard did not think it was appropriate to wait in such circumstances—they just acted.

Similarly, when systems are not in place to protect children from accessing pornography, we cannot just sit by and allow all the egregious associated harms to continue. Just as in Formula 1, when a red flag is raised and the cars must stop and go into the pits until the dangerous debris is cleared, sometimes it is too dangerous to allow platforms to operate until the problems are fixed. It seems to me that platforms would act very swiftly to put effective systems and processes in place if they could not operate in the interim.

The Bill already contains this emergency handbrake; the question is when it should be used. My answer is that it should be used when the evidence of severe harm presents itself, and not only when the regulator has a moment of self-doubt that its “road maps”, which it is normally so optimistic about, will eventually fix the problem. Ofcom should not be allowed to sit on the evidence hoping, with a wing and a prayer, that things will fix themselves in the end.

13:45
Amendment 218C, 218E, 218H and 218K assert that Ofcom must—rather than may—apply to the court for an interim access restriction order should the conditions be met to do so. This is important, because it is only when we have a tough regulator that platforms will act.
When I moved previous amendments to the Bill, I mentioned my experience as chair, for six years, of the Better Regulation Executive. During that time, I learned that regulators that had a reputation for acting quickly and decisively, and for being tough, had a much more compliant base as a consequence. A compliant base in turn eases the regulatory burden, as regulators are not constantly having to bring forward provisional notices of contravention and expend resources trying to extract information from companies—and so Ofcom’s first option in the most serious situations would be to apply the brake immediately. This would have the potential to act as a powerful deterrent. If providers know that the regulator will pursue criminal action on those that breach service conditions, they are more likely to comply and make sure that there is no harmful material in the first place. If we settle for vague enforcement mechanisms, they will undoubtedly be open to abuse. Indeed, appropriate sanctions are impactful only if they are enforced decisively.
We have made immense progress in the development of this Bill in ensuring that children will be protected from pornographic and inappropriate content. We now have the responsibility to ensure that those who fail to comply with these measures face proportionate consequences. As regulator and sole enforcer of the Bill, Ofcom must be empowered to protect users online. In the spirit of willingness to respond positively, which the Minister has demonstrated already this afternoon, I hope that he will also do so with these amendments.
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak briefly to Amendment 218JA, spoken to by the noble Lord, Lord Allan. My name is attached to it online but has not made it on to the printed version. He introduced it so ably and comprehensively that I will not say much more, but I will be more direct with my noble friend the Minister.

This amendment would remove Clause 133(11). The noble Lord, Lord Allan, mentioned that BT has raised with us—I am sure that others have too—that the subsection gives examples of access facilities, such as ISPs and application stores. However, as the noble Lord said, there are other ways that services could use operating systems, browsers and VPNs to evade these access restriction orders. While it is convention for me to say that I would support this amendment should it be moved at a later stage, this is one of those issues that my noble friend the Minister could take off the table this afternoon—he has had letters about it to which there have not necessarily been replies—just by saying that subsection (11) does not give the whole picture, that there are other services and that it is misleading to give just these examples. Will he clarify at the Dispatch Box and on the record, for the benefit of everyone using the Bill now and in future, what broader services are caught? We could then take the issue off the table on this 10th day of Committee.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will be even more direct than the noble Baroness, Lady Morgan, and seek some confirmation. I understood from our various briefings in Committee that, where content is illegal, it is illegal anywhere in the digital world—it is not restricted simply to user to user, search and Part 5. Can the Minister say whether I have understood that correctly? If I have, will he confirm that Ofcom will be able to use its disruption powers on a service out of scope, as it were, such as a blog or a game with no user-to-user aspect, if it were found to be persistently hosting illegal content?

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this has been an interesting debate, though one of two halves, if not three.

The noble Lord, Lord Bethell, introduced his amendment in a very measured way. My noble friend Lady Benjamin really regrets that she cannot be here, but she strongly supports it. I will quote her without taking her speech entirely on board, as we have been admonished for that previously. She would have said that

“credit card companies have claimed ignorance using the excuse of how could they be expected to know they are supporting porn if they were not responsible for maintaining porn websites … This is simply not acceptable”.

Noble Lords must forgive me—I could not possibly have delivered that in the way that my noble friend would have done. However, I very much took on board what the noble Lord said about how this makes breaches transparent to the credit card companies. It is a right to be informed, not an enforcement power. The noble Lord described it as a simple and proportionate measure, which I think is fair. I would very much like to hear from the Minister why, given the importance of credit card companies in the provision of pornographic content, this is not acceptable to the Government.

The second part of this group is all about effective enforcement, which the noble Lord, Lord Bethell, spoke to as well. This is quite technical; it is really important that these issues have been raised, in particular by the noble Lord. The question is whether Ofcom has the appropriate enforcement powers. I was very taken by the phrase

“pre-empt a possible legal challenge”,

as it is quite helpful to get your retaliation in first. Underlying all this is that we need to know what advice the Minister and Ofcom are getting about the enforcement powers and so on.

I am slightly more sceptical about the amendments from the noble Lord, Lord Curry. I am all in favour of the need for speed in enforcement, particularly having argued for it in competition cases, where getting ex-ante powers is always a good idea—the faster one can move, the better. However, restricting the discretion of Ofcom in those circumstances seems to me a bit over the top. Many of us have expressed our confidence in Ofcom as we have gone through the Bill. We may come back to this in future; none of us thinks the Bill will necessarily be the perfect instrument, and it may prove that we do not have a sufficiently muscular regulator. I entirely respect the noble Lord’s track record and experience in regulation, but Ofcom has so far given us confidence that it will be a muscular regulator.

I turn now to the third part of the group. I was interested in the context in which my noble friend placed enforcement; it is really important and supported by the noble Baroness, Lady Morgan. It is interesting what questions have been asked about the full extent of the Government’s ambitions in this respect: are VPNs going to be subject to these kinds of notices? I would hope so; if VPNs are really the gateway to some of the unacceptable harms that we are trying to prevent, we should know about that. We should be very cognisant of the kind of possible culture being adopted by some of the social media and regulated services, and we should tailor our response accordingly. I will be interested to hear what the Government have to say on that.

Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to the noble Lords, Lord Bethell, Lord Curry and Lord Allan for introducing their amendments, to the noble Baroness, Lady Morgan, for her direct question, and to the noble Baroness, Lady Kidron, for her equally direct question. I am sure they will be of great assistance to the Minister when he replies. I will highlight the words of the noble Lord, Lord Allan, who said “We are looking for services to succeed”. I think that is right, but what is success? It includes compliance and enforcement, and that is what this group refers to.

The amendments introduced by the noble Lord, Lord Bethell, seek to strengthen what is already in the Bill about Ofcom’s Chapter 6 powers of enforcement, otherwise known as business disruption powers, and they focus on what happens in the event of a breach; they seek to be more prescriptive than what we already have. I am sure the Minister will remember that the same issue came up in the Digital Economy Bill, around the suggestion that the Government should take specific powers. There, the Government argued they had assurances from credit card companies that, if and when action was required, they would co-operate. In light of that previous discussion, it will be interesting to hear what the Minister has to say.

In respect of the amendments introduced by the noble Lord, Lord Curry, on the need to toughen up requirements on Ofcom to act, I am sure the Minister will say that these powers are not required and that the Bill already makes provision for Ofcom blocking services which are failing in their duties. I echo the concern of the noble Lord, Lord Clement-Jones, about being overly prescriptive and not allowing Ofcom to do its job. The truth is that Ofcom may need discretion but it also needs teeth, and I will be interested to hear what the Minister has to say about whether he feels, in the light of the debate today and other conversations, that there is sufficient toughness in the Bill and that Ofcom will be able to do the job it is required to do. There is an issue of the balance of discretion versus requirement, and I know he will refer to this. I will also be interested to hear from the Minister about the view of Ofcom with respect to what is in the Bill, and whether it feels that it has sufficient powers.

I will raise a final point about the amendments in the name of the noble Lord, Lord Curry. I think they ask a valid question about the level of discretion that Ofcom will have. I ask the Minister this: if, a few years down the line, we find that Ofcom has not used the powers suitably, despite clear failures, what would the Government seek to do? With that, I look forward to hearing from the Minister.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, where necessary, the regulator will be able to apply to the courts for business disruption measures. These are court orders which will require third-party ancillary services and access facilities to withdraw their services from, or impede users’ access to, non-compliant regulated services. These are strong, flexible powers which will ensure that Ofcom can take robust action to protect users. At the same time, we have ensured that due process is followed. An application for a court order will have to specify the non-compliant provider, the grounds and evidence on which the application is based and the steps that third parties must take to withdraw services or block users’ access. Courts will consider whether business disruption measures are an appropriate way of preventing harm to users and, if an order is granted, ensure it is proportionate to the risk of harm. The court will also consider the interests of all relevant parties, which may include factors such as contractual terms, technical feasibility and the costs of the measures. These powers will ensure that services can be held to account for failure to comply with their duties under the Bill, while ensuring that Ofcom’s approach to enforcement is proportionate and upholds due process.

14:00
The proposed new clause in my noble friend Lord Bethell’s Amendment 217 appears to draw on provisions made in Part 3 of the Digital Economy Act 2017 for the age-verification regulator to notify payment service and ancillary service providers of non-compliant services. The noble Lord, Lord Clement-Jones, is right to point to the absence of the noble Baroness, Lady Benjamin; she is unavoidably absent today because of the Windrush Day commemorations, and I know she would have made her points in a similar but perhaps different style from her noble friend.
I am pleased to reassure her and my noble friend Lord Bethell that the enforcement powers in the Bill are stronger than those in the Digital Economy Act. Ofcom will be able to apply to the courts to require ancillary services and access facilities to withdraw their services from, or block users’ access to, non-compliant regulated services, rather than rely on the voluntary action of third parties as under the Digital Economy Act. Furthermore, Ofcom can publish the details of enforcement action and must publish details related to confirmation decisions and penalty notices. Ofcom can also require that the provider publish details of its enforcement action, or otherwise notify users of that action, or both. This will provide greater transparency to third parties and users about whether a service has been found to be non-compliant. Ofcom cannot require providers to publish provisional notices of contravention. The provider has the right to make representations to Ofcom before it issues a confirmation decision.
The amendments from the noble Lord, Lord Curry of Kirkharle, mandate that Ofcom must seek business disruption court orders in specific circumstances of non-compliance or breach. I want to reassure your Lordships that we have provided Ofcom with a robust range of enforcement powers to use against companies that fail to fulfil their duties, including in the circumstances described in the noble Lord’s amendments. Ofcom will be able to use those powers and sanctions according to what it deems to be the most effective way forward in each case, including issuing enforcement decisions that direct companies to take specific steps to come into compliance or remedy a breach, issuing fines of up to £18 million or 10% of global qualifying revenue—whichever is higher—and applying to the courts for business disruption measures. Ofcom will determine the most effective, proportionate and fair intervention on a case-by-case basis.
The circumstances in which business disruption measures can be sought are set out in Clauses 131 to 135—for example, where a regulated provider has failed to comply with any enforceable requirement, that failure is continuing and the provider has not complied with Ofcom’s confirmation decision, or where the risk of harm warrants an application. This provides both services and the regulator with clarity about when these measures could be used. The Government are confident that Ofcom will apply to the courts for business disruption measures where necessary and proportionate, but it is important that it remains for the independent regulator to determine when to use these powers.
Introducing mandatory requirements would undermine Ofcom’s independence and discretion to manage enforcement on a case-by-case basis. This would also frustrate Ofcom’s ability to regulate in a proportionate way and could make its enforcement processes unnecessarily punitive or inflexible. It could also overwhelm the courts if Ofcom is strictly forced to apply for business disruption measures where any grounds apply, even where the breach may be minor. Instead, Ofcom will act proportionately in performing its regulatory functions, targeting action where it is needed and adjusting timeframes as necessary. I am mindful that on the final day in Committee, the noble Lord, Lord Grade of Yarmouth, continues to be in his place, following the Committee’s deliberations very closely on behalf of the regulator.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I am reminded by my noble friend Lord Foster of Bath, particularly relating to the gambling sector, that some of these issues may run across various regulators that are all seeking business disruption. He reminded me that if you type into a search engine, which would be regulated and subject to business disruption measures here, “Casinos not regulated by GAMSTOP”, you will get a bunch of people who are evading GAMSTOP’s regulation. Noble Lords can imagine similar for financial services—something that I know the noble Baroness, Lady Morgan of Cotes, is also very interested in. It may not be for answer now, but I would be interested to understand what thinking the Government have on how all the different business disruption regimes—financial, gambling, Ofcom-regulated search services, et cetera—will all mesh together. They could all come before the courts under slightly different legal regimes.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

When I saw the noble Lord, Lord Foster of Bath, and the noble Baroness, Lady Armstrong of Hill Top, in their places, I wondered whether they were intending to raise these points. I will certainly take on board what the noble Lord says and, if there is further information I can furnish your Lordships with, I certainly will.

The noble Baroness, Lady Kidron, asked whether the powers can be used on out-of-scope services. “No” is the direct answer to her direct question. The powers can be used only in relation to regulated services, but if sites not regulated by the Bill are publishing illegal content, existing law enforcement powers—such as those frequently deployed in cases of copyright infringement—can be used. I could set out a bit more in writing if that would be helpful.

My noble friend Lord Bethell’s amendments seek to set out in the Bill that Ofcom will be able to make a single application to the courts for an order enabling business disruption measures that apply against multiple platforms and operators. I must repeat, as he anticipated, the point made by my right honourable friend Chris Philp that the civil procedure rules allow for a multi-party claim to be made. These rules permit any number of claimants or defendants and any number of claims to be covered by one claim form. The overriding objective of the civil procedure rules is that cases are dealt with justly and proportionately. I want to reassure my noble friend that the Government are confident that the civil procedure rules will provide the necessary flexibility to ensure that services can be blocked or restricted.

The amendment in the name of the noble Lord, Lord Allan of Hallam, seeks to clarify what services might be subject to access restriction orders by removing the two examples provided in the Bill: internet access services and application stores. I would like to reassure him that these are simply indicative examples, highlighting two kinds of service on which access restriction requirements may be imposed. It is not an exhaustive list. Orders could be imposed on any services that meet the definition—that is, a person who provides a facility that is able to withdraw, adapt or manipulate it in such a way as to impede access to the regulated service in question. This provides Ofcom with the flexibility to identify where business disruption measures should be targeted, and it future-proofs the Bill by ensuring that the power remains functional and effective as technologies develop.

As the noble Lord highlighted, these are significant powers that can require that services be blocked in the UK. Clearly, limiting access to services in this way substantially affects the business interests of the service in question and the interests of the relevant third-party service, and it could affect users’ freedom of expression. It is therefore essential that appropriate safeguards are included and that due process is followed. That is why Ofcom will be required to seek a court order to be able to use these powers, ensuring that the courts have proper oversight.

To ensure that due process is upheld, an application by the regulator for a court order will have to specify the non-compliant provider, the grounds of the order and the steps that Ofcom considers should be imposed on the third parties in order to withdraw services and block users’ access. These requirements will ensure that the need to act quickly to tackle harm is appropriately balanced against upholding fundamental rights.

It might be useful to say a little about how blocking works—

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Before the Minister does that, can he say whether he envisages that operating against VPNs as well?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

If I may, I will take advice on that and write to the noble Lord.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

That would be useful.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes; he made a helpful point, and I will come back on it.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

We share a common interest in understanding whether it would be used against VPNs, but we may not necessarily have the same view about whether it should be. Do not take that as an encouragement—take it as a request for information.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I thank the noble Lord.

The term “blocking” is used to describe measures that will significantly impede or restrict access to non-compliant services—for example, internet service providers blocking websites or app stores blocking certain applications. These measures will be used only in exceptional circumstances, where the service has committed serious failures in meeting its duties and where no other action would reasonably prevent online harm to users in the UK.

My noble friend Lord Bethell’s Amendments 218F and 218L seek to ensure that Ofcom can request that an interim service or access restriction order endures for a period of six months in cases where a service hosts pornographic content. I reassure him that the court will already be able to make an order which can last up to six months. Indeed, the court’s interim order can have effect until either the date on which the court makes a service or access restriction order, or an expiry date specified by the court in the order. It is important that sanctions be determined on a case-by-case basis, which is why no limitations are set for these measures in the Bill.

As my noble friend knows, in the Bill there are clear duties on providers to ensure that children are not able to access pornography, which Ofcom will have a robust set of powers to enforce. It is important, however, that Ofcom’s powers and its approach to enforcement apply equally and consistently across the range of harms in scope of the Bill, rather than singling out one form of content in particular.

I hope that that is useful to noble Lords, along with the commitment to write on the further points which were raised. With that, I urge my noble friend to withdraw his amendment.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

My Lords, to be honest, this debate has been an incredible relief to me. Here we have been taking a step away from some of the high-level conversations we had about what we mean by the internet and safety, looking at the far horizon, and instead looking at the moment when the Bill has real traction to try to change behaviours and improve the environment of the internet. I am extremely grateful to the Minister for his fulsome reply on a number of the issues.

The reason why it is so important is the two big areas where enforcement and compliance are going to be really tricky. First, there is Ofcom’s new relationship with the really big behemoths of the internet. It has a long tradition of partnership with big companies such as ITV, the radio sector—with the licensed authorities. However, of course it has licences, and it can pull them. I have worked for some of those companies, and it is quite a thing to go to see your regulator when you know that it can pull your licence. Obviously, that is within legal reason, but at the end of the day it owns your licence, and that is different to having a conversation where it does not.

The second class is the Wild West: the people living in open breach of regular societal norms who care not for the intentions of either the regulator, the Government or even mainstream society. Bringing those people back into reasonable behaviour will be a hell of a thing. My noble friend Lord Grade spoke, reasonably but with a degree of trepidation, about the challenge faced by Ofcom there. I am extremely grateful to the Minister for addressing those points.

Ofcom will step up to having a place next to the FCA and the MHRA. The noble Lord, Lord Curry, spoke about some of the qualities needed of one of the big three regulators. Having had some ministerial oversight of the MHRA, I can tell your Lordships that it has absolutely no hesitation about tackling big pharmaceutical companies and is very quick, decisive and clear. It wields a big stick—or, to use the phrase of the noble Baroness, Lady Merron, big teeth—in order to conduct that. That is why I ask the Minister just to keep in mind some of the recommendations embedded in these amendments.

The noble Baroness, Lady Kidron, mentioned illegal content, and I appreciate the candour of the Minister’s reply. However, business disruption measures offer an opportunity to address the challenge of illegal content, which is something that I know the Secretary of State has spoken about very interestingly, in terms of perhaps commissioning some kind of review. If such a thing were to happen, I ask that business disruption measures and some way of employing them might be brought into that.

We should look again at enforcement and compliance. I appreciate the Minister saying that it is important to let the regulator make some of these decisions, but the noble Lord, Lord Allan, was right: the regulator needs to know what the Government’s intentions are. I feel that we have opened the book on this, but there is still a lot more to be said about where the Government see the impact of regulation and compliance ending up. In all the battles in other jurisdictions—France, Germany, the EU, Canada, Louisiana and Utah—it all comes down to enforcement and compliance. We need to know more of what the Government hope to achieve in that area. With that, I beg leave to withdraw my amendment.

Amendment 217 withdrawn.
14:15
Clause 119: Requirements enforceable by OFCOM against providers of regulated services
Amendments 217A and 218 not moved.
Clause 119 agreed.
Clause 120: Confirmation decisions
Amendments 218ZZA to 218ZB not moved.
Clause 120 agreed.
Clauses 121 and 122 agreed.
Clause 123: Confirmation decisions: children’s access assessments
Amendment 218ZC not moved.
Clause 123 agreed.
Clause 124 agreed.
Clause 125: Confirmation decisions: penalties
Amendment 218ZD not moved.
Clause 125 agreed.
Amendment 218A
Moved by
218A: After Clause 125, insert the following new Clause—
“Confirmation decisions: offence
(1) A person to whom a confirmation decision is given commits an offence if, without reasonable excuse, the person fails to comply with a requirement imposed by the decision which—(a) is of a kind described in section 121(1), and(b) relates (whether or not exclusively) to a children’s online safety duty.(2) A “children’s online safety duty” means a duty set out in—(a) section 11(3)(a),(b) section 11(3)(b),(c) section 72(2), or(d) section 72(3).(3) A person who commits an offence under this section is liable—(a) on summary conviction in England and Wales, to imprisonment for a term not exceeding the general limit in a magistrates’ court or a fine (or both);(b) on summary conviction in Scotland, to imprisonment for a term not exceeding 12 months or a fine not exceeding the statutory maximum (or both);(c) on summary conviction in Northern Ireland, to imprisonment for a term not exceeding 6 months or a fine not exceeding the statutory maximum (or both);(d) on conviction on indictment, to imprisonment for a term not exceeding 2 years or a fine (or both).”Member’s explanatory statement
This amendment creates a new offence of failure to comply with requirements of a confirmation decision that relate to specified duties to protect children’s online safety.
Amendment 218A agreed.
Amendment 218B not moved.
Clause 126: Penalty for failure to comply with confirmation decision
Amendments 218BA and 218BB not moved.
Clause 126 agreed.
Clauses 127 and 128 agreed.
Clause 129: Information to be included in notices under sections 127 and 128
Amendment 218BC not moved.
Clause 129 agreed.
Clause 130 agreed.
Schedule 13 agreed.
Clause 131: Service restriction orders
Amendments 218C and 218D not moved.
Clause 131 agreed.
Clause 132: Interim service restriction orders
Amendments 218E to 218G not moved.
Clause 132 agreed.
Clause 133: Access restriction orders
Amendments 218H to 218JA not moved.
Clause 133 agreed.
Clause 134: Interim access restriction orders
Amendments 218K to 218M not moved.
Clause 134 agreed.
Clause 135 agreed.
Amendment 219 not moved.
Clauses 136 and 137 agreed.
Amendment 220 not moved.
Clause 138: OFCOM’s guidance about enforcement action
Amendments 220A to 220C not moved.
Clause 138 agreed.
Amendments 220D and 220E not moved.
Clause 139: Advisory committee on disinformation and misinformation
Amendments 221 to 224 not moved.
Clause 139 agreed.
Amendment 225 not moved.
Clause 140 agreed.
Clause 141: Research about users’ experiences of regulated services
Amendment 225A not moved.
Clause 141 agreed.
Clause 142 agreed.
Amendment 226 not moved.
Clause 143 agreed.
Clause 144: OFCOM’s reports about news publisher content and journalistic content
Amendment 227 not moved.
Clause 144 agreed.
Clause 145: OFCOM’s transparency reports
Amendment 228 not moved.
Clause 145 agreed.
Amendment 229 not moved.
House resumed. Committee to begin again not before 3.04 pm.

Online Safety Bill

Committee (10th Day) (Continued)
15:04
Clause 146: OFCOM’s report about researchers’ access to information
Amendment 230
Moved by
230: Clause 146, page 128, line 35, leave out from “publish” to end of line 36 and insert “an interim report within the period of three months beginning with the day on which this section comes into force, and a final report within the period of two years beginning on the day on which the interim report is published.”
Member’s explanatory statement
This amendment seeks to accelerate the process relating to Ofcom’s report on researchers’ access to information. Instead of simply requiring a report within two years of Clause 146 being brought into force, this amendment would require an interim report within three months, with a final report to follow two years after that.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, my noble friend Lord Stevenson, who tabled this amendment, unfortunately cannot be with us today as he is off somewhere drinking sherry, I hope.

This is an important set of amendments about researchers’ access to data. As I have previously said to the Committee, we need to ensure that Ofcom has the opportunity to be as trusted as possible in doing its job, so that we can give it as much flexibility as we can, and so that it can deal with a rapidly changing environment. As I have also said on more than one occasion, in my mind, that trust is built by the independence of Ofcom from Secretary of State powers; the ongoing and post-legislative scrutiny of Parliament, which is not something that we can deal with in this Bill; and, finally, transparency—and this group of amendments goes to that very important issue.

The lead amendment in this group, Amendment 230 in my noble friend Lord Stevenson’s name, seeks to accelerate the process relating to Ofcom’s report on researchers’ access to information. Instead of simply requiring a report within two years of Clause 146 being brought into force, this amendment would require an interim report within three months with a final report to follow two years later. Although it is the lead amendment in the group, I do not think it is the more significant because, in the end, it does not do much about the fundamental problem that we want to deal with in this group, which is the need to do better than just having a report. We need to ensure that there really is access by independent reporters.

Amendments 233 and 234 are, I think, of more significance. These proposed new clauses would assist independent researchers in accessing information and data from providers of regulated services. Amendment 233 would allow Ofcom itself to appoint researchers to undertake a variety of research. Amendment 234 would require Ofcom to issue a code of practice on researchers’ access to data; again, this is important so that the practical and legal difficulties for both researchers and service providers can be overcome though negotiation and consultation by Ofcom. Amendment 233A from the noble Lord, Lord Allan, which I am sure he will speak to in a moment, is helpful in clarifying that no data protection breach would be incurred by allowing the research access.

In many ways, there is not a huge amount more to say. When Melanie Dawes, the head of Ofcom, appeared before the Joint Committee on 1 November 2021—all that time ago—she said that

“tightening up the requirement to work with external researchers would be a good thing in the Bill”.

It is therefore a disappointment that, when the Bill was finally published after the Joint Committee’s consideration of the draft, there was not something more significant and more weighty than just a report. That is what we are trying to address, particularly now that we see, as an example, that Twitter is charging more than £30,000 a month for researchers’ access. That is quite a substantial rate in order for researchers to be able to do their work in respect of that platform. Others are restricting or obscuring some of the information that people want to be able to see.

This is a vital set of measures if this Bill is to be effective. These amendments go a long way towards where we want to get to on this; for the reasons I have set out around ensuring that there is transparency, they are vital. We know from the work of Frances Haugen that the platforms themselves are doing this research. We need that out in the open, we need Ofcom to be able to see it through independent researchers and we need others to be able to see it so that Parliament and others can continue to hold these platforms to account. Given that the Minister is in such a positive mood, I look forward to his positive response.

Baroness Barker Portrait The Deputy Chairman of Committees (Baroness Barker) (LD)
- Hansard - - - Excerpts

My Lords, I must advise the Committee that if Amendment 230 is agreed to then I cannot call Amendment 231 because of pre-emption.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, we are reaching the end of our Committee debates, but I am pleased that we have some time to explore these important questions raised by the noble Lord, Lord Knight of Weymouth.

I have an academic friend who studies the internet. When asked to produce definitive answers about how the internet is impacting on politics, he politely suggests that it may be a little too soon to say, as the community is still trying to understand the full impact of television on politics. We are rightly impatient for more immediate answers to questions around how the services regulated by this Bill affect people. For that to happen, we need research to be carried out.

A significant amount of research is already being done within the companies themselves—both more formal research, often done in partnership with academics, and more quick-fix commercial analyses where the companies do their own studies of the data. These studies sometimes see the light of day through publication or quite often through leaks; as the noble Lord, Lord Knight, has referred to, it is not uncommon for employees to decide to put research into the public domain. However, I suggest that this is a very uneven and suboptimal way for us to get to grips with the impact on services. The public interest lies in there being a much more rigorous and independent body of research work, which, rightly, these amendments collectively seek to promote.

The key issues that we need to address head-on, if we are actively to promote more research, lie within the data protection area. That has motivated my Amendment 233A—I will explain the logic of it shortly—and is the reason why I strongly support Amendment 234.

A certain amount of research can be done without any access to personal data, bringing together aggregated statistics of what is happening on platforms, but the reality is that many of the most interesting research questions inevitably bring us into areas where data protection must be considered. For example, looking at how certain forms of content might radicalise people will involve looking at what individual users are producing and consuming and the relationships between them. There is no way of doing without it for most of the interesting questions around the harms we are looking at. If you want to know whether exposure to content A or content B led to a harm, there is no way to do that research without looking at the individual and the specifics.

There is a broad literature on how anonymisation and pseudonymisation techniques can be used to try to make those datasets a little safer. However, even if the data can be made safe from a technical point of view, that still leaves us with significant ethical questions about carrying out research on people who would not necessarily consent to it and may well disagree with the motivation behind the sorts of questions we may ask. We may want to see how misinformation affects people and steers them in a bad direction; that is our judgment, but the judgment of the people who use those services and consume that information may well be that they are entirely happy and there is no way on earth that they would consent to be studied by us for something that they perceive to be against their interests.

Those are real ethical questions that have to be asked by any researcher looking at this area. That is what we are trying to get to in the amendments—whether we can create an environment with that balance of equity between the individual, who would normally be required to give consent to any use of their data, and the public interest. We may determine that, for example, understanding vaccine misinformation is sufficiently important that we will override that individual’s normal right to choose whether to participate in the research programme.

My Amendment 233A is to Amendment 233, which rightly says that Ofcom may be in a position to say that, for example, vaccine misinformation is in the overriding public interest and we need research into it. If it decides to do that and the platforms transfer data to those independent researchers, because we have said in the amendment that they must, the last thing we want is for the platforms to feel that, if there is any problem further down the track, there will be comeback on them. That would be against the principle of natural justice, given that they have been instructed to hand the data over, and could also act as a barrier.

15:15
The fact that I am raising these concerns is because it is not far-fetched; however well-intentioned somebody is and however well they think they are doing data security, the reality of today’s world is that there are data breaches. Once you have given the data over, at some point some independent researcher is going to have a dataset compromised, and Ofcom itself may be in possession of data that is going to be compromised. Amendment 233A seeks to clarify that, in those circumstances, we are not going to go after the company.
People may be aware of a case involving my former employer and a company called Cambridge Analytica, and if you look at the fallout from that case, some of the decisions that were made pointed to the notion that the first party which originally collected the data can almost never say that they are no longer liable; any transfer to a third party carries their liability with it. That is reasonable in most cases; if, for commercial reasons, you are passing data on to somebody else, that is fine. However, in the circumstances where we have said the regulator is going to insist that they provide the data for a valid public purpose, I do not think we should be holding them liable if something goes wrong downstream—that is the rationale for Amendment 233A.
That brings me on to Amendment 234, which is a good way of trying to address the problem more generally. Sometimes there is an assumption that research is good and companies are bad: “Hand over the data and good stuff will happen”. There is a variable community of companies and a variable community of researchers, in terms of the confidence we can have in them to maintain data security and privacy. Having some kind of formal mechanism to approve researchers, and for researchers to sign up to, is extraordinarily helpful.
I refer noble Lords to the work done by the European Digital Media Observatory—this is one of those declarations of interests that is really a confession of expertise. I was on the board of the European Digital Media Observatory, for which I had no remuneration as it was done as a bit of penance having worked in the sector. As part of my penance, I felt I should be helping bodies that try to deal with the misinformation issue. The European Digital Media Observatory is a European Commission-sponsored body trying to deal with these exact questions, asking how we enable more research to happen, looking at misinformation in the context of the EU. It did some excellent work led by Dr Rebekah Tromble, an academic at George Washington University, who convened a working group which has come up with a code of practice that is going through the EU process. As long as we are not divergent from the general data protection regulation, it would have significant applicability here.
The real benefit of such an approach is that everyone knows what they are supposed to do, and we can objectively test whether or not they are doing it: the party that collected the data and handed it over; and the party that receives the data and does the research—everyone has very clear roles and responsibilities. By doing that, we unlock the flows, which is what we want to do collectively in these amendments: we want the data to flow from the regulated services to the independent researchers.
I am not arguing that this will necessarily solve all the problems, but it will certainly flush out whether, when services say they cannot provide data for research, that is a “cannot” or “will not”. Today, they can say they cannot for data protection legal reasons—I think with some justification. If we have the code of conduct in place as proposed in Amendment 234, and the researchers are approved researchers who have signed up to it and committed to doing all the right things, then it is much more reasonable for us to say “Platform, meet researcher; researcher, meet platform—you all know your responsibilities, and there are no legal barriers”, and to expect the data to move in a way that will meet those public interest obligations.
This an important set of amendments which we are coming to quite late in the day. They touch on some issues that are being dealt with elsewhere, and I hope this is one example where we will feel comfortable learning from the EU, which is a little bit ahead in terms of trying to deal with some of these questions, working within a framework which is still, from a data protection law point of view at least, a pretty consistent framework between us and them.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, Amendments 233 and 234 from the noble Lord, Lord Knight of Weymouth, were well motivated, so I will be brief. I just have a couple of queries.

First, we need to consider what the criteria are for who is considered worthy of the privileged status of receiving Ofcom approval as a researcher. We are discussing researchers as though they are totally reliable and trustworthy. We might even think that if they are academic researchers, they are bound to be. However, there was an interesting example earlier this week of confirmation bias leading to mistakes when King’s College had to issue a correction to its survey data that was used in the BBC’s “Mariana in Conspiracyland”. King’s College admitted that it had wildly overestimated the numbers of those reading conspiracy newspaper, The Light, and wildly overestimated the numbers of those attending what it dubbed conspiracy demonstrations. By the way, BBC Verify has so far failed to verify the mistake it repeated. I give this example not as a glib point but because we cannot just say that because researchers are accredited elsewhere they should just be allowed in. I also think that the requirement to give the researchers

“all such assistance as they may reasonably require to carry out their research”

sounds like a potentially very time-consuming and expensive effort.

The noble Lord, Lord Allan of Hallam, raised points around “can’t” or “won’t”, and whether this means researchers “must” or “should”, and who decides whether it is ethical that they “should” in all instances. There are ethical questions here that have been raised. Questions of privacy are not trivial. Studying individuals as specimens of “badthink” or “wrongthink” might appear in this Committee to be in the public interest but without the consent of people it can be quite damaging. We have to decide which questions fulfil the public interest so sufficiently that consent could be overridden in that way.

I do not think this is a slam-dunk, though it looks like a sensible point. I do not doubt that all of us want more research, and good research, and data we can use in arguments, whatever side we are on, but it does not mean we should just nod something through without at least pausing.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I declare an interest as a trustee of the International Centre for the Study of Radicalisation at the War Studies department of King’s College London. That is somewhere that conducts research using data of the kind addressed in this group, so I have a particular interest in it.

We know from the kind of debates that the noble Lord, Lord Knight, referred to that it is widely accepted that independent researchers benefit hugely from access to relevant information from service providers to research online safety matters. That is why my Amendment 234, supported by the noble Lords, Lord Clement-Jones and Lord Knight, aims to introduce an unavoidable mandatory duty for regulated platforms to give access to that data to approved researchers.

As the noble Lord, Lord Knight, said, there are three ways in which this would be done. First, the timeframe for Ofcom’s report would be accelerated; secondly, proposed new Clause 147 would allow Ofcom to appoint the researchers; and, thirdly, proposed new Clause 148 would require Ofcom to write a code of practice on data access, setting up the fundamental principles for data access—a code which, by the way, should answer some of the concerns quite reasonably voiced by the noble Baroness, Lady Fox.

The internet is absolutely the most influential environment in our society today, but it is a complete black box, and we have practically no idea what is going on in some of the most important parts of it. That has a terrible impact on our ability to devise sensible policies and mitigate harm. Instead, we have a situation where the internet companies decide who accesses data, how much of it and for what purposes.

In answer to his point, I can tell the noble Lord, Lord Allan, who they give the data to—they give it to advertisers. I do not know if anyone has bought advertising on the internet, but it is quite a chilling experience. You can find out a hell of a lot about quite small groups of people if you are prepared to pay for the privilege of trying to reach them with one of your adverts: you can find out what they are doing in their bedrooms, what their mode of transport is to get to work, how old they are, how many children they have and so on. There is almost no limit to what you can find out about people if you are an advertiser and you are prepared to pay.

In fact, only the companies themselves can see the full picture of what goes on on the internet. That puts society and government at a massive disadvantage and makes policy-making virtually impossible. Noble Lords should be in no doubt that these companies deliberately withhold valuable information to protect their commercial interests. They obfuscate and confuse policymakers, and they protect their reputations from criticism about the harms they cause by withholding data. One notable outcome of that strategy is that it has taken years for us to be here today debating the Online Safety Bill, precisely because policy-making around the internet has been so difficult and challenging.

A few years ago, we were making some progress on this issue. I used to work with the Institute for Strategic Dialogue using CrowdTangle, a Facebook product. It made a big impact. We were working on a project on extremism, and having access to CrowdTangle revolutionised our understanding of how the networks of extremists that were emerging in British politics were coming together. However, since then, platforms have gone backwards a long way and narrowed their data-sharing. The noble Lord, Lord Knight, mentioned that CrowdTangle has essentially been closed down, and Twitter has basically stopped providing its free API for researchers—it charges for some access but even that is quite heavily restricted. These retrograde steps have severely hampered our ability to gather the most basic data from otherwise respectable and generally law-abiding companies. It has left us totally blind to what is happening on the rest of the internet—the bit beyond the nice bit; the Wild West bit.

Civil society plays a critical role in identifying harmful content and bad behaviour. Organisations such as the NSPCC, the CCDH, the ISD—which I mentioned—the Antisemitism Policy Trust and King’s College London, with which I have a connection, prove that their work can make a really big difference.

It is not as though other parts of our economy or society have the same approach. In fact, in most parts of our world there is a mixture of public, regulator and expert access to what is going on. Retailers, for instance, publish what is sold in our shops. Mobile phones, hospitals, banks, financial markets, the broadcast media—they all give access, both to the public and to their regulators, to a huge amount of data about what is going on. Once again, internet companies are claiming exceptional treatment—that has been a theme of debates on the Online Safety Bill—as if what happens online should, for some reason, be different from what happens in the rest of the world. That attitude is damaging the interests of our country, and it needs to be reversed. Does anyone think that the FSA, the Bank of England or the MHRA would accept this state of affairs in their regulated market? They absolutely would not.

Greater access to and availability of data and information about systems and processes would hugely improve our understanding of the online environment and thereby protect the innovation, progress and prosperity of the sector. We should not have to wait for Ofcom to be able to identify new issues and then appoint experts to look at them closely; there should be a broader effort to be in touch with what is going on with the internet. It is the nature of regulation that Ofcom will heavily rely on researchers and civil society to help enforce the Online Safety Bill, but this can be achieved only if researchers have sufficient access to data.

As the noble Lord, Lord Allan, pointed out, legislators elsewhere are making progress. The EU’s Digital Services Act gives a broad range of researchers access to data, including civil society and non-profit organisations dedicated to public interest research. The DSA sets out a framework for vetting and access procedures in detail, as the noble Baroness, Lady Fox, rightly pointed out, creating an explicit role for new independent supervisory authorities and digital services co-ordinators to manage that process.

Under Clause 146, Ofcom must produce a report exploring such access within two years of that section of the Bill coming into effect. That is too long. There is no obligation on the part of the regulator or service providers to take this further. No arguments have been put forward for this extended timeframe or relative uncertainty. In contrast, the arguments to speed up the process are extremely persuasive, and I invite my noble friend the Minister to address those.

15:30
To anticipate the Minister, I will just say that the skilled persons reports created in Clause 94 give Ofcom the helpful power to appoint a skilled person to provide a report, assisting Ofcom to identify and assess a failure by a regulated service. This will be an essential tool for Ofcom to access external expertise. However, it does not create a broader ecosystem of inspection, study and accountability that includes academics and civil society institutions with the capability and expertise to reach into the data and identify the harms of the platforms and their broader effects on society. That is why these measures need to be in the Bill. Given the Minister’s good mood today, I invite him to adopt wholesale these amendments in time for Report.
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I too want to support this group of amendments, particularly Amendment 234, and will make just a couple of brief points.

First, one of the important qualities of the online safety regime is transparency, and this really speaks to that point. It is beyond clear that we are going to need all hands on deck, and again, this speaks to that need. I passionately agree with the noble Baroness, Lady Fox, on this issue and ask, when does an independent researcher stop being independent? I have met quite a lot on my journey who suddenly find ways of contributing to the digital world other than their independent research. However, the route described here offers all the opportunities to put those balancing pieces in place.

Secondly, I am very much aware of the fear of the academics in our universities. I know that a number of them wrote to the Secretary of State last week saying that they were concerned that they would be left behind their colleagues in Europe. We do not want to put up barriers for academics in the UK. We want the UK to be at the forefront of governance of the digital world, this amendment speaks to that, and I see no reason for the Government to reject it.

Finally, I want to emphasise the importance of research. Revealing Reality did research for 5Rights called Pathways, in which it built avatars for real children and revealed the recommendation loops in action. We could see how children were being offered self-harm, suicide, extreme diets and livestream porn within moments of them arriving online. Frances Haugen has already been mentioned. She categorically proved what we have been asserting for years, namely that Instagram impacts negatively on teenage girls. As we put this regime in place, it is not adequate to rely on organisations that are willing to work in the grey areas of legality to get their research or on whistleblowers—on individual acts of courage—to make the world aware.

One of the conversations I remember happened nearly five years ago, when the then Secretary of State asked me what the most important thing about the Bill was. I said, “To bring a radical idea of transparency to the sector”. This amendment goes some way to doing just that.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I, too, support Amendments 233 and 234, and Amendment 233A, from the noble Lord, Lord Allan. As the noble Baroness, Lady Kidron, said, it has been made clear in the past 10 days of Committee that there is a role for every part of society to play to make sure that we see the benefits of the digital world but also mitigate the potential harms. The role that researchers and academics can play in helping us understand how the digital world operates is critical—and that is going to get ever more so as we enter a world of large language models and AI. Access to data in order to understand how digital systems and processes work will become even more important—next week, not just in 10 years’ time.

My noble friend Lord Bethell quite rightly pointed out the parallels with other regulators, such as the MHRA and the Bank of England. A number of people are now comparing the way in which the MHRA and other medical regulators regulate the development of drugs with how we ought to think about the emergence of regulation for AI. This is a very good read-across: we need to set the rules of the road for researchers and ensure, as the noble Baroness, Lady Kidron, said—nailing it, as usual—that we have the most transparent system possible, enabling people to conduct their research in the light, not in the grey zone.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, as the noble Baroness, Lady Kidron, said, clearly, transparency is absolutely one of the crucial elements of the Bill. Indeed, it was another important aspect of the Joint Committee’s report. Like the noble Lord, Lord Knight—a fellow traveller on the committee—and many other noble Lords, I much prefer the reach of Amendments 233 and 234, tabled by the noble Lord, Lord Bethell, to Amendment 230, the lead amendment in this group.

We strongly support amendments that aim to introduce a duty for regulated platforms to enable access by approved independent researchers to information and data from regulated services, under certain conditions. Of course, there are arguments for speeding up the process under Clause 146, but this is really important because companies themselves currently decide who accesses data, how much of it and for what purposes. Only the companies can see the full picture, and the effect of this is that it has taken years to build a really solid case for this Online Safety Bill. Without a greater level of insight, enabling quality research and harm analysis, policy-making and regulatory innovation will not move forward.

I was very much taken by what the noble Baroness, Lady Harding, had to say about the future in terms of the speeding up of technological developments in AI, which inevitably will make the opening up of data, and research into it, of greater and greater importance. Of course, I also take extremely seriously my noble friend’s points about the need for data protection. We are very cognisant of the lessons of Cambridge Analytica, as he mentioned.

It is always worth reading the columns of the noble Lord, Lord Hague. He highlighted this issue last December, in the Times. He said:

“Social media companies should be required to make anonymised data available to third-party researchers to study the effect of their policies. Crucially, the algorithms that determine what you see—the news you are fed, the videos you are shown, the people you meet on a website—should not only be revealed to regulators but the choices made in crafting them should then be open to public scrutiny and debate”.


Those were very wise words. The status quo leaves transparency in the hands of big tech companies with a vested interest in opacity. The noble Lord, Lord Knight, mentioned Twitter announcing in February that it would cease allowing free research access to its application programming interface. It is on a whim that a billionaire owner can decide to deny access to researchers.

I much prefer Amendment 233, which would enable Ofcom to appoint an approved independent researcher. The Ofcom code of practice proposed in Amendment 234 would be issued for researchers and platforms, setting out the procedures for enabling access to data. I take the point made by the noble Baroness, Lady Fox, about who should be an independent accredited researcher, but I hope that that is exactly the kind of thing that a code of practice would deal with.

Just as a little contrast, Article 40 of the EU’s Digital Services Act gives access to data to a broad range of researchers—this has been mentioned previously—including civil society and non-profit organisations dedicated to public interest research. The DSA sets out in detail the framework for vetting and access procedures, creating an explicit role for new independent supervisory authorities. This is an example that we could easily follow.

The noble Lord, Lord Bethell, mentioned the whole question of skilled persons. Like him, I do not believe that this measure is adequate as a substitute for what is contained in Amendments 233 and 234. It will be a useful tool for Ofcom to access external expertise on a case-by-case basis but it will not provide for what might be described as a wider ecosystem of inspection and analysis.

The noble Lord also mentioned the fact that internet companies should not regard themselves as an exception. Independent scrutiny is a cornerstone of the pharmaceutical, car, oil, gas and finance industries. They are open to scrutiny from research; we should expect that for social media as well. Independent researchers are already given access in many other circumstances.

The case for these amendments has been made extremely well. I very much hope to see the Government, with the much more open approach that they are demonstrating today, accept the value of these amendments.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, the Government are supportive of improving data sharing and encouraging greater collaboration between companies and researchers, subject to the appropriate safeguards. However, the data that companies hold about users can, of course, be sensitive; as such, mandating access to data that are not publicly available would be a complex matter, as noble Lords noted in their contributions. The issue must be fully thought through to ensure that the risks have been considered appropriately. I am grateful for the consideration that the Committee has given this matter.

It is because of this complexity that we have given Ofcom the task of undertaking a report on researchers’ access to information. Ofcom will conduct an in-depth assessment of how researchers can currently access data. To the point raised by the noble Lord, Lord Knight, and my noble friend Lord Bethell, let me provide reassurance that Ofcom will assess the impact of platforms’ policies that restrict access to data in this report, including where companies charge for such access. The report will also cover the challenges that constrain access to data and how such challenges might be addressed. These insights will provide an evidence base for any guidance that Ofcom may issue to help improve data access for researchers in a safe and secure way.

Amendments 230 and 231 seek to require Ofcom to publish a report into researchers’ access to data more rapidly than within the currently proposed two years. I share noble Lords’ desire to develop the evidence base on this issue swiftly, but care must be taken to balance Ofcom’s timelines to ensure that it can deliver its key priorities in establishing the core parts of the regulatory framework that the Bill will bring in; for example, the illegal content and child safety duties. Implementing these duties must be the immediate priority for Ofcom to ensure that the Bill meets its objective of protecting people from harm. It is crucial that we do not divert attention away from these areas and that we allow Ofcom to carry out this work as soon as is practicable.

Further to this, considering the complex matter of researchers’ access to data will involve consultation with interested parties, such as the Information Commissioner’s Office, the Centre for Data Ethics and Innovation, UK Research and Innovation, representatives of regulated services and others—including some of those parties mentioned by noble Lords today—as set out in Clause 146(3). This is an extremely important issue that we need to get right. Ofcom must be given adequate time to consult as it sees necessary and undertake the appropriate research.

15:45
Amendments 233 to 234 would introduce a framework for approving independent researchers’ access to data and require Ofcom to issue a code of practice about it. Again, I am sympathetic to the intention, but it is vital that we understand and can fully mitigate the risks of mandating researchers’ access to data which are not publicly available before implementing such a framework. For example, there will be questions of privacy, the protection of personal data, user consent and the disclosure of commercially sensitive information. As the noble Baroness, Lady Fox, pointed out, there are challenges involved in defining who is an independent researcher, particularly as we seek to ensure that such a framework could not be exploited by bad actors.
Exempting service providers from data protection legislation for the purposes of facilitating researcher access to information poses a number of risks. Making derogations from data protection legislation, outside that legislation itself, risks undermining the coherence of the data protection framework. Ofcom’s report on researchers’ access to information will develop the evidence base on current access to information, the challenges associated with accessing it and how these challenges might be overcome.
As we have discussed previously in Committee, researchers will already have access to the publicly available transparency reports which major platforms will be required to publish annually. They may use these data to conduct research into online harms. Additionally, Ofcom will be required to undertake research into UK users’ opinions and experiences relating to regulated services and will have the power to undertake online safety research more broadly. This will help Ofcom and the wider public to understand the experiences of users online and will help to inform policy-making. Ofcom will have wide-ranging powers to require information from companies to support its research activities. Companies will not be able to withhold relevant online safety data when they are required of them by Ofcom. If they do, they will face enforcement measures, including significant fines and the liability of senior managers.
As noted by noble Lords, Ofcom will also have the power to require a report from a skilled person about a regulated service. Ofcom may appoint an independent researcher as the skilled person. This power may be used to assist Ofcom in identifying and assessing non-compliance and, where Ofcom considers there to be a risk of failure, to develop its understanding of the risk. Amendment 234 requires Ofcom to issue a code of practice on researchers’ access to data. I reassure the Committee that following the publication of its report, Ofcom will have the power to issue guidance to providers and researchers about how to facilitate data sharing for research in a safe and responsible way.
In summary, the regulatory framework’s focus on transparency will improve the data which are publicly available to researchers, while Ofcom’s report on the issue will enable the development of the evidence base before further action is considered. At the risk of disappointing noble Lords about the more open-minded attitudes today—
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Before the Minister succeeds in disappointing us, can he clarify something for us? Once Ofcom has published the report, it has the power to issue guidance. What requirement is there for platforms to abide by that guidance? We want there to be some teeth at the end of all this. There is a concern that a report will be issued, followed by some guidance, but that nothing much else will happen.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

It is guidance rather than direction, but it will be done openly and transparently. Users will be able to see the guidance which Ofcom has issued, to see whether companies have responded to it as they see fit and, through the rest of the framework of the Bill, be empowered to make their decisions about their experiences online. This being done openly and transparently, and informed by Ofcom’s research, will mean that everyone is better informed.

We are sympathetic to the amendment. It is complex, and this has been a useful debate—

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I wonder whether the Minister has an answer to the academic community, who now see their European colleagues getting ahead through being able to access data through other legislation in other parts of the world. Also, we have a lot of faith in Ofcom, but it seems a mistake to let it be the only arbiter of what needs to be seen.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

We are very aware that we are not the only jurisdiction looking at the important issues the Bill addresses. The Government and, I am sure, academic researchers will observe the implementation of the European Union’s Digital Services Act with interest, including the provisions about researchers’ access. We will carefully consider any implications of our own online safety regime. As noble Lords know, the Secretary of State will be required to undertake a review of the framework between two and five years after the Bill comes into force. We expect that to include an assessment of how the Bill’s existing transparency provisions facilitate researcher access.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I do not expect the Minister to have an answer to this today, but it will be useful to get this on the record as it is quite important. Can he let us know the Government’s thinking on the other piece of the equation? We are getting the platforms to disclose the data, and an important regulatory element is the research organisations that receive it. In the EU, that is being addressed with a code of conduct, which is a mechanism enabled by the general data protection regulation that has been approved by the European Data Protection Board and creates this legal framework. I am not aware of equivalent work having been done in the UK, but that is an essential element. We do not want to find that we have the teeth to persuade the companies to disclose the data, but not the other piece we need—probably overseen by the Information Commissioner’s Office rather than Ofcom—which is a mechanism for approving researchers to receive and then use the data.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

We are watching with interest what is happening in other jurisdictions. If I can furnish the Committee with any information in the area the noble Lord mentions, I will certainly follow up in writing.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I have a question, in that case, in respect of the jurisdictions. Why should we have weaker powers for our regulator than others?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I do not think that we do. We are doing things differently. Of course, Ofcom will be looking at all these matters in its report, and I am sure that Parliament will have an ongoing interest in them. As jurisdictions around the world continue to grapple with these issues, I am sure that your Lordships’ House and Parliament more broadly will want to take note of those developments.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

But surely, there is no backstop power. There is the review but there is no backstop which would come into effect on an Ofcom recommendation, is there?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

We will know once Ofcom has completed its research and examination of these complex issues; we would not want to pre-judge its conclusions.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Again, that would require primary legislation.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

With that, if there are no further questions, I invite the noble Lord to withdraw his amendment.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, this was a short but important debate with some interesting exchanges at the end. The noble Baroness, Lady Harding, mentioned the rapidly changing environment generated by generative AI. That points to the need for wider ecosystem-level research on an independent basis than we fear we might get as things stand, and certainly wider than the skilled persons we are already legislating for. The noble Lord, Lord Bethell, referred to the access that advertisers already have to insight. It seems a shame that we run the risk, as the noble Baroness, Lady Kidron, pointed out, of researchers in other jurisdictions having more privileged access than researchers in this country, and therefore becoming dependent on those researchers and whistleblowers to give us that wider view. We could proceed with a report and guidance as set out in the Bill but add in some reserved powers in order to take action if the report suggests that Ofcom might need and want that. The Minister may want to reflect on that, having listened to the debate. On that basis, I am happy to beg leave to withdraw the amendment.

Amendment 230 withdrawn.
Amendment 231 not moved.
Clause 146 agreed.
Amendments 232 to 234 not moved.
Clause 147 agreed.
Amendments 235 to 241 not moved.
Amendment 242
Moved by
242: Before Clause 148, insert the following new Clause—
“General procedure
(1) An appeal to the Upper Tribunal under section 148 or 149 must be commenced by sending a notice of appeal to the court.(2) The notice of appeal must set out the grounds of appeal in sufficient detail to indicate— (a) under which provision of this Act the appeal is to be brought;(b) to what extent (if any) the appellant contends that the decision against, or with respect to which, the appeal is brought was based on an error of fact or was wrong in law; and(c) to what extent (if any) the appellant is appealing against OFCOM’s exercise of its discretion in making the disputed decision.(3) The Upper Tribunal may give an appellant leave to amend the grounds of appeal identified in the notice of appeal.”Member’s explanatory statement
This amendment introduces additional procedural steps to be followed when the Upper Tribunal considers an appeal under Clauses 148 and 149.
Baroness Merron Portrait Baroness Merron (Lab)
- Hansard - - - Excerpts

My Lords, I am pleased to speak to Amendments 242, 243 and 245, which have been tabled in the name of my noble friend Lord Stevenson. The intention of this group is to probe what we consider to be an interesting if somewhat niche area, and I hope the Minister will take it in that spirit.

To give the Committee some idea of the background to this group, when Ofcom was originally set up and was mainly dealing with mobile and fixed telephony cartels, it had a somewhat torrid time, if I can describe it that way. Just about every decision it took was challenged in the courts on the so-called merits of the respective cases and on its powers, as the companies taking it to court had many resources they could call upon. That very much held up Ofcom’s progress and, of course, incurred major costs.

Prior to the Digital Economy Act, the worst of the experiences of this period were over, but Ofcom managed to persuade the Government that challenges made by companies in scope of Ofcom would in future be based on judicial review, rather than on merits. In other words, the test was whether Ofcom had acted within its powers and had not acted irrationally. An area of concern to a number of companies is who can challenge the regulator, even if it is acting within its powers, if it gets it wrong in the eyes of said companies. Perhaps the Minister will reflect on that.

This group of amendments is intended to provide better protections for service providers, their users and the wider public, alongside processes that should mean fewer delays and greater efficiency. The Competition Act 1998 permits appeals of Ofcom’s decisions to be made additionally on account of an error of fact, an error of law or an error of the exercise of its discretion.

16:00
The current wording of the Bill permits challenge only by way of judicial review of Ofcom’s decisions, which habitually leads to somewhat prolonged and drawn-out litigation in a process that, through judicial review, could take some nine to 18 months. That means an inability for a challenge by any party of the evidence or existence of a factual error. In light of their sensitive nature and the significant impact that detection orders may have on a large number of users of any particular service, these amendments would be something of a proportionate step to permit challenge on an extended basis. I look forward to hearing the response from the Minister, and I beg to move.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I congratulate the noble Baroness on having elucidated this arcane set of amendments. Unfortunately, though, it makes me deeply suspicious when I see what the amendments seem to do. I am not entirely clear about whether we are returning to some kind of merits-based appeal. If so, since the main litigators are going to be the social media companies, it will operate for their benefit to reopen every single thing that they possibly can on the basis of the original evidence that was taken into account by Ofcom, as opposed to doing it on a JR basis. It makes me feel quite uncomfortable if it is for their benefit, because I suspect it is not going to be for the ordinary user who has been disadvantaged by a social media company. I hope our brand spanking new independent complaints system—which the Minister will no doubt assure us is well on the way—will deal with that, but this strikes me as going a little too far.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I enter the fray with some trepidation. In a briefing, Carnegie, which we all love and respect, and which has been fantastic in the background in Committee days, shared some concerns. As I interpret its concerns, when Ofcom was created in 2003 its decisions could be appealed on their merits, as the noble Lord has just suggested, to the Competition Appeal Tribunal, and I believe that this was seen as a balancing measure against an untested regime. What followed was that the broad basis on which appeal was allowed led to Ofcom defending 10 appeals per year, which really frustrated its ability as a regulator to take timely decisions. It turned out that the appeals against Ofcom made up more than 80% of the workload of the Competition Appeal Tribunal, whose work was supposed to cover a whole gamut of matters. When there was a consultation in the fringes of the DEA, it was decided to restrict appeal to judicial review and appeal on process. I just want to make sure that we are not opening up a huge and unnecessary delaying tactic.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I thank all those who have spoken, and I very much appreciate the spirit in which the amendments were tabled. They propose changes to the standard of appeal, the standing to appeal and the appeals process itself. The Government are concerned that enabling a review of the full merits of cases, as proposed by Amendments 243 and 245, could prove burdensome for the courts and the regulator, since a full-merits approach, as we have been hearing, has been used by regulated services in other regulatory regimes to delay intervention, undermining the effectiveness of the enforcement process. With deep-pocketed services in scope, allowing for a full-merits review could incentivise speculative appeals, both undermining the integrity of the system and slowing the regulatory process.

While the Government are fully committed to making sure that the regulator is properly held to account, we feel that there is not a compelling case for replacing the decisions of an expert and well-resourced regulator with those of a tribunal. Ofcom will be better placed to undertake the complex analysis, including technical analysis, that informs regulatory decisions.

Amendment 245 would also limit standing and leave to appeal only to providers and those determined eligible entities to make super-complaints under Clause 150. This would significantly narrow the eligibility requirements for appeals. For appeals against Ofcom notices we assess that the broader, well-established standard in civil law of sufficient interest is more appropriate. Super-complaints fulfil a very different function from appeals. Unlike appeals, which will allow regulated services to challenge decisions of the regulator, super-complaints will allow organisations to advocate for users, including vulnerable groups and children, to ensure that systemic issues affecting UK users are brought to Ofcom’s attention. Given the entirely distinct purposes of these functions, it would be inappropriate to impose the eligibility requirements for super-complaints on the appeals system.

I am also concerned about the further proposal in Amendment 245 to allow the tribunal to replace Ofcom’s decision with its own. Currently, the Upper Tribunal is able to dismiss an appeal or quash Ofcom’s decision. Quashed decisions must be remitted to Ofcom for reconsideration, and the tribunal may give directions that it considers appropriate. Amendment 245 proposes instead allowing the Upper Tribunal to

“impose or revoke, or vary the amount of, a penalty … give such directions or take such other steps as OFCOM could itself have given or taken, or … make any other decision which OFCOM could itself have made”.

The concern is that this risks undermining Ofcom’s independence and discretion in applying its powers and issuing sanctions, and in challenging the regulator’s credibility and authority. It may also further incentivise well-resourced providers to appeal opportunistically, with a view to securing a more favourable outcome at a tribunal.

On that basis, I fear that the amendments tabled by the noble Lord would compromise the fundamental features of the current appeals provisions, without any significant benefits, and risk introducing a range of inadvertent consequences. We are confident that the Upper Tribunal’s judicial review process, currently set out in the Bill, provides a proportionate, effective means of appeal that avoids unnecessary expense and delays, while ensuring that the regulator’s decisions can be thoroughly scrutinised. It is for these reasons that I hope the noble Baroness will withdraw the amendment.

Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to the Minister. I will take that as a no—but a very well-considered no, for which I thank him. I say to the noble Lord, Lord Clement-Jones, that we certainly would not wish to make him feel uncomfortable at any time. I am grateful to him and the noble Baroness, Lady Kidron, for their contributions. As I said at the outset, this amendment was intended to probe the issue, which I feel we have done. I certainly would not want to open a can of worms—online, judicial or otherwise. Nor would I wish, as the Minister suggested, to undermine the work, efficiency and effectiveness of Ofcom. I am glad to have had the opportunity to present these amendments. I am grateful for the consideration of the Committee and the Minister, and with that I beg leave to withdraw.

Amendment 242 withdrawn.
Clause 148: Appeals against OFCOM decisions relating to the register under section 86
Amendment 243 not moved.
Clause 148 agreed.
Clause 149: Appeals against OFCOM notices
Amendments 244 to 250 not moved.
Clause 149 agreed.
Amendments 250A and 250B not moved.
Clauses 150 to 153 agreed.
Clause 154: Consultation and parliamentary procedure
Amendments 251 to 254 not moved.
Clause 154 agreed.
Clause 155: Directions about advisory committees
Amendments 255 and 256 not moved.
Clause 155 agreed.
Clause 156 agreed.
Clause 157: Secretary of State’s guidance
Amendments 257 to 259 not moved.
Clause 157 agreed.
Amendment 260 not moved.
Clause 158 agreed.
Clause 159: Review
Amendments 261 to 263 not moved.
Clause 159 agreed.
Amendment 264 not moved.
Clause 160: False communications offence
Amendment 264A
Moved by
264A: Clause 160, page 138, line 10, at end insert “including (but not necessarily) by making use of a stolen identity, credit card or national insurance number,”
Member’s explanatory statement
This amendment, together with the amendment to page 138, line 12 to which Lord Clement-Jones has added his name, seeks to probe the creation of a specific criminal offence of identity theft.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, even by the standards of this Bill, this is a pretty diverse group of amendments. I am leading the line with an amendment that does not necessarily fit with much of the rest of the group, except for Amendment 266, which the noble Baroness, Lady Buscombe, will be speaking to. I look forward to hearing her speak.

This amendment is designed to probe the creation of a new offence of identity theft in Clause 160. As I argued in my evidence to the consultation on digital identity and attributes in 2021, a new offence of identity theft is required. Under the Fraud Act 2006, the Identity Documents Act 2010, the Forgery and Counterfeiting Act 1981, the Computer Misuse Act 1990 and the Data Protection Act 2018 there are currently the offences of fraud using a false identity, document theft, forging an identity, unauthorised computer access and data protection offences respectively, but no specific crime of digital identity theft.

16:15
I was strongly influenced by the experience of the performer Bennett Arron, the award-winning writer and stand-up comedian, who had his digital ID stolen over 20 years ago and, as a result, became penniless and homeless. I think he was the first major victim of identity theft in the UK. Years later, he wrote a comedy show, surprisingly, about his experience. It is a disturbingly true yet funny account of what it is like to have your identity stolen. It was critically acclaimed—I know the Minister will appreciate this—at the Edinburgh festival and led to Bennett being asked to direct a documentary for Channel 4 Television. In the documentary, “How to Steal an Identity”, Bennett proved, through a series of stunts, how easy the crime of ID theft is to carry out. He also managed to steal the identity of the British Home Secretary—I am not sure which one; that needs further investigation. As he says, having something tangible stolen—your phone, a bike or a car—is awful and upsetting. However, they are replaceable, and their loss is unlikely to affect your whole life. Having your identity stolen is different: you will have great difficulty in restoring it, and the consequences can affect you for ever. He went on to describe his experiences.
Interestingly enough, the ICO has published guidance on identity theft. It says:
“Your identity is one of your most valuable assets. If your identity is stolen, you can lose money and may find it difficult to get loans, credit cards or a mortgage. Your name, address and date of birth provide … information”,
and an
“identity thief can use a number of methods to find out your personal information and will then use it to open bank accounts, take out credit cards”,
et cetera, in your name. The guidance goes on to talk about what signs you should look for and what you should do in the event of identity theft. However, effectively, it says that all you can do if some documents are stolen is tell the police; it does not tell you whether the police can do anything about it. All the guidance does is suggest that you report physical documents as having been stolen.
I have asked many questions of the Government, who, in response to the consultation on digital identity, proved stubbornly reluctant to commit to creating a new offence. It is not at all clear why. I am just sorry that the noble Baroness, Lady Morgan, is not here. She chaired the Select Committee and its terrific report Fighting Fraud: Breaking the Chain, which said:
“Identity theft is a fundamental component of fraud and is routinely used by fraudsters to steal money from legitimate individuals and organisations yet it remains out of scope of criminal offences”.
Its recommendation was:
“The Government should consult on the introduction of legislation to create a specific criminal offence of identity theft. Alternatively, the Sentencing Council should consider including identity theft as a serious aggravating factor in cases of fraud”.
In their response, the Government said:
“We agree that identity theft is a vector”—
that is a great word—
“used by fraudsters to commit fraud, but current legislation provides an effective avenue to prosecute those committing identity frauds”.
That is absolutely not the case. I look forward to what the Minister says about that, but I believe that there is a case for including an identity-theft offence both in the Bill and, later, when we come to the Data Protection and Digital Information Bill, where there will be an even stronger case for it to be included.
I will not be able to wind up on this group because I am speaking to this amendment, but I strongly support my noble friend’s Amendments 268AZB and 268AZC. I believe that the Law Commission’s intention was very much to create a high bar for the offence of encouragement of self-harm. It says:
“Our aim is only to criminalise the most serious examples of encouragement of self-harm”.
However, out there, a lot of the support organisations are worried about the broadness of the offence. They are concerned that it risks criminalising peer-support and harm-reduction resources, and that it may also criminalise content posted by people in distress who are sharing their own experiences of self-harm. That is why I support the amendment that my noble friend will speak to. I beg to move.
Baroness Buscombe Portrait Baroness Buscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I shall speak to Amendments 266 and 267, to which my noble and learned friend Lord Garnier, my noble friend Lord Leicester and the noble Lord, Lord Clement-Jones, have added their names. They are the final two amendments from a group of amendments that were also supported by the noble Lord, Lord Moore of Etchingham, and the noble Baroness, Lady Mallalieu.

The purpose of this Bill is to make the internet a safer place. The new offence of false communications is just one of the important means it seeks to use with the objective of making it an offence to harm people by telling lies online—and this is welcome. It is right that the Bill should focus on preventing harms to individuals. One of the most important guarantors that a person can have of good health and well-being is their freedom to pursue their livelihood unimpeded by illegitimate hostile action. Attacks on people’s livelihoods have the potential to wreak unimaginable harm on their mental and physical health, but these attacks are also among the easiest to perpetrate through the internet. My amendments seek to prevent such harms by protecting people who run, or work for, businesses that have been targeted with malicious fake reviews posted to online platforms, such as Google Maps or TripAdvisor. These platforms already fall within scope of this Bill in hosting user-generated content.

By referencing fake reviews, I am not referring to legitimate criticism, fair comment or even remarks about extraneous matters such as the owners’ pastimes or opinions, provided that the reviewer is honest about the nature of their relationship with the business. If someone wants to write a review of a business which they admit they have never patronised, and criticise it based on such factors, this would not be illegal, but it would very likely breach the platform’s terms of service and be removed. Review platforms are not the proper venue for such discussions; their role is to let people share opinions about a business’s products and services, but policing that is up to them.

The malicious fake reviews that I am referring to are those that are fundamentally dishonest. People with grudges to bear know that the platforms they use to attack their victims will remove any reviews that are clearly based on malice rather than a subjective assessment of quality. That is why they have come to adopt more insidious tactics. Without mentioning the real reason for their hostility towards a business and/or its staff, they purport to be customers who have had bad experiences. Of course, in almost every case, the reviewer has never so much as gone near the business. The review is therefore founded on lies.

This is not merely an abstract concern. Real people are being really harmed. Noble Lords will know that in earlier debates I used the prism of rural communities to amplify the objective of my amendments. Only yesterday, during Oral Questions in your Lordships’ House, there was an overwhelming collective consensus that we need to do more to protect the livelihoods of those working so hard in rural communities. My simple amendments would make a massive difference to their well-being.

The Countryside Alliance recently conducted a survey that found innumerable instances of ideologically motivated fake reviews targeted at rural businesses; these were often carried out by animal rights extremists and targeted businesses and their employees who sometimes participated in activities to which they objected, such as hosting shoots or serving meat. In April this year, the Telegraph reported on one case of a chef running a rural pub whose business was attacked with fake reviews by a vegan extremist who had verifiably never visited the pub, based initially on the man’s objection to him having posted on social media a picture of a roast chicken. The chef said these actions were making him fear for his livelihood as his business fought to recover from the pandemic. He is supporting my amendments.

Amendment 266 would therefore simply add the word “financial” to “physical” and “psychological” in the Bill’s definition of the types of harm that a message would need to cause for it to amount to an offence. This amendment is not an attempt to make the Bill into something it was not designed to be. It is merely an attempt to protect the physical and mental health of workers whose businesses are at risk of attack through malicious fake reviews. It may be that the victim of such an attack could argue that a fake review has caused them physical or psychological harm, as required under the Bill as currently drafted—indeed, it would likely do so. The reason for adding financial harm is to circumvent the need for victims to make that argument to the police, the police to the Crown Prosecution Service and then the prosecutors in front of the jury.

That links to Amendment 267, which would enlarge the definition of parties who may be harmed by a message for it to an amount to an offence. Under the Bill, a message must harm its intended, or reasonably foreseeable, recipient; however, it is vital to understand that a person need not receive the message to be harmed by it. In the case of fake reviews, the victim is harmed because the false information has been seen by others; he or she is not an intended recipient. The amendment would therefore include harms to the person or organisation to which the information—or, in reality, disinformation—contained within it relates.

My principal objective in bringing these amendments is not to create a stick with which to beat those who wish harm to others through malicious fake reviews; rather—call me old-fashioned—it is about deterrence. It is to deter this conduct by making it clear that it is not acceptable and would, if necessary, be pursued by police and through the courts under criminal law. It is about seeing to it that malicious fake reviews are not written and their harm is not caused.

I am aware that the Government have responded to constituents who have contacted their MPs in support of these amendments to say that they intend to act through the Competition and Markets Authority against businesses that pay third parties to write fake disparaging reviews of their competitors. I must stress to my noble friend the Minister, with respect, that this response misunderstands the issue. While there is a problem with businesses fraudulently reviewing their competitors to gain commercial advantage—and it is welcome that the Government plan to act on it—I am concerned with extreme activists and other people with ideological or personal axes to grind. These people are not engaged in any relevant business and are not seeking to promote a competitor by comparison. It is hard to see how any action by the Competition and Markets Authority could offer an effective remedy. The CMA exists to regulate businesses, not individual cranks. Further, this is not a matter of consumer law.

If the Government wish to propose some alternative means of addressing this issue besides my amendments, I and those who have added their names—and those who are supporters beyond your Lordships’ House—would be pleased to engage with Ministers between now and Report. In that regard though, I gently urge the Government to start any conversation from a position of understanding—really understanding—what the problem is. I fully appreciate that the purpose of this Bill is to protect individuals, and that is the key point of my amendments. My focus is upon those running and working in small businesses who are easy targets of this form of bullying and abuse. It is entirely in keeping with the spirit and purpose of the Bill to protect them.

Finally, I must be clear that the prism of what happens in our rural areas translates directly to everything urban across the UK. A practical difference is that people working in remote areas are often very isolated and find this intrusion into their life and livelihood so hard to cope with. We live in a pretty unpleasant world that is diminishing our love of life—that is why this Bill is so necessary.

Lord Garnier Portrait Lord Garnier (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I wish to add to what my noble friend Lady Buscombe has just said, but I can do so a little more briefly, not least because she has made all the points that need to be made.

I would disagree with her on only one point, which is that she said—I am not sure that she wanted to be called old-fashioned, but she certainly wanted to have it explained to us—that the purpose of our amendment was to deter people from making malicious posts to the detriments of businesses and so forth. I think it is about more than deterrence, if I may say so. It is about fairness and justice.

It is very natural for a civilised, humane person to want to protect those who cannot protect themselves because of the anonymity of the perpetrator of the act. Over the last nearly 50 years, I have practised at the media Bar, including in cases based on the tort of malicious falsehood, trade libel or slander of goods. Essentially, my noble friend and I are trying to bring into the criminal law the torts that I have advised on and appeared in cases involving, so that the seriousness of the damage caused by the people who do these anonymous things can be visited by the weight of the state as the impartial prosecutor.

16:30
I say to my noble friend on the Front Bench that this is not just a complaint by those who like eating meat, those who earn a living through country pursuits or those who wish to expand their legal practices from the civil sphere to the criminal. It is a plea for the Government and Parliament to reach out and protect those who cannot help themselves.
Now, there will be evidential difficulties in getting hold of anonymous posters of malicious comments and reviews. It may be said that adding to the criminal law, as we would like to do through amending Clause 160, will interfere with people’s Article 10 rights. However, Article 10 does not permit you to make malicious and deliberately false remarks about others. Section 3 of the Defamation Act 2013, which provides for the defence of honest opinion, is not affected by the criminalisation of false and malicious posts about other people’s businesses or services.
We have a very simple remedy here, which goes with the grain of British fair play, the need for justice to be done and a Government who care for the people they govern, look after and make sure do not fall victim unwittingly and unknowingly—unknowingly in the sense that they do not know who is trying to hurt them, but they know what has happened to them because their profits, turnover and ability to feed their families have been grossly affected by these malicious, dishonest people. This amendment needs careful consideration and deserves wholehearted support across the House.
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, as the noble Lord, Lord Clement-Jones, said, this is a very broad group, so I hope noble Lords will forgive me if I do not comment on every amendment in it. However, I have a great deal of sympathy for the case put forward by my noble friend Lady Buscombe and my noble and learned friend Lord Garnier. The addition of the word “financial” to Clause 160 is not only merited on the case made but is a practical and feasible thing to do in a way that the current inclusion of the phrase “non-trivial psychological” is not. After all, a financial loss can be measured and we know how it stands. I will also say that I have a great deal of sympathy with what the noble Lord, Lord Clement-Jones, said about his amendment. In so far as I understand them—I appreciate that they have not yet been spoken to—I am also sympathetic to the amendments in the names of the noble Baroness, Lady Kennedy of The Shaws, and the noble Lord, Lord Allan of Hallam.

I turn to my Amendment 265, which removes the word “psychological” from this clause. We have debated this already, in relation to other amendments, so I am going to be fairly brief about it. Probably through an oversight of mine, this amendment has wandered into the wrong group. I am going to say simply that it is still a very, very good idea and I hope that my noble friend, when he comes to reflect on your Lordships’ Committee as a whole, will take that into account and respond appropriately. Instead, I am going to focus my remarks on the two notices I have given about whether Clauses 160 and 161 should stand part of the Bill; Clause 161 is merely consequential on Clause 160, so the meat is whether Clause 160 should stand part of the Bill.

I was a curious child, and when I was learning the Ten Commandments—I am sorry to see the right reverend Prelate has left because I hoped to impress him with this—I was very curious as to why they were all sins, but some of them were crimes and others were not. I could not quite work out why this was; murder is a crime but lying is not a crime—and I am not sure that at that stage I understood what adultery was. In fact, lying can be a crime, of course, if you undertake deception with intent to defraud, and if you impersonate a policeman, you are lying and committing a crime, as I understand it—there are better-qualified noble Lords than me to comment on that. However, lying in general has never been a crime, until we get to this Bill, because for the first time this Bill makes lying in general—that is, the making of statements you know to be false—a crime. Admittedly, it is a crime dependent on the mode of transmission: it has to be online. It will not be a crime if I simply tell a lie to my noble and learned friend Lord Garnier, for example, but if I do it online, any form of statement which is not true, and I know not to be true, becomes a criminal act. This is really unprecedented and has a potentially chilling effect on free speech. It certainly seems to be right that, in your Lordships’ Committee, the Government should be called to explain what they think they are doing, because this is a very portentous matter.

The Bill states that a person commits the false communications offence if they send a message that they know to be false, if they intend the message to cause a degree of harm of a non-trivial psychological or physical character, and if they have no reasonable excuse for sending the message. Free speech requires that one should be allowed to make false statements, so this needs to be justified. The wording of the offence raises substantial practical issues. How is a court meant to judge what a person knows to be false? How is a committee of the House of Commons meant to judge, uncontroversially, what a person knows to be false at the time they say it? I say again: what is non-trivial psychological harm and what constitutes an excuse? None of these things is actually defined; please do not tell me they are going to be defined by Ofcom—I would not like to hear that. This can lead to astonishing inconsistency in the courts and the misapplication of criminal penalties against people who are expressing views as they might well be entitled to do.

Then there is the question of the audience, because the likely audience is not just the person to whom the false statement is directed but could be anybody who subsequently encounters the message. How on earth is one going to have any control over how that message travels through the byways and highways of the online world and be able to say that one had some sense of who it was going to reach and what non-trivial psychological harm it might cause when it reached them?

We are talking about this as if this criminal matter is going to be dealt with by the courts. What makes this whole clause even more disturbing is that in the vast majority of cases, these offences will never reach the courts, because there is going to be, inevitably, an interaction with the illegal content duties in the Bill. By definition, these statements will be illegal content, and the platforms have obligations under the Bill to remove and take down illegal content when they become aware of it. So, the platform is going to have to make some sort of decision about not only the truth of the statement but whether the person knows what the statement is, that the statement is false and what their intention is. Under the existing definition of illegal content, they will be required to remove anything they reasonably believe is likely to be false and to prevent it spreading further, because the consequences of it, in terms of the harm it might do, are incalculable by them at that point.

We are placing a huge power of censorship—and mandating it—on to the platforms, which is one of the things that some of us in this Committee have been very keen to resist. Just exploring those few points, I think my noble friend really has to explain what he thinks this clause is doing, how it is operable and what its consequences are going to be for free speech and censorship. As it stands, it seems to me unworkable and dangerous.

Lord Garnier Portrait Lord Garnier (Con)
- Hansard - - - Excerpts

Does my noble friend agree with me that our courts are constantly looking into the state of mind of individuals to see whether they are lying? They look at what they have said, what they have done and what they know. They can draw an inference based on the evidence in front of them about whether the person is dishonest. This is the daily bread and butter of court. I appreciate the points he is making but, if I may say so, he needs to dial back slightly his apoplexy. Underlying this is a case to be made in justice to protect the innocent.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

I did not say that it would be impossible for a court to do this; I said it was likely to lead to high levels of inconsistency. We are dealing with what is likely to be very specialist cases. You can imagine this in the context of people feeling non-trivially psychologically harmed by statements about gender, climate, veganism, and so forth. These are the things where you see this happening. The idea that there is going to be consistency across the courts in dealing with these issues is, I think, very unlikely. It will indeed have a chilling effect on people being able to express views that may be controversial but are still valid in an open society.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I want to reflect on the comments that the noble Lord, Lord Moylan, has just put to us. I also have two amendments in the group; they are amendments to the government amendment, and I am looking to the Minister to indicate whether it is helpful for me to explain the rationale of my amendments now or to wait until he has introduced his. I will do them collectively.

First, the point the noble Lord, Lord Moylan, raised is really important. We have reached the end of our consideration of the Bill; we have spent a lot of time on a lot of different issues, but we have not spent very much time on these new criminal offences, and there may be other Members of your Lordships’ House who were also present when we discussed the Communications Act back in 2003, when I was a Member at the other end. At that point, we approved something called Section 127, which we were told was essentially a rollover of the dirty phone call legislation we had had previously, which had been in telecoms legislation for ever to prevent that deep-breathing phone call thing.

16:45
It went through almost on the nod, and then it turned out to become a very significant offence later. Noble Lords may be aware of the Twitter joke trial, the standout trial that people have followed, where an individual was prosecuted for saying something on Twitter which was originally taken very seriously as a bomb threat. Later, on appeal, the prosecution was found to be invalid. The debate around Section 127 and its usage has gone backwards and forwards.
There is a genuine question to be asked here about whether we will be coming back in a few years and finding that this offence has been used in ways that we were not expecting or intending. I think we all know what we are trying to get at: the person who very deliberately uses falsehoods to cause serious harm to others. That has been described by the noble Baroness, Lady Buscombe, and the noble and learned Lord, Lord Garnier. However, you can see how the offence could inadvertently capture a whole load of other things where we would either collectively agree that they should not be prosecuted or have quite different opinions about whether they should be prosecuted.
I used to sit in judgment on content at a platform, and people would often say to us, “Why are you allowing that content? It’s false”. We would say, quite rightly, that we had no terms of service that say you cannot lie or issue falsehoods on the platform. They would say, “Why not?”, and we would dig into it and say, “Let’s just deal with some falsehoods”. For example, “The earth is flat”—I think most of us would agree that that is false, but it is entirely harmless. We do not care; it is a lie we do not care about. What about, “My God is the only true God”? Well, that is an opinion; it is not a statement of fact—but we are getting into a zone that is more contested. As for, “Donald Trump won the election”, that is an absolute and outright lie to one group of people but a fundamental issue of political expression for another. You very quickly run into these hotly contested areas. This is going to be a real challenge.
We have often talked in this debate about how we are handing Ofcom a really difficult job across all the measures in the Bill. In this case, I think we are handing prosecutors a really difficult job of having to determine when they should or should not use the new offence we are giving them. I think it will be contested, and we may well want to come back later and look at whether it is being used appropriately. I am sure the noble Baroness, Lady Fox, will have something to say about this.
If we do anything, we should learn from previous experiences. I think everyone would agree that Section 127 of the Communications Act 2003, which has been in place for 20 years, has not been an easy ride. We have moved our view on that around considerably. One good thing is that these new offences replace it—we are replacing this thing we rushed through then with a different version—but we need to test very carefully whether we have got it right.
Moving away from that offence to the new offence of the encouragement of self-harm, which the Minister will introduce shortly, I have two amendments which are quite different from each other, and I want to explain each of them. The first, Amendment 268AZC, seeks to test the threshold for prosecutions under this offence—so, again, it is about when we should or should not prosecute. It follows concerns raised by my noble friend Lord Clement-Jones. Legitimate concerns have been raised by a coalition of about 130 different individuals and organisations supporting survivors of self-harm. The question, really, is what the Government’s intention is. I hope the Minister can put something on the record which will be helpful later in terms of the Government’s intention for where the threshold should lie.
We can see clear instances where somebody maliciously and deliberately encourages self-harm. There are other issues around the way in which systems encourage self-harm, which I think are tackled in other parts of the Bill, but here we are talking about an individual carrying out an action which may be prosecuted. We can see those people, but there is then a spectrum: people who support victims of self-harm, people who provide educational and support materials—some of which might include quite graphic descriptions—people who run online fora, and indeed the individuals themselves who are posting self-harm content and documenting what they have done to themselves.
What I am looking for, and I think other noble Lords are looking for, is an assurance that it is not our intention to capture those people within this new offence. When the Minister outlines the offence, I would appreciate an assurance about why those fears will not be realised. We have suggested in the amendment to have a bar involving the Director of Public Prosecutions. I am sure the Minister will explain why that is not the right gating mechanism—but if not that, what is going to ensure that people in distress are not brought into the scope of this offence?
It would be very helpful to know what discussions the Minister is having with the Ministry of Justice, particularly about working with relevant organisations on the detail of what is being shared. Again, we can talk about it in the abstract; I found that specific cases can be helpful. I hope that there are people, either in his department or in the Ministry of Justice, who are talking to organisations, looking at the fora and at the kind of content that people in distress post, applying themselves and saying, “Yes, we are confident that we will not end up prosecuting that individual or that organisation”, or, if it is likely that they will be prosecuted, “We need to have a longer discussion about that if that is not the intention”.
Moving backwards through the letters, Amendment 268AZB takes us back to a question raised earlier today by the noble Baroness, Lady Kidron, but also on the first day in Committee, way back when—I am starting to feel nostalgic. She proposed an amendment to broaden the scope of this regulation to all online services, whether or not they are in the regulated user-to-user and search bucket for children’s protection purposes. I argued against that. I continue to view this legislation as appropriately targeted, and I worry about broadening the scope—but here I have a lot of sympathy. The noble Baroness raised a specific scenario, which again is a real one, of an individual outside the UK jurisdiction—let us imagine that they are in the US, because they will have first amendment protections. They run a blog, so it is not user to user, which is targeting people in the UK and saying, “You should harm yourself. You should commit suicide”. That individual is doing nothing wrong in their terms. They are now, under the terms of the Bill, committing a criminal offence in the UK, but there is virtually no prospect that they will ever be prosecuted unless they come to the UK. The amendment is seeking to tease out what happens in those scenarios.
I do not expect the Minister to accept my amendment as drafted—I thought it was cheeky even as I was drafting it. It says that for the purposes that we want, we will apply a whole bunch of the Online Safety Bill measures—the disruption, the blocking, the business disruption measures—to websites that promote this kind of content, even though they are not otherwise regulated. I am sure that the Minister has very good legal arguments in his notes as to why that would not work, but I hope he will tell us what else the Government can do. I do not think that people will find it acceptable if we go to all the trouble of passing this legislation but there is a category of online activity—which we know is there; it is real—that we can do nothing about. They are breaking our criminal law; we have taken all this time to construct a new form of criminal law, and yet they are sitting there beyond our reach, ignoring it, and we can do nothing. I hope the Minister can offer some suggestions as to what we might do.
I see that the noble Baroness, Lady Finlay of Llandaff, is in her place; she asked me to raise questions around the extent to which we have been using the powers we have today, but I will not as I think she will do so herself.
The core question that motivates the second amendment is that of what the Government think we might do if there are individuals who are not user-to-user or search services and are therefore outside the regulated bucket, who persistently and deliberately breach this new offence of encouragement to self-harm and yet are outside the UK jurisdiction. I hope the Minister agrees that something should be done about that scenario, and I look forward to hearing his suggestion about what may be done.
Baroness Kennedy of Shaws Portrait Baroness Kennedy of The Shaws (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I have two amendments in this grouping. I am afraid that I did not have time to get others to put their names to them, but I hope that they will find some support in this Committee.

For almost the whole of 2021, I chaired an inquiry in Scotland into misogyny. It was about the fact that many complaints were being made to the devolved Government in Scotland about women’s experiences not just of online harassment but of the disinhibition that the internet and social media have given people to be abusive online now also visiting the public square. Many people described the ways in which they are publicly harassed. I know that concerns people in this House too.

When I came to the Bill, I was concerned about something that became part of the evidence we heard. It is no different down here from in Scotland. As we know, many women—I say women, but men receive harassment online too—are sent really vicious, vile things. We all know of parliamentarians and journalists who have received them, their lives made a misery by threats to rape and kill and people saying, “Go and kill yourself”. There are also threats of disfigurement—“Somebody should take that smile off your face”—and suggestions that an acid attack be carried out on someone.

In hearing that evidence, it was interesting that some of the forms of threat are not direct in the way that criminal law normally works; they are indirect. They are not saying, “I’m going to come and rape you”. Sometimes they say that, but a lot of the time they say, “Somebody should rape you”; “You should be raped”; “You deserve to be raped”; “You should be dead”; “Somebody should take you out”; “You should be disfigured”; “Somebody should take that smile off your face, and a bit of acid will do it”. They are not saying, “I’m going to come and do it”, in which case the police go round and, if the person is identifiable, make an arrest—as happened with Joanna Cherry, the Scottish MP, for example, who had a direct threat of rape, and the person was ultimately charged under the Communications Act.

Our review of the kinds of threat taking place showed that it was increasingly this indirect form of threat, which has a hugely chilling effect on women. It creates fear and life changes, because women think that some follower of this person might come and do what is suggested and throw acid at them as they are coming out of their house, and they start rearranging their lives because of it—because they live in constant anxiety about it. It was shocking to hear the extent to which this is going on.

In the course of the past year, we have all become much more familiar with Andrew Tate. What happens with these things is that, because of the nature of social media and people having big followings, you get the pile-on: an expression with which I was not that familiar in the past but now understand only too well. The pile-on is where, algorithmically, many different commentaries are brought together and suddenly the recipient receives not just one, or five, but thousands of negative and nasty threats and comments. Of course, as a public person in Parliament, or a councillor, you are expected to open up your social media, because that is how people will get in touch with you or comment on the things you are doing, but then you receive thousands of these things. This affects journalists, Members of Parliament, councillors and the leaders of campaigns. For example, it was interesting to hear that people involved in the Covid matters received threats. It affects both men and women, but the sexual nature of the threats to women is horrifying.

The Andrew Tate thing is interesting because only yesterday I saw in the newspapers that part of the charging in Romania is about the way in which, because of his enormous following, and his encouragement of violence towards women, he is being charged, among many other things that are directly about violence to and the rape of women, for his incitement to these behaviours in many of his young male followers. In the report of the inquiry that I conducted, there are a number of recommendations around offences of that sort.

To specifically deal with this business of online threats, my amendments seek to address their indirect nature—not the ones that say, “I’m going to do it”, but the encouragement to others to do it or to create the fear that it will happen—and to look at how the criminal law addresses that.

17:00
I started out with the much lesser suggestion of inserting that the threat be either by the person who is sending the message or another individual, but the more I reflected on it the more it did not seem to me that that went far enough to deal with what one was seeking to address here. This is why I have an alternative, Amendment 267AB.
Noble Lords will see that it says:
“A person commits an offence if they issue a communication concerning death”.
I have written “concerning death” rather than “a threat to kill” because the former can include someone saying, “You should kill yourself”, “You should commit suicide” or “You should harm yourself”. The amendment also refers to “assault (sexual or otherwise)”, which could include all manner of sexual matters but also self-harming and “disfigurement”. I was shocked at the extent to which disfigurement is suggested in these kinds of abusive texts to all manner of people—even campaigners. Even a woman who campaigns on air pollution after the death of her child from a fatal asthmatic attack receives the most horrible threats and abuse online. When people do this, they know the impact that it is going to have. They do it, as my amendment says,
“knowing it will cause alarm or distress to a specific person or specific people”,
rather than making generalised threats to the world.
As the Minister will know, I wrote to him wondering whether his team might put their great legal minds to this because we have to find a way of addressing the fact that people are encouraging others to make threats. We have to look at the effect that this has on the recipients, who are often women in public life in one way or another; the way in which it affects our polity and women’s participation, not just in public life but in politics and civil society generally; and the way in which it deters women from living their lives freely and equally with menfolk.
I hope that the Committee will think on that and that the Minister can come back to me with some positive things, even if he does not accept the particular formulation that I sought to devise. It may be that a different formulation could be sought, perhaps to include that it is done “recklessly”. I am prepared to consider its impact on people, but I think that it is done with knowledge of the impact that it will have, and where it is foreseeable that there will be an impact.
I urge the Committee to consider these matters. As I just heard one of my colleagues in the Committee suggest, this is a moment to seize. You can be sure that we cannot encapsulate everything, but we should be trying to cover as much as possible and the horrors of what is now happening on social media.
Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will address my remarks to government Amendment 268AZA and its consequential amendments. I rather hope that we will get some reassurance from the Minister on these amendments, about which I wrote to him just before the debate. I hope that that was helpful; it was meant to be constructive. I also had a helpful discussion with the noble Lord, Lord Allan.

As has already been said, the real question relates to the threshold and the point at which this measure will clock in. I am glad that the Government have recognised the importance of the dangers of encouraging or assisting serious self-harm. I am also grateful for the way in which they have defined it in the amendment, relating to it grievous bodily harm and severe injury. The amendment says that this also

“includes successive acts of self-harm which cumulatively reach that threshold”.

That is important; it means, rather than just one act, a series of them.

However, I have a question about subsection (10), which states that:

“A provider of an internet service by means of which a communication is sent, transmitted or published is not to be regarded as a person who sends, transmits or publishes it”.


We know from bereaved parents that algorithms have been set up which relay this ghastly, horrible and inciteful material that encourages and instructs. That is completely different from those organisations that are trying to provide support.

I am grateful to Samaritans for all its help with my Private Member’s Bill, and for the briefing that it provided in relation to this amendment. As it points out, over 5,500 people in England and Wales took their own lives in 2021 and self-harm is

“a strong risk factor for future suicide”.

Interestingly, two-thirds of those taking part in a Samaritans research project said that

“online forums and advice were helpful to them”.

It is important that there is clarity around providing support and not encouraging and goading people into activity which makes their self-harming worse and drags them down to eventually ending their own lives. Three-quarters of people who took part in that Samaritans research said that they had

“harmed themselves more severely after viewing self-harm content online”.

It is difficult to know exactly where this offence sits and whether it is sufficiently narrowly drawn.

I am grateful to the Minister for arranging for me to meet the Bill team to discuss this amendment. When I asked how it was going to work, I was somewhat concerned because, as far as I understand it, the mechanism is based on the Suicide Act, as amended, which talks about the offence of encouraging or assisting suicide. The problem as I see it is that, as far as I am aware, there has not been a string of prosecutions following the suicide of many young people. We have met their families and they have been absolutely clear about how their dead child or sibling—whether a child or a young adult—was goaded, pushed and prompted. I recently had experience outside of a similar situation, which fortunately did not result in a death.

The noble Lord, Lord Allan, has already addressed some of the issues around this, and I would not want the amendment not to be there because we must address this problem. However, if we are to have an offence here, with a threshold that the Government have tried to define, we must understand why, if assisting and encouraging suicide on the internet is already a criminal offence, nothing has happened and there have been no prosecutions.

Why is subsection (10) in there? It seems to negate the whole problem of forwarding on through dangerous algorithms content which is harmful. We know that a lot of the people who are mounting this are not in the UK, and therefore will be difficult to catch. It is the onward forwarding through algorithms that increases the volume of messaging to the vulnerable person and drives them further into the downward spiral that they find themselves in—which is perhaps why they originally went to the internet.

I look forward to hearing the Government’s response, and to hearing how this will work.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, this group relates to communications offences. I will speak in support of Amendment 265, tabled by the noble Lord, Lord Moylan, and in support of his opposition to Clause 160 standing part of the Bill. I also have concerns about Amendments 267AA and 267AB, in the name of the noble Baroness, Lady Kennedy. Having heard her explanation, perhaps she can come back and give clarification regarding some of my concerns.

On Clause 160 and the false communications offence, unlike the noble Lord, Lord Moylan, I want to focus on psychological harm and the challenge this poses for freedom of expression. I know we have debated it before but, in the context of the criminal law, it matters in a different way. It is worth us dwelling on at least some aspects of this.

The offence refers to what is described as causing

“non-trivial psychological or physical harm to a likely audience”.

As I understand it—maybe I want some clarity here—it is not necessary for the person sending the message to have intended to cause harm, yet there is a maximum sentence of 51 weeks in prison, a fine, or both. We need to have the context of a huge cultural shift when we consider the nature of the harm we are talking about.

J.S. Mill’s harm principle has now been expanded, as previously discussed, to include traumatic harm caused by words. Speakers are regularly no-platformed for ideas that we are told cause psychological harm, at universities and more broadly as part of the whole cancel culture discussion. Over the last decade, harm and safety have come no longer to refer just to physical safety but have been conflated. Historically, we understood the distinction between physical threats and violence as distinct from speech, however aggressive or incendiary that speech was; we did not say that speech was the same as or interchangeable with bullets or knives or violence—and now we do. I want us to at least pause here.

What counts as psychological harm is not a settled question. The worry is that we have an inability to ascertain objectively what psychological harm has occurred. This will inevitably lead to endless interpretation controversies and/or subjective claims-making, at least some of which could be in bad faith. There is no median with respect to how humans view or experience controversial content. There are wildly divergent sensibilities about what is psychologically harmful. The social media lawyer Graham Smith made a really good point when he said that speech is not a physical risk,

“a tripping hazard … a projecting nail … that will foreseeably cause injury … Speech is nuanced, subjectively perceived and capable of being reacted to in as many different ways as there are people.”

That is true.

We have seen an example of the potential disputes over what creates psychological harm in a case in the public realm over the past week. The former Culture Secretary, Nadine Dorries, who indeed oversaw much of this Bill in the other place, had her bullying claims against the SNP’s John Nicolson MP overturned by the standards watchdog. Her complaints had previously been upheld by the standards commissioner. John Nicolson tweeted, liked and retweet offensive and disparaging material about Ms Dorries 168 times over 24 hours—which, as they say, is a bit OTT. He “liked” tweets describing Ms Dorries as grotesque, a “vacuous goon” and much worse. It was no doubt very unpleasant for her and certainly a personalised pile-on—the kind of thing the noble Baroness, Lady Kennedy, just talked about—and Ms Dorries would say it was psychologically harmful. But her complaint was overturned by new evidence that led to the bullying claim being turned down. What was this evidence? Ms Dorries herself was a frequent and aggressive tweeter. So, somebody is a recipient of something they say causes them psychological harm, and it has now been said that it does not matter because they are the kind of person who causes psychological harm to other people. My concern about turning this into a criminal offence is that the courts will be full of those kinds of arguments, which I do not think we want.

17:15
One problem I have is that the Bill does not give any explicit protection in the public interest or for the purposes of debate around the use of such harm, even if, as has been indicated, false allegations are being made. People use hyperbole and exaggeration in political argument. Many of the big political questions of the day are not agreed on, and people accuse each other of lying all the time when it comes to anything from Brexit to gender. I am worried that inadvertently—I do not think anyone is trying to do this—Clause 160 will institutionalise and bake into primary legislation the core of cancel culture and lead to a more toxic climate than any of us would want.
I have some reservations about Amendment 267AB. I recognise and am full of admiration for the intention of the noble Baroness, Lady Kennedy, but the wording seems to make it an offence to issue
“a communication concerning death, rape, assault (sexual or otherwise) or disfigurement, knowing it will cause alarm or distress”.
I query how we would prove that someone knows it will cause offence, an issue that was slightly danced around. Also, a lot of content such as news websites, podcasts and various other communications could be said purposefully to cause alarm, offence or distress, perhaps because the intention is to shock people into realising what a war or a famine is like, or into understanding the dangers of groomers or suicide sites—the kind of things we have been discussing. During lockdown, the nudge unit explicitly issued communications about potential death that caused a great deal of alarm and distress. It had a public interest defence, which was that it was important that people were frightened into complying with the rules of lockdown, whatever one thinks of them. I do not see how that will not be caught up in this.
Amendment 267AA extends the offence to include encouraging someone else to commit harm. I understand that this is an attempt to deal with indirect misogynistic abuse that is not quite incitement but on the other hand seems to be A encouraging B—an indirect threat. I worry that if A encourages another person and that person does something as a consequence, it will lead to a “he told me to do it” defence and an abdication of responsibility. I have that qualm about it. The Member’s explanatory statement makes things even more difficult, using the phrase,
“if an individual sends a message which potentially encourages other individuals to carry out a harmful act”.
That is going to be wide open to abuse by all sorts of bad-faith complainants.
I say all this as someone who is regularly piled on. I noticed when I started to speak that the noble Baroness, Lady Stuart, is in her place. I remember my shock and horror when I saw the abuse she got some seven years ago and subsequently—really vicious, vile, horrible abuse for her political stance. Such abuse often takes a very sexualised form if you are a woman. So I can say from my lived experience that I know what it feels like to be on the receiving end of vile, horrible, misogynistic pile-ons, and so do Joanna Cherry, Rosie Duffield and a lot of people involved in contentious political issues. We need to make politics more civil by having the arguments and the debates and not mischaracterising, delegitimising or demonising people. I am just not sure that a criminal intervention here is going to help. I think it might make matters worse.
Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- View Speech - Hansard - - - Excerpts

My Lords, as the noble Lord, Lord Clement-Jones, said in introducing this group some time ago, it is very diverse. I shall comment on two aspects of the amendments in this group. I entirely associate myself with the remarks of the noble Lord, Lord Allan, who really nailed the problems with Amendment 266, and I very much support the amendments in the name of the noble Baroness, Lady Kennedy of The Shaws; I would have signed them if I had caught up with them.

The noble Baroness, Lady Fox, talked about causing alarm and distress. I can draw on my own experience here, thinking about when someone randomly starts to post you pictures of crossbows. I think about what used to happen when I was a journalist in Bangkok, when various people used to get hand grenades posted into their letterbox. That was not actively dangerous—the pin was not pulled; it was still held down—but it was clearly a threat, and the same thing happens on social media.

This is something of which I have long experience. In 2005, when I was the founder of the feminist blog Carnival of Feminists, I saw the kinds of messages that the noble Baronesses have referred to, which in the days before social media used to be posted as comments on people’s blogs. You can still find the blog out there—it ran from 2005 to 2009—but many of its links to other blogs will be dead because they were often run by young women, often young women of colour, who were driven to pull down their blogs and sometimes were driven off the internet entirely by threatening, fearsome messages of the type that the noble Baroness, Lady Kennedy, referred to. We can argue about the drafting here—I will not have any opinion on that in detail—but something that addresses that issue is really important.

Secondly, we have not yet heard the Government’s introductions to Amendment 268AZA, but the noble Lord, Lord Clement-Jones, provided us with the information that it is an amendment to create the offence of encouraging or assisting self-harm. I express support for the general tenor of that, but I want to make one specific point: so far as I can see, the amendment does not have any defence or carve-out for harm-reduction messages, which may be necessary.

To set the context here, figures from the Royal College of Psychiatrists say that about one in 10 young people self-harm at some stage in their youth, and the RCP says those figures are probably an underestimate because they are based on figures where medical professionals actually see them so the number is probably significantly higher than that. An article in the Journal of Psychiatric and Mental Health Nursing from 2018 entitled “Self-cutting and harm reduction” is focused on in-patient settings, but the arguments in it are important in setting the general tone. It says that

“harm reduction in all its guises starts from the premise that the end goal”—

that is, to end self-harm entirely—

“is neither necessarily nor inevitably abstinence”,

which cannot be the solution for some people. Rather,

“the extinction of some particular form of behaviour may not be realistic for, or even desired by, the individual”.

So you may find messages that say, “If you are going to cut yourself, use a clean blade. If you do cut yourself, look after the wound afterwards”, but there is a risk that those kinds of well-intentioned, well-meaning and indeed expert messages could be caught by the amendment. I googled self-harm and harm reduction, and the websites that came up included Self Injury Support, which provides expert advice; a number of mental health trusts and healthcare trusts; and, indeed, the royal college’s own website.

The noble Lord, Lord Allan of Hallam, was trying to address this issue with Amendment 268AZC, which would allow the DPP to authorise prosecutions, but it seems to me that a better approach would be to have in the government amendment a statement saying, “We acknowledge that there will be cases where people talk about self-harm in ways that seek to minimise harm rather than simply stopping it, and they are not meant to be caught by this amendment”.

Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, as the noble Baroness, Lady Bennett, said, it seems a very long time since we heard the introduction from the noble Lord, Lord Clement-Jones, but it was useful in setting this helpful and well-informed debate on its way. I am sure the whole Committee is keen to hear the Minister introducing the government amendments, even at this very late stage in the debate.

I would like to make reference to a few points. I was completely captivated by the noble Lord, Lord Moylan, who invoked the 10 commandments. I say to him that one can go to no higher order, which I am sure will support the amendments that he and his colleagues have put forward.

I will refer first to the amendments tabled by my noble friend Lady Kennedy. At a minimum, they are interesting because they try to broaden the scope of the current offences. I believe they also try to anticipate the extent of the impact of the government amendments, which in my view would be improved by my noble friend’s amendments. As my noble friend said, so many of the threats that are experienced online by, and directed towards, women and girls are indirect. They are about encouraging others: saying “Somebody should do something terrible to you” is extremely common. I feel that here is an opportunity to address that in the Bill, and if we do not, we will have missed a major aspect. I hope that the Minister will take account of that and be positive. We can all be relaxed about whether the amendments need to be made, but the intent is there.

That part of the debate made a strong case to build on the debate we had on an earlier day in Committee about violence against women and girls, which was led by the noble Baroness, Lady Morgan, and supported by noble Baronesses and noble Lords from all sides of the House. We called upon the Minister then to ensure that the Bill explicitly includes the necessary amendments to make it refer to violence against women and girls because, for all the reasons that my noble friend Lady Kennedy has explained, it is considerably greater for them than for others. Without wishing to dismiss the fact that everybody receives levels of abuse, we have to be realistic here: I believe that my noble friend’s amendments are extremely helpful there.

This is a bit in anticipation of what the Minister will say—I am sure he will forgive me if he already has the answers. The noble Lords, Lord Clement-Jones and Lord Allan, referred particularly to the coalition of some 130 individuals and organisations which have expressed their concerns. I want to highlight those concerns as well, because they speak to some important points. The groups in that coalition include the largest self-harm charity, Self Injury Support, along with numerous smaller self-harm support organisations and, of course, the mental health charity Mind. Their voice is therefore considerable.

To emphasise what has already been outlined, the concern with the current amendments is that they are somewhat broad and equivalent to an offence of glamorising self-harm, which was rejected by the Law Commission in its consultation on the offence. That followed concern from the Magistrates’ Association and the Association of Police and Crime Commissioners that the offence would be ambiguous in application and complex to prosecute. It also risks criminalising people in distress, something that none of us want to see.

In addition, the broadness of the offence risks criminalising peer support and harm reduction resources, by defining them as capable of “encouraging or assisting” when they are in fact intended to help people who self-harm. This was raised by the noble Baroness, Lady Finlay, today and in respect of her Private Member’s Bill, which we debated very recently in this Chamber, and I am sure that it would not be the Minister’s intention.

I would like to emphasise another point that has been made. The offence may also criminalise content posted by people who are in distress and sharing their own experiences of self-harm—the noble Baroness, Lady Finlay, referred to this—by, for example, posting pictures of wounds. We do not want to subject vulnerable people to litigation, so let us not have an offence which ends up harming the very people it aims to protect. I shall be listening closely to the Minister.

17:30
There are a number of mitigations which would help, such as the introduction of defences excluding peer support and harm reduction resources, as well as content which has been posted with a view to reducing one’s own serious self-harm. In addition, there could be a mitigation, which I hope we will see, requiring consent from the DPP for any prosecution to occur. That was also suggested by the Law Commission and was picked up by the noble Lord, Lord Allan. There is also the potential for mitigation by including a requirement of malicious intent. This would ensure that offences apply only to instances of trolling or bullying.
I will leave the Minister with a few questions. It would be helpful to hear what consultation there has been with self-harm specific organisations and how the government amendments differ from the broader “glamorisation” offence, which was rejected by the Law Commission. It would also be helpful to hear examples of content that are intended to be criminalised by the offence. That would be of interest to your Lordships’ Committee and the coalition of very key organisations and individuals who are keen, as we all are, to see this Bill end up in the right form and place. I look forward to hearing from the Minister.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, this has been a broad and mixed group of amendments. I will be moving the amendments in my name, which are part of it. These introduce the new offence of encouraging or assisting serious self-harm and make technical changes to the communications offences. If there can be a statement covering the group and the debate we have had, which I agree has been well informed and useful, it is that this Bill will modernise criminal law for communications online and offline. The new offences will criminalise the most damaging communications while protecting freedom of expression.

Amendments 264A, 266 and 267, tabled by the noble Lord, Lord Clement-Jones, and my noble friend Lady Buscombe, would expand the scope of the false communications offence to add identity theft and financial harm to third parties. I am very grateful to them for raising these issues, and in particular to my noble friend Lady Buscombe for raising the importance of financial harm from fake reviews. This will be addressed through the Digital Markets, Competition and Consumers Bill, which was recently introduced to Parliament. That Bill proposes new powers to address fake and misleading reviews. This will provide greater legal clarity to businesses and consumers. Where fake reviews are posted, it will allow the regulator to take action quickly. The noble Baroness is right to point out the specific scenarios about which she has concern. I hope she will look at that Bill and return to this issue in that context if she feels it does not address her points to her satisfaction.

Identity theft is dealt with by the Fraud Act 2006, which captures those using false identities for their own benefit. It also covers people selling or using stolen personal information, such as banking information and national insurance numbers. Adding identity theft to the communications offences here would duplicate existing law and expand the scope of the offences too broadly. Identity theft, as the noble Lord, Lord Clement-Jones, noted, is better covered by targeted offences rather than communications offences designed to protect victims from psychological and physical harm. The Fraud Act is more targeted and therefore more appropriate for tackling these issues. If we were to add identity theft to Clause 160, we would risk creating confusion for the courts when interpreting the law in these areas—so I hope the noble Lord will be inclined to side with clarity and simplicity.

Amendment 265, tabled by my noble friend Lord Moylan, gives me a second chance to consider his concerns about Clause 160. The Government believe that the clause is necessary and that the threshold of harm strikes the right balance, robustly protecting victims of false communications while maintaining people’s freedom of expression. Removing “psychological” harm from Clause 160 would make the offence too narrow and risk excluding communications that can have a lasting and serious effect on people’s mental well-being.

But psychological harm is only one aspect of Clause 160; all elements of the offence must be met. This includes a person sending a knowingly false message with an intention to cause non-trivial harm, and without reasonable excuse. It has also been tested extensively as part of the Law Commission’s report Modernising Communications Offences, when determining what the threshold of harm should be for this offence. It thus sets a high bar for prosecution, whereby a person cannot be prosecuted solely on the basis of a message causing psychological harm.

The noble Lord, Lord Allan, rightly recalled Section 127 of the Communications Act and the importance of probing issues such as this. I am glad he mentioned the Twitter joke trial—a good friend of mine acted as junior counsel in that case, so I remember it well. I shall spare the blushes of the noble Baroness, Lady Merron, in recalling who the Director of Public Prosecutions was at the time. But it is important that we look at these issues, and I am happy to speak further with my noble friend Lord Moylan and the noble Baroness, Lady Fox, about this and their broader concerns about freedom of expression between now and Report, if they would welcome that.

My noble friend Lord Moylan said that it would be unusual, or novel, to criminalise lying. The offence of fraud by false representation already makes it an offence dishonestly to make a false representation—to breach the ninth commandment—with the intention of making a gain or causing someone else a loss. So, as my noble and learned friend Lord Garnier pointed out, there is a precedent for lies with malicious and harmful intent being criminalised.

Amendments 267AA, 267AB and 268, tabled my noble friend Lady Buscombe and the noble Baroness, Lady Kennedy of The Shaws, take the opposite approach to those I have just discussed, as they significantly lower and expand the threshold of harm in the false and threatening communications offences. The first of these would specify that a threatening communications offence is committed even if someone encountering the message did not fear that the sender specifically would carry out the threat. I am grateful to the noble Baroness for her correspondence on this issue, informed by her work in Scotland. The test here is not whether a message makes a direct threat but whether it conveys a threat—which can certainly cover indirect or implied threats.

I reassure the noble Baroness and other noble Lords that Clause 162 already captures threats of “death or serious harm”, including rape and disfigurement, as well as messages that convey a threat of serious harm, including rape and death threats, or threats of serious injury amounting to grievous bodily harm. If a sender has the relevant intention or recklessness, the message will meet the required threshold. But I was grateful to see my right honourable friend Edward Argar watching our debates earlier, in his capacity as Justice Minister. I mentioned the matter to him and will ensure that his officials have the opportunity to speak to officials in Scotland to look at the work being done with regard to Scots law, and to follow the points that the noble Baroness, Lady Bennett, made about pictures—

Baroness Kennedy of Shaws Portrait Baroness Kennedy of The Shaws (Lab)
- Hansard - - - Excerpts

I am grateful to the Minister. I was not imagining that the formulations that I played with fulfilled all of the requirements. Of course, as a practising lawyer, I am anxious that we do not diminish standards. I thank the noble Baroness, Lady Fox, for raising concerns about freedom of speech, but this is not about telling people that they are unattractive or ugly, which is hurtful enough to many women and can have very deleterious effects on their self-confidence and willingness to be public figures. Actually, I put the bar reasonably high in describing the acts that I was talking about: threats that somebody would kill, rape, bugger or disfigure you, or do whatever to you. That was the shocking thing: the evidence showed that it was often at that high level. It is happening not just to well-known public figures, who can become somewhat inured to this because they can find a way to deal with it; it is happening to schoolgirls and young women in universities, who get these pile-ons as well. We should reckon with the fact that it is happening on a much wider basis than many people understand.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes, we will ensure that, in looking at this in the context of Scots law, we have the opportunity to see what is being done there and that we are satisfied that all the scenarios are covered. In relation to the noble Baroness’s Amendment 268, the intentional encouragement or assistance of a criminal offence is already captured under Sections 44 to 46 of the Serious Crime Act 2007, so I hope that that satisfies her that that element is covered—but we will certainly look at all of this.

I turn to government Amendment 268AZA, which introduces the new serious self-harm offence, and Amendments 268AZB and 268AZC, tabled by the noble Lords, Lord Allan and Lord Clement-Jones. The Government recognise that there is a gap in the law in relation to the encouragement of non-fatal self-harm. The new offence will apply to anyone carrying out an act which intends to, and is capable of, encouraging or assisting another person seriously to self-harm by means of verbal or electronic communications, publications or correspondence.

I say to the noble Baroness, Lady Finlay of Llandaff, that the new clause inserted by Amendment 268AZA is clear that, when a person sends or publishes a communication that is an offence, it is also clear that, when a person forwards on another person’s communication, that will be an offence too. The new offence will capture only the most serious behaviour and avoid criminalising vulnerable people who share their experiences of self-harm. The preparation of these clauses was informed by extensive consultation with interested groups and campaign bodies. The new offence includes two key elements that constrain the offence to the most culpable offending; namely, that a person’s act must be intended to encourage or assist the serious self-harm of another person and that serious self-harm should amount to grievous bodily harm. If a person does not intend to encourage or assist serious self-harm, as will likely be the case with recovery and supportive material, no offence will be committed. The Law Commission looked at this issue carefully, following evidence from the Samaritans and others, and the implementation will be informed by an ongoing consultation as well.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I am sorry to interrupt the Minister, but the Law Commission recommended that the DPP’s consent should be required. The case that the Minister has made on previous occasions in some of the consultations that he has had with us is that this offence that the Government have proposed is different from the Law Commission one, and that is why they have not included the DPP’s consent. I am rather baffled by that, because the Law Commission was talking about a high threshold in the first place, and the Minister is talking about a high threshold of intent. Even if he cannot do so now, it would be extremely helpful to tie that down. As the noble Baroness and my noble friend said, 130 organisations are really concerned about the impact of this.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

The Law Commission recommended that the consent, but not the personal consent, of the Director of Public Prosecutions should be required. We believe, however, that, because the offence already has tight parameters due to the requirement for an intention to cause serious self-harm amounting to grievous bodily harm, as I have just outlined, an additional safeguard of obtaining the personal consent of the Director of Public Prosecutions is not necessary. We would expect the usual prosecutorial discretion and guidance to provide sufficient safeguards against inappropriate prosecutions in this area. As I say, we will continue to engage with those groups that have helped to inform the drafting of these clauses as they are implemented to make sure that that assessment is indeed borne out.

17:45
Amendment 268AZB aims to apply business disruption enforcement measures to any internet service that “persistently fails to prevent”, or indeed allows, the illegal encouragement of self-harm. As I mentioned earlier in Committee, the Bill significantly reduces the likelihood of users encountering this material on internet sites. It requires all user-to-user services to remove this content and search services to minimise users’ access to it. I hope that that reassures the noble Lords in relation to their amendments to my amendment.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I completely accept that, yes, by requiring the regulated services to prevent access to this kind of content, we will make a significant difference, but it is still the case that there will be—we know there will be, because they exist today—these individual websites, blogs or whatever you want to call them which are not regulated user-to-user services and which are promoting self-harm content. It would be really helpful to know what the Government think should happen to a service such as that, given that it is outside the regulation; it may be persistently breaking the law but be outside our jurisdiction.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will follow up in writing on that point.

Before I conclude, I will mention briefly the further government amendments in my name, which make technical and consequential amendments to ensure that the communications offences, including the self-harm offence, have the appropriate territorial extent. They also set out the respective penalties for the communications offences in Northern Ireland, alongside a minor adjustment to the epilepsy trolling offence, to ensure that its description is more accurate.

I hope that noble Lords will agree that the new criminal laws that we will make through this Bill are a marked improvement on the status quo. I hope that they will continue to support the government amendments. I express my gratitude to the Law Commission and to all noble Lords—

Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- Hansard - - - Excerpts

Just before the Minister sits down—I assume that he has finished his brief on the self-harm amendments; I have been waiting—I have two questions relating to what he said. First, if I heard him right, he said that the person forwarding on is also committing an offence. Does that also apply to those who set up algorithms that disseminate, as opposed to one individual forwarding on to another individual? Those are two very different scenarios. We can see how one individual forwarding to another could be quite targeted and malicious, and we can see how disseminating through an algorithm could have very widespread harms across a lot of people in a lot of different groups—all types of groups—but I am not clear from what he said that that has been caught in his wording.

Secondly—I will ask both questions while I can—I asked the Minister previously why there have been no prosecutions under the Suicide Act. I understood from officials that this amendment creating an offence was to reflect the Suicide Act and that suicide was not included in the Bill because it was already covered as an offence by the Suicide Act. Yet there have been no prosecutions and we have had deaths, so I do not quite understand why I have not had an answer to that.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will have to write on the second point to try to set that out in further detail. On the question of algorithms, the brief answer is no, algorithms would not be covered in the way a person forwarding on a communication is covered unless the algorithm has been developed with the intention of causing serious self-harm; it is the intention that is part of the test. If somebody creates an algorithm intending people to self-harm, that could be captured, but if it is an algorithm generally passing it on without that specific intention, it may not be. I am happy to write to the noble Baroness further on this, because it is a good question but quite a technical one.

Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- Hansard - - - Excerpts

It needs to be addressed, because these very small websites already alluded to are providing some extremely nasty stuff. They are not providing support to people and helping decrease the amount of harm to those self-harming but seem to be enjoying the spectacle of it. We need to differentiate and make sure that we do not inadvertently let one group get away with disseminating very harmful material simply because it has a small website somewhere else. I hope that will be included in the Minister’s letter; I do not expect him to reply now.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

Some of us are slightly disappointed that my noble friend did not respond to my point on the interaction of Clause 160 with the illegal content duty. Essentially, what appears to be creating a criminal offence could simply be a channel for hyperactive censorship on the part of the platforms to prevent the criminal offence taking place. He has not explained that interaction. He may say that there is no interaction and that we would not expect the platforms to take any action against offences under Clause 160, or that we expect a large amount of action, but nothing was said.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

If my noble friend will forgive me, I had better refresh my memory of what he said—it was some time ago—and follow up in writing.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I will be extremely brief. There is much to chew on in the Minister’s speech and this was a very useful debate. Some of us will be happier than others; the noble Baroness, Lady Buscombe, will no doubt look forward to the digital markets Bill and I will just have to keep pressing the Minister on the Data Protection and Digital Information Bill.

There is a fundamental misunderstanding about digital identity theft. It will not necessarily always be fraud that is demonstrated—the very theft of the identity is designed to be the crime, and it is not covered by the Fraud Act 2006. I am delighted that the Minister has agreed to talk further with the noble Baroness, Lady Kennedy, because that is a really important area. I am not sure that my noble friend will be that happy with the response, but he will no doubt follow up with the Minister on his amendments.

The Minister made a very clear statement on the substantive aspect of the group, the new crime of encouraging self-harm, but further clarification is still needed. We will look very carefully at what he said in relation to what the Law Commission recommended, because it is really important that we get this right. I know that the Minister will talk further with the noble Baroness, Lady Finlay, who is very well versed in this area. In the meantime, I beg leave to withdraw my amendment.

Amendment 264A withdrawn.
Lord Lexden Portrait The Deputy Chairman of Committees (Lord Lexden) (Con)
- Hansard - - - Excerpts

My Lords, I now have to go through a mass of amendments that are not to be the subject of debate today as they have been debated previously. I will proceed as swiftly as I can.

Amendments 265 to 267 not moved.
Amendment 267A
Moved by
267A: Clause 160, page 138, line 25, leave out from “liable” to end of line 27 and insert “—
(a) on summary conviction in England and Wales, to imprisonment for a term not exceeding the maximum term for summary offences or a fine (or both);(b) on summary conviction in Northern Ireland, to imprisonment for a term not exceeding 6 months or a fine not exceeding level 5 on the standard scale (or both).”Member’s explanatory statement
This amendment sets out the penalties for the false communications offence in Northern Ireland, since the offence is now to extend to Northern Ireland as well as England and Wales.
Amendment 267A agreed.
Clause 160, as amended, agreed.
Clause 161 agreed.
Clause 162: Threatening communications offence
Amendments 267AA and 267AB not moved.
Amendments 267B and 267C
Moved by
267B: Clause 162, page 139, line 38, after “conviction” insert “in England and Wales”
Member’s explanatory statement
This amendment adds a reference to England and Wales to differentiate the provision from the similar provision applying to Northern Ireland (see the next amendment in the Minister’s name).
267C: Clause 162, page 139, line 39, at end insert—
“(aa) on summary conviction in Northern Ireland, to imprisonment for a term not exceeding 6 months or a fine not exceeding the statutory maximum (or both);”Member’s explanatory statement
This amendment sets out the penalties for the threatening communications offence in Northern Ireland, since the offence is now to extend to Northern Ireland as well as England and Wales.
Amendments 267B and 267C agreed.
Clause 162, as amended, agreed.
Clause 163: Interpretation of sections 160 to 162
Amendment 268 not moved.
Clause 163 agreed.
Clause 164: Offences of sending or showing flashing images electronically
Amendment 268A
Moved by
268A: Clause 164, page 142, line 30, leave out subsection (14)
Member’s explanatory statement
This is a technical amendment about extent - the extent of the epilepsy trolling offence in clause 164 is now dealt with by amendments of clause 210 (see the amendments of that clause in the Minister’s name).
Amendment 268A agreed.
Clause 164, as amended, agreed.
Amendment 268AZA
Moved by
268AZA: After Clause 164, insert the following new Clause—
“Offence of encouraging or assisting serious self-harm
(1) A person (D) commits an offence if—(a) D does a relevant act capable of encouraging or assisting the serious self-harm of another person, and(b) D’s act was intended to encourage or assist the serious self-harm of another person.(2) D “does a relevant act” if D—(a) communicates in person,(b) sends, transmits or publishes a communication by electronic means,(c) shows a person such a communication,(d) publishes material by any means other than electronic means,(e) sends, gives, shows or makes available to a person—(i) material published as mentioned in paragraph (d), or(ii) any form of correspondence, or(f) sends, gives or makes available to a person an item on which data is stored electronically.(3) “Serious self-harm” means self-harm amounting to—(a) in England and Wales and Northern Ireland, grievous bodily harm within the meaning of the Offences Against the Person Act 1861, and(b) in Scotland, severe injury,and includes successive acts of self-harm which cumulatively reach that threshold.(4) The person referred to in subsection (1)(a) and (b) need not be a specific person (or class of persons) known to, or identified by, D.(5) D may commit an offence under this section whether or not serious self-harm occurs.(6) If a person (D1) arranges for a person (D2) to do an act that is capable of encouraging or assisting the serious self-harm of another person and D2 does that act, D1 is to be treated as also having done it.(7) In the application of subsection (1) to an act by D involving an electronic communication or a publication in physical form, it does not matter whether the content of the communication or publication is created by D (so for example, in the online context, the offence under this section may be committed by forwarding another person’s direct message or sharing another person’s post).(8) In the application of subsection (1) to the sending, transmission or publication by electronic means of a communication consisting of or including a hyperlink to other content, the reference in subsection (2)(b) to the communication is to be read as including a reference to content accessed directly via the hyperlink.(9) In the application of subsection (1) to an act by D involving an item on which data is stored electronically, the reference in subsection (2)(f) to the item is to be read as including a reference to content accessed by means of the item to which the person in receipt of the item is specifically directed by D.(10) A provider of an internet service by means of which a communication is sent, transmitted or published is not to be regarded as a person who sends, transmits or publishes it.(11) Any reference in this section to doing an act that is capable of encouraging the serious self-harm of another person includes a reference to doing so by threatening another person or otherwise putting pressure on another person to seriously self-harm. “Seriously self-harm” is to be interpreted consistently with subsection (3).(12) Any reference to an act in this section, except in subsection (3), includes a reference to a course of conduct, and references to doing an act are to be read accordingly.(13) In subsection (3) “act” includes omission.(14) A person who commits an offence under this section is liable—(a) on summary conviction in England and Wales, to imprisonment for a term not exceeding the general limit in a magistrates’ court or a fine (or both);(b) on summary conviction in Scotland, to imprisonment for a term not exceeding 12 months or a fine not exceeding the statutory maximum (or both);(c) on summary conviction in Northern Ireland, to imprisonment for a term not exceeding 6 months or a fine not exceeding the statutory maximum (or both);(d) on conviction on indictment, to imprisonment for a term not exceeding 5 years or a fine (or both).”Member’s explanatory statement
This amendment inserts a new offence of encouraging or assisting another person to seriously self-harm, with intent to do so, by means of verbal or electronic communications, publications or correspondence.
Amendment 268AZA agreed.
Amendments 268AZB and 268AZC not moved.
Amendment 268AA not moved.
Clause 165: Extra-territorial application and jurisdiction
Amendments 268B to 268F
Moved by
268B: Clause 165, page 142, line 32, leave out subsections (1) and (2)
Member’s explanatory statement
This amendment omits provisions which relate to offences that extended to England and Wales only, as the offences in question are now to extend to Northern Ireland as well.
268C: Clause 165, page 142, line 38, leave out “Section 164(1) applies” and insert “Sections 160(1), 162(1) and 164(1) apply”
Member’s explanatory statement
This amendment, regarding extra-territorial application, is needed because of the extension of the offences in clauses 160 and 162 to Northern Ireland.
268CA: Clause 165, page 142, line 44, at end insert—
“(4A) Section (Offence of encouraging or assisting serious self-harm)(1) applies to an act done outside the United Kingdom, but only if the act is done by a person within subsection (4B).(4B) A person is within this subsection if the person is—(a) an individual who is habitually resident in the United Kingdom, or(b) a body incorporated or constituted under the law of any part of the United Kingdom.”Member’s explanatory statement
This amendment provides for the extra-territorial application of the new offence proposed by the amendment in the Minister’s name to be inserted after clause 164.
268D: Clause 165, page 143, line 1, leave out subsection (5)
Member’s explanatory statement
This amendment omits a provision which relates to offences that extended to England and Wales only, as the offences in question are now to extend to Northern Ireland as well.
268E: Clause 165, page 143, line 4, after “section” insert “160, 162 or”
Member’s explanatory statement
This amendment, regarding extra-territorial jurisdiction, is needed because of the extension of the offences in clauses 160 and 162 to Northern Ireland.
268EA: Clause 165, page 143, line 7, at end insert—
“(6A) Proceedings for an offence committed under section (Offence of encouraging or assisting serious self-harm) outside the United Kingdom may be taken, and the offence may for incidental purposes be treated as having been committed, at any place in the United Kingdom.(6B) In the application of subsection (6A) to Scotland, any such proceedings against a person may be taken, and the offence may for incidental purposes be treated as having been committed—(a) in any sheriff court district in which the person is apprehended or is in custody, or(b) in such sheriff court district as the Lord Advocate may determine.(6C) In subsection (6B) “sheriff court district” is to be construed in accordance with the Criminal Procedure (Scotland) Act 1995 (see section 307(1) of that Act).”Member’s explanatory statement
This amendment is required in order to give courts in the United Kingdom jurisdiction to deal with the new offence proposed by the amendment in the Minister’s name to be inserted after clause 164, if the offence is committed outside the United Kingdom.
268F: Clause 165, page 143, line 8, leave out subsection (7)
Member’s explanatory statement
This is a technical amendment about extent - the extent of clause 165 is now dealt with by amendments of clause 210 (see the amendments of that clause in the Minister’s name).
Amendments 268B to 268F agreed.
Clause 165, as amended, agreed.
Clause 166: Liability of corporate officers
Amendments 268FA to 268G
Moved by
268FA: Clause 166, page 143, line 10, leave out “or 164” and insert “, 164 or (Offence of encouraging or assisting serious self-harm)”
Member’s explanatory statement
This amendment ensures that clause 166, which is about the liability of corporate officers for offences, applies in relation to the new offence proposed by the amendment in the Minister’s name to be inserted after clause 164.
268FB: Clause 166, page 143, line 22, at end insert—
“(2A) If an offence under section (Offence of encouraging or assisting serious self-harm) is committed by a Scottish partnership and it is proved that the offence—(a) has been committed with the consent or connivance of a partner of the partnership, or(b) is attributable to any neglect on the part of a partner of the partnership,the partner (as well as the partnership) commits the offence and is liable to be proceeded against and punished accordingly. (2B) “Partner”, in relation to a Scottish partnership, includes any person who was purporting to act as a partner.”Member’s explanatory statement
This amendment ensures that clause 166, which is about the liability of corporate officers for offences, applies to Scottish partnerships.
268G: Clause 166, page 143, line 23, leave out subsection (3)
Member’s explanatory statement
This is a technical amendment about extent - the extent of clause 166 is now dealt with by amendments of clause 210 (see the amendments of that clause in the Minister’s name).
Amendments 268FA to 268G agreed.
Clause 166, as amended, agreed.
Clause 167: Sending etc photograph or film of genitals
Amendments 269 and 270 not moved.
Clause 167 agreed.
Amendment 271 not moved.
Clause 168: Repeals in connection with offences under sections 160 and 162
Amendments 271A and 271B
Moved by
271A: Clause 168, page 144, line 17, after “Wales” insert “and Northern Ireland”
Member’s explanatory statement
This amendment ensures that section 127(2)(a) and (b) of the Communications Act 2003 is repealed for Northern Ireland as well as England and Wales (because the false communications offence in clause 160 is now to extend to Northern Ireland as well).
271B: Clause 168, page 144, line 22, at end insert—
“(3) The following provisions of the Malicious Communications (Northern Ireland) Order 1988 (S.I. 1988/1849 (N.I. 18)) are repealed—(a) Article 3(1)(a)(ii),(b) Article 3(1)(a)(iii), and(c) Article 3(2).”Member’s explanatory statement
This amendment amends the specified Northern Ireland legislation in consequence of the extension of the false and threatening communications offences to Northern Ireland.
Amendments 271A and 271B agreed.
Clause 168, as amended, agreed.
Clause 169: Consequential amendments
Amendment 271BA
Moved by
271BA: Clause 169, page 144, line 25, at end insert—
“(1A) Part 1A of Schedule 14 contains amendments consequential on section (Offence of encouraging or assisting serious self-harm).” Member’s explanatory statement
This amendment introduces a Part of Schedule 14 containing consequential amendments related to the new offence proposed by the amendment in the Minister’s name to be inserted after clause 164.
Amendment 271BA agreed.
Clause 169, as amended, agreed.
Schedule 14: Amendments consequential on offences in Part 10 of this Act
Amendments 271C to 271F
Moved by
271C: Schedule 14, page 231, line 33, leave out from “2003” to “after” in line 34 and insert “, in the list of offences for England and Wales,”
Member’s explanatory statement
This amendment makes it clearer that changes to the Sexual Offences Act 2003 in paragraph 2 of Schedule 14 to the Bill relate to England and Wales only (since the next amendment in the Minister’s name makes equivalent amendments for Northern Ireland).
271D: Schedule 14, page 231, line 38, at end insert—
“2A_ In Schedule 5 to the Sexual Offences Act 2003, in the list of offences for Northern Ireland, after paragraph 171H insert—“171I_ An offence under section 160 of the Online Safety Act 2023 (false communications).171J_ An offence under section 162 of that Act (threatening communications).””Member’s explanatory statement
This amendment concerns offences relevant to the making of certain orders under the Sexual Offences Act 2003. Now that the false and threatening communications offences under this Bill are to extend to Northern Ireland, this amendment updates the references in Schedule 5 to the Sexual Offences Act that relate to Northern Ireland.
271E: Schedule 14, page 232, line 14, after “sending” insert “or showing”
Member’s explanatory statement
This amendment makes a minor change to the description of the epilepsy trolling offence so that the description is more accurate.
271F: Schedule 14, page 232, line 14, at end insert—
“Part 1AAMENDMENTS CONSEQUENTIAL ON OFFENCE IN SECTION (ENCOURAGING OR ASSISTING SERIOUS SELF-HARM)Children and Young Persons Act 19334A_ In Schedule 1 to the Children and Young Persons Act 1933 (offences against children and young persons with respect to which special provisions of Act apply), after the entry relating to the Suicide Act 1961 insert—“An offence under section (Offence of encouraging or assisting serious self-harm)(1) of the Online Safety Act 2023 (encouraging or assisting serious self-harm) where the relevant act is an act capable of, and done with the intention of, encouraging or assisting the serious self-harm of a child or young person.”Visiting Forces Act 19524B_(1) The Schedule to the Visiting Forces Act 1952 (offences referred to in section 3) is amended as follows.(2) In paragraph 1(b), after paragraph (xv) insert—“(xvi) section (Offence of encouraging or assisting serious self-harm) of the Online Safety Act 2023;”.(3) In paragraph 2(b), after paragraph (iv) insert— “(v) section (Offence of encouraging or assisting serious self-harm) of the Online Safety Act 2023;”.Children and Young Persons Act (Northern Ireland) 1968 (c. 34 (N.I.))4C_ In Schedule 1 to the Children and Young Persons Act (Northern Ireland) 1968 (offences against children and young persons with respect to which special provisions of Act apply), after the entry relating to the Criminal Justice Act (Northern Ireland) 1966 insert—“An offence under section (Offence of encouraging or assisting serious self-harm)(1) of the Online Safety Act 2023 (encouraging or assisting serious self-harm) where the relevant act is an act capable of, and done with the intention of, encouraging or assisting the serious self-harm of a child or young person.”Criminal Attempts Act 19814D_ In section 1 of the Criminal Attempts Act 1981 (attempting to commit an offence), in subsection (4), after paragraph (c) insert—“(d) an offence under section (Offence of encouraging or assisting serious self-harm)(1) of the Online Safety Act 2023 (encouraging or assisting serious self-harm).”Criminal Attempts and Conspiracy (Northern Ireland) Order 1983 (S.I. 1983/ 1120 (N.I. 13))4E_ In Article 3 of the Criminal Attempts and Conspiracy (Northern Ireland) Order 1983 (attempting to commit an offence), in paragraph (4), after sub-paragraph (c) insert—“(ca) an offence under section (Offence of encouraging or assisting serious self-harm)(1) of the Online Safety Act 2023 (encouraging or assisting serious self-harm);”Armed Forces Act 20064F_ In Schedule 2 to the Armed Forces Act 2006 (“Schedule 2 offences”), in paragraph 12, at the end insert—“(ba) an offence under section (Offence of encouraging or assisting serious self-harm) of the Online Safety Act 2023 (encouraging or assisting serious self-harm).”Serious Crime Act 20074G_(1) The Serious Crime Act 2007 is amended as follows.(2) In section 51A (exceptions to section 44 for encouraging or assisting suicide)—(a) the existing text becomes subsection (1);(b) after that subsection insert—“(2) Section 44 does not apply to an offence under section (Offence of encouraging or assisting serious self-harm)(1) of the Online Safety Act 2023 (offence of encouraging or assisting serious self-harm).”;(c) in the heading, at the end insert “or serious self-harm”.(3) In Part 1 of Schedule 3 (listed offences: England and Wales and Northern Ireland), after paragraph 24A insert—“Online Safety Act 202324B_ An offence under section (Offence of encouraging or assisting serious self-harm)(1) of the Online Safety Act 2023 (encouraging or assisting serious self-harm).””Member’s explanatory statement
This amendment makes changes which are consequential on the new offence proposed by the amendment in the Minister’s name to be inserted after clause 164. Among other things, changes are proposed to the Criminal Attempts Act 1981 and the Serious Crime Act 2007 to ensure that offences of attempt and encouragement etc in those Acts do not apply in relation to the new offence, because that offence is itself an inchoate offence.
Amendments 271C to 271F agreed.
Schedule 14, as amended, agreed.
Clause 170: Providers’ judgements about the status of content
Amendments 272 to 283ZA not moved.
Clause 170 agreed.
18:00
Clause 171: OFCOM’s guidance about illegal content judgements
Amendment 283A
Moved by
283A: Clause 171, page 145, line 43, at end insert “, and
(b) judgements by providers about whether news publisher content amounts to a relevant offence (see section 14(5) and (10)).”Member’s explanatory statement
This amendment, in effect, re-states the provision currently in clause 14(11), requiring OFCOM’s guidance under clause 171 to cover the judgements described in the amendment.
Amendment 283A agreed.
Amendment 284 not moved.
Clause 171, as amended, agreed.
Clauses 172 to 174 agreed.
Schedule 15 agreed.
Clauses 175 and 176 agreed.
Amendment 284A
Moved by
284A: After Clause 176, insert the following new Clause—
“Offence of failure to comply with confirmation decision: supplementary
(1) Where a penalty has been imposed on a person by a penalty notice under section 126 in respect of a failure constituting an offence under section (Confirmation decisions: offence)(failure to comply with certain requirements of a confirmation decision), no proceedings may be brought against the person for that offence.(2) A penalty may not be imposed on a person by a penalty notice under section 126 in respect of a failure constituting an offence under section (Confirmation decisions: offence) if—(a) proceedings for the offence have been brought against the person but have not been concluded, or(b) the person has been convicted of the offence.(3) Where a service restriction order under section 131 or an access restriction order under section 133 has been made in relation to a regulated service provided by a person in respect of a failure constituting an offence under section (Confirmation decisions: offence), no proceedings may be brought against the person for that offence.” Member’s explanatory statement
This amendment ensures, among other things, that a person cannot be prosecuted for the new offence created by the new clause to be inserted after clause 125 in the Minister’s name if OFCOM have imposed a financial penalty for the same conduct instead, and vice versa.
Amendment 284A agreed.
Clauses 177 to 179 agreed.
Clause 180: Extra-territorial application
Amendments 284B and 284C
Moved by
284B: Clause 180, page 150, line 23, leave out “Section 121(7)” and insert “Sections 121(7) and 137(11)”
Member’s explanatory statement
This amendment adds a reference to clause 137(11) so that that provision (which is about enforcement by civil proceedings) has extra-territorial application.
284C: Clause 180, page 150, line 24, leave out “applies” and insert “apply”
Member’s explanatory statement
This amendment is consequential on the preceding amendment in the Minister’s name.
Amendments 284B and 284C agreed.
Clause 180, as amended, agreed.
Clause 181: Information offences: extra-territorial application and jurisdiction
Amendments 284D to 284F
Moved by
284D: Clause 181, page 150, line 29, at end insert—
“(2A) Section (Confirmation decisions: offence) applies to acts done by a person in the United Kingdom or elsewhere (offence of failure to comply with confirmation decision).”Member’s explanatory statement
This amendment gives wide extra-territorial effect to the new offence created by the new clause to be inserted after clause 125 in the Minister’s name (failure to comply with certain requirements of a confirmation decision).
284E: Clause 181, page 150, line 31, after “subsection (1)” insert “or (2A)”
Member’s explanatory statement
This amendment extends the extra-territorial effect of the new offence of failure to comply with certain requirements of a confirmation decision in the case of senior managers etc who may commit the offence under clause 178(2) or 179(5).
284F: Clause 181, page 150, line 34, leave out “or 101” and insert “, 101 or (Confirmation decisions: offence)”
Member’s explanatory statement
This amendment is required in order to give United Kingdom courts jurisdiction to deal with the new offence of failure to comply with certain requirements of a confirmation decision if it is committed elsewhere.
Amendments 284D to 284F agreed.
Clause 181, as amended, agreed.
Clauses 182 to 184 agreed.
Amendments 285 and 286 not moved.
Lord Lexden Portrait The Deputy Chairman of Committees (Lord Lexden) (Con)
- Hansard - - - Excerpts

Land is in sight! I call Amendment 286ZA.

Amendment 286ZA

Moved by
286ZA: After Clause 184, insert the following new Clause—
“Artificial intelligence: labelling of machine-generated content
Within the period of six months beginning with the day on which this Act is passed, the Secretary of State must publish draft legislation with provisions requiring providers of regulated services to put in place systems and processes for—(a) identifying content on their service which is machine-generated, and(b) informing users of the service that such content is machine-generated.”Member’s explanatory statement
This probing amendment is to facilitate a discussion around the potential labelling of machine-generated content, which is a measure being considered in other jurisdictions.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, that was a bravura performance by the noble Lord, Lord Lexden. We thank him. To those listening in the Public Gallery, I should say that we debated most of those; it was not quite as on the nod as it looked.

Amendment 286ZA, in the name of my noble friend Lord Stevenson, seeks to address a critical issue in our digital landscape: the labelling of AI-generated content on social media platforms.

As we navigate the ever-evolving world of technology, it is crucial that we uphold a transparency safeguarding the principles of honesty and accountability. Social media has become an integral part of our lives, shaping public discourse, disseminating information and influencing public opinion. However, the rise of AI-powered algorithms and tools has given rise to a new challenge: an increasing amount of content generated by artificial intelligence without explicit disclosure.

We live in an age where AI is capable of creating incredibly realistic text, images and even videos that can be virtually indistinguishable from those generated by humans. While this advancement holds immense potential, it also raises concerns regarding authenticity, trust and the ethical implications of AI-generated content. The proposed amendment seeks to address this concern by advocating for a simple but powerful solution—labelling AI-generated content as such. By clearly distinguishing human-generated content from AI-generated content, we empower individuals to make informed decisions about the information they consume, promoting transparency and reducing the potential for misinformation or manipulation.

Labelling AI-generated content serves several crucial purposes. First and foremost, it allows individuals to differentiate between information created by humans and that generated by algorithms in an era where misinformation and deep fakes pose a significant threat to public trust. Such labelling becomes a vital tool to protect and promote digital literacy.

Secondly, it enables users to better understand the potential biases and limitations of AI-generated content. AI algorithms are trained on vast datasets, and without labelling, individuals might unknowingly attribute undue credibility to AI-generated information, assuming it to be wholly objective and reliable. Labelling, however, helps users to recognise the context and provides an opportunity for critical evaluation.

Furthermore, labelling AI-generated content encourages responsible behaviour from the platforms themselves. It incentivises social media companies to develop and implement AI technologies with integrity and transparency, ensuring that users are aware of the presence and influence of AI in their online experiences.

Some may argue that labelling AI-generated content is an unnecessary burden or that it could stifle innovation. However, the intention behind this amendment is not to impede progress but to foster a healthier digital ecosystem built on trust, integrity and informed decision-making. By promoting transparency, we can strike a balance that allows innovation to flourish while safeguarding the interests of individuals and society as a whole.

In conclusion, the amendment to label AI-generated content on social media platforms represents a crucial step forward in addressing the challenges of the digital age. By embracing transparency and empowering individuals, we can foster a more informed and discerning society. Let us lead by example and advocate for a digital landscape that values accountability, integrity and the rights of individuals. I urge your Lordships to support this amendment as we strive to build a future where technology works hand-in-hand with humanity for the betterment of all.

In the spirit of the amendment, I must flag that my entire speaking note was generated by AI, as the noble Lord, Lord Allan, from his expression, had clearly guessed. In using this tool, I do so not to belittle the amendment but to illustrate that these tools are already infiltrating everyday life and can supercharge misinformation. We need to do something to ease internet users in trusting what they read.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Does the noble Lord agree that the fact that we did not notice his speech was generated by AI somewhat damages his argument?

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

The fact that I labelled it as being AI-generated helped your Lordships to understand, and the transparency eases the debate. I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the noble Lord, Lord Knight, for laying out the amendment and recognise that there was a very thoughtful debate on the subject of machine-generated content on Amendment 125 in my name on a previous day of Committee.

I appreciate that the concept of labelling or watermarking machine-generated material is central to recent EU legislation, but I am equally aware that there is more than one school of thought on the efficacy of that approach among AI experts. On the one hand, as the noble Lord, Lord Knight, beautifully set out—with the help of his artificial friend—there are those who believe that visibly marking the division of real and altered material is a clue for the public to look more carefully at what they are seeing and that labelling it might provide an opportunity for both creators and digital companies to give greater weight to “human-created material”. For example, it could be that the new BBC Verify brand is given greater validity by the public, or that Google’s search results promote it above material labelled as machine-generated as a more authentic source. There are others who feel that the scale of machine-generated material will be so vast that this labelling will be impossible or that labelling will downgrade the value of very important machine-generated material in the public imagination, when in the very near future it is likely that most human activity will be a blend of generated material and human interaction.

I spent the first part of this week locked in a room with others at the Institute for Ethics in AI in Oxford debating some of these issues. While this is a very live discussion, one thing is clear: if we are to learn from history, we must act now before all is certain, and we should act with pragmatism and a level of humility. It may be that either or both sets of experts are correct.

Industry has clearly indicated that there is an AI arms race, and many companies are launching services that they do not understand the implications of. This is not my view but one told to me by a company leader, who said that the speed of distribution was so great that the testing was confined to whether deploying large language models crashed the platforms; there was no testing for safety.

The noble Lord, Lord Stevenson, says in his explanatory statement that this is a probing amendment. I therefore ask the Minister whether we might meet before Report and look once again at the gaps that might be covered by some combination of Amendment 125 and the amendment in front of us, to make certain that the Bill adequately reflects the concerns raised by the enforcement community and reflects the advice of those who best understand the latest iterations of the digital world.

The Communications Act 2003 made a horrible mistake in not incorporating digital within it; let us not do the same here. Adding explicit safety duties to AI and machine learning would not slow down innovation but would ensure that innovation is not short-sighted and dangerous for humanity. It is a small amendment for what may turn out to be an unimaginably important purpose.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to follow the noble Baroness, Lady Kidron. I will try to keep my remarks brief.

It is extremely helpful that we have the opportunity to talk about this labelling question. I see it more as a kind of aperitif for our later discussion of AI regulation writ large. Given that it is literally aperitif hour, I shall just offer a small snifter as to why I think there may be some challenges around labelling—again, perhaps that is not a surprise to the noble Baroness.

When we make rules, as a general matter we tend to assume that people are going to read them and respond in a rationalist, conformist way. In reality, particularly in the internet space, we often see that there is a mixed environment and there will be three groups. There are the people who will look at the rules and respond in that rational way to them; a large group of people will just ignore them—they will simply be unaware and not at all focused on the rules; and another group will look for opportunities to subvert them and use them to their own advantage. I want to comment particularly on that last group by reference to cutlery and call centres, two historic examples of where rules have been subverted.

On the cutlery example, I am a Sheffielder, and “Made in Sheffield” used to mean that you had made the entire knife in Sheffield. Then we had this long period when we went from knives being made in Sheffield to bringing them to Sheffield and silver-plating them, to eventually just sharpening them and putting them in boxes. That is relevant in the context of AI. Increasingly, if there is an advantage to be gained by appearing to be human, people will look at what kind of finishing you need, so: “The content may have been generated by AI but the button to post it was pushed by a human, therefore we do not think it is AI because we looked at it and posted it”. On the speech of the noble Lord, Lord Knight, does the fact that my noble friend intervened on him and the noble Lord had to use some of his own words now mean that his speech in Hansard would not have to be labelled “AI-generated” because we have now departed from it? Therefore, there is that question of individuals who will want something to appear human-made even if it was largely AI-generated, and whether they will find the “Made in Sheffield” way of bypassing it.

Interestingly, we may see the phenomenon flipping the other way, and this is where my call centres come in. If people go to a popular search engine and type in “SpinVox”, they will see the story of a tech company that promised to transcribe voicemails into written text. This was a wonderful use of technology, and it was valued on the basis that it had developed that fantastic technology. However, it turned out—or at least there were claims, which I can repeat here under privilege—that it was using call centres in low-cost, low-wage environments to type those messages out. Therefore, again, we may see, curiously, some people seeing an advantage to presenting content as AI-generated when it is actually made by humans. That is just to flag that up—as I say, it is a much bigger debate that we are going to have. It is really important that we are having it, and labelling has a role to play. However, as we think about it, I urge that we remember those communities of people who will look at whatever rules we come up with and say, “Aha! Where can I get advantage?”, either by claiming that something is human when it is generated by AI or claiming that it is generated by AI if it suits them when it was actually produced by humans.

18:15
Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to follow the noble Lord, Lord Allan. He reminded me of significant reports of the huge amount of exploitation in the digital sector that has come from identification of photos. A great deal of that is human labour, even though it is often claimed to have been done through machine intelligence.

In speaking to this late but important amendment, I thank the noble Lords, Lord Stevenson and Lord Knight, for giving us the chance to do so, because, as every speaker has said, this is really important. I should declare my position as a former newspaper editor. I distinctly recall teasing a sports journalist in the early 1990s when it was reported that journalists were going to be replaced by computer technology. I said that the sports journalists would be the first to go because they just wrote to a formula anyway. I apologise to sports journalists everywhere.

The serious point behind that is that a lot of extreme, high claims are now being made about so-called artificial intelligence. I declare myself an artificial-intelligence sceptic. What we have now—so-called generative AI—is essentially big data. To quote the science fiction writer, Ted Chiang, what we have is applied statistics. Generative AI relies on looking at what already exists, and it cannot produce anything original. In many respects, it is a giant plagiarism machine. There are huge issues, beyond the scope of the Bill, around intellectual property and the fact that it is not generating anything original.

None the less, it is generating what people in the sector like to describe as hallucinations, which might otherwise be described as errors, falsehoods or lies. This is where quotes are made up; ideas are presented which, at first glance, look as though they make sense but fall apart under examination; and data is actively invented. There is one rather famous case where a lawyer got himself into a great deal of trouble by producing a whole lot of entirely false cases that a bot generated for him. We need to be really careful, and this amendment shows us a way forward in attempting to deal with some of the issues we are facing.

To pick up the points made by the noble Lord, Lord Allan, about the real-world impacts, I was at an event in Parliament this week entitled “The Worker Experience of the AI Revolution”, run by the TUC and Connected by Data. It highlighted what has happened with a lot of the big data exercises already in operation: rather than humans being replaced by robots, people are being forced to act like robots. We heard from Royal Mail and Amazon workers, who are monitored closely and expected to act like machines. That is just one example of the unexpected outcomes of the technologies we have been exercising in recent years.

I will make two final comments. First, I refer to 19th-century Luddite John Booth, who was tortured to death by the state. He was a Luddite, but he was also on the record as saying that new machinery

“might be man’s chief blessing instead of his curse if society were differently constituted”.

History is not pre-written; it is made by the choices, laws and decisions we make in this Parliament. Given where we are at the moment with so-called AI, I urge that caution really is warranted. We should think about putting some caution in the Bill, which is what this amendment points us towards.

My final point relates to an amendment I was not allowed to table because, I was told, it was out of scope. It asked the Secretary of State to report on the climate emissions coming from the digital sector, specifically from artificial intelligence. The noble Baroness, Lady Kidron, said that it will operate on a vast scale. I point out that, already, the digital sector is responsible for 3% of the world’s electricity use and 2% of the world’s carbon emissions, which is about the same as the airline sector. We really need to think about caution. I very much agree with everyone who said that we need to have more discussions on all these issues before Report.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this is a real hit-and-run operation from the noble Lord, Lord Stevenson. He has put down an amendment on my favourite subject in the last knockings of the Bill. It is totally impossible to deal with this now—I have been thinking and talking about the whole area of AI governance and ethics for the past seven years—so I am not going to try. It is important, and the advisory committee under Clause 139 should take it into account. Actually, this is much more a question of authenticity and verification than of content. Trying to work out whether something is ChatGPT or GPT-4 content is a hopeless task; you are much more likely to be able to identify whether these are automated users such as chatbots than you are to know about the content itself.

I will leave it there. I missed the future-proofing debate, which I would have loved to have been part of. I look forward to further debates with the noble Viscount, Lord Camrose, on the deficiencies in the White Paper and to the Prime Minister’s much more muscular approach to AI regulation in future.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

I am sure that the noble Lord, Lord Stevenson of Balmacara, is smiling over a sherry somewhere about the debate he has facilitated. His is a useful probing amendment and we have had a useful discussion.

The Government certainly recognise the potential challenges posed by artificial intelligence and digitally manipulated content such as deepfakes. As we have heard in previous debates, the Bill ensures that machine-generated content on user-to-user services created by automated tools or machine bots will be regulated where appropriate. Clause 49(4)(b) means that machine-generated content is regulated unless the bot or automated tool producing the content is controlled by the provider of the service.

The labelling of this content via draft legislation is not something to which I can commit today. The Government’s AI regulation White Paper sets out the principles for the responsible development of artificial intelligence in the UK. These principles, such as safety, transparency and accountability, are at the heart of our approach to ensuring the responsible development and use of AI. As set out in the White Paper, we are building an agile approach that is designed to be adaptable in response to emerging developments. We do not wish to introduce a rigid, inflexible form of legislation for what is a flexible and fast-moving technology.

The public consultation on these proposals closed yesterday so I cannot pre-empt our response to it. The Government’s response will provide an update. I am joined on the Front Bench by the Minister for Artificial Intelligence and Intellectual Property, who is happy to meet with the noble Baroness, Lady Kidron, and others before the next stage of the Bill if they wish.

Beyond labelling such content, I can say a bit to make it clear how the Bill will address the risks coming from machine-generated content. The Bill already deals with many of the most serious and illegal forms of manipulated media, including deepfakes, when they fall within scope of services’ safety duties regarding illegal content or content that is potentially harmful to children. Ofcom will recommend measures in its code of practice to tackle such content, which could include labelling where appropriate. In addition, the intimate image abuse amendments that the Government will bring forward will make it a criminal offence to send deepfake images.

In addition to ensuring that companies take action to keep users safe online, we are taking steps to empower users with the skills they need to make safer choices through our work on media literacy. Ofcom, for example, has an ambitious programme of work through which it is funding several initiatives to build people’s resilience to harm online, including initiatives designed to equip people with the skills to identify disinformation. We are keen to continue our discussions with noble Lords on media literacy and will keep an open mind on how it might be a tool for raising awareness of the threats of disinformation and inauthentic content.

With gratitude to the noble Lords, Lord Stevenson and Lord Knight, and everyone else, I hope that the noble Lord, Lord Knight, will be content to withdraw his noble friend’s amendment.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to everyone for that interesting and quick debate. It is occasionally one’s lot that somebody else tables an amendment but is unavoidably detained in Jerez, drinking sherry, and monitoring things in Hansard while I move the amendment. I am perhaps more persuaded than my noble friend might have been by the arguments that have been made.

We will return to this in other fora in response to the need to regulate AI. However, in the meantime, I enjoyed in particular the John Booth quote from the noble Baroness, Lady Bennett. In respect of this Bill and any of the potential harms around generative AI, if we have a Minister who is mindful of the need for safety by design when we have concluded this Bill then we will have dealt with the bits that we needed to deal with as far as this Bill is concerned.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

Can the noble Lord confirm whether he generated those comments himself, or was he on his phone while we were speaking?

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I do not have an invisible earpiece feeding me my lines—that was all human-generated. I beg leave to withdraw the amendment.

Amendment 286ZA withdrawn.
Clause 185 agreed.
Schedule 16 agreed.
Clauses 186 and 187 agreed.
Schedule 17: Video-sharing platform services: transitional provision etc
Amendment 286A
Moved by
286A: Schedule 17, page 239, line 36, after “19(2)” insert “and (8A)”
Member’s explanatory statement
This amendment ensures that, during the transitional period when video-sharing platform services continue to be regulated by Part 4B of the Communications Act 2003, providers of such services are not exempt from the new duty in clause 19 to supply records of risk assessments to OFCOM.
Amendment 286A agreed.
Schedule 17, as amended, agreed.
Clause 188: Repeals: Digital Economy Act 2017
Amendment 286B
Moved by
286B: Clause 188, page 154, line 1, after “119(10)” insert “and (11)”
Member’s explanatory statement
This amendment effects the repeal of a provision of the Digital Economy Act 2017 which solely relates to another provision of that Act being repealed.
Amendment 286B agreed.
Clause 188, as amended, agreed.
Clauses 189 to 196 agreed.
Clause 197: Parliamentary procedure for regulations
Amendments 287 to 289 not moved.
Clause 197 agreed.
Amendment 290 not moved.
Clauses 198 to 201 agreed.
Clause 202: “Proactive technology”
Amendments 290A to 290G
Moved by
290A: Clause 202, page 166, line 3, leave out “moderation” and insert “identification”
Member’s explanatory statement
This amendment re-names “content moderation technology” as “content identification technology” as that term is more accurate.
290B: Clause 202, page 166, line 7, leave out “moderation” and insert “identification”
Member’s explanatory statement
This amendment is consequential on the first amendment of clause 202 in the Minister’s name.
290C: Clause 202, page 166, line 9, leave out from “analyses” to end of line 11 and insert “content to assess whether it is content of a particular kind (for example, illegal content).”
Member’s explanatory statement
This amendment revises the definition of content identification technology so that the restrictions in the Bill on OFCOM recommending or requiring the use of proactive technology apply to content identification technology operating on any kind of content.
290D: Clause 202, page 166, line 12, leave out “moderation” and insert “identification”
Member’s explanatory statement
This amendment is consequential on the first amendment of clause 202 in the Minister’s name.
290E: Clause 202, page 167, line 4, leave out “moderation” and insert “identification”
Member’s explanatory statement
This amendment is consequential on the first amendment of clause 202 in the Minister’s name.
290F: Clause 202, page 167, line 9, leave out “moderation” and insert “identification”
Member’s explanatory statement
This amendment is consequential on the first amendment of clause 202 in the Minister’s name.
290G: Clause 202, page 167, leave out lines 15 to 18
Member’s explanatory statement
This amendment is consequential on the first amendment of clause 202 in the Minister’s name.
Amendments 290A to 290G agreed.
Clause 202, as amended, agreed.
Clause 203: Content communicated “publicly” or “privately”
Amendment 290H
Moved by
290H: Clause 203, page 167, line 38, at end insert “, or
(ii) users of another internet service.”Member’s explanatory statement
This amendment concerns the factors that OFCOM must particularly consider when deciding if content is communicated publicly or privately. The change ensures that one such factor is how easily the content may be shared with users of another service.
Amendment 290H agreed.
Clause 203, as amended, agreed.
Clause 204: “Functionality”
Amendments 291 to 293 not moved.
Clause 204 agreed.
Clause 205: “Harm” etc
Amendments 294 and 295 not moved.
Clauses 205 agreed.
Clause 206 agreed.
Amendment 296 not moved.
18:30
Clause 207: Interpretation: general
Amendment 297
Moved by
297: Clause 207, page 170, line 13, leave out from “means” to end of line 14 and insert “any system of checking age or age range (including age estimation and age verification);
“age estimation” includes reference to an age range or an age expressed in years;“age verification” means the exact age of a person in years, months, and days or an established date of birth;”Member’s explanatory statement
This amendment defines the meaning of age assurance in the Bill to recognise it includes any test of age including but not limited to verification. Age verification means the exact age of a person in years, months, and days or a date of birth. Age estimation may refer to an age range or an age expressed in years. This is a definition of terms only: the intention is that Ofcom will produce guidance of what level of assurance is required in different settings.
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, we already had a long debate on this subject earlier in Committee. In the interim, many noble Lords associated with these amendments have had conversations with the Government, which I hope will bear some fruit before Report. Today, I want to reiterate a few points that I hope are clarifying to the Committee and the department. In the interests of everyone’s evening plans, the noble Lord, Lord Bethell, and the noble Baroness, Lady Harding, wish to associate themselves with these remarks so that they represent us in our entirety.

For many years, we thought age verification was a gold standard, primarily because it involved a specific government-issued piece of information such as a passport. By the same token, we thought age estimation was a lesser beast, given that it is an estimate by its very nature and that the sector primarily relied on self-declarations with very few checks and balances. In recent years, many approaches to age checking have flourished. Some companies provide age assurance tokens based on facial recognition; others use multiple signals of behaviour, friendship group, parental controls and how you move your body in gameplay; and, only yesterday, I saw the very impressive on-device privacy-preserving age-verification system that Apple rolled out in the US two weeks ago. All of these approaches, used individually and cumulatively, have a place in the age-checking ecosystem, and all will become more seamless over time. But we must ensure that, when they are used, they are adequate for the task they are performing and are quality controlled so that they do not share information about a child, are secure and are effective.

That is why, at the heart of the package of measures put forward in my name and that of the noble Lords, Lord Stevenson and Lord Bethell, and the right reverend Prelate the Bishop of Oxford, are two concepts. First, the method of measuring age should be tech neutral so that all roads can be used. Secondly, there must be robust mechanism of measurement of effectiveness so that only effective systems can be used in high-risk situations, particularly those of primary priority harms such as self-harm and pornography, and that such a measurement will be determined by Ofcom, not industry.

From my work over the last decade and from recent discussion with industry, I am certain that any regime of age assurance must be measurable and hold to certain principles. We cannot create a situation where children’s data is loosely held and liberally shared; we cannot have a system that discriminates against, or does not have automatic appeal mechanisms for, children of colour or those who are 17 or 19, who are at most likelihood of error. Systems should aim to be interoperable and private, not leave traces as children go from one service to another.

Each of the principles of our age-verification package set out in the schedule are of crucial importance. I hope that the Government will see the sense in that because, without them, this age checking will not be trusted. Equally, I urge the Committee to embrace the duality of age verification and estimation that the Government have put forward, because, if a child uses an older sibling’s form of verification and a company understands through the child’s behaviour that they are indeed a child, then we do not want to set up a perverse situation in which the verification is considered of a higher order and they cannot take action based on estimation; ditto, if estimation in gameplay is more accurate than tokens that verify whether someone is over or under 18, it may well be that estimation gives greater assurance that the company will treat the child according to their age.

I hope and believe that, in his response, the Minister will confirm that definitions of age assurance and age estimation will be on the face of the Bill. I also urge him to make a generous promise to accept the full gamut of our concerns about age checking and bring forward amendments in his name on Report that reflect them in full. I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I associate these Benches with the introduction by the noble Baroness, Lady Kidron, support her amendments and, likewise, hope that they form part of the package that is trundling on its way towards us.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, what more can I say than that I wish to be associated with the comments made by the noble Baroness and then by the noble Lord, Lord Clement-Jones? I look forward to the Minister’s reply.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

I am very grateful to the noble Baroness for her amendment, which is a useful opportunity for us to state publicly and share with the Committee the progress we have been making in our helpful discussions on these issues in relation to these amendments. I am very grateful to her and to my noble friends Lord Bethell and Lady Harding for speaking as one on this, including, as is well illustrated, in this short debate this evening.

As the noble Baroness knows, discussions continue on the precise wording of these definitions. I share her optimism that we will be able to reach agreement on a suitable way forward, and I look forward to working with her, my noble friends and others as we do so.

The Bill already includes a definition of age assurance in Clause 207, which is

“measures designed to estimate or verify the age or age-range of users of a service”.

As we look at these issues, we want to avoid using words such as “checking”, which suggests that providers need to take a proactive approach to checking age, as that may inadvertently preclude the use of technologies which determine age through other means, such as profiling. It is also important that any definition of age assurance does not restrict the current and future use of innovative and accurate technologies. I agree that it is important that there should be robust definitions for terms which are not currently defined in the Bill, such as age verification, and recommit to the discussions we continue to have on what terms need to be defined and the best way to define them.

This has been a very helpful short debate with which to end our deliberations in Committee. I am very grateful to noble Lords for all the points that have been raised over the past 10 days, and I am very glad to be ending in this collaborative spirit. There is much for us still to do, and even more for the Office of the Parliamentary Counsel to do, before we return on Report, and I am grateful to it and to the officials working on the Bill. I urge the noble Baroness to withdraw her amendment.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I beg leave to withdraw the amendment.

Amendment 297 withdrawn.
Amendments 298 to 304 not moved.
Clause 207 agreed.
Clauses 208 and 209 agreed.
Clause 210: Extent
Amendments 304A and 304B
Moved by
304A: Clause 210, page 175, line 24, leave out “Except as provided by subsections (2) to (7)” and insert “Subject to the following provisions of this section”
Member’s explanatory statement
This amendment avoids any implication that the power proposed to be inserted by the amendment of the extent clause in the Minister’s name giving power to extend provisions of the Bill to the Crown Dependencies, and related provisions, are limited in extent to the United Kingdom.
304B: Clause 210, page 175, line 26, leave out subsection (2)
Member’s explanatory statement
This amendment omits a provision in the extent clause which is now dealt with by text inserted by the next three amendments in the Minister’s name.
Amendments 304A and 304B agreed.
Amendment 304C had been withdrawn from the Marshalled List.
Amendment 304CA
Moved by
304CA: Clause 210, page 175, line 29, leave out subsection (3) and insert—
“(3) The following provisions extend to England and Wales and Northern Ireland—(a) sections 160 to 164;(b) section 168(1).” Member’s explanatory statement
This amendment revises the extent clause as a result of changes to the extent of the communications offences in Part 10 of the Bill.
Amendment 304CA agreed.
Amendment 304D had been withdrawn from the Marshalled List.
Amendments 304E to 304K
Moved by
304E: Clause 210, page 175, line 35, leave out subsection (6) and insert—
“(6) The following provisions extend to Northern Ireland only—(a) section 168(3);(b) section 190(7) to (9).”Member’s explanatory statement
This amendment revises the extent clause so that the amendments of Northern Ireland legislation in clause 168 extend to Northern Ireland only.
304F: Clause 210, page 176, line 2, at end insert—
“(7A) His Majesty may by Order in Council provide for any of the provisions of this Act to extend, with or without modifications, to the Bailiwick of Guernsey or to the Isle of Man.(7B) Subsections (1) and (2) of section 196 apply to an Order in Council under subsection (7A) as they apply to regulations under this Act.”Member’s explanatory statement
This amendment provides a power for His Majesty by Order in Council to extend any of the provisions of the Bill to Guernsey or the Isle of Man.
304G: Clause 210, page 176, line 4, leave out from second “to” to end of line 5 and insert “the Bailiwick of Guernsey or the Isle of Man any amendment or repeal made by or under this Act of any part of that Act (with or without modifications).”
Member’s explanatory statement
This amendment has the effect that the power conferred by section 411(6) of the Communications Act 2003 may be exercised so as to extend to Guernsey or the Isle of Man the amendment or repeal of provisions of that Act made by the Bill.
304H: Clause 210, page 176, line 7, leave out “any of the Channel Islands” and insert “the Bailiwick of Guernsey”
Member’s explanatory statement
This amendment has the effect that the power conferred by section 338 of the Criminal Justice Act 2003 may be exercised so as to extend to Guernsey (but not Jersey) the amendment of provisions of that Act made by paragraph 7 of Schedule 14 to the Bill.
304J: Clause 210, page 176, line 10, leave out “any of the Channel Islands” and insert “the Bailiwick of Guernsey”
Member’s explanatory statement
This amendment has the effect that the power conferred by section 60(6) of the Modern Slavery Act 2015 may be exercised so as to extend to Guernsey (but not Jersey) the amendment of Schedule 4 to that Act made by paragraph 9 of Schedule 14 to the Bill.
304K: Clause 210, page 176, line 13, leave out “any of the Channel Islands” and insert “the Bailiwick of Guernsey”
Member’s explanatory statement
This amendment has the effect that the power conferred by section 415(1) of the Sentencing Act 2020 may be exercised so as to extend to Guernsey (but not Jersey) the amendment of Schedule 18 to that Act made by paragraph 10 of Schedule 14 to the Bill.
Amendments 304E to 304K agreed.
Clause 210, as amended, agreed.
Clause 211: Commencement and transitional provision
Amendments 305 and 306 not moved.
Clause 211 agreed.
Clause 212 agreed.
House resumed.
Bill reported with amendments.

Online Safety Bill

Report (1st Day)
12:05
Relevant documents: 28th and 38th Reports from the Delegated Powers Committee and 15th Report from the Constitution Committee. Scottish and Welsh legislative consent granted.
Amendment 1
Moved by
1: Before Clause 1, insert the following new Clause—
“Introduction
(1) This Act provides for a new regulatory framework which has the general purpose of making the use of internet services regulated by this Act safer for individuals in the United Kingdom.(2) To achieve that purpose, this Act (among other things)—(a) imposes duties which, in broad terms, require providers of services regulated by this Act to identify, mitigate and manage the risks of harm (including risks which particularly affect individuals with a certain characteristic) from—(i) illegal content and activity, and(ii) content and activity that is harmful to children, and(b) confers new functions and powers on the regulator, OFCOM.(3) Duties imposed on providers by this Act seek to secure (among other things) that services regulated by this Act are—(a) safe by design, and(b) designed and operated in such a way that—(i) a higher standard of protection is provided for children than for adults,(ii) users’ rights to freedom of expression and privacy are protected, and(iii) transparency and accountability are provided in relation to those services.”Member’s explanatory statement
This amendment provides for a new introductory Clause.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- Hansard - - - Excerpts

My Lords, I am pleased that we are on Report, and I thank all noble Lords who took part in Committee and those with whom I have had the pleasure of discussing issues arising since then, particularly for their constructive and collaborative nature, which we have seen throughout the passage of Bill.

In Committee, I heard the strength of feeling and the desire for an introductory clause. It was felt that this would help make the Bill less complex to navigate and make it less easy for providers to use this complexity to try to evade their duties under it. I have listened closely to these concerns and thank the noble Lord, Lord Stevenson of Balmacara, the noble Baroness, Lady Merron, and others for their work on this proposal. I am particularly grateful for their collaborative approach to ensuring the new clause has the desired effect without causing legal uncertainty. In that spirit, I am pleased to introduce government Amendment 1. I am grateful too to the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones, who have signed their names to it. That is a very good start to our amendments here on Report.

Amendment 1 inserts an introductory clause at the start of the Bill, providing an overarching statement about the main objectives of the new regulatory framework. The proposed new clause describes the main broad objectives of the duties that the Bill imposes on providers of regulated services and that the Bill confers new functions and powers on Ofcom.

The clause makes clear that regulated services must identify, mitigate and manage risks that particularly affect people with a certain characteristic. This recognises that people with certain characteristics, or more than one such characteristic, are disproportionately affected by online harms and that providers must account for and protect them from this. The noble Baroness, Lady Merron, raised the example of Jewish women, as did the noble Baroness, Lady Anderson of Stoke-on-Trent. Sadly, they have first-hand experience of the extra levels of abuse and harm that some groups of people can face when they have more than one protected characteristic. It could just as easily be disabled women or queer people of colour. The noble Baroness, Lady Merron, has tabled several amendments highlighting this problem, which I will address further in response to the contribution I know she will make to this debate.

Subsection 3 of the proposed new clause outlines the main outcomes that the duties in the Bill seek to secure. It is a fundamental principle of the legislation that the design of services can contribute to the risk of users experiencing harm online. I thank the noble Lord, Lord Russell of Liverpool, for continuing to raise this issue. I am pleased to confirm that this amendment will state clearly that a main outcome of the legislation is that services must be safe by design. For example, providers must choose and design their functionalities so as to limit the risk of harm to users. I know this is an issue to which we will return later on Report, but I hope this provides reassurance about the Government’s intent and the effect of the Bill’s framework.

Services must also be designed and operated in a way which ensures that a higher standard of protection is provided for children than for adults, that users’ rights to freedom of expression and privacy are protected and that transparency and accountability are enhanced. It should be noted that we have worked to ensure that this clause provides clarity to those affected by the Bill without adversely affecting the interpretation or effect of the substantive provisions of the rest of the Bill. As we debated in Committee, this is of the utmost importance, to ensure that this clause does not create legal uncertainty or risk with the interpretation of the rest of the Bill’s provisions.

I hope that your Lordships will welcome this amendment and I beg to move.

Amendment 2 (to Amendment 1)

Moved by
2: Before Clause 1, in subsection (2)(a), after “characteristic” insert “, or a combination of characteristics”
Member’s explanatory statement
This amendment to the Minister’s introductory Clause makes it clear that some internet users experience a higher level of harm than others, as a result of having multiple characteristics.
Baroness Merron Portrait Baroness Merron (Lab)
- Hansard - - - Excerpts

My Lords, I would like to start on a positive note by thanking the Minister for responding to the clear signals that were expressed across the House that a new introductory clause, which is before us in government Amendment 1, would enhance the Bill and set it on its way to be in the best shape that can be achieved by noble Lords working together. I am glad to acknowledge the contribution of my noble friend Lord Stevenson of Balmacara, who has worked to get this in the right place—as the Minister acknowledged. He has been supported in his endeavours by the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones. It is a great step forward, which I hope shows how we all mean to go on.

This new clause gives a real lift to what was essentially a straightforward summary of various parts of the Bill. I sense that noble Lords shared my disappointment that what was in place originally did not harness what the Bill seeks to do. To have left it unamended would have been a missed opportunity and it is in the spirit, if not the exact recommendation, of the Joint Committee, that the government amendment has come forward. So I am glad to welcome this new introductory clause that sets out the purpose, duties and powers—among other things—that will be invested in the Act. This new clause sets out what it will really mean to people and organisations and I hope that this can be a template for other Bills that come before the House.

Following through on this theme of clarity, I am glad to speak to the amendments in my name—Amendment 2, which has also been signed by the noble Lord, Lord Clement-Jones, and Amendments 54 and 173. They all have the same intent of responding to the indisputable evidence that having more than one protected characteristic greatly increases the level of harm experienced online. Amendment 2 seeks to amend the new and very welcome introductory clause further, by making that clear up front.

I am grateful to the Minister for his willingness to engage on this subject. I know that he accepts the premise of the point that I have been pressing. As he mentioned, and to give just one example, Jewish women find themselves at the intersection of both anti-Semitic and misogynistic abuse. It is as though online abusers multiply the vitriol by at least the number of protected characteristics, such that it feels that the abuse knows no bounds, manifesting in far too many examples of Jewish women in the public eye on the receiving end of death, rape and other serious threats.

In our discussions, the Minister referred me to Section 6 of the Interpretation Act 1978, which says that when interpreting statute,

“words in the singular include the plural and words in the plural include the singular”.

This was as much an education for the Minister as it was for me and, judging by the response, for other noble Lords. However, the key point is that this is not just about semantics. Those looking to the Online Safety Bill for protection will not be cross-referencing to a section of a 1978 Act.

I hope that the Minister will be forthcoming with agreement to make the necessary changes in order that we can get to the place which we all want to get to. I beg to move.

12:15
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to Amendment 1, to which I was happy to add my name alongside that of the Minister. I too thank the noble Lord, Lord Stevenson, for tabling the original amendment, and my noble and learned friend Lord Neuberger for providing his very helpful opinion on the matter.

I am especially pleased to see that ensuring that services are safe by design and offer a higher standard of protection for children is foundational to the Bill. I want to say a little word about the specificity, as I support the noble Baroness, Lady Merron, in trying to get to the core issue here. Those of your Lordships who travel to Westminster by Tube may have seen TikTok posters saying that

“we’re committed to the safety of teens on TikTok. That’s why we provide an age appropriate experience for teens under 16. Accounts are set to private by default, and their videos don’t appear in public feeds or search results. Direct messaging is also disabled”.

It might appear to the casual reader that TikTok has suddenly decided unilaterally to be more responsible, but each of those things is a direct response to the age-appropriate design code passed in this House in 2018. So regulation does work and, on this first day on Report, I want to say that I am very grateful to the Government for the amendments that they have tabled, and “Please do continue to listen to these very detailed matters”.

With that, I welcome the amendment. Can the Minister confirm that having safety by design in this clause means that all subsequent provisions must be interpreted through that lens and will inform all the decisions of Report and those of Ofcom, and the Secretary of State’s approach to setting and enforcing standards?

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I too thank my noble friend the Minister for tabling Amendment 1, to which I add my support.

Very briefly, I want to highlight one word in it, to add to what the noble Baroness, Lady Kidron, has just said. The word is “activity”. It is extremely important that in Clause 1 we are setting out that the purpose is to

“require providers of services regulated by this Act to identify, mitigate and manage”

not just illegal or harmful content but “activity”.

I very much hope that, as we go through the few days on Report, we will come back to this and make sure that in the detailed amendments that have been tabled we genuinely live up to the objective set out in this new clause.

Lord Bishop of Manchester Portrait The Lord Bishop of Manchester
- View Speech - Hansard - - - Excerpts

My Lords, I too support the Minister’s Amendment 1. I remember vividly, at the end of Second Reading, the commitments that we heard from both Front-Benchers to work together on this Bill to produce something that was collaborative, not contested. I and my friends on these Benches have been very touched by how that has worked out in practice and grateful for the way in which we have collaborated across the whole House. My plea is that we can use this way of working on other Bills in the future. This has been exemplary and I am very grateful that we have reached this point.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to my noble friend the Minister for the meeting that he arranged with me and the noble Baroness, Lady Fox of Buckley, on Monday of this week.

Although we are on Report, I will start with just one preliminary remark of a general character. The more closely one looks at this Bill, the clearer it is that it is the instrument of greatest censorship that we have introduced since the liberalisation of the 1960s. This is the measure with the greatest capacity for reintroducing censorship. It is also the greatest assault on privacy. These principles will inform a number of amendments that will be brought forward on Report.

Turning now to the new clause—I have no particular objection to there being an introductory clause—it is notable that it has been agreed by the Front Benches and by the noble Baroness, Lady Kidron, but that it has not been discussed with those noble Lords who have spoken consistently and attended regularly in Committee to speak up in the interests of free speech and privacy. I simply note that as a fact. There has been no discussion about it with those who have made those arguments.

Now, it is true that the new clause does refer to both free speech and privacy, but it sounds to me very much as though these are written almost as add-ons and afterthoughts. We will be testing, as Report stage continues, through a number of amendments, whether that is in fact the case or whether that commitment to free speech and privacy is actually being articulated and vindicated in the Bill.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, needless to say, I disagree with what the noble Lord, Lord Moylan, has just been saying precisely because I believe that the new clause that the Minister has put forward, which I have signed and has support across the House, expresses the purpose of the Bill in the way that the original Joint Committee wanted. I pay tribute to the Minister, who I know has worked extremely hard, in co-operation with the noble Lord, Lord Stevenson of Balmacara, to whom I also pay tribute for getting to grips with a purpose clause. The noble Baronesses, Lady Kidron and Lady Harding, have put their finger on it: this is more about activity and design than it is about content, and that is the reason I fundamentally disagree with the noble Lord, Lord Moylan. I do not believe that will be the impact of the Bill; I believe that this is about systemic issues to do with social media, which we are tackling.

I say this slightly tongue-in-cheek, but if the Minister had followed the collective wisdom of the Joint Committee originally, perhaps we would not have worked at such breakneck speed to get everything done for Report stage. I believe that the Bill team and the Minister have worked extremely hard in a very few days to get to where we are on many amendments that we will be talking about in the coming days.

I also want to show my support for the noble Baroness, Lady Merron. I do not believe it is just a matter of the Interpretation Act; I believe this is a fundamental issue and I thank her for raising it, because it was not something that was immediately obvious. The fact is that a combination of characteristics is a particular risk in itself; it is not just about having several different characteristics. I hope the Minister reflects on this and can give a positive response. That will set us off on a very good course for the first day of Report.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, this has indeed set us on a good course, and I am grateful to noble Lords for their questions and contributions. I apologise to my noble friend Lord Moylan, with whom I had the opportunity to discuss a number of issues relating to freedom of expression on Monday. We had tabled this amendment, and I apologise if I had not flagged it and sought his views on it explicitly, though I was grateful to him and the noble Baroness, Lady Fox of Buckley, for their time in discussing the issues of freedom of expression more broadly.

I am grateful to my noble friend Lady Harding and to the noble Baroness, Lady Kidron, for their tireless work over many months on this Bill and for highlighting the importance of “content” and “activity”. Both terms have been in the Bill since its introduction, for instance in Clauses 5(2) and (3), but my noble friend Lady Harding is right to highlight it in the way that she did. The noble Baroness, Lady Kidron, asked about the provisions on safety by design. The statement in the new clause reflects the requirements throughout the Bill to address content and activity and ensure that services are safe by design.

On the amendments tabled by the noble Baroness, Lady Merron, which draw further attention to people who have multiple characteristics and suffer disproportionately because of it, let me start by saying again that the Government recognise that this is, sadly, the experience for many people online, and that people with multiple characteristics are often at increased risk of harm. The Bill already accounts for this, and the current drafting captures people with multiple characteristics because of Section 6 of the Interpretation Act 1978. As she says, this was a new one to me—other noble Lords may be more familiar with this legacy of the Callaghan Government—but it does mean that, when interpreting statute, words in the singular include the plural and words in the plural include the singular.

If we simply amended the references that the noble Baroness highlights in her amendments, we would risk some uncertainty about what those provisions cover. I sympathise with the concern which lies behind her amendments, and I am grateful for her time in discussing this matter in detail. I agree that it would be helpful to make it clearer that the Bill is designed to protect people with multiple characteristics. This clause is being inserted to give clarity, so we should seek to do that throughout.

We have therefore agreed to add a provision in Clause 211—the Bill’s interpretation clause—to make clear that all the various references throughout the Bill to people with a certain characteristic include people with a combination of characteristics. This amendment was tabled yesterday and will be moved at a later day on Report, so your Lordships’ House will have an opportunity to look at and vote on that. I hope that that provision clarifies the intention of the wording used in the Bill and puts the issue beyond doubt. I hope that the noble Baroness will be satisfied, and I am grateful to all noble Lords for their support on this first amendment.

Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to the Minister for his response. It is a very practical response and certainly one that I accept as a way forward. I am sure that the whole House is glad to hear of his acknowledgement of the true impact that having more than one protected characteristic can have, and of his commitment to wanting the Bill to do the job it is there to do. With that, I am pleased to withdraw the amendment in my name.

Amendment 2 (to Amendment 1) withdrawn.
Amendment 1 agreed.
Clause 162: False communications offence
Amendment 2A
Moved by
2A: Clause 162, page 141, line 32, after “psychological” insert “, financial”
Member’s explanatory statement
This amendment, along with the other amendment to Clause 162 in the name of Baroness Buscombe, would widen the scope of the offence to include financial harm and harm to the subject of the false message arising from its communication to third parties.
Lord Garnier Portrait Lord Garnier (Con)
- Hansard - - - Excerpts

My Lords, I shall speak briefly to Amendments 2A, 2B and 5A, which are in my name but perhaps more importantly in the names of my noble friends Lady Buscombe and Lord Leicester. I want to make it quite clear that this is not a contentious debate, in the sense that I had a very useful meeting with my noble friend the Minister on Monday 3 July, in which we set out to each other our respective concerns about the content of the Bill and how it does not protect the people that my noble friends and I seek to protect. My noble friend the Minister explained the practical difficulties faced in trying to introduce these provisions into this Bill. I think we probably agreed to differ. I hope I do not misinterpret what he told me the other day, but, essentially, I think the Government’s view is that an amendment along the lines that we propose might sit more suitably within the digital markets Bill. I am not entirely sure about that, but I am not going to have a fight about it this afternoon.

I will make some short points. Having listened to the debate on the Government’s Amendment 1, I suggest that our proposal that “financial” should be included in the types of damage referred to in Clause 162(1)(c)—that a person commits an offence if

“at the time of sending it, the person intended the message, or the information in it, to cause non-trivial psychological”,

we would then add in “financial”,

“or physical harm to a likely audience”—

fits in very well with Amendment 1 and the point raised by my noble friend Lady Harding on proposed new subsection (2), which says:

“To achieve that purpose, this Act (among other things) … imposes duties which, in broad terms, require providers of services … to … mitigate and manage the risks of harm … from … illegal content and activity”.

12:30
At our meeting the other day and in Committee, we talked about making it a criminal act to post damaging material about, for example, a business, such as one that sells meat, which becomes the victim of a vegan pile-on. It could be a local bed and breakfast, an Airbnb, a restaurant or pub that is owned by people who, for example, enjoy sports that others disapprove of—to take an easy and obvious example, hunting. In so far as hunting is still a permitted and legal activity, there are none the less people who vehemently disapprove of it, and they take their disapproval to the extent that they pile on abusive, damaging and false accusations on the internet, to the financial damage, quite apart from the mental upset, of those who run those businesses. We are talking not about large corporations but about individuals who own pubs, restaurants, small shops or whatever it might be, whose livelihoods and psychological well-being could well be affected by this.
We think it would be a good idea not only if this sort of activity were criminalised but if the providers and operators of the websites and so on took on the moral responsibility for what those who use their sites are doing. That duty, which I would say goes beyond morality—it is a social, moral and legal duty—translates across to the Government. It is the duty of the Government to protect small business people running perfectly legitimate businesses from this sort of mob activity—in the jargon, pile-ons. I therefore urge the Government, although I accept their concerns about the difficulty of using this particular Bill, to think imaginatively and positively about how they can protect the victims and potential victims of the activity I refer to.
Nothing that we propose would affect the legitimate rights to freedom of expression or privacy set out in Amendment 1. The law as it stands—I refer to the Defamation Act 2013, Article 10 of the European Convention on Human Rights and the common law—allows for restriction of people’s “right” to tell damaging lies. We do not need to get too prissy about protecting the rights of liars who wish to go around damaging other people’s businesses.
That, in essence, is the long and the short of it. I look forward to the Government coming forward in short order with some positive proposals about what they want to do, and how they propose to do it, to protect this group of people who have had their lives and their businesses damaged and who will continue to be at risk until Parliament does something about it. I beg to move.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to Amendment 5B in my name and that of my noble friend Lord Clement-Jones. I am reminded that this is a new stage of the Bill, so I should declare my interests. I have no current financial interests in the tech sector, but until 2019 I worked for one of the large technology companies that will be regulated, doing the kind of censorship job that the noble Lord, Lord Moylan, is concerned about. We clearly did not do it very well or we would not be here today replacing people like me with Ofcom.

Amendment 5B concerns an issue that we raised in Committee: the offence of encouragement of self-harm. That new offence was broadly welcomed, including on these Benches. We believe that there is scope, in some circumstances, to seek criminal prosecution of individuals who, online or otherwise, maliciously seek to encourage other people to harm themselves. The concern we raised in Committee, which we come back to today, is that we want the offence to be used in a way that we would all agree is sensible. We do not want people who are trying to help individuals at risk of self-harm to become concerned about and afraid of it, and to feel that they need to limit activities that would otherwise be positive and helpful.

In Committee we suggested that one way to do this would be to have a filter where the Director of Public Prosecutions looked at potential prosecutions under the new offence. We take a different approach with the amendment, which would in some senses be more effective, which is to explicitly list in the Bill the three categories of activity that would not render an individual liable to prosecution.

The first is people who provide an educational resource. We should be clear that some educational resources that are intended to help people recognise self-harm and turn away from it can contain quite explicit material. Those people are concerned that they might, in publishing that material with good intent, accidentally fall foul of the offence.

The second category is those who provide support—individuals providing peer support networks, such as an online forum where people discuss their experience of self-harm and seek to turn away from it. They should not be inadvertently caught up in the offence.

The third category is people posting information about their own experience of self-harm. Again, that could be people sharing quite graphic material about what they have been doing to themselves. I hope that there would be general agreement that we would not interpret, for example, a distressed teenager sharing material about their own self-harm, with the intent of seeking advice and support from others, as in some way encouraging or assisting others to commit self-harm themselves.

There is a genuine effort here to try to find a way through so that we can provide assurances to others. If the Minister cannot accept the amendment as it is, I hope he will reaffirm that the categories of people that I described are not the target of the offence and that he will be able to offer some kind of assurance as to how they can feel confident that they would not fall foul of prosecution.

Additionally, some of these groups feel with some conviction that their voices have not been as prominent in the debate as those of other organisations. The work they do is quite sensitive, and they are often quite small organisations. Between Report and the Bill becoming law, I hope that those who will be responsible for doing the detailed work around guidance on prosecutions will meet with those people on the front line—again, specificity is all—and that those who are trying to work out how to make this legislation work will meet with the people doing that work, running those fora and engaging with the young people who seek help around self-harm to look in detail at what they are doing. That would be extraordinarily helpful.

Those are my two asks. Ideally, the Government would accept the amendment that we have tabled, but if not I hope that they can give the assurance that the three groups I listed are not the target and that they will commit to having relevant officials meet with individuals working on the front line, so that we can make sure that we do not end up prosecuting individuals without intending to.

Baroness Burt of Solihull Portrait Baroness Burt of Solihull (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I support all the amendments in this group. However, what I have to say on my own amendments will take up enough time without straying on to the territory of others. I ask noble colleagues to please accept my support as read. I thank the Minister for meeting me and giving context and explanation regarding all the amendments standing in my name. I also welcome the government amendments on intimate image abuse in another group and on digitally altered images, which impinge directly on the cyberflashing amendments.

It is clear that the Government’s heart is in the right place, even if their acceptance of a consent-based law is not. I also thank the Law Commission for meeting me and explaining the thinking behind and practicalities of how the new law in relation to cyberflashing will work, and how the existing court system can help, such as juries deciding whether or not they believe the defendant. Last but definitely not least, I acknowledge the help that I have received from Professor Clare McGlynn, and Morgane Taylor from Bumble—both immensely knowledgeable and practical people who have inspired, informed and helped throughout.

I start with Amendments 5C and 7A in my name and that of the noble Baroness, Lady Finlay. I understand that the Government are following the advice of the Law Commission in refusing to accept consent-based defence, but I point out gently that this is something that the Government choose, and sometimes choose not, to do. Although the Law Commission consulted widely, that consultation did not show support for its proposals from victims and victims’ organisations. I am still of the view that a consent-based requirement would have prevented many unsolicited images being received by women and girls. I still worry that young girls may be socialised and sexualised by their peers who say that they are sending these images for a laugh. These girls do not have the maturity to say that they do not find it funny, but pretend it is okay while cringing with humiliation inside. Consent-based legislation would afford them the best protection and educate young girls and men that not only are women and girls frequently not interested in seeing a picture of a man’s willy, but girls think differently from boys about this. Who knew?

I also believe that a consent-based law would provide the most suitable foundation for education and prevention initiatives. However, I have listened to the Minister and the Law Commission. I have been told that, if it got to court, the complainant would not be humiliated all over again by having to give evidence in court and admit the distress and humiliation they felt. But according to the Minister, like the new intimate image amendment tabled by the Government themselves, it is up to the Crown Prosecution Service to follow it up and, after making their statement of complaint, my understanding is that the complainant does not have to take part further—more of that later. However, given the current success rate of only 4% of even charging alleged perpetrators in intimate image abuse cases, I worry that not only will victims continue to be reluctant to come forward but the chances of prosecution will be so slim that it will not act as a deterrent. We know from experience of sharing sexual images without consent, that the motivation thresholds have limited police investigations and prosecutions due to the evidential challenges. That is what the Law Commission has recommended as regards the introduction of a consent-based image offence.

12:45
The only thing that would give me hope is a major education or re-education campaign in schools and society. Will the Minister confirm that such an education and publicity campaign will happen? Will there be a budget allocated to carry it out? I should like it on the record that the Government will produce the campaign within six months of the passing of the Act. Similarly, the Minister has assured me that he will monitor carefully the success of the implemented Act. Please will he make those assurances in his reply to this group and give some kind of timescale?
I come to the recklessness amendment, Amendment 6, which is new. It was originally drafted by Professor McGlynn and Maria Miller at the Commons stage to give a kind of compromise on a recklessness standard, but it has not yet been considered by either House. The specific wording follows recent laws on upskirting and down-blousing in Northern Ireland in the Justice (Sexual Offences and Trafficking Victims) Act (Northern Ireland) 2022. The wording has been tested, reviewed and approved by the justice ministry and the Northern Ireland Assembly. In the Bill, there are two ways in which to prove the cyberflashing offence; first, if it is proved that the defendant either intended to cause distress; or, secondly, if the defendant was motivated by sexual gratification and was reckless in causing distress. My amendment adds a third option. The defence will be made out if the defendant was
“reckless as to whether B will be caused alarm, distress or humiliation”
and that the victim was so harmed. This third option will cover a wider range of cases, meaning that there would be more opportunities for prosecuting this harmful practice and therefore affording greater protection for women and girls.
Recklessness means showing that a defendant was aware of a risk of causing harm but went on to take that risk anyway. There are two arms to the recklessness amendment. First, a defendant is reckless as to causing distress, alarm or humiliation and, secondly, the victim is alarmed, humiliated, et cetera. The first arm, the recklessness, is easier to prove than direct intention. The perpetrator can intend to have a laugh with his friends or send an image for a dare but is reckless as to causing distress. That means that he recognised there was a risk of causing distress but carried on anyway. In many situations, recklessness may be relatively easy to prove, as you would say that, of course, most people would know that sending images of this kind would be likely to cause distress and so on, unless you knew that the recipient would be receptive to it because you checked before sending. I am not going to talk any more about consent-based matters. I am done there. I have made my point. What many men do not get, though, is that girls—particularly girls—and women do not want to receive these images. This is why I have been arguing for a you-know-what consent-based law. The second arm states that the victim is alarmed, humiliated, et cetera. This means that the victim would need to make a statement to that effect. It is included here and in the Northern Ireland draft to raise that threshold just a little. It should not be too high a threshold to meet.
The recklessness amendment is a good compromise, especially when faith in the criminal justice system is at an all-time low among women. Otherwise, women will report cyberflashing and find that they fail at the first hurdle because of the need to prove that the person who sent the image intended to cause direct harm.
In my meeting with the Minister, he gave an example of why a consent-based offence would not work. He used an image of a complainant having to give evidence in a court. That went a long way to swinging it for me but I have taken advice on this and, as I now understand it, the complainant would need only to give a statement, which would of course be crucial to any prosecution. This means that the prosecution would not go ahead unless the complainant supported it, which is fair enough; I had visions of a victim trembling in the dock when facing her abuser. I would be most grateful if the Minister could clarify this because, as I understand it, the court issue made the difference between having a consent-based offence and the Government’s proposal. It has been known on rare occasion for me to get muddled up but I would appreciate clarification on which version is correct.
I appreciate that the Minister and the Law Commission have not had time to consider this recklessness amendment fully so I certainly do not intend to press it to a vote today. The best outcome would be the Government and the Law Commission looking at this and the Government bringing forward their own amendment before Third Reading. I am ever hopeful and thank noble Lords for their patience.
Baroness Kennedy of Shaws Portrait Baroness Kennedy of The Shaws (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, Amendments 3 to 5 to Clause 164 are in my name. They relate to a matter that I raised in Committee: threats of a more indirect nature. As I explained at that time, I chaired an inquiry in Scotland into misogyny and the manifestations of deeply unpleasant behaviour that women experience, some of it in the public arena and some of it online.

Based on that experience, I came to realise that many women who are parliamentarians, are in local authorities, head up NGOs or are journalists and, for some reason, annoy or irritate certain users of social media in any way receive horrible threats. We know about those from the ugly nature of the threats that Diane Abbott and many women parliamentarians have received. Sometimes, the person making the threat does not directly say, “I’m going to rape you”—although sometimes they do: Joanna Cherry, a Scottish Member of Parliament here in Westminster, received a direct threat of rape and the person who threatened was convicted under the Communications Act. Very often, threats of rape, death or disfigurement sound like, “You think you’re so pretty. We can fix that. Somebody should fix that”. It is the indirect nature of the threat that provides comfort to the person making it. They imagine that they cannot be prosecuted because they are not saying that they will do it; they are saying, “Somebody should rape you. Somebody should just eliminate you. Somebody should take that smile off your face; a bit of acid could do it”. That is how many of the threats presented by witnesses to the inquiry—we saw them on their phones and computers—were made; they were of an indirect nature.

One woman in my own chambers is acting for Jimmy Lai, the Hong Kong publisher who is currently in custody awaiting trial under the national security law. She has received death threats, threats against her children and threats of rape. I do not imagine that we can inhibit what is done by people under the auspices of the Chinese Government with this legislation; all I can say is that these sorts of threats are experienced by many women and are not always of a direct nature so the law often does not encapsulate them. I am seeking to introduce some way in which we could, through careful drafting, cover the possibilities.

Take someone such as Andrew Tate: he is a good example of someone with a massive following who clearly puts out to boys and young men horrible ideas about how women should be treated, much of which involve detriments to women. As has been described by others in this House, a pile-on happens in relation to this. Women do not just receive a message saying, “Somebody should rape you”; they receive thousands of messages from the followers of the contributor and communicator.

I have had the benefit of meeting the ministerial team. I am grateful to the Minister and his team, including the lawyers who advise him. We sought a way of dealing with this issue. I particularly wanted to include specific mention of “rape, disfigurement or other” in terms of threats because, in the language of statutes, they are sometimes missed by young junior prosecutors or young policemen. When they see messaging and women come forward with complaints, they do not automatically think that the threat is covered because of the rather oblique nature of statutory language. I wanted it really spelled out, with rape and disfigurement specifically included in my amendment. However, I am persuaded that this issue was in the minds of those who drafted this Bill.

I am pleased that it has been recognised that this specific issue is of a different nature when it is applied to women and girls. It is happening in schools and universities. Young women put their heads above the parapet—they express a view about feminism or describe the fact that they are a lesbian—then, suddenly, they receive a whole range of horrible insults, abuse and threats on social media. I am mindful of the contribution made by the noble Baroness, Lady Fox, in Committee. She was concerned, in essence, about people being rather wet about this and how this measure would inhibit free speech; really, it was about protecting rather gentle feelings. However, that is not what this is about. It is about threats of serious behaviour and serious conduct towards women. The indirect nature of it is not something that should put us off attempting to have law to deal with it.

As I said, I have had an opportunity to meet the ministerial team. We came to the conclusion that we might be able to insert something covering the fact that the carrying out of the threat could be done by persons other than the person who is sending the message. That is the important thing: women receiving these messages saying, “Somebody should rape you”, know that the message is carefully drafted in that way by the Andrew Tates of this world because they imagine that the police cannot then do anything about it, but they also know that these people have followers who may well decide to carry out the suggestion. It is really important that we find a way to deal with this.

As a result of our discussions, I hope that the House will see that this issue is something that we must deal with in this Bill because the opportunity will not come again. This is happening day in, day out to girls and women. If we are going to send a message about what is unacceptable, it is important that the law declares what is unacceptable. These threats are serious, as is the way in which women then have to change their lives. They stop staying out late. They worry about being in places where they might be subjected to some of these threats. They start limiting their behaviour.

Just earlier this morning, someone told me that his niece was a member of a football team’s fan club and had been elected to the board. She suddenly received a whole range of threats from men who felt that no woman should be in that position. She received a pile-on of a horrible kind, and said to her uncle that she wanted to step down and did not want to be on the board if she was going to receive that kind of messaging.

13:00
Women start changing the opportunities in their lives and stop doing things that they might want to do: they stop deciding to be Members of Parliament or to stand for election in any capacity, or, if they are lawyers, to take cases that will be inflammatory. They start inhibiting and limiting their own potential because of this kind of threat coming from men who resent the idea that women should be aspiring to hold positions and be equal to men. Some of it is of a very unpleasant and nasty nature, and law has its place in sending out clear messages of what is acceptable and unacceptable. It is then up to us—in schools, other educational settings and everywhere else—to spread the word among our young men and young women about what is acceptable and what they must not accept, and about the right way to behave decently towards other human beings.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, first, I welcome the amendment from the noble Lord, Lord Allan, and his motivation, because I am concerned that, throughout the Bill, the wrong targets are being caught up. I was grateful to hear his recognition that people who talk about their problems with self-harm could end up being targeted, which nobody would ever intend. These things need to be taken seriously.

In that sense, I was slightly concerned about the motivation of the noble Baroness, Lady Burt of Solihull, in the “reckless” amendment. The argument was that the recklessness standard is easier to prove. I am always worried about things that make it easier to prosecute someone, rather than there being a just reason for that prosecution. As we know, those involved in sending these images are often immature and very foolish young men. I am concerned about lowering the threshold at which we criminalise them—potentially destroying their lives, by the way, because if you have a criminal record it is not good—even though I in no way tolerate what they are doing and it is obviously important that we take that on.

There is a danger that this law will become a mechanism through which people try to resolve a whole range of social problems—which brings me on to responding to the speech just made by the noble Baroness, Lady Kennedy of The Shaws. I continue to be concerned about the question of trying to criminalise indirect threats. The point about somebody who sends a direct threat is that we can at least see the connection between that direct threat and the possibility of action. It is the same sort of thing that we have historically considered in relation to incitement. I understand that, where your physical being is threatened by words, physically a practical thing can happen, and that is to be taken very seriously. The problem I have is with the indirect threat from somebody who says, for example, “That smile should be taken of your face. It can be arranged”, or other indirect but incredibly unpleasant comments. There is clearly no link between that and a specific action. It might use violent language but it is indirect: “It could be arranged”, or “I wish it would happen”.

Anyone on social media—I am sure your Lordships all are—will know that I follow very carefully what people from different political parties say about each other. I do not know if you have ever followed the kind of things that are said about the Government and their Ministers, but the threats are not indirect and are often named. In that instance, it is nothing to do with women, but it is pretty violent and vile. By the way, I have also followed what is said about the Opposition Benches, and that can be pretty violent and vile, including language that implies that they wish those people were the subject of quite intense violence—without going into detail. That happens, and I do not approve of it—obviously. I also do not think that pile-ons are pleasant to be on the receiving end of, and I understand how they happen. However, if we criminalise pile-ons on social media, we are openly imposing censorship.

What is worse in my mind is that we are allowing the conflation of words and actions, where what people say or think is the same as acting on it, as the criminal law would see it. We have seen a very dangerous trend recently, which is particularly popular in the endless arguments and disputes over identity politics, where people will say that speech is violence. This has happened to a number of gender-critical feminists, in this instance women, who have gone in good faith to speak at universities, having been invited. They have been told that their speech was indistinguishable from violence and that it made students at the university feel under threat and unsafe and that it was the equivalent of being attacked. But guess what? Once you remove that distinction, the response to that speech can be to use violence, because you cannot tell the difference between them. That has happened around a number of university actions, where speakers and their supporters were physically assaulted by people who said that they were using self-defence against speech that was violent. I get nervous that this is a slippery slope, and we certainly should not go anywhere near it in legislation.

Finally, I agree that we should tackle the culture of people piling on and using this kind of language, but it is a cultural and social question. What we require is moral leadership and courage in the face of it—calling it out, arguing against it and so on. It is wrong to use the law to send messages; it is an abdication of moral leadership and a cop-out, let alone dangerous in what is criminalised. I urge your Lordships to reject those amendments.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak briefly to Amendments 5C and 7A in this group. I welcome the Government’s moves to criminalise cyberflashing. It is something that many have campaigned for in both Houses and outside for many years. I will not repeat the issues so nobly introduced by the noble Baroness, Lady Burt, and I say yet again that I suspect that the noble Baroness, Lady Featherstone, is watching, frustrated that she is still not able to take part in these proceedings.

It is worth making the point that, if actions are deemed to be serious enough to require criminalisation and for people potentially to be prosecuted for them, I very much hope that my noble friend the Minister will be able to say in his remarks that this whole area of the law will be kept under review. There is no doubt that women and girls’ faith in the criminal justice system, both law enforcement and the Crown Prosecution Service, is already very low. If we trumpet the fact that this offence has been introduced, and then there are no prosecutions because the hurdles have not been reached, that is even worse than not introducing the offence in the first place. So I hope very much that this will be kept under review, and no doubt there will be opportunities to return to it in the future.

I do not want to get into the broader debate that we have just heard, because we could be here for a very long time, but I would just say to the noble Baronesses, Lady Kennedy and Lady Fox, that we will debate this in future days on Report and there will be specific protection and mention of women and girls on the face of the Bill—assuming, of course, that Amendment 152 is approved by this House. The guidance might not use the words that have been talked about, but the point is that that is the place to have the debate—led by the regulator with appropriate public consultation—about the gendered nature of abuse that the noble Baroness, Lady Kennedy, has so eloquently set out. I hope that will also be a big step forward in these matters.

I look forward to hearing from the Minister about how this area of law will be kept under review.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I understand that, as this is a new stage of the Bill, I have to declare my interests: I am the chair of 5Rights Foundation, a charity that works around technology and children; I am a fellow at the computer science department at Oxford University; I run the Digital Futures Commission, in conjunction with the 5Rights Foundation and the London School of Economics; I am a commissioner on the Broadband Commission; I am an adviser for the AI ethics institute; and I am involved in Born in Bradford and the Lancet commission, and I work with a broad number of civil society organisations.

None Portrait Noble Lords
- Hansard -

Hear, hear!

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My comments will be rather shorter. I want to make a detailed comment about Amendment 5B, which I strongly support and which is in the name of the noble Lord, Lord Allan. It refers to,

“a genuine medical, scientific or educational purpose, … the purposes of peer support”

I would urge him to put “genuine peer support”. That is very important because there is a lot of dog whistling that goes on in this area. So if the noble Lord—

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My working assumption would be that that would be contestable. If somebody claimed the peer support defence and it was not genuine, that would lead to them becoming liable. So I entirely agree with the noble Baroness. It is a very helpful suggestion.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I also want to support the noble Baroness, Lady Kennedy. The level of abuse to women online and the gendered nature of it has been minimised; the perpetrators have clearly felt immune to the consequences of law enforcement. What worries me a little in this discussion is the idea or conflation that anything said to a woman is an act of violence. I believe that the noble Baroness was being very specific about the sorts of language that could be caught under her suggestions. I understand from what she said that she has been having conversations with the Minister. I very much hope that something is done in this area, and that it is explored more fully, as the noble Baroness, Lady Morgan, said, in the guidance. However, I just want to make the point that online abuse is also gamified: people make arrangements to abuse people in groups in particular ways that are not direct. If they threaten violence, that is quite different to a pile-in saying that you are a marvellous human being.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I too must declare my interests on the register—I think that is the quickest way of doing it to save time. We still have time, and I very much hope that the Minister will listen to this debate and consider it. Although we are considering clauses that, by and large, come at the end of the Bill, there is still time procedurally—if the Minister so decides—to come forward with an amendment later on Report or at Third Reading.

We have heard some very convincing arguments today. My noble friend explained that the Minister did not like the DPP solution. I have looked back again at the Law Commission report, and I cannot for the life of me see the distinction between what was proposed for the offence in its report and what is proposed by the Government. There is a cigarette paper, if we are still allowed to use that analogy, between them, but the DPP is recommended—perhaps not on a personal basis, although I do not know quite what distinction is made there by the Law Commission, but certainly the Minister clearly did not like that. My noble friend has come back with some specifics, and I very much hope that the Minister will put on the record that, in those circumstances, there would not be a prosecution. As we heard in Committee, 130 different organisations had strong concerns, and I hope that the Minister will respond to those concerns.

As regards my other noble friend’s amendment, again creatively she has come back with a proposal for including reckless behaviour. The big problem here is that many people believe that, unless you include “reckless” or “consent”, the “for a laugh” defence operates. As the Minister knows, quite expert advice has been had on this subject. I hope the Minister continues his discussions. I very much support my noble friend in this respect. I hope he will respond to her in respect of timing and monitoring—the noble Baroness, Lady Morgan, mentioned the need for the issue to be kept under review—even if at the end of the day he does not respond positively with an amendment.

Everybody believes that we need a change of culture—even the noble Baroness, Lady Fox, clearly recognises that—but the big difference is whether or not we believe that these particular amendments should be made. We very much welcome what the Law Commission proposed and what the Government have put into effect, but the question at the end of day is whether we truly are making illegal online what is illegal offline. That has always been the Government’s test. We must be mindful of that in trying to equate online behaviour with offline behaviour. I do not believe that we are there yet, however much moral leadership we are exhorted to display. I very much take the point of the noble Baroness, Lady Morgan, about the violence against women and girls amendment that the Government are coming forward with. I hope that will have a cultural change impact as well.

As regards the amendments of the noble Baroness, Lady Kennedy, I very much take the point she made, both at Committee and on Report. She was very specific, as the noble Baroness, Lady Kidron, said, and was very clear about the impact, which as men we severely underestimate if we do not listen to what she said. I was slightly surprised that the noble Baroness, Lady Fox, really underestimates the impact of that kind of abuse—particularly that kind of indirect abuse.

I was interested in what the Minister had to say in Committee:

“In relation to the noble Baroness’s Amendment 268, the intentional encouragement or assistance of a criminal offence is already captured under Sections 44 to 46 of the Serious Crime Act 2007”.—[Official Report, 22/6/23; col. 424.]


Is that still the Government's position? Has that been explained to the noble Baroness, Lady Kennedy, who I would have thought was pretty expert in the 2007 Act? If she does not agree with the Minister, that is a matter of some concern.

Finally, I agree that we need to consider the points raised at the outset by the noble and learned Lord, Lord Garnier, and I very much hope that the Government will keep that under review.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, this has been an interesting debate that in a curious way moves us from the debate on the first group, which was about the high level of aspiration for this Bill, for the work of those involved in it and indeed for Parliament as a whole, down to some of the nitty-gritty points that emerge from some of the Bill’s proposals. I am very much looking forward to the Minister’s response.

In a sense, where the noble Lord, Lord Clement-Jones, ends, I want to start. The noble and learned Lord, Lord Garnier, did a good job of introducing the points made previously by his colleague, the noble Baroness, Lady Buscombe, in relation to those unfortunate exercises of public comment on businesses, and indeed individuals, that have no reason to receive them. There does not seem to be a satisfactory sanction for that. In a sense he was drawn by the overarching nature of Clause 1, but I think we have established between us that Clause 1 does not have legal effect in the way that he would like, so we would probably need to move further forward. The Government probably need to pick up his points in relation to some of the issues that are raised further down, because they are in fact not dissimilar and could be dealt with.

The key issue is the one that my noble friend Lady Kennedy ended on, in the sense that the law online and the law offline, as mentioned by the noble Lord, Lord Clement-Jones, seem to be at variance about what you can and cannot do in relation to threats issued, whether or not they are general, to a group or groups in society. This is a complex area that needs further thought of the nature that has been suggested, and may well refer back to the points made by the noble Baroness, Lady Morgan. There is something here that we are not tackling correctly. I look forward to the Government’s response. We would support movement in that area should that agreement be made.

Unfortunately, the noble Lord, Lord Russell, whom I am tempted to call my noble friend because he is a friend, has just moved out of his seat—I do not need to give him a namecheck any more—but he and I went to a meeting yesterday, I think, although I have lost track of time. It was called by Luke Pollard MP and related to the incel movement or, as the meeting concluded, what we should call the alleged incel movement, because by giving it a name we somehow give it a position. I wanted to make that point because a lot of what we are talking about here is in the same territory. It was an informal research-focused meeting to hear all the latest research being done on the group of activities going under the name of the alleged incel movement.

I mention that because it plays into a lot of the discussion here. The way in which those who organise it do so—the name Andrew Tate has already been mentioned—was drawn into the debate in a much broader context by that research, particularly because representatives from the Home Office made the interesting point that the process by which the young men who are involved in this type of activity are groomed to join groups and are told that by doing so they are establishing a position that has been denied to them by society in general, and allegedly by women in particular, is very similar to the methods used by those who are cultivating terrorism activity. That may seem to be a big stretch but it was convincing, and the argument and debate around that certainly said to me that there are things operating within the world of social media, with its ability to reach out to those who often feel alone, even if they are not, and who feel ignored, and to reach them in a way that causes them to overreact in the way they deal with the issues they face.

That point was picked up by others, including my noble friend Lady Kennedy and the noble Baroness, Lady Burt, in relation to the way in which the internet itself is in some way gendered against women. I do not in any sense want to apportion blame anywhere for that; it is a much more complex issue than single words can possibly address, but it needs to be addressed. As was said in the meeting and has been said today, there are cultural, educational and holistic aspects here. We really do not tackle the symptoms or the effects of it, but we should also look at what causes people to act in the way they have because of, or through the agency of, the internet.

Having said that, I support the amendments from the noble Lord, Lord Allan, and I look forward to the Government’s response to them. Amendment 5B raises the issue that it will be detrimental to society if people stop posting and commenting on things because they fear that they will be prosecuted—or not even prosecuted but attacked. The messages that they want to share will be lost as a result, and that is a danger that we do not want to encourage. It will be interesting to hear the Minister’s response to that.

The noble Baroness, Lady Burt, made powerful points about the way in which the offence of cyberflashing is going to be dealt with, and the differences between that and the intimate image abuse that we are coming on to in the next group. It may well be that this is the right way forward, and indeed we support the Government in the way that they are going, but it is important to recognise her point that we need a test of whether it is working. The Government may well review the impact of the Bill in the normal way of things, but this aspect needs particular attention; we need to know whether there are prosecutions and convictions and whether people understand the implication of the change in practice. We need publicity, as has been said, otherwise it will not be effective in any case. These issues, mentioned by the noble Baroness, Lady Burt, and picked up by the noble Baroness, Lady Morgan, are important. We will have other opportunities to discuss them, but at this stage we should at least get a response to that.

If it is true that in Northern Ireland there is now a different standard for the way in which cyberflashing offences are to be undertaken—taking into account the points made very well by the noble Baroness, Lady Fox, and the worry about encouraging more offences for which crimes may not necessarily be appropriate at this stage, particularly the one about recklessness—do the Government not have a slight problem here? In the first case, do we really accept that we want differences between the various regions and nations of our country in these important issues? We support devolution but we also need to have a sense of what the United Kingdom as a whole stands for in its relationship with these types of criminal offence, if they are criminal. If that happens, do we need a better understanding of why one part of the country has moved in a particular way, and is that something that we are missing in picking up action that is perhaps necessary in other areas? As my noble friend Lady Kennedy has also said, some of the work she has been doing in Scotland is ahead of the work that we have been doing in this part of the United Kingdom, and we need to pick up the lessons from that as well.

As I said at the beginning, this is an interesting range of amendments. They are not as similar as the grouping might suggest, but they point in a direction that needs government attention, and I very much look forward to the Minister’s comments on them.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am grateful to my noble friends Lady Buscombe and Lord Leicester and my noble and learned friend Lord Garnier for the amendments that they have tabled, with which we began this helpful debate, as well as for their time earlier this week to discuss them. We had a good debate on this topic in Committee and I had a good discussion with my noble friend Lady Buscombe and my noble and learned friend Lord Garnier on Monday. I will explain why the Government cannot accept the amendments that they have brought forward today.

I understand my noble friends’ concerns about the impact that fake reviews can have on businesses, but the Bill and the criminal offences it contains are not the right place to address this issue. The amendments would broaden the scope of the offences and likely result in overcriminalisation, which I know my noble friends would not want to see.

13:30
I reassure my noble friends that, as we have discussed, the Digital Markets, Competition and Consumers Bill will address these issues by including a power to take stronger action against fake and misleading reviews. Schedule 18 to that Bill sets out a power for the Secretary of State to that effect. While that Bill does not place duties on private individuals acting in a personal capacity, the proposals are likely to require traders hosting reviews to take reasonable and proportionate steps to ensure that they represent a genuine consumer experience.
The Government will also consult on what is reasonable and proportionate for businesses to do to ensure that reviews are genuine and do not unduly harm businesses or the people who own them. My noble friends’ Amendment 5A would represent a significant expansion of the communications offences in the Bill. It would criminalise a wide range of conduct other than sending messages. Criminalising conduct which is merely capable of encouraging someone else to send a message would represent a significant risk to freedom of expression and is beyond the scope of the offence we have drafted. While I remain sympathetic to my noble friends about the malignant behaviour they have highlighted and the impact on the people who own the businesses affected, I continue to agree to disagree with my noble and learned friend Lord Garnier about whether this is a matter for this Bill. I continue to point him and my noble friends in the direction of the Digital Markets, Competition and Consumers Bill, which my noble friend Lord Camrose will take through; he has heard the points my noble friends have raised today and in earlier stages of the Online Safety Bill.
Amendment 3, tabled by the noble Baroness, Lady Kennedy of The Shaws, seeks to amend the definition of the offence in Clause 164(1) to add the threats of “rape” and “disfigurement” to the existing description, which includes “a threat of death”. Her Amendment 5 is consequential. I am very grateful to the noble Baroness for her time yesterday to discuss her amendments. The Government agree that threats of rape and disfigurement are truly abhorrent—she set out some harrowing examples—and should be captured in criminal law, which is why the offence, as drafted, already covers these threats. Rape is included in the definition of “serious harm”. As I discussed with the noble Baroness yesterday, disfigurement would also be captured under the definition of “serious harm”, as it would constitute grievous bodily harm.
I know that the noble Baroness has come across some very distressing examples of threats to disfigure in her work on tackling misogyny, including the review she mentioned that she chaired for the Scottish Government, but if disfigurement were specified separately in this offence, it could introduce ambiguity about the ambit of serious harm. Grievous bodily harm is an established and well-understood legal concept; singling out disfigurement could lead to uncertainty in the law about other harms which amount to grievous bodily harm.
The noble Baroness’s Amendment 4 seeks to clarify that a person who sends a threatening message would meet the threshold of this offence, even if the threat were carried out by “another individual”. The offence, as drafted, does not require the threat to be carried out by a particular person, but following the helpful discussions with the noble Baroness yesterday, I am happy to acknowledge the need for greater clarity here. While her amendment would make it clearer that the offence will capture scenarios where the recipient feared that the threat would be carried out by the sender or a different individual, it could restrict this to specific or identifiable individuals. This would apply only where there was an intention, and not where a sender is reckless as to causing the recipient to fear the threat being carried out. As the noble Baroness knows, while we cannot accept the amendment as she has drafted it, we are happy to commit to bringing forward a government amendment at Third Reading to clarify that the offence is committed whether or not the threat would be carried out by the person who sent the message. I am very grateful to her for pressing this issue.
Amendment 5B, tabled by the noble Lords, Lord Allan of Hallam and Lord Clement-Jones, seeks to ensure that the new serious self-harm offence does not lead to the prosecution of people sharing content to support people at risk of self-harm. I fully understand the concern which has prompted their amendment, and I reassure them that the offence has been developed with the aim of ensuring that it does not criminalise the sorts of people that they mentioned. The Law Commission addressed this issue and was confident that the inclusion of the two key elements it recommended—an intention to encourage or assist another person to harm themselves, and a threshold of harm consistent with grievous bodily harm—will constrain the offence to only the most culpable offending.
We expect these tight parameters and the usual prosecutorial discretion to provide sufficient safeguards against inappropriate prosecutions. The defence of necessity may also serve to ensure that actions undertaken in extraordinary circumstances to mitigate more serious harm should not be criminal. The offence of encouraging or assisting suicide has not led to the prosecution of vulnerable people who talk about suicidal feelings online or those who offer them support, and there is no reason to suppose that this offence will criminalise those whom this amendment seeks to protect. However, the noble Lords raise an important issue and I assure them that we will keep the operation of the offence under review. The Government have committed to expanding it to cover all ways of encouraging or assisting self-harm so there will be an opportunity to revisit it in due course.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

I appreciate the Minister’s response. Could he also respond to my suggestion that it would be helpful for some of the people working on the front line to meet officials to go through their concerns in more detail?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

I am very happy to make that commitment. It would be useful to have their continued engagement, as we have had throughout the drafting of the Bill.

The noble Baroness, Lady Burt of Solihull, has tabled a number of amendments related to the new offence of cyberflashing. I will start with her Amendment 6. We believe that this amendment reduces the threshold of the new offence to too great an extent. It could, for example, criminalise a person sending a picture of naked performance art to a group of people, where one person might be alarmed by the image but the sender sends it anyway because he or she believes that it would be well received. That may be incorrect, unwise and insensitive, but we do not think it should carry the risk of being convicted of a serious sexual offence.

Crucially, the noble Baroness’s amendment requires that the harm against the victim be proven in court. Not only does this add an extra step for the prosecution to prove in order for the perpetrator to be convicted, it creates an undue burden on the victim, who would be cross-examined about his or her—usually her—experience of harm. For example, she might have to explain why she felt humiliated; this in itself could be retraumatising and humiliating for the victim. By contrast, Clause 170 as drafted means that the prosecution has only to prove and focus on the perpetrator’s intent.

Baroness Burt of Solihull Portrait Baroness Burt of Solihull (LD)
- View Speech - Hansard - - - Excerpts

I am very grateful for the Minister’s comments. This is the crux of my confusion: I am not entirely sure why it is necessary for the victim to appear in court. In intimate image abuse, is it not the case that the victim does not have to make an appearance in court? What is the difference between intimate image abuse and cyberflashing abuse? I do not get why one attracts a physical court appearance and the other does not. They seem to be different sides of the same coin to me.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

If a defendant said that he—usually he—had sent an image believing that the consent of the recipient was implied, the person making the complaint would be cross-examined on whether or not she had indeed given that consent. If an offence predicated on proof of non-consent or proof of harm were made out, the victim could be called to give evidence and be cross-examined in court. The defence would be likely to lead evidence challenging the victim’s characteristics and credibility. We do not want that to be a concern for victims; we do not want that to be a barrier to victims coming forward and reporting abuse for fear of having their sexual history or intentions cross-examined.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, we are coming to this in the next group, but that is a consent-based offence, is it not?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

It is—and I shall explain more in that group why we take that approach. But the offence of cyberflashing matches the existing offence of flashing, which is not a consent-based offence. If somebody flashes at someone in public, it does not matter whether the person who sees that flashing has consented to it—it is the intent of the flasher that is the focus of the court. That is why the Law Commission and we have brought the cyberflashing offence forward in the same way, whereas the sharing of intimate images without somebody’s consent relies on the consent to sharing. But I shall say a bit more when we get to that group, if the noble Lord will allow.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

I am sure that the noble and learned Lord, Lord Garnier, is going to come in, and he knows a great deal more about this than I do. But we are getting into the territory where we talk about whether or not somebody needs to appear in court in order to show consent. That was all that I was trying to point out, in a way—that, if the Minister accepted the amendment on behalf of my noble friend, and then the complainant had to appear in court, why is that not the case with intimate abuse?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

Perhaps I can respond to the point about intimate abuse when we come on to the next group—that might be helpful.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

It might be helpful—except for the refusal to accept my noble friend’s amendment.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

If the defendant said that they had sent an image because they thought that consent had been obtained, the person whose consent was under question would find themselves cross-examined on it in a way that we do not want to see. We do not want that to be a barrier to people reporting this, in the same way that it is not for people who report flashing on the streets.

Lord Garnier Portrait Lord Garnier (Con)
- Hansard - - - Excerpts

My Lords, I do not want to interfere in private grief, but the courts have powers to protect witnesses, particularly in cases where they are vulnerable or will suffer acute distress, by placing screens in the way and controlling the sorts of cross-examinations that go on. I accept the concern expressed by the noble Baroness, Lady Burt, but I think that my noble friend the Minister will be advised that there are protective measures in place already for the courts to look after people of the sort that she is worried about.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

There are indeed but, as my noble and learned friend’s interjection makes clear, those are still means for people to be cross-examined and give their account in court, even with those mitigations and protections. That is really the crux of the issue here.

We have already debated the risk that the approach that the noble Baroness sets out in her Amendments 5C and 7A criminalises sending messages, and people whom we would not deem to be criminal. I want to reassure her and your Lordships’ House that the intent-based offence, as drafted at Clause 170, provides the comprehensive protections for victims that we all want to see, including situations where the perpetrator claims it was “just for a joke”. The offence is committed if a perpetrator intended to cause humiliation, and that captures many supposed “joke” motives, as the perverted form of humour in this instance is often derived from the victim’s humiliation, alarm or distress.

Indeed, it was following consultation with victims’ groups and others that the Law Commission added humiliation as a form of intent to the offence to address those very concerns. Any assertions made by a defendant in this regard would not be taken at face value but would be considered and tested by the police and courts in the usual way, alongside the evidence. The Crown Prosecution Service and others are practised in prosecuting intent, and juries and magistrates may infer intention from the context of the behaviour and its foreseeable consequences.

The addition of defences, as the noble Baroness suggests in her Amendment 7A, is unfortunately still not sufficient to ensure that we are not overcriminalising here. Even with the proposed defences, sending a picture of genitalia without consent for medical reasons would still risk being considered a criminal Act and potentially compel a medical professional to justify that he or she has an adequate defence.

13:45
Baroness Burt of Solihull Portrait Baroness Burt of Solihull (LD)
- View Speech - Hansard - - - Excerpts

On the various protections already within that original amendment, if it went to court, why would the person who had sent the image get prosecuted if he or she had a good reason for having sent it?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

It is about the burden on the medical professionals and the question of whether it comes to court when the police investigate it and the prosecution make out. We do not want to see that sort of behaviour being overly criminalised or the risk of prosecution hanging over people for reasons where it is not needed. We want to make sure that the offence is focused on the behaviour that we all want to tackle here.

The Law Commission has looked at this extensively—and I am glad the noble Baroness has had the opportunity to speak to it directly—and brought forward these proposals, which mirror the offence of flashing that already exists in criminal law. We think that is the right way of doing it and not risking the overcriminalisation of those whom noble Lords would not want to capture.

Contrary to some concerns that have been expressed, the onus is never on the victim to marshal evidence or prove the intent of the perpetrator. It is for the police and the Crown Prosecution Service when investigating the alleged offence or prosecuting the case in court. That is why we and the Law Commission consulted the police and the CPS extensively in bringing the offence forward.

By contrast, as I say, the consent-based approach is more likely to put onerous pressure on the victim by focusing the case on his or her behaviour and sexual history instead of the behaviour of the perpetrator. I know and can tell from the interjections that noble Lords still have some concerns or questions about this offence as drafted. I reassure them, as my noble friend Lady Morgan of Cotes urged, that we will be actively monitoring and reviewing the implementation of this offence, along with the Crown Prosecution Service and the police, to ensure that it is working effectively and bringing perpetrators to justice.

The noble Baroness, Lady Burt, also raised the importance of public engagement and education in this regard. As she may know, the Government have a long-term campaign to tackle violence against women and girls. The Enough campaign covers a range of online and offline forms of abuse, including cyberflashing. The campaign includes engaging with the public to deepen understanding of this offence. It focuses on educating young people about healthy relationships, on targeting perpetrators and on ensuring that victims of violence against women and girls can access support. Future phases of the Enough campaign will continue to highlight the abusive nature and unacceptability of these behaviours, and methods for people safely to challenge them.

In addition, in our tackling violence against women and girls strategy the Government have committed to invest £3 million better to understand what works to prevent violence against women and girls, to invest in high-quality, evidence-informed prevention projects, including in schools, aiming to educate and inform children and young people about violence against women and girls, healthy relationships and the consequences of abuse.

With that commitment to keep this under review—to ensure that it is working in the way that the Law Commission and the Government hope and expect it to—and with that explanation of the way we will be encouraging the public to know about the protections that are there through the law and more broadly, I hope noble Lords will be reassured and will not press their amendments.

Baroness Kennedy of Shaws Portrait Baroness Kennedy of The Shaws (Lab)
- Hansard - - - Excerpts

Before the Minister sits down, I express my gratitude that he has indicated that my amendment would have some serious impact. I thank the noble Lord, Lord Clement-Jones, for saying that there should be some learning among men in the House and in wider society about what puts real fear in the hearts of women and how it affects how women conduct their lives. I thank those who said that some change is necessary.

We have to remember that this clause covers a threatening communications offence. I know that something is going to be said about the particular vulnerability of women and girls—the noble Baroness, Lady Morgan, mentioned it, and I am grateful for that—but this offence is not specific to one gender. It is a general offence that someone commits if a message they send conveys a threat of death or serious harm.

I reassure the noble Baroness, Lady Fox, that we are not talking about a slight—saying to a woman that she is ugly or something. This is not about insults but about serious threats. The business about it being reckless as to whether or not it is going to be carried out is vital. Clause 164(1)(c)(i) says an offence is committed if it is intended that an individual encountering the message would fear that the threat would be carried out. I would like to see added the words, “whether or not by the person sending the message”.

Just think of this in the Irish context of years gone by. If someone sent a message saying, “You should be kneecapped”, it is very clear that we would be talking about something that would put someone in terror and fear. It is a serious fear, so I am glad that this is supported by the Minister, and I hope we will progress it to the next stage.

Lord Harlech Portrait Lord Harlech (Con)
- View Speech - Hansard - - - Excerpts

My Lords, without wishing to disrupt the very good nature of this debate, I remind the House that the Companion advises against speaking more than once on Report, except for specific questions or points of elucidation.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

None the less, I am grateful to the noble Baroness for her clarification and expansion of this point. I am glad that she is satisfied with the approach we have set out.

Baroness Kennedy of Shaws Portrait Baroness Kennedy of The Shaws (Lab)
- Hansard - - - Excerpts

It is not specific to women; it is general.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

The issue the noble Baroness has highlighted will protect all victims against people trying to evade the law, and I am grateful to her. We will bring forward an amendment at Third Reading.

Lord Garnier Portrait Lord Garnier (Con)
- Hansard - - - Excerpts

My Lords, I will be incredibly brief because everything that needs to be said has been said at least twice. I am grateful to those who have taken the trouble to listen to what I had to say, and I am grateful to the Minister for his response. I beg leave to withdraw my amendment.

Amendment 2A withdrawn.
Amendment 2B not moved.
Clause 164: Threatening communications offence
Amendments 3 to 5 not moved.
Clause 165: Interpretation of sections 162 to 164
Amendment 5A not moved.
Clause 167: Offence of encouraging or assisting serious self-harm
Amendment 5B not moved.
Clause 170: Sending etc photograph or film of genitals
Amendments 5C and 6 not moved.
Amendment 7
Moved by
7: Clause 170, page 149, line 25, after “made” insert “or altered”
Member’s explanatory statement
This amendment provides that “photograph” and “film” in the new offence of sending a photograph or film of genitals (and, by extension the new offences of sharing an intimate photograph or film) includes an image which has been altered and which appears to be a photograph or film.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, I am grateful for the opportunity to continue some of the themes we touched on in the last group and the debate we have had throughout the passage of the Bill on the importance of tackling intimate image abuse. I shall introduce the government amendments in this group that will make a real difference to victims of this abhorrent behaviour.

Before starting, I take the opportunity again to thank the Law Commission for the work it has done in its review of the criminal law relating to the non-consensual taking, making and sharing of intimate images. I also thank my right honourable friend Dame Maria Miller, who has long campaigned for and championed the victims of online abuse. Her sterling efforts have contributed greatly to the Government’s approach and to the formulation of policy in this sensitive area, as well as to the reform of criminal law.

As we announced last November, we intend to bring forward a more expansive package of measures based on the Law Commission’s recommendations as soon as parliamentary time allows, but the Government agree with the need to take swift action. That is why we are bringing forward these amendments now, to deliver on the recommendations which fall within the scope of the Bill, thereby ensuring justice for victims sooner.

These amendments repeal the offence of disclosing private sexual photographs and films with intent to cause distress and replace it with four new sexual offences in the Sexual Offences Act 2003. The first is a base offence of sharing an intimate photograph or film without consent or reasonable belief in consent. This recognises that the sharing of such images, whatever the intent of the perpetrator, should be considered a criminal violation of the victim’s bodily autonomy.

The amendments create two more serious offences of sharing an intimate photograph or film without consent with intent to cause alarm, distress or humiliation, or for the purpose of obtaining sexual gratification. Offenders committing the latter offence may also be subject to notification requirements, commonly referred to as being on the sex-offenders register. The amendments create an offence of threatening to share an intimate image. These new sharing offences are based on the Law Commission’s recommended approach to the idea of intimate photographs or films to include images which show or appear to show a person nude or partially nude, or which depict sexual or toileting activity. This will protect more victims than the current Section 33 offence, which protects only images of a private and sexual nature.

Finally, these clauses will, for the first time, make it a criminal offence to share a manufactured or so-called deepfake image of another person without his or her consent. This form of intimate image abuse is becoming more prevalent, and we want to send a clear message that it will not be tolerated.

By virtue of placing these offences in the Sexual Offences Act 2003, we are extending to these offences also the current special measures, so that victims can benefit from them in court, and from anonymity provisions, which are so important when something so intimate has been shared without consent. This is only the first stage in our reform of the law in this area. We are committed to introducing additional changes, giving effect to further recommendations of the Law Commission’s report which are beyond the scope of the Bill, when parliamentary time allows.

I hope that noble Lords from across your Lordships’ House will agree that these amendments represent an important step forward in tackling intimate image abuse and protecting victims. I commend them to the House, and I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, I welcome these new offences. From my professional experience, I know that what came to be known as “sextortion” created some of the most distressing cases you could experience, where an individual would obtain intimate images, often by deception, and then use them to make threats. This is where a social network is particularly challenging; it enables people to access a network of all the family and friends of an individual whose photo they now hold and to threaten to distribute it to their nearest and dearest. This affects men and women; many of the victims were men who were honey-potted into sharing intimate images and in the worst cases it led to suicide. It was not uncommon that people would feel that there was no way out; the threat was so severe that they would take their own lives. It is extremely welcome that we are doing something about it, and making it more obvious to anyone who is thinking about committing this kind of offence that they run the risk of criminal prosecution.

I have a few specific questions. The first is on the definitions in proposed new Section 66D, inserted by government Amendment 8, where the Government are trying to define what “intimate” or “nudity” represents. This takes me back again to my professional experience of going through slide decks and trying to decide what was on the right or wrong side of a nudity policy line. I will not go into the detail of everything it said, not least because I keep noticing younger people in the audience here, but I will leave you with the thought that you ended up looking at images that involved typically fishnets, in the case of women, and socks, in the case of men—I will leave the rest to your Lordships’ imaginations to determine at what point someone has gone from being clothed to nude. I can see in this amendment that the courts are going to have to deal with the same issues.

The serious point is that, where there is alignment between platform policies, definitions and what we do not want to be distributed, that is extremely helpful, because it then means that if someone does try to put an intimate image out across one of the major platforms, the platform does not have to ask whether there was consent. They can just say that it is in breach of their policy and take it down. It actually has quite a beneficial effect on slowing transmission.

The other point that comes out of that is that some of these questions of intimacy are quite culturally subjective. In some cultures, even a swimsuit photo could be used to cause humiliation and distress. I know this is extremely difficult; we do not want to be overly censorious but, at the same time, we do not want to leave people exposed to threats, and if you come from a culture where a swimsuit photo would be a threat, the definitions may not work for you. So I hope that, as we go through this, there will be a continued dialogue between experts in the platforms who have to deal with these questions and people working on the criminal offence side. To the extent that we can achieve it, there should be alignment and the message should go out that if you are thinking of distributing an image like this, you run the risk of being censored by the platforms but also of running into a criminal prosecution. That is on the mechanics of making it work.

14:00
I have two questions on the specifics of implementation. I am sure the Minister is going to confirm this, but will our definitions of photographs and films stretch to novel settings such as virtual reality? This is where somebody takes an image of an individual and creates a virtual reality avatar. Our expectation is that that is still within the definition of a photograph and will not escape the threat of prosecution. I hope he can confirm that.
Secondly, on the cross-jurisdictional questions that regularly come up, from experience, many of these sextortion cases occur cross-border. There are rings in particular countries that are well known, and law enforcement will be able to share information on those. It is well known where these rings are. If this offence is going to be effective, we have to make sure there is that cross-border co-operation between law enforcement agencies in each country. Otherwise, the problem we have today, which is that people feel they can do this with impunity, continues. If there is that cross-border co-operation, some of the regimes within which some of the perpetrators live will not treat them as nicely as we would if those convictions happen. Having created this offence, let us make sure it is effective, whether or not the perpetrator is in the United Kingdom. I hope that on those points the Minister can give some additional assurances.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I also welcome these amendments and want to pay tribute to Maria Miller in the other place for her work on this issue. It has been extraordinary. I too was going to raise the issue of the definition of “photograph”, so perhaps the Minister could say or, even better, put it in the Bill. It does extend to those other contexts.

My main point is about children. We do not want to criminalise children, but this is pervasive among under-18s. I do want to make the distinction between those under-18s who intentionally harm another under-18 and have to be responsible for what they have done in the meaning of the law as the Minister set it out, and those who are under the incredible pressure—I do not mean coercion, because that is another out-clause—of oversharing that is inherent in the design of many of these services. That is an issue I am sure we are going to come back to later today. I would love to hear the Minister say something about the Government’s intention from the Dispatch Box: that it is preventive first and there is a balance between education and punishment for under-18s who find themselves unavoidably in this situation.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Hansard - - - Excerpts

Very briefly, before I speak to these amendments, I want to welcome them. Having spoken to and introduced some of the threats of sharing intimate images under the Domestic Abuse Act 2021, I think it is really welcome that everything has been brought together in one place. Again, I pay tribute to the work of Dame Maria Miller and many others outside who have raised these as issues. I also want to pay tribute to the Ministry of Justice Minister Edward Argar, who has also worked with my noble friend the Minister on this.

I have one specific question. The Minister did mention this in his remarks, but could he be absolutely clear that these amendments do not mention specifically the lifetime anonymity of claimants and the special measures in relation to giving evidence that apply to witnesses. That came up in the last group of amendments as well. Because they are not actually in this drafting, it would be helpful if he could put on record the relationship with the provisions in the Sexual Offences Act 2003. I know that would be appreciated by campaigners.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I have very little to add to the wise words that we have heard from my noble friend and from the noble Baronesses, Lady Kidron and Lady Morgan. We should thank all those who have got us to this place, including the Law Commission. It was a separate report. In that context, I would be very interested to hear a little more from the Minister about the programme of further offences that he mentioned. The communication offences that we have talked about so far are either the intimate images offences, which there was a separate report on, or other communications offences, which are also being dealt with as part of the Bill. I am not clear what other offences are in the programme.

Finally, the Minister himself raised the question of deepfakes. I have rustled through the amendments to see exactly how they are caught. The question asked by the noble Baroness, Lady Kidron, is more or less the same but put a different way. How are these deepfakes caught in the wording that is now being included in the Bill? This is becoming a big issue and we must be absolutely certain that it is captured.

Baroness Merron Portrait Baroness Merron (Lab)
- Hansard - - - Excerpts

My Lords, I am grateful to the Minister for introducing this suite of government amendments. From these Benches we welcome them. From the nature of the debate, this seems to be very much a work in progress. I wish the Minister well as he and the Justice Minister continue to pick their way through a route to get us to where we need to be. I too thank the Law Commission, Dame Maria Miller MP and so many other campaigners who, as noble Lords have said, have got us to this important point.

However, as I am sure is recognised, with the best of intentions, the government amendments still leave some areas that are as yet unresolved, particularly on sharing images with others: matters such as revenge porn and sending unwanted pictures on dating apps. There are areas still to be explored. The Minister and the Justice Minister said in a letter that, when parliamentary time allows, there will be a broader package of offences being brought forward. I realise that the Minister cannot be precise, but I would appreciate some sense of urgency or otherwise in terms of parliamentary time and when that might be.

We are only just starting to understand the impact of, for example, artificial intelligence, which we are about to come on to. That will be relevant in this regard too. We all understand that this is a bit of a moveable feast. The test will be whether this works. Can the Minister say a bit more about how this suite of measures will be kept under review and, in so doing, will the Government be looking at keeping an eye on the number of charges that are brought? How will this be reported to the House?

In line with this, will there be some consideration of the points that were raised in the previous group? I refer particularly to the issues raised in the amendments tabled by the noble Baroness, Lady Burt, especially where there may not be the intent, or the means, to obtain sexual gratification. They might be about “having a bit of a laugh”, as the noble Baroness said—which might be funny to some but really not funny to others.

In welcoming this, I hope that the Minister will indicate that this is just one step along the way and when we will see further steps.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am happy to respond clearly to that. As my right honourable friend Edward Argar MP and I said in our letter, this is just the first step towards implementing the changes which the Law Commission has recommended and which we agree are needed. We will implement a broader package of offences, covering, for instance, the taking of intimate images without consent, which were also part of the Law Commission’s report. The parameters of this Bill limit what we can do now. As I said in my opening remarks, we want to bring those forward now so that we can provide protections for victims in all the ways that the Bill gives us scope to do. We will bring forward further provisions when parliamentary time allows. The noble Baroness will understand that I cannot pre-empt when that is, although if we make good progress on the Bill, parliamentary time may allow for it sooner.

The noble Baroness also asked about our review. We will certainly take into account the number of prosecutions and charges that are brought. That is always part of our consideration of criminal law, but I am happy to reassure her that this will be the case here. These are new offences, and we want to make sure that they are leading to prosecutions to deter people from doing it.

The noble Lord, Lord Allan of Hallam, asked whether images will include those shared on virtual reality platforms and in other novel ways. As he knows, the Bill is written in a technologically neutral way to try to be future-proof and capture those technologies which have not yet been invented. I mentioned deepfakes in my opening remarks, which we can envisage. An image will be included on whatever platform it is shared, if it appears to be a photograph or film—that is to say, if it is photo-real. I hope that reassures him.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

If the Minister has time, can he actually direct us to that, because it is important that we are clear that it really is captured?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

In the amendments, if I can, I will. In the meantime, I reassure my noble friend Lady Morgan of Cotes that, as I said in opening, placing these offences in the Sexual Offences Act means that we are also extending the current special measures provisions to these offences, as we heard in our debate on the last group, so that victims can benefit from those in court. The same applies to anonymity provisions, which are so important when something so intimate has been shared without someone’s consent.

I promised in the previous group to outline the difference in the consent basis between this offence and the cyberflashing offence. Both are abhorrent behaviours which need to be addressed in criminal law. Although the levels of harm and distress may be the same in each case, the Law Commission recommended different approaches to take into account the different actions of the perpetrator in each offence. Sharing an intimate image of somebody without their consent is, in and of itself, wrongful, and a violation of their bodily privacy and sexual autonomy. Sending a genital image without the consent of the recipient is not, in and of itself, wrongful; for instance, the example I gave in the previous debate about an artistic performance, or a photograph which depicts a naked protester. If that was sent without the consent of the recipient, it is not always or necessarily harmful. This is an issue which the Law Commission looked at in some detail.

The criminal law must take the culpability of the perpetrator into account. I reassure noble Lords that both we and the Law Commission have looked at these offences considerably, working with the police and prosecutors in doing so. We are confident that the Bill provides the comprehensive protection for victims that we all want to see, including in situations where a perpetrator may claim that it was just a joke.

The terms “photograph” and “film” are defined in proposed new Section 66D(5). That refers to the definition in new Section 66A, which refers to an image which is made or altered in any way

“which appears to be a photograph or film”.

That is where the point I make about photo-reality is captured.

The noble Baroness, Lady Kidron, is right to highlight that this is a matter not just for the criminal law. As we discussed on the previous group, it is also a matter for public education, so that young people and users of any age are aware of the legal boundaries and legal issues at stake here. That is why we have the public education campaigns to which I alluded in the previous group.

14:15
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I believe I misspoke when I asked my question. I referred to under-18s. Of course, if they are under 18 then it is child sexual abuse. I meant someone under the age of 18 with an adult image. I put that there for the record.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

If the noble Baroness misspoke, I understood what she intended. I knew what she was getting at.

With that, I hope noble Lords will be content not to press their amendments and that they will support the government amendments.

Amendment 7 agreed.
Amendment 7A not moved.
Amendment 8
Moved by
8: After Clause 170, insert the following new Clause—
“Sharing or threatening to share intimate photograph or film
In the Sexual Offences Act 2003, after section 66A (inserted by section 170), insert—“66B Sharing or threatening to share intimate photograph or film(1) A person (A) commits an offence if—(a) A intentionally shares a photograph or film which shows, or appears to show, another person (B) in an intimate state,(b) B does not consent to the sharing of the photograph or film, and(c) A does not reasonably believe that B consents.(2) A person (A) commits an offence if—(a) A intentionally shares a photograph or film which shows, or appears to show, another person (B) in an intimate state,(b) A does so with the intention of causing B alarm, distress or humiliation, and(c) B does not consent to the sharing of the photograph or film.(3) A person (A) commits an offence if—(a) A intentionally shares a photograph or film which shows, or appears to show, another person (B) in an intimate state, (b) A does so for the purpose of A or another person obtaining sexual gratification,(c) B does not consent to the sharing of the photograph or film, and(d) A does not reasonably believe that B consents.(4) A person (A) commits an offence if—(a) A threatens to share a photograph or film which shows, or appears to show, another person (B) in an intimate state, and(b) A does so—(i) with the intention that B or another person who knows B will fear that the threat will be carried out, or(ii) being reckless as to whether B or another person who knows B will fear that the threat will be carried out.(5) Subsections (1) to (4) are subject to section 66C (exemptions).(6) For the purposes of subsections (1) to (3) and section 66C(3)(b)—(a) “consent” to the sharing of a photograph or film includes general consent covering the particular act of sharing as well as specific consent to the particular act of sharing, and(b) whether a belief is reasonable is to be determined having regard to all the circumstances including any steps A has taken to ascertain whether B consents.(7) Where a person is charged with an offence under subsection (4), it is not necessary for the prosecution to prove—(a) that the photograph or film mentioned in the threat exists, or(b) if it does exist, that it is in fact a photograph or film which shows or appears to show a person in an intimate state.(8) It is a defence for a person charged with an offence under subsection (1) to prove that the person had a reasonable excuse for sharing the photograph or film.(9) A person who commits an offence under subsection (1) is liable on summary conviction to imprisonment for a term not exceeding the maximum term for summary offences or a fine (or both).(10) A person who commits an offence under subsection (2), (3) or (4) is liable—(a) on summary conviction, to imprisonment for a term not exceeding the general limit in a magistrates’ court or a fine (or both);(b) on conviction on indictment, to imprisonment for a term not exceeding 2 years.(11) In subsection (9) “the maximum term for summary offences” means—(a) if the offence is committed before the time when section 281(5) of the Criminal Justice Act 2003 comes into force, six months;(b) if the offence is committed after that time, 51 weeks.(12) If on the trial of a person charged with an offence under subsection (2) or (3) a magistrates’ court or jury finds the person not guilty of the offence charged, the magistrates’ court or jury may find the person guilty of an offence under subsection (1).(13) The Crown Court has the same powers and duties in relation to a person who is by virtue of subsection (12) convicted before it of an offence under subsection (1) as a magistrates’ court would have on convicting the person of the offence. 66C Sharing or threatening to share intimate photograph or film: exemptions(1) A person (A) who shares a photograph or film which shows, or appears to show, another person (B) in an intimate state does not commit an offence under section 66B(1), (2) or (3) if—(a) the photograph or film was taken in a place to which the public or a section of the public had or were permitted to have access (whether on payment or otherwise),(b) B had no reasonable expectation of privacy from the photograph or film being taken, and(c) B was, or A reasonably believes that B was, in the intimate state voluntarily.(2) For the purposes of subsection (1)(b), whether a person had a reasonable expectation of privacy from a photograph or film being taken is to be determined by reference to the circumstances that the person sharing the photograph or film reasonably believes to have existed at the time the photograph or film was taken.(3) A person (A) who shares a photograph or film which shows, or appears to show, another person (B) in an intimate state does not commit an offence under section 66B(1), (2) or (3) if—(a) the photograph or film had, or A reasonably believes that the photograph or film had, been previously publicly shared, and(b) B had, or A reasonably believes that B had, consented to the previous sharing.(4) A person (A) who shares a photograph or film which shows, or appears to show, another person (B) in an intimate state does not commit an offence under section 66B(1) if—(a) B is a person under 16,(b) B lacks, or A reasonably believes that B lacks, capacity to consent to the sharing of the photograph or film, and(c) the photograph or film is shared—(i) with a healthcare professional acting in that capacity, or(ii) otherwise in connection with the care or treatment of B by a healthcare professional.(5) A person who shares a photograph or film which shows, or appears to show, a child in an intimate state does not commit an offence under section 66B(1) if the photograph or film is of a kind ordinarily shared between family and friends.(6) A person who threatens to share a photograph or film which shows, or appears to show, another person in an intimate state does not commit an offence under section 66B(4) if, by reason of this section, the person would not commit an offence under section 66B(1), (2) or (3) by sharing the photograph or film in the circumstances conveyed by the threat.66D Sharing or threatening to share intimate photograph or film: interpretation(1) This section applies for the purposes of sections 66B and 66C.(2) A person “shares” something if the person, by any means, gives or shows it to another person or makes it available to another person.(3) But a provider of an internet service by means of which a photograph or film is shared is not to be regarded as a person who shares it.(4) “Photograph” and “film” have the same meaning as in section 66A (see subsections (3) to (5) of that section). (5) Except where a photograph or film falls within subsection (8), a photograph or film “shows, or appears to show, another person in an intimate state” if it shows or appears to show—(a) the person participating or engaging in an act which a reasonable person would consider to be a sexual act,(b) the person doing a thing which a reasonable person would consider to be sexual,(c) all or part of the person’s exposed genitals, buttocks or breasts,(d) the person in an act of urination or defecation, or(e) the person carrying out an act of personal care associated with the person’s urination, defecation or genital or anal discharge.(6) For the purposes of subsection (5)(c) the reference to all or part of a person’s “exposed” genitals, buttocks or breasts includes—(a) a reference to all or part of the person’s genitals, buttocks or breasts visible through wet or otherwise transparent clothing,(b) the case where all or part of the person’s genitals, buttocks or breasts would be exposed but for the fact that they are covered only with underwear, and(c) the case where all or part of the person’s genitals, buttocks or breasts would be exposed but for the fact that they are obscured, provided that the area obscured is similar to or smaller than an area that would typically be covered by underwear worn to cover a person’s genitals, buttocks or breasts (as the case may be).(7) In subsection (6)(c) “obscured” means obscured by any means, other than by clothing that a person is wearing, including, in particular, by an object, by part of a person’s body or by digital alteration.(8) A photograph or film falls within this subsection if (so far as it shows or appears to show a person in an intimate state) it shows or appears to show something, other than breastfeeding, that is of a kind ordinarily seen in public.(9) For the purposes of subsection (8) “breastfeeding” includes the rearranging of clothing in the course of preparing to breastfeed or having just finished breastfeeding.””Member’s explanatory statement
This amendment provides for new offences of sharing or threatening to share intimate photographs or films.
Amendment 8 agreed.
Amendment 9
Moved by
9: After Clause 171, insert the following new Clause—
“Repeals in connection with offences under section (Sharing or threatening to share intimate photograph or film)
Sections 33 to 35 of the Criminal Justice and Courts Act 2015 (disclosing or threatening to disclose private sexual photographs and films with intent to cause distress) are repealed.”Member’s explanatory statement
This amendment is consequential on the new Clause creating offences of sharing or threatening to share intimate photographs or films.
Amendment 9 agreed.
Clause 172: Consequential amendments
Amendments 10 and 11
Moved by
10: Clause 172, page 150, line 15, leave out “section 170” and insert “sections 170 and (Sharing or threatening to share intimate photograph or film)”
Member’s explanatory statement
This amendment provides that Part 3 of Schedule 14 also makes consequential amendments on the new Clause creating offences of sharing and threatening to share intimate photographs or films.
11: Clause 172, page 150, line 15, at end insert—
“(4) Part 4 of Schedule 14 contains amendments consequential on section (Repeals in connection with offences under section (Sharing or threatening to share intimate photograph or film)).”Member’s explanatory statement
This amendment introduces a new Part of Schedule 14 which makes consequential amendments on the new Clause in my name repealing sections 33 to 35 of the Criminal Justice and Courts Act 2015.
Amendments 10 and 11 agreed.
Schedule 14: Amendments consequential on offences in Part 10 of this Act
Amendments 12 to 26
Moved by
12: Schedule 14, page 240, line 24, after first “the” insert “first”
Member’s explanatory statement
This is a technical amendment ensuring that the amendments made under Schedule 14 to Schedule 1 to the Children and Young Persons Act 1933 are inserted in the correct place in that Act.
13: Schedule 14, page 240, line 25, after “66A” insert “, 66B”
Member’s explanatory statement
This amendment adds a reference to the new offences of sharing and threatening to share an intimate photograph or film to Schedule 1 to the Children and Young Persons Act 1933 (offences to which certain provisions of that Act apply).
14: Schedule 14, page 240, line 25, at end insert—
“13A_ In section 65A of the Police and Criminal Evidence Act 1984 (“qualifying offences” for the purposes of Part 5 of that Act), in subsection (2)(p) after “61 to” insert “66A, 66B(2) and (3),”.”Member’s explanatory statement
This amendment adds a reference to certain of the new offences of sharing an intimate photograph or film to section 65A(2) of the Police and Criminal Evidence Act 1984 (meaning of “qualifying offence” for the purposes of Part 5 of that Act).
15: Schedule 14, page 240, line 25, at end insert—
“13A_ In section 6 of the Sexual Offences (Amendment) Act 1992 (interpretation), after subsection (2A) insert—“(2B) For the purposes of this Act, where it is alleged or there is an accusation that an offence under section 66B(4) of the Sexual Offences Act 2003 (threatening to share intimate photograph or film) has been committed, the person against whom the offence is alleged to have been committed is to be regarded as— (a) the person to whom the threat mentioned in that subsection is alleged to have been made, and(b) (if different) the person shown, or who appears to be shown, in an intimate state in the photograph or film that is the subject of the threat.””Member’s explanatory statement
This amendment has the effect of applying the provisions of the Sexual Offences (Amendment) Act 1992 to the person shown or who appears to be shown in an intimate photograph or film where a threat to share the photograph or film is made to a person other than that person.
16: Schedule 14, page 240, line 27, at end insert—
“(1A) In section 78 (meaning of “sexual”), after “15A” insert “, 66B to 66D ”.”Member’s explanatory statement
This amendment provides that the existing definition of “sexual” in section 78 of the Sexual Offences Act 2003 does not apply to the new offences of sharing and threatening to share an intimate photograph or film (on account of a separate definition applying to those offences).
17: Schedule 14, page 240, line 29, after “66A” insert “, 66B(2) and (3)”
Member’s explanatory statement
This amendment adds a reference to certain of the new offences of sharing an intimate photograph or film to section 136A(3A) of the Sexual Offences Act 2003 (offences specified as child sex offences for the purposes of Part 2A of that Act when committed against a person under 18).
18: Schedule 14, page 241, line 4, at end insert—
“33B_ An offence under section 66B(3) of this Act (sharing intimate photograph or film for purpose of obtaining sexual gratification) if—(a) where the offender was under 18, the offender is or has been sentenced in respect of the offence to imprisonment for a term of at least 12 months;(b) in any other case—(i) the victim was under 18, or(ii) the offender, in respect of the offence or finding, is or has been—(a) sentenced to a term of imprisonment,(b) detained in a hospital, or(c) made the subject of a community sentence of at least 12 months.””Member’s explanatory statement
This amendment adds a reference to the new offence of sharing an intimate photograph or film for the purpose of obtaining sexual gratification to Schedule 3 to the Sexual Offences Act 2003 (offences to which certain provisions of that Act apply).
19: Schedule 14, page 241, line 10, at end insert—
“149B_ An offence under section 66B(2) or (3) of that Act (sharing intimate photograph or film with intent to cause alarm, distress or humiliation or for purpose of obtaining sexual gratification).””Member’s explanatory statement
This amendment adds a reference to certain of the new offences of sharing an intimate photograph or film to Schedule 15 to the Criminal Justice Act 2003 (specified sexual offences for the purposes of section 325 of that Act).
20: Schedule 14, page 241, line 12, after “66A” insert “, 66B(2) or (3)”
Member’s explanatory statement
This amendment adds a reference to certain of the new offences of sharing an intimate photograph or film to Schedule 34A to the Criminal Justice Act 2003 (child sex offences for the purposes of section 327A of that Act).
21: Schedule 14, page 241, line 12, at end insert “, and
(b) after “exposure” insert “, sending etc photograph or film of genitals, sharing intimate photograph or film with intent to cause alarm, distress or humiliation or for purpose of obtaining sexual gratification”.”Member’s explanatory statement
This amendment is consequential on the other amendment to Schedule 34A to the Criminal Justice Act 2003 made in my name.
22: Schedule 14, page 241, line 17, after “66A” insert “, 66B(2) and (3)”
Member’s explanatory statement
This amendment adds a reference to certain of the new offences of sharing an intimate photograph or film to section 116 of the Anti-social Behaviour, Crime and Policing Act 2014 (conduct constituting offence amounting to “child sexual exploitation” when committed against a person under 18 for the purposes of that section).
23: Schedule 14, page 241, line 17, at end insert “, and
(b) after “exposure” insert “, sending etc photograph or film of genitals, sharing intimate photograph or film with intent to cause alarm, distress or humiliation or for purpose of obtaining sexual gratification”.”Member’s explanatory statement
This amendment is consequential on the other amendment to section 116 of the Anti-social Behaviour, Crime and Policing Act 2014 made in my name.
24: Schedule 14, page 241, line 22, at end insert—
“section 66B(2) (sharing intimate photograph or film with intent to cause alarm, distress or humiliation)section 66B(3) (sharing intimate photograph or film for purpose of obtaining sexual gratification)”.”Member’s explanatory statement
This amendment adds a reference to certain of the new offences of sharing an intimate photograph or film to paragraph 33 of Schedule 4 to the Modern Slavery Act 2015 (offences to which the defence in section 45 does not apply).
25: Schedule 14, page 241, line 27, at end insert—
“(axb) section 66B(2) (sharing intimate photograph or film with intent to cause alarm, distress or humiliation);(axc) section 66B(3) (sharing intimate photograph or film for purpose of obtaining sexual gratification);”.”Member’s explanatory statement
This amendment adds a reference to certain of the new offences of sharing an intimate photograph or film to Part 2 of Schedule 18 to the Sentencing Act 2020 (specified sexual offences for the purposes of section 306 of that Act).
26: Schedule 14, page 241, line 32, at end insert—
“PART 4AMENDMENTS CONSEQUENTIAL ON SECTION (REPEALS IN CONNECTION WITH OFFENCES UNDER SECTION (SHARING OR THREATENING TO SHARE INTIMATE PHOTOGRAPH OR FILM))Criminal Justice and Courts Act 2015
20_(1) The Criminal Justice and Courts Act 2015 is amended as follows.(2) In section 96 (extent), in subsection (6), omit paragraphs (c) and (g).(3) Omit Schedule 8 (disclosing or threatening to disclose private sexual photographs or films: providers of information society services).Domestic Abuse Act 2021
21_(1) The Domestic Abuse Act 2021 is amended as follows. (2) Omit section 69 (threats to disclose private sexual photographs and films with intent to cause distress) and the italic heading before it.(3) In section 85 (power to make consequential provision), in subsection (1)(b), omit “69,”.(4) In section 86 (power to make transitional or saving provision), in subsection (1)(b), omit “69,”.Overseas Operations (Service Personnel and Veterans) Act 2021
22_ In Part 1 of Schedule 1 to the Overseas Operations (Service Personnel and Veterans) Act 2021 (“excluded offences” for the purposes of section 6 of that Act), omit paragraph 11.Criminal Justice (Electronic Commerce) (Amendment) (EU Exit) Regulations 2021 (S.I. 2021/835)
23_ In the Criminal Justice (Electronic Commerce) (Amendment) (EU Exit) Regulations 2021, omit regulation 8 (amendment of the Criminal Justice and Courts Act 2015).”Member’s explanatory statement
This amendment inserts a new Part into Schedule 14 consequential on the new Clause in my name repealing sections 33 to 35 of the Criminal Justice and Courts Act 2015.
Amendments 12 to 26 agreed.
14:17
Consideration on Report adjourned until not before 2.48 pm.

Online Safety Bill

Report (1st Day) (Continued)
14:48
Schedule 1: Exempt user-to-user and search services
Amendment 27
Moved by
27: Schedule 1, page 185, line 11, leave out from “provider” to end of line 13 and insert “, including where the publication of the content is effected or controlled by means of—
(a) software or an automated tool or algorithm applied by the provider or by a person acting on behalf of the provider, or(b) an automated tool or algorithm made available on the service by the provider or by a person acting on behalf of the provider.”Member’s explanatory statement
This amendment is about what counts as “provider content” for the purposes of the exemption in paragraph 4 of Schedule 1 of the Bill (which provides that limited functionality services are exempt). Words are added to expressly cover the case where an automated tool or algorithm is made available on the service by a provider, such as a generative AI bot.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, the Government are committed to protecting children against accessing pornography online. As technology evolves, it is important that the regulatory framework introduced by the Bill keeps pace with emerging risks to children and exposure to pornography in new forms, such as generative artificial intelligence.

Part 5 of the Bill has been designed to be future-proof, and we assess that it would already capture AI-generated pornography. Our Amendments 206 and 209 will put beyond doubt that content is “provider pornographic content” where it is published or displayed on a Part 5 service by means of an automated tool or algorithm, such as a generative AI bot, made available on the service by a provider. Amendments 285 and 293 make clear that the definition of an automated tool includes a bot. Amendment 276 clarifies the definition of a provider of a Part 5 service, to make clear that a person who controls an AI bot that generates pornography can be regarded as the provider of a service.

Overall, our amendments provide important certainty for users, providers and Ofcom on the services and content in scope of the Part 5 duties. This will ensure that the new, robust duties for Part 5 providers to use age verification or age estimation to prevent children accessing provider pornographic content will also extend to AI-generated pornography. I beg to move.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, the noble Baroness, Lady Kidron, has unfortunately been briefly detained. If you are surprised to see me standing up, it is because I am picking up for her. I start by welcoming these amendments. I am grateful for the reaction to the thought-provoking debate that we had in Committee. I would like to ask a couple of questions just to probe the impact around the edges.

Amendment 27 looks as if it implies that purely content-generating machine-learning or AI bots could be excluded from the scope of the Bill, rather than included, which is the opposite of what we were hoping to achieve. That may be us failing to understand the detail of this large body of different amendments, but I would welcome my noble friend the Minister’s response to make sure that in Amendment 27 we are not excluding harm that could be generated by some form of AI or machine-learning instrument.

Maybe I can give my noble friend the Minister an example of what we are worried about. This is a recent scenario that noble Lords may have seen in the news, of a 15 year-old who asked, “How do I have sex with a 30 year-old?”. The answer was given in forensic detail, with no reference to the fact that it would in fact be statutory rape. Would the regulated service, or the owner of the regulated service that generated that answer, be included or excluded as a result of Amendment 27? That may be my misunderstanding.

This group is on AI-generated pornography. My friend, the noble Baroness, Lady Kidron, and I are both very concerned that it is not just about pornography, and that we should make sure that AI is included in the Bill. Specifically, many of us with teenage children will now be learning how to navigate the Snap AI bot. Would harm generated by that bot be captured in these amendments, or is it only content that is entirely pornographic? I hope that my noble friend the Minister can clarify both those points, then we will be able to support all these amendments.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I rise briefly to welcome the fact that there is a series of amendments here where “bot” is replaced by

“bot or other automated tool”.

I point out that there is often a lot of confusion about what a bot is or is not. It is something that was largely coined in the context of a particular service—Twitter—where we understand that there are Twitter bots: accounts that have been created to pump out lots of tweets. In other contexts, on other services, there is similar behaviour but the mechanism is different. It seems to me that the word “bot” may turn out to be one of those things that was common and popular at the end of the 2010s and in the early 2020s, but in five years we will not be using it at all. It will have served its time, it will have expired and we will be using other language to describe what it is that we want to capture: a human being has created some kind of automated tool that will be very context dependent, depending on the nature of the service, and they are pumping out material. It is very clear that we want to make sure that such behaviour is in scope and that the person cannot hide behind the fact that it was an automated tool, because we are interested in the mens rea of the person sitting behind the tool.

I recognise that the Government have been very wise in making sure that whenever we refer to a bot we are adding that “automated tool” language, which will make the Bill inherently much more future-proof.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I just want to elucidate whether the Minister has any kind of brief on my Amendment 152A. I suspect that he does not; it is not even grouped—it is so recent that it is actually not on today’s groupings list. However, just so people know what will be coming down the track, I thought it would be a good idea at this stage to say that it is very much about exactly the question that the noble Baroness, Lady Harding, was asking. It is about the interaction between a provider environment and a user, with the provider environment being an automated bot—or “tool”, as my noble friend may prefer.

It seems to me that we have an issue here. I absolutely understand what the Minister has done, and I very much support Amendment 153, which makes it clear that user-generated content can include bots. But this is not so much about a human user using a bot or instigating a bot; it is much more about a human user encountering content that is generated in an automated way by a provider, and then the user interacting with that in a metaverse-type environment. Clearly, the Government are apprised of that with regard to Part 5, but there could be a problem as regards Part 3. This is an environment that the provider creates, but it is interacted with by a user as if that environment were another user.

I shall not elaborate or make the speech that I was going to make, because that would be unfair to the Minister, who needs to get his own speaking note on this matter. But I give him due warning that I am going to degroup and raise this later.

Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - - - Excerpts

My Lords, I warmly welcome this group of amendments. I am very grateful to the Government for a number of amendments that they are bringing forward at this stage. I want to support this group of amendments, which are clearly all about navigating forward and future-proofing the Bill in the context of the very rapid development of artificial intelligence and other technologies. In responding to this group of amendments, will the Minister say whether he is now content that the Bill is sufficiently future-proofed, given the hugely rapid development of technology, and whether he believes that Ofcom now has sufficient powers to risk assess for the future and respond, supposing that there were further parallel developments in generative AI such as we have seen over the past year?

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, this is a quick-fire debate on matters where most of us probably cannot even understand the words, let alone the purpose and particularity of the amendments. I want to raise points already raised by others: it seems that the Government’s intention is to ensure that the Bill is future-proofed. Why then are they restricting this group to Part 5 only? It follows that, since Part 5 is about pornography, it has to be about only pornography—but it is rather odd that we are not looking at the wider context under which harm may occur, involving things other than simply pornography. While the Bill may well be currently able to deal with the issues that are raised in Part 3 services, does it not need to be extended to that as well? I shall leave it at that. The other services that we have are probably unlikely to raise the sorts of issues of concern that are raised by this group. None the less, it is a point that we need reassurance on.

15:00
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, this has been a short but important debate and I am grateful to noble Lords for their broad support for the amendments here and for their questions. These amendments will ensure that services on which providers control a generative tool, such as a generative AI bot, are in scope of Part 5 of the Bill. This will ensure that children are protected from any AI-generated pornographic content published or displayed by provider-controlled generative bots. These changes will not affect the status of any non-pornographic AI-generated content, or AI-generated content shared by users.

We are making a minor change to definitions in Part 3 to ensure that comments or reviews on content generated by a provider-controlled artificial intelligence source are not regulated as user-generated content. This is consistent with how the Bill treats comments and reviews on other provider content. These amendments do not have any broader impact on the treatment of bots by Part 3 of the Bill’s regime beyond the issue of comments and reviews. The basis on which a bot will be treated as a user, for example, remains unchanged.

I am grateful to the noble Lord, Lord Clement-Jones, for degrouping his Amendment 152A so that I can come back more fully on it in a later group and I am grateful for the way he spoke about it in advance. I am grateful too for my noble friend Lady Harding’s question. These amendments will ensure that providers which control a generative tool on a service, such as a generative AI bot, are in scope of Part 5 of the Bill. A text-only generative AI bot would not be in scope of Part 5. It is important that we focus on areas which pose the greatest risk of harm to children. There is an exemption in Part 5 for text-based provider pornographic content because of the limited risks posed by published pornographic content. This is consistent with the approach of Part 3 of the Digital Economy Act 2017 and its provisions to protect children from commercial online pornography, which did not include text-based content in scope.

The right reverend Prelate the Bishop of Oxford is right to ask whether we think this is enough. These changes certainly help. The way that the Bill is written in a technology-neutral way will help us to future proof it but, as we have heard throughout the passage of the Bill, we all know that this area of work will need constant examination and scrutiny. That is why the Bill is subject the post-Royal Assent review and scrutiny that it is and why we are grateful for the anticipation noble Lords and Members of Parliament in the other place have already given to ensuring that it delivers on what we want to see. I believe these amendments, which put out of doubt important provisions relating to generative AI, are a helpful addition and I beg to move.

Amendment 27 agreed.
Amendment 28
Moved by
28: Schedule 1, page 185, line 23, at end insert—
“Public information services
5A A user-to-user service is exempt if its primary purpose is the creation of public information resources and it has the following characteristics—(a) user-to-user functions are limited to those necessary for the creation and maintenance of a public information resource,(b) OFCOM has determined that there is minimal risk of users sharing harmful content on the service, and(c) it is non-commercial.”Member’s explanatory statement
This amendment would allow OFCOM to exempt services like Wikipedia from regulation where it deems them to be low risk.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, as we enter the final stages of consideration of this Bill, it is a good time to focus a little more on what is likely to happen once it becomes law, and my Amendment 28 is very much in that context. We now have a very good idea of what the full set of obligations that in-scope services will have to comply with will look like, even if the detailed guidance is still to come.

With this amendment I want to return to the really important question that I do not believe we answered satisfactorily when we debated it in Committee. That is that there is a material risk that, without further amendment or clarification, Wikipedia and other similar services may feel that they can no longer operate in the United Kingdom.

Wikipedia has already featured prominently in our debates, but there are other major services that might find themselves in a similar position. As I was discussing the definitions in the Bill with my children yesterday—this may seem an unusual dinner conversation with teenagers, but I find mine to be a very useful sounding board—they flagged that OpenStreetMap, to which we all contribute, also seems to be in the scope of how we have defined user-to-user services. I shall start by asking some specific questions so that the Minister has time to find the answers in his briefing or have them magically delivered to him before summing up: I shall ask the questions and then go on to make the argument.

First, is it the Government’s view that Wikipedia and OpenStreetMap fall within the definition of user-to-user services as defined in Clause 2 and the content definition in Clause 211? We need to put all these pieces together to understand the scope. I have chosen these services because each is used by millions of people in the UK and their functionality is very well known, so I trust that the Government had them in mind when they were drafting the legislation, as well as the more obvious services such as Instagram, Facebook et cetera.

Secondly, can the Minister confirm whether any of the existing exemptions in the Bill would apply to Wikipedia and OpenStreetMap such that they would not have to comply with the obligations of a category 1 or 2B user-to-user service?

Thirdly, does the Minister believe that the Bill as drafted allows Ofcom to use its discretion in any other way to exempt Wikipedia and OpenStreetMap, for example through the categorisation regulations in Schedule 11? As a spoiler alert, I expect the answers to be “Yes”, “No” and “Maybe”, but it is really important that we have the definitive government response on the record. My amendment would seek to turn that to “Yes”, “Yes” and therefore the third would be unnecessary because we would have created an exemption.

The reason we need to do this is not in any way to detract from the regulation or undermine its intent but to avoid facing the loss of important services at some future date because of situations we could have avoided. This is not hyperbole or a threat on the part of the services; it is a natural consequence if we impose legal requirements on a responsible organisation that wants to comply with the law but knows it cannot meet them. I know it is not an intended outcome of the Bill that we should drive these services out, but it is certainly one intended outcome that we want other services that cannot meet their duties of care to exit the UK market rather than continue to operate here in defiance of the law and the regulator.

We should remind ourselves that at some point, likely to be towards the end of 2024, letters will start to arrive on the virtual doormats of all the services we have defined as being in scope—these 25,000 services—and their senior management will have a choice. I fully expect that the Metas, the Googles and all such providers will say, “Fine, we will comply. Ofcom has told us what we need to do, and we will do it”. There will be another bunch of services that will say, “Ofcom, who are they? I don’t care”, and the letter will go in the bin. We have a whole series of measures in the Bill by which we will start to make life difficult for them: we will disrupt their businesses and seek to prosecute them and we will shut them out of the market.

However, there is a third category, which is the one I am worried about in this amendment, who will say, “We want to comply, we are responsible, but as senior managers of this organisation”, or as directors of a non-profit foundation, “we cannot accept the risk of non-compliance and we do not have the resources to comply. There is no way that we can build an appeals mechanism, user reporter functions and all these things we never thought we would need to have”. If you are Wikipedia or OpenStreetMap, you do not need to have that infrastructure, yet as I read the Bill, if they are in scope and there is no exemption, then they are going to be required to build all that additional infrastructure.

The Bill already recognises that there are certain classes of services where it would be inappropriate to apply this new regulatory regime, and it describes these in Schedule 1, which I am seeking to amend. My amendment just seeks to add a further class of exempted service and it does this quite carefully so that we would exclude only services that I believe most of us in this House would agree should not be in scope. There are three tests that would be applied.

The first is a limited functionality test—we already have something similar in Schedule 1—so that the user-to-user functions are only those that relate to the production of what I would call a public information resource. In other words, users engage with one another to debate a Wikipedia entry or a particular entry on a map on OpenStreetMap. So, there is limited user-to-user functionality all about this public interest resource. They are not user-to-user services in the classic sense of social media; they are a particular kind of collective endeavour. These are much closer to newspaper publishers, which we have explicitly excluded from the Bill. It is much more like a newspaper; it just happens to be created by users collectively, out of good will, rather than by paid professional journalists. They are very close to that definition, but if you read Schedule 1, I do not think the definition of “provider content” in paragraph 4(2) includes at the moment these collective-user endeavours, so they do not currently have the exemption.

I have also proposed that Ofcom would carry out a harm test to avoid the situation where someone argues that their services are a public information resource, while in practice using it to distribute harmful material. That would be a rare case, but noble Lords can conceive of it happening. Ofcom would have the ability to say that it recognises that Wikipedia does not carry harmful content in any meaningful way, but it would also have the right not to grant the exemption to service B that says it is a new Wikipedia but carries harmful content.

Thirdly, I have suggested that this is limited to non-commercial services. There is an argument for saying any public information resource should benefit, and that may be more in line with the amendment proposed by the noble Lord, Lord Moylan, where it is defined in terms of being encyclopaedic or the nature of the service. I recognise that I have put in “non-commercial” as belt and braces because there is a rationale for saying that, while we do not really want an encyclopaedic resource to be in the 2B service if it has got user-to-user functions, if it is commercial, we could reasonably expect it to find some way to comply. It is different when it is entirely non-commercial and volunteer-led, not least because the Wikimedia Foundation, for example, would struggle to justify spending the money that it has collected from donors on compliance costs with the UK regime, whereas a commercial company could increase its resources from commercial customers to do that.

I hope this is a helpful start to a debate in which we will also consider Amendment 29, which has similar goals. I will close by asking the Minister some additional questions. I have asked him some very specific ones to which I hope he can provide answers, but first I ask: does he acknowledges the genuine risk that services like Wikipedia and OpenStreetMap could find themselves in a position where they have obligations under the Bill that they simply cannot comply with? It is not that they are unwilling, but there is no way for them to do all this structurally.

Secondly, I hope the Minister would agree that it is not in the public interest for Ofcom to spend significant time and effort on the oversight of services like these; rather, it should spend its time and effort on services, such as social media services, that we believe to be creating harms and are the central focus of the Bill.

Thirdly, will the Minister accept that there is something very uncomfortable about a government regulator interfering with the running of a neutral public resource like Wikipedia, when there is so much benefit from it and little or no demonstrative harm? It is much closer to the model that exists for a newspaper. We have debated endlessly in this House—and I am sure we will come back to it—that there is, rightly, considerable reluctance to have regulators going too far and creating this relationship with neutral public information goods. Wikipedia falls into that category, as does OpenStreetMap and others, and there would be fundamental in principle challenges around that.

I hope the Government will agree that we should be taking steps to make sure we are not inadvertently creating a situation where, in one or two years’ time, Ofcom will come back to us saying that it wrote to Wikipedia, because the law told it to do so, and told Wikipedia all the things that it had to do; Wikipedia took it to its senior management and then came back saying that it is shutting shop in the UK. Because it is sensible, Ofcom would come back and say that it did not want that and ask to change the law to give it the power to grant an exemption. If such things deserve an exemption, let us make it clear they should have it now, rather than lead ourselves down this path where we end up effectively creating churn and uncertainty around what is an extraordinarily valuable public resource. I beg to move.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, Amendments 29 and 30 stand in my name. I fully appreciated, as I prepared my thoughts ahead of this short speech, that a large part of what I was going to say might be rendered redundant by the noble Lord, Lord Allan of Hallam. I have not had a discussion with him about this group at all, but it is clear that his amendment is rather different from mine. Although it addresses the same problem, we are coming at it slightly differently. I actually support his amendment, and if the Government were to adopt it I think the situation would be greatly improved. I do prefer my own, and I think he put his finger on why to some extent: mine is a little broader. His relates specifically to public information, whereas mine relates more to what can be described as the public good. So mine can be broader than information services, and I have not limited it to non-commercial operations, although I fully appreciate that quite a lot of the services we are discussing are, in practice, non-commercial. As I say, if his amendment were to pass, I would be relatively satisfied, but I have a moderate preference for my own.

15:15
Wikipedia has been mentioned frequently, and the Government cannot say that they have not had notice of this problem, because it was frequently mentioned in Committee. The fact that the Government have not come forward with any suggestions or amendments to address this at all—I can summarise what I think is their response in a moment—is truly remarkable. There has been an open letter signed recently drawing attention to this problem, which includes not only OpenStreetMap, which the noble Lord referred to, but the Heritage Alliance; INSPIRE, which is a physics research platform operated by CERN; the Wellcome Sanger Institute; the Chartered Institute of Library and Information Professionals in Scotland; and Liberty. They are all concerned about how they are going to operate. They will be caught. Curiously, the Taliban will not be caught, because the Taliban will benefit from the exemption that exists in paragraph 9(1)(c) of Schedule 1 for a “foreign sovereign power”. So the Taliban will not be in Ofcom’s scope at all, but all these organisations doing perfectly decent work are going to be chased down by our regulator.
The Minister has said from the Dispatch Box that he does not think that Wikipedia would be in scope. But when pressed on this, both at the Dispatch Box and in private conversation—for which I am grateful—he said that that was his opinion but it was going to be decided by Ofcom; his assurance at the Dispatch Box, in other words, carries no weight, because the decision is to be made by Ofcom. When I say that we can still pass these amendments, he says we cannot tell Ofcom what to do because it is independent. But the entire Bill is telling Ofcom what to do, and of course Parliament can tell it that services of this character are not in scope. The Bill specifies what services are and are not in scope. So I think it is pretty much a nonsense answer.
Although these organisations, in many cases, are non-profits, that does not mean they are not businesses, and businesses have to plan, invest and think about what they are going to do next. They are hanging there, waiting and absolutely uncertain until Ofcom make a regulatory decision of an existential character, because we in Parliament cannot possibly take a stance on it and the Government cannot have a view. I know that businesses are constantly subject to the possibility of regulatory change in the future, but I do not know of any regulatory changes that, in the normal course of events, threaten the entire business model. They might threaten how it is you plan to make a particular product or what insulation you might have to put in a house you are building; they do not put you entirely out of business, which is what this threatens to do. So I think there is a very strong argument indeed for an amendment that takes these services out of scope, not only because it is a nonsense not to but because it really does threaten investment, planning, jobs and the things that go with that.
My Amendments 29 and 30 should be taken together. Amendment 29 creates an exemption, the terms of which are stated very plainly so I do not need to read them out. Amendment 30 creates what is being referred to as a “rescue clause”; in other words, it says that Ofcom has the discretion to withdraw the exemption if it sees that it is in the public good to do so. If any of these services were to start going rogue and behaving in a way that was contrary to the public interest, or objectionable in terms of how the Bill operates and is intended to operate, Ofcom would be able to intervene and say, “That exemption no longer applies; if you get the exemption under Amendment 29, we can take it away under Amendment 30”. This is not unprecedented. This rescue clause has been almost cut and pasted from the Gambling Act. This process of creating an exemption which can be withdrawn is not unprecedented and has great merit. This is why I recommend Amendments 29 and 30 while not wanting to be exclusive to Amendment 28.
This is not going away. Germany exempts non-profit organisations. France has recently passed laws with similar scope to ours and has exempted those entities which operate in the public interest. There is nothing strange about doing what I and the noble Lord, Lord Allan, want to do. This will not go away, because the consequences could be very severe. It is not a question of whether Wikipedia will close. Wikipedia in the English language will probably survive all of this. It has a lot of people supporting it and a lot of volunteers working for it. However, what of many of the minor languages? Wait until the Government find out whether Welsh Wikipedia, the largest Welsh-language website in the world, will survive and the consequences. What will the Government say when they start getting letters? “Oh, it’s nothing to do with us, we have no answer for that, it’s all a matter for Ofcom”. This is a completely unsustainable position. It is indefensible for us in Parliament to let it pass and it is completely unsustainable for the Government. Some action must be taken.
Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to my Amendments 281 to 281B. I thank the noble Baronesses, Lady Harding and Lady Kidron, and the noble Lord, Lord Knight, for adding their names to them. I will deal first with Amendments 281 and 281B, then move to 281A.

On Amendments 281 and 281B, the Minister will recall that in Committee we had a discussion around how functionality is defined in the Bill and that a great deal of the child risk assessments and safety duties must have regard to functionality, as defined in Clause 208. However, as it is currently written, this clause appears to separate out functionalities of user-to-user services and search services. These two amendments are designed to adjust that slightly, to future-proof the Bill.

Why is this necessary? First, it reflects that it is likely that in the future, many of the functionalities that we currently see on user-to-user services will become present on search services and possibly vice versa. Therefore, we need to try to take account of how the world is likely to move. Secondly, this is already happening, and it poses a risk to children. Some research done by the 5Rights Foundation has found that “predictive search”, counted in the Bill as a search service functionality, is present on social media websites, leading one child user using a search bar to be presented in nanoseconds with prompts associated with eating disorders. In Committee, the Minister noted that the functionalities listed in this clause are non-exhaustive. At the very least, it would be helpful to clarify this in the Bill language.

Amendment 281A would add specific functionalities which we know are addictive or harmful to children and put them in the Bill. We have a great deal of research and evidence which demonstrates how persuasive certain design strategies are with children. These are features which are solely designed to keep users on the platform, at any cost, as much as possible and for as long as possible. The more that children are on the platform, the more harm they are likely to suffer. Given that the purpose of this Bill is for services to be safe by design, as set out usefully in Amendment 1, please can we make sure that where we know—and we do know—that risk exists, we are doing our utmost to tackle it?

The features that are listed in this amendment are known as “dark patterns”—and they are known as “dark patterns” for a very good reason. They have persuasive and pervasive design features which are deliberately baked into the design of the digital services and products, to capture and hold, in this case, children’s attention, and to create habitual, even compulsive behaviours. The damage this does to children is proven and palpable. For example, one of the features mentioned is infinite scroll, which is now ubiquitous on most major social media platforms. The inventor of infinite scroll, a certain Aza Raskin, who probably thought it was a brilliant idea at the time, has said publicly that he now deeply regrets ever introducing it, because of the effect it is having on children.

One of the young people who spoke to the researchers at 5Rights said of the struggle they have daily with the infinite scroll feature:

“Scrolling forever gives me a sick feeling in my stomach. I’m so aware of how little control I have and the feeling of needing to be online is overwhelming and consuming”.


Features designed to keep users—adults, maybe fine, but children not fine—online at any cost are taking a real toll. Managing public and frequent interactions online, which the features encourage, creates the most enormous pressures for young people, and with that comes anxiety, low self-esteem and mental health challenges. This is only increasing, and unless we are very specific about these, they are going to continue.

We have the evidence. We know what poses harm and risk to children. Please can we make sure that this is reflected accurately in the Bill?

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I rise briefly to support many of the amendments in this group. I will start with Amendments 281, 281A and 281B in the name of my noble friend Lord Russell, to which I have added my name. The noble Lord set out the case very well. I will not reiterate what he said, but it is simply the case that the features and functionalities of regulated companies should not be separated by search and user-to-user but should apply across any regulated company that has that feature. There is no need to worry about a company that does not have one of the features on the list, but it is a much more dangerous thing to have an absent feature than it is to have a single list and hold companies responsible for their features.

Only this morning, Meta released Thread as its challenger to Twitter. In the last month, Snapchat added generative AI to its offering. Instagram now does video, and TikTok does shopping. All these companies are moving into a place where they would like to be the one that does everything. That is their commercial endgame, and that is where the Bill should set its sights.

Separating out functionality and, as the noble Lord, Lord Russell, said, failing to add what we already know, puts the Bill in danger of looking very old before the ink is dry. I believe it unnecessarily curtails Ofcom in being able to approach the companies for what they are doing, rather than for what the Bill thought they might be doing at this point. So, if the Minister is not in a position to agree to the amendment, I urge him at least to take it away and have a look at it, because it is a technical rather than an ideological matter. It would be wonderful to fix it.

15:30
I support the other amendments in the group: Amendments 28 and 29, and the very interesting Amendment 30. We come back to a very similar issue, which is about design. The thing about Wikipedia is that it does not stand at the doorway and grab your attention, and it does not follow you for six months after you visit it. It does not have that algorithmic push. So, although I freely admit that there are some unsavoury things on Wikipedia, it does not push them at you or at young people. That is a really interesting thing for us to hold in mind when we talk about the next group of amendments on harm.
I am bound to say that, although the noble Lord, Lord Allan, might prefer his amendment and the noble Lord, Lord Moylan, might prefer his, I prefer Amendment 245 in the name of the noble Baroness, Lady Morgan, which says that all services should be judged according to risk. This would stop this endless game of taking things out and putting things in, in case they behave badly, or taking things out for companies that we recognise now although we do not know what the companies of the future will be. We all have to remember that, even when we had the pre-legislative committee, we were not talking about large language models and when we started this Bill we were not talking about TikTok. Making laws for individual services is not a grand idea, but saying that it is not the size but the risk that should determine the category of a regulated service, and therefore its duties, seems a comprehensive way of getting to the same place.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, there is a danger of unanimity breaking out. The noble Lord, Lord Moylan, and I are not always on the same page as others, but this is just straightforward. I hope the Government listen to the fact that, even though we might be coming at this in different ways, there is concern on all sides.

I also note that this is a shift from what happened in Committee, when I tabled an amendment to try to pose the same dilemmas by talking about the size of organisations. Many a noble Lord said that size did not matter and that that did not work—but it was trying to get at the same thing. I do feel rather guilty that, to move the core philosophy forward, I have dumped the small and micro start-ups and SMEs that I also wanted to protect from overregulation—that is what has happened in this amendment—but now it seems an absolute no-brainer that we should find a way to exempt public interest organisations. This is where I would go slightly further. We should have a general exemption for public interest organisations, but with the ability for Ofcom to come down hard if they look as though they have moved from being low risk to being a threat.

As the noble Lord, Lord Moylan, noted, public interest exemptions happen throughout the world. Although I do not want to waste time reading things out, it is important to look at the wording of Amendment 29. As it says, we are talking about:

“historical, academic, artistic, educational, encyclopaedic, journalistic, or statistical content”.

We are talking about the kind of online communities that benefit the public interest. We are talking about charities, user-curated scientific publications and encyclopaedias. They is surely not what this Bill was designed to thwart. However, there is a serious danger that, if we put on them the number of regulatory demands in the Bill, they will not survive. That is not what the Government intend but it is what will happen.

Dealing with the Bill’s complexity will take much time and money for organisations that do not have it. I run a small free-speech organisation called the Academy of Ideas and declare my interest in it. I am also on the board of the Free Speech Union. When you have to spend so much time on regulatory issues it costs money and you will go under. That is important. This could waste Ofcom’s time. The noble Lord, Lord Allan of Hallam, has explained that. It would prevent Ofcom concentrating on the nasty bits that we want it to. It would be wasting its time trying to deal with what is likely to happen.

I should mention a couple of other things. It is important to note that there is sometimes controversy over the definition of a public interest organisation. It is not beyond our ken to sort it out. I Googled it—it is still allowed—and came up with a Wikipedia page that still exists. That is always good. If one looks, the term “public interest” is used across a range of laws. The Government know what kind of organisations they are talking about. The term has not just been made up for the purpose of an exemption.

It is also worth noting that no one is talking about public interest projects and organisations not being regulated at all but this is about an exemption from this regulation. They still have to deal with UK defamation, data protection, charity, counterterrorism and pornography laws, and the common law. Those organisations’ missions and founding articles will require that they do some good in the world. That is what they are all about. The Government should take this matter seriously.

Finally, on the rescue clauses, it is important to note—there is a reference to the Gambling Act—the Bill states that if there is problem, Ofcom should intervene. That was taken from what happens under the Gambling Act, which allows UK authorities to strip one or more gambling businesses of their licensing exemptions when they step out of line. No one is trying to say do not look at those exemptions at all but they obviously should not be in the scope of the Bill. I hope that when we get to the next stage, the Government will, on this matter at least, accept the amendment.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I also speak in support of Amendments to 281, 281A and 281B, to which I have added my name, tabled by the noble Lord, Lord Russell. He and, as ever, the noble Baroness Kidron, have spoken eloquently, I am not going to spend much time on these amendments but I wanted to emphasise Amendment 281A.

In the old world of direct marketing—I am old enough to remember that when I was a marketing director it was about sending magazines, leaflets and letters—one spent all of one’s time working out how to build loyalty: how to get people to engage longer as a result of one’s marketing communication. In the modern digital world, that dwell time has been transformed into a whole behavioural science of its own. It has developed a whole set of tools. Today, we have been using the word “activity” at the beginning of the Bill in the new Clause 1 but also “features” and “functionality”. The reason why Amendment 281A is important is that there is a danger that the Bill keeps returning to being just about content. Even in Clause 208 on functionality, almost every item in subsection (2) mentions content, whereas Amendment 281A tries to spell out the elements of addiction-driving functionality that we know exist today.

I am certain that brilliant people will invent some more but we know that these ones exist today. I really think that we need to put them in the Bill to help everyone understand what we mean because we have spent days on this Bill—some of us have spent years, if not decades, on this issue—yet we still keep getting trapped in going straight back to content. That is another reason why I think it is so important that we get some of these functionalities in the Bill. I very much hope that, if he cannot accept the amendment today, my noble friend the Minister will go back, reflect and work out how we could capture these specific functionalities before it is too late.

I speak briefly on Amendments 28 to 30. There is unanimity of desire here to make sure that organisations such as Wikipedia and Streetmap are not captured. Personally, I am very taken—as I often am—by the approach of the noble Baroness, Lady Kidron. We need to focus on risk rather than using individual examples, however admirable they are today. If Wikipedia chose to put on some form of auto-scroll, the risk of that service would go up; I am not suggesting that Wikipedia is going to do so today but, in the digital world, we should not assume that, just because organisations are charities or devoted to the public good, they cannot inadvertently cause harm. We do not make that assumption in the physical world either. Charities that put on physical events have to do physical risk assessments. I absolutely think that we should hold all organisations to that same standard. However, viewed through the prism of risk, Wikipedia—brilliant as it is—does not have a risk for child safety and therefore should not be captured by the Bill.

Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - - - Excerpts

My Lords, I broadly support all the amendments in this group but I will focus on the three amendments in the names of the noble Lord, Lord Russell, and others; I am grateful for their clear exposition of why these amendments are important. I draw particular attention to Amendment 281A and its helpful list of functions that are considered to be harmful and to encourage addiction.

There is a very important dimension to this Bill, whose object, as we have now established, is to encourage safety by design. An important aspect of it is cleaning up, and setting right, 20 years or more of tech development that has not been safe by design and has in fact been found to be harmful by way of design. As the noble Baroness, Lady Harding, just said, in many conversations and in talking to people about the Bill, one of the hardest things to communicate and get across is that this is about not only content but functionality. Amendment 281A provides a useful summary of the things that we know about in terms of the functions that cause harm. I add my voice to those encouraging the Minister and the Government to take careful note of it and to capture this list in the text of the Bill in some way so that this clean-up operation can be about not only content for the future but functionality and can underline the objectives that we have set for the Bill this afternoon.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I start by saying amen—not to the right reverend Prelate but to my noble friend Lady Harding. She said that we should not assume that, just because charities exist, they are all doing good; as a former chair of the Charity Commission, I can say that that is very true.

The sponsors of Amendments 281 to 281B have made some powerful arguments in support of them. They are not why I decided to speak briefly on this group but, none the less, they made some strong points.

I come back to Amendments 28 to 30. Like others, I do not have a particular preference for which of the solutions is proposed to address this problem but I have been very much persuaded by the various correspondence that I have received—I am sure that other noble Lords have received such correspondence—which often uses Wikipedia as the example to illustrate the problem.

However, I take on board what my noble friend said: there is a danger of identifying one organisation and getting so constrained by it that we do not address the fundamental problems that the Bill is about, which is making sure that there is a way of appropriately excluding organisations that should not be subject to these various regulations because they are not designed for them. I am open to the best way of doing that.

15:45
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a very interesting debate, as it is a real contrast. We have one set of amendments which say that the net is too wide and another which say that the net is not wide enough, and I agree with both of them. After all, we are trying to fine-tune the Bill to get it to deal with the proper risks—the word “risk” has come up quite a lot in this debate—that it should. Whether or not we make a specific exemption for public interest services, public information services, limited functionality services or non-commercial services, we need to find some way to deal with the issue raised by my noble friend and the noble Lord, Lord Moylan, in their amendments. All of us are Wikipedia users; we all value the service. I particularly appreciated what was said by the noble Baroness, Lady Kidron: Wikipedia does not push its content at us—it is not algorithmically based.

What the noble Lord, Lord Russell, said, resonated with me, because I think he has found a thundering great hole in the Bill. This infinite scrolling and autoplay is where the addiction of so much of social media lies, and the Bill absolutely needs systemically and functionally to deal with it. So, on the one hand, we have a service which does not rely on that infinite scrolling and algorithmic type of pushing of content and, on the other hand, we are trying to identify services which have that quality.

I very much hope the Minister is taking all this on board, because on each side we have identified real issues. Whether or not, when we come to the light at the end of the tunnel of Amendment 245 from the noble Baroness, Lady Morgan, it will solve all our problems, I do not know. All I can say is that I very much hope that the Minister will consider both sets of amendments and find a way through this that is satisfactory to all sides.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, much like the noble Lord, Lord Clement-Jones, I started off being quite certain I knew what to say about these amendments. I even had some notes—unusual for me, I know—but I had to throw them away, which I always do with my notes, because the arguments have been persuasive. That is exactly why we are here in Parliament discussing things: to try to reach common solutions to difficult problems.

We started with a challenge to the Minister to answer questions about scope, exemptions and discretion in relation to a named service—Wikipedia. However, as the debate went on, we came across the uncomfortable feeling that, having got so far into the Bill and agreed a lot of amendments today improving it, we are still coming up against quite stubborn issues that do not fit neatly into the categorisation and structures that we have. We do not seem to have the right tools to answer the difficult questions before us today, let alone the myriad questions that will come up as the technology advances and new services come in. Why have we not already got solutions to the problems raised by Amendments 281, 281A and 281B?

There is also the rather difficult idea we have from the noble Lord, Lord Russell, of dark patterns, which we need to filter into our thinking. Why does that not fit into what we have got? Why is it that we are still worried about Wikipedia, a service for public good, which clearly has risks in it and is sometimes capable of making terrible mistakes but is definitely a good thing that should not be threatened by having to conform with a structure and a system which we think is capable of dealing with some of the biggest and most egregious companies that are pushing stuff at us in the way that we have talked about?

I have a series of questions which I do not have the answers to. I am looking forward to the Minister riding to my aid on a white charger of enormous proportions and great skill which will take us out without having to fall over any fences.

If I may, I suggest to the Minister a couple of things. First, we are stuck on the word “content”. We will come back to that in the future, as we still have an outstanding problem about exactly where the Bill sets it. Time and again in discussions with the Bill team and with Ministers we have been led back to the question of where the content problem lies and where the harms relate to that, but this little debate has shown beyond doubt that harm can occur independent of and separate from content. We must have a solution to that, and I hope it will be quick.

Secondly, when approaching anybody or anything or any business or any charity that is being considered in scope for this Bill, we will not get there if we are looking only at the question of its size and its reach. We have to look at the risks it causes, and we have to drill down hard into what risks we are trying to deal with using our armoury as we approach these companies, because that is what matters to the children, vulnerable people and adults who would suffer otherwise, and not the question of whether or not these companies are big or small. I think there are solutions to that and we will get there, but, when he comes to respond, the Minister needs to demonstrate to us that he is still willing to listen and think again about one or two issues. I look forward to further discussions with him.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am grateful to noble Lords for their contributions during this debate. I am sympathetic to arguments that we must avoid imposing disproportionate burdens on regulated services, and particularly that the Bill should not inhibit services from providing valuable information which is of benefit to the public. However, I want to be clear that that is why the Bill has been designed in the way that it has. It has a broad scope in order to capture a range of services, but it has exemptions and categorisations built into it. The alternative would be a narrow scope, which would be more likely inadvertently to exempt risky sites or to displace harm on to services which we would find are out of scope of the Bill. I will disappoint noble Lords by saying that I cannot accept their amendments in this group but will seek to address the concerns that they have raised through them.

The noble Lord, Lord Allan, asked me helpfully at the outset three questions, to which the answers are yes, no and maybe. Yes, Wikipedia and OpenStreetMap will be in scope of the Bill because they allow users to interact online; no, we do not believe that they would fall under any of the current exemptions in the Bill; and the maybe is that Ofcom does not have the discretion to exempt services but the Secretary of State can create additional exemptions for further categories of services if she sees fit.

I must also say maybe to my noble friend Lord Moylan on his point about Wikipedia—and with good reason. Wikipedia, as I have just explained, is in scope of the Bill and is not subject to any of its exemptions. I cannot say how it will be categorised, because that is based on an assessment made by the independent regulator, but I reassure my noble friend that it is not the regulator but the Secretary of State who will set the categorisation thresholds through secondary legislation; that is to say, a member of the democratically elected Government, accountable to Parliament, through legislation laid before that Parliament. It will then be for Ofcom to designate services based on whether or not they meet those thresholds.

It would be wrong—indeed, nigh on impossible—for me to second-guess that designation process from the Dispatch Box. In many cases it is inherently a complex and nuanced matter since, as my noble friend Lady Harding said, many services change over time. We want to keep the Bill’s provisions flexible as services change what they do and new services are invented.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I would just like to finish my thought on Wikipedia. Noble Lords are right to mention it and to highlight the great work that it does. My honourable friend the Minister for Technology and the Digital Economy, Paul Scully, met Wikipedia yesterday to discuss its concerns about the Bill. He explained that the requirements for platforms in this legislation will be proportionate to the risk of harm, and that as such we do not expect the requirements for Wikipedia to be unduly burdensome.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I am computing the various pieces of information that have just been given, and I hope the Minister can clarify whether I have understood them correctly. These services will be in scope as user-to-user services and do not have an exemption, as he said. The Secretary of State will write a piece of secondary legislation that will say, “This will make you a category 1 service”—or a category 2 or 2B service—but, within that, there could be text that has the effect that Wikipedia is in none of those categories. So it and services like it could be entirely exempt from the framework by virtue of that secondary legislation. Is that a correct interpretation of what he said?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

The Secretary of State could create further exemptions but would have to bring those before Parliament for it to scrutinise. That is why there is a “maybe” in answer to his third question in relation to any service. It is important for the legislation to be future-proofed that the Secretary of State has the power to bring further categorisations before Parliament for it to discuss and scrutinise.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, I will keep pressing this point because it is quite important, particularly in the context of the point made by the noble Baroness, Lady Kidron, about categorisation, which we will debate later. There is a big difference when it comes to Schedule 11, which defines the categorisation scheme: whether in the normal run of business we might create an exemption in the categorisation secondary legislation, or whether it would be the Secretary of State coming back with one of those exceptional powers that the Minister knows we do not like. He could almost be making a case for why the Secretary of State has to have these exceptional powers. We would be much less comfortable with that than if the Schedule 11 categorisation piece effectively allowed another class to be created, rather than it being an exceptional Secretary of State power.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

To follow on from that, we are talking about the obligation to bring exemptions to Parliament. Well, we are in Parliament and we are bringing exemptions. The noble Lord is recommending that we bring very specific exemptions while those that the noble Lord, Lord Moylan, and I have been recommending may be rather broad—but I thought we were bringing exemptions to Parliament. I am not being facetious. The point I am making is, “Why can’t we do it now?” We are here now, doing this. We are saying, as Parliament, “Look at these exemptions”. Can the Minister not look at them now instead of saying that we will look at them some other time?

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I may as well intervene now as well, so that the Minister can get a good run at this. I too am concerned at the answer that has been given. I can see the headline now, “Online Safety Bill Age-Gates Wikipedia”. I cannot see how it does not, by virtue of some of the material that can be found on Wikipedia. We are trying to say that there are some services that are inherently in a child’s best interests—or that are in their best interests according to their evolving capacity, if we had been allowed to put children’s rights into the Bill. I am concerned that that is the outcome of the answer to the noble Lord, Lord Allan.

16:00
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I do not think that it is, but it will be helpful to have a debate on categorisation later on Report, when we reach Amendment 245, to probe this further. It is not possible for me to say that a particular service will certainly be categorised one way or another, because that would give it carte blanche and we do not know how it may change in the future—estimable though I may think it is at present. That is the difficulty of setting the precise parameters that the noble Baroness, Lady Fox, sought in her contribution. We are setting broad parameters, with exemptions and categorisations, so that the burdens are not unduly heavy on services which do not cause us concern, and with the proviso for the Secretary of State to bring further exemptions before Parliament, as circumstances strike her as fit, for Parliament to continue the debate we are having now.

The noble Baroness, Lady Kidron, in her earlier speech, asked about the functionalities of user-to-user services. The definitions of user-to-user services are broad and flexible, to capture new and changing services. If a service has both user-to-user functionality and a search engine, it will be considered a combined service, with respective duties for the user-to-user services which form part of its service and search duties in relation to the search engine.

I reassure my noble friend Lady Harding of Winscombe that the Bill will not impose a disproportionate burden on services, nor will it impede the public’s access to valuable content. All duties on services are proportionate to the risk of harm and, crucially, to the capacity of companies. The Bill’s proportionate design means that low-risk services will have to put in place only measures which reflect the risk of harm to their users. Ofcom’s guidance and codes of practice will clearly set out how these services can comply with their duties. We expect that it will set out a range of measures and steps for different types of services.

Moreover, the Bill already provides for wholesale exemptions for low-risk services and for Ofcom to exempt in-scope services from requirements such as record-keeping. That will ensure that there are no undue burdens to such services. I am grateful for my noble friend’s recognition, echoed by my noble friend Lady Stowell of Beeston, that “non-profit” does not mean “not harmful” and that there can be non-commercial services which may pose harms to users. That is why it is important that there is discretion for proper assessment.

Amendment 30 seeks to allow Ofcom to withdraw the exemptions listed in Schedule 1 from the Bill. I am very grateful to my noble friend Lord Moylan for his time earlier this week to discuss his amendment and others. We have looked at it, as I promised we would, but I am afraid that we do not think that it would be appropriate for Ofcom to have this considerable power—my noble friend is already concerned that the regulator has too much.

The Bill recognises that it may be necessary to remove certain exemptions if there is an increased risk of harm from particular types of services. That is why the Bill gives the Secretary of State the power to remove particular exemptions, such as those related to services which have limited user-to-user functionality and those which offer one-to-one live aural communications. These types of services have been carefully selected as areas where future changes in user behaviour could necessitate the repeal or amendment of an exemption in Schedule 1. This power is intentionally limited to only these types of services, meaning that the Secretary of State will not be able to remove exemptions for comments on recognised news publishers’ sites. That is in recognition of the Government’s commitment to media freedom and public debate. It would not be right for Ofcom to have the power to repeal those exemptions.

Amendments 281 and 281B, in the name of the noble Lord, Lord Russell of Liverpool, are designed to ensure that the lists of features under the definition of “functionality” in the Bill apply to all regulated services. Amendment 281A aims to add additional examples of potentially addictive functionalities to the Bill’s existing list of features which constitute a “functionality”. I reassure him and other noble Lords that the list of functionalities in the Bill is non-exhaustive. There may be other functionalities which could cause harm to users and which services will need to consider as part of their risk assessment duties. For example, if a provider’s risk assessment identifies that there are functionalities which risk causing significant harm to an appreciable number of children on its service, the Bill will require the provider to put in place measures to mitigate and manage that risk.

He and other noble Lords spoke about the need for safety by design. I can reassure them this is already built into the framework of the Bill, which recognises how functionalities including many of the things mentioned today can increase the risk of harm to users and will encourage the safe design of platforms.

Amendments 281 and 281B have the effect that regulated services would need to consider the risk of harm of functionalities that are not relevant for their kind of service. For example, sharing content with other users is a functionality of user-to-user services, which is not as relevant for search services. The Bill already outlines specific features that both user-to-user and search services should consider, which are the most relevant functionalities for those types of service. Considering these functionalities would create an unnecessary burden for regulated services which would detract from where their efforts can best be focused. That is why I am afraid I cannot accept the amendments that have been tabled.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, surely it is the role of the regulators to look at functionalities of this kind. The Minister seemed to be saying that it would be an undue burden on the regulator. Is not that exactly what we are meant to be legislating about at this point?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Perhaps I was not as clear as I could or should have been. The regulator will set out in guidance the duties that fall on the businesses. We do not want the burden on the business to be unduly heavy, but there is an important role for Ofcom here. I will perhaps check—

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

But these functionalities are a part of their business model, are they not?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Hence Ofcom will make the assessments about categorisation based on that. Maybe I am missing the noble Lord’s point.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I think we may need further discussions on the amendment from the noble Lord, Lord Russell.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will check what I said but I hope that I have set out why we have taken the approach that we have with the broad scope and the exemptions and categorisations that are contained in it. With that, I urge the noble Lord to withdraw his amendment.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, that was a very useful debate. I appreciate the Minister’s response and his “yes, no, maybe” succinctness, but I think he has left us all more worried than when the debate started. My noble friend Lord Clement-Jones tied it together nicely. What we want is for the regulator to be focused on the greatest areas of citizen risk. If there are risks that are missing, or things that we will be asking the regulator to do that are a complete waste of time because they are low risk, then we have a problem. We highlighted both those areas. The noble Lord, Lord Russell, rightly highlighted that we are not content with just “content” as the primary focus of the legislation; it is about a lot more than content. In my amendment and those by the noble Lord, Lord Moylan, we are extremely worried—and remain so—that the Bill creates a framework that will trap Wikipedia and services like it, without that being our primary intention. We certainly will come back to this in later groups; I will not seek to press the amendment now, because there is a lot we all need to digest. However, at the end of this process, we want to get to point where the regulator is focused on things that are high risk to the citizen and not wasting time on services that are very low risk. With that, I beg leave to withdraw my amendment.

Amendment 28 withdrawn.
Amendment 29 not moved.
Amendment 30 not moved.
Clause 5: Overview of Part 3
Amendment 31
Moved by
31: Clause 5, page 4, line 40, leave out “section 54” and insert “sections 54 to (“Priority content that is harmful to children”)”
Member’s explanatory statement
This amendment is consequential on the new Clauses proposed to be inserted after Clause 54 in my name setting out which kinds of content count as primary priority content and priority content harmful to children.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, the government amendments in this group relate to the categories of primary priority and priority content that is harmful to children.

Children must be protected from the most harmful online content and activity. As I set out in Committee, the Government have listened to concerns about designating primary priority and priority categories of content in secondary legislation and the need to protect children from harm as swiftly as possible. We have therefore tabled amendments to set out these categories in the Bill. I am grateful for the input from across your Lordships’ House in finalising the scope of these categories.

While it is important to be clear about the kinds of content that pose a risk of harm to children, I acknowledge what many noble Lords raised during our debates in Committee, which is that protecting children from online harm is not just about content. That is why the legislation takes a systems and processes approach to tackling the risk of harm. User-to-user and search service providers will have to undertake comprehensive, mandatory risk assessments of their services and consider how factors such as the design and operation of a service and its features and functionalities may increase the risk of harm to children. Providers must then put in place measures to manage and mitigate these risks, as well as systems and processes to prevent and protect children from encountering the categories of harmful content.

We have also listened to concerns about cumulative harm. In response to this, the Government have tabled amendments to Clause 209 to make it explicit that cumulative harm is addressed. This includes cumulative harm that results from algorithms bombarding a user with content, or where combinations of functionality cumulatively drive up the risk of harm. These amendments will be considered in more detail under a later group of amendments, but they are important context for this discussion.

I turn to the government amendments, starting with Amendment 171, which designates four categories of primary priority content. First, pornographic content has been defined in the same way as in Part 5—to give consistent and comprehensive protection for children, regardless of the type of service on which the pornographic content appears. The other three categories capture content which encourages, promotes or provides instructions for suicide, self-harm or eating disorders. This will cover, for example, glamorising or detailing methods for carrying out these dangerous activities. Designating these as primary priority content will ensure that the most stringent child safety duties apply.

Government Amendment 172 designates six categories of priority content. Providers will be required to protect children from encountering a wide range of harmful violent content, which includes depictions of serious acts of violence or graphic injury against a person or animal, and the encouragement and promotion of serious violence, such as content glamorising violent acts. Providers will also be required to protect children from encountering abusive and hateful content, such as legal forms of racism and homophobia, and bullying content, which sadly many children experience online.

The Government have heard concerns from the noble Baronesses, Lady Kidron and Lady Finlay of Llandaff, about extremely dangerous activities being pushed to children as stunts, and content that can be harmful to the health of children, including inaccurate health advice and false narratives. As such, we are designating content that encourages dangerous stunts and challenges as a category of priority content, and content which encourages the ingestion or inhalation of, or exposure to, harmful substances, such as harmful abortion methods designed to be taken by a person without medical supervision.

Amendment 174, from the noble Baroness, Lady Kidron, seeks to add “mis- and disinformation” and “sexualised content” to the list of priority content. On the first of these, I reiterate what I said in Committee, which is that the Bill will protect children from harmful misinformation and disinformation where it intersects with named categories of primary priority or priority harmful content—for example, an online challenge which is promoted to children on the basis of misinformation or disinformation, or abusive content with a foundation in misinformation or disinformation. However, I did not commit to misinformation and disinformation forming its own stand-alone category of priority harmful content, which could be largely duplicative of the categories that we have already included in the Bill and risks capturing a broad range of legitimate content.

We have already addressed key concerns related to misinformation and disinformation content which presents the greatest risk to children by including content which encourages the ingestion or inhalation of, or exposure to, harmful substances to the list of priority categories. However, the term “mis- and disinformation”, as proposed by Amendment 174, in its breadth and subjectivity risks inadvertently capturing a wide range of content resulting in disproportionate, excessive censorship of the content children see online, including in areas of legitimate debate. The harm arising from misinformation or disinformation usually arises from the context or purpose of the content, rather than the mere fact that it is untrue. Our balanced approach ensures that children are protected from the most prevalent and concerning harms associated with misinformation and disinformation.

16:15
I turn to sexualised and adult content. Again, we must tread carefully here. What might constitute this is subjective and presents challenges for both providers and Ofcom to interpret. It is important that what constitutes priority content is sufficiently well defined so it is clear both to providers and to Ofcom what their obligations under the Bill are. Amendment 174 sets an extremely broad scope and gives rise to a risk of censorship if providers take an excessively broad interpretation of what is sexualised and adult content. It is important that we safeguard children’s freedom of expression through the Bill and do not inadvertently limit their access to innocuous and potentially helpful content.
The duties are, of course, not limited to the content in the Government’s amendments. The Bill requires providers to identify and act on any “non-designated content” which meets the Bill’s threshold of
“content that is harmful to children”,
even where it has not been designated as primary priority or priority content. Therefore, I hope the noble Baroness will understand why we cannot accept Amendment 174.
Amendment 237 in my name introduces a delegated power to update and amend these lists. This is essential for ensuring that the legislation remains flexible to change and that new and emerging risks of harm can be captured swiftly. Amendment 238, also in my name, ensures that the draft affirmative procedure will apply except in cases where there is an urgent need to update the lists, when the affirmative procedure can be used. This ensures that Parliament will retain the appropriate degree of oversight over any changes. I beg to move.
Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, we spent a lot of time in Committee raising concerns about how pornography and age verification were going to operate across all parts of the Bill. I have heard what the Minister has said in relation to this group, priority harms to children, which I believe is one of the most important groups under discussion in the Bill. I agree that children must be protected from the most harmful content online and offline.

I am grateful to the Government for having listened carefully to the arguments put forward by the House in this regard and commend the Minister for all the work he and his team have done since them. I also commend the noble Lord, Lord Bethell. He and I have been in some discussion between Committee and now in relation to these amendments.

In Committee, I argued for several changes to the Bill which span three groups of amendments. One of my concerns was that pornography should be named as a harm in the Bill. I welcome the Government’s Amendment 171, which names pornography as a primary priority content. I also support Amendment 174 in the name of the noble Baroness, Lady Kidron. She is absolutely right that sexualised content can be harmful to children if not age appropriate and, in that regard, before she even speaks, I ask the Minister tousb reconsider his views on this amendment and to accept it.

Within this group are the amendments which move the definition of “pornographic content” from Part 5 to Clause 211. In that context, I welcome the Government’s announcement on Monday about a review of the regulation, legislation and enforcement of pornography offences.

In Committee, your Lordships were very clear that there needed to be a consistent approach across the Bill to the regulation of pornography. I am in agreement with the amendments tabled in Committee to ensure that consistency applies across all media. In this regard, I thank the noble Baroness, Lady Benjamin, for her persistence in raising this issue. I also thank my colleagues on the Opposition Front Bench, the noble Lord, Lord Stevenson, and the noble Baroness, Lady Merron.

I appreciate that the Government made this announcement only three days ago, but I hope the Minister will set out a timetable for publishing the terms of reference and details of how this review will take place. The review is too important to disappear into the long grass over the Summer Recess, never to be heard of again, so if he is unable to answer my question today, will he commit to writing to your Lordships with the timeframe before the House rises for the summer? Will he consider the active involvement of external groups in this review, as much expertise lies outside government in this area? In that regard, I commend CARE, CEASE and Barnardo’s for all their input into the debates on the Bill.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, I think the noble Baroness’s comments relate to the next group of amendments, on pornography. She might have skipped ahead, but I am grateful for the additional thinking time to respond to her questions. Perhaps she will save the rest of her remarks for that group.

Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- Hansard - - - Excerpts

I thank the Minister for that. In conclusion, I hope he will reflect on those issues and come back, maybe at the end of the next group. I remind the House that in February the APPG on Commercial Sexual Exploitation, in its inquiry on pornography, recommended that the regulation of pornography should be consistent across all online platforms and between the online and offline spheres. I hope we can incorporate the voices I have already mentioned in the NGO sphere in order to assist the Government and both Houses in ensuring that we regulate the online platforms and that children are protected from any harms that may arise.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I shall speak briefly to Amendment 174 in my name and then more broadly to this group—I note that the Minister got his defence in early.

On the question of misinformation and disinformation, I recognise what he said and I suppose that, in my delight at hearing the words “misinformation and disinformation”, I misunderstood to some degree what he was offering at the Dispatch Box, but I make the point that this poses an enormous risk to children. As an example, children are the fastest-growing group of far-right believers/activists online, and there are many areas in which we are going to see an exponential growth in misinformation and disinformation as large language models become the norm. So I ask him, in a tentative manner, to look at that.

On the other issue, I have to push back at the Minister’s explanation. Content classification around sexual content is a well-established norm. The BBFC does it and has done it for a very long time. There is an absolute understanding that what is suitable for a U, a PG, a 12 or a 12A are different things, and that as children’s capacities evolve, as they get older, there are things that are more suitable for older children, including, indeed, stronger portrayals of sexual behaviour as the age category rises. So I cannot accept that this opens a new can of worms: this is something that we have been doing for many, many years.

I think it is a bit wrongheaded to imagine that if we “solve” the porn problem, we have solved the problem—because there is still sexualisation and the commercialisation of sex. Now, if you say something about feet to a child, they start to giggle uproariously because, in internet language, you get paid for taking pictures of feet and giving them to strange people. There are such detailed and different areas that companies should be looking at. This amendment in my name and the names of the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford, should be taken very seriously. It is not new ground, so I would ask the Minister to reconsider it.

More broadly, the Minister will have noticed that I liberally added my name to the amendments he has brought forward to meet some of the issues we raised in Committee, and I have not added my name to the schedule of harms. I want to be nuanced about this and say I am grateful to the Government for putting them in the Bill, I am grateful that the content harms have been discussed in this Chamber and not left for secondary legislation, and I am grateful for all the conversations around this. However, harm cannot be defined only as content, and the last grouping got to the core of the issue in the House. Even when the Minister was setting out this amendment, he acknowledged that the increase in harm to users may be systemic and by design. In his explanation, he used the word “harm”; in the Bill, it always manifests as “harmful content”.

While the systemic risk of increasing the presence of harmful content is consistently within the Bill, which is excellent, the concept that the design of service may in and of itself be harmful is absent. In failing to do that, the Government, and therefore the Bill, have missed the bull’s-eye. The bull’s-eye is what is particular about this method of communication that creates harm—and what is particular are the features, functionalities and design. I draw noble Lords back to the debate about Wikipedia. It is not that we all love Wikipedia adoringly; it is that it does not pursue a system of design for commercial purposes that entraps people within its grasp. Those are the harms we are trying to get at. I am grateful for the conversations I have had, and I look forward to some more. I have laid down some other amendments for Monday and beyond that would, I hope, deal with this—but until that time, I am afraid this is an incomplete picture.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, I have a comment about Amendment 174 in the name of the noble Baroness, Lady Kidron. I have no objection to the insertion of subsection (9B), but I am concerned about (9A), which deals with misinformation and disinformation. It is far too broad and political, and if we start at this late stage to try to run off into these essentially political categories, we are going to capsize the Bill altogether. So I took some heart from the fact that my noble friend on the Front Bench appeared disinclined to accept at least that limb of the amendment.

I did want to ask briefly some more detailed questions about Amendment 172 and new subsection (2) in particular. This arises from the danger of having clauses added at late stages of the Bill that have not had the benefit of proper discussion and scrutiny in Committee. I think we are all going to recognise the characteristics that are listed in new subsection (2) as mapping on to the Equality Act, which appears to be their source. I note in passing that it refers in that regard to gender reassignment. I would also note that most of the platforms, in their terms and conditions, refer not to gender reassignment but to various other things such as gender identity, which are really very different, or at least different in detail, and I would be interested to ask my noble friend how effectively he expects it to be enforced that the words used in English statute are actually applied by these foreign platforms—I am going to come back to this in a further amendment later—or how the words used in English statute are applied by what are, essentially, foreign platforms when they are operating for an audience in the United Kingdom.

16:30
I also want to make a more general point. I have looked up the Equality Act, to make sure I am not mistaken about this. The Equality Act is essentially about discrimination against people as individuals or as groups. In defining direct discrimination, it says:
“A person (A) discriminates against another (B) if, because of a protected characteristic, A treats B less favourably than A treats or would treat others”.
So, there has to be a “B”—there has to be a person—for the Equality Act to be engaged. But this Bill says content is abusive when it
“targets any of the following characteristics”.
“Targets” is an interesting word. It does not say “attacks” and it does not relate to treatment. One can “target” something favourably. One can be positive in targeting something. It does not have to be a negative thing.
On the question of religion, if I, A, treat B adversely because they adhere to a particular religion, I fall foul of the Equality Act. But this appears to cover religion as a phenomenon. So, if I say that I am going to treat somebody badly because they are Jewish, of course I fall foul of the Equality Act. But this appears to say that if I say something adverse and abusive about the Jewish religion without reference to any particular individual, I will fall foul of this clause. I know that sounds a minor point of detail, but it is actually very significant. I want to hear my noble friend explain how in detail this is going to operate. If I say something adverse or abusive about gender reassignment and disability, that would not fall foul of the Equality Act necessarily, but it would fall foul of the Bill, as far as I can see. Are we creating a new blasphemy offence here, in effect, in relation to religion, as opposed to what the Equality Act does? I would like my noble friend to be able to expand on this. I know this is a Committee stage-type query, but this is our first opportunity to ask these questions.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, interestingly, because I have not discussed this at all with the noble Lord, Lord Moylan, I have some similar concerns to his. I have always wanted this to be a children’s online safety Bill. My concerns generally have been about threats to adults’ free speech and privacy and the threat to the UK as the home of technological innovation. I have been happy to keep shtum on things about protecting children, but I got quite a shock when I saw the series of government amendments.

I thought what most people in the public think: the Bill will tackle things such as suicide sites and pornography. We have heard some of that very grim description, and I have been completely convinced by people saying, “It’s the systems”. I get all that. But here we have a series of amendments all about content—endless amounts of content and highly politicised, contentious content at that—and an ever-expanding list of harms that we now have to deal with. That makes me very nervous.

On the misinformation and disinformation point, the Minister is right. Whether for children or adults, those terms have been weaponised. They are often used to delegitimise perfectly legitimate if contrary or minority views. I say to the noble Baroness, Lady Kidron, that the studies that say that youth are the fastest-growing far-right group are often misinformation themselves. I was recently reading a report about this phenomenon, and things such as being gender critical or opposing the small boats arriving were considered to be evidence of far-right views. That was not to do with youth, but at least you can see that this is quite a difficult area. I am sure that many people even in here would fit in the far right as defined by groups such as HOPE not hate, whose definition is so broad.

My main concerns are around the Minister’s Amendment 172. There is a problem: because it is about protected characteristics—or apes the protected characteristics of the Equality Act—we might get into difficulty. Can we at least recognise that, even in relation to the protected characteristics as noted in the Equality Act, there are raging rows politically? I do not know how appropriate it is that the Minister has tabled an amendment dragging young people into this mire. Maya Forstater has just won a case in which she was accused of being opposed to somebody’s protected characteristics and sacked. Because of the protected characteristics of her philosophical views, she has won the case and a substantial amount of money.

I worry when I see this kind of list. It is not just inciting hatred—in any case, what that would mean is ambivalent. It refers to abuse based on race, religion, sex, sexual orientation, disability and so on. This is a minefield for the Government to have wandered into. Whether you like it or not, it will have a chilling effect on young people’s ability to debate and discuss. If you worry that some abuse might be aimed at religion, does that mean that you will not be able to discuss Charlie Hebdo? What if you wanted to show or share the Charlie Hebdo cartoons? Will that count? Some people would say that is abusive or inciteful. This is not where the Bill ought to be going. At the very least, it should not be going there at this late stage. Under race, it says that “nationality” is one of the indicators that we should be looking out for. Maybe it is because I live in Wales, but there is a fair amount of abuse aimed at the English. A lot of Scottish friends dole it out as well. Will this count for young people who do that? I cannot get it.

My final question is in relation to proposed subsection (11). This is about protecting children, yet it lists a person who

“has the characteristic of gender reassignment if the person is proposing to undergo, is undergoing or has undergone a process (or part of a process) for the purpose of reassigning the person’s sex by changing physiological or other attributes of sex”.

Are the Government seriously accepting that children have not just proposed to reassign but have been reassigned? That is a breach of the law. That is not meant to be happening. Your Lordships will know how bad this is. Has the Department for Education seen this? As we speak, it is trying to untangle the freedom for people not to have to go along with people’s pronouns and so on.

This late in the day, on something as genuinely important as protecting children, I just want to know whether there is a serious danger that this has wandered into the most contentious areas of political life. I think it is very dangerous for a government amendment to affirm gender reassignment to and about children. It is genuinely irresponsible and goes against the guidance the Government are bringing out at the moment for us to avoid. Please can the Minister clarify what is happening with Amendment 172?

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am not entirely sure how to begin, but I will try to make the points I was going to make. First, I would like to respond to a couple of the things said by the noble Baroness, Lady Fox. With the greatest respect, I worry that the noble Baroness has not read the beginning of the proposed new clause in Amendment 172, subsection (2), which talks about “Content which is abusive”, as opposed to content just about race, religion or the other protected characteristics.

One of the basic principles of the Bill is that we want to protect our children in the digital world in the same way that we protect them in the physical world. We do not let our children go to the cinema to watch content as listed in the primary priority and priority content lists in my noble friend the Minister’s amendments. We should not let them in the digital world, yet the reality is that they do, day in and day out.

I thank my noble friend the Minister, not just for the amendments that he has tabled but for the countless hours that he and his team have devoted to discussing this with many of us. I have not put my name to the amendments either because I have some concerns but, given the way the debate has turned, I start by thanking him and expressing my broad support for having the harms in the Bill, the importance of which this debate has demonstrated. We do not want this legislation to take people by surprise. The important thing is that we are discussing some fundamental protections for the most vulnerable in our society, so I thank him for putting those harms in the Bill and for allowing us to have this debate. I fear that it will be a theme not just of today but of the next couple of days on Report.

I started with the positives; I would now like to bring some challenges as well. Amendments 171 and 172 set out priority content and primary priority content. It is clear that they do not cover the other elements of harm: contact harms, conduct harms and commercial harms. In fact, it is explicit that they do not cover the commercial harms, because proposed new subsection (4) in Amendment 237 explicitly says that no amendment can be made to the list of harms that is commercial. Why do we have a perfect crystal ball that means we think that no future commercial harms could be done to our children through user-to-user and search services, such that we are going to expressly make it impossible to add those harms to the Bill? It seems to me that we have completely ignored the commercial piece.

I move on to Amendment 174, which I have put my name to. I am absolutely aghast that the Government really think that age-inappropriate sexualised content does not count as priority content. We are not necessarily talking here about a savvy 17 year-old. We are talking about four, five and six year-olds who are doomscrolling on various social media platforms. That is the real world. To suggest that somehow the digital world is different from the old-fashioned cinema, and a place where we do not want to protect younger children from age-inappropriate sexualised material, just seems plain wrong. I really ask my noble friend the Minister to reconsider that element.

I am also depressed about the discussion that we had about misinformation. As I said in Committee several times, I have two teenage girls. The reality is that we are asking today’s teenagers to try to work out what is truth and what is misinformation. My younger daughter will regularly say, “Is this just something silly on the internet?” She does not use the term “misinformation”; she says, “Is that just unreal, Mum?” She cannot tell about what appears in her social media feeds because of the degree of misinformation. Failing to recognise that misinformation is a harm for young people who do not yet know how to validate sources, which was so much easier for us when we were growing up than it is for today’s generations, is a big glaring gap, even in the content element of the harms.

I support the principle behind these amendments, and I am pleased to see the content harms named. We will come back next week to the conduct and contact harms—the functionality—but I ask my noble friend the Minister to reconsider on both misinformation and inappropriate sexualised material, because we are making a huge mistake by failing to protect our children from them.

16:45
Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - - - Excerpts

My Lords, I too welcome these amendments and thank the Minister and the Government for tabling them. The Bill will be significantly strengthened by Amendment 172 and related amendments by putting the harms as so clearly described in the Bill. I identify with the comments of others that we also need to look at functionality. I hope we will do that in the coming days.

I also support Amendment 174, to which I added my name. Others have covered proposed new subsection (9B) very well; I add my voice to those encouraging the Minister to give it more careful consideration. I will also speak briefly to proposed new subsection (9A), on misinformation and disinformation content. With respect to those who have spoken against it and argued that those are political terms, I argue that they are fundamentally ethical terms. For me, the principle of ethics and the online world is not the invention of new ethics but finding ways to acknowledge and support online the ethics we acknowledge in the offline world.

Truth is a fundamental ethic. Truth builds trust. It made it into the 10 commandments:

“You shall not bear false witness against your neighbour”.


It is that ethic that would be translated across in proposed new subsection (9A). One of the lenses through which I have viewed the Bill throughout is the lens of my eight grandchildren, the oldest of whom is eight years old and who is already using the internet. Proposed new subsection (9A) is important to him because, at eight years old, he has very limited ways of checking out what he reads online—fewer even than a teenager. He stands to be fundamentally misled in a variety of ways if there is no regulation of misinformation and disinformation.

Also, the internet, as we need to keep reminding ourselves in all these debates, is a source of great potential good and benefit, but only if children grow up able to trust what they read there. If they can trust the web’s content, they will be able to expand their horizons, see things from the perspective of others and delve into huge realms of knowledge that are otherwise inaccessible. But if children grow up necessarily imbued with cynicism about everything they read online, those benefits will not accrue to them.

Misinformation and disinformation content is therefore harmful to the potential of children across the United Kingdom and elsewhere. We need to guard against it in the Bill.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, Amendment 172 is exceptionally helpful in putting the priority harms for children on the face of the Bill. It is something that we have asked for and I know the pre-legislative scrutiny committee asked for it and it is good to see it there. I want to comment to make sure that we all have a shared understanding of what this means and that people out there have a shared understanding.

My understanding is that “primary priority” is, in effect, a red light—platforms must not expose children to that content if they are under 18—while “priority” is rather an amber light and, on further review, for some children it will be a red light and for other children it be a green light, and they can see stuff in there. I am commenting partly having had the experience of explaining all this to my domestic focus group of teenagers and they said, “Really? Are you going to get rid of all this stuff for us?” I said, “No, actually, it is quite different”. It is important in our debate to do that because otherwise there is a risk that the Bill comes into disrepute. I look at something like showing the harms to fictional characters. If one has seen the “Twilight” movies, the werewolves do not come off too well, and “Lord of the Rings” is like an orc kill fest.

As regards the point made by the noble Baroness, Lady Harding, about going to the cinema, we allow older teenagers to go to the cinema and see that kind of thing. Post the Online Safety Bill, they will still be able to access it. When we look at something like fictional characters, the Bill is to deal with the harm that is there and is acknowledged regarding people pushing quite vile stuff, whereby characters have been taken out of fiction and a gory image has been created, twisted and pushed to a younger child. That is what we want online providers to do—to prevent an 11 year-old seeing that—not to stop a 16 year-old enjoying the slaughter of werewolves. We need to be clear that that is what we are doing with the priority harms; we are not going further than people think we are.

There are also some interesting challenges around humour and evolving trends. This area will be hard for platforms to deal with. I raised the issue of the Tide pod challenge in Committee. If noble Lords are not familiar, it is the idea that one eats the tablets, the detergent things, that one puts into washing machines. It happened some time ago. It was a real harm and that is reflected here in the “do not ingest” provisions. That makes sense but, again talking to my focus group, the Tide pod challenge has evolved and for older teenagers it is a joke about someone being stupid. It has become a meme. One could genuinely say that it is not the harmful thing that it was. Quite often one sees something on the internet that starts harmful—because kids are eating Tide pods and getting sick—and then over time it becomes a humorous meme. At that point, it has ceased to be harmful. I read it as that filter always being applied. We are not saying, “Always remove every reference to Tide pods” but “At a time when there is evidence that it is causing harm, remove it”. If at a later stage it ceases to be harmful, it may well move into a category where platforms can permit it. It is a genuine concern.

To our freedom of expression colleagues, I say that we do not want mainstream platforms to be so repressive of ordinary banter by teenagers that they leave those regulated mainstream platforms because they cannot speak any more, even when the speech is not harmful, and go somewhere else that is unregulated—one of those platforms that took Ofcom’s letter, screwed it up and threw it in the bin. We do not want that to be an effect of the Bill. Implementation has to be very sensitive to common trends and, importantly, as I know the noble Baroness, Lady Kidron, agrees, has to treat 15, 16 and 17 year-olds very differently from 10, 11 or 12 year-olds. That will be hard.

The other area that jumped out was about encouraging harm through challenges and stunts. That immediately brought “Jackass” to mind, or the Welsh version, “Dirty Sanchez”, which I am sure is a show that everyone in the House watched avidly. It is available on TV. Talking about equality, one can go online and watch it. It is people doing ridiculous, dangerous things, is enjoyed by teenagers and is legal and acceptable. My working assumption has to be that we are expecting platforms to distinguish between a new dangerous stunt such as the choking game—such things really exist—from a ridiculous “Jackass” or “Dirty Sanchez” stunt, which has existed for years and is accessible elsewhere.

The point that I am making in the round is that it is great to have these priority harms in the Bill but it is going to be very difficult to implement them in a meaningful way whereby we are catching the genuinely harmful stuff but not overrestricting. But that is that task that we have set Ofcom and the platforms. The more that we can make it clear to people out there what we are expecting to happen, the better. We are not expecting a blanket ban on all ridiculous teenage humour or activity. We are expecting a nuanced response. That is really helpful as we go through the debate.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I just have a question for the noble Lord. He has given an excellent exposé of the other things that I was worried about but, even when he talks about listing the harms, I wonder how helpful it is. Like him, I read them out to a focus group. Is it helpful to write these things, for example emojis, down? Will that not encourage the platforms to over-panic? That is my concern.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

On the noble Baroness’s point, that is why I intervened in the debate: so that we are all clear. We are not saying that, for priority content, it is an amber light and not a red light. We are not saying, “Just remove all this stuff”; it would be a wrong response to the Bill to say, “It’s a fictional character being slaughtered so remove it”, because now we have removed “Twilight”, “Watership Down” and whatever else. We are saying, “Think very carefully”. If it is one of those circumstances where this is causing harm—they exist; we cannot pretend that they do not—it should be removed. However, the default should not be to remove everything on this list; that is the point I am really trying to make.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, our debate on this group is on the topic of priority harms to children. It is not one that I have engaged in so I tread carefully. One reason why I have not engaged in this debate is because I have left it to people who know far more about it than I do; I have concentrated on other parts of the Bill.

In the context of this debate, one thing has come up on which I feel moved to make a short contribution: misinformation and disinformation content. There was an exchange between my noble friend Lady Harding and the noble Baroness, Lady Fox, on this issue. Because I have not engaged on the topic of priority harms, I genuinely do not have a position on what should and should not be featured. I would not want anybody to take what I say as support for or opposition to any of these amendments. However, it is important for us to acknowledge that, as much as misinformation and disinformation are critical issues—particularly for children and young people because, as the right reverend Prelate said, the truth matters—we cannot, in my view, ignore the fact that misinformation and disinformation have become quite political concepts. They get used in a way where people often define things that they do not agree with as misinformation—that is, opinions are becoming categorised as misinformation.

We are now putting this in legislation and it is having an impact on content, so it is important, too, that we do not just dismiss that kind of concern as not relevant because it is real. That is all I wanted to say.

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak briefly as I know that we are waiting for a Statement.

If you talk to colleagues who know a great deal about the harm that is happening and the way in which platforms operate, as well as to colleagues who talk directly to the platforms, one thing that you commonly hear from them is a phrase that often recurs when they talk to senior people about some of the problems here: “I never thought of that before”. That is whether it is about favourites on Snapchat, which cause grief in friendship groups, about the fact that, when somebody leaves a WhatsApp group, it flags up who that person is—who wants to be seen as the person who took the decision to leave?—or about the fact that a child is recommended to other children even if the company does not know whether they are remotely similar.

If you are 13, you are introduced as a boy to Andrew Tate; if you are a girl, you might be introduced to a set of girls who may or may not share anorexia content, but they dog-whistle and blog. The companies are not deliberately orchestrating these outcomes—it is the way they are designed that is causing those consequences—but, at the moment, they take no responsibility for what is happening. We need to reflect on that.

I turn briefly to a meeting that the noble Lord, Lord Stevenson, and I were at yesterday afternoon, which leads neatly on to some of the comments the noble Baroness, Lady Fox, made, a few moments ago about the far right. The meeting was convened by Luke Pollard MP and was on the strange world known as the manosphere, which is the world of incels—involuntary celibates. As your Lordships may be aware, on various occasions, certain individuals who identify as that have committed murder and other crimes. It is a very strange world.

17:00
I was introduced to two terms that I was not aware of. If you are an incel, you refer to males who are fortunate enough to get on well with ladies as Chads, and ladies who are fortunate enough to get on well with men or boys are apparently known as Stacys. That was something I did not particularly want to learn, but I did. This was the second meeting that Luke Pollard had convened; the people at that meeting were from a huge variety of groups and they all said that it is the only forum they have found in United Kingdom that is pulling all these different elements together. They are acutely aware of how much of a problem this is, and they find that forum incredibly helpful, because nobody else is doing it.
I have asked some of the people gave evidence in that meeting to forward some of their concerns to us, which the noble Lord, Lord Stevenson, and I would like to forward to the Minister and the Bill team. I raise this so as to encourage the Minister to understand the scale of this, what is happening and its effects and then to look at how harms are defined in the Bill and how functionality is looked at, to see whether this live, growing area is dealt with effectively by the Bill.
This takes me into the area of media literacy, which I flag so that the Minister and the Bill team can do some homework. In Scotland, there is a very sensible scheme called Mentors in Violence Prevention. Essentially, it uses the older pupils in the sixth form, who have learned beforehand from other mentors how to talk about and understand the harms they might experience online. They deliver to the younger children the information that they have accumulated. The evidence is that this is infinitely more effective than teachers or outside experts doing it. It is almost peer-to-peer—perhaps a very appropriate approach for your Lordships’ House.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

I shall be brief, my Lords, because I know we have a Statement to follow. It is a pleasure to follow the noble Lord, Lord Russell. I certainly share his concern about the rise of incel culture, and this is a very appropriate point to raise it.

This is all about choices and the Minister, in putting forward his amendments, in response not only to the Joint Committee but the overwhelming view in Committee on the Bill that this was the right thing to do, has done the right thing. I thank him for that, with the qualification that we must make sure that the red and amber lights are used—just as my noble friend Lord Allan and the noble Baroness, Lady Stowell, qualified their support for what the Minister has done. At the same time, I make absolutely clear that I very much support the noble Baroness, Lady Kidron. I was a bit too late to get my name down to her amendment, but it would be there otherwise.

I very much took to what the right reverend Prelate had to say about the ethics of the online world and nowhere more should they apply than in respect of children and young people. That is the place where we should apply these ethics, as strongly as we can. With some knowledge of artificial intelligence, how it operates and how it is increasingly operating, I say that what the noble Baroness wants to add to the Minister’s amendment seems to be entirely appropriate. Given the way in which algorithms are operating and the amount of misinformation and disinformation that is pouring into our inboxes, our apps and our social media, this is a very proportionate addition. It is the future. It is already here, in fact. So I very strongly support Amendment 174 from the noble Baroness and I very much hope that after some discussion the Minister will accept it.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, like the noble Baroness, Lady Harding, I want to make it very clear that I think the House as a whole welcomes the change of heart by the Government to ensure that we have in the Bill the two sides of the question of content that will be harmful to children. We should not walk away from that. We made a big thing of this in Committee. The Government listened and we have now got it. The fact that we do not like it—or do not like bits of it—is the price we pay for having achieved something which is, probably on balance, good.

The shock comes from trying to work out why it is written the way it is, and how difficult it is to see what it will mean in practice when companies working to Ofcom’s instructions will take this and make this happen in practice. That lies behind, I think I am right in saying, the need for the addition to Amendment 172 from the noble Baroness, Lady Kidron, which I have signed, along with the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford. Both of them have spoken well in support of it and I do not need to repeat those points.

Somehow, in getting the good of Amendments 171 and 172, we have lost the flexibility that we think we want as well to try to get that through. The flexibility does exist, because the Government have retained powers to amend and change both primary priority content that is harmful to children and the primary content. Therefore, subject to approval through the secondary legislation process, this House will continue to have a concern about that—indeed, both Houses will.

Somehow, however, that does not get to quite where the concern comes from. The concern should be both the good points made by the noble Lord, Lord Russell—I should have caught him up in the gap and said I had already mentioned the fact that we had been together at the meeting. He found some additional points to make which I hope will also be useful to future discussion. I am glad he has done that. He is making a very good point in relation to cultural context and the work that needs to go on—which we have talked about in earlier debates—in order to make this live: in other words, to make people who are responsible for delivering this through Ofcom, but also those who are delivering it through companies, to understand the wider context. In that sense, clearly we need the misinformation/disinformation side of that stuff. It is part and parcel of the problems we have got. But more important even than that is the need to see about the functionality issues. We have come back to that. This Bill is about risk. The process that we will be going through is about risk assessment and making sure that the risks are understood by those who deliver services, and the penalties that follow the failure of the risk assessment process delivering change that we want to see in society.

However, it is not just about content. We keep saying that, but we do not see the changes around it. The best thing that could happen today would be if the Minister in responding accepted that these clauses are good—“Tick, we like them”—but could we just not finalise them until we have seen the other half of that, which is: what are the other risks to which those users of services that we have referred to and discussed are receiving through the systemic design processes that are designed to take them in different directions? It is only when we see the two together that we will have a proper concern.

I may have got this wrong, but the only person who can tell us is the Minister because he is the only one who really understands what is going on in the Bill. Am I not right in saying—I am going to say I am right; he will say no, I am not, but I am, aren’t I?—that we will get to Clauses 208 and 209, or the clauses that used to be 208 and 209, one of which deals with harms from content and the other deals with functionality? We may need to look at the way in which those are framed in order to come back and understand better how these lie and how they interact with that. I may have got the numbers wrong—the Minister is looking a bit puzzled, so I probably have—but the sense is that this will probably not come up until day 4. While I do not want to hold back the Bill, we may need to look at some of the issues that are hidden in the interstices of this set of amendments in order to make sure that the totality is better for those who have to use it.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a useful debate. As the noble Baroness, Lady Kidron, says, because I spoke first to move the government amendments, in effect I got my response in first to her Amendment 174, the only non-government amendment in the group. That is useful because it allows us to have a deeper debate on it.

The noble Baroness asked about the way that organisations such as the British Board of Film Classification already make assessments of sexualised content. However, the Bill’s requirement on service providers and the process that the BBFC takes to classify content are not really comparable. Services will have far less time and much more content to consider them the BBFC does, so will not be able to take the same approach. The BBFC is able to take an extended time to consider maybe just one scene, one image or one conversation, and therefore can apply nuance to its assessments. That is not possible to do at the scale at which services will have to apply the child safety duties in the Bill. We therefore think there is a real risk that they would excessively apply those duties and adversely affect children’s rights online.

I know the noble Baroness and other noble Lords are rightly concerned with protecting rights to free expression and access to information online for children and for adults. It is important that we strike the right balance, which is what we have tried to do with the government amendments in this group.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

To be clear, the point that I made about the BBFC was not to suggest a similar arrangement but to challenge the idea that we cannot categorise material of a sexualised nature. Building on the point made by the noble Lord, Lord Allan, perhaps we could think about it in terms of the amber light rather than the red light—in other words, something to think about.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I certainly will think about it, but the difficulty is the scale of the material and the speed with which we want these assessments to be made and that light to be lit, in order to make sure that people are properly protected.

My noble friend Lord Moylan asked about differing international terminology. In order for companies to operate in the United Kingdom they must have an understanding of the United Kingdom, including the English-language terms used in our legislation. He made a point about the Equality Act 2010. While it uses the same language, it does not extend the Equality Act to this part of the Bill. In particular, it does not create a new offence.

The noble Baroness, Lady Fox, also mentioned the Equality Act when she asked about the phraseology relating to gender reassignment. We included this wording to ensure that the language used in the Bill matches Section 7(1) of the Equality Act 2010 and that gender reassignment has the same meaning in the Bill as it does in that legislation. As has been said by other noble Lords—

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I clarify that what I said was aimed at protecting children. Somebody corrected me and asked, “Do you know that this says ‘abusive’?”—of course I do. What I suggested was that this is an area that is very contentious when we talk about introducing it to children. I am thinking about safeguarding children in this instance, not just copying and pasting a bit of an Act.

17:15
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

I take this opportunity to ask my noble friend the Minister a question; I want some clarity about this. Would an abusive comment about a particular religion—let us say a religion that practised cannibalism or a historical religion that sacrificed babies, as we know was the norm in Carthage—count as “priority harmful content”? I appreciate that we are mapping the language of the Equality Act, but are we creating a new offence of blasphemy in this Bill?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

As was pointed out by others in the debate, the key provision in Amendment 172 is subsection (2) of the proposed new clause, which relates to:

“Content which is abusive and which targets any of the following characteristics”.


It must both be abusive and target the listed characteristics. It does not preclude legitimate debate about those things, but if it were abusive on the basis of those characteristics—rather akin to the debate we had in the previous group and the points raised by the noble Baroness, Lady Kennedy of The Shaws, about people making oblique threats, rather than targeting a particular person, by saying, “People of your characteristic should be abused in the following way”—it would be captured.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I will keep this short, because I know that everyone wants to get on. It would be said that it is abusive to misgender someone; in the context of what is going on in sixth forms and schools, I suggest that this is a problem. It has been suggested that showing pictures of the Prophet Muhammad in an RE lesson—these are real-life events that happen offline—is abusive. I am suggesting that it is not as simple as saying the word “abusive” a lot. In this area, there is a highly contentious and politicised arena that I want to end, but I think that this will exacerbate, not help, it.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My noble friend seemed to confirm what I said. If I wish to be abusive—in fact, I do wish to be abusive—about the Carthaginian religious practice of sacrificing babies to Moloch, and I were to do that in a way that came to the attention of children, would I be caught as having created “priority harmful content”? My noble friend appears to be saying yes.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Does my noble friend wish to do that and direct it at children?

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

With respect, it does not say “directed at children”. Of course, I am safe in expressing that abuse in this forum, but if I were to do it, it came to the attention of children and it were abusive—because I do wish to be abusive about that practice—would I have created “priority harmful content”, about which action would have to be taken?

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I will leap to the Minister’s defence on this occasion. I remind noble colleagues that this is not about individual pieces of content; there would have to be a consistent flow of such information being proffered to children before Ofcom would ask for a change.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, these words have obviously appeared in the Bill in one of those unverified sections; I have clicked the wrong button, so I cannot see them. Where does it say in Amendment 172 that it has to be a consistent flow?

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

May I attempt to assist the Minister? This is the “amber” point described by the noble Lord, Lord Allan: “priority content” is not the same as “primary priority content”. Priority content is our amber light. Even the most erudite and scholarly description of baby eating is not appropriate for five year-olds. We do not let it go into “Bod” or any of the other of the programmes we all grew up on. This is about an amber warning: that user-to-user services must have processes that enable them to assess the risk of priority content and primary priority content. It is not black and white, as my noble friend is suggesting; it is genuinely amber.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, we may be slipping back into a Committee-style conversation. My noble friend Lord Moylan rightly says that this is the first chance we have had to examine this provision, which is a concession wrung out of the Government in Committee. As the noble Lord, Lord Stevenson, says, sometimes that is the price your Lordships’ House pays for winning these concessions, but it is an important point to scrutinise in the way that my noble friend Lord Moylan and the noble Baroness, Lady Fox, have done.

I will try to reassure my noble friend and the noble Baroness. This relates to the definition of a characteristic with which we began our debates today. To be a characteristic it has to be possessed by a person; therefore, the content that is abusive and targets any of the characteristics has to be harmful to an individual to meet the definition of harm. Further, it has to be material that would come to the attention of children in the way that the noble Baronesses who kindly leapt to my defence and added some clarity have set out. So my noble friend would be able to continue to criticise the polytheistic religions of the past and their tendencies to his heart’s content, but there would be protections in place if what he was saying was causing harm to an individual—targeting them on the basis of their race, religion or any of those other characteristics—if that person was a child. That is what noble Lords wanted in Committee, and that is what the Government have brought forward.

My noble friend and others asked why mis- and disinformation were not named as their own category of priority harmful content to children. Countering mis- and disinformation where it intersects with the named categories of primary priority or priority harmful content, rather than as its own issue, will ensure that children are protected from the mis- and disinformation narratives that present the greatest risk of harm to them. We recognise that mis- and disinformation is a broad and cross-cutting issue, and we therefore think the most appropriate response is to address directly the most prevalent and concerning harms associated with it; for example, dangerous challenges and hoax health advice for children to self-administer harmful substances. I assure noble Lords that any further harmful mis- and disinformation content will be captured as non-designated content where it presents a material risk of significant harm to an appreciable number of children.

In addition, the expert advisory committee on mis- and disinformation, established by Ofcom under the Bill, will have a wide remit in advising on the challenges of mis- and disinformation and how best to tackle them, including how they relate to children. Noble Lords may also have seen that the Government have recently tabled amendments to update Ofcom’s statutory media literacy duty. Ofcom will now be required to prioritise users’ awareness of and resilience to misinformation and disinformation online. This will include children and their awareness of and resilience to mis- and disinformation.

My noble friend Lady Harding of Winscombe talked about commercial harms. Harms exacerbated by the design and operation of a platform—that is, their commercial models—are covered in the Bill already through the risk assessment and safety duties. Financial harm, as used in government Amendment 237, is dealt with by a separate legal framework, including the Consumer Protection from Unfair Trading Regulations. This exemption ensures that there is no regulatory overlap.

The noble Lord, Lord Russell of Liverpool, elaborated on remarks made earlier by the noble Lord, Lord Stevenson of Balmacara, about their meeting looking at the incel movement, if it can be called that. I assure the noble Lord and others that Ofcom has a review and report duty and will be required to stay on top of changes in the online harms landscape and report to government on whether it recommends changes to the designated categories of content because of the emerging risks that it sees.

The noble Baroness, Lady Kidron, anticipated the debate we will have on Monday about functionalities and content. I am grateful to her for putting her name to so many of the amendments that we have brought forward. We will continue the discussions that we have been having on this point ahead of the debate on Monday. I do not want to anticipate that now, but I undertake to carry on those discussions.

In closing, I reiterate what I know is the shared objective across your Lordships’ House—to protect children from harmful content and activity. That runs through all the government amendments in this group, which cover the main categories of harmful content and activity that, sadly, too many children encounter online every day. Putting them in primary legislation enables children to be swiftly protected from encountering them. I therefore hope that noble Lords will be heartened by the amendments that we have brought forward in response to the discussion we had in Committee.

Amendment 31 agreed.

Online Safety Bill

Report (1st Day) (Continued)
18:08
Clause 6: Providers of user-to-user services: duties of care
Amendment 32
Moved by
32: Clause 6, page 5, line 29, at end insert—
“(ba) the duties about assessments related to adult user empowerment set out in section (Assessment duties: user empowerment),”Member’s explanatory statement
This amendment ensures that the new duties in the new Clause proposed after Clause 11 in my name are imposed on providers of Category 1 services.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- Hansard - - - Excerpts

My Lords, as noble Lords will be aware, the Government removed the legal but harmful provisions from the Bill in another place, given concerns about freedom of expression. I know that many noble Lords would not have taken that approach, but I am grateful for their recognition of the will of the elected House in this regard as well as for their constructive contributions about ways of strengthening the Bill while continuing to respect that.

I am therefore glad to bring forward a package of amendments tabled in my name relating to adult safety. Among other things, these strengthen our existing approach to user empowerment and terms of service by rebalancing the power over the content adults see and interact with online, moving the choice away from unaccountable technology companies and towards individual users.

First, we are introducing a number of amendments, which I am pleased to say have the support of the Opposition Front Bench, which will introduce a comprehensive duty on category 1 providers to carry out a full assessment of the incidence of user empowerment content on their services. The amendments will mean that platforms can be held to account by Ofcom and their users when they fail to assess the incidence of this kind of content on their services or when they fail to offer their users an appropriate ability to control whether or not they view it.

Amendments 19 to 21 and 26—I am grateful to noble Lords opposite for putting their names to them—will strengthen the user empowerment content duty. Category 1 providers will now need proactively to ask their registered adult users how they would like the control features to be applied. We believe that these amendments achieve two important aims that your Lordships have been seeking from these duties: first, they ensure that they are more visible for registered adult users; and, secondly, they offer better protection for young adult users.

Amendments 55 and 56, tabled by the noble Lord, Lord Clement-Jones, my noble friend Lord Moylan and the noble Baroness, Lady Fox of Buckley, seek to provide users with a choice over how the tools are applied for each category of content set out in Clause 12(10), (11) and (12). The legislation gives platforms the flexibility to decide what tools they offer in compliance with Clause 12(2). A blanket approach is unlikely to be consistent with the duty on category 1 services to have particular regard to the importance of protecting users’ freedom of expression when putting these features in place. Additionally, the measures that Ofcom will recommend in its code of practice must consider the impact on freedom of expression so are unlikely to be a blanket approach.

Amendments 58 and 63 would require providers to set and enforce consistent terms of service on how they identify the categories of content to which Clause 12(2) applies; and to apply the features to content only when they have reasonable grounds to infer that it is user empowerment content. I assure noble Lords that the Bill’s freedom of expression duties will prevent providers overapplying the features or adopting an inconsistent or capricious approach. If they do, Ofcom can take enforcement action.

Amendments 59, 64 and 181, tabled by the noble Lord, Lord Clement-Jones, seek to require that the user empowerment and user verification features are provided at no cost. I reassure the noble Lord that the effect of these amendments is already achieved by the drafting of Clause 12. Category 1 providers will be compliant with their duties only if they proactively ask all registered users whether or not they want to use the user empowerment content features, which would not be possible with a paywall. Amendment 181 is similar and applies to user verification. While the Bill does not specify that verification must be free of charge, category 1 providers can meet the duties in the Bill only by offering all adult users the option to verify themselves.

Turning to Amendment 204, tabled by the noble Baroness, Lady Finlay of Llandaff, I share her concern about the impact that self-harm and suicide content can have. However, as I said in Committee, the Bill goes a long way to provide protections for both children and adults from this content. First, it includes the new criminal offence of encouraging or assisting self-harm. This then feeds through into the Bill’s illegal content duties. Companies will be required to take down such content when it is reported to them by users.

Beyond the illegal content duties, there are specific protections in place for children. The Government have tabled amendments designating content that encourages, promotes or provides instructions as a category of primary priority content, meaning that services will have to prevent children of all ages encountering it. For adults, the Government listened to concerns and, as mentioned, have strengthened the user empowerment duties to make it easier for adult users to opt in to using them by offering a forced choice. We have made a careful decision, however, to balance these protections with users’ right to freedom of expression and therefore cannot require platforms to treat legal content accessed by adults in a prescribed way. That is why, although I share the noble Baroness’s concerns about the type of content that she mentions, I cannot accept her amendment and hope that she will agree.

The Bill’s existing duties require category 1 platforms to offer users the ability to verify their identity. Clause 12 requires category 1 platforms to offer users the ability to filter out users who have not verified their identity. Amendment 183 from my noble friend Lord Moylan seeks to give Ofcom the discretion to decide when it is and is not proportionate for category 1 services to offer users the ability to verify their identity. We do not believe that these will be excessively burdensome, given that they will apply only to category 1 companies, which have the resource and capacity to offer such tools.

Amendment 182 would require platforms to offer users the option to make their verification status visible. The existing duty in Clause 57, in combination with the duty in Clause 12, will already provide significant protections for adults from anonymous abuse. Adult users will now be able to verify their own status and decide to interact only with other verified users, whether or not their status is visible. We do not believe that this amendment would provide additional protections.

The Government carefully considered mandating that all users display their verification status, which may heighten some users’ safety, but it would be detrimental to vulnerable users, who may need to remain anonymous for perfectly justifiable reasons. Further government amendments in my name will expand the types of information that Ofcom can require category 1, 2A and 2B providers to publish in their transparency reports in relation to user empowerment content.

Separately, but also related to transparency, government Amendments 189 and 202 make changes to Clause 67 and Schedule 8. These relate to category 1 providers’ duties to create clear and accessible terms of service and apply them consistently and transparently. Our amendments tighten these parts of the Bill so that all the providers’ terms through which they might indicate that a certain type of content is not allowed on their service, are captured by these duties.

I hope that noble Lords will therefore accept the Government amendments in this group and that my anticipatory remarks about their amendments will give them some food for thought as they make their contributions. I beg to move.

18:15
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I speak to Amendments 56, 58, 63 and 183 in my name in this group. I have some complex arguments to make, but time is pressing, so I shall attempt to do so as briefly as possible. I am assisted in that by the fact that my noble friend on the Front Bench very kindly explained that the Government are not going to accept my worthless amendments, without actually waiting to hear what it is I might have said on their behalf.

None the less, I turn briefly to Amendment 183. The Bill has been described, I think justly, as a Twitter-shaped Bill: it does not take proper account of other platforms that operate in different ways. I return to the question of Wikipedia, but also platforms such as Reddit and other community-driven platforms. The requirement for a user-verification tool is of course intended to lead to the possibility that ordinary, unverified users—people like you and me—could have the option to see only that content which comes from those people who are verified.

This is broadly a welcome idea, but when we combine that with the fact that there are community-driven sites such as Wikipedia where there are community contributions and people who contribute to those sites are not always verified—sometimes there are very good reasons why they would want to preserve their anonymity —we end up with the possibility of whole articles having sentences left out and so on. That is not going to happen; the fact is that nobody such as Wikipedia can operate a site like that, so it is another one of those existential questions that the Government have not properly grappled with and really must address before we come to Third Reading, because this will not work the way it is.

As for my other amendments, they are supportive of and consistent with the idea of user verification, and they recognise—as my noble friend said—that user verification is intended to be a substitute for the abandoned “legal but harmful” clause. I welcome the abandonment of that clause and recognise that this provision is more consistent with individual freedom and autonomy and the idea that we can make choices of our own, but it is still open to the possibility of abuse by the platforms themselves. The amendments that I am put forward address, first, the question of what should be the default position. My argument is that the default position should be that filtering is not on and that one has to opt into it, because that that seems to me the adult proposition, the adult choice.

The danger is that the platforms themselves will either opt you into filtering automatically as the default, so you do not see what might be called the full-fat milk that is available on the internet, or that they harass you to do so with constant pop-ups, which we already get. If you go on the Nextdoor website, you constantly get the pop-up saying, “You should switch on notifications”. I do not want notifications; I want to look at it when I want to look at it. I do not want notifications, but I am constantly being driven into pressing the button that says, “Switch on notifications”. You could have something similar here—constantly being driven into switching on the filters—because the platforms themselves will be very worried about the possibility that you might see illegal content. We should guard against that.

Secondly, on Amendment 58, if we are going to have user verification—as I say, there is a lot to be said for that approach—it should be applied consistently. If the platform decides to filter out racist abuse and you opt in to filtering out racist abuse or some other sort of specified abuse, it has to filter all racist abuse, not simply racist abuse that comes from people they do not like; or, with gender assignment abuse, they cannot filter out stuff from only one side or other of the argument. The word “consistently” that is included here is intended to address that, and to require policies that show that, if you opt in to having something filtered out, it would be done on a proper, consistent and systematic basis and not influenced by the platform’s own particular political views.

Finally, we come to Amendment 63 and the question of how this is communicated to users of the internet. This amendment would force the platforms to make these policies about how user verification will operate a part of their terms and conditions in a public and visible way and to ensure that those provisions are applied consistently. It goes a little further than the other amendments—the others could stand on their own—but would also add a little bit more by requiring public and consistent policies that people can see. This works with the grain of what the Government are trying to do; I do not see that the Government can object to any of this. There is nothing wrecking here. It is trying to make everything more workable, more transparent and more obvious.

I hope, given the few minutes or short period of time that will elapse between my sitting down and the Minister returning to the Dispatch Box, that he will have reflected on the negative remarks that he made in his initial speech and will find it possible to accept these amendments now that he has heard the arguments for them.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I will not engage with the amendments of the noble Lord, Lord Moylan, since mine are probably the diametric opposite of what he has been saying.

I say, first, on behalf of the noble Baroness, Lady Finlay, that she regrets very much not being able to be here. Amendment 204 in her name is very much a Samaritans amendment. The Samaritans have encouraged her to put it forward and encourage us to support it. It is clear that the Minister has got his retaliation in first and taken the wind out of all our sails right at the beginning. Nevertheless, that does not mean that we cannot come back at the Minister and ask for further and better particulars of what he has to say.

Clearly the Government’s decision to bring in the new offence of encouraging or assisting self-harm is welcome. However—certainly in the view of the Samaritans—this will only bring into the remit of the Bill content that encourages serious self-harm, which must reach the high threshold amounting to grievous bodily harm. Their view, therefore, is that much harmful content will still be left untouched and available to criminals online. This could include information, depictions, instructions and advice on methods of self-harm and suicide. It would also include content that portrays self-harm and suicide as positive or desirable, and graphic descriptions or depictions of self-harm and suicide.

Perhaps the Minister could redouble his efforts to assure us as to how the Bill will take a comprehensive approach to placing duties on all platforms to reduce all dangerous suicide and self-harm content, such as detailed instructions on how people can harm themselves, for adults as well as children. This should also be in respect of smaller sites; it is not just the larger category 1 sites that will need to proactively remove priority illegal content, whatever the level of detail in their risk assessment. I hope I have done my duty by the noble Baroness, Lady Finlay, who very much regrets that she was not able to be here.

My own Amendments 55, 59, 64 and 181 are about changes in social media. The Bill really began its life at the high point of the phase where services were free to the user and paid for by adverts. The noble Lord talked about this being a Twitter Bill. Well, to some extent we are influenced by what Twitter has been doing over the last 12 months: it has begun to charge for user-verification services and some features, and other services are adopting versions of what you might call this premium model. So there is a real concern that Clause 12 might not be as comprehensive as the Minister seems to be asserting. I assume that it is covered by the “proportionate” wording in Clause 12, and therefore it would not be proportionate—to put it the other way round—if they charged for this service. I would very much like the Minister to give the detail of that, so I am not going to cover the rest of the points that I would otherwise have made.

The Minister said that a blanket approach would not be appropriate for user-empowerment control features. The thought that people have had is that a platform might choose to have a big red on/off button that would try to cover all the types of content that could be subject to this kind of user-empowerment tool. I do not think the contents of Clause 12 are as clear as the Minister perhaps considers they could be, but they go with the grain of the new government amendments. I should have said right at the beginning—although many of us regret the deletion of “legal but harmful” from the original draft Bill—that the kind of assessment that is going to be made is a step in the right direction and demonstrates that the Minister was definitely listening in Committee. However, if a blanket approach of this kind is taken, that would not be in the spirit of where these user-empowerment tools are meant to go. I welcome what the Minister had to say, but again I would like the specifics of where he thinks the wording is helpful in making sure that we have a much more granular form of user-empowerment control feature when this eventually comes into operation.

Finally, I return to user verification. This is very much in the footsteps of the Joint Committee. The noble Baroness, Lady Merron, spoke very well in Committee to what was then Amendment 41, which was in the name of the noble Lord, Lord Stevenson. It would required category 1 services to make visible to users whether another user was verified or non-verified.

18:30
Amendment 182 to Clause 57 is a rather different animal, but we are again trying to get improvements to what is in the clause at the moment. It tries to focus even more on empowering users by giving them choice. Alongside offering UK users a choice to verify, it will ensure that users are also offered a choice to make that verification visible to others. In a sense, it goes very much with the grain of what the Government have been moving towards with their approach to the use of user-empowerment tools and giving choice at the outset. That is, in a sense, the compromise between default and non-default, as we discussed in Committee. This offers users a different kind of choice, but nevertheless an important choice.
Just as the Bill would not force any UK users to verify, so this amendment would not force any UK users to make their choice to verify visible. All it would do is require that platforms offer them an option. Research suggests that most UK users would choose to verify and to make that visible; I am sure that the Minister is familiar with some of the research. New research published this week by Clean Up the Internet, based on independent opinion polling conducted by Opinium, found that 78% of UK social media users say that it would be helpful to be able to see which social media accounts have been verified to help them avoid scams. Almost as many—77%—say that being able to see which accounts have been verified would help with identifying bullies or trolls. Some 72% say it would help with spotting false or misleading news stories, and 68% say it would help with buying products or services.
Ofcom’s own research into online fraud, published in March this year, found:
“A warning from the platform that content or messages come from an unverified source”
is the single most popular measure platforms could introduce to help users avoid getting drawn into scams. So it would be an extremely popular move for the Minister to accept my amendment, as I am sure he would appreciate.
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

Is he not outrageous, trying to make appeals to one’s good humour and good sense? But I support him.

I will say only three things about this brief but very useful debate. First, I welcome the toggle-on, toggle-off resolution: that is a good move. It makes sure that people make a choice and that it is made at an appropriate time, when they are using the service. That seems to be the right way forward, so I am glad that that has come through.

Secondly, I still worry that terms of service, even though there are improved transparency measures in these amendments, will eventually need some form of power for Ofcom to set de minimis standards. So much depends on the ability of the terms of service to carry people’s engagement with the social media companies, including the decisions about what to see and not to see, and about whether they want to stay on or keep off. Without some power behind that, I do not think that the transparency will take it. However, we will leave it as it is; it is better than it was before.

Thirdly, user ID is another issue that will come back. I agree entirely with what the noble Lord, Lord Clement-Jones, said: this is at the heart of so much of what is wrong with what we see and perceive as happening on the internet. To reduce scams, to be more aware of trolls and to be aware of misinformation and disinformation, you need some sense of who you are talking to, or who is talking to you. There is a case for having that information verified, whether or not it is done on a limited basis, because we need to protect those who need to have their identities concealed for very good reason—we know all about that. As the noble Lord said, it is popular to think that you would be a safer person on the internet if you were able to identify who you were talking to. I look forward to hearing the Minister’s response.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak very briefly to Amendments 55 and 182. We are now at the stage of completely taking the lead from the Minister and the noble Lords opposite—the noble Lords, Lord Stevenson and Lord Clement-Jones—that we have to accept these amendments, because we need now to see how this will work in practice. That is why we all think that we will be back here talking about these issues in the not too distant future.

My noble friend the Minister rightly said that, as we debated in Committee, the Government made a choice in taking out “legal but harmful”. Many of us disagree with that, but that is the choice that has been made. So I welcome the changes that have been made by the Government in these amendments to at least allow there to be more empowerment of users, particularly in relation to the most harmful content and, as we debated, in relation to adult users who are more vulnerable.

It is worth reminding the House that we heard very powerful testimony during the previous stage from noble Lords with personal experience of family members who struggle with eating disorders, and how difficult these people would find it to self-regulate the content they were looking at.

In Committee, I proposed an amendment about “toggle on”. Anyone listening to this debate outside who does not know what we are talking about will think we have gone mad, talking about toggle on and toggle off, but I proposed an amendment for toggle on by default. Again, I take the Government’s point, and I know my noble friend has put a lot of work into this, with Ministers and others, in trying to come up with a sensible compromise.

I draw attention to Amendment 55. I wonder if my noble friend the Minister is able say anything about whether users will be able to have specific empowerment in relation to specific types of content, where they are perhaps more vulnerable if they see it. For example, the needs of a user might be quite different between those relating to self-harm and those relating to eating disorder content or other types of content that we would deem harmful.

On Amendment 182, my noble friend leapt immediately to abusive content coming from unverified users, but, as we have heard, and as I know, having led the House’s inquiry into fraud and digital fraud last year, there will be, and already is, a prevalence of scams. The Bill is cracking down on fraudulent advertisements but, as an anti-fraud measure, being able to see whether an account has been verified would be extremely useful. The view now is that, if this Bill is successful—and we hope it is—in cracking down on fraudulent advertising, then there will be even more reliance on what is called organic reach, which is the use of fake accounts, where verification therefore becomes more important. We have heard from opinion polling that the public want to see which accounts are or are not verified. We have also heard that Amendment 182 is about giving users choice, in making clear whether their accounts are verified; it is not about compelling people to say whether they are verified or not.

As we have heard, this is a direction of travel. I understand that the Government will not want to accept these amendments at this stage, but it is useful to have this debate to see where we are going and what Ofcom will be looking at in relation to these matters. I look forward to hearing what my noble friend the Minister has to say about these amendments.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I speak to Amendment 53, on the assessment duties, and Amendment 60, on requiring services to provide a choice screen. It is the first time we have seen these developments. We are in something of a see-saw process over legal but harmful. I agree with my noble friend Lord Clement-Jones when he says he regrets that it is no longer in the Bill, although that may not be a consistent view everywhere. We have been see-sawing backwards and forwards, and now, like the Schrödinger’s cat of legal but harmful, it is both dead and alive at the same time. Amendments that we are dealing with today make it a little more alive that it was previously.

In this latest incarnation, we will insist that category 1 services carry out an assessment of how they will comply with their user-empowerment responsibility. Certainly, this part seems reasonable to me, given that it is limited to category 1 providers, which we assume will have significant resources. Crucially, that will depend on the categorisations—so we are back to our previous debate. If we imagine category 1 being the Meta services and Twitter, et cetera, that is one thing, but if we are going to move others into category 1 who would really struggle to do a user empowerment tool assessment—I have to use the right words; it is not a risk assessment—then it is a different debate. Assuming that we are sticking to those major services, asking them to do an assessment seems reasonable. From working on the inside, I know that even if it were not formalised in the Bill, they would end up having to do it as part of their compliance responsibilities. As part of the Clause 8 illegal content risk assessment, they would inevitably end up doing that.

That is because the categories of content that we are talking about in Clauses 12(10) to (12) are all types of content that might sometimes be illegal and sometimes not illegal. Therefore, if you were doing an illegal content risk assessment, you would have to look at it, and you would end up looking at types of content and putting them into three buckets. The first bucket is that it is likely illegal in the UK, and we know what we have to do there under the terms of the Bill. The second is that it is likely to be against your terms of service, in which case you would deal with it there. The third is that it is neither against your terms of service nor against UK law, and you would make a choice about that.

I want to focus on what happens once you have done the risk assessment and you have to have the choice screen. I particularly want to focus on services where all the content in Clause 12 is already against their terms of service, so there is no gap. The whole point of this discussion about legal but harmful is imagining that there is going to be a mixed economy of services and, in that mixed economy, there will be different standards. Some will wish to allow the content listed in Clause 12—self-harm-type content, eating disorder content and various forms of sub-criminal hate speech. Some will choose to do that—that is going to be their choice—and they will have to provide the user empowerment tools and options. I believe that many category 1 providers will not want to; they will just want to prohibit all that stuff under their terms of service and, in that case, offering a choice is meaningless. That will not make the noble Lord, Lord Moylan, or the noble Baroness, Lady Fox, very happy, but that is the reality.

Most services will just say that they do not want that stuff on their platform. In those cases, I hope that what we are going to say is that, in their terms of service, when a user joins a service, they can say that they have banned all that stuff anyway, so they are not going to give the user a user empowerment tool and, if the user sees that stuff, they should just report it and it will be taken down under the terms of service. Throughout this debate I have said, “No more cookie banners, please”. I hope that we are not going to require people, in order for them to comply with this law, to offer a screen that people then click through. It is completely meaningless and ineffective. For those services that have chosen under their terms of service to restrict all the content in Clause 12, I hope that we will be saying that their version of the user empowerment tool is not to make people click anything but to provide education and information and tell them where they can report the content and have it taken down.

Then there are those who will choose to protect that content and allow it on their service. I agree with the noble Lord, Lord Moylan, that this is, in some sense, Twitter-focused or Twitter-driven legislation, because Twitter tends to be more in the freedom of speech camp and to allow hate speech and some of that stuff. It will be more permissive than Facebook or Instagram in its terms, and it may choose to maintain that content and it will have to offer that screen. That is fine, but we should not be making services do so when they have already prohibited such content.

The noble Lord, Lord Moylan, mentioned services that use community moderators to moderate part of the service and how this would apply there. Reddit is the obvious example, but there are others. If you are going to have user empowerment—and Reddit is more at the freedom of expression end of things—then if there are some subreddits, or spaces within Reddit that allow hate speech or the kind of speech that is in Clause 12, it would be rational to say that user empowerment in the context of Reddit is to be told that you can join these subreddits and you are fine or you can join those subreddits and you are allowing yourself to be exposed to this kind of content. What would not make sense would be for Reddit to do it individual content item by content item. When we are thinking about this, I hope that the implementation would say that, for a service with community-moderated spaces, and subspaces within the larger community, user empowerment means choosing which subspaces you enter, and you would be given information about them. Reddit would say to the moderators of the subreddits, “You need to tell us whether you have any Clause 12-type content”—I shall keep using that language—“and, if you are allowing it, you need to make sure that you are restricted”. But we should not expect Reddit to restrict every individual content item.

Finally, as a general note of caution, noble Lords may have detected that I am not entirely convinced that these will be hugely beneficial tools, perhaps other than for a small subset of Twitter users, for whom they are useful. There is an issue around particular kinds of content on Twitter, and particular Twitter users, including people in prominent positions in public life, for whom these tools make sense. For a lot of other people, they will not be particularly meaningful. I hope that we are going to keep focused on outcomes and not waste effort on things that are not effective.

As I say, many companies, when they are faced with this, will look at it and say, “I have limited engineering time. I could build all these user empowerment tools or I could just ban the Clause 12 stuff in my terms of service”. That would not be a great outcome for freedom of expression; it might be a good outcome for the people who wanted to prohibit legal but harmful in the first place. You are going to do that as a really hard business decision. It is much more expensive to try to maintain these different regimes and flag all this content and so on. It is simpler to have one set of standards.

18:45
I think most services will just adopt the Clause 12 content restrictions into their terms of service and have done with it. I do not think we want to create a perverse situation where we say you must allow some in order to have a tool to block it. I certainly had the experience at Facebook where people were saying to me, “Why does Facebook not have safe search to prevent nudity?” I would say, “Our terms ban nudity”, and then they would say, “But you need safe search”. I would say, “It is banned. You are not supposed to have it. Why would I have a tool to block something that should not be there in the first place?” I hope we are not going to go down that path and create those perverse incentives.
This is crucially about deploying resources so I hope if we are going ahead with the user empowerment tools we will assess them and be ruthless about deploying resources where they work best. I do not think anyone is going to cry for category 1 companies. They have plenty of resources; they can build stuff. But the pool of engineers is not infinite and if we are asking them to spend their time on user empowerment tools that very few people use and are not producing huge safety benefits, frankly, I would rather they take those engineers and put them on something else, such as scanning algorithms which can pick up the priority content for children.
I hope we keep all that in mind as we do this. We are going to build the user empowerment tools. It is a logical response once we had decided to take legal but harmful out, but I think we should approach it with a note of caution that we do not assume it is necessarily going to be a fix everywhere and in the same way on all platforms. For some platforms, it might be quite meaningless; for others, potentially, it is something people will want to use.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I am happy to acknowledge and recognise what the Government did when they created user empowerment duties to replace legal but harmful. I think they were trying to counter the dangers of over-paternalism and illiberalism that oblige providers to protect adult users from content that allegedly would cause them harm.

At least the new provisions brought into the Bill have a different philosophy completely. They enhance users’ freedom as individuals and allow them to apply voluntary content filters and freedom of choice, on the principle that adults can make decisions for themselves.

In case anyone panics, I am not making a philosophical speech. I am reminding the Government that that is what they said to us—to everybody—“We are getting rid of legal but harmful because we believe in this principle”. I am worried that some of the amendments seem to be trying to backtrack from that different basis of the Bill—and that more liberal philosophy—to go back to the old legal but harmful. I say to the noble Lord, Lord Allan of Hallam, that the cat is distinctly not dead.

The purpose of Amendment 56 is to try to ensure that providers also cannot thwart the purpose of Clause 12 and make it more censorious and paternalistic. I am not convinced that the Government needed to compromise on this as I think Amendment 60 just muddies the waters and fudges the important principle that the Government themselves originally established.

Amendment 56 says that the default must be no filtering at all. Then users have to make an active decision to switch on the filtering. The default is that you should be exposed to a full flow of ideas and, if you do not want that, you have to actively decide not to and say that you want a bowdlerised or sanitised version.

Amendment 56 takes it a bit further, in paragraph (b), and applies different levels of filtering in terms of content of democratic importance and journalistic content. In the Bill itself, the Government accept the exceptional nature of those categories of content, and this just allows users to be able to do the same and say, “No; I might want to filter some things out but bear in mind the exceptional importance of democratic and journalistic content”. I worry that the government amendments signal to users that certain ideas are dangerous and must be hidden. That is my big concern. In other words, they might be legal but they are harmful: that is what I think these amendments try to counter.

One of the things that worries me about the Bill is the danger of echo chambers. I know we are concentrating on harms, but I think echo chambers are harmful. I started today quite early at Blue Orchid at 55 Broadway with a big crowd of sixth formers involved in debating matters. I complimented Keir Starmer on his speech on the importance of oracy and encouraging young people to speak. I stressed to all the year 12 and year 13 young people that the important thing was that they spoke out but also that they listened to contrary opinions and got out of their safe spaces and echo chambers. They were debating very difficult topics such as commercial surrogacy, cancel culture and the risks of contact sports. I am saying all that to them and then I am thinking, “We have now got a piece of legislation that says you can filter out all the stuff you do not want to hear and create your own safe space”. So I just get anxious that we do not inadvertently encourage in the young—I know this is for all adults—that antidemocratic tendency to not want to hear what you do not want to hear, even when it would be good to hear as many opinions as possible.

I also want to press the Minister on the problem of filtering material that targets race, religion, sex, sexual orientation, disability and gender reassignment. I keep trying to raise the problem that it could lead to diverse philosophical views around those subjects also being removed by overzealous filtering. You might think that you know what you are asking to be filtered out. If you say you want to filter out material that is anti-religion, you might not mean that you do not want any debates on religious tolerance. For example, there was that major controversy over the “The Lady of Heaven” film. I know the Minister was interested, as I was, in the dangers of censorship in relation to that. You would not want, because you said, “Don’t target me for my religion”, to not be able to access that debate.

I think there is a danger that we are handing a lot of power to filterers to make filtering decisions based on their values when we are not clear about what they are. Look at what has happened with the banks in the last few days. Their values have closed down people’s bank accounts because they disagree on values. Again, we say “Don’t target on race”, but I have been having lots of arguments with people recently who have accused the Government, through their Illegal Migration Bill, of being racist. I think we just need to know that we are not accepting an ideological filtering of what we see.

Amendment 63 is key because it requires providers’ terms of service to include provisions about how content to which Clause 12(2) applies is identified, precisely to try to counter these problems. It imposes a duty on providers to apply those provisions consistently, as the noble Lord, Lord Moylan, explained. The point that providers have to set out how they identify content that is allegedly hostile, for example, to religion, or racially abusive, is important because this is about empowering users. Users need to know whether this will be done by machine learning or will it be a human doing it. Do they look for red flags and, if so, what are the red flags? How are these things decided? That means that providers have to state clearly and be accountable for their definition of any criteria that could justify them filtering out and disturbing the flow of democratic information. It is all about transparency and accountability in that sense.

Finally, in relation to Amendment 183, I am worried about the notion of filtering out content from unverified users for a range of reasons. It indicates somehow that there is a direct link between being unverified or anonymous and harm or being dodgy, which I think that is illegitimate. It has already been explained that there will be a detrimental impact on certain organisations —we have talked about Reddit, but I like to remember Mumsnet. There are quite a lot of organisations with community-centred models, where the structure is that influencers broadcast to their followers and where there are pseudonymous users. Is the requirement to filter out those contributors likely to lead to those models collapsing? I need to be reassured on this because I am not convinced at all. As has been pointed out, there will be a two-tier internet because those who are unable or unwilling to disclose their identity online or to be verified by someone would be or could be shut out from public discussions. That is a very dangerous place to have ended up, even though I am sure it is not what the Government intend.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful for the broad, if not universal, support for the amendments that we have brought forward following the points raised in Committee. I apologise for anticipating noble Lords’ arguments, but I am happy to expand on my remarks in light of what they have said.

My noble friend Lord Moylan raised the question of non-verified user duties and crowdsourced platforms. The Government recognise concerns about how the non-verified user duties will work with different functionalities and platforms, and we have engaged extensively on this issue. These duties are only applicable to category 1 platforms, those with the largest reach and influence over public discourse. It is therefore right that such platforms have additional duties to empower their adult users. We anticipate that these features will be used in circumstances where vulnerable adults wish to shield themselves from anonymous abuse. If users decide that they are restricting their experience on a particular platform, they can simply choose not to use them. In addition, before these duties come into force, Ofcom will be required to consult effective providers regarding the codes of practice, at which point they will consider how these duties might interact with various functionalities.

My noble friend and the noble Lord, Lord Allan of Hallam, raised the potential for being bombarded with pop-ups because of the forced-choice approach that we have taken. These amendments have been carefully drafted to minimise unnecessary prompts or pop-ups. That is why we have specified that the requirement to proactively ask users how they want these tools to be applied is applicable only to registered users. This approach ensures that users will be prompted to make a decision only once, unless they choose to ignore it. After a decision has been made, the provider should save this preference and the user should not be prompted to make the choice again.

The noble Lord, Lord Clement-Jones, talked further about his amendments on the cost of user empowerment tools as a core safety duty in the Bill. Category 1 providers will not be able to put the user empowerment tools in Clause 12 behind a pay wall and still be compliant with their duties. That is because they will need to offer them to users at the first possible opportunity, which they will be unable to do if they are behind a pay wall. The wording of Clause 12(2) makes it clear that providers have a duty to include user empowerment features that an adult user may use or apply.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

The Minister may not have the information today, but I would be happy to get it in writing. Can he clarify exactly what will be expected of a service that already prohibits all the Clause 12 bad stuff in their terms of service?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay
- Hansard - - - Excerpts

I will happily write to the noble Lord on that.

Clause 12(4) further sets out that all search user empowerment content tools must be made available to all adult users and be easy to access.

The noble Lord, Lord Clement-Jones, on behalf of the noble Baroness, Lady Finlay, talked about people who will seek out suicide, self-harm or eating-disorder content. While the Bill will not prevent adults from seeking out legal content, it will introduce significant protections for adults from some of the most harmful content. The duties relating to category 1 services’ terms of service are expected hugely to improve companies’ own policing of their sites. Where this content is legal and in breach of the company’s terms of service, the Bill will force the company to take it down.

We are going even further by introducing a new user empowerment content-assessment duty. This will mean that where content relates to eating disorders, for instance, but which is not illegal, category 1 providers need fully to assess the incidence of this content on their service. They will need clearly to publish this information in accessible terms of service, so users will be able to find out what they can expect on a particular service. Alternatively, if they choose to allow suicide, self-harm or eating content disorder which falls into the definition set out in Clause 12, they will need proactively to ask users how they would like the user empowerment content features to be applied.

My noble friend Lady Morgan was right to raise the impact on vulnerable people or people with disabilities. While we anticipate that the changes we have made will benefit all adult users, we expect them particularly to benefit those who may otherwise have found it difficult to find and use the user empowerment content features independently—for instance, some users with types of disabilities. That is because the onus will now be on category 1 providers proactively to ask their registered adult users whether they would like these tools to be applied at the first possible opportunity. The requirement also remains to ensure that the tools are easy to access and to set out clearly what tools are on offer and how users can take advantage of them.

19:00
On the granularity of choice for different tools, as pressed by the noble Lord, Lord Clement-Jones, the forced choice user empowerment amendment has been drafted in such a way to ensure that, should platforms offer users a range of tools to comply with their duties, users will get a choice about each tool that they offer. For instance, if a provider offers users one tool that will reduce the likelihood that they see certain categories of content and another that alerts them to the nature of it, they will get separate choices about whether they want these tools to be applied. This will ensure that users have even more control over their experience online. A blanket on/off choice for all user empowerment features is unlikely to be consistent with the duty on category 1 services to have particular regard to the importance of protecting users’ freedom of expression when putting in place these features, which can be found in Clause 18. Additionally, duties under the Human Rights Act 1998 and the requirement to consult experts on freedom of expression mean that the measures Ofcom will recommend in its codes of practice must consider the impact of freedom of expression so are unlikely to take a blanket approach.
I hope all that goes some way towards reassuring the noble Baroness, Lady Fox, that freedom of expression is baked into all these amendments. Many of the questions she raises come down to the choice of users. We are not forcing people by having a default on or off. We are encouraging all users to make a decision about what material they see as adults on the internet within the law. If, like her and like me, they want to continue to see that, they should continue to keep their settings broad and know that they will encounter things with which they may disagree or that may offend them.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, does the Minister have any more to say on identity verification?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am being encouraged to be brief so, if I may, I will write to the noble Lord on that point.

Amendment 32 agreed.
Amendment 33
Moved by
33: Clause 6, page 5, line 37, leave out “duty about record-keeping set out in section 19(9)” and insert “duties about record-keeping set out in section 19(8A) and (9)”
Member’s explanatory statement
This amendment ensures that the new duties in Clause 19 proposed by amendments in my name to that clause are imposed on providers of Category 1 services.
Amendment 33 agreed.
Clause 10: Children’s risk assessment duties
Amendment 34
Moved by
34: Clause 10, page 9, line 13, after “8” insert “and, in the case of services likely to be accessed by children which are Category 1 services, the duties about assessments set out in section (Assessment duties: user empowerment)”
Member’s explanatory statement
This amendment inserts a signpost to the new duties imposed on providers of Category 1 services by the new Clause proposed after Clause 11 in my name.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, I will speak to the government amendments now but not anticipate the non-government amendments in this group.

As noble Lords know, protecting children is a key priority for this Bill. We have listened to concerns raised across your Lordships’ House about ensuring that it includes the most robust protections for children, particularly from harmful content such as pornography. We also recognise the strength of feeling about ensuring the effective use of age-assurance measures, by which we mean age verification and age estimation, given the important role they will have in keeping children safe online.

I thank the noble Baroness, Lady Kidron, and my noble friends Lady Harding of Winscombe and Lord Bethell in particular for their continued collaboration over the past few months on these issues. I am very glad to have tabled a significant package of amendments on age assurance. These are designed to ensure that children are prevented from accessing pornography, whether it is published by providers in scope of the Part 5 duties or allowed by user-to-user services that are subject to Part 3 duties. The Bill will be explicit that services will need to use highly effective age verification or age estimation to meet these new duties.

These amendments will also ensure that there is a clear, privacy-preserving and future-proof framework governing the use of age assurance, which will be overseen by Ofcom. Our amendments will, for the first time, explicitly require relevant providers to use age verification or age estimation to protect children from pornography. Publishers of pornographic content, which are regulated in Part 5, will need to use age verification or age estimation to ensure that children are not normally able to encounter content which is regulated provider pornographic content on their service.

Further amendments will ensure that, where such tools are proactive technology, Ofcom may also require their use for Part 5 providers to ensure compliance. Amendments 279 and 280 make further definitional changes to proactive technology to ensure that it can be recommended or required for this purpose. To ensure parity across all regulated pornographic content in the Bill, user-to-user providers which allow pornography under their terms of service will also need to use age verification or age estimation to prevent children encountering pornography where they identify such content on their service. Providers covered by the new duties will also need to ensure that their use of these measures meets a clear, objective and high bar for effectiveness. They will need to be highly effective at correctly determining whether a particular user is a child. This new bar will achieve the intended outcome behind the amendments which we looked at in Committee, seeking to introduce a standard of “beyond reasonable doubt” for age assurance for pornography, while avoiding the risk of legal challenge or inadvertent loopholes.

To ensure that providers are using measures which meet this new bar, the amendments will also require Ofcom to set out, in its guidance for Part 5 providers, examples of age-verification and age-estimation measures which are highly effective in determining whether a particular user is a child. Similarly, in codes of practice for Part 3 providers, Ofcom will need to recommend age-verification or age-estimation measures which can be used to meet the new duty to use highly effective age assurance. This will meet the intent of amendments tabled in Committee seeking to require providers to use measures in a manner approved by Ofcom.

I confirm that the new requirement for Part 3 providers will apply to all categories of primary priority content that is harmful to children, not just pornography. This will mean that providers which allow content promoting or glorifying suicide, self-harm and eating disorders will also be required to use age verification or age estimation to protect children where they identify such content on their service.

Further amendments clarify that a provider can conclude that children cannot access a service—and therefore that the service is not subject to the relevant children’s safety duty—only if it uses age verification or age estimation to ensure that children are not normally able to access the service. This will ensure consistency with the new duties on Part 3 providers to use these measures to prevent children’s access to primary priority content. Amendment 34 inserts a reference to the new user empowerment duties imposed on category 1 providers in the child safety duties.

Amendment 214 will require Part 5 providers to publish a publicly available summary of the age-verification or age-estimation measures that they are using to ensure that children are not normally able to encounter content that is regulated provider pornographic content on their service. This will increase transparency for users on the measures that providers are using to protect children. It also aligns the duties on Part 5 providers with the existing duties on Part 3 providers to include clear information in terms of service on child protection measures or, for search engines, a publicly available statement on such measures.

I thank the noble Baroness, Lady Kidron, for her tireless work relating to Amendment 124, which sets out a list of age-assurance principles. This amendment clearly sets out the important considerations around the use of age-assurance technologies, which Ofcom must have regard to when producing its codes of practice. Amendment 216 sets out the subset of principles which apply to Part 5 guidance. Together, these amendments ensure that providers are deploying age-assurance technologies in an appropriate manner. These principles appear as a full list in Schedule 4. This ensures that the principles can be found together in one place in the Bill. The wider duties set out in the Bill ensure that the same high standards apply to both Part 3 and Part 5 providers. These principles have been carefully drafted to avoid restating existing duties in the Bill. In accordance with good legislative drafting practice, the principles also do not include reference to other legislation which already directly applies to providers. In its relevant guidance and codes, however, Ofcom may include such references as it deems appropriate.

Finally, I highlight the critical importance of ensuring that users’ privacy is protected throughout the age-assurance processes. I make it clear that privacy has been represented in these principles to the furthest degree possible, by referring to the strong safeguards for user privacy already set out in the Bill.

In recognition of these new principles and to avoid duplication, Amendment 127 requires Ofcom to refer to the age-assurance principles, rather than to the proactive technology principles, when recommending age-assurance technologies that are also proactive technology.

We have listened to the points raised by noble Lords about the importance of having clear and robust definitions in the Bill for age assurance, age verification and age estimation. Amendment 277 brings forward those definitions. We have also made it clear that self-declared age, without additional, more robust measures, is not to be regarded as age verification or age estimation for compliance with duties set out in the Bill. Amendment 278 aligns the definition of proactive technology with these new definitions.

The Government are clear that the Bill’s protections must be implemented as quickly as is feasible. This entails a complex programme of work for the Government and Ofcom, as well as robust parliamentary scrutiny of many parts of the regime. All of this will take time to deliver. It is right, however, that we set clear expectations for when the most pressing parts of the regulation—those targeting illegal content and protecting children—should be in place. These amendments create an 18-month statutory deadline from the day the Bill is passed for Ofcom’s implementation of those areas. By this point, Ofcom must submit draft codes of practice to the Secretary of State to be laid in Parliament and publish its final guidance relating to illegal content duties, duties about content harmful to children and duties about pornography content in Part 5. This also includes relevant cross-cutting duties, such as content reporting procedures, which are relevant to illegal content and content harmful to children.

In line with convention, most of the Bill’s substantive provisions will be commenced two months after Royal Assent. These amendments ensure that a set of specific clauses will commence earlier—on the day of Royal Assent—allowing Ofcom to begin vital implementation work sooner than it otherwise would have done. Commencing these clauses early will enable Ofcom to launch its consultation on draft codes of practice for illegal content duties shortly after Royal Assent.

Amendment 271 introduces a new duty on Ofcom to produce and publish a report on in-scope providers’ use of age-assurance technologies, and for this to be done within 18 months of the first date on which both Clauses 11 and 72(2), on pornography duties, are in force. I thank the noble Lord, Lord Allan of Hallam, for the amendment he proposed in Committee, to which this amendment responds. We believe that this amendment will improve transparency in how age-assurance solutions are being deployed by providers, and the effectiveness of those solutions.

Finally, we are also making a number of consequential and technical amendments to the Bill to split Clauses 11 and 25 into two parts. This is to ensure these do not become unwieldy and that the duties are clear for providers and for Ofcom. I beg to move.

Debate on Amendment 34 adjourned.
Consideration on Report adjourned.
House adjourned at 7.12 pm.

Online Safety Bill

Report (2nd Day)
15:56
Relevant documents: 28th and 38th Reports from the Delegated Powers Committee, 15th Report from the Constitution Committee. Scottish and Welsh Legislative Consent granted.
Clause 10: Children’s risk assessment duties
Debate on Amendment 34 resumed.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

We began this group on the previous day on Report, and I concluded my remarks, so it is now for other noble Lords to contribute on the amendments that I spoke to on Thursday.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I rise emphatically to welcome the government amendments in this group. They are a thoughtful and fulsome answer to the serious concerns expressed from the four corners of the Chamber by a great many noble Lords at Second Reading and in Committee about the treatment of age verification for pornography and online harms. For this, I express my profound thanks to my noble friend the Minister, the Secretary of State, the Bill team, the Ofcom officials and all those who have worked so hard to refine this important Bill. This is a moment when the legislative team has clearly listened and done everything it possibly can to close the gap. It is very much the House of Lords at its best.

It is worth mentioning the exceptionally broad alliance of noble Lords who have worked so hard on this issue, particularly my compadres, my noble friend Lady Harding, the noble Baroness, Lady Kidron, and the right reverend Prelate the Bishop of Oxford, who all signed many of the draft amendments. There are the Front-Benchers, including the noble Lords, Lord Stevenson, Lord Knight, Lord Clement-Jones and Lord Allan of Hallam, and the noble Baroness, Lady Merron. There are the Back-Benchers behind me, including my noble friends Lady Jenkin and Lord Farmer, the noble Lords, Lord Morrow, Lord Browne and Lord Dodds, and the noble Baroness, Lady Foster. Of those in front of me, there are the noble Baronesses, Lady Benjamin and Lady Ritchie, and there is also a number too large for me to mention, from all across the House.

I very much welcome the sense of pragmatism and proportionality at the heart of the Online Safety Bill. I welcome the central use of risk assessment as a vital tool for policy implementation and the recognition that some harms are worse than others, that some children need more protection than others, that we are legislating for future technologies that we do not know much about and that we must engage industry to achieve effective implementation. As a veteran of the Communications Act 2003, I strongly support the need for enabling legislation that has agility and a broad amount of support to stand the test of time.

16:00
However, there is also a time when it is essential that we are absolute about things and that we say that there are some contents and functions that children should never encounter anywhere on the internet. If we want to be taken seriously as a Parliament, and if we want to bring about meaningful behavioural change, we need to tether the regulation of the internet to at least some certainties, and that is why this package of government amendments is so very welcome. We need to make it clear to anyone doing business on the internet that, sometimes, there are no loopholes, no mitigations, no legal cop-outs, no consultations and no sliding scales. Sometimes Parliament makes choices and decides that some things are just plain wrong and beyond the pale.
It was the moral relativism and misguided sense of proportionality and consultative handling, when it came to the age verification measures for damaging and violent pornography, that so alarmed so many people about the draft Bill. It is very much the clarity of the newly introduced government amendments that make them so powerful. They make it crystal clear that no child should ever see any pornography anywhere on the internet. That is a massive relief to those who have campaigned so hard and for so long for final age verification for pornography and priority harms. On this, I am particularly thankful to the noble Baroness, Lady Benjamin, who I know is seething with frustration that she is not speaking, and to my noble friend Lord Farmer. By introducing clear, concrete and definitive measures, this government package provides a tether that anchors the Bill to some certainty.
The government amendments are an essential step to ending the corrosive sense of exceptionalism that has hung around the regulation of online spaces for too long. We will no longer consult with industry about what it might or might not be expected to do to protect children, as was first intended. Instead, we are defining a high bar and applying it to all pornographic content and priority harms, wherever they are on the internet, the metaverse or any future technology. We are legislating for all online businesses, wherever they are, and whether they are big, small, mobile, meta or whatever.
Thank goodness that our Government have recognised that we have reached an inflection point in the history of the online world, where the access to internet content and functions are in the pockets and bedrooms of our children. We should no longer victim-blame our children by calling for more education; we cannot scapegoat parents by making implausible expectations about how families can police or manage their children’s times on devices. Instead, with these amendments, the Government have recognised that we need internet companies to take responsibility for what is on their platforms; there will be no more dodging or obfuscation. If you have horrible, violent porn content on your service, you need a system to keep children away—full stop; no haggling.
This is a profound challenge to Twitter, Instagram, Snapchat, TikTok, WhatsApp, Reddit, Facebook and all the services that the Children’s Commissioner rightly identified as gateways to pornography: businesses that have, for too long, happily recruited kids to their algorithms with porn and have taken advertisers’ money for clicks from kids watching porn, regardless of the consequences to society. This is also a challenge to Pornhub and all the other professional pornographers that have benefited from the constructive ambiguity of the last 30 years and will now be called to account.
This is a huge victory. My noble friend Lord Grade thoughtfully and movingly told us that he would judge Ofcom’s mission to be a success if platforms finally took responsibility for what was on their services, and I take that very seriously. The government amendments in this group provide a powerful illustration of that principle and a tool for bringing it about. Effective enforcement by Ofcom is essential for giving tech bosses, who are too often happy to pay the fines and move on, a certain clarity of mind. I welcome the government amendments on senior management liability, which introduce the threat of prison to those who egregiously breach Ofcom’s standards. However, as the House knows, several enforcement measures are yet to reach us, and I flag in advance to my noble friend the Minister that these are considered critical to the success of the new regime by several noble Lords.
The government amendments are a massive step towards applying the common-sense principle that the rules about what is illegal and inappropriate for children in the real world should be applied equally to the online world, with equal vigour and equal scope. For that reason, I very much welcome the announcement of a porn review to investigate gaps in UK regulation that allow exploitation or abuse to occur online, in clear breach of long-standing criminal and civil laws, and to identify barriers to enforcing criminal law.
This is a knotty issue that involves several cross-departmental dependencies, including the allocation of resources by the Home Office, the writing of proper guidelines by the Ministry of Justice, the prioritisation of prosecutions by the CPS, and the building of a relevant skills base in our police forces. I therefore ask the Minister for guidance on the timetable, the terms of reference and the appointment of a chair for this review. I would also ask that a wide range of voices are heard and prioritised for this review.
The Government deserve considerable praise for their bold steps: not just to protect children from the harms of pornography, and to put Britain at the forefront of the global response to online safety, but also to nurture a benign environment for our critically important tech sector. These government amendments will create legislative certainty: an essential foundation for innovation. They will help rehabilitate the reputation of a tech sector reeling from the excesses of bad actors and the misplaced moral relativism of this young, exciting and vibrant industry. That will have benign consequences for investment and recruitment.
With a final word of optimism, I ask my noble friend the Minister what work will be done to bring alignment with other jurisdictions and to promote Britain as a well-regulated destination for investment, much as we do for life sciences.
Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I reiterate what the noble Lord, Lord Bethell, has said and thank him for our various discussions between Committee and Report, particularly on this set of amendments to do with age verification. I also welcome the Government’s responsiveness to the concerns raised in Committee. I welcome these amendments, which are a step forward.

In Committee, I was arguing that there should be a level playing field for regulating any online platform with pornographic content, whether it falls under Part 3 or Part 5 of the Bill. I welcome the Government’s significant changes to Clauses 11 and 72 to ensure that robust age verification or estimation must be used and that standards are consistent across the Bill.

I have a few minor concerns that I wish to highlight. I am thoughtful about whether enough is required of search services in preventing young people from accessing pornography in Clause 25. I recognise the Government believe they have satisfied the need. I fear they may have done enough in the short term, but there is a real concern that this clause is not sufficiently future-proofed. Of course, only time will tell. Maybe the Minister could advise us further in that particular regard.

In Committee, I also argued that the duties in respect of pornography in Parts 3 and 5 must come into effect at the same time. I welcome the government commitment to placing a timeframe for the codes of practice and guidance on the face of the Bill through amendments including Amendment 230. I hope that the Minister will reassure us today that it is the Government’s intention that the duties in Clauses 11 and 72 will come into effect at the same time. Subsection (3) of the new clause proposed in Amendment 271 specifically states that the duties could come into effect at different times, which leaves a loophole for pornography to be regulated differently, even if only for a short time, between Part 3 and Part 5 services. This would be extremely regrettable.

I would also like to reiterate what I said last Thursday, in case the Minister missed my contribution when he intervened on me. I say once again that I commend the Minister for the announcement of the review of the regulation, legislation and enforcement of pornography offences, which I think was this time last week. I once again ask the Minister: will he set out a timetable for publishing the terms of reference and details of how this review will take place? If he cannot set out that timetable today, will he write to your Lordships setting out the timetable before the Recess, and ensure a copy is placed in the Library?

Finally, all of us across the House have benefited from the expertise of expert organisations as we have considered this Bill. I repeat my request to the Minister that he consider establishing an external reference group to support the review, consisting of those NGOs with particular and dedicated expertise. Such groups would have much to add to the process—they have much learning and advice, and there is much assistance there to the Government in that regard.

Once again, I thank the Minister for listening and responding. I look forward to seeing the protections for children set out in these amendments implemented. I shall watch implementation very closely, and I trust and hope that the regulator will take robust action once the codes of practice and guidance are published. Children above all will benefit from a safer internet.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I welcome the government amendments in this group, which set out the important role that age assurance will play in the online safety regime. I particularly welcome Amendment 210, which states that companies must employ systems that are “highly effective” at correctly determining whether a particular user is a child to prevent access to pornography, and Amendment 124, which sets out in a code of practice principles which must be followed when implementing age assurance—principles that ensure alignment of standards and protections with the ICO’s age appropriate design code and include, among other things, that age assurance systems should be easy to use, proportionate to the risk and easy to understand, including to those with protected characteristics, as well as aiming to be interoperable. The code is a first step from current practice, in which age verification is opaque, used to further profile children and related adults and highly ineffective, to a world in which children are offered age-appropriate services by design and default.

I pay tribute again to the noble Lord, Lord Bethell, and the noble Baroness, Lady Benjamin, and I associate myself with the broad set of thanks that the noble Lord, Lord Bethell, gave in his opening speech. I also thank colleagues across your Lordships’ House and the other place for supporting this cause with such clarity of purpose. On this matter, I believe that the UK is world-beating, and it will be a testament to all those involved to see the UK’s age verification and estimation laws built on a foundation of transparency and trust so that those impacted feel confident in using them—and we ensure their role in delivering the online world that children and young people deserve.

I have a number of specific questions about government Amendment 38 and Amendment 39. I would be grateful if the Minister were able to answer them from the Dispatch Box and in doing so give a clear sign of the Government’s intent. I will also speak briefly to Amendments 125 and 217 in my name and those of the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford, as well as Amendment 184 in the names of the noble Baroness, Lady Fox, and the noble Lord, Lord Moylan. All three amendments address privacy.

Government Amendment 38, to which I have added my name, offers exemptions in new subsections (3A) and (3B) that mean that a regulated company need not use age verification or estimation to prevent access to primary priority content if they already prevent it by means of its terms of service. First, I ask the Minister to confirm that these exemptions apply only if a service effectively upholds its terms of service on a routine basis, and that failure to do so would trigger enforcement action and/or an instruction from Ofcom to apply age assurance.

16:15
Secondly, will the Minister further explain how these exemptions impact on the child safety duties for priority content, which is not prohibited but must be age appropriate, or how they might account for aspects of the service that create harm but are not associated directly with the content? In trying to work out the logic, it appeared to me that either a company would have to identify the end user was a child by means of age assurance, but perhaps not the highest bar, or it would have to design a service that was not harmful to children even if that harm was not primary priority content. It would be good to hear what the intention is to make sure there is not—inadvertently, I am sure—a loophole by which companies can fail in their duties by ignoring children on their service because they do not allow primary priority content.
Amendments 49 and 93 would ensure that
“a provider can only conclude that children cannot access a service if age verification or age estimation is used on the service with the result that children are not normally able to access it”.
Somewhat related to my previous question, can the noble Lord confirm that a service—for example, a financial service provider that offers adult-facing products, or a gambling site that requires adults to identify themselves—will be taken as having age-verified so that age verification is no longer necessary?
I welcome the 18-month deadline for Ofcom to produce guidance on child access assessments, but can the Minister confirm that the timeframe is a backstop and the Government’s ambition is to get age assurance much quicker? Also, when he responds, will he confirm that Schedule 4 will be published as part of the child safety code, which is why it is not mentioned in Amendment 271? While I enthusiastically welcome the code of practice, it is simply the fact that many adults and children have concerns about privacy and security.
Amendments 125 and 217 address a gap in the Bill by making it clear
“that data collected for age assurance must be stored securely, deleted as soon as possible and not used for other purposes”.
Similarly, Amendment 184, in the name of the noble Lord, Lord Moylan, addresses the privacy issue in a detailed way that may be better suited to Ofcom’s fully fleshed out code of conduct but none the less speaks to the same gap. I do not expect the Minister to accept these amendments as written, and I understand that there is an overarching requirement for privacy in the Bill, but in the public discourse about the online world, safety is always put in binary opposition to privacy. If the Government were to acknowledge in the Bill the seriousness of the need for privacy and security of information relating to age verification and estimation, it would send a clear message that they have understood the validity of the privacy concerns and be an enormous contribution to ending the unhelpful binary. I hope that on this matter the Minister will agree to take these amendments away and include wording to the same effect.
Finally, last week, at the invitation of the right reverend Prelate the Bishop of Gloucester, the Minister and I attended an event at which we were addressed by children about the pressures they felt from social media. I thank all the young people present for the powerful and eloquent way in which they expressed the need for politicians and religious, civic and business leaders to do more to detoxify the digital world. If they are listening, as they said they would, I want to assure them that all of us in this Chamber hear their concerns. Importantly, when I asked Oliver, aged 12, and Arthur, aged 13, what one thing we could and should do to make their online world better, they said, “Make age checking meaningful”. Today, we are doing just that.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I shall follow on directly from some of the comments of the noble Baroness, Lady Kidron, around privacy. I shall not repeat the arguments around children and pornography but touch on something else, which is the impact of these amendments on the vast majority of internet users, the 85%-plus who are 18 or older. Of course, when we introduce age assurance measures, they will affect everyone: we should not kid ourselves that it is only about children, because everyone will have to pass through these gateways.

I shall speak particularly to Amendments 184 and 217 on privacy. I am sure that most adults will support extra safety measures for children, but they also want to be able to access a wide range of online services with the least possible friction and the lowest risk to their own personal data. We can explore how this might work in practice by looking at something that we might all do in this Chamber. Looking round, I believe that we are all at least 18 years old, and we might find ourselves idly passing the time creating an account on a new user-to-user or search service that has been recommended. We should consider this group of amendments by how that might play out. In future, the services will have to check that we are in the United Kingdom—there is a range of ways in which they can do that. Having confirmed that, they will need to understand whether we are 18-plus or a child user so that they can tailor their service appropriately.

I hope we all agree that the services should not be asking us for passports or driving licences, for example, as that would be entirely contrary to the thrust of privacy regulations and would be a huge gateway to fraud and other problems. The most efficient way would be for them to ask us for some sort of digital certificate—a certificate that we have on our devices where we have proven to a trusted third party that we are 18-plus. The certificate does not need to contain any personal data but simply confirms that we are of age. That is very similar to the way in which secure websites work: they send a digital certificate to your browser and you verify that certificate with a trusted third party—a certificate authority—and then you can open an encrypted connection. We are reversing the flow: the service will ask the user for a certificate and then verify that before granting access. A user may have a setting on their device in future where they confirm that they are happy for their 18-plus certificate to be given to anybody or whether they would like to be asked every time there will be a new set of privacy controls.

Building the infrastructure for this is non-trivial. Many things could go wrong but at least the kind of model I am describing has some hope of achieving widespread adoption. It is very good for the adult users as they can continue to have the frictionless experience as long as they are happy for their device to send a certificate to new services. It is good for the market of internet services if new services can bring users on easily. It is good for privacy by avoiding lots of services each collecting personal data, as most people access a multiplicity of services. Perhaps most importantly in terms of the Bill’s objectives, it is good for children if services can separate out the vast majority of their users who are 18-plus and then focus their real efforts on tailoring the services for the minority of users who will be children. The Bill will introduce a whole set of new obligations.

We should not underestimate the scale of the challenge in practice; it will work only if major internet companies are willing to play the game and get into the market of offering 18-plus certificates. Companies such as Google, Meta, Amazon, Apple and Microsoft—the ones we normally love to hate—will potentially provide the solution, as well as not-for-profits. There will be foundations for those who object to the big internet companies, but it is those big internet companies which will have the reach; they each have millions of users in the United Kingdom. This is not to fly the flag for those companies; it is simply a question of efficiency. I suspect that everyone in the Chamber uses a combination of services from those big providers. We already share with them the personal data necessary for age assurance, and there would be no additional sharing of data. If they were willing to provide a certificate, they could do so at the kind of scale necessary for the 50 million or so adult internet users in the United Kingdom to be able to get one easily and then pass it to services when they choose to access them.

There may be some discomfort with big tech playing this role, but I cannot see the kind of aggressive targets that we are setting in the amendments working unless we take advantage of those existing platforms and use them to make this work. Amendment 230 tells us that we have about 18 months, which is very soon in terms of trying to build something. We should be clear that if we are to deliver this package it will depend on persuading some of those big names in tech to create age certification schemes for UK users.

For this to have widespread adoption and a competitive market, we need it to be free of direct financial costs to individual users and to services choosing to age-verify, as we have asked them to do so. We need to think very carefully about that, as it raises a whole series of competition questions that I am sure Ofcom and the Competition and Markets Authority will have to address, not least because we will be asking companies to provide age certification free of charge that will be used by their existing and future competitors to meet their compliance requirements.

There may be some listening who think that we can rely on small age-assurance start-ups. Some of them have a really important role to play and we should be proud of our homegrown industry, but we should be realistic that they will reach scale only if they work with and through the large service providers. Many of them are already seeking those kinds of relationship.

As a test case, we might think of an application such as Signal, a messaging app that prides itself on being privacy-first. It does not want to collect any additional information from its users, which is perfectly reasonable, given where it is coming from. It will be really interesting to see how comfortable such a service will be with working with certification schemes, under which it can prove that users are over 18 by taking advantage of the data held by other services which collect significant amounts of data and have a very good idea of how old we are.

I have not focused on under-18s but, once this system is in place, application providers will be thinking very carefully about the pros and cons of allowing under-18s on at all. I know that the noble Baroness, Lady Kidron, is also concerned about this. There will be services that will think very carefully, if they find that the vast majority of their users are 18-plus, about the extent to which they want to put time and effort into tailoring them for users under 18. We do not intend that outcome from the Bill, but we need realistically to consider it.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Just to be clear, I say that the purpose of my question to the Minister was to get at the fact that, for low-risk situations, there can be age assurance that is a lot less effective or intrusive, for that very reason.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I agree; that is very helpful. I think Amendments 74, 93 and 99 also talk about the exclusion, as the noble Baroness raised, of services from the child safety duties if they can show that they are only 18-plus. It will be quite material and critical to know at what level they can demonstrate that.

I have avoided talking about pornography services directly, but there are interesting questions around what will happen if this model develops, as it likely will. If big tech is now starting to provide age certification for the kinds of mainstream services we may all want to access, they may be much less comfortable providing that same certification to pornography providers, for reputational reasons. A mainstream provider would not want to enter that market. Ofcom will need to take a view on this. We have talked about interoperability in the framework we have created, but it is a big question for Ofcom whether it wants to steer all age certification providers also to provide 18-plus certification for pornography providers or, effectively, to allow two markets to develop—one for mainstream certification and one for certification for pornography.

I have taken a few minutes because this is a very high-risk area for the Bill. There are material risks in willing into existence a model that depends on technical infrastructure that has not yet been built. The noble Lord, Lord Bethell, referred to prior experience; one of the reasons why we have not delivered age assurance before is that the infrastructure was not there. We now want it built, so must recognise that it is quite a high-risk endeavour. That does not mean it is not worth attempting, but we must recognise the risks and work on them.

If the implementation is poor, it will frustrate adult users, which may bring the Bill into disrepute. We need to recognise that as a genuine risk. There are people out there already saying that the Bill means that every internet service in the world will ask you for your passport. If that is not the case, we need to stress that we do not expect that to happen. There are also potentially significant impacts on the market for online services available to both adults and children in the UK, depending on the design of this system.

The purpose of thinking about some of these risks today is not to create a doom-laden scenario and say that it will not work. It is entirely the opposite—to say that, if we are to move ahead into a world in which children are protected from harmful content, for which very good reasons have been articulated and a huge amount of work has gone ahead, and in which services can tailor and gear access to the age of the child, we have to be able to take the 18-plus out of that, put it into a separate box and do so in a really easy, straightforward manner. If not, the 18-plus will end up dragging down what we want to do for the underage.

I hope that explanation helps in the context of these amendments. We will need to test them against it as implementation happens over the next few months.

16:30
Lord Farmer Portrait Lord Farmer (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the Minister for engaging with the amendment in my name and that of the noble Baroness, Lady Benjamin, in Committee, to ensure parity between the regulation of online and offline pornography. We did not table it for Report because of the welcome news of the Government’s review. At this point, I would like to give my backing to all that my noble friend Lord Bethell said and would like to thank him for his great encouragement and enthusiasm on our long journey, as well as the noble Baroness, Lady Kidron. I would particularly like to mention the noble Baroness, Lady Benjamin, who, as my noble friend Lord Bethell mentioned, must be very frustrated today at not being able to stand up and benefit us with her passion on this subject, which has kept a lot of us going.

I have some questions and comments about the review, but first I want to stand back and state why this review is so necessary. Our society must ask how pornography was able to proliferate so freely, despite all the warnings of the danger and consequences of this happening when the internet was in its infancy. Human appetites, the profit motive and the ideology of cyberlibertarianism flourished freely in a zeitgeist where notions of right and wrong had become deeply unfashionable. Pre-internet, pornography was mainly on top shelves, in poky and rather sordid sex shops, or in specialist cinemas. There was recognition that exposure to intimate sex acts should never be accidental but always the result of very deliberate decisions made by adults—hence the travesty of leaving children exposed to the danger of stumbling across graphic, violent and frequently misogynistic pornography by not bringing Part 3 of the Digital Economy Act 2017 into force.

I have talked previously in this House about sociology professor Christie Davies’ demoralisation of society thesis: what happens when religiously reinforced moralism, with its totemic notion of free will, is ditched along with God. Notions of right and wrong become subjective, individually determined, and a kind of blindness sets in; how else can we explain why legislators ignored the all-too-predictable effects of unrestrained access to pornography on societal well-being, including but not limited to harms to children? For this Bill to be an inflection point in history, this review, birthed out of it, must unashamedly call out the immorality of what has gone before. How should we define morality? Well, society simply does not work if it is governed by self-gratification and expressive individualism. Relationships—the soil of society—including intimate sexual relationships, are only healthy if they are self-giving, rather than self-gratifying. These values did not emerge from the Enlightenment but from the much deeper seam of our Judeo-Christian foundations. Pornography is antithetical to these values.

I turn to the review’s terms of reference. Can the Minister confirm that the lack of parity between online and offline regulation will be included in the legal gaps it will address? Can he also confirm that the review will address gaps in evidence? As I said in Committee, a deep seam of academic research already exists on the harmful effects of the ubiquity of pornography. The associations with greater mental ill health, especially among teenagers, are completely unsurprising; developing brains are being saturated with dark depictions of child sexual abuse, incest, trafficking, torture, rape, violence and coercion. As I mentioned earlier, research shows that adults whose sexual arousal is utterly dependent on pornography can be catastrophically impaired in their ability to form relationships with flesh-and-blood human beings, let alone engage in intimate physical sex.

Will the review also plug gaps in areas that remain underresearched and controversial and where vested interests are bound? On that point, whoever chairs this review will have to be ready, willing and able to take on powerful, ideologically motivated and profit-driven lobbies.

Inter alia, we need to establish through research the extent to which some young women are driven to change their gender because of hyper-sexualised, porn-depicted female stereotypes. Anecdotally, some individuals have described their complete inability to relate to their natal sex. It can be dangerous and distasteful to be a woman in a world of pornified relationships which expects them to embrace strangulation, degradation and sexual violence. One girl who transitioned described finding such porn as a child: “I am ashamed that I was fascinated by it and would seek it out. Despite this interest in watching it, I hated the idea of myself actually being in the position of the women. For a while, I even thought I was asexual. Sex is still scary to me, complicated”.

Finally, the Government’s announcement mentioned several government departments but does not make it clear that they will also draw in the work of DfE and DHSC—the departments for children’s and adult mental health—for reasons I have already touched on. Can the Minister confirm that the remit will include whatever areas of government responsibility are needed so that the review is genuinely broad enough to look across society at how to protect not just children but adults?

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to speak to Amendment 184 in my name—

Lord Harlech Portrait Lord Harlech (Con)
- Hansard - - - Excerpts

My Lords, the guidance in the Companion states that Peers who were not present for the opening of this debate last week should not speak in the debate today, so I will have to ask the noble Baroness to reserve her remarks on this occasion.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, that neatly brings me to the beginning of my own speech. I have expressed to the Chief Whip and the Minister my great regret that my noble friend Lady Benjamin is not able to take part in today’s debate because of the rather arbitrary way the group was started at the very end of proceedings on Thursday. The Minister is very much aware of that; it is a very sad thing.

I pay huge tribute to my noble friend, as the noble Lords, Lord Bethell and Lord Farmer, have. She is sitting behind me, yet she cannot make her contribution after a decade of campaigning so passionately on these issues. That includes pushing for age verification for pornographic content. We stood shoulder to shoulder on Part 3 of the Digital Economy Act, and she has carried that passion through into the debates on this Bill.

My noble friend believes that the Minister’s amendments in particular are a huge step forward. She describes this as a landmark moment from her point of view. She wants me to thank Barnardo’s, CARE and CEASE for their support and for bringing evidence and research to us on pornography. She would like to thank the Secretary of State and the Minister in particular for taking us to this point.

My noble friend also welcomes the review that was announced last week but, like the noble Lords, Lord Bethell and Lord Farmer, she has some questions that have be asked. This review is a good opportunity to examine the gaps in regulation, but it is proposed that the review will take a year. Is that the proposal and is it a firm year? What happens thereafter? Is there a commitment by the Government to legislate on this, if they are still the Government in a year’s time? What are their intentions and what is the road map to legislation? For instance, the gambling review started four years ago and we have not seen real change yet, so I think it is important to have some assurance in that respect.

Who will be involved in the review? Will the third sector and charity organisations working in this space be involved? The noble Lord, Lord Farmer, asked about scientific and medical research, which are all important aspects. I know that my noble friend would want to pay her own tribute to the noble Lords, Lord Farmer and Lord Bethell, to others involved in this exercise—“exercise” should be what it is called as it certainly feels like exercise—and in particular to the noble Baroness, Lady Kidron. I hope that the Minister will give my noble friend those assurances, despite the fact that she is not able to take part in this debate today.

From my point of view, I welcome the Government’s decision to strengthen the Bill’s age-verification requirements for online pornography, especially in respect of the principles for age assurance. But—and there always is a “but”—we absolutely need that age assurance to be privacy protecting. Amendment 125 is crucial and I am disappointed that it has not been included so far.

My noble friend Lord Allan referred to one of the major objections. We had a huge argument and debate about the efficacy of age verification when we discussed Part 3. There were great fears that age verification was going to be privacy invading and there was not a great deal of certainty about the kind of technology that was available for this kind of privacy-protecting age verification. I personally prefer and wanted to see third-party age verification; at the time, I thought it far better and safer to have third parties, such as Yoti, being responsible for our certification rather than the big tech companies, for all kinds of reasons and not just competitive ones. If we do not have some privacy-protecting language, we will be back in that situation of suspicion if we are not very careful.

Like my noble friend, I welcome the announcement of a review on the issue. There is a huge gap currently, and I give credit to the Secretary of State for understanding that that gap between the treatment of online pornography and offline pornography is very large indeed, as the BBFC can say from its experience. There is a wealth of evidence showing the link between violent pornography and real-life violence against women and girls. That is one of the reasons that I am so pleased that this review is taking place.

I mentioned the BBFC and have mentioned it before. It was going to be the regulator under Part 3 of the Digital Economy Bill. I very much hope that the Government will consult the BBFC, as it has a great deal of experience in offline certification, so I hope it will be heavily involved in a review of this kind.

I listened to my noble friend very intently and I think he made many points that resonate about the practical way in which will need to age-verify to make it simple for the public who are 18 and over. I much prefer the idea of third-party age verification to putting myself in the hands of big tech. I hope that Ofcom and the Government will do everything they can to make sure that those kinds of services are readily available and are not just controlled by the big tech companies in an anti-competitive way.

16:45
I think we have done a great service here in this group of amendments. The noble Lord, Lord Bethell, and other noble Lords have achieved a level playing field between Part 3 and Part 5 and, in doing so, have introduced a much more robust and safer form of age verification and age assurance which, nevertheless, as the noble Baroness, Lady Kidron, pointed out, is proportionate in the circumstances. So I pay tribute to all those involved, including the Minister for his flexibility and the Secretary of the State likewise, but we must have that privacy-protecting aspect to it.
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a good debate, perhaps unfairly curtailed in terms of the range of voices we have heard, but I am sure the points we wanted to have on the table are there and we can use them in summarising the debate we have had so far.

I welcome the Government’s amendments in this group. They have gone a long way to resolving a number of the difficulties that were left after the Digital Economy Act. As the noble Lord, Lord Clement-Jones, has said, we now have Part 3 and Part 5 hooked together in a consistent and effective way and definitions of “age verification” and “age estimation”. The noble Lord, Lord Grade, is sadly not in his place today—I normally judge the quality of the debate by the angle at which he resides in that top corner there. He is not here to judge it, but I am sure he would be upright and very excited by what we have been hearing so far. His point about the need for companies to be clearly responsible for what they serve up through their services is really important in what we are saying here today.

However, despite the welcome links across to the ICO age-appropriate design code, with the concerns we have been expressing on privacy there are still a number of questions which I think the Minister will want to deal with, either today or in writing. Several noble Lords have raised the question of what “proportionate” means in this area. I have mentioned it in other speeches in other groups. We all want the overall system to be proportionate in the way in which it allocates the powers, duties and responsibilities on the companies providing us with the services they do. But there is an exception for the question of whether children should have access to material which they should not get because of legal constraints, and I hope that “proportionate” is not being used in any sense to evade that.

I say that particularly because the concern has been raised in other debates—and I would be grateful if the Minister could make sure when he comes to respond that this issue is addressed—that smaller companies with less robust track records in terms of their income and expenditures might be able to plead that some of the responsibilities outlined in this section of the Bill do not apply to them because otherwise it would bear on their ability to continue. That would be a complete travesty of where we are trying to get to here, which is an absolute bar on children having access to material that is illegal or in the lists now in the Bill in terms of priority content.

The second worry that people have raised is: will the system that is set up here actually work in practice, particularly if it does not apply to all companies? That relates perhaps to the other half of the coin that I have just mentioned.

The third point, raised by a number of Peers, is: where does all this sit in relation to the review of pornography which was announced recently? A number of questions have been asked about issues which the Minister may be unable to respond to, but I suspect he may also want to write to us on the wider issue of timing and the terms of reference once they are settled.

I think we need to know this as we reach the end of the progress on this Bill, because you cannot expect a system being set up with the powers that are being given to Ofcom to work happily and well if Ofcom knows it is being reviewed at the same time. I hope that some consideration will be given to how we get the system up and running, even if the timescale is now tighter than it was, if at the same time a review rightly positioned to try to look at the wider range of pornography is going to impact on its work.

I want to end on the question raised by a large number of noble Lords: how does all this work sit with privacy? Where information and data are being shared on the basis of assuring access to services, there will be a worry if privacy is not ensured. The amendments tabled by the noble Baroness, Lady Kidron, are very salient to this. I look forward to the Minister’s response to them.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am sorry that the noble Baroness, Lady Benjamin, was unable to be here for the start of the debate on Thursday and therefore that we have not had the benefit of hearing from her today. I am very glad that she was here to hear the richly deserved plaudits from across the House for her years of campaigning on this issue.

I am very glad to have had the opportunity to discuss matters directly with her including, when it was first announced, the review that we have launched. I am pleased that she gave it a conditional thumbs up. Many of her points have been picked up by other noble Lords today. I did not expect anything more than a conditional thumbs up from her, given her commitment to getting this absolutely right. I am glad that she is here to hear some of the answers that I am able to set out, but I know that our discussions would have continued even if she had been able to speak today and that her campaigns on this important issue will not cease; she has been tireless in them. I am very grateful to her, my noble friends Lord Bethell and Lady Harding, the noble Baroness, Lady Kidron, and many others who have been working hard on this.

Let me pick up on their questions and those of the noble Baroness, Lady Ritchie of Downpatrick, and others on the review we announced last week. It will focus on the current regulatory landscape and how to achieve better alignment of online and offline regulation of commercial pornography. It will also look at the effectiveness of the criminal law and the response of the criminal justice system relating to pornography. This would focus primarily on the approach taken by law enforcement agencies and the Crown Prosecution Service, including considering whether changes to the criminal law would address the challenges identified.

The review will be informed by significant expert input from government departments across Whitehall, the Crown Prosecution Service and law enforcement agencies, as well as through consultation with the industry and with civil society organisations and regulators including, as the noble Baroness, Lady Ritchie, rightly says, some of the many NGOs that do important work in this area. It will be a cross-government effort. It will include but not be limited to input from the Ministry of Justice, the Home Office, the Department for Science, Innovation and Technology and my own Department for Culture, Media and Sport. I assure my noble friend Lord Farmer that other government departments will of course be invited to give their thoughts. It is not an exhaustive list.

I detected the enthusiasm for further details from noble Lords across the House. I am very happy to write as soon as I have more details on the review, to keep noble Lords fully informed. I can be clear that we expect the review to be complete within 12 months. The Government are committed to undertaking it in a timely fashion so that any additional safeguards for protecting UK users of online services can be put in place as swiftly as possible.

My noble friend Lord Bethell asked about international alignment and protecting Britain for investment. We continue to lead global discussions and engagement with our international partners to develop common approaches to online safety while delivering on our ambition to make the UK the safest place in the world to be online.

The noble Baroness, Lady Kidron, asked about the new requirements. They apply only to Part 3 providers, which allow pornography or other types of primary priority content on their service. Providers that prohibit this content under their terms of service for all users will not be required to use age verification or age estimation. In practice, we expect services that prohibit this content to use other measures to meet their duties, such as effective content moderation and user reporting. This would protect children from this content instead of requiring measures that would restrict children from seeing content that is not allowed on the service in the first place.

These providers can still use age verification and age estimation to comply with the existing duty to prevent children encountering primary priority content. Ofcom can still recommend age-verification and age-estimation measures in codes of practice for these providers where proportionate. On the noble Baroness’s second amendment, relating to Schedule 4, Ofcom may refer to the age-assurance principles set out in Schedule 4 in its children’s codes of practice.

On the 18-month timetable, I can confirm that 18 months is a backstop and not a target. Our aim is to have the regime in force as quickly as possible while making sure that services understand their new duties. Ofcom has set out in its implementation road map that it intends to publish draft guidance under Part 5 this autumn and draft children’s codes next spring.

The noble Baroness, Lady Ritchie, also asked about implementation timetables. I can confirm that Part 3 and Part 5 duties will be implemented at the same time. Ofcom will publish draft guidance shortly after Royal Assent for Part 5 duties and codes for the illegal content duties in Part 3. Draft codes for Part 3 children’s duties will follow in spring next year. Some Part 3 duties relating to category 1 services will be implemented later, after the categorisation thresholds have been set in secondary legislation.

The noble Lord, Lord Allan of Hallam, asked about interoperability. We have been careful to ensure that the Bill is technology neutral and to allow for innovation across the age-assurance market. We have also included a principle on interoperability in the new list of age-assurance principles in Schedule 4 and the Part 5 guidance.

At the beginning of the debate, on the previous day on Report, I outlined the government amendments in this group. There are some others, which noble Lords have spoken to. Amendments 125 and 217, from the noble Baroness, Lady Kidron, seek to add additional principles on user privacy to the new lists of age-assurance principles for both Part 3 and 5, which are brought in by Amendments 124 and 216. There are already strong safeguards for user privacy in the Bill. Part 3 and 5 providers will need to have regard to the importance of protecting users’ privacy when putting in place measures such as age verification or estimation. Ofcom will be required to set out, in codes of practice for Part 3 providers and in guidance for Part 5 providers, how they can meet these duties relating to privacy. Furthermore, companies that use age-verification or age-estimation solutions will need to comply with the UK’s robust data protection laws or face enforcement action.

Adding the proposed new principles would, we fear, introduce confusion about the nature of the privacy duties set out in the Bill. Courts are likely to assume that the additions are intended to mean something different from the provisions already in the Bill relating to privacy. The new amendments before your Lordships imply that privacy rights are unqualified and that data can never be used for more than one purpose, which is not the case. That would introduce confusion about the nature of—

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I apologise to the Minister. Can he write giving chapter and verse for that particular passage by reference to the contents of the Bill?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am very happy to do that. That would probably be better than me trying to do so at length from the Dispatch Box.

Government Amendment 124 also reinforces the importance of protecting children’s privacy, including data protection, by ensuring that Ofcom will need to have regard to standards set out under Section 123 of the Data Protection Act 2018 in the age-appropriate design code. I hope that explains why we cannot accept Amendments 125 or 217.

The noble Baroness, Lady Fox, has Amendment 184 in this group and was unable to speak to it, but I am very happy to respond to it and the way she set it out on the Marshalled List. It seeks to place a new duty on Ofcom to evaluate whether internet service providers, internet-connected devices or individual websites should undertake user-identification and age-assurance checks. This duty would mean that such an evaluation would be needed before Ofcom produces guidance for regulated services to meet their duties under Clauses 16 and 72.

Following this evaluation, Ofcom would need to produce guidance on age-verification and age-assurance systems, which consider cybersecurity and a range of privacy considerations, to be laid before and approved by Parliament. The obligation for Ofcom to evaluate age assurance, included in the noble Baroness’s amendment, is already dealt with by Amendment 271, which the Government have tabled to place a new duty on Ofcom to publish a report on the effectiveness of age-assurance solutions. That will specifically include consideration of cost to business, and privacy, including the processing of personal data.

17:00
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I just realised I forgot to thank the Government for Amendment 271, which reflected something I raised in Committee. I will reflect back to the Minister that, as is reinforced by his response now, it goes precisely where I wanted to. That is to make sure—I have raised this many times—that we are not implementing another cookie banner, but are implementing something and then going back to say, “Did it work as we intended? Were the costs proportionate to what we achieved?” I want to put on the record that I appreciate Amendment 271.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I appreciate the noble Lord’s interjection and, indeed, his engagement on this issue, which has informed the amendments that we have tabled.

In relation to the amendment of the noble Baroness, Lady Fox, as I set out, there are already robust safeguards for user privacy in the Bill. I have already mentioned Amendment 124, which puts age-assurance principles in the Bill. These require Ofcom to have regard, when producing its codes of practice on the use of age assurance, to the principle of protecting the privacy of users, including data protection. We think that the noble Baroness’s amendment is also unnecessary. I hope that she and the noble Baroness, Lady Kidron, will be willing to not move their amendments and to support the government amendments in the group.

Amendment 34 agreed.
Amendment 35
Moved by
35: Clause 10, page 9, line 37, at end insert—
“(iv) features, functionalities or behaviours (including those enabled or created by the design or operation of the service) that are harmful to children”Member’s explanatory statement
This amendment ensures that in carrying out risk assessments, user to user services must consider the potential for the design and operation of services to create harm separately and additionally to harm relating to the dissemination of or encountering harmful content.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I rise to speak to all the amendments in this group. It is a cause of great regret that, despite many private meetings with officials, government lawyers and Ministers, we have not yet come to an agreement that would explicitly include in the Bill harm that does not derive from content. I will be listening very carefully to the Minister, if he should change his mind during the debate.

The amendments in this group fall into three categories. First, there is a series of amendments in my name and those of the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford: Amendments 35, 36, 37A and 85. I hope the Government will accept them as consequential because, in meetings last week, they would not accept that harm to children can arise from the functionality and design of services and not just from the content. Each of these amendments simply makes it clear that harm can arise absent from content: nothing more, nothing less. If the Minister agrees that harm may derive from the design of products and services, can he please explain, when he responds, why these amendments are not acceptable? Simply put, it is imperative that the features, functionalities or behaviours that are harmful to children, including those enabled or created by the design or operation of the service, are in scope of the Bill. This would make it utterly clear that a regulated company has a duty to design its service in a manner that does not harm children.

The Government have primary priority harmful content, priority content or non-designated harmful content, the latter being a category that is yet to be defined, but not the harm that emerges from how the regulated company designs its service. For example, there are the many hundreds of small reward loops that make up a doomscroll or make a game addictive; commercial decisions such as Pokémon famously did for a time, which was to end every game in a McDonald’s car park; or, more sinister still, the content-neutral friend recommendations that introduce a child to other children like them, while pushing children into siloed groups. For example, they deliberately push 13 year-old boys towards Andrew Tate—not for any content reason, but simply on the basis that 13 year-old boys are like each other and one of them has already been on that site.

The impact of a content-neutral friend recommendation has rocked our schools as female teachers and girls struggle with the attitudes and actions of young boys, and has torn through families, who no longer recognise their sons and brothers. To push hundreds of thousands of children towards Andrew Tate for no reason other than to benefit commercially from the network effect is a travesty for children and it undermines parents.

The focus on content is old-fashioned and looks backwards. The Bill is drafted as if it has particular situations and companies in mind but does not think about how fast the business moves. When we started the Bill, none of us thought about the impact of TikTok; last week, we saw a new service, Threads, go from zero to 70 million users in a single day. It is an act of stunning hubris to be so certain of the form of harm. To be unprepared to admit that some harm is simply design means that, despite repeated denials, this is just a content Bill. The promise of systems and processes being at the heart of the Bill has been broken.

The second set of amendments in this group are in the name of my noble friend Lord Russell. Amendments 46 and 90 further reveal the attitude of the Government, in that they are protecting the companies rather than putting them four-square in the middle of their regime. The Government specifically exempt the manner of dissemination from the safety duties. My noble friend Lord Russell’s amendment would leave that out and ensure that the manner of dissemination, which is fundamental to the harm that children experience, is included. Similarly, Amendment 240 would take out “presented by content” so that harm that is the result of the design decisions is included in the Bill.

The third set are government Amendments 281C and 281D, and Amendment 281F, in my name. For absence of doubt, I am totally supportive of government Amendments 281C to 281E, which acknowledge the cumulative harms; for example, those that Molly Russell experienced as she was sent more and more undermining and harmful content. In as far as they are a response to my entreaties, and those of other noble Lords, that we ensure that cumulative harmful content is the focus of our concerns, I am grateful to the Government for tabling them. However, I note that the Government have conceded only the role of cumulative harm for content. Amendments 281D and 281E once again talk about content as the only harm to children.

The noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford have added their names to Amendment 281F, and I believe I am right in saying that if there were not a limit to four names, there were a great many Peers who would have added their names also. For the benefit of the House, I will quote directly from the amendment:

“When in relation to children, references to harm include the potential impact of the design and operation of a regulated service separately and additionally from harms arising from content, including the following considerations … the potential cumulative impact of exposure to harm or a combination of harms … the potential for harm to result from features, functionalities or behaviours enabled or created by the design and operation of services … the potential for some features and functionalities within a service to be higher risk than other aspects of the service … that a service may, when used in conjunction with other services, facilitate harm to a child on a different service … the potential for design strategies that exploit a child’s developmental vulnerabilities to create harm, including validation metrics and compulsive reward loops … the potential for real time services, features and functionalities such as geolocation, livestream broadcasts or events, augmented and virtual environments to put children at immediate risk … the potential for content neutral systems that curate or generate environments, content feeds or contacts to create harm to children … that new and emerging harms may arise from artificial intelligence, machine generated and immersive environments”.


Before I continue, I ask noble Lords to consider which of those things they would not like for their children, grandchildren or, indeed, other people’s children. I have accepted that the Government will not add the schedule of harms as I first laid it: the four Cs of content, conduct, contact and commercial harms. I have also accepted that the same schedule, written in the less comfortable language of primary priority, priority and non-designated harms, has also been rejected. However, the list that I just set out, and the amendment to the duties that reflect those risks, would finally put the design of the system at the heart of the Bill. I am afraid that, in spite of all our conversations, I cannot accept the Government’s argument that all harm comes from content.

Even if we are wrong today—which we are most definitely not—in a world of AI, immersive tech and augmented reality, is it not dangerous and, indeed, foolish, to exclude harm that might come from a source other than content? I imagine that the Minister will make the argument that the features are covered in the risk assessment duties and that, unlike content, features may be good or bad so they cannot be characterised as harmful. To that I say: if the risk assessment is the only process that matters, why do the Government feel it necessary to define the child safety duties and the interpretation of harm? The truth is, they have meaning. In setting out the duty of a company to a child, why would the Government not put the company’s design decisions right at the centre of that duty?

As for the second part of the argument, a geolocation feature may of course be great for a map service but less great if it shows the real-time location of a child to a predator, and livestreaming from a school concert is very different from livestreaming from your bedroom. Just as the noble Lord, Lord Allan, explained on the first day on Report, there are things that are red lines and things that are amber; in other words, they have to be age-appropriate. This amendment does not seek—nor would it mean—that individual features or functionalities would be prevented, banned or stopped. It would mean that a company had a duty to make sure that their features and functionalities were age-appropriate and did not harm children—full stop. There would be no reducing this to content.

Finally, I want to repeat what I have said before. Sitting in the court at Molly Russell’s inquest, I watched the Meta representative contest content that included blood cascading down the legs of a young woman, messages that said, “You are worthless”, and snippets of film of people jumping off buildings. She said that none of those things met the bar of harmful content according to Meta’s terms and conditions.

Like others, I believe that the Online Safety Bill could usher in a new duty of care towards children, but it is a category error not to see harm in the round. Views on content can always differ but the outcome on a child is definitive. It is harm, not harmful content, that the Bill should measure. If the Minister does not have the power to accede, I will, with great regret, be testing the opinion of the House. I beg to move.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, as so often in the course of the Bill, I associate myself wholeheartedly with the comments that the noble Baroness, Lady Kidron, just made. I, too, thank my noble friend the Minister and the Secretary of State for listening to our debates in Committee on the need to be explicit about the impact of cumulative harmful content. So I support Amendments 281C, 281D and 281E, and I thank them for tabling them.

17:15
In the previous group, we talked about and debated the hugely important topic of content harm, most particularly pornography, and the need to ensure an absolutely firm bar that prevents our children seeing such content. As my friend the noble Baroness, Lady Kidron, said, content is not the only harm on the internet—quite the opposite. Internet algorithms on social media platforms do not care what the content is. The computer does not read the content; the algorithm simply drives addiction. So the functionality is the root of an awful lot of the harms that our children are experiencing today, completely unregulated—whether it is driving addiction, creating dangerous friendship groups, connecting people who should not be able to be connected to our underage minors, or tracking individuals in real time.
My teenage daughter is currently in America on a school tour, and I have been stalking and tracking her to see where she is. But, each time I do, a shiver runs down my spine as I think how easy it would be for a predator to do the same thing, without recognising that non-content harm is a real and present danger to our children. As the noble Baroness, Lady Kidron, said, this is not to say that these functionalities are not brilliant. It makes me, as her mum, feel good that I can track her. As the noble Lord, Lord Allan, said last week, we need to remember that this is about priority harm and not primary priority harm. It is not black and white that it is always bad; it is a functionality, and we should require companies to assess the risk that it imposes on young people. That is why it is so important that we recognise this as a part of the Bill.
I know that my noble friend the Minister will want to say, “This is all included in the Bill anyway. Why have you all got your knickers in a twist about this? We’re all on track, and we’re going to do it. Ofcom has done all of the pre-work. It’s there”. My worry is that this is a complex and technical Bill. We have all got ourselves tangled up in the structure of it, and, if it is not in the Bill that non-content harms are real harms, the risk of it not being clear in the future is very great. I do not understand the argument—presented to us many times over the last few weeks—that, by putting it in the Bill, we make it worse, not better. I am no lawyer, but it seems strange to me that we are now specifying every other element of harm clearly in the Bill, but, together, we have not been able to find a wording that puts this in.
I am willing to accept that the amendments that I put my name to and that the noble Baroness, Lady Kidron, introduced so powerfully might not be the best way to do this. We might well have unintentionally fallen on to a landmine in this complex Bill. But I cannot accept that it is not necessary to put it in the Bill, so I urge my noble friend the Minister to accept the principle behind these amendments. If he cannot accept them today, I ask him to firmly commit to bring back government amendments that put non-content harms in the Bill. Otherwise, I will need to follow the noble Baroness, Lady Kidron, through the Lobbies.
Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - - - Excerpts

My Lords, as often, it is a pleasure to follow the noble Baronesses, Lady Harding and Lady Kidron, and to support this group of amendments, especially those to which I put my name. I thank the Minister and the Secretary of State for the many amendments they are introducing, including in the last group, on which I was not able to speak for similar reasons to other noble Lords. I especially note Amendment 1, which makes safety by design the object of the Bill and makes implicit the amendments that we are speaking to this afternoon, each of which is consistent with that object of safety by design running through the Bill.

As others have said, this is an immensely complex Bill, and anything which introduces clarity for the technology companies and the users is to be welcomed. I particularly welcome the list in Amendment 281F, which the noble Baroness, Lady Kidron, has already read aloud and which spells out very clearly the harm which results from functionality as well as content. It is imperative to have that in the Bill.

In Committee, I referred to the inequality of harms between the user of a service and the forces arrayed against them. You may like to imagine a child of eight, 12 or 15 using one of the many apps we are discussing this afternoon. Now imagine the five As as forces arrayed against them; they are all about functionality, not content. We must consider: the genius of the advertising industry, which is designed on a commercial basis for sales and profit; the fact that processes, applications and smartphones mean that there is 24/7 access to those who use these services and that there is no escape from them; the creation of addictions by various means of rewarding particular features, which have little to do with content and everything to do with design and function; the creative use of algorithms, which will often be invisible and undetectable to adult users and certainly invisible to children; and the creation of the generation of more harms through artificial intelligence, deep fakes and all the harms resulting from functionality. Advertising, access, addiction, algorithms and artificial intelligence are multiplying harms in a range of ways, which we have heard discussed so movingly today.

The quantity of harm means the socialisation, normalisation and creation of environments which are themselves toxic online and which would be completely unacceptable offline. I very much hope, alongside others, that the Government will give way on these amendments and build the naming of functionality and harm into the Bill.

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak, in part, to two amendments with my name on them and which my noble friend Lady Kidron referred to: Amendments 46 and 90 on the importance of dissemination and not just content.

A more effective way of me saying the same thing differently is to personalise it by trying to give your Lordships an understanding of the experience taking place, day in, day out, for many young people. I address this not only to the Minister and the Bill team but, quite deliberately, to the Office of the Parliamentary Counsel. I know full well that the Bill has been many years in gestation and, because the online world, technology and now AI are moving so fast, it is almost impossible for the Bill and its architecture to keep pace with them. But that is not a good reason for not listening to and accepting the force of the argument which my noble friend Lady Kidron and many others have put forward.

Last week, on the first day on Report, when we were speaking to a group of amendments, I spoke to your Lordships about a particular functionality called dark patterns, which are a variety of different features built into the design of these platforms to drive more and more volume and usage.

The individual whose journey I will be describing is called Milly. Milly is online and she accepts an automatic suggestion that is on a search bar. Let us say it is about weight loss. She starts to watch videos that she would not otherwise have found. The videos she is watching are on something called infinite scroll, so one just follows another that follows another, potentially ad infinitum. To start off, she is seeing video after video of people sharing tips about dieting and showing how happy they are after losing weight. As she scrolls and interacts, the women she sees mysteriously seem to get thinner and thinner. The platform’s content dispersal strategy—if indeed it has one, because not all do—that tempers the power of the algorithm has not yet kicked in. The Bill does not address this because, individually, not a single one of the videos Milly has been watching violates the definition of primary priority content. Coding an algorithm to meet a child’s desire to view increasingly thin women is what they are doing.

The videos that Milly sees are captioned with a variety of hashtags such as #thinspo, #thighgap and #extremeweightloss. If she clicks on those, she will find more extreme videos and will start to click on the accounts that have posted the content. Suddenly, she is exposed to the lives of people who are presenting disordered eating not just as normal but as aspirational. Developmentally, Milly is at an age where she does not have the critical thinking skills to evaluate what she is seeing. She has entered a world that she is too young to understand and would never have found were it not for the design of the platform. Throughout her journey thus far, she has yet to see a single video that meets the threshold of primary priority harm content. This world is the result of cumulative design harms.

She follows some of the accounts that prompts the platform to recommend similar accounts. Many of the accounts recommended to her are even more extreme. They are managed by people who have active eating disorders but see what is known as their pro-ana status—that is, pro anorexia—as a lifestyle choice rather than a mental health issue. These accounts are very savvy about the platform’s community guidelines, so the videos and the language they use are coded specifically to avoid detection.

Every aspect of the way Milly is interacting with the platform has now been polluted. It is not just the videos she sees. It is the autocomplete suggestions she gets on searches. It is the algorithmically determined account recommendations. It is the design strategies that make it impossible for her to stop scrolling. It is the notifications she receives encouraging her back to the platform to watch yet another weight-loss video or follow yet another account. It is the filters and effects she is offered before she posts. It is the number of likes her videos get. It goes on and on, and the Bill as it is stands will fail Milly. This is why I am talking directly to the Minister and the Office of the Parliamentary Counsel, because they need to sort this out.

Earlier on this afternoon, before we began this debate, I was talking to an associate professor in digital humanities at UCL, Dr Kaitlyn Regehr. We were talking about incels—involuntary celibates—and the strange world they live in, and she made a comment. This is a quote that I wrote down word for word because it struck me. She said:

“One off-day seeds the algorithm. The algorithm will focus on that and amplify that one off-day”—


that one moment when we click on something and suddenly it takes us into a world and in a direction that we had no idea existed but, more importantly, because of the way these are designed, we feel we have no control over. We really must do something about this.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to support the amendments in the names of the intrepid noble Baroness, Lady Kidron, the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford. They fit hand in hand with the amendments that have just been debated in the previous group. Sadly, I was unable to take part in that debate because of a technical ruling, but I thank the Minister for his kind words and thank other noble Lords for what they have said. But my heart is broken, because they included age verification, for which I have campaigned for the past 12 years, and I wanted to thank the Government for finally accepting that children need to be protected from online harmful content, pornography being one example; it is the gateway to many other harms.

17:30
As we have heard around the House, the Government need to make it clear that harm to children can arise from functionality and design of online services too—not only content. These amendments would show the tech industry that there is no place to hide when it comes to fulfilling its obligations to protect children, especially as AI is emerging. The consequences of this could open a whole new Pandora’s box of harms, which have already started to spread—with horror. These amendments are an excellent and golden opportunity to protect children from them.
Many of us from across the House have been fighting for years for this day, and it has been good to see that the Government have finally listened—I say, “Hallelujah”. But why they should stop the Bill being absolutely clear about harm fails me. If they are saying that it is covered in the Bill, what is the objection to them making it explicit? These amendments would send a loud, long message to the industry that it is responsible for the design of its products. Surely, the Government should be on the side of children who have suffered for far too long from being exposed to harmful content, not on the side of the multinational tech companies.
As the children’s charity Barnardo’s said—and I declare an interest as vice president—children do not have a voice. I feel that we have a responsibility to protect them, and we must expect the Government to take children into consideration and show that they have a holistic view about protecting them from harm. I hope that the Government will embrace these amendments by continuing to listen to common sense and will support them.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - - - Excerpts

My Lords, it is a great pleasure to follow the veteran campaigner on this issue, the noble Baroness, Lady Benjamin. I, too, rise briefly to support Amendments 35 to 37A, 85 and 240 in the name of my noble friend Lady Kidron.

In Committee, I put my name to amendments that aimed to produce risk assessments on harms to future-proof the Bill. Sadly, they were thought unnecessary by the Government. Now the Minister has another chance to make sure that Ofcom will be able to assess and respond to potential harms from one of the fastest-changing sectors in the world in order to protect our children. I praise the Minister for having come so far but, if this Bill is to stand the test of time, we will have to be prepared for the ever-changing mechanisms that would deliver that content to children. Noble Lords have already told the House about the fast-changing algorithms and the potential of AI to create harms. Many tech companies do not even understand how their algorithms work; a risk assessment of their functions would ensure that they found out soon enough.

In the Communications and Digital Select Committee inquiry into regulating the internet, we recommended that, because the changes in digital delivery and technology were happening so fast, a specific body needed to be set up to horizon scan. In these amendments, we would build these technological changes into this Bill’s regulatory mechanism to safeguard our children in future. I hope that noble Lords will support the amendment.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I also support the amendments from the noble Baroness, Lady Kidron. It is relatively easy to stand here and make the case for age verification for porn: it is such a black and white subject and it is disgusting pornography, so of course children should be protected from it. Making the case for the design of the attention economy is more subtle and complex—but it is incredibly important, because it is the attention economy that is driving our children to extreme behaviours.

I know this from my own personal life; I enjoy incredibly lovely online content about wild-water swimming, and I have been taken down a death spiral towards ice swimming and have become a compulsive swimmer in extreme temperatures, partly because of the addiction generated by online algorithms. This is a lovely and heart-warming anecdote to give noble Lords a sense of the impact of algorithms on my own imagination, but my children are prone to much more dangerous experiences. The plasticity of their brains is so much more subtle and malleable; they are, like other children, open to all sorts of addiction, depression, sleeplessness and danger from predators. That is the economy that we are looking at.

I point noble Lords to the intervention from the surgeon general in America, Admiral Vivek Murthy—an incredibly impressive individual whom I came across during the pandemic. His 25-page report on the impact of social media on the young of America is incredibly eye-opening reading. Some 95% of American children have come across social media, and one-third of them see it almost constantly, he says. He attributes to the impact of social media depression, anxiety, compulsive behaviours and sleeplessness, as well as what he calls the severe impact on the neurological development of a generation. He calls for a complete bar on all social media for the under-13s and says that his own children will never get anywhere near a mobile phone until they are 16. That is the state of the attention economy that the noble Baroness, Lady Kidron, talks about, and that is the state of the design of our online applications. It is not the content itself but the way in which it is presented to our children, and it traps their imagination in the kind of destructive content that can lead them into all kinds of harms.

Admiral Murthy calls on legislators to act today—and that was followed on the same day by a commitment from the White House to look into this and table legislation to address the kind of design features that the noble Baroness, Lady Kidron, is looking at. I think that we should listen to the surgeon general in America and step up to the challenge that he has given to American legislators. I am enormously grateful to my noble friend the Minister for the incredible amount of work that he has already done to try to bridge the gap in this matter, but there is a way to go. Like my noble friend Lady Harding, I hope very much indeed that he will be able to tell us that he has been able to find a way across the gap, or else I shall be supporting the noble Baroness, Lady Kidron, in her amendment.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

I rise briefly to speak to this group of amendments. I want to pick up where my noble friend Lord Bethell has just finished. The Government have listened hugely on this Bill and, by and large, the Bill, and the way in which Ministers have engaged, is a model of how the public wants to see their Parliament acting: collaboratively and collegiately, listening to each other and with a clear sense of purpose that almost all of us want to see the Bill on the statute book as soon as possible. So I urge my noble friend the Minister to do so again. I know that there have been many conversations and I think that many of us will be listening with great care to what he is about to say.

There are two other points that I wanted to mention. The first is that safety by design was always going to be a critical feature of the Bill. I have been reminding myself of the discussions that I had as Culture Secretary. Surely and in general, we want to prevent our young people in particular encountering harms before they get there, rather than always having to think about the moderation of harmful content once it has been posted.

Secondly, I would be interested to hear what the Minister has to say about why the Government find it so difficult to accept these amendments. Has there been some pushback from those who are going to be regulated? That would suggest that, while they can cope with the regulation of content, there is still secrecy surrounding the algorithms, functionalities and behaviours. I speak as the parent of a teenager who, if he could, would sit there quite happily looking at YouTube. In fact, he may well be doing that now—he certainly will not be watching his mother speaking in this House. He may well be sitting there and looking at YouTube and the content that is served up automatically, time after time.

I wonder whether this is, as other noble Lords have said, an opportunity. If we are to do the Bill properly and to regulate the platforms—and we have decided we need to do that—we should do the job properly and not limit ourselves to content. I shall listen very carefully to what my noble friend says but, with regret, if there is a Division, I will have to support the indomitable noble Baroness, Lady Kidron, as I think she was called.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I very strongly support the noble Baroness, Lady Kidron, in her Amendments 35, 36 and 281F and in spirit very much support what the noble Lord, Lord Russell, said in respect of his amendments. We have heard some very powerful speeches from the noble Baroness, Lady Kidron, herself, from the noble Baronesses, Lady Harding and Lady Morgan, from the right reverend Prelate the Bishop of Oxford, from my noble friend Lady Benjamin and from the noble Lords, Lord Russell and Lord Bethell. There is little that I can add to the colour and the passion that they brought to the debate today.

As the noble Baroness, Lady Kidron, started by saying that it is not just about content; it is about functionalities, features and behaviours. It is all about platform design. I think the Government had pretty fair warning throughout the progress of the Bill that we would be keen to probe this. If the Minister looks back to the Joint Committee report, he will see that there was a whole chapter titled “Societal harm and the role of platform design”. I do not think we could have been clearer about what we wanted from this legislation. One paragraph says:

“We heard throughout our inquiry that there are design features specific to online services that create and exacerbate risks of harm. Those risks are always present, regardless of the content involved, but only materialise when the content concerned is harmful”.


It goes on to give various examples and says:

“Tackling these design risks is more effective than just trying to take down individual pieces of content (though that is necessary in the worst cases). Online services should be identifying these design risks and putting in place systems and process to mitigate them before people are harmed”.


That is the kind of test that the committee put. It is still valid today. As the noble Baroness said, platforms are benefiting from the network effect, and the Threads platform is an absolutely clear example of how that is possible.

The noble Lord, Lord Russell, gave us a very chilling example of the way that infinite scrolling worked for Milly. A noble Lord on the Opposition Bench, a former Home Secretary whose name I momentarily forget, talked about the lack of empathy of AI in these circumstances. The algorithms can be quite relentless in pushing this content; they lack human qualities. It may sound over the top to say that, but that is exactly what we are trying to legislate for. As the noble Lord, Lord Russell, says, just because we cannot always anticipate what the future holds, there is no reason why we should not try. We are trying to future-proof ourselves as far as possible, and it is not just the future but the present that we are trying to proof against through these amendments. We know that AI and the metaverse are coming down the track, but there are present harms that we are trying to legislate for as well. The noble Baroness, Lady Kidron, was absolutely right to keep reminding us about Molly Russell. It is this kind of algorithmic amplification that is so dangerous to our young people.

The Minister has a chance, still, to accede to these amendments. He has heard the opinion all around the House. It is rather difficult to understand what the Government’s motives are. The noble Baroness, Lady Morgan, put her finger on it: why is it so difficult to accede to these? We have congratulated the Government, the Minister and the Secretary of State throughout these groups over the last day and a bit; they have been extremely consensual and have worked very hard at trying to get agreement on a huge range of issues. Most noble Lords have never seen so many government amendments in their life. So far, so good; why ruin it?

17:45
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

There is always a simple question. We are in a bit of a mess—again. When I said at Second Reading that I thought we should try to work together, as was picked up by the noble Baroness in her powerful speech, to get the best Bill possible out of what we had before us, I really did not know what I was saying. Emotion caught me and I ripped up a brilliant speech which will never see the light of day and decided to wing it. I ended up by saying that I thought we should do the unthinkable in this House—the unthinkable in politics, possibly—and try to work together to get the Bill to come right. As the noble Lord, Lord Clement-Jones, pointed out, I do not think I have ever seen, in my time in this House, so many government amendments setting out a huge number of what we used to call concessions. I am not going to call them concessions—they are improvements to the Bill. We should pay tribute to the Minister, who has guided his extensive team, who are listening anxiously as we speak, in the good work they have been doing for some time, getting questioned quite seriously about where it is taking us.

The noble Lord, Lord Clement-Jones, is quite right to pick up what the pre-legislative scrutiny committee said about this aspect of the work we are doing today and what is in the Bill. We have not really nailed the two big things that social media companies ask: this amplification effect, where a single tweet—or thread, let us call it now—can go spinning around the world and gather support, comment, criticism, complaint, anger and all sorts of things that we probably do not really understand in the short period of time it takes to be read and reacted to. That amplification is not something we see in the real world; we do not really understand it and I am not quite sure we have got to the bottom of where we should be going at this stage.

The second most important point—the point we are stuck on at the moment; this rock, as it were, in the ocean—is the commercial pressure which, of course, drives the way in which companies operate. They are in it for the money, not the social purpose. They did not create public spaces for people to discuss the world because they think it is a good thing. There is no public service in this—this is a commercial decision to get as much money as possible from as many people as possible and, boy, are they successful.

But commercial pressures can have harms; they create harms in ways that we have discussed, and the Bill reflects many of those. This narrow difference between the way the Bill describes content, which is meant to include many of the things we have been talking about today—the four Cs that have been brought into the debate helpfully in recent months—does not really deal with the commercial pressures under which people are placed because of the way in which they deal with social media. We do not think the Bill is as clear as it could be; nor does it achieve as much as it should in trying to deal with that issue.

That is in part to do with the structure. It is almost beyond doubt that the sensibility of what we are trying to achieve here is in the Bill, but it is there at such a level of opacity that it does not have the clarity of the messages we have heard today from those who have spoken about individuals—Milly and that sort of story—and the impact on people. Even the noble Lord, Lord Bethell, whose swimming exploits we must admire, is an unwitting victim of the drive of commercial pressures that sees him in his underwear at inappropriate moments in order that they should seek the profits from that. I think it is great, but I wonder why.

I want to set the Minister a task: to convince us, now that we are at the bar, that when he says that this matter is still in play, he realises what that must imply and will give us a guarantee that we will be able to gain from the additional time that he seeks to get this to settle. There is a case, which I hope he will agree to, for having in the Bill an overarching statement about the need to separate out the harms that arise from content and the harms that arise from the system discussions and debates we have been having today where content is absent. I suggest that, in going back to Clause 1, the overarching objectives clause, it might well be worth seeing whether that might be strengthened so that it covers this impact, so that the first thing to read in the Bill is a sense that we embrace, understand and will act to improve this question of harm arising absent content. There is a case for putting into Clauses 10, 11, 25 and 82 the wording in Amendments 35, 36, 37A and 240, in the name of the noble Baroness, Lady Kidron, and to use those as a way of making sure that every aspect of the journey through which social media companies must go to fulfil the duties set out in the Bill by Ofcom reflects both the content that is received and the design choices made by those companies in bringing forward those proposals for material content harms and the harms that arise from the design choices. Clauses 208 and 209 also have to provide a better consideration of how one describes harms so that they are not always apparently linked to content.

That is a very high hurdle, particularly because my favourite topic of how this House works will be engaged. We have, technically, already passed Clause 1; an amendment was debated and approved, and now appears in versions of the Bill. We are about to finish with Clauses 10 and 11 today, so we are effectively saying to the Minister that he must accept that there are deficiencies in the amendments that have already been passed or would be, if we were to pass Amendments 35, 36, 37A, 85 and 240 in the name of the noble Baroness, Lady Kidron, and others. It is not impossible, and I understand that it would be perfectly reasonable, for the Government to bring back a series of amendments on Third Reading reflecting on the way in which the previous provisions do not fulfil the aspirations expressed all around the House, and therefore there is a need to change them. Given the series of conversations throughout this debate—my phone is red hot with the exchanges taking place, and we do not have a clear signal as to where that will end up—it is entirely up to the Minister to convince the House whether these discussions are worth it.

To vote on this when we are so close seems ridiculous, because I am sure that if there is time, we can make this work. But time is not always available, and it will be up to the Minister to convince us that we should not vote and up to the noble Baroness to decide whether she wishes to test the opinion of the House. We have a three-line Whip on, and we will support her. I do not think that it is necessary to vote, however—we can make this work. I appeal to the Minister to get over the bar and tell us how we are to do it.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am very grateful for the discussion we have had today and the parallel discussions that have accompanied it, as well as the many conversations we have had, not just over the months we have been debating the Bill but over the past few days.

I will turn in a moment to the amendments which have been the focus of the debate, but let me first say a bit about the amendments in this group that stand in my name. As noble Lords have kindly noted, we have brought forward a number of changes, informed by the discussions we have had in Committee and directly with noble Lords who have taken an interest in the Bill for a long time.

Government Amendments 281C, 281D, 281E and 281G relate to the Bill’s interpretation of “harm”, which is set out in Clause 209. We touched on that briefly in our debate on Thursday. The amendments respond to concerns which I have discussed with many across your Lordships’ House that the Bill does not clearly acknowledge that harm and risk can be cumulative. The amendments change the Bill to make that point explicit. Government Amendment 281D makes it clear that harm may be compounded in instances where content is repeatedly encountered by an individual user. That includes, but is not limited to, instances where content is repeatedly encountered as a result of algorithms or functionalities on a service. Government Amendment 281E addresses instances in which the combination of multiple functionalities on a service cumulatively drives up the risk of harm.

Those amendments go hand in hand with other changes that the Government have made on Report to strengthen protections for children. Government Amendment 1, for instance, which we discussed at the beginning of Report, makes it clear that services must be safe by design and that providers must tackle harms which arise from the design and operation of their service. Government Amendments 171 and 172 set out on the face of the Bill the categories of “primary priority” and “priority” content which is harmful to children to allow the protections for children to be implemented as swiftly as possible following Royal Assent. As these amendments demonstrate, the Government have indeed listened to concerns which have been raised from all corners of your Lordships’ House and made significant changes to strengthen the Bill’s protections for children. I agree that it has been a model of the way in which your Lordships’ House operates, and the Bill has benefited from it.

Let me turn to the amendments in the name of the noble Baroness, Lady Kidron. I am very grateful for her many hours of discussion on these specific points, as well as her years of campaigning which led to them. We have come a long way and made a lot of progress on this issue since the discussion at the start of Committee. The nature of online risk versus harm is one which we have gone over extensively. I certainly accept the points that the noble Baroness makes; I know how heartfelt they are and how they are informed by her experience sitting in courtrooms and in coroners’ inquests and talking to people who have had to be there because of the harms they or their families have encountered online. The Government are firmly of the view that it is indisputable that a platform’s functionalities, features or wider design are often the single biggest factor in determining whether a child will suffer harm. The Bill makes it clear that functions, features and design play a key role in the risk of harm occurring to a child online; I draw noble Lords’ attention to Clause 11(5), which makes it clear that the child safety duties apply across all areas of a service, including the way it is designed, operated and used, as well as content present on the service. That makes a distinction between the design, operation and use, and the content.

In addition, the Bill’s online safety objectives include that regulated services should be designed and operated so as to protect from harm people in the United Kingdom who are users of the service, including with regard to algorithms used by the service, functionalities of the services and other features relating to the operation of the service. There is no reference to content in this section, again underlining that the Bill draws a distinction.

This ensures that the role of functionalities is properly accounted for in the obligations on providers and the regulator, but I accept that noble Lords want this to be set out more clearly. Our primary aim must be to ensure that the regulatory framework can operate as intended, so that it can protect children in the way that they deserve and which we all want to see. Therefore, we cannot accept solutions that, however well meaning, may inadvertently weaken the Bill’s framework or allow providers to exploit legal uncertainty to evade their duties. We have come back to that point repeatedly in our discussions.

18:00
I will address the problems with the amendments as drafted; as the noble Baroness knows, if she presses them to a vote, we will not be able to accept them, although we are very happy to continue to discuss the concerns lying behind them. I am happy to reassure noble Lords that the Bill recognises and addresses that services can be risky by design and that features and functionalities can exacerbate the risk of harm to users, including children.
First, I have mentioned the new introductory clause that your Lordships have put into the Bill, which establishes safety by design as a key objective of it. As such, features and functionalities are captured in the existing children’s risk assessment and safety duties. I am grateful to the noble Lord, Lord Stevenson, for his suggestion that, if there is interest from the noble Baroness, Lady Kidron, we could use the time between now and Third Reading, in addition to our continuing discussions, to look at that again and try to make it clearer. However, its inclusion in the Bill has already been of benefit.
Secondly, providers must comprehensively consider and assess the risk presented by the design and operation of their service, including the risk of their design choices, which, as many noble Lords have highlighted, are often motivated by commercial aims rather than safety. These assessments also require providers to assess the risk that a service’s features and functionalities pose. Once this mandatory risk assessment is completed, they are required to mitigate and manage the risks to children that they have identified. For example, if a service has a direct messaging function, it will need to consider how this increases the risk of users encountering harms such as bullying and to follow steps in codes of practice, or take equivalent measures, to mitigate this.
It is not right to say that functionalities are excluded from the child safety duties. Clause 11(5) clearly sets out that safety duties apply across all areas of a service, including the way it is designed, operated and used, and not only to content that is present on the service.
The noble Lord, Lord Russell, spoke to his Amendments 46 and 90. They seek to remove the provisions in Clauses 11(15) and 25(13), which limit corresponding duties elsewhere in the Bill to cases where the risk of harm is presented by the nature of the content rather than the fact of its dissemination. Clause 209 is clear that harm from content may arise from the fact or manner of its dissemination. As I have mentioned, the Government’s amendments to Clause 209 make it clear that this includes instances where algorithms bombard a user with content, such as in the scenario the noble Lord set out. As such, user-to-user and search service providers must take action to address this as part of their child safety duties.
The duties in Clauses 11(2) and 25(2) apply to content that is harmful due to the manner of its dissemination, requiring providers to design and operate their services so as to mitigate the risks of harm identified in their risk assessments. This includes risks such as an algorithm pushing content at high volume to a user. If Clauses 11(15) and 25(13) were removed, Clause 11(3) and (6) and Clause 25(3) would require children to be protected from inherently harmless content on the grounds that harm could be caused if that content were encountered repeatedly over time. I am sure that is not what the noble Lord, Lord Russell, has in mind with his amendments, but that is why, if he pushes them to a vote, we will not be able to accept them.
We have talked about this at great length. If we can use the time between now and Third Reading fruitfully to address the points I have raised on these amendments—the noble Baroness, Lady Kidron, has heard them repeatedly; I make them for the benefit of the rest of your Lordships’ House, because we have had much discussion—I am very willing to look at that and bring forward points to address this at Third Reading. However, I have set out our concerns about the approach taken in the amendments she has tabled. I am very grateful to her for her time and for discussing this. Procedurally, if she presses them to a vote now, the matter will have been dealt with on Report and we will not be able to look at this again at Third Reading. I hope she may yet have found comfort in what I have said and be willing to continue those discussions, but if she wishes to press her amendments to a Division now, the Government will not be able to accept them and I would recommend that noble Lords vote against.
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I thank everybody who has spoken for these amendments. I also thank the Minister for our many discussions and apologise to the House for the amount of texts that I sent while we were trying to get stand-alone harms into the Bill—unfortunately, we could not; we were told that it was a red line.

It is with some regret that I ask the House to walk through the Lobbies. Before I do so, I acknowledge that the Government have met me on very many issues, for which I am deeply grateful. There are no concessions on this Bill, only making it better. From my perspective, there is no desire to push anybody anywhere, only to protect children and give citizens the correct relationship with the digital world.

I ask those who were not here when I said this before: please think about your children and grandchildren and other people’s children and grandchildren before you vote against these amendments. They are not only heartfelt, as the Minister said, but have been drafted with reference to many experts and people in the business, who, in their best practice, meet some of these things already. We do not want the Bill, by concentrating on content, to be a drag on what we are pushing forward. We want it to be aspirational and to push the industry into another culture and another place. At a personal level, I am very sorry to the Minister, for whom I have a great deal of respect, but I would like to test the opinion of the House.

18:08

Division 1

Ayes: 240

Noes: 168

18:20
Clause 11: Safety duties protecting children
Amendment 36
Moved by
36: Clause 11, page 10, line 38, at end insert—
“(c) mitigate the impact of harm to children in different age groups presented by features, functionalities or behaviours enabled or created by the design or operation of the service.”Member’s explanatory statement
This amendment ensures that User to user services’ duty to protect children from harm includes the ways in which the design and operation of services may create harm separately and additionally to harm relating to the dissemination of or encountering harmful content.
Amendment 36 agreed.
Amendment 37
Moved by
37: Clause 11, page 10, line 42, leave out “(for example, by using age verification)”
Member’s explanatory statement
This amendment is consequential on the next amendment of Clause 11 in my name.
Amendment 37 agreed.
Amendment 37A
Moved by
37A: Clause 11, page 10, line 46, at end insert—
“(c) protect children in age groups judged to be at risk of harm from features, functionalities or behaviours enabled or created by the design or operation of the service”Member’s explanatory statement
This amendment ensures that user to user services’ duty to protect children from harm includes the ways in which the design and operation of services may create harm separately and additionally to harm relating to the dissemination or encountering harmful content.
Amendment 37A agreed.
Amendment 38
Moved by
38: Clause 11, page 10, line 46, at end insert—
“(3A) The duty set out in subsection (3)(a) requires a provider to use age verification or age estimation (or both) to prevent children of any age from encountering primary priority content that is harmful to children which the provider identifies on the service.(3B) That requirement applies to a provider in relation to a particular kind of primary priority content that is harmful to children in every case except where—(a) a term of service indicates (in whatever words) that the presence of that kind of primary priority content that is harmful to children is prohibited on the service, and(b) that policy applies in relation to all users of the service.(3C) If a provider is required by subsection (3A) to use age verification or age estimation for the purpose of compliance with the duty set out in subsection (3)(a), the age verification or age estimation must be of such a kind, and used in such a way, that it is highly effective at correctly determining whether or not a particular user is a child.”Member’s explanatory statement
This amendment requires providers of user-to-user services to use age verification or age estimation to prevent children from encountering identified primary priority content that is harmful to children, unless the terms of service indicate that that kind of content is prohibited; and where that requirement applies, new subsection (3C) provides that the age verification or age estimation must be highly effective.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I beg to move.

Amendment 39 (to Amendment 38)

Moved by
39: Clause 11, at end insert—
“(3D) If the duty in subsection (3)(a) relates to pornographic content, the duty applies regardless of the size and capacity of a service.”Member’s explanatory statement
This amendment does not allow a service to determine age verification or age estimation is not needed because of their size and capacity.
Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

My Lords, I commend the Minister for the great strides forward which have been made since Committee. There remains one concern which has necessitated a further amendment in my name, that refers to this group. In Committee, I and others probed whether pornographic content would be caught by the Bill. It is the opening words of Clause 11(3) which give rise to this concern, while amendments helpfully put forward by the Government—which I wholeheartedly support—bolster age-verification amendments. These amendments are still subject to qualification.

The Government’s amendments leave the beginning of Clause 11(3) unchanged. User-to-user services now have a duty to use age verification and age estimation, or both, to prevent children of any age from encountering primary priority content that is harmful to children. This duty is qualified by the words

“using proportionate systems and processes”.

It is that word “proportionate” that gives rise to concern, and which Amendment 39 seeks to address for pornographic content.

In a document produced by the Government in January 2021, the British Board of Film Classification said that there were literally millions of pornographic websites. This study did not include social media websites, some of which also host pornographic content—a point made by the Children’s Commissioner in her powerful recent report.

When announcing the new age-verification and age-estimation amendments on 30 June, the government press release said that

“pornography companies, social media platforms and other services”

will

“be explicitly required to use age verification or estimation measures to prevent children accessing pornography”.

My question to the Minister is this: will all websites and social media be covered by the Bill? With millions of sites on the internet, it is not unreasonable to think that some sites will argue that despite hosting pornographic content, they are not of a size or a capacity that necessitates them investing in age verification or estimation technology.

A further concern relates to large, particularly social media, providers. A proportionality clause may leave it open to them to claim that while they host pornographic content, the amount of pornography or the number of children accessing the platform simply does not warrant age verification as it is statistically a small part of what they provide. I think most people expect that the Bill will ensure that all pornographic content, wherever it is found, is subject to age verification or estimation. In fact, I congratulated my noble friend the Minister on that point earlier this afternoon.

In Committee, many noble Lords across the House argued that Parts 3 and 5 should be subject to the same duties. I am pleased to say that this is the last anomaly regarding pornographic content in the Bill. The Government have gone a very long way to ensure that the duties across Parts 3 and 5 are identical, which is very welcome. However, websites which fall under the scope of Part 5 do not have any exceptions. There is no proportionality test: they must have age verification or estimation to meet that duty. All I am seeking to do with Amendment 39 is to ensure parity of regulation across the Bill.

Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, the final issue I raised in Committee is dealt with in this group on so-called proportionality. I tabled amendments in Committee to ensure that under Part 3 no website or social media service with pornographic content could argue that it should be exempt from implementing age verification under Clause 11 because to do so would be disproportionate based on its size and capacity. I am pleased today to be a co-signatory to Amendment 39 tabled by the noble Lord, Lord Bethell, to do just that.

The noble Lord, Lord Russell, and the noble Baroness, Lady Kidron, have also tabled amendments which raise similar points. I am disappointed that despite all the amendments tabled by the Minister, the issue of proportionality has not been addressed; maybe he will give us some good news on that this evening. It feels like the job is not quite finished and leaves an unnecessary and unhelpful loophole.

I will not repeat all the arguments I made in Committee in depth but will briefly recap that we all know that in the offline world, we expect consistent regulation regardless of size when it comes to protecting children. We do not allow a small corner shop to act differently from a large supermarket on the sale of alcohol or cigarettes. In a similar online scenario, we do not expect small or large gambling websites to regulate children’s access to gambling in a different way.

We know that the impact of pornographic content on children is the same whether it is accessed on a large pornographic website or a small social media platform. We know from the experience of France and Germany that pornographic websites will do all they can to evade age verification. As the noble Lord, Lord Stevenson, said on the eighth day of Committee, whether pornography

“comes through a Part 3 or Part 5 service, or accidently through a blog or some other piece of information, it has to be stopped. We do not want our children to receive it. That must be at the heart of what we are about, and not just something we think about as we go along”.—[Official Report, 23/5/23; col. 821.]

By not shutting off the proportionality argument, the Government are allowing different-sized online services to act differently on pornography and all the other primary priority content, as I raised in Committee. At that stage, the noble Baroness, Lady Kidron, said,

“we do not need to take a proportionate approach to pornography”.—[Official Report, 2/5/23; col. 1481.]

Amendment 39 would ensure that pornographic content is treated as a separate case with no loopholes for implementing age verification based on size and capacity. I urge the Minister to reflect on how best we can close this potential loophole, and I look forward to his concluding remarks.

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will briefly address Amendments 43 and 87 in my name. I thank the noble Baronesses, Lady Harding and Lady Kidron, and the noble Lord, Lord Knight, for adding their names to these amendments. They are complementary to the others in this group, on which the noble Lord, Lord Bethell, and the noble Baroness, Lady Ritchie, have spoken.

In Committee the Minister argued that it would be unfair to place the same child safety duties across all platforms. He said:

“This provision recognises that what it is proportionate to require of providers at either end of that scale will be different”.—[Official Report, 2/5/23; col. 1443.]


Think back to the previous group of amendments we debated. We talked about functionality and the way in which algorithms drive these systems. They drive you in all directions—to a large platform with every bell and whistle you might anticipate because it complies with the legislation, but also, willy-nilly, without any conscious thought because that is how it is designed, to a much smaller site. If we do not amend the legislation as it stands, they will take you to smaller sites that do not require the same level of safety duties, particularly towards children. I think we all fail to understand the logic behind that argument.

18:30
Child safety duties are based on the risk identified in the child risk assessments, which all services must carry out. If the risk is found to be low, the duties on that service will not be too onerous. If the risk is high, the duties should be onerous. Is the Minister seriously saying that, if the platform is small but the risk to children is high, because the platform is small it does not need the same level of safety duties as a large platform? That completely goes against the spirit and direction of the Bill.
Smaller is not safer. We know that, even with the very smallest platforms, real harm can transfer into the real world. I mentioned that I had been talking earlier today with an associate professor at UCL who has recently been looking at the world of incels—involuntary celibates. These forums often have very small memberships and numbers of people visiting, but their ability to get into a very unpleasant world of anti-immigrant hate speech, stirring up communities against refugees and migrants, and potentially women if they are incels, is a real-world problem.
I simply ask the Minister to reflect and look carefully at this and, frankly, the illogicality of the Government’s current approach to see whether we can yet again improve the Bill—as he has on so many occasions.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I follow the noble Lord, Lord Russell, particularly in talking about Amendments 43, 87 and 242, which raise some interesting and quite profound questions on what we are expecting from the market of internet services once the Online Safety Bill is in place.

It is worth taking a moment to remind ourselves of what we do and do not want from the Bill. We want services that are causing harm and are unwilling to take reasonable steps to address that to leave the UK market. That is clear. As a result of this legislation, it will be likely that some services leave the UK market, because we have asked them to do reasonable things and they have said no; they are not willing to comply with the law and therefore they need to be out. There is a whole series of measures in the Bill that will lead to that.

Equally, we want services that are willing to take reasonable steps to stay in the UK market, do the risk assessments, work at improvements and have the risks under control. They may not all be resolved on day one—otherwise, we would not need the legislation—but they should be on a path to address the risks that have been identified. We want those people to be in the market, for two reasons.

The first is that we want choice for people; we do not take pleasure in shutting people who are providing services out of the market. Also, from a child safety point of view, there is a genuine concern that, if you limit choice too far, you will end up creating more of a demand for completely unregulated services that sit outside the UK and will fill the gap. There is a balance in making sure that there is a range of attractive services, so that teenagers in particular feel that their needs are being met. We want those services to be regulated and committed to improvement.

Something that is in between will be a hard decision for Ofcom—something that is not great today, but not so bad that we want it out tomorrow. Ofcom will have to exercise considerable judgment in how it deals with those services. This is my interpretation of where proportionality and capacity come in. If you are running a very large internet service, something such as PhotoDNA, which is the technology that allows you to scan photos and detect child abuse images, is relatively straightforward to implement. All the major providers do it, but there are costs to that for smaller services. There are some real capacity challenges around implementing those kinds of technology. It is getting better over time and we would like them to do it, but you would expect Ofcom to engage in a conversation as a smaller service—smaller not in terms of its users but in its engineers and capacity—may need a little longer to implement such a technology.

A larger service could do proactive investigations. If it has a large team, once it has identified that something is problematic, it can investigate proactively. Again, a smaller service may not have the bodies on the ground to do that, but you would hope it would develop that capacity. It is important to recognise something about capacity if we are to encourage those that are half way between to come to the light side rather than slip off to the dark side.

I am interested in the Minister’s interpretation of these words and the instruction to Ofcom. We will be dependent on Ofcom, which will sit on the other side of a real or virtual table with the people who run these companies, as Ofcom can insist that they come in and talk to it. It will have to make these judgments, but we do not want it to be conned or to be a walkover for an organisation that has the capacity or could do things that are helpful, but is simply refusing to do them or somehow trying to pull the wool over Ofcom’s eyes.

Equally, we do not want Ofcom to demand the impossible of a service that genuinely struggles to meet a demand and that has things broadly under control. That is the balance and the difficult judgment. I think we are probably aiming for the same thing, and I hope the Minister is able to clarify these instructions and the way the Government expect Ofcom to interpret them. We are looking for that point at which Ofcom is seriously demanding but does not get overbearing and unnecessarily drive out of the market people who are making best efforts to do their risk assessments and then work hard to resolve those risks.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, as somebody who is only five feet and two inches, I have felt that size does not matter for pretty much all my life and have long wanted to say that in a speech. This group of amendments is really about how size does not matter; risk does. I will briefly build on the speech just given by the noble Lord, Lord Allan, very eloquently as usual, to describe why risk matters more than size.

First, there are laws for which size does matter—small companies do not need to comply with certain systems and processes—but not those concerned with safety. I have in my mind’s eye the small village fête, where we expect a risk assessment if we are to let children ride on rides. That was not the case 100 years ago, but is today because we recognise those dangers. One of the reasons why we stumbled into thinking that size should matter in this Bill is that we are not being honest about the scale of the risk for our children. If the risk is large enough, we should not be worrying about size; we should be worrying about that risk. That is the first reason why we have to focus on risk and not size.

The second reason is subsequent to what I have just said—the principles of the physical world should apply to the online world. That is one of the core tenets of this Bill. That means that if you recognise the real and present risks of the digital world you have to say that it does not matter whether a small number of people are affected. If it is a small business, it still has an obligation not to put people in harm’s way.

Thirdly, small becomes big very quickly—unfortunately, that has not been true for me, but it is true in the digital world as Threads has just shown us. Fourthly, we also know that in the digital world re-engineering something once it has got very big is really difficult. There is also a practical reason why you want engineers to think about the risks before they launch services rather than after the event.

We keep being told, rightly, that this is a Bill about systems and processes. It is a Bill where we want not just the outcomes that the noble Lord, Lord Allan, has referred to in terms of services in the UK genuinely being safer; we are trying to effect a culture change. I would argue one of the most important culture changes is that any bright, young tech entrepreneur has to start by thinking about the risks and therefore the safety procedures they need to put in place as they build their tech business from the ground up and not once they have reached some artificial size threshold.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I have to admit that it was incompetence rather than lack of will that meant I did not add my name to Amendment 39 in the name of the noble Lord, Lord Bethell, and I would very much like the Government to accept his argument.

In the meantime, I wonder whether the Minister would be prepared to make it utterly clear that proportionality does not mean a little bit of porn to a large group of children or a lot of porn to a small group of children; rather, it means that high-risk situations require effective measures and low-risk situations should be proportionate to that. On that theme, I say to the noble Lord, Lord Allan, whose points I broadly agree with, that while we would all wish to see companies brought into the fold rather than being out of the fold, it rather depends on their risk.

This brings me neatly to Amendments 43 and 87 from the noble Lord, Lord Russell, to which I managed to add my name. They make a very similar point to Amendment 39 but across safety duties. Amendment 242 in my name, to which the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford have added their names, makes the same point—yet again—in relation to Ofcom’s powers.

All these things are pointing in the same direction as Amendment 245 in the name of the noble Baroness, Lady Morgan, which I keep on trumpeting from these Benches and which offers an elegant solution. I urge the Minister to consider Amendment 245 before day four of Report because if the Government were to accept it, it would focus company resources, focus Ofcom resources and, as we discussed on the first day of Report, permit companies which do not fit the risk profile of the regime and are unable to comply with something that does not fit their model yet leaves them vulnerable to enforcement also to be treated in an appropriate way.

Collectively, the ambition is to make sure that we are treating things in proportion to the risk and that proportionate does not start meaning something else.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I agree with the noble Baroness, Lady Kidron, that all these amendments are very much heading in the same direction, and from these Benches I am extremely sympathetic to all of them. It may well be that this is very strongly linked to the categorisation debate, as the noble Baroness, Lady Kidron, said.

The amendment from the noble Lord, Lord Bethell, matters even more when we are talking about pornography in the sense that child safety duties are based on risks. I cannot for the life of me see why we should try to contradict that by adding in capacity and size and so on.

My noble friend made a characteristically thoughtful speech about the need for Ofcom to regulate in the right way and make decisions about risk and the capacity challenges of new entrants and so on. I was very taken by what the noble Baroness, Lady Harding, had to say. This is akin to health and safety and, quite frankly, it is a cultural issue for developers. What after all is safety by design if it is not advance risk assessment of the kinds of algorithm that you are developing for your platform? It is a really important factor.

18:45
I hope that we adopt that in AI regulation more broadly than simply online safety of this kind for social media platforms. We need a change of culture so that this is not just a question of developing without thinking about the ethical aspects of it. It is really important that we start with this kind of debate talking about assessing risk upfront. That should be the key test and not the size or capacity of a particular platform.
I support these amendments. I hope the Minister can give us some indication that we are all heading in the same direction as he is or that he is heading in the same direction as us. That would be enormously helpful.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, as we have heard, this is a small group of amendments concerned with preventing size and lack of capacity being used as a reasonable excuse for allowing children to be unsafe. Part of the problem is the complexity of the Bill and the way it has been put together.

For example, Clause 11, around user-to-user services, is the pertinent clause and it is headed “Safety duties protecting children”. Clause 11(2) is preceded in italics with the wording “All services” so anyone reading it would think that what follows applies to all user-to-user services regardless of size. Clause 11(3) imposes a duty on providers

“to operate a service using proportionate systems and processes”

to protect children from harm. That implies that there will be judgment around what different providers can be expected to do to protect children; for example, by not having to use a particular unaffordable technical solution on age assurance if they can show the right outcome by doing things differently. That starts to fudge things a little.

The noble Lord, Lord Bethell, who introduced this debate so well with Amendment 39, supported by my noble friend Lady Ritchie, wants to be really sure that the size of the provider can never be used to argue that preventing all children from accessing porn is disproportionate and that a few children slipping through the net might just be okay.

The clarity of Clause 11 unravels even further at the end of the clause, where in subsection (12)(b) it reads that

“the size and capacity of the provider of a service”

is relevant

“in determining what is proportionate”.

The clause starts to fall apart at that point quite thoroughly in terms of anyone reading it being clear about what is supposed to happen.

Amendment 43 seeks to take that paragraph out, as we have heard from the noble Lord, Lord Russell, and would do the same for search in Amendment 87. I have added my name to these amendments because I fear that the ambiguity in the wording of this clause will give small and niche platforms an easy get out from ensuring that children are safe by design.

I use the phrase “by design” deliberately. We need to make a choice with this Bill even at this late stage. Is the starting point in the Bill children’s safety by design? Or is the starting point one where we do not want to overly disrupt the way providers operate their business first—which is to an extent how the speech from the noble Lord, Lord Allan, may have been heard—and then overlay children’s safety on top of that?

Yesterday, I was reading about how children access inappropriate and pornographic content, not just on Twitter, Instagram, Snapchat, TikTok and Pinterest but on Spotify and “Grand Theft Auto”—the latter being a game with an age advisory of “over 17” but which is routinely played by teenaged children. Wherever we tolerate children being online, there are dangers which must be tackled. Listening to the noble Baroness, Lady Harding, took me to where a big chunk of my day job in education goes to—children’s safeguarding. I regularly have to take training in safeguarding because of the governance responsibilities that I have. Individual childminders looking after one or two children have an assessment and an inspection around their safeguarding. In the real world we do not tolerate a lack of safety for children in this context. We should not tolerate it in the online world either.

The speech from the noble Lord, Lord Russell, reminded me of the breadcrumbing from big platforms into niche platforms that is part of that incel insight that he referenced. Content that is harmful to children can also be what some children are looking for, which keeps them engaged. Small, emergent services aggressively seeking growth could set algorithms accordingly. They must not be allowed to believe that engaging harmful content is okay until they get to the size that they need to be to afford the age-assurance technology which we might envisage in the Bill. I hope that the Minister shares our concerns and can help us with this problem.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, short debates can be helpful and useful. I am grateful to noble Lords who have spoken on this group.

I will start with Amendment 39, tabled by my noble friend Lord Bethell. Under the new duty at Clause 11(3)(a), providers which allow pornography or other forms of primary priority content under their terms of service will need to use highly effective age verification or age estimation to prevent children encountering it where they identify such content on their service, regardless of their size or capacity. While the size and capacity of providers is included as part of a consideration of proportionality, this does not mean that smaller providers or those with less capacity can evade the strengthened new duty to protect children from online pornography. In response to the questions raised by the noble Baronesses, Lady Ritchie of Downpatrick and Lady Kidron, and others, no matter how much pornographic content is on a service, where providers do not prohibit this content they would still need to meet the strengthened duty to use age verification or age estimation.

Proportionality remains relevant for the purposes of providers in scope of the new duty at Clause 11(3)(a) only in terms of the age-verification or age-estimation measures that they choose to use. A smaller provider with less capacity may choose to go for a less costly but still highly effective measure. For instance, a smaller provider with less capacity might seek a third-party solution, whereas a larger provider with greater capacity might develop their own solution. Any measures that providers use will need to meet the new high bar of being “highly effective”. If a provider does not comply with the new duties and fails to use measures which are highly effective at correctly determining whether or not a particular user is a child, Ofcom can take tough enforcement action.

The other amendments in this group seek to remove references to the size and capacity of providers in provisions relating to proportionality. The principle of proportionate, risk-based regulation is fundamental to the Bill’s regulatory framework, and we consider that the Bill as drafted already strikes the correct balance. The Bill ultimately will regulate a large number of services, ranging from some of the biggest companies in the world to smaller, voluntary organisations, as we discussed in our earlier debate on exemptions for public interest services.

The provisions regarding size and capacity recognise that what it is proportionate to require of companies of various sizes and business models will be different. Removing this provision would risk setting a lowest common denominator standard which does not create incentives for larger technology companies to do more to protect their users than smaller organisations. For example, it would not be proportionate for a large multinational company which employs thousands of content moderators and which invests in significant safety technologies to argue that it is required to take only the same steps to comply as a smaller provider which might have only a handful of employees and a few thousand UK users.

While the size and capacity of providers is included as part of a consideration of proportionality, let me be clear that this does not mean that smaller providers or those with less capacity do not need to meet the child safety duties and other duties in the Bill, such as the illegal content safety duties. These duties set out clear requirements for providers. If providers do not meet these duties, they will face enforcement action.

I hope that is reassuring to my noble friend Lord Bethell and to the other noble Lords with amendments in this group. I urge my noble friend to withdraw his amendment.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I thank my noble friend the Minister for that reassurance. He put the points extremely well. I very much welcome his words from the Dispatch Box, which go a long way towards clarifying and reassuring.

This was a short and perfectly formed debate. I will not go on a tour d’horizon of everyone who has spoken but I will mention the noble Lord, Lord Allan of Hallam. He is entirely right that no one wants gratuitously to hound out businesses from the UK that contribute to the economy and to our life here. There are good regulatory principles that should be applied by all regulators. The five regulatory principles of accountability, transparency, targeting, consistency and proportionality are all in the Legislative and Regulatory Reform Act 2006. Ofcom will embrace them and abide by them. That kind of reassurance is important to businesses as they approach the new regulatory regime.

I take on board what my noble friend the Minister said in terms of the application of regulations regardless of size or capacity, and the application of these strengthened duties, such as “highly effective”, regardless of any economic or financial capacity. I feel enormously reassured by what he has said. I beg leave to withdraw my amendment.

Amendment 39 (to Amendment 38) withdrawn.
Amendment 38 agreed.
Amendment 40 had been withdrawn from the Marshalled List.
Amendments 41 and 42
Moved by
41: Clause 11, page 11, line 1, leave out from beginning to “may” in line 2 and insert “Age verification or age estimation to identify who is or is not a child user or which age group a child user is in are examples of measures which (if not required by subsection (3A))”
Member’s explanatory statement
This amendment refers to age verification and age estimation as mentioned in the preceding amendment in my name, and clarifies the relationship between Clause 11(4) and new subsection (3A) of Clause 11 inserted by that amendment.
42: Clause 11, page 12, line 6, leave out “this section” and insert “section 11”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 11 into two Clauses.
Amendments 41 and 42 agreed.
Amendment 43 not moved.
Amendments 44 and 45
Moved by
44: Clause 11, page 12, line 12, leave out “this section” and insert “section 11”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 11 into two Clauses.
45: Clause 11, page 12, line 16, leave out “subsections (3)(b)” and insert “section 11(3)(b)”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 11 into two Clauses.
Amendments 44 and 45 agreed.
Amendment 46 not moved.
Amendments 47 to 52
Moved by
47: Clause 11, page 12, line 21, leave out “subsections (3)” and insert “section 11(3)”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 11 into two Clauses.
48: Clause 11, page 12, line 24, leave out “this section” and insert “section 11”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 11 into two Clauses.
49: Clause 11, page 12, line 27, leave out from “if” to “the” in line 29 and insert “age verification or age estimation is used on the service with”
Member’s explanatory statement
This amendment provides that a provider can only conclude that children cannot access a service if age verification or age estimation is used on the service with the result that children are not normally able to access it.
50: Clause 11, page 12, line 31, after “In” insert “section 11 and”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 11 into two Clauses.
51: Clause 11, page 12, line 33, leave out “this section” and insert “section 11”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 11 into two Clauses.
52: Clause 11, divide Clause 11 into two clauses, the first (Safety duties protecting children) to consist of subsections (1) to (11) and the second (Safety duties protecting children: interpretation) to consist of subsections (12) to (19)
Member’s explanatory statement
This amendment splits up Clause 11 into two Clauses.
Amendments 47 to 52 agreed.
Amendment 53
Moved by
53: After Clause 11, insert the following new Clause—
“Assessment duties: user empowerment
(1) This section sets out the duties about assessments related to adult user empowerment which apply in relation to Category 1 services (in addition to the duties about risk assessments set out in section 8 and, in the case of Category 1 services likely to be accessed by children, section 10).(2) A duty to carry out a suitable and sufficient assessment for the purposes of section 12(2) at a time set out in, or as provided by, Schedule 3.(3) A duty to take appropriate steps to keep such an assessment up to date.(4) Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient assessment for the purposes of section 12(2) relating to the impacts of that proposed change.(5) An assessment of a service “for the purposes of section 12(2)” means an assessment of the following matters—(a) the user base;(b) the incidence of relevant content on the service;(c) the likelihood of adult users of the service encountering, by means of the service, each kind of relevant content (with each kind separately assessed), taking into account (in particular) algorithms used by the service, and how easily, quickly and widely content may be disseminated by means of the service;(d) the likelihood of adult users with a certain characteristic or who are members of a certain group encountering relevant content which particularly affects them;(e) the likelihood of functionalities of the service facilitating the presence or dissemination of relevant content, identifying and assessing those functionalities more likely to do so;(f) the different ways in which the service is used, and the impact of such use on the likelihood of adult users encountering relevant content;(g) how the design and operation of the service (including the business model, governance, use of proactive technology, measures to strengthen adult users’ control over their interaction with user-generated content, and other systems and processes) may reduce or increase the likelihood of adult users encountering relevant content.(6) In this section “relevant content” means content to which section 12(2) applies (content to which user empowerment duties set out in that provision apply).(7) See also—(a) section 19(8A) and (9) (records of assessments), and(b) Schedule 3 (timing of providers’ assessments).” Member’s explanatory statement
This amendment requires providers of Category 1 services to carry out and update as necessary an assessment about how likely it is that adult users will encounter content to which Clause 12(2) applies (suicide and self-harm content and so on - see Clause 12(10), (11) and (12)).
Amendment 54 (to Amendment 53) not moved.
Amendment 53 agreed.
Clause 12: User empowerment duties
Amendments 55 and 56 not moved.
Amendment 57
Moved by
57: Clause 12, page 13, line 9, after “(2)” insert “(“control features”)”
Member’s explanatory statement
This amendment is a technical drafting change related to the next amendment in my name.
Amendment 57 agreed.
Amendments 58 and 59 not moved.
Amendments 60 to 62
Moved by
60: Clause 12, page 13, line 10, at end insert—
“(4A) A duty to operate a service using a system or process which seeks to ensure that all registered adult users are offered the earliest possible opportunity, in relation to each control feature included in the service, to take a step indicating to the provider that—(a) the user wishes to retain the default setting for the feature (whether that is that the feature is in use or applied, or is not in use or applied), or(b) the user wishes to change the default setting for the feature.(4B) The duty set out in subsection (4A)—(a) continues to apply in relation to a user and a control feature for so long as the user has not yet taken a step mentioned in that subsection in relation to the feature;(b) no longer applies in relation to a user once the user has taken such a step in relation to every control feature included in the service.”Member’s explanatory statement
This amendment imposes a new duty on providers of Category 1 services to proactively ask all registered adult users whether they wish to opt in or opt out of any features offered in compliance with the duty in subsection (2), until a choice is made.
61: Clause 12, page 13, line 12, leave out from “which” to “and” in line 13 and insert “control features are offered”
Member’s explanatory statement
This amendment is a technical drafting change related to the preceding amendment in my name.
62: Clause 12, page 13, line 13, at end insert—
“(5A) A duty to summarise in the terms of service the findings of the most recent assessment of a service under section (Assessment duties: user empowerment) (assessments related to the duty set out in subsection (2)).” Member’s explanatory statement
This amendment requires providers of Category 1 services to summarise in their terms of service the findings of their latest assessment under the new clause proposed after Clause 11 in my name.
Amendments 60 to 62 agreed.
Amendments 63 and 64 not moved.
Amendments 65 to 73
Moved by
65: Clause 12, page 13, line 24, leave out “subsection (2)” and insert “section 12(2)”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 12 into two Clauses.
66: Clause 12, page 13, line 26, leave out paragraph (a) and insert—
“(a) all the findings of the most recent assessment under section (Assessment duties: user empowerment), and”Member’s explanatory statement
This amendment makes it clear that the findings of the latest assessment under the new Clause proposed after Clause 11 in my name are a relevant factor for the purposes of determining what it is proportionate for a provider to do to comply with the duty under Clause 12(2).
67: Clause 12, page 13, line 29, leave out “Subsection (2)” and insert “Section 12(2)”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 12 into two Clauses.
68: Clause 12, page 14, line 3, at end insert—
“(12A) The duty set out in section 12(4A) applies in relation to all registered adult users, not just those who begin to use a service after that duty begins to apply.”Member’s explanatory statement
This amendment makes it clear that the new duty on providers to offer registered users a choice about whether to use the user empowerment tools applies to existing as well as new users.
69: Clause 12, page 14, line 4, after “In” insert “section 12 and”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 12 into two Clauses.
70: Clause 12, page 14, line 12, after “In” insert “section 12 and”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 12 into two Clauses.
71: Clause 12, page 14, line 16, after first “of” insert “section 12 and”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 12 into two Clauses.
72: Clause 12, page 14, line 21, at end insert—
“(16) See also, in relation to duties set out in section 12, section 18 (duties about freedom of expression and privacy).”Member’s explanatory statement
This amendment inserts a signpost to Clause 18, to which the duties in Clause 12 are relevant.
73: Clause 12, divide Clause 12 into two clauses, the first (User empowerment duties) to consist of subsections (1) to (7) and the second (User empowerment duties: interpretation) to consist of subsections (8) to (16)
Member’s explanatory statement
This amendment splits up Clause 12 into two Clauses.
Amendments 65 to 73 agreed.
Clause 16: Duty about content reporting
Amendment 74
Moved by
74: Clause 16, page 19, line 26, leave out from “if” to “the” in line 28 and insert “age verification or age estimation is used on the service with”
Member’s explanatory statement
This amendment provides that a provider can only conclude that children cannot access a service if age verification or age estimation is used on the service with the result that children are not normally able to access it.
Amendment 74 agreed.
Clause 17: Duties about complaints procedures
Amendments 75 and 76
Moved by
75: Clause 17, page 21, line 2, leave out “11(3)” and insert “11(2) or (3)”
Member’s explanatory statement
This amendment is about complaints of content being blocked because of an incorrect assessment of a user’s age. A reference to Clause 11(2) is inserted, as the duty in that provision can also be complied with by using age verification or age estimation.
76: Clause 17, page 21, line 16, leave out from “if” to “the” in line 18 and insert “age verification or age estimation is used on the service with”
Member’s explanatory statement
This amendment provides that a provider can only conclude that children cannot access a service if age verification or age estimation is used on the service with the result that children are not normally able to access it.
Amendments 75 and 76 agreed.
19:00
Clause 18: Duties about freedom of expression and privacy
Amendment 77
Moved by
77: Clause 18, page 21, line 30, after “implementing,” insert “terms of service,”
Member’s explanatory statement
This amendment, and others in the name of Baroness Fox of Buckley, ensure free speech is not just considered at an abstract policy level but is included in providers’ terms of service.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I am rather disappointed that, while this is a large group on freedom of expression, it is dominated by amendments by myself and the noble Lord, Lord Moylan. I welcome the noble Baroness, Lady Fraser of Craigmaddie, and the noble Lord, Lord Stevenson of Balmacara, dipping their toes in the free-expression water here and I am glad that the Minister has added his name to their amendment, although it is a shame that he did not add his name to one of mine.

Earlier today we heard a lot of congratulations to the Government for listening. I have to say, it depends who you are, because the Government have not listened to all of us. It is notable that, of the hundreds of new government concessions that have taken the form of amendments on Report, none relates to free speech. Before I go through my amendments, I want to note that, when the noble Lord, Lord Moylan, and I raise concerns about free speech, it can be that we get treated as being slightly eccentric. There has been a generally supportive and generous mood from the regulars in this House. I understand that, but I worry that free speech is being seen as peripheral.

This country, our country, that we legislate for and in, has a long history of boasting that it is the home of liberty and adopts the liberal approach that being free is the default position: that free speech and the plurality and diversity of views it engenders are the cornerstone of democracy in a free society and that any deviation from that approach must require extraordinary and special justification. A comprehensive piece of law, such as the one we are dealing with, that challenges many of those norms, deserves thorough scrutiny through the prism of free speech.

When I approached this Bill, which I had been following long before I arrived in this House, I assumed that there would be packed Benches—as there are on the Illegal Migration Bill—and that everybody, including all these Law Lords, would be in quoting the European Court of Human Rights on Article 8 and Article 10. I assumed there would be complaints about Executive power grabs and so on. But it has been a bit sparse.

That is okay; I can live with that, even if it is a bit dispiriting. But I am concerned when the Government cite that the mood of the Committee has been reflected in their amendments, because it has not been a very large Committee. Many of the amendments that I, the noble Lord, Lord Moylan, and others tabled about free expression represent the concerns of a wide range of policy analysts, civil rights groups, academics, lawyers, free speech campaigners and industry representatives. They have been put forward in good faith—I continue to do that—to suggest ways of mitigating some of the grave threats to free speech in this Bill, with constructive ideas about how to tackle flaws and raising some of the problems of unintended consequences. I have, at times, felt that those concerns were batted away with a certain indifference. Despite the Minister being very affable and charming, none the less it can be a bit disappointing.

Anyway, I am here to bat again. I hope that the Government now will listen very closely and consider how to avoid the UK ending up with the most restrictive internet speech laws of any western democracy at the end of this. I have a lot of different amendments in my name in this group. I wholeheartedly support the amendments in the name of the noble Lord, Lord Moylan, requiring Ofcom to assess the impact of its codes on free speech, but I will not speak to them.

I will talk about my amendments, starting with Amendments 77, 78, 79, 80 and 81. These require platforms to have particular regard to freedom of expression, not just when implementing safety measures and policies but when writing their terms of service. This is to ensure that freedom of expression is not reduced to an abstract “have regard to” secondary notion but is visible in drafting terms of services. This would mean that users know their rights in clear and concrete terms. For example, a platform should be expected to justify how a particular term of service, on something such as religious hatred, will be balanced with consideration of freedom of expression and conscience, in order to allow discussions over different beliefs to take place. Users need to be able to point to specific provisions in the terms of service setting out their free speech protections.

This is all about parity between free speech and safety. Although the Government—and I welcome this—have attempted some balance, via Clause 18, to mitigate the damage to individual rights of free expression from the Bill, it is a rather weak, poor cousin. We need to recognise that, if companies are compelled to prevent and minimise so-called harmful content via operational safety duties, these amendments are saying that there should be parity with free expression. They should be compelled to do the same on freedom of expression, with a clear and positive duty, rather than Clause 64, which is framed rather negatively.

Amendment 188 takes on the issue of terms of service from a different direction, attempting to ensure that duties with regard to safety must not be allowed to restrict lawful expression or that protected by Article 10 of the European Convention on Human Rights. That states that interference in free speech rights is not lawful unless it is a last resort. I note, in case anyone is reading the amendment carefully, and for Hansard, that the amendment cites Article 8—a rather Freudian slip on my part that was not corrected by the Table Office. That is probably because privacy rights are also threatened by the Bill, but I meant Article 10 of course.

Amendment 188 addresses a genuine dilemma in terms of Ofcom enforcing safety duties via terms and conditions. These will transform private agreements between companies and users into statutory duties under Clause 65. This could mean that big tech companies would be exercising public law functions by state-backed enforcement of the suppression of lawful speech. One worry is that platforms’ terms of service are not neutral; they can change due to external political or commercial pressures. We have all been following with great interest what is happening at Twitter. They are driven by values which can be at odds with UK laws. So I hope the Minister will answer the query that this amendment poses: how is the UK able to uphold its Article 10 obligations if state regulators are legally instructed to enforce terms of service attitudes to free speech, even when they censor far more than UK domestic law requires?

Amendment 162 has a different focus and removes offences under Section 5 of the Public Order Act from the priority offences to be regulated as priority illegal content, as set out in Schedule 7. This amendment is prompted by a concern that the legislation enlists social media companies to act as a private online police force and to adjudicate on the legality of online content. This is especially fraught in terms of the legal limits on speech, where illegality is often contested and contentious—offline as well as online.

The inclusion of Section 5 would place a duty on service providers to take measures to prevent individuals ever encountering content that includes

“threatening or abusive words or behaviour, or disorderly behaviour”

that is likely to cause “harassment, alarm or distress”. It would also require service providers to minimise the length of time such content is present on the service.

I am not sure whether noble Lords have been following the dispute that broke out over the weekend. There is a film on social media doing the rounds of a trans speaker, Sarah Jane Baker, at the Trans Pride event screaming pretty hysterically “If you see a TERF, punch them in the effing face”—and I am being polite. You would think that that misogynistic threat would be the crime people might be concerned about, yet some apologists for Trans Pride claim that those women—TERFs such as myself—who are outraged, and have been treating the speech as saying that, are the ones who are stirring up hate.

Now, that is a bit of a mess, but asking service providers, or indeed algorithms, to untangle such disputes can surely lead only to the over-removal of online expression, or even more of a muddle. As the rule of law charity Justice points out, this could also catch content that depicts conflict or atrocities, such as those taking place in the Russia-Ukraine war. Justice asks whether the inclusion of Section 5 of the POA could lead to the removal of posts by individuals sharing stories of their own abuse or mistreatment on internet support forums.

Additionally, under Schedule 7 to the Bill, versions of Section 5 could also be regulated as priority illegal conduct, meaning that providers would have to remove or restrict content that, for instance, encourages what is called disorderly behaviour that is likely to cause alarm. Various organisations are concerned that this could mean that content that portrayed protest activity, that might be considered disorderly by some, was removed unless you condemned it, or even that content which encouraged people to attend protests would be in scope.

I am not a fan of Section 5 of the Public Order Act, which criminalises stirring up hatred, at the best of times, but at least those offences have been and are subject to the full rigour of the criminal justice system and case law. Of course, the courts, the CPS and the police are also bound, for example by Article 10, to protect free speech. But that is very different to compelling social media companies, their staff or automated algorithms to make such complex assessments of the Section 5 threshold of illegality. Through no fault of their own, those companies are just not qualified to make such determinations, and it is obvious that that could mean that legitimate speech will end up being restricted. Dangerously, it also makes a significant departure from the UK’s rule of law in deciding what is legal or illegal speech. It has the potential to limit UK users’ ability to engage in important aspects of public life, and prevent victims of abuse from sharing their stories, as I have described.

I turn finally to the last amendment, Amendment 275—I will keep this short, for time’s sake. I will not go into detail, but I hope that the Minister will take a look at it, see that there is a loophole, and discuss it with the department. In skeleton form, the Free Speech Union has discovered that the British Board of Film Classification runs a mobile classification network, an agreement with mobile network providers that means that it advises mobile providers on what content should be filtered because it is considered suitable for adults only. This arrangement is private, not governed by statute, and as such means that even the weak free speech safeguards in this Bill can be sidestepped. This affects not only under-18s but anyone with factory settings on their phone. It led to a particular bizarre outcome when last year the readers of the online magazine, “The Conservative Woman”, reported that the website was inaccessible. This small online magazine was apparently blacklisted by the BBFC because of comments below the line on its articles. The potential for such arbitrary censorship is a real concern, and the magazine cannot even appeal to the BBFC, so I ask the Minister to take this amendment back to the DCMS, which helped set up this mobile classification network, and find out what is going on.

That peculiar tale illustrates my concerns about what happens when free speech is not front and centre, even when you are concerned about safety and harm. I worry that when free speech is casually disregarded, censorship and bans can become the default, and a thoughtless option. That is why I urge the Minister before Third Reading to at least make sure that some of the issues and amendments in this group are responded to positively.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, my noble friend on the Front Bench said at various points when we were in Committee that the Bill struck an appropriate balance between protecting the rights of children and the rights of those wishing to exercise their freedom of expression. I have always found it very difficult indeed to discern that point of balance in the Bill as originally drafted, but I will say that if there were such a point, it has been swamped by the hundreds of amendments tabled to the Bill by my noble friend since Committee which push the Bill entirely in the opposite direction.

Among those amendments, I cannot find—it may be my fault, because I am just looking by myself; I have no help to find these things—a single one which seeks to redress the balance back in favour of freedom of expression. My Amendments 123, 128, 130, 141, 148 and 244 seek to do that to some extent, and I am grateful to the noble Baroness, Lady Fox of Buckley, for the support she has expressed for them.

19:15
The Bill includes various provisions requiring providers to have regard to freedom of speech. The group of amendments I am about to speak to addresses the fact that nowhere in the Bill is there any obligation placed on Ofcom to have regard to freedom of speech, unless—it is just possible—it is in that swathe of amendments that has been tabled which I have been swimming in over the weekend looking for a few pearls but finding none. I will take noble Lords through those amendments very briefly, and say what they actually do. I cannot see that there could be any objection to them from any Member of the House, but it seems that the Government are uninterested.
Amendment 123 appears in Chapter 6 of the Bill, which deals with codes of conduct, and imposes a new duty on Ofcom in exercising the functions listed in that regard to
“have special regard to the importance of protecting the rights of users of a service and … interested persons to freedom of expression within the law”.
Who can object to that? Amendment 128 requires Ofcom to issue a statement when it is issuing a code of conduct, showing how it has complied with this new duty. Amendment 130 requires the Secretary of State to lay that statement before Parliament alongside the draft code of conduct, when he lays it.
Amendment 141 relates to how Ofcom responds to a direction—we are moving away from codes of conduct now—from the Secretary of State made under Clause 39. It requires the document Ofcom submits to the Secretary of State in response to that direction to specify how it has complied with the new duty. Amendment 148 requires Ofcom to issue a similar statement to accompany minor amendments to a code of conduct. Amendment 244 imposes a similar duty on Ofcom in relation to the issuance of guidance, which is separate from codes of conduct and from responding to a direction from the Secretary of State. Finally, Amendment 269, which is slightly different, relates to Ofcom’s guidance to providers on enforcement activities, because Ofcom is required to give guidance to providers on how they conduct enforcement activities. Again, it requires Ofcom to have regard to freedom of speech in doing that.
These are not wild and woolly demands. One of the briefs I have received in relation to Report stage reads as follows:
“The proposed duties on providers of Category 1 Services … that seek to protect freedom of expression … should be replaced with a single comprehensive duty to protect the right to freedom of expression. This duty should require Category 1 Service providers to take all reasonable steps to ensure that freedom of expression is not infringed by measures taken to comply with the other duties in the Bill. This should include giving the duty to protect freedom of expression similar status and form as the duties on illegal content”.
That does not come from some strange right-wing think tank; it comes from the Equality and Human Rights Commission.
If I may briefly trespass on the House by quoting a little bit more, the commission goes further:
“The duty to protect the right to freedom of expression should be included in the list of relevant duties for which Ofcom will be required to develop a code of practice”.
I think I referred to it as a code of conduct in my remarks so far; code of practice is the correct term. This is precisely what part of one of my amendments seeks to do, so these recommendations have a very good pedigree. I cannot see, for the life of me, why the Government would want to resist them.
Before I sit down, I will turn briefly to some of the amendments in the name of the noble Baroness, Lady Fox of Buckley, to which I have added my name. Amendment 162 seeks to remove the offence in Section 5 of the Public Order Act from the list of priority illegal content. This provision criminalises
“threatening or abusive words or behaviour, or disorderly behaviour”
likely to lead to “harassment, alarm or distress”. In this House, we spent a considerable amount of time last year, in relation to the then Police, Crime, Sentencing and Courts Bill, seeking new statutory guidance—which has subsequently arrived—to the College of Policing about how this complex and difficult offence should be enforced in the non-virtual world. Here we are, in effect, simply handing it over to private companies, many of them abroad, under the strange and remote supervision of Ofcom, and assuming it is all going to work very well. It is not. The example of transgender disputes, given by the noble Baroness, is a particularly rich example of how difficult it is going to be for private companies to enforce it online. There is a strong case for removing it altogether from the Bill.
Amendment 188 relates to Clause 65, which is about providers’ terms of service. It requires Ofcom to enforce terms of service in a way that is compatible with our rights to freedom of expression under the European Convention on Human Rights. Noble Lords may remember that many of these companies come from countries that are not European. They do not live under the legal cosh of the European Convention on Human Rights. Why should we be enforcing terms of service that might be perfectly legal in California, or Russia, or Kazakhstan—we do not know where the next popular phenomena on the web are going to come from—without having the restriction placed on Ofcom that it cannot be done in a way that contravenes, or at least does not uphold, our rights under the European convention?
I turn to Amendment 275 and the peculiar discovery that the British Board of Film Classification is running its own parallel censorship system—they used to be called censors, so I think I can call them that fairly—in an entirely private arrangement that has no supervision from Ofcom at all. The suggestion is that perhaps, if we are going to have one system supervised by Ofcom, everything might be brought within it so that we have a degree of consistency. Again, I find it very hard to understand why the Government would resist an amendment that is so pellucidly commonsensical.
With that, I will sit down. I do not think these issues are going to go away; there is a very strong public interest in this, as there is, increasingly in recent days, in various other amendments that are going to come up later in the Bill. By pushing things through in the way we have with the amendments the Government have conceded to those who argue for stronger enforcement and more restriction on access to the internet, it may all pass through the Commons and simply become law. I seriously have my doubts, as I have expressed in relation to, for example, Wikipedia and the threat to Welsh Wicipedia, whether some of this is going to survive first contact with reality. The amendments I propose would make it easier to do so.
Lord Hope of Craighead Portrait Lord Hope of Craighead (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I speak to Amendments 286 and 294, which are the last two amendments in this group, and I will explain what they are about. They are in the name of the noble Baroness, Lady Fraser of Craigmaddie, who unfortunately cannot be here this evening, to which I and the noble Lord, Lord Stevenson of Balmacara, have added our names, as has the Minister, for which we are very grateful. They serve a simple purpose: they seek to insert a definition of the phrase “freedom of expression” into the list of definitions in Clause 211 and add it to the index of defined expressions in Clause 212.

They follow an amendment which I proposed in Committee. My amendment at that stage was to insert the definition into Clause 18, where the phrase

“freedom of expression within the law”

appears. It was prompted by a point made by the Constitution Committee in its report on the Bill, which said that the House might wish to consider defining that expression in the interests of legal certainty.

The same point arose when the House was considering the then Higher Education (Freedom of Speech) Bill. Following a similar amendment by me, a government amendment on Report, to achieve the same result, was agreed to that Bill. My amendment in Committee on this Bill adopted the same wording as the government amendment to that Bill. In his response to what I said in Committee, the Minister pointed out, quite correctly, that the Higher Education (Freedom of Speech) Act and this Bill serve quite different purposes, but he did say that the Bill team—and he himself—would consider our amendment closely between then and Report.

What has happened since is the amendment we are now proposing, which has undergone some changes since Committee. They are the product of some very helpful discussions with the Bill team. The most important is that the definition placed in Clause 211 extends to the use of the expression “freedom of expression” wherever it appears in the Bill, which is obviously a sensible change. It also now includes the word “receive” as well as the word “impart”, so that it extends to both kinds of communication that are within the scope of the Bill. The words “including in electronic form”, which are in my amendment, have been removed as unnecessary, as the Bill is concerned with communications in electronic form only.

There are also two provisions in the Bill which refer to freedom of expression to which, as the definition now makes clear, this definition is not to apply. They are in Clauses 36(6)(f) and 69(2)(d). This is because the context in which the expression is used there is quite different. They require Ofcom to consult people with expertise as to this right when preparing codes of conduct. They are not dealing with the duties of providers, which is what the definition aims to do.

As the discussion in Committee showed, and as the noble Baroness, Lady Fox, demonstrated again this evening, we tend to use the phrases “freedom of speech” and “freedom of expression” interchangeably, perhaps without very much thought as to what they really mean and how they relate to other aspects of the idea. That is why legal certainty matters when they appear in legislation. The interests of legal certainty will be met if this definition finds a place in the Bill, and it makes it clear that the reference is to the expression referred to in Article 10(1) of the convention as it has effect for the purposes of the Human Rights Act. That is as generous and comprehensive a definition as one would wish to have for the purposes of the Bill.

I am grateful to the Minister for his support and to the Bill team for their help. When the times come, either the noble Baroness, Lady Fraser, or I will move the amendment; it comes at the very end of the Bill so it will be at the last moment of the last day, when we are finishing Report. I look forward to that stage, as I am sure the Minister does himself.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I want to respond to some of the comments made by the noble Baroness, Lady Fox, and the noble Lord, Lord Moylan. I have been looking forward to this debate equally, as it touches on some crucial issues. One of the mistakes of the Bill that I place on the Government is that it was sold as somehow a balancing Bill. It is not; it is a speech-limiting Bill, as all Bills of this kind are. Its primary purpose is to prevent people in the United Kingdom encountering certain types of content.

If you support the Bill, it is because you believe that those restrictions are necessary and proportionate in the context of Article 8. Others will disagree. We cannot pretend that it is boosting free speech. The United States got it right in its first amendment. If you want to maximise speech, you prohibit your parliament regulating on speech: “Congress shall make no law that limits speech”. As soon as you start regulating, you tend towards limitations; the question in the UK and European contexts is whether those limitations are justified and justifiable.

19:30
I happen to think that certain limitations are, and there are reasons for that—not least, as we have to remind ourselves, because the Bill does not regulate the entire internet. As we discussed when we talked about exemptions, most direct speech by an individual in the United Kingdom remains unaffected. Email is unaffected; personal websites are unaffected. It regulates search and user to user. If you have concerns, as perhaps the noble Baroness, Lady Fox, does, you may feel that it goes too far, but we should be careful not to equate social media with the entire internet. When you are thinking about one’s right to speak, all those channels matter, not just the channels we are talking about. There is a case for saying that restrictions are necessary and proportionate with respect to Article 8, in the context of a regulation that regulates part—albeit an important part—of the internet.
Another thing to recognise—and this is where I perhaps depart from the noble Baroness, Lady Fox, and the noble Lord, Lord Moylan—is that we are in a sense dealing with privately managed public spaces on the internet. There is a lot of debate around this but, for me, they are functionally equivalent to other privately managed public spaces such as pubs, hotels or sports grounds. In none of those context do we expect all legal speech to be permissible. Rather, they all have their own norms and they enforce them. I cannot go into a sports ground and say what I like; I will get thrown out if I carry out certain actions within most of those public spaces. We are talking about privately managed public spaces; anyone can go in but, in entering that space, you have to conform to the norms of that space. As I said, I am not aware of many spaces where all legal speech is permitted.
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

I understand the point the noble Lord is making but, if he were thrown out, sacked or treated in some other way that was incompatible with his rights to freedom of expression under Article 10 of the European convention, he would have cause for complaint and, possibly, cause for legal redress.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

That point is well made. In support of that, if the public space treated me in a discriminatory way, I would expect to have redress, but I do not think I have a right in every public space to say everything I like in the classic Article 8 sense. My right vis-à-vis the state is much broader than my right vis-à-vis any public space that I am operating in where norms apply as well as my basic legal rights. Again, to take the pub example, if I went in and made a racist speech, I may well be thrown out of the pub even though it is sub-criminal and the police are never called; they do not need to be as the space itself organises it.

I am making the point that terms of service are about managing these privately managed public services, and it would be a mistake to equate them entirely with our right to speak or the point at which the state can step in and censor us. I understand the point about state interference but it cuts both ways: both the state interfering in excessively censoring what we can say but also the state potentially interfering in the management of what is, after all, a private space. To refer back to the US first amendment tradition, a lot of that was about freedom of religion and precisely about enabling heterodoxy. The US did not want an orthodoxy in which one set of rules applied everywhere to everybody. Rather, it wanted people to have the right to dissent, including in ways that were exclusive. You could create your own religious sect and you could not be told not to have those beliefs.

Rolling that power over to the online world, online services, as long as they are non-discriminatory, can have quite different characters. Some will be very restrictive of speech like a restrictive religious sect; some will be very open and catholic, with a small “c”, in the sense of permitting a broad range of speech. I worry about some of the amendments in case there is a suggestion that Ofcom would start to tell a heterodox community of online services that there is an orthodox way to run their terms of service; I would rather allow this to be a more diverse environment.

Having expressed some concerns, I am though very sympathetic to Amendment 162 on Section 5 of the Public Order Act. I have tried in our debates to bring some real experience to this. There are two major concerns about the inclusion of the Public Order Act in the Bill. One is a lack of understanding of what that means. If you look at the face of the language that has been quoted at us, and go back to that small service that does not have a bunch of lawyers on tap, it reads as though it is stopping any kind of abusive content. Maybe you will google it, as I did earlier, and get a little thing back from the West Yorkshire Police. I googled: “Is it illegal to swear in the street?”. West Yorkshire Police said, “Yes, it is”. So if you are sitting somewhere googling to find out what this Public Order Act thing means, you mind end up thinking, “Crikey, for UK users, I have to stop them swearing”. There is a real risk of misinterpretation.

The second risk is that of people deliberately gaming the system; again, I have a real-life example from working in one of the platforms. I had people from United Kingdom law enforcement asking us to remove content that was about demonstrations by far-right groups. They were groups I fundamentally disagree with, but their demonstrations did not appear to be illegal. The grounds cited were that, if you allow this content to go ahead and the demonstration happens, there will be a Public Order Act offence. Once you get that on official notepaper, you have to be quite robust to say, “No, I disagree”, which we did on occasion.

I think there will be other services that receive Public Order Act letters from people who seem official and they will be tempted to take down content that is entirely legal. The critical thing here is that that content will often be political. In other parts of the Bill, we are saying that we should protect political speech, yet we have a loophole here that risks that.

I am sure the Minister will not concede these amendments, but I hope he will concede that it is important that platforms are given guidance so that they do not think that somebody getting upset about a political demonstration is sufficient grounds to remove the content as a Public Order Act offence. If you are a local police officer it is much better to get rid of that EDL demonstration, so you write to the platform and it makes your life easier, but I do not think that would be great from a speech point of view.

Finally, I turn to the point made by the noble Lord, Lord Moylan, on Amendment 188 about the ECHR Article 8 exemption. As I read it, if your terms of service are not consistent with ECHR Article 8—and I do not think they will be for most platforms—you then get an exemption from all the other duties around appeals and enforcing them correctly. It is probably a probing amendment but it is a curious way of framing it; it essentially says that, if you are more restrictive, you get more freedom in terms of the Ofcom relationship. I am just curious about the detail of that amendment.

It is important that we have this debate and understand this relationship between the state, platforms and terms of service. I for one am persuaded that the general framework of the Bill makes sense; there are necessary and proportionate restrictions. I am strongly of the view that platforms should be allowed to be heterodox in their terms of service. Ofcom’s job is very much to make sure that they are done correctly but not to interfere with the content of those terms of service beyond that which is illegal. I am persuaded that we need to be extraordinarily careful about including Public Order Act offences; that particular amendment needs a good hearing.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I have said several times when we have been debating this Bill—and I will probably say it again when we get to the group about powers—that, for me, the point of the Online Safety Bill is to address the absence of accountability for the extraordinary power that the platforms and search engines have over what we see online and, indeed, how we live and engage with each other online. Through this Bill, much greater responsibility for child safety will be placed on the platforms. That is a good thing; I have been very supportive of the measures to ensure that there are strong protections for children online.

The platforms will also have responsibility, though, for some measures to help adults protect themselves. We must not forget that, the more responsibility that platforms have to protect, the more power we could inadvertently give them to influence what is an acceptable opinion to hold, or to shape society to such an extent that they can even start to influence what we believe to be right or wrong—we are talking about that significant amount of power.

I was of the camp that was pleased when the Government removed the legal but harmful aspects of the Bill, because for me they represented a serious risk to freedom of expression. As I just described, I felt that they risked too much inadvertent power, as it were, going to the platforms. But, with the Government having done that, we have seen through the passage of the Bill some push-back, which is perfectly legitimate and understandable—I am not criticising anyone—from those who were concerned about that move. In response to that, the Government amended the Bill to provide assurances and clarifications on things like the user-empowerment tools. As I said, I do not have any problem; although I might not necessarily support some of the specific measures that were brought forward, I am okay with that as a matter of principle.

However, as was explained by my noble friend Lord Moylan and the noble Baroness, Lady Fox, there has not been a similar willingness from the Government to reassure those who remain concerned about the platforms’ power over freedom of expression. We have to bear in mind that some people’s concerns in this quarter remained even when the legal but harmful change was made—that is, the removal of legal but harmful was a positive step, but it did not go far enough for some people with concerns about freedom of expression.

I am sympathetic to the feeling behind this group, which was expressed by my noble friend and the noble Baroness, Lady Fox. I am sympathetic to many of the amendments. As the noble Lord, Lord Allan of Hallam, pointed out, specifically Amendment 162 in relation to the Public Order Act seems worthy of further consideration by the Government. But the amendments in the group that caught my attention place a specific duty on Ofcom in regard to freedom of expression when drawing up or amending codes of practice or other guidance—these amendments are in my noble friend Lord Moylan’s name. When I looked at them, I did not think that they undermined anything else that the Government brought forward through the amendments to the Bill, as he said, but I thought that they would go a long way towards enforcing the importance of freedom of expression as part of this regulatory framework—one that we expect Ofcom to attach serious importance to.

I take on board what the noble Lord, Lord Allan, said about the framework of this legislation being primarily about safeguarding and protection. The purpose of the Bill is not to enhance freedom of expression, but, throughout its passage, that has none the less always been a concern. It is right that the Government seek to balance these two competing fundamental principles. I ask whether more can be done—my noble friend pointed to the recommendations of the Equality and Human Rights Commission and how they reinforce some of what he proposed. I would like to think that my noble friend the Minister could give some greater thought to this.

As was said, it is to the Government’s credit how much they have moved on the Bill during its passage, particularly between Committee and Report. That was quite contrary to the sense that I think a lot of us felt during the early stages of our debates. It would be a shame if, once the Bill leaves the House, it is felt that the balance is not as fine—let me put it like that—as some people feel it needs to be. I just wanted to express some support and ask my noble friend the Minister to give this proper and serious consideration.

19:45
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I rise briefly to note that, in the exchange between the noble Lords, Lord Allan and Lord Moylan, there was this idea about where you can complain. The independent complaints mechanism would be as advantageous to people who are concerned about freedom of speech as it would be for any other reason. I join and add my voice to other noble Lords who expressed their support for the noble Baroness, Lady Fox, on Amendment 162 about the Public Order Act.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, we are dangerously on the same page this evening. I absolutely agree with the noble Baroness, Lady Kidron, about demonstrating the need for an independent complaints mechanism. The noble Baroness, Lady Stowell, captured quite a lot of the need to keep the freedom of expression aspect under close review, as we go through the Bill. The noble Baroness, Lady Fox, and the noble Lord, Lord Moylan, have raised an important and useful debate, and there are some crucial issues here. My noble friend captured it when he talked about the justifiable limitations and the context in which limitations are made. Some of the points made about the Public Order Act offences are extremely valuable.

I turn to one thing that surprised me. It was interesting that the noble Lord, Lord Moylan, quoted the Equality and Human Rights Commission, which said it had reservations about the protection of freedom of expression in the Bill. As we go through the Bill, it is easy to keep our eyes on the ground and not to look too closely at the overall impact. In its briefing, which is pretty comprehensive, paragraph 2.14 says:

“In a few cases, it may be clear that the content breaches the law. However, in most cases decisions about illegality will be complex and far from clear. Guidance from Ofcom could never sufficiently capture the full range or complexity of these offences to support service providers comprehensively in such judgements, which are quasi-judicial”.


I am rather more optimistic than that, but we need further assurance on how that will operate. Its life would probably be easier if we did not have the Public Order Act offences in Schedule 7.

I am interested to hear what the Minister says. I am sure that there are pressures on him, from his own Benches, to look again at these issues to see whether more can be done. The EHRC says:

“Our recommendation is to create a duty to protect freedom of expression to provide an effective counterbalance to the duties”.


The noble Lord, Lord Moylan, cited this. There is a lot of reference in the Bill but not to the Ofcom duties. So this could be a late contender to settle the horses, so to speak.

This is a difficult Bill; we all know that so much nuance is involved. We really hope that there is not too much difficulty in interpretation when it is put into practice through the codes. That kind of clarity is what we are trying to achieve, and, if the Minister can help to deliver that, he will deserve a monument.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

It is always nice to be nice to the Minister.

I will reference, briefly, the introduction of the amendments in the name of the noble Baroness, Lady Fraser of Craigmaddie, which I signed. They were introduced extremely competently, as you would expect, by my noble and learned kinsman Lord Hope. It is important to get the right words in the right place in Bills such as this. He is absolutely right to point out the need to be sure that we are talking about the right thing when we say “freedom of expression”—that we do mean that and not “freedom of speech”; we should not get them mixed up—and, also, to have a consistent definition that can be referred to, because so much depends on it. Indeed, this group might have run better and more fluently if we had started with this amendment, which would have then led into the speeches from those who had the other amendments in the group.

The noble Baroness is not present today, but not for bad news: for good news. Her daughter is graduating and she wanted to be present at that; it is only right that she should do that. She will be back to pick up other aspects of the devolution issues she has been following very closely, and I will support her at that time.

The debate on freedom of expression was extremely interesting. It raised issues that, perhaps, could have featured more fully had this been timetabled differently, as both noble Lords who introduced amendments on this subject said. I will get my retaliation in first: a lot of what has been asked for will have been done. I am sure that the Minister will say that, if you look at the amendment to Clause 1, the requirement there is that freedom of expression is given priority in the overall approach to the Bill, and therefore, to a large extent, the requirement to replace that at various parts of the Bill may not be necessary. But I will leave him to expand on that; I am sure that he will.

Other than that, the tension I referred to in an earlier discussion, in relation to what we are made to believe about the internet and the social media companies, is that we are seeing a true public square, in which expressions and opinions can be exchanged as freely and openly as they would be in a public space in the real world. But, of course, neither of those places really exists, and no one can take the analogy further than has been done already.

The change, which was picked up by the noble Baroness, Lady Stowell, in relation to losing “legal but harmful”, has precipitated an issue which will be left to social media companies to organise and police—I should have put “policing” in quotation marks. As the noble Baroness, Lady Kidron, said, the remedy for much of this will be an appeals mechanism that works both at the company level and for the issues that need rebalancing in relation to complexity or because they are not being dealt with properly. We will not know that for a couple of years, but at least that has been provided for and we can look forward to it. I look forward to the Minister’s response.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I hope that the noble Baroness, Lady Fox, and my noble friend Lord Moylan do feel that they have been listened to. It was striking, in this debate, that they had support from all corners of your Lordships’ House. I know that, at various points in Committee, they may have felt that they were in a minority, but they have been a very useful and welcome one. This debate shows that many of the arguments that they have made throughout the passage of the Bill have resonated with noble Lords from across the House.

Although I have not signed amendments in the names of the noble Baroness and my noble friend Lord Moylan, in many cases it is not because I disagree with them but because I think that what they do is already covered in the Bill. I hope to reassure them of that in what I say now.

Amendments 77 to 81 from the noble Baroness, Lady Fox, would require services to have particular regard to freedom of expression and privacy when deciding on their terms of service. Services will already need to have particular regard to users’ rights when deciding on safety systems to fulfil their duties. These requirements will be reflected in providers’ terms of service, as a result of providers’ duties to set out their safety measures in their terms of service. The framework will also include a range of measures to allow scrutiny of the formulation, clarity and implementation of category 1 providers’ own terms of service.

However, there are some points on which we disagree. For instance, we do not think that it would be appropriate for all providers to have a general duty to have a particular regard to freedom of expression when deciding on their own terms of service about content. We believe that the Bill achieves the right balance. It requires providers to have regard to freedom of expression when carrying out their safety duties, and it enables public scrutiny of terms of service, while recognising providers’ own freedom of expression rights as private entities to set the terms of service that they want. It is of course up to adults to decide which services to use based on the way those services are drawn up and the way the terms of service set out what is permissible in them.

Nothing in the Bill restricts service providers’ ability to set their own terms and conditions for legal content accessed by adults—that is worth stressing. Ofcom will not set platforms’ terms and conditions, nor will it take decisions on whether individual pieces of content should, or should not, be on a platform. Rather, it will ensure that platforms set clear terms and conditions, so that adults know what to expect online, and ensure that platforms have systems and processes in place to enforce those terms and conditions themselves.

Amendment 226 from the noble Baroness, Lady Fox, would require providers to use all relevant information that is reasonably available to them whenever they make judgments about content under their terms of service. That is, where they have included or drafted those terms of service in compliance with duties in the Bill. Her amendment would be to an existing requirement in Clause 173, which already requires providers to take this approach whenever they implement a system or process to comply, and this system is making judgments about certain content. For example, Clause 173 already covers content judgments made via systems and processes that a category 1 provider implements to fulfil its Clause 65 duties to enforce its own terms of service consistently. So we feel that Clause 173 is already broad enough to achieve the objectives that the noble Baroness, Lady Fox, seeks.

My noble friend Lord Moylan’s amendments seek to require Ofcom to have special regard to the importance of protecting freedom of expression when exercising its enforcement duties and when drafting codes or guidance. As we discussed in Committee, Ofcom has existing obligations to protect freedom of expression, and the Bill will include additional measures in this regard. We are also making additional amendments to underline the importance of freedom of expression. I am grateful to the noble and learned Lord, Lord Hope of Craighead, and my noble friend Lady Fraser of Craigmaddie for their work to define “freedom of expression” in the Bill. The Bill’s new overarching statement at Clause 1, as the noble Lord, Lord Stevenson, rightly pointed out, lists “freedom of expression”, signalling that it is a fundamental part of the Bill. That is a helpful addition.

Amendment 188 in the name of the noble Baroness, Lady Fox, seeks to disapply platforms’ Clause 65 duties when platforms’ terms of service restrict lawful expression, or expression otherwise protected by Article 10 of the European Convention on Human Rights. Her amendment would mean that category 1 providers’ Clause 65 duties to enforce clear, accessible terms of service in a consistent manner would not apply to any of their terms of service, where they are making their own decisions restricting legal content. That would greatly undermine the application of these provisions in the Bill.

Article 10 of the European Convention on Human Rights concerns individuals’ and entities’ rights to receive and impart ideas without undue interference by public authorities, not private entities. As such, it is not clear how a service provider deciding not to allow a certain type of content on its platform would engage the Article 10 rights of a user.

Beyond the legal obligations regarding the treatment of certain kinds of user-generated content imposed by this Bill and by other legislation, platforms are free to decide what content they wish, or do not wish, to have on their services. Provisions in the Bill will set out important duties to ensure that providers’ contractual terms on such matters are clear, accessible and consistently enforced.

20:00
Moreover, as we have discussed before, Ofcom is bound by the Human Rights Act 1998. So, when carrying out all its functions under this Bill, including the preparation of guidance and codes, it will need to ensure that freedom of expression is protected. There is already a range of other measures in the Bill which ensure that Ofcom protects freedom of expression; for instance, it has a duty in Clause 143 to set out the steps it has taken, and the processes it operates, to ensure that its online safety functions have been exercised compatibly with Articles 8 and 10 of the European Convention on Human Rights. As such, my noble friend Lord Moylan’s amendments would be largely duplicative, since Ofcom already has an obligation to set out similar information in an annual statement.
The illegal content duties, in relation to the points raised about Section 5 of the Public Order Act in Schedule 7, remain risk-based and proportionate. Platforms must use proportionate systems and processes designed to prevent users encountering illegal content and to minimise the length of time that any priority illegal content is present on the service. We are not requiring platforms to ensure that users never encounter illegal content. Companies could take proportionate measures, such as user reporting, user empowerment and enforcing policies which prohibit threats or abuse, but the Bill also creates strong safeguards to protect freedom of expression. All services will need to have particular regard to freedom of expression when implementing safety duties. I certainly agree with the noble Lord, Lord Allan of Hallam, when he says that good and clear guidance is vital here. That is why we have put in place a requirement through Clause 174 for Ofcom to produce guidance about how to make judgments about illegal content.
Amendment 162 from the noble Baroness, Lady Fox, seeks to remove offences under Section 5 of the Public Order Act 1986 from the priority offences list. Section 5 of the Public Order Act makes it an offence to use
“threatening or abusive words or behaviour, or disorderly behaviour”
or to display any
“visible representation which is threatening or abusive”.
Given that that activity can cause harm, it is right that companies have duties to tackle it and, subject to the guidance that I have just mentioned, we think that the Bill sets that out appropriately.
The noble Baroness’s Amendment 275 would require Ofcom to ensure that content classification frameworks created by the British Board of Film Classification, which act as a reference for providers’ online safety duties, should not undermine the Bill’s safeguards for freedom of expression. If it is the case that a content classification scheme produced by the BBFC is unsuitable to be used as a reference for whether content falls within the scope of providers’ new online safety duties, Ofcom should not recommend it in its codes of practice. Ofcom has specific duties in the Bill to protect freedom of expression when drafting its codes of practice, which will ensure that any measures it recommends are designed in that light. However, I will take the point and case study she raised back to the department to see whether I can find out any further detail about what went on in that instance.
Amendments 286 and 294 would insert a definition of “freedom of expression” into the Bill. As I mentioned, I am grateful to the noble and learned Lord, Lord Hope, and my noble friend Lady Fraser for proposing these amendments, which align the definition of freedom of expression in the Bill with that in the European Convention on Human Rights. We agree with them that it will increase clarity about freedom of expression in the Bill, which is why I have added my name to their amendments and, when we come to the very end of Report—to which I look forward as well—I will be very glad to support them.
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, before my noble friend sits down, perhaps I could seek a point of clarification. I think I heard him say, at the beginning of his response to this short debate, that providers will be required to have terms of service which respect users’ rights. May I ask him a very straightforward question: do those rights include the rights conferred by Article 10 of the European Convention on Human Rights? Put another way, is it possible for a provider operating in the United Kingdom to have terms and conditions that abridge the rights conferred by Article 10? If it is possible, what is the Government’s defence of that? If it is not possible, what is the mechanism by which the Bill achieves that?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

As I set out, I think my noble friend and the noble Baroness, Lady Fox, are not right to point to the European Convention on Human Rights here. That concerns individuals’ and entities’ rights

“to receive and impart ideas without undue interference”

by public authorities, not private entities. We do not see how a service provider deciding not to allow certain types of content on its platform would engage the Article 10 rights of the user, but I would be very happy to discuss this further with my noble friend and the noble Baroness in case we are talking at cross-purposes.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

On that point specifically, having worked inside one of the companies, they fear legal action under all sorts of laws, but not under the European Convention on Human Rights. As the Minister explained, it is for public bodies; if people are going to take a case on Article 10 grounds, they will be taking it against a public body. There are lots of other grounds to go after a private company but not ECHR compliance.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

My Lords, I genuinely appreciate this debate. The noble Lord, Lord Clement-Jones, made what I thought was a very important point, which is, in going through the weeds of the Bill—and some people have been involved in it for many years, looking at the detail—I appreciate that it can be easy to forget the free speech point. It is important that it has been raised but it also constantly needs to be raised. That is the point: it is, as the noble Lord, Lord Allan of Hallam, admitted, a speech-restricting Bill where we are working out the balance.

I apologise to the noble and learned, Lord Hope of Craighead, for not acknowledging that he has constantly emphasised the distinction between free speech and free expression. He and I will not agree on this; it is that we do not have time for this argument now rather than me not understanding. But he has been diligent in his persistence in trying to at least raise the issues and that is important.

I was a bit surprised by the Minister’s response because, for the first time ever, since I have been here, there has been some enthusiasm across the House for one of my amendments—it really is unprecedented—Amendment 162 on the public order offences. I thought that the Minister might have noted that, because he has noted it every other time there has been a consensus across the House. I think he ought to look again at Amendment 162.

To indicate the muddle one gets in, in terms of public order offences and illegality, the police force in Cheshire, where I am from, has put out a film online today saying that misgendering is a crime. That is the police who have said that. It is not a crime and the point about these things, and the difficulty we are concerned with, is asking people to remove and censor material based on illegality or public offences that they should not be removing. That is my concern: censorship.

To conclude, I absolutely agree with the noble Lord, Lord Allan of Hallam, that of course free speech does not mean saying whatever you want wherever you want. That is not free speech, and I am a free speech absolutist. Even subreddits—if people know what they are—think they are policing each other’s speech. There are norms that are set in place. That is fine with me—that multitude.

My concern is that a state body such as Ofcom is going to set norms of what is acceptable free speech that are lower than free speech laws by demanding, on pain of breach of the law, with fines and so on, that these private companies have to impose their own terms of service, which can actually then set a norm, leading them to be risk-averse, and set a norm for levels of speech that are very dangerous. For example, when you go into work, you cannot just say anything, but there are people such as Maya Forstater, who said something at work and was disciplined and lost her job and has just won more than £100,000, because she was expressing her views and opinions. The Equality Act ran to her aid and she has now won and been shown to be right. You cannot do that if your words have disappeared and are censored.

I could talk about this for a long time, as noble Lords know. I hope that at least, as the Bill progresses, even when it becomes an Act, the Government could just stamp on its head, “Don’t forget free speech”—but before then, as we end this process, they could come back with some concessions to some of the amendments that have been raised here today. That would be more than just words. I beg leave to withdraw the amendment.

Amendment 77 withdrawn.
Amendments 78 to 81 not moved.
Clause 19: Record-keeping and review duties
Amendments 82 and 83
Moved by
82: Clause 19, page 23, line 30, at end insert—
“(8A) A duty to make and keep a written record, in an easily understandable form, of all aspects of every assessment under section (Assessment duties: user empowerment) (assessments related to the adult user empowerment duty set out in section 12(2)), including details about how the assessment was carried out and its findings.”Member’s explanatory statement
This amendment requires providers of Category 1 services to keep full records of their assessments under the new Clause proposed after Clause 11 in my name.
83: Clause 19, page 23, line 31, leave out “a risk assessment as required by subsection (2)” and insert “an assessment as required by subsection (2) or (8A)”
Member’s explanatory statement
This amendment requires providers of Category 1 services to supply OFCOM with copies of records of their assessments under the new Clause proposed after Clause 11 in my name.
Amendments 82 and 83 agreed.
Consideration on Report adjourned until not before 8.42 pm.

Online Safety Bill

Report (2nd Day) (Continued)
21:01
Amendment 84
Moved by
84: Clause 19, page 24, line 4, at end insert “, and (Disclosure of information about use of service by deceased child users) (deceased child users).”
Member’s explanatory statement
This amendment has the effect that OFCOM have a duty to review compliance by user-to- user service providers with the new duties imposed by the Clause proposed after Clause 67 in my name.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, as I set out in Committee, the Government are bringing forward a package of amendments to address the challenges that bereaved parents and coroners have faced when seeking to access data after the death of a child.

These amendments have been developed after consultation with those who, so sadly, have first-hand experience of these challenges. I thank in particular the families of Breck Bednar, Sophie Parkinson, Molly Russell, Olly Stephens and Frankie Thomas for raising awareness of the challenges they have faced when seeking access to information following the heartbreaking cases involving their children. I am also grateful to the noble Baroness, Lady Kidron, for championing this issue in Parliament and more widely. I am very happy to say that she is supporting the government amendments in this group.

The loss of any life is heartbreaking, but especially so when it involves a child. These amendments will create a more straightforward and humane process for accessing data and will help to ensure that parents and coroners receive the answers they need in cases where a child’s death may be related to online harms. We know that coroners have faced challenges in accessing relevant data from online service providers, including information about a specific child’s online activity, where that might be relevant to an investigation or inquest. It is important that coroners can access such information.

As such, I turn first to Amendments 246, 247, 249, 250, 282, 283 and 287, which give Ofcom an express power to require information from regulated services about a deceased child’s online activity following a request from a coroner. This includes the content the child had viewed or with which he or she had engaged, how the content came to be encountered by the child, the role that algorithms and other functionalities played, and the method of interaction. It also covers any content that the child generated, uploaded or shared on the service.

Crucially, this power is backed up by Ofcom’s existing enforcement powers, so that, where a company refuses to provide information requested by Ofcom, companies may be subject to enforcement action, including senior management liability. To ensure that there are no barriers to Ofcom sharing information with coroners, first, Amendment 254 enables Ofcom to share information with a coroner without the prior consent of a business to disclose such information. This will ensure that Ofcom is free to provide information it collects under its existing online safety functions to coroners, as well as information requested specifically on behalf of a coroner, where that might be useful in determining whether social media played a part in a child’s death.

Secondly, coroners must have access to online safety expertise, given the technical and fast-moving nature of the industry. As such, Amendment 273 gives Ofcom a power to produce a report dealing with matters relevant to an investigation or inquest, following a request from a coroner. This may include, for example, information about a company’s systems and processes, including how algorithms have promoted specific content to a child. To this end, the Chief Coroner’s office will consider issuing non-statutory guidance and training for coroners about social media as appropriate, subject to the prioritisation of resources. We are confident that this well-established framework provides an effective means to provide coroners with training on online safety issues.

It is also important that we address the lack of transparency from large social media services about their approach to data disclosure. Currently, there is no common approach to this issue, with some services offering memorialisation or contact-nomination processes, while others seemingly lack any formal policy. To tackle this, a number of amendments in this group will require the largest services—category 1, 2A and 2B services—to set out policies relating to the disclosure of data regarding the online activities of a deceased child in a clear, accessible and sufficiently detailed format in their terms of service. These companies will also be required to provide a written response to data requests in a timely manner and must provide a dedicated helpline, or similar means, for parents to communicate with the company, in order to streamline the process. This will address the painful radio silence experienced by many bereaved parents. The companies must also offer options so that parents can complain when they consider that a platform is not meeting its obligations. These must be easy to access, easy to use and transparent.

The package of amendments will apply not only to coroners in England and Wales but also to Northern Ireland and equivalent investigations in Scotland, where similar sad events have occurred.

The Government will also address other barriers which are beyond the scope of this Bill. For example, we will explore measures to introduce data rights for bereaved parents who wish to request information about their deceased children through the Data Protection and Digital Information Bill. We are also working, as I said in Committee, with our American counterparts to clarify and, where necessary, address unintended barriers to information sharing created by the United States Stored Communications Act. I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the Minister and indeed the Secretary of State for bringing forward these amendments in the fulsome manner that they have. I appreciate it, but I know that Bereaved Families for Online Safety also appreciates it. The Government committed to bringing forward these amendments on the last day in Committee, so they have been pre-emptively welcomed and discussed at some length. One need only read through Hansard of 22 June to understand the strength of feeling about the pain that has been caused to families and the urgent need to prevent others experiencing the horror faced by families already dealing with the loss of their child.

I will speak briefly on three matters only. First, I must once again thank bereaved families and colleagues in this House and in the other place for their tireless work in pressing this issue. This is one of those issues that does not allow for celebration. As I walked from the Chamber on 22 June, I asked one of the parents how they felt. They said: “It is too late for me”. It was not said in bitterness but in acknowledgement of their profound hurt and the failure of companies voluntarily to do what is obvious, moral and humane. I ask the Government to see the sense in the other amendments that noble Lords brought forward on Report to make children safer, and make the same, pragmatic, thoughtful solution to those as they have done on this group of amendments. It makes a huge difference.

Secondly, I need to highlight just one gap; I have written to the Secretary of State and the Minister on this. I find it disappointing that the Government did not find a way to require senior management to attend an inquest to give evidence. Given that the Government have agreed that senior managers should be subject to criminal liability under some circumstances, I do not understand their objections to summoning them to co-operate with legal proceedings. If a company submits information in response to Ofcom and at the coroner’s request the company’s senior management is invited to attend the inquest, it makes sense that someone should be required to appear to answer and follow up those questions. Again, on behalf of the bereaved families and specifically their legal representatives, who are very clear on the importance of this part of the regime, I ask the Government to reconsider this point and ask the Minister to undertake to speak to the department and the MoJ, if necessary, to make sure that, if senior managers are asked to attend court, they are mandated to do so.

Thirdly, I will touch on the additional commitments the Minister made beyond the Bill, the first of which is the upcoming Data Protection and Digital Information Bill. I am glad to report that some of the officials working on the Bill have already reached out, so I am grateful to the Minister that this is in train, but I expect it to include guidance for companies that will, at a minimum, cover data preservation orders and guidance about the privacy of other users in cases where a child has died. I think that privacy for other users is central to this being a good outcome for everybody, and I hope we are able to include that.

I am pleased to hear about the undertaking with the US regarding potential barriers, and I believe—and I would love to hear from the Minister—that the objective is to make a bilateral agreement that would allow data to be shared between the two countries in the case of a child’s death. It is very specific requirement, not a wide-ranging one. I believe, if we can do it on a bilateral basis, it would be easier than a broad attempt to change the data storage Act.

I turn finally to training for coroners. I was delighted that the Chief Coroner made a commitment to consider issuing non-legislative guidance and training on social media for coroners and the offer of consultation with experts, including Ofcom, the ICO and bereaved families and their representatives, but this commitment was made subject to funding. I ask the Minister to agree to discuss routes to funding from the levy via Ofcom’s digital literacy duty. I have proposed an amendment to the government amendment that would make that happen, but I would welcome the opportunity to discuss it with the Minister. Coroners must feel confident in their understanding of the digital world, and I am concerned that giving this new route to regulated companies via Ofcom without giving them training on how to use it may create a spectre of failure or further frustration and distress for bereaved families. I know there is not a person in the House who would want that to be the outcome of these welcome government amendments.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I also welcome this group of amendments. I remember a debate led by the noble Baroness, Lady Kidron, some time ago in the Moses Room, where we discussed this, and I said at the time I thought it would get fixed in the Online Safety Bill. I said that in a spirit of hope, not knowing any of the detail, and it is really satisfying to see the detail here today. As she said, it is testimony to the families, many of whom got in touch with me at that time, who have persisted in working to find a solution for other families—as the noble Baroness said, it is too late for them, but it will make a real difference to other families—and it is so impressive that, at a time of extreme grief and justifiable anger, people have been able to channel that into seeking these improvements.

The key in the amendments, which will make that difference, is that there will be a legal order to which the platforms know they have to respond. The mechanism that has been selected—the information notice—is excellent because it will become well known to every one of the 25,000 or so platforms that operate in the United Kingdom. When they get an information notice from Ofcom, that is not something that they will have discretion over; they will need to comply with it. That will make a huge difference.

21:15
The noble Baroness made an important point on this around privacy. Importantly, the platforms will be handing the data over to a public authority in the United Kingdom. So it will go to Ofcom and then, through Ofcom, to a coroner’s court. Again, legal order and public authority are quite critical and we have a mechanism that deals with that. If we want to test the amendments, we can look at the practical effect they will have in light of what the barriers have been to date. The companies have been stubborn in taking an extreme position on non-disclosure, to a point that seems completely irrational to anyone outside.
There have been three barriers. I will outline them briefly so that we understand how the amendments tackle them. The first barrier is legal concerns. This is not just the “I can’t give you the data because of data protection” type of limp excuse that we are all used to getting these days from different authorities. There are serious lawyers with genuine concerns that disclosure of data would lead a business into a risky area; businesses will tend to be risk-averse on data disclosures that they perceive to be optional or voluntary as opposed to those that they perceive to be mandatory. People may feel that they should feel the ethical or moral compulsion to disclose but, often, that has not been sufficient and legal concerns have been raised.
The second barrier concerns real fears about what may happen with the data. I come back to the point raised by the noble Baroness, Lady Kidron. I know that people have been through this stuff and they want to see what has been going on. Much of the content on social media sites is hard to fully anonymise without it failing to fulfil the purpose of helping people to understand what was going on. There is a difficult line to tread here. I know that people who work on this will sometimes find that the content they are being asked to disclose feels very sensitive. Their overwhelming fear is that, by disclosing that data, they will create a knock-on effect where, because of the disclosure, other individuals will become distressed and may even harm themselves. Again, that is a genuine fear.
The third barrier is embarrassment or shame at what has been going on and the platform not wanting to give transparency. I will not shy away from that; it is of course there as a strong motivation. People are sitting there defending their organisation and thinking, “Oh my goodness, we can’t disclose this”. So all three of these barriers exist at once. There are genuine legal fears, genuine fears about what may happen down the track—which is unknown—and then this corporate defensiveness which says, “Let’s not disclose”.
If we look at the orders here and the mechanisms that have been proposed, the legal barriers are overcome, at least in relation to the basic compulsion to disclose. There are some interesting issues about the Stored Communications Act, and when you look at the history of the Cloud Act, which was an attempt to reform the Stored Communications Act, you see that it is all quite messy when you are dealing with disclosing personal data from the United States to Europe. It becomes particularly problematic if the data is about Americans; of course, if you are using a social media platform, it may be that some of the content that people or families want to see is associated with an American. So it is gratifying to hear from the Minister that the Government are going to look at that.
Essentially, at least for much of the data, we will now have a straightforward legal mechanism that works. It cannot altogether fix this question of what happens with the data downstream. For that we have to trust Ofcom, working with the Information Commissioner’s Office and the coroners, to do the right thing and, when they get the data, to look after it. Then, if a platform is handing data over to agents of the British state with that kind of authority, it will feel that it can trust them. Let us hope that nothing happens and the trust will be there to move the data over and rely on their professionalism. The third issue of corporate embarrassment then becomes irrelevant, because they have no choice; that is then resolved and no longer a sufficient barrier to disclosure.
I have a few questions on the specifics of this. The first is on Amendment 190, on platforms declaring in their terms how they will handle the data of deceased children. Again, I lived through this, and most of the stuff to do with memorialisation and bereavement was entirely ad hoc. Platforms were built in an incredibly optimistic fantasy California world where no one died, so they did not think about that. But then a friend of someone did, and they got in touch and said, “What should we do?” and they replied, “Oh, we’ll come up with something”. That is literally the origin story of a lot of these memorialisation policies and things like that. Someone then said, “Well, that doesn’t work very well”, and someone else asked, “Who constitutes a family member who can make the request?” It all happened in this very ad hoc way, so Amendment 190 is welcome in making sure that that is more consistent.
It would be interesting if the Minister has any thoughts on how that extends to other people. Clearly, we are focused on deceased children, but some of the same considerations, certainly around transparency, apply to any deceased family member, whatever their age. Having a dedicated hotline specifically for children is right, but there is something interesting in Amendment 190 about filling the gaps around disclosure, transparency and memorialisation more generally, and making that consistent.
The second question is on Amendment 273, on reports into a death. My understanding is that this relates to a death of anyone of any age and that it is not limited to children specifically. Again, it would be interesting to hear more about how the Government see that working because, as I understand it, the volume through that channel could be much greater if we are talking about any death at any age, which is how I read Amendment 273, unless I have misunderstood it.
I have another question, on Amendment 249, which is on information notices specifically about child deaths. I do not want to broaden this out, but we need to flag that we will need some clarity around what assistance can be given to people where the death is of someone who is not a child. There will be situations that are important to families and where everyone has a huge amount of sympathy but where we are not dealing with a child. Again, it is right that we have this specific set of measures around deceased children, but we should expect that Ofcom will be asked, “What about other circumstances?” We need a reasonable answer to that: that other things are in place. I hope that the answer will be that, if it is a serious enough case, without the information notice powers Ofcom could still, under Amendment 273 as I read it, look into other deaths that involve adults, as well as the specific powers it has in relation to children. I would appreciate clarification from the Minister.
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, given the hour, I will be brief. I wanted to thank my noble friend the Minister and the Secretary of State, and to congratulate my friend the noble Baroness, Lady Kidron, on such an important group. It is late at night and not many of us are left in the Chamber, but this is an important thing that they have succeeded in doing together, and it is important that we mark that. It is also a hugely important thing that the bereaved families for justice have achieved, and I hope that they have achieved a modicum of calm from having made such a big difference for future families.

I will make one substantive point, referencing where my noble friend the Minister talked about future Bills. In this House and in this generation, we are building the legal scaffolding for a digital world that already exists. The noble Lord, Lord Allan of Hallam, referenced the fact that much of this was built without much thought—not maliciously but just without thinking about the real world, life and death. In Committee, I was taken by the noble Lord, Lord Knight, mentioning the intriguing possibility of using the Data Protection and Digital Information Bill to discuss data rights and to go beyond the dreadful circumstances that these amendments cover to make the passing on of your digital assets something that is a normal part of our life and death. So I feel that this is the beginning of a series of discussions, not the end.

I hope that my noble friend the Minister and whichever of his and my colleagues picks up the brief for the forthcoming Bill can take to heart how we have developed all this together. I know that today has perhaps not been our most wholly collaborative day, but, in general, I think we all feel that the Bill is so much the better for the collaborative nature that we have all brought to it, and on no more important a topic than this amendment.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I will be extremely brief. We have come a very long way since the Joint Committee made its recommendations to the Government, largely, I think, as a result of the noble Baroness, Lady Kidron. I keep mistakenly calling her “Baroness Beeban”; familiarity breeds formality, or something.

I thank the Minister and the Secretary of State for what they have done, and the bereaved families for having identified these issues. My noble friend Lord Allan rightly identified the sentiments as grief and anger at what has transpired. All we can do is try to do, in a small way, what we can to redress the harm that has already been done. I was really interested in his insights into how a platform will respond and how this will help them through the process of legal order and data protection issues with a public authority.

My main question to the Minister is in that context—the relationship with the Information Commissioner’s Office—because there are issues here. There is, if you like, an overlap of jurisdiction with the ICO, because the potential or actual disclosure of personal data is involved, and therefore there will necessarily have to be co-operation between the ICO and Ofcom to ensure the most effective regulatory response. I do not know whether that has emerged on the Minister’s radar, but it certainly has emerged on the ICO’s radar. Indeed, in the ideal world, there probably should be some sort of consultation requirement on Ofcom to co-operate with the Information Commissioner in these circumstances. Anything that the Minister can say on that would be very helpful.

Again, this is all about reassurance. We must make sure that we have absolutely nailed down all the data protection issues involved in the very creative way the Government have responded to the requests of the bereaved families so notably championed by the noble Baroness, Lady Kidron.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, first, I associate myself with the excellent way in which the noble Baroness, Lady Harding, paid tribute to the work of the noble Baroness, Lady Kidron, on behalf of Bereaved Families for Online Safety, and with the comments she made about the Minister and the Secretary of State in getting us to this point, which were echoed by others.

I have attached my name, on behalf of the Opposition, to these amendments on the basis that if they are good enough for the noble Baroness, Lady Kidron, it ought to be good enough for me. We should now get on with implementing them. I am also hopeful to learn that the Minister has been liaising with the noble Baroness, Lady Newlove, to ensure that the amendments relating to coroners’ services, and the equivalent procurator fiscal service in Scotland, will satisfy her sense of what will work for victims. I am interested, also, in the answer to the question raised by the noble Baroness, Lady Kidron, regarding a requirement for senior managers to attend inquests. I liked what she had to say about the training for coroners being seeing as media literacy and therefore fundable from the levy.

All that remains is for me to ask three quick questions to get the Minister’s position clear regarding the interpretation of the new Chapter 3A, “Deceased Child Users”. First, the chapter is clear that terms of service must clearly and easily set out policy for dealing with the parents of a deceased child, and must provide a dedicated helpline and a complaints procedure. In subsection (2), does a helpline or similar—the “similar” being particularly important—mean that the provider must offer an accessible, responsive and interactive service? Does that need to be staffed by a human? I think it would be helpful for the Minister to confirm that is his intention that it should be, so that parents are not fobbed off with solely an automated bot-type service.

21:30
Secondly, the requirement to provide a complaints service is clear. The duties on Ofcom in the group are also clear enough. Can he make sure he has summarised on the record the consequences for the provider if they fail in their duties and, in particular, if the platform’s complaints service is insufficient.
Finally, in the circumstance of a complaints service failing a parent, what should they then do? Do they have direct recourse to Ofcom? Will the regulator need to offer individual parents a channel to report problems if they have satisfied all the provider’s own processes, as set out in these clauses?
Again, I repeat my thanks to all across the House who have worked so hard to get substantial progress on this key issue.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful for the recognition of the work that has been done here, led by the noble Baroness, Lady Kidron, but involving many others, including officials who have worked to bring this package forward.

Noble Lords took the opportunity to ask a number of questions. The noble Baroness, Lady Kidron, asked about senior management liability. Ofcom will have extensive enforcement powers at its disposal if service providers do not comply with its information requests issued on behalf of a coroner. The powers will include the ability to hold senior managers criminally liable for non-compliance. Those powers are in line with Ofcom’s existing information-gathering powers in the Bill. Where Ofcom has issued an information request to a company, that company may be required to name a senior manager who is responsible for ensuring compliance with the requirements of the notice. If the named senior manager is found to have failed to comply with that information notice, or has failed to take all reasonable steps to prevent a failure to comply with the notice, that individual will be held personally liable and could be subject to imprisonment.

On the point about them not appearing in court, coroners have well-established powers to require senior managers to attend court. The enforcement powers available to Ofcom are in line with Ofcom’s existing information-gathering powers in the Bill. They do not extend to Ofcom requiring senior managers to appear in court as part of a coronial investigation. We do not think that would be appropriate for Ofcom, given that the coroner’s existing remit already covers this. The noble Baroness raised many specific instances that had come to her attention, and if she has specific examples of people not attending court that she would like to share with us and the Ministry of Justice, of course we would gladly follow those up.

The noble Lord, Lord Knight, rightly mentioned my noble friend Lady Newlove. I can reassure him that I have discussed this package of amendments with her, and had the benefit of her experience as a former Victims’ Commissioner.

On the training for coroners, which is an issue she raised, as did the noble Baroness, Lady Kidron, in her remarks just now, the Chief Coroner for England and Wales has statutory responsibility for maintaining appropriate arrangements for the training of coroners. That is of course independent of government, and exercised through the Judicial College, but the training is mandatory and the Chief Coroner is aware of the issues we are debating now.

The noble Lords, Lord Allan of Hallam and Lord Knight of Weymouth, raised the helpline for parents. Yes, we expect our approach of requiring a dedicated helpline or similar means will involve a human. As we say, we want a more humane process for those who need to use it; we think it would be more effective than requiring a company to provide a named individual contact. We touched on this briefly in Committee, where the point was raised, understandably, about staff turnover or people being absent on leave—that a requirement for a named individual could hinder the contact which families need to see there.

The noble Lord, Lord Allan, also asked some questions about deaths of people other than a child. First, Ofcom’s report in connection with investigations into a death covers any coronial inquest, not just children. More broadly, of course, social media companies may have their own terms and conditions or policies in place setting out when they will share information after somebody has passed away. Companies based outside the UK may have to follow the laws of the jurisdiction in which they are based, which may limit the sharing of data without a court order. While we recognise the difficulty that refusing to disclose data may cause for bereaved relatives in other circumstances, the right to access must, of course, be balanced with the right to privacy. Some adult social media users may be concerned, for instance, about the thought of family members having access to information about their private life after their deaths, so there is a complexity here, as I know the noble Lord understands.

The noble Baroness, Lady Kidron, asked about data preservation orders. I am very glad that officials from another Bill team are already in touch with her, as they should be. As we set out in Committee, we are aware of the importance of data preservation to coroners and bereaved parents, and the Government agree with the principle of ensuring that those data are preserved. We will work towards a solution through the Data Protection and Digital Information Bill. My noble friend Lord Camrose—who is unable to be with us today, also for graduation reasons—and I will be happy to keep the House and all interested parties updated about our progress in resolving the issue of data preservation as we work through this complex problem.

The noble Lord, Lord Clement-Jones, asked about the Information Commissioner’s Office. We expect Ofcom to consult the ICO on all the guidance where its expertise will be relevant, including on providers’ new duties under these amendments. I am grateful, as I say, for the support that they have had and the recognition that this has been a long process since these issues were first raised in the pre-legislative committee. We believe that it is of the utmost importance that coroners and families can access information about a child’s internet use following a bereavement, and that companies’ responses are made in a humane and transparent way.

This group of amendments should be seen alongside the wider protections for children in the Bill, and I hope they will help bereaved parents to get the closure that they deserve. The noble Lord, Lord Allan, was right to pay tribute to how these parents, who have campaigned so bravely, have turned their grief and frustration into a determination to make sure that no other parents go through the sorts of ordeals that they have. That is both humbling and inspiring, and I am glad that the Bill can help to be a part of the change that they are seeking. I share my noble friend Lady Harding’s wish that it may bring them a modicum of calm. I beg to move.

Amendment 84 agreed.
Clause 25: Safety duties protecting children
Lord McNicol of West Kilbride Portrait The Deputy Speaker (Lord McNicol of West Kilbride) (Lab)
- View Speech - Hansard - - - Excerpts

Amendment 85 is consequential to Amendment 35, which was previously agreed.

Amendment 85

Moved by
85: Clause 25, page 28, line 33, at end insert—
“(c) mitigate the impact of harm to children in different age groups presented by search functions that expose children to features, functionalities or behaviours that are harmful to children.”Member’s explanatory statement
This amendment ensures that Search services’ duty to protect children from harm includes the ways in which the design and operation of services may create harm separately and additionally to harm relating to the dissemination or encountering harmful content.
Amendment 85 agreed.
Amendment 86
Moved by
86: Clause 25, page 29, line 28, leave out “this section” and insert “section 25”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 25 into two Clauses.
Amendment 86 agreed.
Amendment 87 not moved.
Amendments 88 and 89
Moved by
88: Clause 25, page 29, line 34, leave out “this section” and insert “section 25”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 25 into two Clauses.
89: Clause 25, page 29, line 38, leave out “subsection (3)(b)” and insert “section 25(3)(b)”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 25 into two Clauses.
Amendments 88 and 89 agreed.
Amendment 90 not moved.
Amendments 91 to 96
Moved by
91: Clause 25, page 29, line 42, leave out “subsection (3)” and insert “section 25(3)”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 25 into two Clauses.
92: Clause 25, page 30, line 1, leave out “this section” and insert “section 25”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 25 into two Clauses.
93: Clause 25, page 30, line 4, leave out from “if” to “the” in line 6 and insert “age verification or age estimation is used on the service with”
Member’s explanatory statement
This amendment provides that a provider can only conclude that children cannot access a service if age verification or age estimation is used on the service with the result that children are not normally able to access it.
94: Clause 25, page 30, line 8, after “In” insert “section 25 and”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 25 into two Clauses.
95: Clause 25, page 30, line 10, leave out “this section” and insert “section 25”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 25 into two Clauses.
96: Clause 25, divide Clause 25 into two clauses, the first (Safety duties protecting children) to consist of subsections (1) to (9) and the second (Safety duties protecting children: interpretation) to consist of subsections (10) to (17)
Member’s explanatory statement
This amendment splits up Clause 25 into two Clauses.
Amendments 91 to 96 agreed.
Clause 27: Duties about complaints procedures
Amendment 97
Moved by
97: Clause 27, page 32, line 2, leave out “25(3)” and insert “25(2) or (3)”
Member’s explanatory statement
This amendment is about complaints of content being blocked because of an incorrect assessment of a user’s age. A reference to Clause 25(2) is inserted, as the duty in that provision can also be complied with by using age verification or age estimation.
Amendment 97 agreed.
Clause 29: Record-keeping and review duties
Amendment 98
Moved by
98: Clause 29, page 33, line 41, at end insert “,
and for the purposes of subsection (6), also includes the duties set out in section (Disclosure of information about use of service by deceased child users) (deceased child users).”Member’s explanatory statement
This amendment has the effect that OFCOM have a duty to review compliance by search service providers with the new duties imposed by the Clause proposed after Clause 67 in my name.
Amendment 98 agreed.
Clause 30: Children’s access assessments
Amendment 99
Moved by
99: Clause 30, page 34, line 12, leave out from “if” to “the” in line 13 and insert “age verification or age estimation is used on the service with”
Member’s explanatory statement
This amendment provides that a provider can only conclude that children cannot access a service if age verification or age estimation is used on the service with the result that children are not normally able to access it.
Amendment 99 agreed.
Amendment 100
Moved by
100: Clause 30, page 34, line 23, after “significant” insert “in itself or”
Member’s explanatory statement
This amendment aligns the definition of “significant” with the ICO’s Age Appropriate Design Code and draft guidance to ensure regulatory alignment and to ensure the protection of the greatest number of children.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I apologise for speaking once more today. I shall introduce Amendments 100 and 101 on the child user condition. They are very technical in nature and simply align the definition of “significant” in the Bill with the ICO’s age-appropriate design code to ensure regulatory alignment and to ensure the protection of the greatest number of children.

The Minister has stated on the record that the child-user condition is the same as the age-appropriate design code; however, in Clause 30(3) of the Bill, a service is “likely to be accessed” by children if

“(a) there is a significant number of children who are users of the service or of that part of it, or (b) the service, or that part of it, is of a kind likely to attract a significant number of users who are children”.

At Clause 30(4),

“the reference to a ‘significant’ number includes a reference to a number which is significant in proportion to the total number of United Kingdom users of a service or … part of a service”.

That is a key issue: “in proportion”. Because, by contrast, the ICO’s age-appropriate design code states that a service is “likely to be accessed” if

“children form a substantive and identifiable user group”.

That is quite a different threshold.

In addition, the ICO’s draft guidance on “likely to be accessed” sets out a list of factors that should be taken into consideration when making this assessment. These factors are far more extensive than Clause 30(4) and specifically state:

“‘Significant’ in this context does not mean that a large number of children must be using the service or that children form a substantial proportion of your users. It means that there are more than a de minimis or insignificant number of children using the service”.


In other words, it is possibly quite a small group, or a stand-alone group, that is not in proportion to the users. I will stop here to make the point that sometimes users are in their millions or tens of millions, so a small proportion could be many hundreds of thousands of children—just to be really clear that this matters and I am not quite dancing on the head of a pin here.

Amendment 101 mirrors the ICO’s draft guidance on age assurance on this point. I really struggle to see, if the intention of the Government is that these two things align, why this would not be just a technical amendment that they can just say yes to and we can move on.

I finish by reminding the House that the legal opinion of my noble and learned friend Lord Neuberger, the former head of the Supreme Court, which I shared with the Government, highlights the importance of regulatory alignment, clarity and consistency, particularly in new areas of law where concepts such as “likely to be accessed” are becoming a phrase that is in more than one Act.

My noble and learned friend states:

“As the Minister rightly says, simplicity and clarity are desirable in a statute, and it serves both simplicity and clarity if the same expression is used in the two statutes, and it is made clear that the same meaning is intended … The currently drafted reference in the Bill to ‘a significant number of children’ appears to me to be something of a recipe for uncertainty, especially when compared with the drafting of section 123 of the DPA”.


With that, I beg to move.

21:45
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, very briefly, I commend these two amendments. Again, the provenance is very clear; the Joint Committee said:

“This regulatory alignment would simplify compliance for businesses, whilst giving greater clarity to people who use the service, and greater protection to children.”


It suggested that the Information Commissioner’s Office and Ofcom should issue a joint statement on how these two regulatory systems will interact once the Online Safety Bill has been enacted. That still sounds eminently sensible, a year and a half later.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, Amendments 100 and 101 seek further to define the meaning of “significant” in the children’s access assessment, with the intention of aligning this with the meaning of “significant” in the Information Commissioner’s draft guidance on the age-appropriate design code.

I am grateful to the noble Baroness, Lady Kidron, for the way in which she has set out the amendments and the swiftness with which we have considered it. The test in the access assessment in the Bill is already aligned with the test in the code, which determines whether a service is likely to be accessed by children in order to ensure consistency for all providers. The Information Commissioner’s Office has liaised with Ofcom on its new guidance on the likely to access test for the code, with the intention of aligning the two regulatory regimes while reflecting that they seek to do different things. In turn, the Bill will require Ofcom to consult the ICO on its guidance to providers, which will further support alignment between the tests. So while we agree about the importance of alignment, we think that it is already catered for.

With regard to Amendment 100, Clause 30(4)(a) already states that

“the reference to a ‘significant’ number includes a reference to a number which is significant in proportion to the total number of United Kingdom users of a service”.

There is, therefore, already provision in the Bill for this being a significant number in and of itself.

On Amendment 101, the meaning of “significant” must already be more than insignificant by its very definition. The amendment also seeks to define “significant” with reference to the number of children using a service rather than seeking to define what is a significant number.

I hope that that provides some reassurance to the noble Baroness, Lady Kidron, and that she will be content to withdraw the amendment.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I am not sure that, at this late hour, I completely understood what the Minister said. On the basis that we are seeking to align, I will withdraw my amendment, but can we check that we are aligned as my speech came directly from a note from officials that showed a difference? On that basis, I am happy to withdraw.

Amendment 100 withdrawn.
Amendment 101 not moved.
Clause 31: Duties about children’s access assessments
Amendment 102
Moved by
102: Clause 31, page 35, line 1, leave out from “of” to “as” in line 2 and insert “age verification or age estimation that is used on the service”
Member’s explanatory statement
This amendment is consequential on the amendment of clause 30 in my name.
Amendment 102 agreed.
Schedule 3: Timing of providers’ assessments
Amendments 103 to 122
Moved by
103: Schedule 3, page 195, line 34, at end insert—
“5A (1) In this paragraph “the relevant day”, in relation to a regulated user-to- user service, means—(a) the first day on which the service is a Category 1 service, or (b) the first day on which the service again becomes a Category 1 service (following a period during which the service was not a Category 1 service).(2) If, on the relevant day, section 12(2) guidance is available, a section 12(2) assessment of the service must be completed within the period of three months beginning with that day.(3) Sub-paragraph (4) applies if—(a) on the relevant day, the first section 12(2) guidance has not yet been published, and(b) immediately before the publication of that guidance, the service is still a Category 1 service.(4) The first section 12(2) assessment of the service must be completed within the period of three months beginning with the day on which the first section 12(2) guidance is published.”Member’s explanatory statement
This amendment and the rest of the amendments of Schedule 3 in my name provide for the timing of the first assessments under the new Clause proposed after Clause 11 in my name.
104: Schedule 3, page 196, line 36, leave out “and 12” and insert “to 12A”
Member’s explanatory statement
See the explanatory statement for the first amendment to Schedule 3 in the Minister’s name.
105: Schedule 3, page 196, line 43, at end insert—
“(2A) If the effect of paragraph 5A is that the period within which the first section 12(2) assessment of the service must be completed begins on a day before the assessment start day, the time for carrying out that assessment is extended as set out in paragraph 12A.”Member’s explanatory statement
See the explanatory statement for the first amendment to Schedule 3 in the Minister’s name.
106: Schedule 3, page 196, line 44, leave out “and 12” and insert “to 12A”
Member’s explanatory statement
See the explanatory statement for the first amendment to Schedule 3 in the Minister’s name.
107: Schedule 3, page 197, line 14, at end insert—
“12A (1) If section 12(2) guidance is available on the assessment start day, the first section 12(2) assessment of the service must be completed within the period of three months beginning with that day.(2) If, on the assessment start day, the first section 12(2) guidance has not yet been published, the first section 12(2) assessment of the service must be completed within the period of three months beginning with the day on which the first section 12(2) guidance is published.”Member’s explanatory statement
See the explanatory statement for the first amendment to Schedule 3 in the Minister’s name.
108: Schedule 3, page 197, line 24, after “1” insert “or paragraph 5A”
Member’s explanatory statement
See the explanatory statement for the first amendment to Schedule 3 in the Minister’s name.
109: Schedule 3, page 197, line 25, leave out “or CAA” and insert “, CAA or section 12(2) assessment”
Member’s explanatory statement
See the explanatory statement for the first amendment to Schedule 3 in the Minister’s name.
110: Schedule 3, page 197, line 28, leave out “or 15” and insert “, 15 or 15A”
Member’s explanatory statement
See the explanatory statement for the first amendment to Schedule 3 in the Minister’s name.
111: Schedule 3, page 197, line 30, leave out “applies” and insert “and paragraph 5A apply”
Member’s explanatory statement
See the explanatory statement for the first amendment to Schedule 3 in the Minister’s name.
112: Schedule 3, page 198, line 8, at end insert—
“15A (1) If section 12(2) guidance is available on the assessment start day, a section 12(2) assessment of the Part 4B part must be completed within the period of three months beginning with that day.(2) If, on the assessment start day, the first section 12(2) guidance has not yet been published, a section 12(2) assessment of the Part 4B part must be completed within the period of three months beginning with the day on which the first section 12(2) guidance is published.”Member’s explanatory statement
See the explanatory statement for the first amendment to Schedule 3 in the Minister’s name.
113: Schedule 3, page 198, line 13, at end insert—
“(b) a section 12(2) assessment of the regulated service if a section 12(2) assessment is due to be carried out in relation to the Part 4B part of the service in accordance with paragraph 15A.”Member’s explanatory statement
See the explanatory statement for the first amendment to Schedule 3 in the Minister’s name.
114: Schedule 3, page 198, line 15, leave out “or a CAA” and insert “, a CAA or a section 12(2) assessment”
Member’s explanatory statement
See the explanatory statement for the first amendment to Schedule 3 in the Minister’s name.
115: Schedule 3, page 198, line 25, leave out “or a CAA” and insert “, a CAA or a section 12(2) assessment”
Member’s explanatory statement
See the explanatory statement for the first amendment to Schedule 3 in the Minister’s name.
116: Schedule 3, page 198, line 26, after “1” insert “or paragraph 5A”
Member’s explanatory statement
See the explanatory statement for the first amendment to Schedule 3 in the Minister’s name.
117: Schedule 3, page 198, line 37, at end insert—
“(c) a section 12(2) assessment is not required to be carried out at the time provided for by paragraph 5A.”Member’s explanatory statement
See the explanatory statement for the first amendment to Schedule 3 in the Minister’s name.
118: Schedule 3, page 198, line 38, leave out “or CAA” and insert “, CAA or section 12(2) assessment”
Member’s explanatory statement
See the explanatory statement for the first amendment to Schedule 3 in the Minister’s name.
119: Schedule 3, page 198, line 39, at end insert “or paragraph 5A.”
Member’s explanatory statement
See the explanatory statement for the first amendment to Schedule 3 in the Minister’s name.
120: Schedule 3, page 199, line 18, at end insert—
“section 12(2) assessment” means OFCOM’s assessment under section (Assessment duties: user empowerment) (assessments related to the adult user empowerment duty set out in section 12(2)); “section 12(2) guidance” means OFCOM’s guidance under section 47(A1).”Member’s explanatory statement
See the explanatory statement for the first amendment to Schedule 3 in the Minister’s name.
121: Schedule 3, page 200, line 6, after “CAA” insert “, a section 12(2) assessment”
Member’s explanatory statement
See the explanatory statement for the first amendment to Schedule 3 in the Minister’s name.
122: Schedule 3, page 200, line 12, after “CAAs” insert “, section 12(2) assessments”
Member’s explanatory statement
See the explanatory statement for the first amendment to Schedule 3 in the Minister’s name.
Amendments 103 to 122 agreed.
Amendment 123 not moved.
Schedule 4: Codes of practice under section 36: principles, objectives, content
Amendment 124
Moved by
124: Schedule 4, page 203, line 23, at end insert—
“Content of codes of practice: age assurance
11A (1) This paragraph is about the inclusion of age assurance in a code of practice as a measure recommended for the purpose of compliance with any of the duties set out in section 11(2) or (3) or 25(2) or (3), and sub- paragraph (2) sets out some further principles, in addition to those in paragraphs 1 and 2 (general principles) and 10(2) (freedom of expression and privacy), which are particularly relevant.(2) In deciding whether to recommend the use of age assurance, or which kinds of age assurance to recommend, OFCOM must have regard to the following—(a) the principle that age assurance should be effective at correctly identifying the age or age-range of users;(b) relevant standards set out in the latest version of the code of practice under section 123 of the Data Protection Act 2018 (age- appropriate design code);(c) the need to strike the right balance between—(i) the levels of risk and the nature, and severity, of potential harm to children which the age assurance is designed to guard against, and(ii) protecting the right of users and (in the case of search services or the search engine of combined services) interested persons to freedom of expression within the law;(d) the principle that more effective kinds of age assurance should be used to deal with higher levels of risk of harm to children;(e) the principle that age assurance should be easy to use, including by children of different ages and with different needs;(f) the principle that age assurance should work effectively for all users regardless of their characteristics or whether they are members of a certain group;(g) the principle of interoperability between different kinds of age assurance. (3) In a code of practice that describes measures for the purpose of compliance with the duty set out in section 11(3)(a), OFCOM must recommend (among other things) age verification or age estimation which is such of a kind, and which is to be used in such a way, that it is highly effective at correctly determining whether or not a particular user is a child (see section 11(3C)).(4) In deciding which kinds and uses of age verification or age estimation to recommend for the purpose of compliance with the duty set out in section 11(3)(a), OFCOM must have regard to their guidance under section 73 that gives examples of kinds and uses of age verification and age estimation that are, or are not, highly effective at correctly determining whether or not a particular user is a child.(5) Nothing in sub-paragraph (2) is to be read as allowing OFCOM to recommend, for the purpose of compliance with the duty set out in section 11(3)(a) by providers subject to the requirement in section 11(3A), a kind or use of age verification or age estimation which does not meet the requirement to be highly effective as mentioned in section 11(3C).(6) A code of practice that recommends the use of age assurance for the purpose of compliance with the duties set out in section 11(2) or (3) must also describe measures recommended for the purpose of compliance with the duties set out in—(a) section 11(6), (8) and (10) (inclusion of clear information in terms of service), and(b) section 17(2) and (3)(see, in particular, section 17(5)(e) (complaints about age assurance)).(7) A code of practice that recommends the use of age assurance for the purpose of compliance with the duties set out in section 25(2) or (3) must also describe measures recommended for the purpose of compliance with the duties set out in—(a) section 25(5) and (8) (inclusion of clear information in publicly available statement), and(b) section 27(2) and (3)(see, in particular, section 27(5)(d) (complaints about age assurance)).(8) A code of practice may—(a) refer to industry or technical standards for age assurance (where they exist);(b) elaborate on the principles mentioned in paragraphs (a) and (c) to (g) of sub-paragraph (2).(9) In this paragraph “age assurance” means age verification or age estimation, and see in particular section (“Age verification” and “age estimation”) (4) (self-declaration of age not to be regarded as age verification or age estimation).”Member’s explanatory statement
This amendment contains provisions which relate to OFCOM’s recommendation of age assurance in codes of practice for the purposes of Part 3 of the Bill. It includes some relevant principles and makes it clear that OFCOM must recommend highly effective age assurance in connection with the duty in Clause 11(3)(a) (preventing children from encountering primary priority content that is harmful to children).
Amendment 125 (to Amendment 124) not moved.
Amendment 124 agreed.
Amendments 126 and 127
Moved by
126: Schedule 4, page 204, line 10, leave out “existing”
Member’s explanatory statement
This amendment is a minor drafting change to omit a superfluous word.
127: Schedule 4, page 204, line 14, at end insert—
“(7) Sub-paragraph (6) does not apply in relation to proactive technology which is a kind of age verification or age estimation technology.”Member’s explanatory statement
This amendment carves out age assurance technologies from the paragraph of Schedule 4 which is about proactive technology, because age assurance principles etc are covered by new paragraph 11A proposed to be inserted by the amendment in my name above.
Amendments 126 and 127 agreed.
Clause 38: Procedure for issuing codes of practice
Amendment 128 not moved.
Consideration on Report adjourned.
House adjourned at 9.51 pm.
Report (3rd Day)
Relevant documents: 28th and 38th Reports from the Delegated Powers Committee, 15th Report from the Constitution Committee. Scottish and Welsh Legislative Consent granted.
16:00
Clause 38: Procedure for issuing codes of practice
Amendment 129
Moved by
129: Clause 38, page 40, line 29, after “39” insert “(A1), (B1) or”
Member’s explanatory statement
This amendment is consequential on the amendments made to Clause 39 in my name.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- Hansard - - - Excerpts

My Lords, the amendments in this group consider regulatory accountability and the roles of Ofcom, the Government and Parliament in overseeing the new framework. The proposals include altering the powers of the Secretary of State to direct Ofcom, issue guidance to Ofcom and set strategic priorities. Ofcom’s operational independence is key to the success of this framework, but the regime must ensure that there is an appropriate level of accountability to government. Parliament will also have important functions, in particular scrutinising and approving the codes of practice which set out how platforms can comply with their duties and providing oversight of the Government’s powers.

I heard the strength of feeling expressed in Committee that the Bill’s existing provisions did not get this balance quite right and have tabled amendments to address this. Amendments 129, 134 to 138, 142, 143, 146 and 147 make three important changes to the power for the Secretary of State to direct Ofcom to modify a draft code of practice. First, these amendments replace the public policy wording in Clause 39(1)(a) with a more defined list of reasons for which the Secretary of State can make a direction. This list comprises: national security, public safety, public health and the UK’s international obligations. This is similar to the list set out in a Written Ministerial Statement made last July but omits “economic policy” and “burden to business”.

This closely aligns the reasons in the Bill with the existing power in Section 5 of the Communications Act 2003. The power is limited to those areas genuinely beyond Ofcom’s remit as a regulator and where the Secretary of State might have access to information or expertise that the regulator does not. Secondly, the amendments clarify that the power will be used only for exceptional reasons. As noble Lords know, this has always been our intent and the changes we are tabling today put this beyond doubt. Thirdly, the amendments increase the transparency regarding the use of the power by requiring the Secretary of State to publish details of a direction at the time the power is used. This will ensure that Parliament has advance sight of modifications to a code and I hope will address concerns that several directions could be made on a single code before Parliament became aware.

This group also considers Amendments 131 to 133, which create an 18-month statutory deadline for Ofcom to submit draft codes of practice to the Secretary of State to be laid in Parliament relating to illegal content, safety duties protecting children and other cross-cutting duties. These amendments sit alongside Amendment 230, which we debated on Monday and which introduced the same deadline for Ofcom’s guidance on Part 5 of the regime.

I am particularly grateful to my noble friend Lady Stowell of Beeston, with whom I have had the opportunity to discuss these amendments in some detail as they follow up points that she and the members of her committee gave particular attention to. I beg to move.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to the amendments in this group in my name: Amendments 139, 140, 144 and 145. I thank the noble Lords, Lord Stevenson and Lord Clement-Jones, and the noble Viscount, Lord Colville, for signing those amendments and for their continued support on this group. I am also grateful to my noble friend the Minister and his team for engaging with me on the issue of Secretary of State powers. He has devoted a lot of time and energy to this, which is reflected in the wide- ranging group of amendments tabled by him.

Before I go any further, it is worth emphasising that the underlying concern here is making sure that we have confidence, through this new regulation regime, that the Bill strikes the right balance of power between government, Parliament, the regulator and big tech firms. The committee that I chair—the Communications and Digital Select Committee of your Lordships’ House—has most focused on that in our consideration of the Bill. I should say also that the amendments I have brought forward in my name very much have the support of the committee as well.

These amendments relate to Clause 39, which is where the main issue lies in the context of Secretary of State powers, and we have three broad concerns. First, as it stood, the Bill handed the Secretary of State unprecedented powers to direct the regulator on pretty much anything. Secondly, these powers allowed the Government to conduct an infinite form of ping-pong with the regulator, enabling the Government to prevail in a dispute. Thirdly, this ping-pong could take place in private with no possibility of parliamentary oversight or being able to intervene, as would be appropriate in the event of a breakdown in the relationship between executive and regulator.

This matters because the Online Safety Bill creates a novel form for regulating the internet and what we can or cannot see online, in particular political speech, and it applies to the future. It is one thing for the current Government, who I support, to say that they would never use the powers in this way. That is great but, as we know, current Governments cannot speak for whoever is in power in the generations to come, so it is important that we get this right.

As my noble friend said, he has brought forward amendments to Clause 39 that help to address this. I support him in and commend him for that. The original laundry list of powers to direct Ofcom has been shortened and now follows the precedent set out in the Communications Act 2003. The government amendments also say that the Secretary of State must now publish their directions to Ofcom, which will improve transparency, and once the code is agreed Ofcom will publish changes so that Parliament can see what changes have been made and why. These are all very welcome and, as I say, they go a long way to addressing some of our concerns, but two critical issues remain.

First, the Government retain an opt-out, which means that they do not have to publish their directions if the Secretary of State believes that doing so would risk

“national security or public safety”,

or international relations. However, those points are now the precise grounds on which the Secretary of State may issue a direction and, if history is any guide, there is a real risk that we will never hear about the directions because the Government have decided that they are a security issue.

My Amendments 139 and 140 would require the Secretary of State to at least notify Parliament of the fact that a direction has been issued and what broad topic it relates to. That would not require any details to be published, so it does not compromise security, but it does give assurance that infinite, secretive ping-pong is not happening behind the scenes. My noble friend spoke so quickly at the beginning that I was not quite sure whether he signalled anything, but I hope that he may be able to respond enthusiastically to Amendments 139 and 140.

Secondly, the Government still have powers for infinite ping-pong. I appreciate that the Government have reservations about capping the number of exchanges between the Secretary of State and Ofcom, but they must also recognise the concern that they appear to be preparing the ground for any future Government to reject infinitely the regulator’s proposals and therefore prevail in a dispute about a politically contentious topic. My Amendments 144 and 145 would clarify that the Government will have a legally binding expectation that they will use no more than the bare minimum number of directions to achieve the intent set out in their first direction.

The Government might think that adding this to the Bill is superfluous, but it is necessary in order to give Parliament and the public confidence about the balance of power in this regime. If Parliament felt that the Secretary of State was acting inappropriately, we would have sufficient grounds to intervene. As I said, the Government acknowledged in our discussions the policy substance of these concerns, and as we heard from my noble friend the Minister in introducing this group, there is an understanding on this. For his part, there is perhaps a belief that what they have done goes far enough. I urge him to reconsider Amendments 144 and 145, and I hope that, when he responds to the debate on this group, he can say something about not only Amendments 139 and 140 but the other two amendments that will give me some grounds for comfort.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I realise that I am something of a fish out of water in this House, as I was in Committee, on the Bill, which is fundamentally flawed in a number of respects, including its approach to governance, which we are discussing today. Having said that, I am generally sympathetic to the amendments proposed by my noble friend Lady Stowell of Beeston. If we are to have a flawed approach, her amendments would improve it somewhat.

However, my approach is rather different and is based on the fairly simple but important principle that we live in a free democracy. If we are to introduce a new legislative measure such as this Bill, which has far- reaching powers of censorship taking us back 70 or 80 years in terms of the freedom of expression we have been able to develop since the 1950s and 1960s— to the days of Lady Chatterleys Lover and the Lord Chamberlain, in equivalent terms, as far as the internet and the online world are concerned—then decisions of such a far-reaching character affecting our lives should be taken by somebody who is democratically accountable.

My approach is utterly different from that which my noble friend on the Front Bench has proposed. He has proposed amendments which limit yet further the Secretary of State’s power to give directions to Ofcom, but the Secretary of State is the only party in that relationship who has a democratic accountability. We are transferring huge powers to a completely unaccountable regulator, and today my noble friend proposes transferring, in effect, even more powers to that unaccountable regulator.

To go back to a point that was discussed in Committee and earlier on Report, if Ofcom takes certain decisions which make it impossible for Wikipedia to operate its current model, such that it has to close down at least its minority language websites—my noble friend said that the Government have no say over that and no idea what Ofcom will do—to whom do members of the public protest? To whom do they offer their objections? There is no point writing to the Secretary of State because, as my noble friend told us, they will not have had any say in the matter and we in this House will have forsworn the opportunity, which I modestly proposed, to take those powers here. There is no point writing to their MP, because all their MP can do is badger the Secretary of State. It is a completely unaccountable structure that is completely indefensible in a modern democratic society. So I object to the amendments proposed by my noble friend, particularly Amendments 136 and 137.

16:15
I rise particularly to speak to the amendments in my name: Amendments 218, 220, 221 and 223. This is surely the structure we want, one in which decisions are made by someone who is accountable and are then properly scrutinised by Parliament. My amendments introduce that particularly in respect of the Secretary of State’s powers to set Ofcom’s strategic objectives. This is the purpose of my Amendment 223, to which Amendments 220 and 226 are consequential. It would require that those instructions and directions should be approved by Parliament through the affirmative process. At the moment, the proposal is that they be approved by the negative process, and I think it should be the affirmative process; that is what my amendments seek to achieve. I do not think it requires much argument.
My Amendment 218 relates not to the setting of Ofcom’s strategic priorities but the guidance to be given by the Secretary of State to Ofcom on the exercise of any function. There is currently no parliamentary check on this guidance; it simply has to be laid before Parliament, but there is no procedure for this, neither negative nor affirmative. Surely we should say that this, too, should be subject to the affirmative procedure so that your Lordships’ House has an opportunity to debate and comment on those directions and that guidance, so that we have some of the features of a proper, functioning democracy.
Overall, we need accountable decision-makers, not unaccountable regulators, and we need them to be subject to parliamentary scrutiny. That is the burden of my argument and the effect of my amendments. I hope that they will command the support of the House.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - - - Excerpts

My Lords, the codes of practice are among the most important documents that Ofcom will produce as a result of the Bill—in effect, deciding what content we, the users of the internet, will see. The Government’s right to modify these drafts affects us all, so it is absolutely essential that the codes are trusted.

I, too, welcome the Government’s Amendments 134 to 138, which are a huge improvement on the Clause 39 that was presented in Committee. I am especially grateful that the Government have not proceeded with including economic conditions as a reason for the Secretary of State to modify draft codes, which the noble Baroness, Lady Harding, pointed out in Committee would be very damaging. But I would like the Minister to go further, which is why I put my name to Amendments 139, 140, 144 and 145.

Amendment 139 is so important at the moment. My fear is about the opt-out from publishing these directions from the Secretary of State for Ofcom to modify the draft codes, which will then allow them to be made behind closed doors between the Government and the regulator. This should not be allowed to happen. It would happen at a time when trust in the Government is low and there is a feeling that so many decisions affecting us all are taken without our knowledge. Surely it is right that there should be as much transparency as possible in exposing the pressure that the Minister is placing on the regulator. I hope that, if this amendment is adopted, it will allow Parliament to impose the bright light of transparency on the entire process, which is in danger of becoming opaque.

I am sure that no one wants a repeat of what happened under Section 94 of the Telecommunications Act 1984, which gave the Secretary of State power to give directions of a “general character” to anyone, in the “interests of national security” or international relations, as long as they did not disclose important information to Parliament. The Minister’s power to operate in total secrecy, without any accountability to Parliament, was seen by many as wrong and undemocratic. It was subsequently repealed. Amendments 139 and 140 will prevent the creation of a similar problem.

Likewise, I support Amendment 144, which builds on the previous amendments, as another brake on the control of the Secretary of State over this important area of regulations. Noble Lords in this House know how much the Government dislike legislative ping-pong—which we will see later this evening, I suspect. I ask the Minister to transfer this dislike to limiting ping-pong between the Government and the regulator over the drafting of codes of practice. It would also prevent the Secretary of State or civil servants expanding their control of the draft codes of practice from initial parameters to slightly wider sets of parameters each time that they are returned to the Minister for consideration. It will force the civil servants and the Secretary of State to make a judgment on the limitation of content and ensure that they stick to it. As it is, the Secretary of State has two bites of the cherry. They are involved in the original shaping of the draft codes of practice and then they can respond to Ofcom’s formulation. I hope the Minister would agree that it is sensible to stop this process from carrying on indefinitely. I want the users of the digital world to have full faith that the control of online content they see is above board —and not the result of secretive government overreach.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, not for the first time I find myself in quite a different place from my noble friend Lord Moylan. Before I go through some detailed comments on the amendments, I want to reflect that at the root of our disagreement is a fundamental view about how serious online safety is. The logical corollary of my noble friend’s argument is that all decisions should be taken by Secretaries of State and scrutinised in Parliament. We do not do that in other technical areas of health and safety in the physical world and we should not do that in the digital world, which is why I take such a different view—

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

Perhaps the noble Lord will allow me to make my point. I really welcome the government amendments in this group. I thank my noble friend the Minister for bringing them forward and for listening hard to the debates that we had at Second Reading and in Committee. I am very pleased to see the removal of economic policy and the burdens to business as one of the reasons that a Secretary of State could issue directions. I firmly believe that we should not be putting Secretaries of State in the position of having to trade off safety for economic growth. The reality is that big tech has found it impossible to make those trade-offs too. People who work in these companies are human beings. They are looking for growth in their businesses. Secretaries of State are rightly looking for economic growth in our countries. We should not be putting people in the position of trying to make that trade-off. The right answer is to defer to our independent regulator to protect safety. I thank my noble friend and the Government very much for tabling these amendments.

I also support my noble friend Lady Stowell, as a member of the Communications and Digital Committee that she chairs so ably. She has brought forward a characteristically thoughtful and detailed set of amendments in an attempt to look around the corners of these powers. I urge my noble friend the Minister to see whether he can find a way in the specific issues of infinite and secretive ping-pong. Taking the secretive, my noble friend Lady Stowell has found a very clever way of making sure that it is not possible for future Governments to obscure completely any direction that they are giving, while at the same time not putting at risk any national secrets. It is a very thoughtful and precise amendment. I very much hope that my noble friend the Minister can support it.

On the infinite nature of ping-pong, which I feel is quite ironic today—I am not sure anyone in this House welcomes the concept of infinite ping-pong right now, whatever our views on business later today—friends of mine in the business world ask me what is different about working in government versus working in the business world; I have worked in both big and small businesses. Mostly it is not different: people come to work wanting to do a good job and to further the objectives of the organisation that they are part of, but one of the biggest differences in government is that doing nothing and continuing to kick the can down the road is a much more viable option in the body politic than it is in the business world. Rarely is that for the good.

One of the things you learn in business is that doing nothing is often the very worst thing you can do. My worry about the infinite nature of the ping-pong is that it refers to a technical business world that moves unbelievably fast. What we do not need is to enshrine a system that enables government essentially to avoid doing anything. That is a particularly business and pragmatic reason to support my noble friend’s amendment. I stress that it is a very mild amendment. My noble friend Lady Stowell has been very careful and precise not to put unreasonable burdens on a future Secretary of State. In “Yes Minister”-speak, the bare minimum could be quite a lot. I urge my noble friend the Minister to look positively on what are extremely constructive amendments, delivered in a very thoughtful way.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I want to congratulate the noble Baroness, Lady Stowell, on her amendments and to raise some concerns, in particular about Amendment 138. I do this as somebody who has had the perhaps unique experience of being leaned on by Governments around the world who sought to give us, as a platform, directions about how to handle content. The risk is real: when there is a huge public outcry and you are an elected politician, you must be seen to be doing something, and the thing that you have been doing to date is to go directly to the platforms and seek to lean on them to make the change that you want.

In future, as the noble Baroness, Lady Stowell, has pointed out quite a few times, we are moving the accountability from the platforms to our independent regulator, Ofcom—and I agree with the noble Baroness, Lady Harding, that that is the right model, as it is an independent regulator. In these amendments we are considering a mechanism whereby that political outrage can still find an outlet, and that outlet will be a direction from the Secretary of State to the regulator asking it to change the guidance that it would otherwise have issued. It is really important that we dig into that and make sure that it does not prevent legitimate political activity but, at the same time, does not replicate the problem that we have had—the lack of transparency about decision-making inside companies, which has been resolved and addressed through leaks and whistleblowers. We do not want to be in a position in which understanding what has been happening in that decision-making process, now inside government, depends on leaks and whistleblowers. Having these directions published seems critical, and I worry a lot about Amendment 138 and how it will potentially mean that the directions are not published.

I have a couple of specific questions around that process to which I hope the Minister can respond. I understand how this will work: Ofcom will send its draft code of practice to the department and, inside the department, if the Secretary of State believes that there is an issue related to national security or there is another more limited set of conditions, they will be able to issue a direction. The direction may or may not have reasons with it; if the Secretary of State trusts Ofcom, they might give their reasons, but if the Secretary of State does not trust Ofcom with the information, they will give it the bare direction with no reasons. Clause 39 gives the Secretary of State the power to either give or withhold reasons, for reasons of national security. Ofcom will then come up with an amended version of the code of practice, reflecting the direction that it has been given.

The bit that I am really interested in is what happens from a Freedom of Information Act point of view. I hope that the Minister can clarify whether it would be possible for an individual exercising their Freedom of Information Act powers to seek the original draft code of practice as it went to the department. The final code of practice will be public, because it will come to us. It may be that we are in a situation in which you can see the original—Ofcom’s draft—and the final draft as it came to Parliament, and the only bit you cannot see under Amendment 138 is the actual direction itself, if the Secretary of State chooses to withhold it. That is quite critical, because we can anticipate that in these circumstances there will be Freedom of Information Act requests and a significant public interest in understanding any direction that was given that affected the speech of people in the United Kingdom. I would expect the ICO, unless there was some compelling reason, to want that original draft from Ofcom to be made public. That is one question around the interaction of the Freedom of Information Act and the process that we are setting out here, assuming that the Secretary of State has withheld their direction.

The other question is whether the Minister can enlighten us as to the circumstances in which he thinks the Secretary of State would be happy to publish the direction. We have said that this is now related only to very narrow national security interests and we have given them that get-out, so I am curious as to whether there are any examples of the kind of direction, in legislating for a power for the Secretary of State, that would meet the narrow criteria of being those exceptional circumstances, yet not be so sensitive—to use the double negative—that the Secretary of State would want to withhold it. If there were some examples of that, it might help assure us that the withholding of publication will be exceptional rather than routine.

My fear is that Section 138 says you can withhold in some circumstances. Actually, if we read it all together and say that, by definition, the direction comes from the fact that there is a national security concern, we end up with a situation in which the lack of publication has to be on national security grounds. Those two mirror each other, and therefore the norm may be that directions are never published. The Minister might allay our concerns if he could, at least in general terms, describe the kind of directions that would meet the gateway criteria for being permissible and yet not be so sensitive that the Secretary of State would not be comfortable with them being published.

16:30
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, a lot of positive and interesting things have been said that I am sympathetic to, but this group of amendments raises concerns about a democratic deficit: if too much of the Bill is either delegated to the Secretary of State or open to interference in relation to the Secretary of State and Ofcom, who decides what those priorities are? I will ask for a couple of points of clarification.

I am glad to see that the term “public policy” has been replaced, because what did that mean? Everything. But I am not convinced that saying that the Secretary of State can decide not just on national security but on public safety and public health is reassuring in the present circumstances. The noble Lord, Lord Allan, has just pointed out what it feels like to be leaned on. We had a very recent example internationally of Governments leaning on big tech companies in relation to Covid policies, lockdowns and so on, and removing material that was seen to contradict official public health advice—often public health advice that turned out not to be accurate at all. There should at least have been a lot more debate about what were political responses to a terrible virus. Noble Lords will know that censorship became a matter of course during that time, and Governments interfering in or leaning on big tech directly was problematic. I am not reassured that the Government hold to themselves the ability to lean on Ofcom around those issues.

It is also worth remembering that the Secretary of State already has a huge amount of power to designate, as we have discussed previously. They can designate what constitute priority illegal offences and priority content harmful to children, and that can all change beyond what we have discussed here. We have already seen that there is a constant expansion of what those harms can be, and having those decisions removed using only secondary legislation, unaccountable to Parliament or to public scrutiny, really worries me. It is likely to give a green light to every identity group and special interest NGO to demand that the list of priority harms and so on should be dealt with. That is likely to make the job of the Secretary of State to respond to “something must be done” moral panics all the more difficult. If that is going to happen, we should have parliamentary scrutiny of it; it cannot just be allowed to happen elsewhere.

It is ironic that the Secretary of State is more democratic, because they are elected, than an unelected regulator. I just feel that there is a danger in so much smoke and mirrors. When the Minister very kindly agreed to see the noble Lord, Lord Moylan, and me, I asked in a rather exasperated way why Ofcom could not make freedom of expression a priority, with codes of practice so that it would have to check on freedom of speech. The Minister said, “It’s not up to me to tell Ofcom what to do”, and I thought, “The whole Bill is telling Ofcom what to do”. That did not seem to make any sense.

I had another exchange with the present Secretary of State—again, noble Lords will not be surprised to hear that it was not a sophisticated intervention on my part—in which I said, “Why can’t the Government force the big tech companies to put freedom of expression in their terms and conditions or terms of service?” The Minister said, “They are private companies; we’re not interfering in what they do”. So you just end up thinking, “The whole Bill is telling companies that they’re going to be compelled to act in relation to harm and safety, but not on freedom of expression”. What that means is that you feel all the time as though the Government are saying that they are outsourcing this to third parties, which means that you cannot hold anyone to account.

Civil liberties campaigner Guy Herbert compared this to what is happening with the banks at the moment; they are being blamed by the Government and held to account for things such as politically exposed people and Ts and Cs that overconcentrate on values such as EDI and ESG that may be leading to citizens of this country having their bank accounts closed down. The Government say that they will tell the regulator that it has to act and say that the banks cannot behave in this way, but this all came from legislation—it is not as though the regulator was doing it off its own bat. Maybe it overinterpreted the legislation and the banks then overinterpreted it again and overremoved.

The obvious analogy for me is that there is a danger here that we will not be able to hold anyone to account for overremoval of legitimate democratic discussion from the online world, because everyone is pointing the finger at everyone else. At the very least, the amendments are trying to say that any changes beyond what we have discussed so far on this Bill must come before Parliament. That is very important for any kind of democratic credibility to be attached to this legislation.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I too express my admiration to the noble Baroness, Lady Stowell, for her work on this group with the Minister and support the amendments in her name. To pick up on what the noble Baroness, Lady Harding, said about infinite ping-pong, it can be used not only to avoid making a decision but as a form of power and of default decision-making—if you cannot get the information back, you are where you are. That is a particularly important point and I add my voice to those who have supported it.

I have a slight concern that I want to raise in public, so that I have said it once, and get some reassurance from the Minister. New subsection (B1)(d) in Amendment 134 concerns the Secretary of State directing Ofcom to change codes that may affect

“relations with the government of a country outside the United Kingdom”.

Many of the companies that will be regulated sit in America, which has been very forceful about protecting its sector. Without expanding on this too much, when it was suggested that senior managers would face some sort of liability in international fora, various parts of the American Government and state apparatus certainly made their feelings clearly known.

I am sure that the channels between our Government and the US are much more straightforward than any that I have witnessed, but it is absolutely definite that more than one Member of your Lordships’ House was approached about the senior management and said, “This is a worry to us”. I believe that where we have landed is very good, but I would like the Minister to say what the limits of that power are and acknowledge that it could get in a bit of a muddle with the economic outcomes that we were talking about, celebrating that they had been taken off the list, and government relations. That was the thing that slightly worried me in the government amendments, which, in all other ways, I welcome.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a consistent theme ever since the Joint Committee’s report. It was reported on by the Delegated Powers and Regulatory Reform Committee, and the Digital and Communications Committee, chaired by the noble Baroness, Lady Stowell, has rightly taken up the issue. Seeing some movement from the Minister, particularly on Clause 29 and specifically in terms of Amendments 134 to 137, is very welcome and consistent with some of the concerns that have been raised by noble Lords.

There are still questions to answer about Amendment 138, which my noble friend has raised. I have also signed the amendments to Clause 38 because I think the timetabling is extremely welcome. However, like other noble Lords, I believe we need to have Amendments 139, 140, 144 and 145 in place, as proposed by the noble Baroness, Lady Stowell of Beeston. The phrase “infinite ping-pong” makes us all sink in gloom, in current circumstances—it is a very powerful phrase. I think the Minister really does have to come back with something better; I hope he will give us that assurance, and that his discussions with the noble Baroness Stowell will bear further fruit.

I may not agree with the noble Lord, Lord Moylan, about the Clause 39 issues, but I am glad he raised issues relating to Clause 159. It is notable that of all the recommendations by the Delegated Powers and Regulatory Reform Committee, the Government accepted four out of five but did not accept the one related to what is now Clause 159. I have deliberately de-grouped the questions of whether Clauses 158 and 159 should stand part of the Bill, so I am going to pose a few questions which I hope, when we get to the second group which contains my clause stand part proposition, the Minister will be able to tell me effortlessly what he is going to do. This will prevent me from putting down further amendments on those clauses, because it seems to me that the Government are being extraordinarily inconsistent in terms of how they are dealing with Clauses 158 and 159 compared with how they have amended Clause 39.

For instance, Clause 158 allows the Secretary of State to issue a direction to Ofcom, where the Secretary of State has reasonable grounds for believing that there is a threat to public health and safety or national security, and they can direct Ofcom to set objectives in how they use their media-literacy powers in Section 11 of the Communications Act for a specific period to address the threat, and make Ofcom issue a public-statement notice. That is rather extraordinary. I will not go into great detail at this stage, and I hope the Minister can avoid me having to make a long speech further down the track, but the Government should not be in a position to be able to direct a media regulator on a matter of content. For instance, the Secretary of State has no powers over Ofcom on the content of broadcast regulation—indeed, they have limited powers to direct over radio spectrum and wires—and there is no provision for parliamentary involvement, although I accept that the Secretary of State must publish reasons for the direction. There is also the general question of whether the threshold is high enough to justify this kind of interference. So Clause 158 is not good news at all. It raises a number of questions which I hope the Minister will start to answer today, and maybe we can avoid a great debate further down the track.

16:45
Then, of course, we have Clause 159. I think the noble Lord, Lord Moylan is correct. It does not have nearly enough parliamentary input into this, as the DPRRC itself said. It allows the Secretary of State to issue “have regard” guidance to Ofcom about Ofcom’s exercise of functions under the Act, research it might carry out, the use of its powers from the Communications Act, and how Ofcom uses its media literacy powers in the Communications Act across the Bill.
The point of principle involved here is that the Secretary of State should not interfere with the independence of the communications regulator—in particular, not in its day-to-day operation. There are a number of questions there, particularly on why the Government absolutely resisted what the DPRRC had to say. This is unusual, as normally the Government make a better fist of it in responding than I think they did on this occasion, but I look forward to hearing what the Minister has to say.
Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, first, I have to say that, having read Hansard from last Thursday, I feel I should have drawn attention to my interests in the register that relate to the Jewish community. I apologise for not doing so at the time and am pleased to now put this on the record.

I will be brief, as noble Lords have already raised a number of very pertinent points, to which I know the Minister will want to respond. In this group of amendments, there is a very welcome focus on transparency, accountability and the role of Parliament, all of which are absolutely crucial to the success of the Bill. I am grateful to the Minister for his introduction and explanation of the impact of the proposed changes to the role of the Secretary of State and Ofcom, whose codes of practice will be, as the noble Viscount, Lord Colville, said, vitally important to the Bill. We very much welcome the amendments in the name of the noble Baroness, Lady Stowell, which identify the requirements of the Secretary of State. We also welcome the government amendments, which along with the amendments by the noble Baroness, have been signed by my noble friend Lord Stevenson.

The amendments tabled in the name of the noble Lord, Lord Moylan, raise interesting points about the requirement to use the affirmative procedure, among other points. I look forward to the Minister’s response to that and other amendments. It would be helpful to hear from the Minister his thoughts on arrangements for post-legislative scrutiny. It would also be helpful to deliberations to understand whether there have been discussions on this between the usual channels.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, this is indeed an apposite day to be discussing ongoing ping-pong. I am very happy to speak enthusiastically and more slowly about my noble friend Lady Stowell of Beeston’s Amendments 139 and 140. We are happy to support those, subject to some tidying up at Third Reading. We agree with the points that she has made and are keen to bring something forward which would mean broadly that a statement would be laid before Parliament when the power to direct had been used. My noble friend Lady Harding characterised them as the infinite ping-pong question and the secretive ping-pong question; I hope that deals with the secretive ping-pong point.

My noble friend Lady Stowell’s other amendments focus on the infinite ping-pong question, and the power to direct Ofcom to modify a code. Her Amendments 139, 140, 144 and 145 seek to address those concerns: that the Secretary of State could enter into a private form of ping-pong with Ofcom, making an unlimited number of directions on a code to prevent it from ever coming before Parliament. Let me first be clear that we do not foresee that happening. As the amendments I have spoken to today show, the power can be used only when specific exceptional reasons apply. In that sense, we agree with the intent of the amendments tabled by my noble friend Lady Stowell. However, we cannot accept them as drafted because they rely on concepts— such as the “objective” of a direction—which are not consistent with the procedure for making a direction set out in the Bill.

The amendments I have brought forward mean that private ping-pong between the Secretary of State and Ofcom on a code is very unlikely to happen. Let me set out for my noble friend and other noble Lords why that is. The Secretary of State would need exceptional reasons for making any direction, and the Bill then requires that the code be laid before Parliament as soon as is reasonably practicable once the Secretary of State is satisfied that no further modifications to the draft are required. That does not leave room for the power to be used inappropriately. A code could be delayed in this way and in the way that noble Lords have set out only if the Secretary of State could show that there remained exceptional reasons once a code had been modified. This test, which is a very high bar, would need to be met each time. Under the amendments in my name, Parliament would also be made aware straightaway each time a direction was made, and when the modified code came before Parliament, it would now come under greater scrutiny using the affirmative procedure.

I certainly agree with the points that the noble Lord, Lord Allan, and others made that any directions should be made in as transparent a way as possible, which is why we have tabled these amendments. There may be some circumstances where the Secretary of State has access to information—for example, from the security services—the disclosure of which would have an adverse effect on national security. In our amendments, we have sought to retain the existing provisions in the Bill to make sure that we strike the right balance between transparency and protecting national security.

As the noble Lord mentioned, the Freedom of Information Act provides an additional route to transparency while also containing existing safeguards in relation to national security and other important areas. He asked me to think of an example of something that would be exceptional but not require that level of secrecy. By dropping economic policy and burden to business, I would point him to an example in those areas, but a concrete example evades me this afternoon. Those are the areas to which I would turn his attention.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

Can the Minister confirm that the fact that a direction has been made will always be known to the public, even if the substance of it is not because it is withheld under the secrecy provision? In other words, will the public always have a before and after knowledge of the fact of the direction, even if its substance is absent?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes; that is right.

I hope noble Lords will agree that the changes we have made and that I have outlined today as a package mean that we have reached the right balance in this area. I am very grateful to my noble friend Lady Stowell —who I see wants to come in—for the time that she too has given this issue, along with members of her committee.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Hansard - - - Excerpts

I am grateful to my noble friend for his constructive response to my Amendments 139 and 140. I am sure he will do me the honour of allowing me to see the Government’s reversioning of my amendments before they are laid so that we can be confident at Third Reading that they are absolutely in line with expectations.

Could I press my noble friend a little further on Amendments 144 and 145? As I understood what he said, the objection from within government is to the language in the amendments I have tabled—although as my noble friend Lady Harding said, they are incredibly modest in their nature.

I was not sure whether my noble friend was saying in his defence against accepting them that issuing a direction would have to be exceptional, and that that led to a need to clarify that this would be ongoing. Would each time there is a ping or a pong be exceptional? Forgive me, because it starts to sound a bit ridiculous when we get into this amount of detail, but it seems to me that the “exceptional” issue kicks in at the point where you issue the direction. Once you engage in a dialogue, “exceptional” is no longer really the issue. It is an odd defence against trying to limit the number of times you allow that dialogue to continue. Bearing in mind that he is willing to look again at Amendments 139 and 140, I wonder whether, between now and Third Reading, he would at least ask parliamentary counsel to look again at the language in my original amendment.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am certainly happy to commit to showing my noble friend the tidying up we think necessary of the two amendments I said we are happy to accept ahead of Third Reading. On the others, as I said, the code could be delayed repeatedly only if the Secretary of State showed that there remained exceptional reasons once it had been modified, and that high bar would need to be met each time. So we do not agree with her Amendments 14 and 145 because of concerns about the drafting of my noble friend’s current amendment and because the government amendments we have brought forward cater for the scenario about which she is concerned. Her amendments would place a constraint on the Secretary of State not to give more directions than are necessary to achieve the objectives set out in the original direction, but they would not achieve the intent I think my noble friend has. The Bill does not require the direction to have a particular objective. Directions are made because the Secretary of State believes that modifications are necessary for exceptional reasons, and the direction must set out the reasons why the Secretary of State believes that a draft should be modified.

Through the amendments the Government have laid today, the direction would have to be for exceptional reasons relating to a narrower list and Parliament would be made aware each time a direction was made. Parliament would also have increased scrutiny in cases where a direction had been made under Clause 39(1)(a), because of the affirmative procedure. However, I am very happy to keep talking to my noble friend, as we will be on the other amendments, so we can carry on our conversation then if she wishes.

Let me say a bit about the amendments tabled by my noble friend Lord Moylan. His Amendment 218 would require the draft statement of strategic priorities laid before Parliament to be approved by resolution of each House. As we discussed in Committee, the statement of strategic priorities is necessary because future technological changes are likely to shape harms online, and the Government must have an avenue through which to state their strategic priorities in relation to these emerging technologies.

The Bill already requires the Secretary of State to consult Ofcom and other appropriate persons when preparing a statement. This provides an opportunity for consideration and scrutiny of a draft statement, including, for example, by committees of Parliament. This process, combined with the negative procedure, provides an appropriate level of scrutiny and is in line with comparable existing arrangements in the Communications Act in relation to telecommunications, the management of radio spectrum and postal services.

My noble friend’s other amendments would place additional requirements on the Secretary of State’s power to issue non-binding guidance to Ofcom about the exercise of its online safety functions. The guidance document itself does not create any statutory requirements —Ofcom is required only to have regard to the guidance —and on that basis, we do not agree that it is necessary to subject it to parliamentary approval as a piece of secondary legislation. As my noble friend Lady Harding of Winscombe pointed out, we do not require that in numerous other areas of the economy, and we do not think it necessary here.

Let me reassure my noble friend Lord Moylan on the many ways in which Parliament will be able to scrutinise the work of Ofcom. Like most other regulators, it is accountable to Parliament in how it exercises its functions. The Secretary of State is required to present its annual report and accounts before both Houses. Ministers from the devolved Administrations must also lay a copy of the report before their respective Parliament or Assembly. Ofcom’s officers can be required to appear before Select Committees to answer questions about its work; indeed, its chairman and chief executive appeared before your Lordships’ Communications and Digital Committee just yesterday. Parliament will also have a role in approving a number of aspects of the regulatory framework through its scrutiny of both primary and secondary legislation.

17:00
Once the regime that the Bill establishes is in force, a key point will be the Secretary of State’s review of its effectiveness, which will take place between two and five years after it comes into force, resulting in the production of a report that will then be laid before Parliament. This will clearly be an important moment, requiring input and scrutiny from a number of parties. We will ensure that Parliament is central to that process and is able thoroughly to scrutinise the operation of the regulatory framework in a way that deploys the skills and expertise in both Houses.
The noble Baroness, Lady Merron, asked to hear a bit more about this post-legislative scrutiny. In addition to that, we agree that ongoing parliamentary scrutiny of the regime will be crucial to providing reassurance that it is working in the way we all intend it to. The creation of the Department for Science, Innovation and Technology means that there is a new dedicated Select Committee in another place looking at the work of that department, and this provides an enhanced opportunity for cross-party work to scrutinise the online safety regime and digital regulation. More broadly, your Lordships’ Communications and Digital Committee will of course continue to play a vital role in scrutiny, as its work yesterday in talking to Ofcom’s chief executive and chairman demonstrates. We will continue to consider how to support the committee’s work; indeed, we will have an opportunity in a later debate to discuss this issue further in relation to Amendment 239.
The noble Lord, Lord Clement-Jones, asked why it is necessary for the Secretary of State to have powers over Ofcom in certain circumstances. We expect the media literacy powers to be used only in exceptional circumstances where it is right that the Secretary of State should have the power to direct the regulator.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, the key question is this: why have these powers over social media when the Secretary of State does not have them over broadcast?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

If I may, I will write to the noble Lord having reflected on that question further. We are talking here about the provisions set up in the Bill to deal with online harms; clearly, that is the focus here, which is why this Bill deals with that. I will speak to colleagues who look at other areas and respond further to the noble Lord’s question.

Let me reassure the noble Baroness, Lady Fox, that, through this Bill, both Ofcom and providers are being asked to have regard to freedom of expression. Ofcom already has obligations under the Human Rights Act to be bound by the European Convention on Human Rights, including Article 10 rights relating to freedom of expression. Through this Bill, user-to-user and search services will have to consider and implement safeguards for freedom of expression when fulfilling their duties. Those points are uppermost in our minds.

I am grateful for the support expressed by noble Lords for the government amendments in this group. Given the mixed messages of support and the continued work with my noble friend Lady Stowell of Beeston, I urge her not to move her amendments.

Amendment 129 agreed.
Amendment 130 not moved.
Amendments 131 to 133
Moved by
131: Clause 38, page 41, line 4, leave out “This section applies” and insert “Subsections (1) to (6) apply”
Member’s explanatory statement
This amendment is consequential on the amendment inserting new subsections (9) to (13) into this Clause in my name.
132: Clause 38, page 41, line 5, leave out “it applies” and insert “they apply”
Member’s explanatory statement
This amendment is consequential on the amendment inserting new subsections (9) to (13) into this Clause in my name.
133: Clause 38, page 41, line 7, at end insert—
“(9) Subsection (11) applies to—(a) a draft of the first code of practice prepared under section 36(1) (terrorism code of practice);(b) a draft of the first code of practice prepared under section 36(2) (CSEA code of practice);(c) a draft of the first code of practice prepared under section 36(3) relating to a duty set out in section 9 or 23 (illegal content);(d) a draft of the first code of practice prepared under section 36(3) relating to a duty set out in section 11 or 25 (children’s online safety);(e) a draft of the first code of practice prepared under section 36(3) relating to a duty set out in section 16 or 26 (content reporting);(f) a draft of the first code of practice prepared under section 36(3) relating to—(i) a duty set out in section 17 (complaints procedures) that concerns complaints of a kind mentioned in subsection (4) or (5) of that section, or(ii) a duty set out in section 27 (complaints procedures).(10) For the purposes of paragraphs (c) to (f) of subsection (9) a draft of a code of practice is a draft of the first code of practice relating to a duty if—(a) it describes measures recommended for the purpose of compliance with the duty, and(b) it is a draft of the first code of practice prepared under section 36(3) that describes measures for that purpose.(11) OFCOM must submit a draft to which this subsection applies to the Secretary of State under subsection (1) within the period of 18 months beginning with the day on which this Act is passed.(12) If OFCOM consider that it is necessary to extend the period mentioned in subsection (11) in relation to a draft mentioned in any of paragraphs (a) to (f) of subsection (9), OFCOM may extend the period in relation to that draft by up to 12 months by making and publishing a statement.But this is subject to subsection (15).(13) A statement under subsection (12) must set out—(a) the reasons why OFCOM consider that it is necessary to extend the period mentioned in subsection (11) in relation to the draft concerned, and(b) the period of extension.(14) A statement under subsection (12) may be published at the same time as (or incorporate) a statement under section (Time for publishing first guidance under certain provisions of this Act)(3) (extension of time to prepare certain guidance).(15) But a statement under subsection (12) may not be made in relation to a draft mentioned in a particular paragraph of subsection (9) if—(a) a statement has previously been made under subsection (12) (whether in relation to a draft mentioned in the same or a different paragraph of subsection (9)), or(b) a statement has previously been made under section (Time for publishing first guidance under certain provisions of this Act)(3).”Member’s explanatory statement
This amendment provides that OFCOM must prepare the first draft of certain codes of practice within 18 months of Royal Assent, unless they consider a longer period to be necessary in which case OFCOM may (on one occasion only) extend the period and set out why in a published statement.
Amendments 131 to 133 agreed.
Clause 39: Secretary of State’s powers of direction
Amendments 134 to 137
Moved by
134: Clause 39, page 41, line 8, at end insert—
“(A1) The Secretary of State may direct OFCOM to modify a draft of a code of practice submitted under section 38(1) if the Secretary of State believes that modifications are required for the purpose of securing compliance with an international obligation of the United Kingdom.(B1) The Secretary of State may direct OFCOM to modify a draft of a code of practice, other than a terrorism or CSEA code of practice, submitted under section 38(1) if the Secretary of State believes that modifications are required for exceptional reasons relating to—(a) national security,(b) public safety,(c) public health, or(d) relations with the government of a country outside the United Kingdom.”Member’s explanatory statement
This amendment (together with other amendments to this Clause in my name) sets out the circumstances in which the Secretary of State can direct OFCOM to modify a draft of a code of practice.
135: Clause 39, page 41, line 9, after second “a” insert “terrorism or CSEA”
Member’s explanatory statement
This amendment is consequential on the other amendments to this Clause in my name.
136: Clause 39, page 41, line 12, leave out “public policy” and insert “national security or public safety”
Member’s explanatory statement
This amendment removes the ability of the Secretary of State to direct OFCOM to modify a draft of a code of practice for public policy reasons.
137: Clause 39, page 41, line 13, leave out paragraph (b) and insert—
“(b) for exceptional reasons relating to public health or relations with the government of a country outside the United Kingdom.”Member’s explanatory statement
This amendment (together with other amendments to this Clause in my name) sets out the circumstances in which the Secretary of State can direct OFCOM to modify a draft of a code of practice.
Amendments 134 to 137 agreed.
Amendment 138
Moved by
138: Clause 39, page 41, line 37, at end insert “, and
(c) must be published, except where the Secretary of State considers that doing so would have the effect mentioned in paragraph (b).”Member’s explanatory statement
This amendment requires a direction given under Clause 39 to be published except in cases where the Secretary of State considers that to do so would be against the interests of national security, public safety or relations with the government of a country outside the United Kingdom.
Amendment 139 (to Amendment 138) not moved.
Amendment 138 agreed.
Amendments 140 and 141 not moved.
Amendments 142 and 143
Moved by
142: Clause 39, page 42, line 2, at end insert—
“(ca) publish the document, and”Member’s explanatory statement
This amendment requires OFCOM to publish a document submitted to the Secretary of State in response the Secretary of State giving a direction under this Clause.
143: Clause 39, page 42, line 8, after “subsection” insert “(A1), (B1),”
Member’s explanatory statement
This amendment is consequential on the other amendments to this Clause in my name.
Amendments 142 and 143 agreed.
Amendments 144 and 145 not moved.
Clause 40: Procedure for issuing codes of practice following direction under section 39
Amendments 146 and 147
Moved by
146: Clause 40, page 42, line 34, leave out “(1)(a)” and insert “(A1), (B1) or (1)(b)”
Member’s explanatory statement
This amendment is consequential on the amendments made to Clause 39 in my name.
147: Clause 40, page 42, line 36, leave out “(b)” and insert “(a)”
Member’s explanatory statement
This amendment is consequential on the amendments made to Clause 39 in my name.
Amendments 146 and 147 agreed.
Clause 43: Minor amendments of codes of practice
Amendment 148 not moved.
Clause 47: OFCOM’s guidance about certain duties in Part 3
Amendments 149 and 150
Moved by
149: Clause 47, page 48, line 11, at end insert—
“(A1) OFCOM must produce guidance for providers of Category 1 services to assist them in complying with their duties set out in section (Assessment duties: user empowerment) (assessments related to the adult user empowerment duty set out in section 12(2)).”Member’s explanatory statement
This amendment requires OFCOM to produce guidance to assist providers of Category 1 services in carrying out their assessments as required by the new Clause proposed after Clause 11 in my name.
150: Clause 47, page 48, line 20, after “subsection” insert “(A1) or”
Member’s explanatory statement
This amendment requires OFCOM to consult the Information Commissioner before producing guidance mentioned in the preceding amendment in my name.
Amendments 149 and 150 agreed.
Clause 48: OFCOM’s guidance: content that is harmful to children and user empowerment
Amendment 151
Moved by
151: Clause 48, page 48, line 33, leave out “12(9)” and insert “(User empowerment duties: interpretation)”
Member’s explanatory statement
This amendment is consequential on the splitting up of Clause 12 into two Clauses.
Amendment 151 agreed.
Amendment 152
Moved by
152: After Clause 48, insert the following new Clause—
“OFCOM’s guidance about protecting women and girls
(1) OFCOM must produce guidance for providers of Part 3 services which focuses on content and activity—(a) in relation to which such providers have duties set out in this Part or Part 4, and(b) which disproportionately affects women and girls.(2) The guidance may, among other things—(a) contain advice and examples of best practice for assessing risks of harm to women and girls from content and activity mentioned in subsection (1), and for reducing such risks;(b) refer to provisions contained in a code of practice under section 36 which are particularly relevant to the protection of women and girls from such content and activity.(3) Before producing the guidance (including revised or replacement guidance), OFCOM must consult—(a) the Commissioner for Victims and Witnesses,(b) the Domestic Abuse Commissioner, and(c) such other persons as OFCOM consider appropriate.(4) OFCOM must publish the guidance (and any revised or replacement guidance).”Member’s explanatory statement
This new Clause requires OFCOM to produce and publish a guidance document focusing on online content and activity which disproportionately affects women and girls.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, as we discussed in Committee, the Bill contains strong protection for women and girls and places duties on services to tackle and limit the kinds of offences and online abuse that we know disproportionately affect them. His Majesty’s Government are committed to ensuring that women and girls are protected online as well as offline. I am particularly grateful to my noble friend Lady Morgan of Cotes for the thoughtful and constructive way in which she has approached ensuring that the provisions in the Bill are as robust as possible.

It is with my noble friend’s support that I am therefore pleased to move government Amendment 152. This will create a new clause requiring Ofcom to produce guidance that summarises, in one clear place, measures that can be taken to tackle the abuse that women and girls disproportionately face online. This guidance will relate to regulated user-to-user and search services and will cover content regulated under the Bill’s frame- work. Crucially, it will summarise the measures in the Clause 36 codes for Part 3 duties, namely the illegal and child safety duties. It will also include a summary of platforms’ relevant Part 4 duties—for example, relevant terms of service and reporting provisions. This will provide a one-stop shop for providers.

Providers that adhere to the codes of practice will continue to be compliant with the duties. However, this guidance will ensure that it is easy and clear for platforms to implement holistic and effective protections for women and girls across their various duties. Any company that says it is serious about protecting women and girls online will, I am sure, refer to this guidance when implementing protections for its users.

Ofcom will have the flexibility to shape the guidance in a way it deems most effective in protecting women and girls online. However, as outlined in this amendment, we expect that it will include examples of best practice for assessing risks of harm to women and girls from content and activity, and how providers can reduce these risks and emphasise provisions in the codes of practice that are particularly relevant to the protection of women and girls.

To ensure that this guidance is effective and makes a difference, the amendment creates a requirement on Ofcom to consult the Domestic Abuse Commissioner and the Victims’ Commissioner, among other people or organisations it considers appropriate, when it creates this guidance. Much like the codes of practice, this will ensure that the views and voices of experts on the issue, and of women, girls and victims, are reflected. This amendment will also require Ofcom to publish this guidance.

I am grateful to all the organisations that have worked with us and with my noble friend Lady Morgan to get to this point. I hope your Lordships will accept the amendment. I beg to move.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak very briefly to this amendment; I know that the House is keen to get on to other business today. I very much welcome the amendment that the Government have tabled. My noble friend the Minister has always said that they want to keep women and girls safe online. As has been referred to elsewhere, the importance of making our digital streets safer cannot be overestimated.

As my noble friend said, women and girls experience a disproportionate level of abuse online. That is now recognised in this amendment, although this is only the start, not the end, of the matter. I thank my noble friend and the Secretary of State for their engagement on this issue. I thank the chief executive and the chair of Ofcom. I also thank the noble Baroness, Lady Kidron, the right reverend Prelate the Bishop of Gloucester, who I know cannot be here today, and the noble Lord, Lord Knight, who signed the original amendment that we discussed in Committee.

My noble friend has already talked about the campaigners outside the Chamber who wanted there to be specific mention of women and girls in the Bill. I thank Refuge, the 100,000 people who signed the End Violence Against Women coalition’s petition, BT, Glitch, Carnegie UK, Professor Lorna Woods, the NSPCC and many others who made the case for this amendment.

As my noble friend said, this is Ofcom guidance. It is not necessarily a code of practice, but it is still very welcome because it is broader than just the specific offences that the Government have legislated on, which I also welcome. As he said, this puts all the things that companies, platforms and search engines should be doing to protect women and girls online in one specific place. My noble friend mentioned holistic protection, which is very important.

There is no offline/online distinction these days. Women and girls should feel safe everywhere. I also want to say, because I know that my noble friend has had a letter, that this is not about saying that men and boys should not be safe online; it is about recognising the disproportionate levels of abuse that women and girls suffer.

I welcome the fact that, in producing this guidance, Ofcom will have to consult with the Domestic Abuse Commissioner and the Victims’ Commissioner and more widely. I look forward, as I am sure do all the organisations I just mentioned, to working with Ofcom on the first set of guidance that it will produce. It gives me great pleasure to have signed the amendment and to support its introduction.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I know that we do not have long and I do not want to be churlish. I am not that keen on this amendment, but I want to ask a question in relation to it.

I am concerned that there should be no conflation in the best practice guidance between the actual, practical problems of, for example, victims of domestic abuse being stalked online, which is a threat to their safety, or threatened with physical violence—I understand that—and abuse. Abuse is horrible to be on the receiving end of, but it is important for freedom of thought and freedom of speech that we do not make no distinction between words and action. It is important not to overreact or frighten young women by saying that being shouted at is the same as being physically abused.

17:15
I also want to discuss the elephant in the room. Many of us have experienced a huge increase in misogynistic abuse over the past year or so. It is actually due to beliefs, rather than to our being women, but it specifically relates to women who insist that we believe in biological sex, as distinct from gender identity. While there are those who do not want this issue raised, it has become one of the key issues for women when people—trans activists, very often men—attack you for being a woman: attack your physical being and tell you that you cannot claim the word “mother” or “woman” for yourself, and so on and so forth. We have seen it in high-profile cases such as those involving Rosie Duffield MP and Joanna Cherry MP, who have been harassed and treated incredibly badly online because of their gender-critical views.
I ask that the consultations bear that in mind and that we do not ignore the contemporary situation; and that the consultations are not confined, therefore, to the domestic abuse commissioner or the victims’ commissioner but include, for example, LGB Alliance and Sex Matters, and that the Government bring this into scope. I appreciate that the Minister says that the Government cannot interfere in Ofcom, but the Government are saying in this amendment that Ofcom should set up this code. I therefore urge that Ofcom broaden its consultation to take into account the savage online attack on the rights of women who are gender critical due to our belief, ironically, in women as a distinct biological category.
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I rise briefly to support the noble Baroness, Lady Morgan, to welcome the government amendment and to say that this is a moment of delight for many girls—of all varieties. I echo the noble Baroness, Lady Fox, on the issue of having a broad consultation, which is a good idea. While our focus during the passage of this Bill was necessarily on preventing harm, I hope this guidance will be part of the rather more aspirational and exciting part of the digital world that allows young people to participate in social and civic life in ways that do not tolerate abuse and harm on the basis of their gender. In Committee, I said that we have a duty not to allow digital tech to be regressive for girls. I hope that this is a first step.

Baroness Burt of Solihull Portrait Baroness Burt of Solihull (LD)
- View Speech - Hansard - - - Excerpts

My Lords, on behalf of my party, all the groups mentioned by the noble Baroness, Lady Morgan, and potentially millions of women and girls in this country, I briefly express my appreciation for this government amendment. In Committee, many of us argued that a gender-neutral Bill would not achieve strong enough protection for women and girls as it would fail to recognise the gendered nature of online abuse. The Minister listened, as he has on many occasions during the passage of the Bill. We still have differences on some issues—cyberflashing, for instance—but in this instance I am delighted that he is amending the Bill, and I welcome it.

Why will Ofcom be required to produce guidance and not a code, as in the amendment originally tabled by the noble Baroness, Lady Morgan? Is there a difference, or is it a case of a rose by any other name? Is there a timescale by which Ofcom should produce this guidance? Are there any plans to review Ofcom’s guidance once produced, just to see how well it is working?

We all want the same thing: for women and girls to be free to express themselves online and not to be harassed, abused and threatened as they are today.

Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, this very positive government amendment acknowledges that there is not equality when it comes to online abuse. We know that women are 27 times more likely than men to be harassed online, that two-thirds of women who report abuse to internet companies do not feel heard, and three out of four women change their behaviour after receiving online abuse.

Like others, I am very glad to have added my name to support this amendment. I thank the Minister for bringing it before your Lordships’ House and for his introduction. It will place a requirement on Ofcom to produce and publish guidance for providers of Part 3 services in order to make online spaces safer for women and girls. As the noble Baroness, Lady Morgan, has said, while this is not a code of practice—and I will be interested in the distinction between the code of practice that was being called for and what we are expecting now—it would be helpful perhaps to know when we might expect to see it. As the noble Baroness, Lady Burt, just asked, what kind of timescale is applicable?

This is very much a significant step for women and girls, who deserve and seek specific protections because of the disproportionate amount of abuse received. It is crucial that the guidance take a holistic approach which focuses on prevention and tech accountability, and that it is as robust as possible. Can the Minister say whether he will be looking to the model of the Violence against Women and Girls Code of Practice, which has been jointly developed by a number of groups and individuals including Glitch, the NSPCC, 5Rights and Refuge? It is important that this be got right, that we see it as soon as possible and that all the benefits can be felt and seen.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

I am very grateful to everyone for the support they have expressed for this amendment both in the debate now and by adding their names to it. As I said, I am particularly grateful to my noble friend Lady Morgan, with whom we have worked closely on it. I am also grateful for her recognition that men and boys also face harm online, as she rightly points out. As we discussed in Committee, this Bill seeks to address harms for all users but we recognise that women and girls disproportionately face harm online. As we have discussed with the noble Baroness, Lady Merron, women and girls with other characteristics such as women of colour, disabled women, Jewish women and many others face further disproportionate harm and abuse. I hope that Amendment 152 demonstrates our commitment to giving them the protection they need, making it easy and clear for platforms to implement protections for them across all the wide-ranging duties they have.

The noble Baroness, Lady Burt of Solihull, asked why it was guidance and not a code of practice. Ofcom’s codes of practice will set out how companies can comply with the duties and will cover how companies should tackle the systemic risks facing women and girls online. Stipulating that Ofcom must produce specific codes for multiple different issues could, as we discussed in Committee, create duplication between the codes, causing confusion for companies and for Ofcom.

As Ofcom said in its letter to your Lordships ahead of Report, it has already started the preparatory work on the draft illegal content and child sexual abuse and exploitation codes. If it were required to create a separate code relating to violence against women and girls, this preparatory work would need to be revised, so there would be the unintended—and, I think, across the House, undesired—consequence of slowing down the implementation of these vital protections. I am grateful for the recognition that we and Ofcom have had on that point.

Instead, government Amendment 152 will consolidate all the relevant measures across codes of practice, such as on illegal content, child safety and user empowerment, in one place, assisting platforms to reduce the risk of harm that women and girls disproportionately face.

On timing, at present Ofcom expects that this guidance will be published in phase 3 of the implementation of the Bill, which was set out in Ofcom’s implementation plan of 15 June. This is when the duties in Part 4 of the Bill, relating to terms of service and so on, will be implemented. The guidance covers the duties in Part 4, so for guidance to be comprehensive and have the most impact in protecting women and girls, it is appropriate for it to be published during phase 3 of the Bill’s implementation.

The noble Baroness, Lady Fox, mentioned the rights of trans people and the rights of people to express their views. As she knows, gender reassignment and religious or philosophical belief are both protected characteristics under the Equality Act 2010. Sometimes those are in tension, but they are both protected in the law.

With gratitude to all the noble Lords who have expressed their support for it, I commend the amendment to the House.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

The Minister did not quite grasp what I said but I will not keep the House. Would he be prepared to accept recommendations for a broader consultation—or who do I address them to? It is important that groups such as the Women’s Rights Network and others, which suffer abuse because they say “I know what a woman is”, are talked to in a discussion on women and abuse, because that would be appropriate.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am sorry—yes, the noble Baroness made a further point on consultation. I want to reassure her and other noble Lords that Ofcom has the discretion to consult whatever body it considers appropriate, alongside the Victims’ Commissioner, the Domestic Abuse Commissioner and others who I mentioned. Those consultees may not all agree. It is important that Ofcom takes a range of views but is able to consult whomever. As I mentioned previously, Ofcom and its officers can be scrutinised in Parliament through Select Committees and in other ways. The noble Baroness could take it up directly with them but could avail herself of those routes for parliamentary scrutiny if she felt that her pleas were falling on deaf ears.

Amendment 152 agreed.
Clause 49: “Regulated user-generated content”, “user-generated content”, “news publisher content”
Amendment 152A
Moved by
152A: Clause 49, page 49, line 22, at end insert “including user generated or controlled characters and objects with which user characters interact in visual or audio environments within which users interact”
Member’s explanatory statement
This amendment seeks to probe whether the bill sufficiently covers certain harmful content users may encounter in services, for example in the metaverse.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, we had a pretty extensive future-proofing debate in Committee, which I was sadly unable to take part in, but I start this debate with a sinking feeling about the scope of the Bill. This amendment relates to the metaverse in particular.

In metaverse or game-type settings, users interact in a visual or audio environment that is wholly or in part created by the service provider. An analogy might be that the service provider supplies an immersive stage environment for people to act upon, complete with scenery, computer-generated props and characters, some of which could be harmful. The environment created or enabled by the service provider could itself be harmful to children and even adults—for instance, a World War II concentration camp, a sex shop or a Ku Klux Klan rally; at least one online game has allowed people to play the role of an Auschwitz camp guard.

I am particularly influenced by a report from the Center for Countering Digital Hate, Horizon Worlds Exposed, and the research for it, which was carried out by the online CSEA covert intelligence team. This may have been cited earlier but they found that minors are routinely harassed and exposed to adult content on Meta’s flagship virtual reality social network, Horizon Worlds. The research follows Meta’s announcements that Horizon Worlds would be opening up to 13 to 17 year-olds, showing that it is already failing to prevent minors accessing mature content, despite a supposed ban on them accessing its VR applications.

17:30
I am sure that the Minister is familiar with the OCCIT report, which is concerning. There are many more lurid stories. There was a Mail Online story— I hasten to add that I am not a regular reader of the Mail Online—with the great headline, “Chilling chats self-styled ‘assassin’ had with AI bot ‘girlfriend’ who encouraged him to kill the late Queen at Windsor Castle”. We want to avoid such lurid headlines, if possible, but there is a serious point behind this. The Bill acts on user-generated content, but it might not catch the features provided as part of the service, even though it covers user interactions in that environment. Recent government amendments proposed by the Minister try to catch bots, but they do not encompass the static components I have described.
Such issues ought to be caught by the general child safety duty. Possibly, adults should be allowed to use the user empowerment tools in Clause 12 to protect themselves from the risks that Parliament has identified, if they arise from such features—at least, they should be made aware of the potential risks. The illegal content duty might apply there, but there would be difficulties with what is described as “mens rea” or intent. It would be good to get a clear statement from the Government at the Dispatch Box.
When I tried to put down an amendment that attempted to include the provider environment, I discovered from the Public Bill Office that it was out of scope. I think that that means that there are big questions marks over the Bill in that sense. All I could get down was this amendment to include
“user generated or controlled characters and objects with which user characters interact in visual or audio environments within which users interact”.
That speaks volumes about what is excluded from the Bill, and it makes the point that I am trying to make: provider content, such as anti-Semitic slogans on backdrops in an immersive environment, is not entirely within the scope of the Bill. My amendment, as drafted, will not tackle that issue, because I am not able to put down an amendment which might.
I believe that there is a case to answer by the Government. Things are moving very fast—I entirely understand that it is difficult to keep up, in a sense, with the changes—but the metaverse should not be beyond the scope of the Bill and nor should the environments created by it. If we do not include that kind of provider environment in its scope, we will fail our children and vulnerable adults and we will be falling down on the job. We have waited five years to get the Bill through, so to knowingly pass a Bill without the right provisions and the proper future-proofing would be grossly negligent. I beg to move.
Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I am most grateful to the noble Lord, Lord Clement-Jones, for tabling the amendment. If I had been quicker, I would have added my name to it, because he may— I use the word “may” advisedly, because I am not sure—have identified quite a serious gap in terms of future-proofing. As far as I understand it, in a somewhat naive way, the amendment probes whether there is a gap between provider-generated content and user-generated content and whether provider-generated content could lead to a whole lot of ghastly stuff on the metaverse without any way of tackling it because it is deemed to have fallen outside the scope of the Bill.

I am grateful to Carnegie UK for having tried to talk me through this—it is pretty complicated. As a specific example, I understand that a “Decentraland” avatar pops up on gaming sites, and it is useful because it warns you about the dangers of gambling and what it can lead to. But then there is the problem about the backdrop to this avatar: at the moment, it seems to be against gambling, but you can see how those who have an interest in gambling would be quite happy to have the avatar look pretty hideous but have a backdrop of a really enticing casino with lots of lights and people streaming in, or whatever. I am not sure where that would fit, because it seems that this type of content would be provider-generated. When it comes to the metaverse and these new ways of interacting with 3D immersion, I am not clear that we have adequately caught within the Bill some of these potentially dangerous applications. So I hope that the Minister will be able to clarify it for us today and, if not, possibly to write between now and the next time that we debate this, because I have an amendment on future-proofing, but it is in a subsequent group.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I am interested to hear what the Minister says, but could he also explain to the House the difference in status of this sort of material in Part 5 versus Part 3? I believe that the Government brought in a lot of amendments that sorted it out and that many of us hoped were for the entire Bill, although we discovered, somewhat to our surprise, that they were only in Part 5. I would be interested if the Minister could expand on that.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to the noble Lord, Lord Clement-Jones, for raising this; it is important. Clause 49(3)(a)(i) mentions content

“generated directly on the service by a user”,

which, to me, implies that it would include the actions of another user in the metaverse. Sub-paragraph (ii) mentions content

“uploaded to or shared on the service by a user”,

which covers bots or other quasi-autonomous virtual characters in the metaverse. As we heard, a question remains about whether any characters or objects provided by the service itself are covered.

A scenario—in my imagination anyway—would be walking into an empty virtual bar at the start of a metaverse service. This would be unlikely to be engaging: the attractions of indulging in a lonely, morose drink at that virtual bar are limited. The provider may therefore reasonably configure the algorithm to generate characters and objects that are engaging until enough users then populate the service to make it interesting.

Of course, there is the much more straightforward question of gaming platforms. On Monday, I mentioned “Grand Theft Auto”, a game with an advisory age of 17—they are still children at that age—but that is routinely accessed by younger children. Shockingly, an article that I read claimed that it can evolve into a pornographic experience, where the player becomes the character from a first-person angle and received services from virtual sex workers, as part of the game design. So my question to the Minister is: does the Bill protect the user from these virtual characters interacting with users in virtual worlds?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

I will begin with that. The metaverse is in scope of the Bill, which, as noble Lords know, has been designed to be technology neutral and future-proofed to ensure that it keeps pace with emerging technologies—we have indeed come a long way since the noble Lord, Lord Clement-Jones, the noble Lords opposite and many others sat on the pre-legislative scrutiny committee for the Bill. Even as we debate, we envisage future technologies that may come. But the metaverse is in scope.

The Bill will apply to companies that enable users to share content online or to interact with each other, as well as search services. That includes a broad range of services, such as websites, applications, social media services, video games and virtual reality spaces, including the metaverse.

Any service that enables users to interact, as the metaverse does, will need to conduct a child access test and will need to comply with the child safety duties—if it is likely to be accessed by children. Content is broadly defined in the Bill as,

“anything communicated by means of an internet service”.

Where this is uploaded, shared or directly generated on a service by a user and able to be encountered by other users, it will be classed as user-generated content. In the metaverse, this could therefore include things like objects or avatars created by users. It would also include interactions between users in the metaverse such as chat—both text and audio—as well as images, uploaded or created by a user.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I hope I am not interrupting the Minister in full flow. He has talked about users entirely. He has not yet got to talking about what happens where the provider is providing that environment—in exactly the way in which the noble Lord, Lord Knight, illustrated.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

We talked about bots controlled by service providers before the noble Lord, Lord Knight, asked questions on this. The Bill is designed to make online service providers responsible for the safety of their users in light of harmful activities that their platforms might facilitate. Providers of a user-to-user service will need to adhere to their duties of care, which apply to all user-generated content present on their service. The Bill does not, however, regulate content published by user-to-user providers themselves. That is because the providers are liable for the content they publish on the service themselves. The one exception to this—as the noble Baroness, Lady Kidron, alluded to in her contribution—is pornography, which poses a particular risk to children and is regulated by Part 5 of the Bill.

I am pleased to reassure the noble Lord, Lord Clement- Jones, that the Bill—

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I thank the noble Lord for giving way. The Minister just said that private providers will be responsible for their content. I would love to understand what mechanism makes a provider responsible for their content?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will write to noble Lords with further information and will make sure that I have picked up correctly the questions that they have asked.

On Amendment 152A, which the noble Lord, Lord Clement-Jones, has tabled, I am pleased to assure him that the Bill already achieves the intention of the amendment, which seeks to add characters and objects that might interact with users in the virtual world to the Bill’s definition of user-generated content. Let me be clear again: the Bill already captures any service that facilitates online user-to-user interaction, including in the metaverse or other augmented reality or immersive online worlds.

The Bill broadly defines “content” as

“anything communicated by means of an internet service”,

so it already captures the various ways in which users may encounter content. Clause 211 makes clear that “encounter” in relation to content for the purposes of the Bill means to,

“read, view, hear or otherwise experience”

content. That definition extends to the virtual worlds which noble worlds have envisaged in their contributions. It is broad enough to encompass any way of encountering content, whether that be audio-visually or through online avatars or objects.

In addition, under the Bill’s definition of “functionality”,

“any feature that enables interactions of any description between users of the service”

will be captured. That could include interaction between avatars or interaction by means of an object in a virtual world. All in-scope services must therefore consider a range of functionalities as part of their risk assessment and must put in place any necessary measures to mitigate and manage any risks that they identify.

I hope that that provides some assurance to the noble Lord that the concerns that he has raised are covered, but I shall happily write on his further questions before we reach the amendment that the noble Baroness, Lady Finlay, rightly flagged in her contribution.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I thank the Minister. I feel that we have been slightly unfair because we have been asking questions about an amendment that we have not been able to table. The Minister has perfectly well answered the actual amendment itself and has given a very positive reply—and in a sense I expected him to say what he said about the actual amendment. But, of course, the real question is about an amendment that I was unable to table.

17:45
We have those two issues about provider material, and exactly the point that the noble Baroness, Lady Kidron, made about chapter and verse for the provider liability in those circumstances. If we knew the answer to that, and that it was absolutely clear in the Bill, we would be more reassured than we have been so far. In the meantime, I beg leave to withdraw my amendment.
Amendment 152A withdrawn.
Amendments 153 to 157
Moved by
153: Clause 49, page 49, line 27, after “bot” insert “or other automated tool”
Member’s explanatory statement
This amendment, and the next two amendments in my name, make it clear that an automated tool which is not a bot - as well as a bot - may be regarded as a user for the purposes of the definition of “user-generated content”.
154: Clause 49, page 49, line 28, leave out “bot’s functions” and insert “functions of the bot or tool”
Member’s explanatory statement
See the explanatory statement to the preceding amendment in my name.
155: Clause 49, page 49, line 30, after “bot” insert “or tool”
Member’s explanatory statement
See the explanatory statement to the first amendment of this Clause in my name.
156: Clause 49, page 49, line 38, leave out “description” and insert “kind”
Member’s explanatory statement
This amendment ensures consistency of language in referring to kinds of content.
157: Clause 49, page 49, line 45, leave out from beginning to end of line 2 on page 50 and insert “, including where the publication of the content is effected or controlled by means of—
(a) software or an automated tool or algorithm applied by the provider or by a person acting on behalf of the provider, or(b) an automated tool or algorithm made available on the service by the provider or by a person acting on behalf of the provider.”Member’s explanatory statement
This amendment is about what counts as “provider content” for the purposes of the exemption in Clause 49(6) of the Bill (which provides that comments/reviews on provider content don’t count as regulated user-generated content). Words are added to expressly cover the case where an automated tool or algorithm is made available on the service by a provider, such as a generative AI bot.
Amendments 153 to 157 agreed.
Amendment 158
Moved by
158: Clause 49, page 50, line 17, leave out sub-paragraphs (ii) and (iii) and insert—
“(ii) is video or audio content that was originally published or broadcast by a recognised news publisher, and is not a clipped or edited form of such content (unless it is the recognised news publisher who has clipped or edited it), or(iii) is a link to an article or item within sub-paragraph (i) or to content within sub-paragraph (ii).”Member’s explanatory statement
This amendment revises the definition of “news publisher content” so that, in particular, online content published by a recognised news publisher that has not first been broadcast is covered by the definition.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, as noble Lords know, His Majesty’s Government are committed to defending the invaluable role of a free media, and our online safety legislation must protect the vital role of the press in providing people with reliable and accurate information online. That is why we have included strong protections for recognised news publishers in the Bill.

Clause 49(9) and (10) set out what is considered “news publisher content” in relation to a regulated user-to-user service, while Clause 52 sets out that news publishers’ content is exempt from search services’ duties. The government amendments clarify minor elements of these exemptions and definitions. Given the evolving consumption habits for news, recognised news publishers might clip or edit content from their published or broadcast versions to cater to different audiences and platforms. We want to ensure that recognised news publisher content is protected in all its forms, as long as that content is created or generated by the news publishers themselves.

First, our amendments clarify that any video or audio content published or broadcast by recognised news publishers will be exempt from the Bill’s safety duties and will benefit from the news publisher appeals process, when shared on platforms in scope of the Bill. These amendments ensure that old terminology works effectively in the internet age. The amendments now also make it clear that any news publisher content that is clipped or edited by the publisher itself will qualify for the Bill’s protections when shared by third parties on social media. However, these protections will not apply when a third-party user modifies that content itself. This will ensure that the protections do not apply to news publisher content that has been edited by a user in a potentially harmful way.

The amendments make it clear that the Bill’s protections apply to links to any article, video or audio content generated by recognised news publishers, clipped or edited, and regardless of the form in which that content was first published or broadcast. Taken together, these amendments ensure that our online safety legislation protects recognised news publishers’ content as intended. I hope noble Lords will support them. I beg to move.

Lord Lipsey Portrait Lord Lipsey (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, Amendments 159 and 160 are in my name and those of the noble Lord, Lord McNally, and the noble Baronesses, Lady Hollins and Lady Newlove. First, I apologise for the fact that this is the first time I have spoken on the Bill. That was not the plan: illness intervened. Anyway, I am all better now, thanks.

The purport of the amendments is simple. Content posted on social media by newspapers benefits, under the Bill as it stands, from exemption from any regulatory action by the platforms. Nowhere does the Bill set up a system for the public to complain about such pieces. Newspapers can have any complaints system they want and still benefit from the exemption. Under our amendments, the exemption would apply only to newspapers that have a system for public complaints that meets proper standards—at the very least, the complaints code must be independently set up and not under the control of newspapers, their editors or any puppet regulators they may set up.

Noble Lords will have noticed that the amendments do not say that the system must measure up to the standards required by the Press Recognition Panel and monitored by a body approved by that panel; at the moment, only Impress would qualify. We have omitted that particular way of making sure that the complaints system works not because it would not be perfectly good—it would—but because the very mention of PRP/Impress is a red rag to a bull to those who control the press, so we kept the red rag in our pockets. This, of course, says more about those who control the press than about the admirable PRP/Impress set-up, which has, within its limited practical scope, been doing a very fine job.

What the amendments do mean, however, is that newspapers cannot any more hide behind their fig leaf IPSO, the so-called Independent Press Standards Organisation. I know that some Members of your Lordships’ House are IPSO fans who fought for it tooth and nail; indeed, the noble Lord, Lord Faulks, is its chair. I pay tribute to that organisation: the political skills it has deployed in its attempts to give itself credibility have amazed even me, somebody who has been around politics for nearly 50 years. Two former Northern Ireland officials have been hired to produce whitewash reports on it: appointed by IPSO, terms of reference from IPSO and paid for by IPSO. They did their duty. Only last week—perhaps it knew the Bill was coming up in your Lordships’ House—it actually upheld a complaint: that against Jeremy Clarkson for abusing Meghan. That was an event as rare as bumping into a dodo on the streets one night: only three in 1,000 complaints are upheld by IPSO.

A more objective academic view of IPSO than mine was provided by the Media Standards Trust, a study by the academics Martin Moore and Gordon Ramsay published in 2019. It found that IPSO fell short on 25 of 38 Leveson recommendations. I am sorry—we have not heard the word “Leveson” for a while, and I am not sure we are still allowed to utter it, but I will. IPSO has never in its history established a single standards investigation. It has never fined a publisher. It and its editors set the code to suit themselves.

Ever intrepid, I once tried complaining about a case—a slam dunk case, if I may ask the House to take my word for it—against the Express about its use of something it wrongly described as a poll. It was an intriguing experience. IPSO followed the procedural rules minutely and scrupulously, if slowly. The Express obfuscated. Eventually, IPSO produced a ruling that was so bizarre and incomprehensible that I hesitate to describe it to the House and, of course, turned down my complaint. That experience is very typical. Some 1,500 people give up on their complaints every year, despairing of fighting their way through IPSO and the newspapers’ attritional system. The average complaint takes about six months to resolve.

These amendments, partly for the reasons I have already mentioned, do not attempt to specify what body can rule. It could be a body approved by the PRP or one adhering to another kosher code. What should be clear, however, is that the regulator should not be a pussycat regulator controlled by the press, as IPSO is. It should be a genuinely independent regulator with a genuinely independent code to enforce.

“You’re against free speech, Lipsey; you want state regulation”. But there is no inhibition on free speech in our amendments. They merely provide a way of hearing complaints after pieces have been printed, and the state need have nothing to do with it. Incidentally, I find great curiosity in the way in which this state regulation bogey is played about with in this debate. In fact, Ofcom is already a state regulator of many of the things that would be covered by our amendments. Nevertheless, the cry of “state regulation” is obviously red blood that the proponents of total freedom want.

I too want freedom. I spent a third of my working life as a journalist. I was deputy editor of two national newspapers and Bagehot of the Economist. I believe in press freedom to my very core. If I thought for a moment that these amendments in any way threatened press freedom, I would not be proposing them tonight, but I am perfectly certain that they would not. Instead, they would put some inhibition on newspapers planning to abuse often innocent people on their websites; not stopping them saying it but subjecting them to complaints if they do so, which would be independently adjudicated.

I, my co-signatories and my noble friends on the Front Bench are aware that a media Bill is coming up this Session, next Session or sometime sooner or later— I hope sooner, obviously. That will explicitly end the incentives for newspapers to join an independent regulatory system, such as PRP/Impress, by repealing Section 40 of the Act that gives them the incentives to do so. When we last debated these matters, my noble friend Lord Knight on the Front Bench argued that this Bill was not the right way to tackle the complaints problem, and that it could be done under the media Bill. I am pleased to say that my party, the Labour Party, has specifically pledged that it will not repeal Section 40 in any media Bill introduced if and when it takes power. I respect my noble friend Lord Knight’s argument so, for the avoidance of doubt, we shall not seek the opinion of the House on this amendment. But let the press be in no doubt: Parliament remains on the case—sometimes more intently, sometimes less intently; once agreed on the royal charter, but that has gone down the river; but always ready to act if the newspapers defeat the rights of the public to complain.

We will not finish the job tonight, nor with this Bill, but examples of egregious press behaviour continue to mount up. I know that some of them are in the past, and we were all following the recent High Court case, but they still appear to be around. The question will not go away. The Government continue to attempt to curry favour with the press—the Prime Minister even went to a Rupert Murdoch party rather than attend a climate conference—but, at the end of the day, the power of the press is declining. The force of those who argue for a better complaints system multiplies. Sooner or later, something will have to be done.

18:00
Lord McNally Portrait Lord McNally (LD)
- View Speech - Hansard - - - Excerpts

My Lords, my name is also to this amendment. I am moved by a phrase used by the noble Lord, Lord Stevenson, on Monday; he said the passage of this Bill has been a “series of conversations”. So it has been. The way the Minister has engaged with the House on many of the concerns that the Bill tries to cover has been greatly to his credit.

It is somewhat unknown how much the new technologies will impact on our democracy, our privacy and the safety of our children, although they have all been discussed with great thoroughness. That is why the opt-out for recognised news publishers is something of a puzzle, unless you assume that the Government have caved in to pressure from that sector. Why should it be given this opt-out? It is partly because if you ask the press to take responsibility in any way, it becomes like Violet Elizabeth Bott in the Just William stories; it “thkweems and thkweems”—usually led by the noble Lord, Lord Black, whom I am glad to see in his place —and talks about press freedom.

My skin in this game is that I was the Minister in the Lords when the Leveson inquiry was under way and when we took action to try to implement its findings. It is interesting that at that point there was cross-party agreement in both Houses on how to implement them. I advise anybody intending to go into coalitions in future not to take the Conservative Party’s assurances on such matters totally at face value, as that cross-party agreement to implement Leveson was reneged on by the Conservative Party under pressure from the main newspaper publishers.

It was a tragedy, because the “series of conversations” that the noble Lord, Lord Stevenson, referred to will be ongoing. We will not let the press off the hook, no matter how much it wields its power. It is just over 90 years since Stanley Baldwin’s famous accusation of

“power without responsibility—the prerogative of the harlot throughout the ages”.

It is just over 30 years since David Mellor warned the press that it was in the “last chance saloon” and just over 10 years since Rupert Murdoch said that appearing before the Leveson inquiry, with a curious choice of language, was

“the most humble day of my life”.

Of course, like water off a duck’s back, once the pressure was off and the deal had been done with the Conservative Party, we could carry on on our own merry way.

It was a tragedy too because the Leveson settlement—as I think the PRP and Impress have proved—works perfectly well. It is neither state controlled nor an imposition on a free press. Like the noble Lord, Lord Lipsey, I greatly resent the idea that this is somehow an attempt to impose on a free press. It is an attempt to get the press to help the whole of our democracy and make things work properly, just as this Bill attempts to do.

Someone mentioned Rupert Murdoch’s recent summer party. The Prime Minister was not the only one who went—so did the leader of the Opposition. I like to think that Mr Attlee would not have gone. I am not sure that my old boss, Jim Callaghan, would have gone. I do not think that either would have flown half way around the world, as Tony Blair did, to treat with him. The truth is that, over the last decade or so, in some ways the situation has got worse. Politicians are more cowed by the press. When I was a Minister and we proposed some reasonably modest piece of radical change, I was told by my Conservative colleague, “We’ll not get that through; the Daily Mail won’t tolerate it”. That pressure on politics means we need politicians with the guts to resist it.

Those who want a genuinely free press would not leave this festering wound. I will not join in the attack on the noble Lord, Lord Faulks, because we worked together very well in coalition. I would prefer to see IPSO reform itself to become Leveson-compliant. That would not bring any of the dangers that we will hear about from the noble Lord, Lord Black, but it would give us a system of press regulation that we could all agree with.

On Section 40, I remember well the discussions about how we would give some incentive to join. A number of my colleagues feel uncomfortable about Section 40 making even the winners pay, but the winner pays only if they are not within a Leveson-compliant system. That was, perhaps innocently, thought of as a carrot to bring the press in, though, of course, it does not read easily. Frankly, if Section 40 were to go but IPSO became Leveson-compliant, that would be a fair deal.

This Bill leaves us with some very dangerous loopholes. Some of the comments underneath in the press and, as the Minister referred to, the newsclips that can be added can be extremely dangerous if children are exposed to them.

There are many other loopholes that this genuflection to press power is going to leave in the Bill and which will lead to problems in the future. Rather than launch another attack—because you can be sure another case will come along or another outrage will happen, and perhaps this time, Parliament will have the guts to deal with it—it would be far better if the media itself saw Leveson for what it was: a masterful, genuine attempt to put a free press within the context of a free society and protect the individuals and institutions in that society in a way that is in all our interests. As the noble Lord, Lord Lipsey, said, we are not pushing this tonight, but we are not going to go away.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I have been a journalist my whole career and I have great respect for the noble Lords who put their names to Amendments 159 and 160. However, I cannot support another attempt to lever Section 42 of the Crime and Courts Act into the Bill. In Committee I put my name to Amendment 51, which aims to protect journalism in the public interest. It is crucial to support our news outlets, in the interests of democracy and openness. We are in a world where only a few newspapers, such as the New York Times, manage to make a profit from their digital subscribers. I welcome the protection provided by Clause 50; it is much needed.

In the past decade, the declining state of local journalism has meant there is little coverage of magistrates’ courts and council proceedings, the result being that local public servants are no longer held to account. At a national level, newspapers are more and more reluctant to put money into investigations unless they are certain of an outcome, which is rarely the case. Meanwhile, the tech platforms are using newspapers’ contents for free or paying them little money, while disaggregating news content on their websites so the readers do not even know its provenance. I fear that the digital era is putting our legacy media, which has long been a proud centrepiece of our democracy, in great danger. The inclusion of these amendments would mean that all national newspapers and most local media would be excluded from the protections of the clause. The Bill, which is about regulating the digital world, should not be about trying to limit the number of newspapers and news websites covered by the protections of Clause 50; it would threaten democracy at a local and national level.

Lord Black of Brentwood Portrait Lord Black of Brentwood (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am very pleased to say a few words, because I do not want to disappoint my good friend the noble Lord, Lord McNally, who has obviously read the text of my speech before I have even delivered it. I declare my interests as deputy chairman of the Telegraph Media Group and a director of the Regulatory Funding Company, and note my other interests as set out in the register.

It will not come as a surprise that I oppose Amendments 159 and 160. I am not going to detain your Lordships for long; there are other more important things to talk about this evening than this seemingly never-ending issue, about which we had a good discussion in Committee. I am sorry that the two noble Lords were indisposed at that time, and I am glad to see they are back on fighting form. I am dispirited that these amendments surfaced in the first place as I do not think they really have anything to do with online safety and the protection of children. This is a Bill about the platforms, not the press. I will not repeat all the points we discussed at earlier stages. Suffice it to say that, in my view, this is not the time and the place to seek to impose what would be statutory controls on the press, for the first time since that great liberal, John Locke, led the charge for press freedom in 1695 when the Licensing Acts were abolished. Let us be clear: despite what the two noble Lords said, that is what these amendments would do, and I will briefly explain why.

These amendments seek to remove the exemption for news publishers from an onerous statutory regime overseen by Ofcom, which is, as the noble Lord, Lord Lipsey, said, a state regulator, unless they are part of an approved regulator. Yet no serious publisher, by which I mean the whole of the national and regional press, as the noble Viscount, Lord Colville, said—including at least 95% of the industry, from the Manchester Evening News to Cosmopolitan magazine—is ever going to join a regulator which is approved by the state. Even that patron saint of press controls, Sir Brian Leveson, conceded that this was a “principled position” for the industry to take. The net effect of these amendments would be, at a stroke, to subject virtually the entire press to state regulation—a momentous act wholly inimical to any definition of press freedom and free speech—and with very little discussion and absolutely no consultation.

18:15
The Bill would then become not the Online Safety Act but the state regulation of the press Act, changed entirely from something in which we should take great pride to something deeply controversial and condemned across the globe. I say to the noble Lord, Lord McNally, that on a basic administrative level, it would no longer be possible to certify that the legislation accords with the Human Rights Act, as statutory press controls of this sort have always been found to be in contravention of the ECHR’s provisions on freedom of expression. That is why this is not the place for so fateful a piece of legislation; nor is it the time. As the noble Lord, Lord Lipsey, said, I hope that, within months, we will have a media Bill which will contain provisions to repeal the odious Section 40 of the Crime and Courts Act, in line with the Government’s manifesto commitment. If noble Lords do wish to discuss press regulation and re-open issues which were settled a decade ago—and, indeed, that relate to events which took place two decades ago—that is the time to do it, not here and now.
I am pleased to hear from the noble Lord that he will not divide the House, because to do so would be to hijack this important legislation, in which we should all take great pride, and turn it into something it was never intended to be.
Lord Faulks Portrait Lord Faulks (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I declare my interest—although I think it has already been declared for me by the noble Lords, Lord McNally and Lord Lipsey—as the chair of the Independent Press Standards Organisation.

We had this debate in Committee, although not with the same actors; I am glad to see both of them now back in their places and restored to health. However, I cannot welcome all the comments they made, particularly not those of the noble Lord, Lord Lipsey, critical as he was of IPSO. I should tell the House that IPSO is not on the side of the press. It is not on anybody’s side: it is an independent organisation for the regulation of the press that regulates, by circulation, some 95% of both national and regional newspapers.

The noble Lord, Lord Lipsey, spoke of how ineffective we were as an organisation and was rather disparaging about the reviews of IPSO’s governance and operations. I ought, at the very least, to maintain a defence of Sir Bill Jeffrey, a very distinguished civil servant in the Ministry of Defence who recently carried out a report on IPSO. I hope that Members of your Lordships’ House, particularly the noble Lords, Lord Lipsey and Lord McNally, will read the report to see in what ways they consider IPSO is still not showing its independence, but I would very much defend Sir Bill Jeffrey’s independence and the way in which he approached the task. I think it unfortunate that he was attacked in the way he was by the noble Lord. I give way.

Lord Lipsey Portrait Lord Lipsey (Lab)
- Hansard - - - Excerpts

Does the noble Lord agree that a report which gives as part of its evidence conversations with a sample of precisely 12 complainants cannot be taken seriously?

Lord Faulks Portrait Lord Faulks (Non-Afl)
- Hansard - - - Excerpts

The report must be read as a whole. I do not accept at all what the noble Lord has said. It is worth visiting the IPSO website, because he was very disparaging about the number of complaints that were upheld. IPSO is very transparent; its website shows all the decisions that were reached and the way in which they were reached. I invite those who doubt its independence to look at the constituent elements of those who are on the complaints committee and the board, and all the published decisions, in order to decide whether IPSO is indeed in the pockets of the press, which seemed to be the suggestion made by both noble Lords.

Of course, the approved regulator, Impress, has very little work to do. I am sure it does its work highly conscientiously. The code by which it regulates is remarkably similar to the editors’ code, which is produced by the industry, it is true, with contributions from all sorts of people. It varies from year to year. There is very little criticism of the editors’ code. It provides a very sensible and balanced view to make the press accountable, allowing the complaints committee to decide whether there has been a violation of the code.

The noble Lord, Lord Lipsey, said that at last it has found the press to be in breach of that code in the recent complaint. It was interesting that the complaints body which I chair was alleged to not be independent of the press. It was roundly criticised by the press for coming to that decision—by the Times, the Telegraph and the Daily Mail. At the same time, it is said that the organisation which I chair is not independent. It is of course independent and will continue to be so.

As for Section 40, before I had anything to do with press regulation, I did not like it. As a lawyer, the idea of somebody having a free hit against anybody is unattractive. Whatever you think of press regulation, I do not think that Section 40 should commend itself to anybody. As they have promised for some time, the Government are quite right to include it in the media Bill, which is to come before your Lordships’ House in due course. It has been a sword of Damocles hanging over the industry. It is not helpful, and I hope that it is repealed. I understand that the Labour Party and perhaps the Liberal Democrats will bring back something of that sort. I understand they may be opposing it when it comes into the media Bill, but that is a matter for them in due course.

Of course, the press should be accountable. Of course, it should be properly regulated. The idea of an independent regulator is to provide reassurance that it is being regulated, as opposed to, until this Bill becomes law, social media—which is not regulated—which provides a source for news which is considerably less reliable than all those newspapers which are subject to regulation.

This is not the occasion to go into further debates about Leveson, but it is perhaps worth rereading the Leveson report and the conclusions that Sir Brian reached—which I have done recently. It must be seen, as all reports, as very much of its time. It is particularly interesting to see the extent to which he promoted and advanced the cause of arbitration. Alternative dispute resolution is very much at the centre of what the legal profession as a whole, and Sir Brian Leveson and his committee in particular, advance as a much better way to resolve disputes. There is an arbitration scheme provided by IPSO, as noble Lords and the House may know. Of course, that is an option which we would encourage people to use—consistent with what Sir Brian and his committee recommended. It is not a substitute for going to court, and if people want to, they should be allowed to do so. However, I think there is a case for courts considering having directions whereby, at first, somebody seeking relief in the court should show that they have exhausted alternative remedies, including alternative dispute resolution. I am in favour of that.

On the idea of being Leveson-compliant—I do not think Sir Brian Leveson particularly likes that expression. He made various recommendations, many of which are reflected in what IPSO does now. I understand there is a great deal of history in this debate. I remember the debates myself. No doubt, we will return to them in due course, but I think we should fight today’s battles, and not the battles of 10 years ago or longer. I think the press is much more accountable and responsible than it was. Of course, as parliamentarians, we will carefully watch what the press do and consider carefully whether this exemption is merited. However, I do not think that this amendment is justified and I hope that the Government do not support it.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I want to bring the tone of the debate down somewhat to talk about government Amendments 158 and 161 in a rather nerdier fashion. I hope that the House will be patient with me as I do that.

The Minister said that these two amendments introduce some “minor changes” that would make the Bill work as intended. I want to explore whether they are rather more significant than the Minister has given them credit for, and whether they may have unintended consequences. As I understand it, the purpose of the amendments is to ensure that all forms of video and audio content, in long form or short form, whether originally broadcast or made exclusively for social media, will now benefit from the news publisher exemptions.

Particularly thinking about this from a social media point of view—the noble Lord, Lord Faulks, just made the point about news publishers such as newspapers—when we have been looking at the Bill and the news publisher exemption, we have been thinking of the BBC and newspapers. We have been thinking a lot less about people who regard themselves to be news publishers but produce things exclusively for social media—often in a clickbait fashion, using a lot of short-form material. As I read these amendments, they are saying very clearly that this kind of material will benefit from the news publisher exemption. That opens up a whole series of questions we must ask ourselves about whether that will have unintended consequences.

Looking at this in the context of what it takes to be registered as a news publisher in Clause 50, the noble Viscount, Lord Colville, referred to the fact that there is an intention and a view that Clause 50 should be kept broad so that people can register as news publishers. Clearly, that is good for media diversity, but if we look at those tests, they are tests that I think that a lot of organisations could pass. We must ask ourselves who might try to establish themselves as a recognised news publisher. They would need to have an office in the United Kingdom. They would also need to apply our standards code, but Clause 50(6)(b) says that the standards code can be their own standards code—it does not have to be anyone else’s.

I am not going to get into a debate about who should be the press regulator; that is for other noble Lords. As I read it, these internet services could pass the Clause 50(2) test by establishing the office and meeting a few basic requirements, then under Clause 50(6)(b) say, “I’ve got a standards code. It’s my standards code. I’ve written it—on the back of an envelope but it’s a standards code”. Then we need to think about who might want to take advantage of that material. My reading of the Bill, thinking about intention, is that services such as Breitbart News—which is not my cup of tea, but is a recognised news publisher—would pass the test and would be able to establish themselves as a news publisher in the UK, benefiting from the exemptions. Whether or not I agree with it, I can see that is a reasonable unintended outcome.

My concern is about other services, such as Infowars, which I am sure everybody is familiar with. It is a service that has caused untold harm and has been sued in the US courts for defamation—which is a pretty high bar. Infowars has clearly caused so much harm that it has found itself on the wrong end of defamation lawsuits in the United States. I do not think it should in any way be our intention that a service such as Infowars should be able to benefit from the special privileges granted to news publishers under the legislation. I know that it is hard to draw lines, and I am not expecting the Minister to say at the Dispatch Box exactly where the line should be drawn. However, I think that without citing examples such as that, we risk not testing the legislation to destruction—which is precisely what we should be doing here—and ending up in a scenario where we have created a news publisher exemption that could be taken advantage of by the wrong organisations. Someone has to draw a line and make a classification.

As we create this news publisher exemption, it is incumbent on us to describe it to people out there in vernacular terms they would understand. My understanding is that the BBC, the Daily Mail, Breitbart News—all those are in. We expect them to be able to pass the Clause 50 test and we have no problem with that. Russia Today, Infowars and a whole host of other services that brand themselves news but are incredibly harmful and destructive to society and individuals—we would want them to fail the Clause 50 test.

I hope the Minister will at least acknowledge that there is going to be a challenge around bad services run by bad people claiming to be news publishers under Clause 50. I hope he will agree that it is not our intention to give publisher privileges to services such as Infowars that cause so much harm to society.

18:30
I hope the Minister will be able to at least suggest to us where there may be some mechanism for disputes, because I do not see that in the Bill at the moment. We are leaving it to providers to make this judgment and then, presumably, to the disgruntled excluded service to make a complaint and go after the provider. When there is a dispute—when social media companies do what I think we want them to do, which is to not grant special privileges and keep content up when it is coming from these awful people who are causing harm to society—what will be the mechanism for resolving that?
At some point, someone has to say, “You’ve got it right: you shouldn’t be able to classify that as a recognised news publisher”, or, “You’ve got it wrong: actually, the British Government, in all their glory, stand behind the fact that Infowars should be recognised and given these special privileges”. Those are really important questions we have to ask about how this clause will work in practice. Amendments 158 and 161, because they allow explicitly for short-form video made especially for social media, will come to be seen as quite instrumental and not at all minor.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I am completely opposed to Amendments 159 and 160, but the noble Lords, Lord Faulks and Lord Black, and the noble Viscount, Lord Colville, have explained the issues perfectly. I am fully in agreement with what they said. I spoke at length in Committee on that very topic. This is a debate we will undoubtedly come back to in the media Bill. I, for one, am extremely disappointed that the Labour Party has said that it will not repeal Section 40. I am sure that these issues will get an airing elsewhere. As this is a speech-limiting piece of legislation, as was admitted earlier this week, I do not want any more speech limiting. I certainly do not want it to be a media freedom-limiting piece of legislation on top of that.

I want to talk mainly about the other amendments, Amendments 158 and 161, but approach them from a completely different angle from the noble Lord, Lord Allan of Hallam. What is the thinking behind saying that the only people who can clip content from recognised news publishers are the news publishers? The Minister mentioned in passing that there might be a problem of editing them, but it has become common practice these days for members of the public to clip from recognised news publishers and make comments. Is that not going to be allowed? That was the bit that completely confused me. It is too prescriptive; I can see all sorts of people getting caught by that.

The point that the noble Lord, Lord Allan of Hallam, made about what constitutes a recognised news publisher is where the issue gets quite difficult. The point was made about the “wrong” organisations, but I want to know who decides what is right and wrong. We might all nod along when it comes to Infowars and RT, but there are lots of organisations that would potentially fail that test. My concern is that they would not be able to appeal when they are legitimate news organisations, even if not to everybody’s taste. Because I think that we already have too much speech limiting in the Bill, I do not want any more. This is important.

When it comes to talking about the “wrong” organisations, I noticed that the noble Lord, Lord McNally, referred to people who went to Rupert Murdoch’s parties. I declare my interests here: I have never been invited or been to a Rupert Murdoch party—although do feel free, I say, if he is watching—but I have read about them in newspapers. For some people in this Chamber, the “wrong” kind of news organisation is, for example, the Times or one with the wrong kind of owner. The idea that we will all agree or know which news publishers are the “wrong” kind is not clear, and I do not think that the test is going to sort it out.

Will the Minister explain what organisations can do if they fail the recognised news publisher test to appeal and say, “We are legitimate and should be allowed”? Why is there this idea that a member of the public cannot clip a recognised news publisher’s content without falling foul? Why would they not be given some exemption? I genuinely do not understand that.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I shall speak very briefly. I feel a responsibility to speak, having spoken in Committee on a similar group of amendments when the noble Lords, Lord Lipsey and Lord McNally, were not available. I spoke against their amendments then and would do so again. I align myself with the comments of my noble friend Lord Black, the noble Lord, Lord Faulks, and the noble Viscount, Lord Colville. As the noble Baroness, Lady Fox, just said, they gave a comprehensive justification for that position. I have no intention of repeating it, or indeed repeating my arguments in Committee, but I think it is worth stating my position.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, we have heard some very well-rehearsed lines during the debate today, with the usual protagonists. Nevertheless, the truth of the matter is that the Press Recognition Panel is as frustrated as many of us on these Benches and other Benches at the failure to implement a post-Leveson scheme of press regulation. Despite many efforts, it has never been fully put into effect.

I do not think I need to repeat a great deal of what has been said today. For instance, the record of IPSO, which the noble Lord, Lord Faulks, talked about, has been very well tracked by Hacked Off. This is not a proposal for state regulation—which is so often, if you like, the canard placed on it.

If not this Bill, which Bill? The media Bill is not going to tackle issues such as this, as my noble friend Lord McNally said. As the noble Lord, Lord Stevenson, has pointed out, this Bill has been a series of conversations —extremely fruitful conversations—but in this particular direction it has borne no fruit at all.

I must admit that, throughout my looking at the draft Bill and continuing to look through its various versions, this opt-out for news publishers has remained a puzzle. The below-the-line opt-out for the mainstream news media always strikes me as strange, because there is no qualification that there should be any curation of that below-the-line, user-generated content. That is peculiar, and it is rather like somebody in the last chance saloon being rewarded with a bouquet. It seems a rather extraordinary provision.

My noble friend Lord Allan rightly pointed to some of the dangers in the new provisions, and indeed in the provisions generally, for these services. I hope the Minister has at least some answers to give to the questions he raised. Progress on this and the scheme that the PRP was set up to oversee, which is still not in place, remain a source of great division across the parties and within them. There is still hope; it may be that under a different Government we would see a different result.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I was unfortunately unable to attend round 1 of this debate—I had to leave. My noble friend Lord Knight has absented himself from hearing what I am going to say about his remarks, so he must fear that he had got his lines wrong. I apologised to him for leaving him a bit exposed, because we had not quite anticipated how the conversation would go, but I think he did as well as he could, and I repeat the lines he said: this is not the right Bill to rerun the arguments about the Leveson report. I still believe that. The noble Lord, Lord Clement-Jones, does not think the media Bill is; maybe it is not, but at least we can make sure that the debate is properly argued.

It is interesting that, although we clearly have well-defined positions and antipathies present in the debate, a number of things have been said today that will be helpful, if we manage to get a little closer, in trying to resolve some of the issues outstanding. If I am still around and involved in it, I will approach this by trying to see what we can do together rather than the rights and wrongs of positions we have adopted before. It has worked for this Bill: we have achieved huge changes to the Bill because we decided from the start that we would try to see what was the best that could come out of it. That is the instinct I have as we go forward to any future debate and discussion, whether or not it is on the media Bill.

The puzzling thing here is why this is such a continuing concern that it needs to be brought into to any opportunity we have to discuss these areas. The sense we had in the pre-legislative scrutiny committee, which discussed this to some extent but not in quite the same range as we have tonight, or even in Committee, was that the issues raised in this Bill were really about protecting freedom of expression. At that stage, the Bill still had the legal but harmful clauses in it so perhaps had had less exposure to those issues in the debate we had. I still think it is primarily about that. I still have real concerns about it, as have been raised by one or two people already in our discussion. I do not think the recognised news provider definition is a good one; I do not think the definition of a journalist is a good one. The pre-legislative scrutiny committee wanted an objective test of material based around public interest, but the Government would not accept that, so we are where we are. We must try to ensure that what works is what we have in the Bill in relation to the topics before it.

The primary purpose must be to ensure material that will inform and enhance our knowledge about democracy, current affairs and issues that need to be debated in the public space, so it is clearly right that that which is published by recognised journalists—quality journalists is another phrase that has been used—should be protected, perhaps more than other material, but at the fringes there are still doubts as to whether the Bill does that.

I had taken it that in the amendments I signed up to, government Amendments 158 and 161, the material we were talking about was from recognised news publishers, not material self-generated in social media. I am looking hard at the Minister hoping he will be able to come to my aid when he comes to respond. The issue here is about making sure that material that was not originally broadcast but is still provided by a recognised news publisher is protected from being taken down, and it would not have been if those amendments were not made. I hope that is the right interpretation. That was the basis on which I signed up for them; I do not know quite where it leaves me if that is wrong.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

As I opened up that question, just to be clear, I was saying that it is exactly right that an individual user would not be covered, but I was trying to suggest that a social media-only news service that does not exist as a publication or a broadcaster outside social media, if it meets the Clause 50 test to be a recognised news publisher, should be given extra scope under the amendments.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I hope they do not, and I think the Minister has to answer that question quite directly. The issue here is about quality material that would otherwise be taken down being kept in place so that we can all as a society be informed by that. That does not mean it needs to be from particular sources that we know to be egregious or running material which is certainly not in the public interest. Again, I make the point that that would have been a better way of approaching this in the legislation, but I take the point made by the noble Lord, Lord Allan, who knows his stuff—I often think we ought to bottle him and carry it around so we can take a whiff of his expertise and knowledge every time we get stuck on a problem, but I am not quite sure how we manage that.

18:45
That is a long way in to what is a relatively straightforward point. At the end of the day, we need some system under which the material that we want in circulation in our democratic society, supporting the essential characteristics we have just talked about, is protected. I think the Bill moves that way. We will have to see whether it works in practice, but it gives us a basis for that. That is not the same thing as in any sense trying to address the questions raised about whether particular groups of newspapers perform or are categorised in a way that meets a particular set of regulations or laws, which may or may not still be in effect post-Leveson. As I said, that is for another day, and I am sure there are issues there that we can talk about.
At the end of this debate, which I think has been useful and will help in future, the narrow point is whether we believe that freedom of expression is enhanced by the proposals in front of us, and I very much think it is. I look forward to hearing the Government’s response.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

I reassure the noble Lord, Lord Stevenson, that he was right to sign the amendments; I am grateful that he did. I do not know whether it is possible to have a sense of déjà vu about debates that took place before one entered your Lordships’ House, but if so, I feel I have had it over the past hour. I am, however, glad to see the noble Lords, Lord Lipsey and Lord McNally, back in their places and that they have had the chance to express their views, which they were unable to do fully in Committee. I am grateful to noble Lords who have joined in that debate again.

At present, Amendment 159 would enable news publishers that are members of Impress, the sole UK regulator which has sought approval by the Press Recognition Panel, to benefit from the Bill’s protections for news publishers, without meeting the criteria set out in Clause 50(2). This would introduce a legislative advantage for Impress members over other news publishers. The amendment would, in effect, create strong incentives for publishers to join a specific press regulator. We do not consider that to be compatible with our commitment to a free press. To that end, as noble Lords know, we will repeal existing legislation that could have that effect, specifically Section 40 of the Crime and Courts Act 2013, through the media Bill, which was published recently.

Not only is creating an incentive for a publisher to join a specific regulator incompatible with protecting press freedom in the United Kingdom but it would undermine the aforementioned criteria. These have been drafted to be as robust as possible, with requirements including that organisations have publication of news as their principal purpose, that they are subject to a standards code and that their content is created by different persons. Membership of Impress, or indeed any other press regulator, does not and should not automatically ensure that these criteria are met.

Amendment 160 goes further by amending one of these criteria—specifically, the requirement for entities to be subject to a standards code. It would add the requirement that these standards codes be drawn up by a regulator, such as a body such as Impress. This amendment would create further incentives for news publishers to join a press regulator if they are to benefit from the exclusion for recognised news publishers. This is similarly not compatible with our commitment to press freedom.

We believe the criteria set out in Clause 50 of the Bill are already sufficiently strong, and we have taken significant care to ensure that only established news publishers are captured, while limiting the opportunity for bad actors to benefit.

The noble Lord, Lord Allan, asked about protections against that abuse by bad actors. The Bill includes protections for journalism and news publishers, given the importance of a free press in a democratic society. However, it also includes safeguards to prevent the abuse of these protections by bad actors. Platforms will still be able to remove recognised news publisher content that breaches their terms and conditions as long as they notify recognised news publishers and offer a right of appeal first. This means that content will remain online while the appeal is considered, unless it constitutes a relevant offence under the Bill or the platform would incur criminal or civil liability by hosting it. This marks a significant improvement on the status quo whereby social media companies can remove journalistic content with no accountability and little recourse for journalists to appeal.

We are clear that sanctioned news outlets such as RT must not benefit from these protections. We are amending the criteria for determining which entities qualify as recognised news publishers explicitly to exclude entities that are subject to sanctions. The criteria also exclude any entity that is a proscribed organisation under the Terrorism Act 2000 or whose purpose is to support an organisation that is proscribed under that Act. To require Ofcom or another party to assess standards would be to introduce press regulation by the back door.

The noble Baroness, Lady Fox of Buckley, asked about protecting clipped or edited content. Given evolving news consumption habits, recognised news publishers may clip or edit content from their published or broadcast versions to cater to different audiences and to be used on different platforms. We want to ensure recognised news publisher content is protected in all its forms as long as that content is still created or generated by the news publisher. For example, if a broadcaster shares a link to its shorter, online-only version of a long-form TV news programme or documentary on an in-scope platform, this should still benefit from the protections that the Bill affords. The amendment that we have brought forward ensures that this content and those scenarios remain protected but removes the risk of platforms being forced to carry news publisher content that has been edited by a third party potentially to cause harm. I hope that clarifies that.

I am grateful to the noble Lord, Lord Lipsey, for making it clear that he does not intend to press his amendments to a Division, so I look forward to that. I am also grateful for the support for the Government’s amendments in this group.

Amendment 158 agreed.
Clause 50: “Recognised news publisher”
Amendments 159 and 160 not moved.
Clause 51: “Search content”, “search results” etc
Amendment 161
Moved by
161: Clause 51, page 52, line 14, leave out sub-paragraphs (ii) and (iii) and insert—
“(ii) is video or audio content that was originally published or broadcast by a recognised news publisher, and is not a clipped or edited form of such content (unless it is the recognised news publisher who has clipped or edited it), or(iii) is a link to an article or item within sub-paragraph (i) or to content within sub-paragraph (ii).”Member’s explanatory statement
This amendment ensures that, in particular, online content published by a recognised news publisher that has not first been broadcast is included in the list of content which does not count as search content for the purposes of the Bill.
Amendment 161 agreed.
Schedule 7: Priority offences
Amendment 162 not moved.
Clause 54: “Content that is harmful to children” etc
Amendments 163 to 170
Moved by
163: Clause 54, page 54, line 44, leave out “applies” and insert “and sections (“Primary priority content that is harmful to children”) and (“Priority content that is harmful to children”) apply”
Member’s explanatory statement
This technical amendment ensures that the new Clauses proposed to be inserted after Clause 54 in my name setting out which kinds of content count as primary priority content and priority content harmful to children apply for the purposes of Part 3 of the Bill.
164: Clause 54, page 55, line 1, leave out subsections (2) and (3)
Member’s explanatory statement
This amendment omits powers to make regulations setting out which kinds of content count as primary priority content and priority content harmful to children. Those kinds of content are now set out on the face of the Bill (see the new Clauses proposed to be inserted after Clause 54 in my name).
165: Clause 54, page 55, line 8, after “children” insert “(see section (“Primary priority content that is harmful to children”))”
Member’s explanatory statement
This amendment inserts a signpost to the new Clause proposed to be inserted after Clause 54 in my name setting out which kinds of content count as primary priority content harmful to children.
166: Clause 54, page 55, line 9, after “children” insert “(see section (“Priority content that is harmful to children”))”
Member’s explanatory statement
This amendment inserts a signpost to the new Clause proposed to be inserted after Clause 54 in my name setting out which kinds of content count as priority content harmful to children.
167: Clause 54, page 55, leave out line 13
Member’s explanatory statement
This is a technical amendment omitting a line which is superfluous as a result of the next amendment in my name.
168: Clause 54, page 55, line 14, leave out paragraph (a)
Member’s explanatory statement
This amendment omits a provision about the relationship between illegal content and content harmful to children.
169: Clause 54, page 55, line 34, leave out “is” and insert “and sections (“Primary priority content that is harmful to children”) and (“Priority content that is harmful to children”) are”
Member’s explanatory statement
This amendment ensures that technical provision about content harmful to children extends to primary priority and priority content harmful to children in the new Clauses proposed to be inserted after Clause 54 in my name.
170: Clause 54, page 55, line 36, leave out subsection (9)
Member’s explanatory statement
This amendment omits a signpost to regulations about primary priority and priority content harmful to children, which is no longer needed as the new Clauses proposed to be inserted after Clause 54 in my name set out those kinds of content on the face of the Bill.
Amendments 163 to 170 agreed.
Amendment 171
Moved by
171: After Clause 54, insert the following new Clause—
““Primary priority content that is harmful to children”
(1) “Primary priority content that is harmful to children” means content of any of the following kinds. (2) Pornographic content, other than content within subsection (6).(3) Content which encourages, promotes or provides instructions for suicide.(4) Content which encourages, promotes or provides instructions for an act of deliberate self-injury.(5) Content which encourages, promotes or provides instructions for an eating disorder or behaviours associated with an eating disorder.(6) Content is within this subsection if it—(a) consists only of text, or(b) consists only of text accompanied by—(i) identifying content which consists only of text,(ii) other identifying content which is not itself pornographic content,(iii) a GIF which is not itself pornographic content,(iv) an emoji or other symbol, or(v) any combination of content mentioned in sub-paragraphs (i) to (iv).(7) In this section and section (“Priority content that is harmful to children”) “injury” includes poisoning.”Member’s explanatory statement
This amendment describes which kinds of content count as primary priority content harmful to children for the purposes of Part 3 of the Bill.
Amendment 171 agreed.
Amendment 172
Moved by
172: After Clause 54, insert the following new Clause—
““Priority content that is harmful to children”
(1) “Priority content that is harmful to children” means content of any of the following kinds.(2) Content which is abusive and which targets any of the following characteristics—(a) race,(b) religion,(c) sex,(d) sexual orientation,(e) disability, or(f) gender reassignment.(3) Content which incites hatred against people—(a) of a particular race, religion, sex or sexual orientation,(b) who have a disability, or(c) who have the characteristic of gender reassignment.(4) Content which encourages, promotes or provides instructions for an act of serious violence against a person.(5) Bullying content.(6) Content which—(a) depicts real or realistic serious violence against a person;(b) depicts the real or realistic serious injury of a person in graphic detail.(7) Content which—(a) depicts real or realistic serious violence against an animal;(b) depicts the real or realistic serious injury of an animal in graphic detail;(c) realistically depicts serious violence against a fictional creature or the serious injury of a fictional creature in graphic detail. (8) Content which encourages, promotes or provides instructions for a challenge or stunt highly likely to result in serious injury to the person who does it or to someone else.(9) Content which encourages a person to ingest, inject, inhale or in any other way self-administer—(a) a physically harmful substance;(b) a substance in such a quantity as to be physically harmful.(10) In subsections (2) and (3)—(a) “disability” means any physical or mental impairment;(b) “race” includes colour, nationality, and ethnic or national origins;(c) references to religion include references to a lack of religion.(11) For the purposes of subsection (3), a person has the characteristic of gender reassignment if the person is proposing to undergo, is undergoing or has undergone a process (or part of a process) for the purpose of reassigning the person’s sex by changing physiological or other attributes of sex, and the reference to gender reassignment in subsection (2) is to be construed accordingly.(12) For the purposes of subsection (5) content may, in particular, be “bullying content” if it is content targeted against a person which—(a) conveys a serious threat;(b) is humiliating or degrading;(c) forms part of a campaign of mistreatment.(13) In subsection (6) “person” is not limited to a real person.(14) In subsection (7) “animal” is not limited to a real animal.”Member’s explanatory statement
This amendment describes which kinds of content count as priority content harmful to children for the purposes of Part 3 of the Bill.
Amendments 173 (to Amendment 172) not moved.
Amendment 174 (to Amendment 172) not moved.
Amendment 172 agreed.
Clause 55: Regulations under section 54
Amendment 175
Moved by
175: Clause 55, leave out Clause 55
Member’s explanatory statement
This amendment omits Clause 55 (regulations describing kinds of content harmful to children), as the kinds of content are now set out in the Bill - see the new Clauses proposed to be inserted after Clause 54 in my name.
Amendment 175 agreed.
Clause 56: Regulations under section 54: OFCOM’s review and report
Amendments 176 to 179
Moved by
176: Clause 56, page 56, line 22, leave out subsection (1)
Member’s explanatory statement
This amendment and the next two amendments in my name omit references to regulations which are no longer needed, as primary priority content and priority content harmful to children are now set out in the new Clauses proposed to be inserted after Clause 54 in my name, not in regulations.
177: Clause 56, page 56, line 23, leave out “For so long as regulations are in force,”
Member’s explanatory statement
See the explanatory statement for the first amendment of Clause 56 in my name.
178: Clause 56, page 56, line 32, leave out “the regulations” and insert “sections (“Primary priority content that is harmful to children”) and (“Priority content that is harmful to children”)”
Member’s explanatory statement
See the explanatory statement for the first amendment of Clause 56 in my name.
179: Clause 56, page 56, line 36, leave out “the first statutory instrument containing regulations is made” and insert “this Act is passed”
Member’s explanatory statement
This amendment provides that OFCOM have 3 years from the date this Bill is passed to produce a report reviewing content harmful to children.
Amendments 176 to 179 agreed.
Amendment 180
Moved by
180: After Clause 56, insert the following new Clause—
“Review: offences relating to animal torture content
(1) Within the period of six months beginning with the day on which this Act is passed, the Secretary of State must carry out a review of relevant offences under the—(a) Communications Act 2003, and(b) Animal Welfare Act 2006,to determine whether there is an offence of sending a communication to encourage or assist an act of animal torture, or sharing content related to animal torture, on a regulated service.(2) If the review under subsection (1) determines that one or more offences contained within the Acts does extend to such communications or content, the Secretary of State must, as soon as practicable, make regulations to designate the offence or offences under Schedule 7 to this Act (see section 198(3)).”Member’s explanatory statement
Following answers to a recent oral question (27 June), this amendment would require the Secretary of State to undertake a review of existing criminal offences under the listed enactments to determine whether they apply to online posts containing or facilitating animal torture. If they do, the Secretary of State would be compelled to add these offences to the list of priority offences in Schedule 7.
Baroness Merron Portrait Baroness Merron (Lab)
- Hansard - - - Excerpts

My Lords, I am pleased to speak to Amendment 180, and I thank the noble Lord, Lord Clement-Jones, for adding his name to it and tabling Amendment 180A, which follows it. I am grateful to the Badger Trust, Action for Primates, Wildlife and Countryside Link and the many others who have been in contact about the worryingly high volume of animal cruelty and animal torture content that we see online. I thank the Minister for his engagement on this issue. I very much acknowledge the contribution of noble Lords across the House and their interest in this topic, not only when it was raised in Committee but when my noble friend Lady Hayman of Ullock secured a topical Oral Question on it just last month.

The good news is that everybody agrees that there is a problem here—one that was recently brought into sharp focus by a BBC investigation entitled “The Monkey Haters”. The bad news is that we do not seem to be able to agree on how to address these issues, whether under this Bill or through other forms of action. Users of what will become regulated services once this Bill has passed are using these platforms to discuss, order and share photographs and videos of extreme acts of animal cruelty.

The Government’s position appears to be that, while such activities are abhorrent, they do not generate human harm and are therefore outside the scope of this legislation. In my view, that position is undermined by some of the Government’s own amendments to this legislation, which identify content relating to animal cruelty as falling under priority harms to children. Of course, this measure is a welcome addition. However, as a number of noble Lords highlighted during the recent Oral Question, there is a growing body of evidence that those who engage in acts of animal cruelty go on to harm other human beings.

This amendment contains a modest proposal to review whether the offences already cited from the Dispatch Box apply to online animal torture activity and, if so, to designate those offences under Schedule 7 to the Bill. We accept that the Government are already undertaking a review of criminal offences with a view to expanding the list in Schedule 7, but we have not been able to ascertain the timings attached to that review, whether its findings will be made public or whether Parliament will have a role beyond approving statutory instruments.

In our discussions with the Minister, we had a simple ask: that he commit to including animal welfare issues in the ongoing review and to working with Defra’s Secretary of State to publish a Written Ministerial Statement outlining how many prosecutions have been brought under animal welfare laws, the timetable that applies and how those provisions will be kept under review. We do not consider a Written Ministerial Statement from the Secretary of State summarising government policy to be an unreasonable ask—particularly as this Government are happy to claim that they have done more for animal welfare than any other—yet the Government have hitherto been unable to accept our request. I understand that, just a few minutes ago, an offer of a Written Ministerial Statement was made; noble Lords will understand that I have not seen it as I am in the Chamber, but I am advised that it is not from the Defra Secretary of State and does not refer to the number of prosecutions, timescales or any of the other matters that we requested to be included.

The volume of this content has grown exponentially in recent years. This means thousands of animals being harmed and an even higher number of human beings exposed to abhorrent and horrific material. This amendment may not be perfect, but it will, we hope, encourage the Government to take this issue more seriously than they have done to date. The Minister will be aware that, in view of the Government’s response thus far, I am minded to test the opinion of the House on this amendment. I beg to move.

19:00
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I strongly support Amendment 180, tabled by the noble Baroness, Lady Merron. I will also explain why I put forward Amendment 180A. I pay tribute to the noble Baroness, Lady Hayman, who pursued this issue with considerable force through her Question in the House.

There is clearly an omission in the Bill. One of its primary aims is to protect children from harmful online content, and animal cruelty content causes harm to the animals involved and, critically, to the people who view it, especially children. In Committee, in the Question and today, we have referred to the polling commissioned by the RSPCA, which found that 23% of 10 to 18 year-olds had seen animal cruelty on social media sites. I am sure that the numbers have increased since that survey in 2018. A study published in 2017 found—if evidence were needed—that:

“There is emerging evidence that childhood exposure to maltreatment of companion animals is associated with psychopathology in childhood and adulthood.”


The noble Baroness made an extremely good case, and I do not think that I need to add to it. When the Bill went through the Commons, assurances were given by the former Minister, Damian Collins, who acknowledged that the inclusion of animal cruelty content in the Bill deserves further consideration as the Bill progresses through its parliamentary stages. We need to keep up that pressure, and we will be very much supporting the noble Baroness if she asks for the opinion of the House.

Turning to my Amendment 180A, like the noble Baroness, I pay tribute to the Social Media Animal Cruelty Coalition, which is a very large coalition of organisations. We face a global extinction crisis which the UK Government themselves have pledged to reverse. Algorithmic amplification tools and social media recommendation engines have driven an explosive growth in online wildlife trafficking. A National Geographic article from 2020 quoted US wildlife officials describing the dizzying scale of the wildlife trade on social media. The UK’s national wildlife crime units say that cyber-enabled wildlife crime has become their priority focus, since virtually all wildlife cases they now investigate have a cyber component to them, usually involving social media or e-commerce platforms. In a few clicks it is easy to find pages, groups and postings selling wildlife products made from endangered species, such as elephant ivory, rhino horn, pangolin scales and marine turtle shells, as well as big cats, reptiles, birds, primates and insects for the exotic pet trade. This vast, unregulated trade in live animals and their parts is not only illegal but exacerbates the risk of another animal/human spillover event such as the ones that caused Ebola, HIV and the Covid-19 pandemic.

In addition to accepting the animal welfare amendment tabled by the noble Baroness, which I hope they do, the Government should also add offences under the Control of Trade in Endangered Species Regulations 2018 to Schedule 7 to the Bill. This would definitely help limit the role of social media platforms in enabling wildlife trafficking, helping to uphold the UK’s commitments to tackling global wildlife crime.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I rise very briefly to support the noble Baroness, Lady Merron, and to make only one point. As someone who has the misfortune of seeing a great deal of upsetting material of all kinds, I have to admit that it sears an image on your mind. I have had the misfortune to see the interaction of animal and human cruelty in the same sequences, again and again. In making the point that there is a harm to humans in witnessing and normalising this kind of material, I offer my support to the noble Baroness.

Viscount Camrose Portrait The Parliamentary Under-Secretary of State, Department for Science, Innovation and Technology (Viscount Camrose) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, Amendments 180 and 180A seek to require the Secretary of State to conduct a review of existing legislation and how it relates to certain animal welfare offences and, contingent on this review, to make them priority offences under the regulatory framework.

I am grateful for this debate on the important issue of protecting against animal cruelty online, and all of us in this House share the view of the importance of so doing. As the House has discussed previously, this Government are committed to strong animal welfare standards and protections. In this spirit, this Government recognise the psychological harm that animal cruelty content can cause to children online. That is why we tabled an amendment that lists content that depicts real or realistic serious violence or injury against an animal, including by fictional creatures, as priority content that is harmful to children. This was debated on the first day of Report.

In addition, all services will need proactively to tackle illegal animal cruelty content where this amounts to an existing offence such as extreme pornography. User-to-user services will be required swiftly to remove other illegal content that targets an individual victim once made aware of its presence.

The noble Baroness asked about timing. We feel it is important to understand how harm to animals as already captured in the Bill will function before committing to the specific remedy proposed in the amendments.

As discussed in Committee, the Bill’s focus is rightly on ensuring that humans, in particular children, are protected online, which is why we have not listed animal offences in Schedule 7. As many have observed, this Bill cannot fix every problem associated with the internet. While we recognise the psychological harm that can be caused to adults by seeing this type of content, listing animal offences in Schedule 7 is likely to dilute providers’ resources away from protecting humans online, which is the Bill’s main purpose.

However, I understand the importance of taking action on animal mistreatment when committed online, and I am sympathetic to the intention of these amendments. As discussed with the noble Baroness, Defra is confident that the Animal Welfare Act 2006 and its devolved equivalents can successfully bring prosecutions for the commission and action of animal torture when done online in the UK. These Acts do not cover acts of cruelty that take place outside the UK. I know from the discussion we have had in this House that there are real concerns that the Animal Welfare Act 2006 cannot tackle cross-border content, so I wish to make a further commitment today.

The Government have already committed to consider further how the criminal law can best protect individuals from harmful communications, alongside other communications offences, as part of changes made in the other place. To that end, we commit to include the harm caused by animal mistreatment communications as part of this assessment. This will then provide a basis for the Secretary of State to consider whether this offence should be added to Schedule 7 to the OSB via the powers in Clause 198. This work will commence shortly, and I am confident that this, in combination with animal cruelty content listed as priority harms to children, will safeguard users from this type of content online.

For the reasons set out, I hope the noble Baroness and the noble Lord will consider not pressing their amendments.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

The Minister has not dealt with Amendment 180A at all.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I will be happy to write to the noble Lord.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

That really is not good enough, if I may say so. Does the Minister not have any brief of any kind on Amendment 180A?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I am sorry if the noble Lord feels that I have not dealt with it at all.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

The words “animal trafficking” have not passed his lips.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I am sorry; I will have to write to the noble Lord.

Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am sure the letter will be anticipated.

I am grateful to the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones, for their support for Amendment 180. I appreciate the consideration that the Minister has given to the issue. I am in no doubt of his sympathy for the very important matters at stake here. However, he will not be surprised to hear that I am disappointed with the response, not least because, in the Minister’s proposal, a report will go to the Secretary of State and it will then be up to the Secretary of State whether anything happens, which really is not what we seek. As I mentioned at the outset, I would like to test the opinion of the House.

19:12

Division 1

Ayes: 211


Labour: 112
Liberal Democrat: 62
Crossbench: 29
Independent: 5
Green Party: 2
Bishops: 1

Noes: 171


Conservative: 159
Independent: 7
Crossbench: 4
Labour: 1

19:22
Amendment 180A not moved.
Clause 57: User identity verification
Amendments 181 to 183 not moved.
Amendment 184 not moved.
Clause 60: Regulations about reports to the NCA
Amendment 185
Moved by
185: Clause 60, page 59, line 15, at end insert—
“(2A) The regulations may also—(a) require providers to retain, for a specified period, data of a specified description associated with a report, and(b) impose restrictions or requirements in relation to the retention of such data (including how the data is to be secured or stored or who may access the data).(2B) The power to require the retention of data associated with a report includes power to require the retention of—(a) content generated, uploaded or shared by any user mentioned in the report (or metadata relating to such content), and(b) user data relating to any such person (or metadata relating to such data).“User data” here has the meaning given by section 206.” Member’s explanatory statement
This amendment provides that regulations under this Clause may require a provider to retain data associated with a report sent to the NCA and impose restrictions or requirements in relation to the retention of the data.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, child sexual exploitation or abuse is an abhorrent crime. Reporting allows victims to be identified and offenders apprehended. It is vital that in-scope companies retain the data included in reports made to the National Crime Agency. This will enable effective prosecutions and ensure that children can be protected.

The amendments in my name in this group will enable the Secretary of State to include in the regulations about the reporting of child sexual exploitation or abuse content a requirement for providers to retain data. This requirement will be triggered only by a provider making a report of suspected child sexual exploitation or abuse to the National Crime Agency. The provider will need to retain the data included in the report, along with any associated account data. This is vital to enabling prosecutions and to ensuring that children can be protected, because data in reports cannot be used as evidence. Law enforcement agencies request this data only when they have determined that the content is in fact illegal and that it is necessary to progress investigations.

Details such as the types of data and the period of time for which providers must retain this data will be specified in regulations. This will ensure that the requirement is future-proofed against new types of data and will prevent companies retaining types of data that may have become obsolete. The amendments will also enable regulations to include any necessary safeguards in relation to data protection. However, providers will be expected to store, process and share this personal data within the UK GDPR framework.

Regulations about child sexual exploitation or abuse reporting will undergo a robust consultation with relevant parties and will be subject to parliamentary scrutiny. This process will ensure that the regulations about retaining data will be well-informed, effective and fit for purpose. These amendments bring the child sexual exploitation and abuse reporting requirements into line with international standards. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, these seem very sensible amendments. I am curious about why they have arrived only at this stage, given this was a known problem and that the Bill has been drafted over a long period. I am genuinely curious as to why this issue has been raised only now.

On the substance of the amendments, it seems entirely sensible that, given that we are now going to have 20,000 to 25,000 regulated entities in scope, some of which will never have encountered child sexual exploitation or abuse material or understood that they have a legal duty in relation to it, it will be helpful for them to have a clear set of regulations that tell them how to treat their material.

Child sexual exploitation or abuse material is toxic in both a moral and a legal sense. It needs to be treated almost literally as toxic material inside a company, and sometimes that is not well understood. People feel that they can forward material to someone else, not understanding that in doing so they will break the law. I have had experiences where well-meaning people acting in a vigilante capacity sent material to me, and at that point you have to report them to police. There are no ifs or buts. They have committed an offence in doing so. As somebody who works inside a company, your computer has to be quarantined and taken off and cleaned, just as it would be for any other toxic material, because we framed the law, quite correctly, to say that we do not want to offer people the defence of saying “I was forwarding this material because I’m a good guy”. Forwarding the material is a strict liability offence, so to have regulations that explain, particularly to organisations that have never dealt with this material, exactly how they have to deal with it in order to be legally compliant will be extremely helpful.

One thing I want to flag is that there are going to be some really fundamental cross-border issues that have to be addressed. In many instances of child sexual exploitation or abuse material, the material has been shared between people in different jurisdictions. The provider may not be in a UK jurisdiction, and we have got to avoid any conflicts of laws. I am sure the Government are thinking about this, but in drafting those regulations, what we cannot do, for example, is order a provider to retain data in a way that would be illegal in the jurisdiction from which it originates or in which it has its headquarters. The same would apply vice versa. We would not expect a foreign Government to order a UK company to act in a way that was against UK law in dealing with child sexual exploitation or abuse material. This all has to be worked out. I hope the Government are conscious of that.

I think the public interest is best served if the United Kingdom, the United States and the European Union, in particular, adopt common standards around this. I do not think there is anything between us in terms of how we would want to approach child sexual exploitation or abuse material, so the extent to which we end up having common legal standards will be extraordinarily helpful.

As a general matter, to have regulations that help companies with their compliance is going to be very helpful. I am curious as to how we have got there with the amendment only at this very late stage.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to make a slightly lesser point, but I also welcome these amendments. I want to ask the Minister where the consultation piece of this will lie and to check that all the people who have been in this space for many years will be consulted.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, as ever, my noble friend Lord Allan and the noble Baroness, Lady Kidron, have made helpful, practical and operational points that I hope the Minister will be able to answer. In fact, the first half of my noble friend’s speech was really a speech that the Minister himself could have given in welcoming the amendment, which we do on these Benches.

19:30
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, from this side we certainly welcome these government amendments. I felt it was probably churlish to ask why it had taken until this late stage to comply with international standards, but that point was made very well by the noble Lord, Lord Allan of Hallam, and I look forward to the Minister’s response.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

I am grateful to noble Lords for their support for these amendments and for their commitment, as expected, to ensuring that we have the strongest protections in the Bill for children.

The noble Lord, Lord Allan of Hallam, asked: why only now? It became apparent during the regular engagement that, as he would expect, the Government have with the National Crime Agency on issues such as this that this would be necessary, so we are happy to bring these amendments forward. They are vital amendments to enable law enforcement partners to prosecute offenders and keep children safe.

Reports received by the National Crime Agency are for intelligence only and so cannot be relied on as evidence. As a result, in some cases law enforcement agencies may be required to request that companies provide data in an evidential format. The submitted report will contain a limited amount of information from which law enforcement agencies will have to decide what action to take. Reporting companies may hold wider data that relate to the individuals featured in the report, which could allow law enforcement agencies to understand the full circumstances of the event or attribute identities to the users of the accounts.

The data retention period will provide law enforcement agencies with the necessary time to decide whether it is appropriate to request data in order to continue their investigations. I hope that explains the context of why we are doing this now and why these amendments are important ones to add to the Bill. I am very grateful for noble Lords’ support for them.

Amendment 185 agreed.
Amendment 186
Moved by
186: Clause 60, page 59, line 16, leave out “the regulations” and insert “regulations under this section”
Member’s explanatory statement
This amendment is consequential on the other amendment to Clause 60 in my name.
Amendment 186 agreed.
19:33
Consideration on Report adjourned.

Online Safety Bill

Report (4th Day)
Relevant documents: 28th and 38th Reports from the Delegated Powers Committee, 15th Report from the Constitution Committee. Scottish and Welsh Legislative Consent granted.
15:41
Amendment 186A
Moved by
186A: Before Clause 64, insert the following new Clause—
“Terms of service as a contract
The terms of service under which a Category 1 service is provided to a person who is a consumer for the purposes of the Consumer Rights Act 2015 must be treated as being a contract for a trader to provide a service to a consumer.”Member’s explanatory statement
This purpose of this amendment is to ensure that providers’ terms of service are treated as consumer contracts, and to give users recourse to the remedies under the Consumer Rights Act 2015 in the event of breach.
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, in speaking to my Amendment 186A, I hope that noble Lords will forgive me for not speaking in detail to the many other amendments in this group correctly branded “miscellaneous” by those who compile our lists for us. Many of them are minor and technical, especially the government amendments. However, that is not true of all of them: Amendment 253 in the name of the noble Lord, Lord Clement-Jones, is a substantial amendment relating to regulatory co-operation, while Amendment 275A, in the name of the noble Baroness, Lady Finlay of Llandaff, is also of some interest, relating to the reports that Ofcom is being asked to produce on technological developments.

Nor is Amendment 191A lacking in importance and substance, although—I hope I will be forgiven for saying this, not in a snarky sort of way—for those of us who are worried about the enormous powers being given to Ofcom as a result of the Bill, the idea that it should be required by statute to give guidance to coroners, who are part of the courts system, seems to me strange and worth examining more closely. There might be a more seemly way of achieving the effect that the noble Baroness, Lady Kidron, understandably wants to achieve.

I turn to my own Amendment 186A, which, I hope, ought to be relatively straightforward. It concerns the terms of service of a contract with a category 1 service provider, and it is intended to improve the rights that consumers or users of that service have. It is the case that the Government want users of those services to have the ability to enforce their rights under contract against the service providers, as set out in Clause 65, and this is entirely welcome. However, it is well known that bringing claims in contract can be an expensive and onerous business, as I have pointed out in the past, particularly when the service is provided on the one-sided terms of the service provider—often, of course, drafted under the legal system of a foreign jurisdiction.

15:45
Parliament recognised this difficulty when it enacted the Consumer Rights Act 2015. Its purpose was to make it easier for consumers to deal fairly with traders and get redress when they are treated unfairly. However, category 1 providers such as Twitter and Facebook are not governed by the Consumer Rights Act. In the language of the Act, they do not supply digital content
“for a price paid by the consumer”,
a requirement of the Act. So the purpose of Clause 65 is therefore only partially achieved in the Bill as it stands; a key element in enforcing consumer rights is missing.
The European Union, to its credit, rectified this oversight in its directive about modernising consumer law in 2019. It provides that a consumer who receives content or a service in return for providing their personal data should be in the same position as the consumer who pays a price, and the same consumer protections apply to both. The effect of my amendment is that we would follow suit—if we are serious about protecting and empowering users, that is what we should do. We should do away with the arbitrary fiction that category 1 users are not consumers deserving of protection simply because they do not pay a monetary price.
I have heard it said about bringing up this matter on Report, when it should have been brought up in Committee—as I admit that it should—that it is a little too late perhaps for the Bill team and the Minister to absorb this proposal. However, I do not fully accept that, since the Third Reading of the Bill has been set for 4 September, which gives the Government ample time over the summer to get all their ducks lined up in a row.
I should have thought that this amendment would command support on all sides of the House and from all noble Lords who have participated in this Bill so far. I am hoping, although I have had no indication from my noble friend the Minister whether he is going to accept this amendment, that he would feel that it was a relatively straightforward thing to do, entirely in line with his purpose and something that would strengthen the Bill considerably. I beg to move.
Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- Hansard - - - Excerpts

My Lords, I shall speak to my Amendment 275A in this group. It would place a duty on Ofcom to report annually on areas where our legal codes need clarification and revision to remain up to date as new technologies emerge—and that is to cover technologies, some of which we have not even thought of yet.

Government Amendments 206 and 209 revealed the need for an amendment to the Bill and how it would operate, as they clarify that reference to pornographic content in the Bill includes content created by a bot. However, emerging technologies will need constant scrutiny.

As the noble Lord, Lord Clement-Jones, asked, what about provider content, which forms the background to the user interaction and may include many harms. For example, would a game backdrop that includes anti-Semitic slurs, a concentration camp, a sex shop or a Ku Klux Klan rally be caught by the Bill?

The Minister confirmed that “content” refers to anything communicated by means of an internet service and the encounter includes any content that individuals read, view, hear or otherwise experience, making providers liable for the content that they publish. Is this liable under civil, regulatory or criminal law?

As Schedule 1 goes to some lengths to exempt some service-to-provider content, can the Minister for the record provide chapter and verse, as requested by the noble Lord, Lord Clement-Jones, on provider liability and, in particular, confirm whether such content would be dealt with by the Part 3 duties under the online safety regime or whether users would have to rely on similar law for claims at their own expense through the courts or the police carry the burden of further enforcement?

Last week, the Minister confirmed that “functionality” captures any feature enabling interactions of any description between service users, but are avatars or objects created by the provider of a service, not by an individual user, in scope and therefore subject to risk assessments and their mitigation requirements? If so, will these functionalities also be added to user empowerment tools, enabling users to opt out of exposure to them, or will they be caught only by child safety duties? Are environments provided by a service provider, such as a backdrop to immersive environments, in scope through the definition of “functionality”, “content” or both? When this is provider content and not user-generated content, will this still hold true?

All this points to a deeper issue. Internet services have become more complex and vivid, with extremely realistic avatars and objects indistinguishable from people and objects in the real world. This amendment avoids focusing on negatives associated with AI and new technologies but tries to ensure that the online world is as safe as the offline world should be. It is worth noting that Interpol is already investigating how to deal with criminals in the metaverse and anticipating crimes against children, data theft, money laundering, fraud and counterfeit, ransomware, phishing, sexual assault and harassment, among other things. Many of these behaviours operate in grey areas of the law where it is not clear whether legal definitions extend to the metaverse.

Ofcom has an enormous task ahead, but it is best placed to consider the law’s relationship to new technological developments and to inform Parliament. Updating our laws through the mechanisms proposed in Amendment 275A will provide clarity to the courts, judges, police and prosecution service. I urge the Minister to provide as full an answer as possible to the many questions I have posed. I am grateful to him for all the work he has been doing. If he cannot accept my amendment as worded, will he provide an assurance that he will return to this with a government amendment at Third Reading?

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to Amendment 191A in my name. I also support Amendment 186A in the name of the noble Lord, Lord Moylan, Amendment 253 in the name of the noble Lord, Lord Clement-Jones, and Amendment 275A in the name of my noble friend Lady Finlay. I hope that my words will provide a certain level of reassurance to the noble Lord, Lord Moylan.

In Committee and on Report, the question was raised as to how to support the coronial system with information, education and professional development to keep pace with the impact of the fast-changing digital world. I very much welcome the Chief Coroner’s commitment to professional development for coroners but, as the Minister said, this is subject to funding. While it is right that the duty falls to the Chief Coroner to honour the independence and expert knowledge associated with his roles, this amendment seeks to support his duties with written guidance from Ofcom, which has no such funding issue since its work will be supported by a levy on regulated companies—a levy that I argue could usefully and desirably contribute to the new duties that benefit coroners and bereaved parents.

The role of a coroner is fundamental. They must know what preliminary questions to ask and how to triage the possibility that a child’s digital life is relevant. They must know that Ofcom is there as a resource and ally and how to activate its powers and support. They must know what to ask Ofcom for, how to analyse information they receive and what follow-up questions might be needed. Importantly, they must feel confident in making a determination and describing the way in which the use of a regulated service has contributed to a child’s death, in the case that that is indeed their finding. They must be able to identify learnings that might prevent similar tragedies happening in the future. Moreover, much of the research and information that Ofcom will gather in the course of its other duties could be usefully directed at coroners. All Amendment 191A would do is add to the list of reports that Ofcom has to produce with these issues in mind. In doing so, it would do the Chief Coroner the service of contributing to his own needs and plans for professional development.

I turn to Amendment 186A in the name of the noble Lord, Lord Moylan, who makes a very significant point in bringing it forward. Enormous effort goes into creating an aura of exceptionality for the tech sector, allowing it to avoid laws and regulations that routinely apply to other sectors. These are businesses that benefit from our laws, such as intellectual copyright or international tax law. However, they have negotiated a privileged position in which they have privatised the benefits of our attention and data while outsourcing most of the costs of their service to the public purse or, indeed, their users.

Terms and conditions are a way in which a company enters into a clear agreement with its users, who then “pay” for access with their attention and their data: two of the most valuable commodities in today’s digital society. I am very sympathetic to the noble Lord’s wish to reframe people, both adults and children, from a series of euphemisms that the sector employs—such as “users”, “community members”, “creators” or “participants”—to acknowledge their status as consumers who have rights and, in particular, the right to expect the product they use to be safe and for providers to be held accountable if it is not. I join the noble Lord in asserting that there are now six weeks before Third Reading. This is a very valuable suggestion that is worthy of government attention.

Amendment 253 in the name of the noble Lord, Lord Clement-Jones, puts forward a very strong recommendation of the pre-legislative committee. We were a bit bewildered and surprised that it was not taken up at the time, so I will be interested to hear what argument the Minister makes to exclude it, if indeed he does so. I say to him that I have already experienced the frustration of being bumped from one regulator to another. Although my time as an individual or the organisational time of a charity is minor in the picture we are discussing, it is costly in time and resources. I point to the time, resources and potential effectiveness of the regulatory regime. However well oiled and well funded the regulatory regime of the Online Safety Bill is, I do not think it will be as well oiled and well funded as those that it seeks to regulate.

I make it clear that I accept the arguments of not wanting to create a super-regulator or slow down or confuse existing regulators which each have their own responsibilities, but I feel that the noble Lord, Lord Clement-Jones, has approached this with more of a belt-and-braces approach rather than a whole realignment of regulators. He simply seeks to make it explicit that regulators can, should and do have a legal basis on which to work singularly or together when it suits them. As I indicated earlier, I cannot quite understand why that would not be desirable.

Finally, in what is truly a miscellaneous group, I will refer to the amendment in the name of my noble friend Lady Finlay. I support the intent of this amendment and sincerely hope that the Minister will be able to reassure us that this is already in the Bill and will be done by Ofcom under one duty or another. I hope that he will be able to point to something that includes this. I thank my noble friend for raising it, as it harks back to an amendment in Committee in my name that sought to establish that content deemed harmful in one format would be deemed harmful in all formats—whether synthetic, such as AI, the metaverse or augmented reality. As my noble friend alluded to, it also speaks to the debate we had last week in relation to the amendment from the noble Lord, Lord Clement-Jones, about provider content in the metaverse.

16:00
I do not think we have time to wait for the report that my noble friend seeks. This is the long-awaited Online Safety Bill. We have been warned by the inventors of neural networks and leaders in AI and alternate realities that we are at a crossroads between human and machine. It is incumbent on the Government to ensure that the Bill is fit not only for the past but for the future. In order to do that, they need to look at the definitions—as they did so admirably in Part 5—but also at some of the exceptions they have carved out so that they can say that the Bill truly ends the era of exceptionality in which harms online are treated differently from those offline. My view is that the amendment in the name of my noble friend Lady Finlay should not be necessary at this stage. But, if the Minister cannot confirm that it is already covered, perhaps he will indicate his willingness to accept the amendment.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I will make some arguments in favour of Amendment 191A, in the name of the noble Baroness, Lady Kidron, and inject some notes of caution around Amendment 186A.

On Amendment 191A, it has been my experience that when people frequently investigate something that has happened on online services, they do it well, and well-formed requests are critical to making this work effectively. This was the case with law enforcement: when an individual police officer is investigating something online for the first time, they often ask the wrong questions. They do not understand what they can get and what they cannot get. It is like everything in life: the more you do it, the better you get at it.

Fortunately, in a sense, most coroners will only very occasionally have to deal with these awful circumstances where they need data related to the death of a child. At that point, they are going to be very dependent on Ofcom—which will be dealing with the companies day in and day out across a range of issues—for its expertise. Therefore, it makes absolute sense that Ofcom’s expertise should be distributed widely and that coroners—at the point where they need to access this information—should be able to rely on that. So Amendment 191A is very well intended and, from a practical point of view, very necessary if we are going to make this new system work as I know the noble Baroness, Lady Kidron, and I would like to see it work.

On Amendment 186A around consumer law, I can see the attraction of this, as well as some of the read-across from the United States. A lot of the enforcement against online platforms in the US takes place through the Federal Trade Commission precisely in this area of consumer law and looking at unfair and deceptive practices. I can see the attraction of seeking to align with European Union law, as the noble Lord, Lord Moylan, argued we should be doing with respect to consumer law. However, I think this would be much better dealt with in the context of the digital markets Bill and it would be a mistake to squeeze it in here. My reasons for this are about both process and substance.

In terms of process, we have not done the impact assessment on this. It is quite a major change, for two reasons. First, it could potentially have a huge impact in terms of legal costs and the way businesses will have to deal with that—although I know nobody is going to get too upset if the impact assessment says there will be a significant increase in legal costs for category 1 companies. However, we should at least flesh these things out when we are making regulations and have them in an impact assessment before going ahead and doing something that would have a material impact.

Secondly in process terms, there are some really interesting questions about the way this might affect the market. The consumer law we have does exclude services that are offered for free, because so much of consumer law is about saying, “If the goods are not delivered correctly, you get your money back”. With free services, we are clearly dealing with a different model, so the notion that we have a law that is geared towards making sure you either get the goods or you get the money may not be the best fit. To try to shoehorn in these free-at-the-point-of-use services may not be the best way to do it, even from a markets and consumer point of view. Taking our time to think about how to get this right would make sense.

More fundamentally, in terms of the substance, we need to recognise that, as a result of the Online Safety Bill, Ofcom will be requiring regulated services to rewrite their terms of service in quite a lot of detail. We see this throughout the Bill. We are going to have to do all sorts of things—we will debate other amendments in this area today—to make sure that their terms of service are conformant with what we want from them in this Bill. They are going to have to redo their complaints and redress mechanisms. All of this is going to have to change and Ofcom is going to be the regulator that tells them how to do it; that is what we are asking Ofcom to tell them to do.

My fundamental concern here, if we introduce another element, is that there is a whole different structure under consumer law where you might go to local trading standards or the CMA, or you might launch a private action. In many cases, this may overlap. The overlap is where consumer law states that goods must be provided with reasonable care and skill and in a reasonable time. That sounds great, but it is also what the Online Safety Bill is going to be doing. We do not want consumer law saying, “You need to write your terms of service this way and handle complaints this way”, and then Ofcom coming along and saying, “No, you must write your terms of service that way and handle complaints that way”. We will end up in a mess. So I just think that, from a practical point of view, we should be very focused in this Bill on getting all of this right from an Online Safety Bill point of view, and very cautious about introducing another element.

Perhaps one of the attractions of the consumer law point for those who support the amendment is that it says, “Your terms must be fair”. It is the US model; you cannot have unfair terms. Again, I can imagine a scenario in which somebody goes to court and tries to get the terms struck down because they are unfair but the platform says, “They’re the terms Ofcom told me to write. Sort this out, please, because Ofcom is saying I need to do this but the courts are now saying the thing I did was unfair because somebody feels that they were badly treated”.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

Does the noble Lord accept that that is already a possibility? You can bring an action in contract law against them on the grounds that it is an unfair contract. This could happen already. It is as if the noble Lord is not aware that the possibility of individual action for breach of contract is already built into Clause 65. This measure simply supplements it.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I am certainly aware that it is there but, again, the noble Lord has just made the point himself: this supplements it. The intent of the amendment is to give consumers more rights under this additional piece of legislation; otherwise, why bother with the amendment at all? The noble Lord may be arguing against himself in saying that this is unnecessary and, at the same time, that we need to make the change. If we make the change, it is, in a sense, a material change to open the door to more claims being made under consumer law that terms are unfair. As I say, we may want this outcome to happen eventually, but I find it potentially conflicting to do it precisely at a time when we are getting Ofcom to intervene much more closely in setting those terms. I am simply arguing, “Let’s let that regime settle down”.

The net result and rational outcome—again, I am speaking to my noble friend’s Amendment 253 here—may be that other regulators end up deferring to Ofcom. If Ofcom is the primary regulator and we have told it, under the terms of the Online Safety Bill, “You must require platforms to operate in this way, handle complaints in this way and have terms that do these things, such as excluding particular forms of language and in effect outlawing them on platforms”, the other regulators will eventually end up deferring to it. All I am arguing is that, at this stage, it is premature to try to introduce a second, parallel route for people to seek changes to terms or different forms of redress, however tempting that may be. So I am suggesting a note of caution. It is not that we are starting from Ground Zero—people have routes to go forward today—but I worry about introducing something that I think people will see as material at this late stage, having not looked at the full impact of it and potentially running in conflict with everything else that we are trying to do in this legislation.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak briefly on a couple of amendments and pick up from where the noble Lord, Lord Allan, just finished on Amendment 186A. I associate myself with all the comments that the noble Baroness, Lady Kidron, made on her Amendment 191A. As ever, she introduced the amendment so brilliantly that there is no need for me to add anything other than my wholehearted support.

I will briefly reference Amendment 253 from the noble Lord, Lord Clement-Jones. Both his amendment and my noble friend Lord Moylan’s point to one of the challenges about regulating the digital world, which is that it touches everything. We oscillate between wanting to compartmentalise the digital and recognising that it is interconnected to everything. That is the same challenge faced by every organisation that is trying to digitise: do you ring-fence or recognise that it touches everything? I am very supportive of the principles behind Amendment 253 precisely because, in the end, it does touch everything. It is hugely important that, even though this Bill and others still to come are creating an extraordinarily powerful single regulator in the form of Ofcom, we also recognise the interconnectivity of the regulatory landscape. The amendment is very well placed, and I hope my noble friend the Minister looks favourably on it and its heritage from the pre-legislative scrutiny committee.

I will briefly add my thoughts on Amendment 186A in this miscellaneous group. It feels very much as if we are having a Committee debate on this amendment, and I thank my noble friend Lord Moylan for introducing it. He raises a hugely important point, and I am incredibly sympathetic to the logic he set out.

In this area the digital world operates differently from the physical world, and we do not have the right balance at all between the powers of the big companies and consumer rights. I am completely with my noble friend in the spirit in which he introduced the amendment but, together with the noble Lord, Lord Allan, I think it would be better tackled in the Digital Markets, Competition and Consumers Bill, precisely because it is much broader than online safety. This fundamentally touches the issue of consumer rights in the digital world and I am worried that, if we are not careful, we will do something with the very best intentions that actually makes things slightly worse.

I worry that the terms and conditions of user-to-user services are incomprehensible to consumers today. Enshrining it as a contract in law might, in some cases, make it worse. Today, when user-to-user services have used our data for something, they are keen to tell us that we agreed to it because it was in their terms of service. My noble friend opens up a really important issue to which we should give proper attention when the Digital Markets, Competition and Consumers Bill arrives in the House. It is genuinely not too late to address that, as it is working its way through the Commons now. I thank my noble friend for introducing the amendment, because we should all have thought of the issue earlier, but it is much broader than online safety.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, even by previous standards, this is the most miscellaneous of miscellaneous groups. We have ranged very broadly. I will speak first to Amendment 191A from the noble Baroness, Lady Kidron, which was so well spoken to by her and by the noble Baroness, Lady Harding. It is common sense, and my noble friend Lord Allan, as ever, put his finger on it: it is not as if coroners are going to come across this every day of the week; they need this kind of guidance. The Minister has introduced his amendments on this, and we need to reduce those to an understandable code for coroners and bereaved parents. I defy anybody, apart from about three Members of this House, to describe in any detail how the information notices will interlock and operate. I could probably name those Members off the top of my head. That demonstrates why we need such a code of practice. It speaks for itself.

I am hugely sympathetic to Amendment 275A in the name of the noble Baroness, Lady Finlay, who asked a series of important questions. The Minister said at col. 1773 that he would follow up with further information on the responsibility of private providers for their content. This is a real, live issue. The noble Baroness, Lady Kidron, put it right: we hope fervently that the Bill covers the issue. I do not know how many debates about future-proofing we have had on the Bill but each time, including in that last debate, we have not quite been reassured enough that we are covering the metaverse and provider content in the way we should be. I hope that this time the Minister can give us definitive chapter and verse that will help to settle the horses, so to speak, because that is exactly what the very good amendment in the name of the noble Baroness, Lady Finlay, was about.

16:15
On the amendment in the name of the noble Lord, Lord Moylan, I think I am rather more in favour of it in principle than is my noble friend, although I do not think he was against it in principle, and I was probably on exactly the same page as far as process is concerned. The noble Baroness, Lady Harding, put her finger on it, because a right of action is highly desirable; in fact, the Joint Committee was extremely keen on having a right of action for those affected by social media, and it made an important recommendation. In a sense, that was not seen through to a sufficient extent. We believe that the Bill is an opportunity to reset the relationship between service providers and users. While we recognise the resource challenges both for individuals in accessing the courts and for the courts themselves, we think that the importance of issues in the Bill requires that users have a right of redress in the court. I am absolutely with the noble Lord but am not entirely sure about shoehorning that into the Bill at this stage, given that other digital services acquire data as well. This should not be covered just by the Online Safety Bill; I believe the DMCC Bill should cover it, and I very much look forward to supporting an amendment from the noble Lord, Lord Moylan, if he chooses to table it when the time comes.
On my Amendment 253, we have heard throughout debates on the Bill that the range of human and business activity covered online presents a complex map of potential harms. Some of them will fall into or be adjacent to the oversight of other regulators with domain-specific expertise. The relationship has to some extent been formalised through the Digital Regulation Cooperation Forum, which comprises Ofcom, the CMA, the ICO and the FCA, and I think we all support the creation of the DRCF. Ofcom already has a working relationship with the ASA, and of course with the Internet Watch Foundation—I was very pleased to hear from it that progress is being made on a memorandum of understanding with Ofcom, and I very much hope that continues to progress, as it has since Committee, because that kind of relationship that Ofcom and the IWF continue to build is really important.
Within that regulatory web, if you like, Ofcom will of course have the most relevant powers and expertise, and many regulators will look to it for help in tackling online safety issues. Effective public protection will be achieved through what might be described as regulatory interlock. To protect the public, Ofcom should be explicitly empowered to co-operate with others and to share information, and the Bill should, as much as it can, enable Ofcom to work with other regulators and share online safety information with them. Ofcom should also be able to bring the immense skills of other regulators into its own work. The Bill gives Ofcom the general ability to co-operate with overseas regulators, but it is largely silent on co-operation with UK ones. The Communications Act 2003 limits the UK regulators with which Ofcom can share information—it excludes the ICO, for instance—yet the Bill has a permissive approach to overseas regulators.
The Bill should extend co-operation and information sharing in respect of online safety to include regulators overseeing the offences in Schedule 7, the primary priority harms for children and the priority harms to adults. Elsewhere in regulation, it is noted that the Financial Conduct Authority has a general duty to co-operate, so there are precedents. As the noble Baroness, Lady Kidron, pointed out, the Joint Committee was extremely keen on this. It said:
“In taking on its responsibilities under the Bill, Ofcom will be working with a network of other regulators and third parties already working in the digital world. We recommend that the Bill provide a framework for how these bodies will work together including when and how they will share powers, take joint action, and conduct joint investigations”.
The logic is absolutely in favour of this. I very much hope that the Minister will be able to take it on board because it has been seen to be logical right from the word go, particularly by the Joint Committee. I think the Communications and Digital Committee also recommended it, so what is not to like?
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, as others have said, this has been a very interesting tour d’horizon of some of the points in the Bill that we still need to resolve. I will not go over too much of the detail that has been raised because those points need a response from the Minister when he responds.

I will start with the use of “chairman” in several places throughout the Bill. We do not understand what is going on here. My noble friend Lady Merron wanted to deal with this but she unfortunately is not here, so I have been left holding the issue, and I wish to pursue it vigorously.

It is probably not well known but, in 2007, the Government decided that there ought to be changes in the drafting of our laws to make them gender-neutral as much as possible. Since 2007, it has been customary practice to replace words that could be gender-specific with those which are not. The Drafting Guidance, which is issued and should be followed by the Office of the Parliamentary Counsel, says that gender-neutral drafting requires

“avoiding gender-specific pronouns (such as ‘he’) for a person who is not necessarily of that gender”,

and avoiding gender-specific nouns

“that might appear to assume that a person of a particular gender will do a particular job or perform a particular role (eg ‘chairman’)”.

The guidance provides another bit of extra information:

“The gender-specific noun most likely to be encountered is ‘chairman’. ‘Chair’ is now widely used in primary legislation as a substitute”,


and we should expect to see it. Why do we not see it in this Bill?

Lord Deben Portrait Lord Deben (Con)
- Hansard - - - Excerpts

My wife, who is chairman of a number of things, objects to “chair” as “furniturism”. She likes to be referred to as a person and not a thing.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I respect the noble Lord’s point. I did not make a specific proposal; I simply asked why the Bill was framed in circumstances that are not those required by the Office of the Parliamentary Counsel.

Moving on, Amendment 288A, which addresses the issue of multiple characteristics, is welcome. I am grateful to the Minister for it. However, it is a rather odd solution to what should be a very straightforward question. We have the amendment—which, as I said, we welcome—because it was pointed out that the new overarching objective for this Bill that has been brought forward by government amendment refers to issues affecting those who have a characteristic. It uses the word “characteristic” without qualification, although I think most of us who have been involved in these debates and discussions realise that this is an oblique reference to the Equality Act 2010 and that, although they are not set out in the Bill, the characteristics concerned are probably those that are protected under the Equality Act. I think the Minister has some problem with the Equality Act, because we have crossed swords on it before, but we will not go back into that.

In referencing “a characteristic”, which is perfectly proper, we did not realise—but it has been pointed out—that under the Interpretation Act it is necessary to recall that in government legislation when the singular is mentioned it includes the plural unless it is specifically excluded. So we can assume that when references are made to “a characteristic”, they do in fact mean “characteristics”. Therefore, by logic, moving forward to the way to which it is referred in the Bill, when a person is referred to as having “a characteristic” it can also be assumed that the reference in the Bill applies to them having more than one characteristic.

However, grateful as I am to the Minister for bringing forward these amendments, which we accept, this is not quite the point that we were trying to get across. I invite the Minister, when he comes to respond, to explain a little more about the logic behind what I will propose. We are fairly convinced—as I think are most people who have been involved in these discussions—that social media companies’ form of operation, providing the materials and service that we want, is gendered. I do not think there is any doubt about that; everybody who has spoken in this debate has at some stage pointed out that, in many cases, those with protected characteristics, and women and girls in particular, are often picked on and singled out. A pile-on—the phrase used to mean the amplification that comes with working on the internet—is a very serious concern. That may change; it may just be a feature of today’s world and one day be something that does not happen. However, at the moment, it is clearly the case that if one is in some way characterised by a protected characteristic, and you have more than one of them, you tend to get more attention, aggravation and difficulty in your social media engagement. The evidence is so clear that we do not need to go into it.

The question we asked in Committee, and which we hoped we would get a response to, was whether we should always try to highlight the fact that where we are talking about people with more than one characteristic, it is the fact that there is a combination, not that it is a plural, that is the matter. Being female and Jewish, which has been discussed several times from the Dispatch Box by my noble friend Lady Merron and others, seems to be the sort of combination of characteristics which causes real difficulties on the internet for the people who have them. I use that only as one example; there are others.

If that is the case then it would have been nice to have seen that specifically picked up, and my original drafting of the amendment did that. However, we have accepted the Government’s amendment to create the new overarching objective, and I do not want to change it at this stage—we are past that debate. But I wonder whether the Minister, when he comes to respond, could perhaps as a grace note explain that he accepts the point that it is the doubling or tripling of the characteristics, not the plurality, that matters.

Moving back to the clauses that have been raised by others speaking in this debate, and who have made points that need to be responded to, I want to pick up on the point made by the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, about the need for some form of engagement between domestic regulators if we are going to get the best possible solution to how the internet is regulated as we go forward. We have never said that there had to be a new super-regulator and we never intended that there should be powers taken to change the way in which we do this. However, some form of co-operation, other than informal co-operation, is almost certainly going to be necessary. We do not want to subtract from where we are in relation to how our current regulators operate—they seem to be working well—but we worry that the legal powers and support that might be required in order to do that are not yet in place or, if they are in place, are based on somewhat archaic and certainly not modern-day regulatory practice.

Is this something that the committees of the Houses might consider? Perhaps when we come to other amendments around this, that is something we might pick up, because I think it probably needs further consideration away from the Bill in order to get the best possible solution. That is particularly true given, as the noble Lord, Lord Clement-Jones, says, so many of these regulators will now have experience of working together and might be prepared to share that in evidence or in appearances before such a committee.

16:30
Finally, I refer to the very good discussion we have had about Amendment 186A, which was introduced by the noble Lord, Lord Moylan. Like many people who received his initial circulation of his draft amendment, I was struck by why on earth I had not thought of that myself. It is a good and obvious move that we should think a little more about. It probably needs a lot more thought about the concerns about the unintended consequences that might arise from it before we move forward on it, and I take the points made by the noble Lord, Lord Allan, about that, but I hope that the Minister will respond positively to it and that it is perhaps something we can pick up in future Bills.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- Hansard - - - Excerpts

My Lords, let me add to this miscellany by speaking to the government amendments that stand in my name as part of this group. The first is Amendment 288A, which we mentioned on the first group of amendments on Report because it relates to the new introductory clause, Clause 1, and responds to the points raised by the noble Lord, Lord Stevenson of Balmacara. I am very happy to say again that the Government recognise that people with multiple and combined characteristics suffer disproportionately online and are often at greater risk of harm. This amendment therefore adds a provision in the new interpretation clause, Clause 1, to put beyond doubt that all the references to people with “a certain characteristic” throughout the Bill include people with a combination of characteristics. We had a good debate about the Interpretation Act 1978, which sets that out, but we are happy to set it out clearly here.

In his Amendment 186A, my noble friend Lord Moylan seeks to clarify a broader issue relating to consumer rights and online platforms. He got some general support—certainly gratitude—for raising this issue, although there was a bit of a Committee-style airing of it and a mixture of views on whether this is the right way or the right place. The amendment seeks to make it clear that certain protections for consumers in the Consumer Rights Act 2015 apply when people use online services and do not pay for them but rather give up their personal data in exchange. The Government are aware that the application of the law in that area is not always clear in relation to free digital services and, like many noble Lords, express our gratitude to my noble friend for highlighting the issue through his amendment.

We do not think that the Bill is the right vehicle for attempting to provide clarification on this point, however. We share some of the cautions that the noble Lord, Lord Allan of Hallam, raised and agree with my noble friend Lady Harding of Winscombe that this is part of a broader question about consumer rights online beyond the services with which the Bill is principally concerned. It could be preferable that the principle that my noble friend Lord Moylan seeks to establish through his amendment should apply more widely than merely to category 1 services regulated under the Bill. I assure him that the Bill will create a number of duties on providers which will benefit users and clarify that they have existing rights of action in the courts. We discussed these new protections in depth in Committee and earlier on Report. He drew attention to Clause 65(1), which puts a requirement on all services, not just category 1 services, to include clear and accessible provisions in their terms of service informing users about their right to bring a claim for breach of contract. Therefore, while we are grateful, we agree with noble Lords who suggested that this is a debate for another day and another Bill.

Amendment 191A from the noble Baroness, Lady Kidron, would require Ofcom to issue guidance for coroners and procurators fiscal to aid them in submitting requests to Ofcom to exercise its power to obtain information from providers about the use of a service by a deceased child. While I am sympathetic to her intention, I do not think that her amendment is the right answer. It would be inappropriate for an agency of the Executive to issue guidance to a branch of the judiciary. As I explained in Committee, it is for the Chief Coroner to provide detailed guidance to coroners. This is written to assist coroners with the law and their legal duties and to provide commentary and advice on policy and practice.

The amendment tabled by the noble Baroness cuts across the role of the Chief Coroner and risks compromising the judicial independence of the coroner, as set out in the Constitutional Reform Act 2005. As she is aware, the Chief Coroner has agreed to consider issuing guidance to coroners on social media and to consider the issues covered in the Bill. He has also agreed to explore whether coroners would benefit from additional training, with the offer of consultation with experts including Ofcom and the Information Commissioner’s Office. I suggest that the better approach would be for Ofcom and the Information Commissioner’s Office to support the Chief Coroner in his consideration of these issues where he would find that helpful.

I agree with the noble Lord, Lord Allan, that coroners must have access to online safety expertise given the technical and fast-moving nature of this sector. As we have discussed previously, Amendment 273 gives Ofcom a power to produce a report dealing with matters relevant to an investigation or inquest following a request from a coroner which will provide that expertise. I hope that this reassures the noble Baroness.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I understand the report on a specific death, which is very welcome and part of the regime as we all see it. The very long list of things that the coroner may not know that they do not know, as I set out in the amendment, is the issue which I and other noble Lords are concerned about. If the Government could find a way to make that possible, I would be very grateful.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

We are keen to ensure that coroners have access to the information and expertise that they need, while respecting the independence of the judicial process to decide what they do not know and would like to know more about and the role of the Chief Coroner there. It is a point that I have discussed a lot with the noble Baroness and with my noble friend Lady Newlove in her former role as Victims’ Commissioner. I am very happy to continue doing so because it is important that there is access to that.

The noble Lord, Lord Stevenson, spoke to the amendments tabled by the noble Baroness, Lady Merron, about supposedly gendered language in relation to Clauses 141 and 157. As I made clear in Committee, I appreciate the intention—as does Lady Deben—of making clear that a person of either sex can perform the role of chairman, just as they can perform the role of ombudsman. We have discussed in Committee the semantic point there. The Government have used “chairman” here to be consistent with terminology in the Office of Communications Act 2002. I appreciate that this predates the Written Ministerial Statement which the noble Lord cited, but that itself made clear that the Government at the time recognised that in practice, parliamentary counsel would need to adopt a flexible approach to this change—for example, in at least some of the cases where existing legislation originally drafted in the former style is being amended.

The noble Lord may be aware of a further Written Ministerial Statement, made on 23 May last year, following our debates on gendered language on another Bill, when the then Lord President of the Council and Leader of the House of Commons said that the Office of the Parliamentary Counsel would update its drafting guidance in light of that. That guidance is still forthcoming. However, importantly, the term here will have no bearing on Ofcom’s decision-making on who would chair the advisory committees. It must establish that this could indeed be a person of either sex.

Amendment 253 seeks to enable co-operation, particularly via information-sharing, between Ofcom and other regulators within the UK. I reassure noble Lords that Section 393 of the Communications Act 2003 already includes provisions for sharing information between Ofcom and other regulators in the UK.

As has been noted, Ofcom already co-operates effectively with other domestic regulators. That has been strengthened by the establishment of the Digital Regulation Co-operation Forum. By promoting greater coherence, the forum helps to resolve potential tensions, offering clarity for people and the industry. It ensures collaborative work across areas of common interest to address complex problems. Its outputs have already delivered real and wide-ranging impacts, including landmark policy statements clarifying the interactions between digital regulatory regimes, research into cross-cutting issues, and horizon-scanning activities on new regulatory challenges. We will continue to assess how best to support collaboration between digital regulators and to ensure that their approaches are joined up. We therefore do not think that Amendment 253 is necessary.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, the Minister has not stated that there is a duty to collaborate. Is he saying that that is, in fact, the case in practice?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes, there is a duty, and the law should be followed. I am not sure whether the noble Lord is suggesting that it is not—

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Is there a duty to collaborate between regulators?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am not sure that I follow the noble Lord’s question, but perhaps—

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, the Minister is saying that, in practice, there is a kind of collaboration between regulators and that there is a power under the Communications Act, but is he saying that there is any kind of duty on regulators to collaborate?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

If I may, I will write to the noble Lord setting that out; he has lost me with his question. We believe, as I think he said, that the forum has added to the collaboration in this important area.

The noble Baroness, Lady Finlay, raised important questions about avatars and virtual characters. The Bill broadly defines “content” as

“anything communicated by means of an internet service”,

meaning that it already captures the various ways through which users may encounter content. In the metaverse, this could therefore include things such as avatars or characters created by users. As part of the user-to-user services’ risk assessments, providers will be required to consider more than the risk in relation to user-generated content, including aspects such as how the design and operation of their services, including functionality and how the service is used, might increase the risk of harm to children and the presence of illegal content. A user-to-user service will need to consider any feature which enables interaction of any description between users of the service when carrying out its risk assessments.

The Bill is focused on user-to-user and search services, as there is significant evidence to support the case for regulation based on the risk of harm to users and the current lack of regulatory and other accountability in this area. Hosting, sharing and the discovery of user-generated content and activity give rise to a range of online harms, which is why we have focused on those services. The Bill does not regulate content published by user-to-user service providers themselves; instead, providers are already liable for the content that they publish on their services themselves, and the criminal law is the most appropriate mechanism for dealing with services which publish illegal provider content.

The noble Baroness’s Amendment 275A seeks to require Ofcom to produce a wide-ranging report of behaviour facilitated by emerging technologies. As we discussed in Committee, the Government of course agree that Ofcom needs continually to assess future risks and the capacity of emerging technologies to cause harm. That is why the Bill already contains provisions which allow it to carry out broad horizon scanning, such as its extensive powers to gather information, to commission skilled persons’ reports and to require providers to produce transparency reports. Ofcom has already indicated that it plans to research emerging technologies, and the Bill will require it to update its risk assessments, risk profiles and codes of practice with the outcomes of this research where relevant.

As we touched on in Committee, Clause 56 requires regular reviews by Ofcom into the incidence of content that is harmful to children, and whether there should be changes to regulations setting out the kinds of content that are harmful to children. In addition, Clause 143 mandates that Ofcom should investigate users’ experience of regulated services, which are likely to cover user interactions in virtual spaces, such as the metaverse and those involving content generated by artificial intelligence.

16:45
I reiterate that platforms on which user-generated interactions take place are in scope of the Bill. That includes the metaverse as well as other extended reality services which have been raised by a number of noble Lords as an area of concern. In Committee, the noble Baroness, Lady Finlay, tabled a similar amendment in relation to real-world physical crimes, such as sexual assault. Although her present amendment pertains to the regulatory framework, I reassure her and all noble Lords that criminal offences are drafted so as to avoid, where possible, specifying any medium or technology through which they might be committed so that they too are future-proofed. Many current criminal offences can therefore be committed and prosecuted regardless of whether the behaviour is conducted online or offline.
The report that the noble Baroness seeks through this amendment would be a broad expansion of Ofcom’s oversight responsibilities to services that are not in scope of the Bill. As a result, I am afraid I cannot commit to taking that forward in relation to this Bill but I am very happy to keep discussing the issue with her more broadly, as is my noble friend Lord Camrose, as a Minister at the Department for Science, Innovation and Technology. I hope that provides her with sufficient reassurance to not press her amendment today.
Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- View Speech - Hansard - - - Excerpts

I am most grateful to the Minister; perhaps I could just check something he said. There was a great deal of detail and I was trying to capture it. On the question of harms to children, we all understand that the harms to children are viewed more extensively than harms to others, but I wondered: what counts as unregulated services? The Minister was talking about regulated services. What happens if there is machine-generated content which is not generated by any user but by some random codes that are developed and then randomly incite problematic behaviours?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am happy to provide further detail in writing and to reiterate the points I have made as it is rather technical. Content that is published by providers of user-to-user services themselves is not regulated by the Bill because providers are liable for the content they publish on the services themselves. Of course, that does not apply to pornography, which we know poses a particular risk to children online and is regulated through Part 5 of the Bill. I will set out in writing, I hope more clearly, for the noble Baroness what is in scope to reassure her about the way the Bill addresses the harms that she has rightly raised.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Will the Minister copy other Members in?

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, this has indeed been a wide-ranging and miscellaneous debate. I hope that since we are considering the Bill on Report noble Lords will forgive me if I do not endeavour to summarise all the different speeches and confine myself to one or two points.

The first is to thank the noble Baroness, Lady Kidron, for her support for my amendment but also to say that having heard her argument in favour of her Amendment 191A, I think the difference between us is entirely semantic. Had she worded it so as to say that Ofcom should be under a duty to offer advice to the Chief Coroner, as opposed to guidance to coroners, I would have been very much happier with it. Guidance issued under statute has to carry very considerable weight and, as my noble friend the Minister said, there is a real danger in that case of an arm of the Executive, if you like, or a creature of Parliament—however one wants to regard Ofcom—interfering in the independence of the judiciary. Had she said “advice to the Chief Coroner and whoever is the appropriate officer in Scotland”, that would have been something I could have given wholehearted support to. I hope she will forgive me for raising that quibble at the outset, but I think it is a quibble rather than a substantial disagreement.

On my own amendment, I simply say that I am grateful to my noble friend for the brevity and economy with which he disposed of it. He was of course assisted in that by the remarks and arguments made by many other noble Lords in the House as they expressed their support for it in principle.

I think there is a degree of confusion about what the Bill is doing. There seemed to be a sense that somehow the amendment was giving individuals the right to bring actions in the courts against providers, but of course that already happens because that right exists and is enshrined in Article 65. All the amendment would do is give some balance so that consumers actually had some protections in what is normally, in essence, an unequal contest, which is trying to ensure that a large company enforces the terms and contracts that it has written.

In particular, my amendment would give, as I think noble Lords know, the right to demand repeat performance—that is, in essence, the right to put things right, not monetary compensation—and it would frustrate any attempts by providers, in drafting their own terms and conditions, to limit their own liability. That is of course what they seek to do but the Consumer Rights Act frustrates them in their ability to do so.

We will say no more about that for now. With that, I beg leave to withdraw my amendment.

Amendment 186A withdrawn.
Clause 65: Further duties about terms of service
Amendment 187
Moved by
187: Clause 65, page 62, line 18, leave out from “service” to “down” in line 20 and insert “indicate (in whatever words) that the presence of a particular kind of regulated user-generated content is prohibited on the service, the provider takes”
Member’s explanatory statement
This amendment makes a change to a provision about what the terms of service of a Category 1 service say. The effect of the change is to cover a wider range of ways in which a term of service might indicate that a certain kind of content is not allowed on the service.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, transparency and accountability are at the heart of the regulatory framework that the Bill seeks to establish. It is vital that Ofcom has the powers it needs to require companies to publish online safety information and to scrutinise their systems and processes, particularly their algorithms. The Government agree about the importance of improving data sharing with independent researchers while recognising the nascent evidence base and the complexities of this issue, which we explored in Committee. We are pleased to be bringing forward a number of amendments to strengthen platforms’ transparency, which confer on Ofcom new powers to assess how providers’ algorithms work, which accelerate the development of the evidence base regarding researchers’ access to information and which require Ofcom to produce guidance on this issue.

Amendment 187 in my name makes changes to Clause 65 on category 1 providers’ duties to create clear and accessible terms of service and apply them consistently and transparently. The amendment tightens the clause to ensure that all the providers’ terms through which they might indicate that a certain kind of content is not allowed on its service are captured by these duties.

Amendment 252G is a drafting change, removing a redundant paragraph from the Bill in relation to exceptions to the legislative definition of an enforceable requirement in Schedule 12.

In relation to transparency, government Amendments 195, 196, 198 and 199 expand the types of information that Ofcom can require category 1, 2A and 2B providers to publish in their transparency reports. With thanks to the noble Lord, Lord Stevenson of Balmacara, for his engagement on this issue, we are pleased to table these amendments, which will allow Ofcom to require providers to publish information relating to the formulation, development and scope of user-to-user service providers’ terms of service and search service providers’ public statements of policies and procedures. This is in addition to the existing transparency provision regarding their application.

Amendments 196 and 199 would enable Ofcom to require providers to publish more information in relation to algorithms, specifically information about the design and operation of algorithms that affect the display, promotion, restriction, discovery or recommendation of content subject to the duties in the Bill. These changes will enable greater public scrutiny of providers’ terms of service and their algorithms, providing valuable information to users about the platforms that they are using.

As well as publicly holding platforms to account, the regulator must be able to get under the bonnet and scrutinise the algorithms’ functionalities and the other systems and processes that they use. Empirical tests are a standard method for understanding the performance of an algorithmic system. They involve taking a test data set, running it through an algorithmic system and observing the output. These tests may be relevant for assessing the efficacy and wider impacts of content moderation technology, age-verification systems and recommender systems.

Government Amendments 247A, 250A, 252A, 252B, 252C, 252D, 252E and 252F will ensure that Ofcom has the powers to enable it to direct and observe such tests remotely. This will significantly bolster Ofcom’s ability to assess how a provider’s algorithms work, and therefore to assess its compliance with the duties in the Bill. I understand that certain technology companies have voiced some concerns about these powers, but I reassure your Lordships that they are necessary and proportionate.

The powers will be subject to a number of safeguards. First, they are limited to viewing information. Ofcom will be unable to remotely access or interfere with the service for any other purpose when exercising the power. These tests would be performed offline, meaning that they would not affect the services’ provision or the experience of users. Assessing systems, processes, features and functionalities is the focus of the powers. As such, individual user data and content are unlikely to be the focus of any remote access to view information.

Additionally, the power can be used only where it is proportionate to use in the exercise of Ofcom’s functions—for example, when investigating whether a regulated service has complied with relevant safety duties. A provider would have a right to bring a legal challenge against Ofcom if it considered that a particular exercise of the power was unlawful. Furthermore, Ofcom will be under a legal obligation to ensure that the information gathered from services is protected from disclosure, unless clearly defined exemptions apply.

The Bill contains no restriction on services making the existence and detail of the information notice public. Should a regulated service wish to challenge an information notice served to it by Ofcom, it would be able to do so through judicial review. In addition, the amendments create no restrictions on the use of this power being viewable to members of the public through a request, such as those under the Freedom of Information Act—noting that under Section 393 of the Communications Act, Ofcom will not be able to disclose information it has obtained through its exercise of these powers without the provider’s consent, unless permitted for specific, defined purposes. These powers are necessary and proportionate and will that ensure Ofcom has the tools to understand features and functionalities and the risks associated with them, and therefore the tools to assess companies’ compliance with the Bill.

Finally, I turn to researchers’ access to data. We recognise the valuable work of researchers in improving our collective understanding of the issues we have debated throughout our scrutiny of the Bill. However, we are also aware that we need to develop the evidence base to ensure that any sharing of sensitive information between companies and researchers can be done safely and securely. To this end, we are pleased to table government Amendments 272B, 272C and 272D.

Government Amendment 272B would require Ofcom to publish its report into researcher access to information within 18 months, rather than two years. This report will provide the evidence base for government Amendments 272C and 272D, which would require Ofcom to publish guidance on this issue. This will provide valuable, evidence-based guidance on how to improve access for researchers safely and securely.

That said, we understand the calls for further action in this area. The Government will explore this issue further and report back to your Lordships’ House on whether further measures to support researchers’ access to data are required—and if so, whether they could be implemented through other legislation, such as the Data Protection and Digital Information Bill. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, Amendment 247B in my name was triggered by government Amendment 247A, which the Minister just introduced. I want to explain it, because the government amendment is quite late—it has arrived on Report—so we need to look in some detail at what the Government have proposed. The phrasing that has caused so much concern, which the Minister has acknowledged, is that Ofcom will be able to

“remotely access the service provided by the person”.

It is those words—“remotely access”—which are trigger words for anyone who lived through the Snowden disclosures, where everyone was so concerned about remote access by government agencies to precisely the same services we are talking about today: social media services.

17:00
So I hope the House will forgive me for teasing out why this is important and why we need extensive safeguards. This is non-trivial. If you are out there running a service, the idea of a government agency having remote access to your systems is a big deal, and the fact that this has come so late compounds that fear because it feels like it is being snuck in.
Ofcom, as I am sure it will remind us, is independent, but in the last debate it was referred to as part of the UK Executive. We cannot have it both ways. I see the chairman of Ofcom shaking his head at that, but we have just had a debate at the heart of which was the notion that this part of the UK Executive would give guidance to part of the UK judiciary. What is sauce for the goose is sauce for the gander. Here, the concern is that Ofcom would be seen as part of the UK Executive having remote access to social media services run by independent companies; I think your Lordships can see why that triggers things.
We need to establish in this debate that what the Government have in mind for Ofcom—this independent regulator—is utterly different from what might be the case with, for example, a security service under the terms of investigatory powers legislation. On the face of it, it looks similar, so it is important that, if it is different as the Minister has started, and I think will continue, to argue, we establish exactly why it is different and how we can be confident that is different. Amendment 247B in my name and that of my noble friend was intended to be helpful to the Government as one way of trying to establish why this is different, by placing some limitations on the Bill.
There are two risks inherent in this notion of remote access. The first is that Ofcom is overintrusive. It has an oversight role, but we are not expecting Ofcom to run our social media services; we expect it to oversee social media services that run themselves. Clearly, remote access, if used in an overbearing way, could be excessively intrusive in relation to those services being able to do what they do. In one version of remote access, which I think the Minister has tried to tease out, it is an offline exercise done occasionally to check something in a quasi-academic way; in another form, Ofcom sits there with a dashboard of what is going on in these systems, just as the Government like to do in their own public services with health and other things. Ofcom with a dashboard looking at what is happening in real time is quite different, and I think would be seen as overbearing and excessively intrusive. I hope the Minister will be able to provide further assurances that that is not what they have in mind.
The second risk is that the access is used for purposes other than simply Ofcom’s purposes. I am certainly not a conspiracy theorist and, perhaps unusually in my community of tech people, I quite admire the people who have only first names and live in Cheltenham, because I think what they do does keep us safe. That is what we pay them to do: to be creative and find creative ways to access data under lawful authority, et cetera, fully respecting human rights. I have confidence that those I have met do that and they do a great job; but we pay them to be creative and find access to data, not to put up with barriers. A spy is gonna spy. If they know that there is a form of access to data, of course, their job is to look at whether that would be useful to them.
My understanding—not as a conspiracy theory but as a matter of fact as to how the law works—is that, under investigatory powers rules, they can issue secret warrants, appropriately signed off, to pretty much anyone to access data. The recipients of those warrants have to execute them and, under penalty of prosecution themselves, are not able to tell anyone they received the warrant. Ofcom is not exempt from that. That is a fact, and we should recognise it; so, were Ofcom to receive an appropriate warrant for data, my understanding is that it would not have a way to say no and would not even be able to tell us about it. The best way to protect against that—to protect against temptation for James from Cheltenham, who is doing his job—is to make sure that remote access does not include anything that would be remotely useful to the security services. The way that we will be able to understand that is through transparency.
The Minister began his comments by saying that transparency and accountability were critical, and that maxim also applies here. We also want to protect against Ofcom’s own overreach and against any downstream use of that data. It is essential, therefore, that we understand in quite a lot of detail exactly what this remote access does and does not entail, so we can make our minds up about whether this piece is being used in an appropriate way.
I hope that the Minister can build on assurances which he very helpfully started to give at the beginning of the debate about this information notice process. He said that there was nothing secret about the information notices. Again, I hope that we can reinforce that any platform that is concerned that remote access that it is being asked to provide is inappropriate can tell us all about it and, as the Minister said, challenge that. I hope also that individual complainants and the harshest critics of the Government and of the security services—and a lot of people in this world worry about these things who, when they read this debate or look at the amendment, will assume the worst—can see exactly what remote access has and has not been made available. Then also I hope that, as individual users—because it is all about our privacy, as social media users of one form or another—we will know that, when we hand our data over to the social media service, which correctly under the terms of this legislation is required to give Ofcom access and keep us safe, we will know exactly what that access entails and that it does not go further than we set out in the legislation.
The transparency piece is critical. Can the Minister say that the information notices in relation to remote access will never be withheld? It is an utterly different world from the investigatory powers world, where there are good reasons where things have to be kept secret. If the Minister can say that in that world nothing is secret about the fact of remote access, and if anybody who has concerns can get the information that they need to understand whether those concerns are genuine, or whether something much more benign is happening, that would be extremely helpful. The Minister mentioned judicial review by platforms. I get that but, if a platform feels that Ofcom, the regulator, is behaving in an overbearing way, I remain a little concerned that judicial review is quite a slow and painful process. As I understand it, it is more about whether it was legally correct to give the order than perhaps whether the substance of the order was appropriate.
We still need to know that the checks and balances are in place. If Ofcom, under a future leadership—I am sure not under its current leadership—were to take it upon itself to want to set up a dashboard to look at what was happening in every social media company in real time, and were taken by that spirit of madness at some future date, I hope that the companies would be able to raise concerns about that, because it is not what we intend to happen in this Bill. I hope that they would be able to do that in a more straightforward process than in a lengthy judicial review.
The Minister has a clear idea of the kind of reassurances that we are looking for. He teased out some of them in his opening comments, and I hope that he can make them even more strongly in his closing remarks.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, the noble Lord, Lord Allan of Hallam, hinted at the fact that there have been a plethora of government amendments on Report and, to be honest, it has been quite hard fully to digest most of them, let alone scrutinise them. I appreciate that the vast majority have been drawn up with opposition Lords, who might have found it a bit easier. But some have snuck in and, in that context, I want to raise some problems with the amendments in this group, which are important. I, too, am especially worried about that government amendment on facilitating remote access to services and equipment used to buy services. I am really grateful to the noble Lords, Lord Allan of Hallam and Lord Clement-Jones, for tabling Amendment 247B, because I did not know what to do—and they did it. At least it raises the issue to the level of it needing to be taken seriously.

The biggest problem that I had when I originally read this provision was that facilitating remote access to services, and as yet undefined equipment used by a service, seems like a very big decision, and potentially disproportionate. It certainly has a potential to have regulatory overreach, and it creates real risks around privacy. It feels as though it has not even been flagged up strongly enough by the Government with regard to what it could mean.

I listened to what the Minister said, but I still do not fully understand why this is necessary. Have the Government considered the privacy and security implications that have already been discussed? Through Amendment 252A, the Government now have the power to enter premises for inspection—it rather feels as if there is the potential for raids, but I will put that to one side. They can go in, order an audit and so on. Remote access as a preliminary way to gather information seems heavy-handed. Why not leave it as the very last thing to do in a dialogue between Ofcom and a given platform? We have yet to hear a proper justification of why Ofcom would need this as a first-order thing to do.

The Bill does not define exactly what

“equipment used by the service”

means. Does it extend to employees’ laptops and phones? If it extends to external data centres, have the Government assessed the practicalities and security impact of that and the substantial security implications, as have been explained, for the services, the data centre providers and those of us whose data they hold?

I am also concerned that this will necessitate companies having very strongly to consider internal privacy and security controls to deal with the possibility of this, and that this will place a disproportionate burden on smaller and mid-sized businesses that do not have the resources available to the biggest technology companies. I keep raising this because in other parts of government there is a constant attempt to say that the UK will be the centre of technological innovation and that we will be a powerhouse in new technologies, yet I am concerned that so much of the Bill could damage that innovation. That is worth considering.

It seems to me that Amendment 252A on the power to observe at the premises ignores decentralised projects and services—the very kind of services that can revolutionise social media in a positive way. Not every service is like Facebook, but this amendment misses that point. For example, you will not be able to walk into the premises of the UK-based Matrix, the provider of the encrypted chat service Element that allows users to host their own service. Similarly, the non-profit Mastodon claims to be the largest decentralised social network on the internet and to be built on open-web standards precisely because it does not want to be bought or owned by a billionaire. So many of these amendments seem not to take those issues into account.

I also have a slight concern on researcher access to data. When we discussed this in Committee, the tone was very much—as it is in these amendments now—that these special researchers need to be able to find out what is going on in these big, bad tech companies that are trying to hide away dangerous information from us. Although we are trying to ensure that there is protection from harms, we do not want to demonise the companies so much that, every time they raise privacy issues or say, “We will provide data but you can’t access it remotely” or “We want to be the ones deciding which researchers are allowed to look at our data”, we assume that they are always up to no good. That sends the wrong message if we are to be a tech-innovative country or if there is to be any working together.

17:15
My final point is to be a bit more positive. I am very keen on the points made by the Minister on the importance of transparency in algorithms, particularly in Amendments 196 and 199. This raises an important point. These amendments are intended to mean that providers of user-to-user services and search services would have to include in their transparency report details about algorithms, so that we can see how they work, and these amendments particularly relate to illegal content and content that is harmful to children. I should like that being understood more broadly, because for me there is constant tension where people do not know what the algorithms are doing. When content is removed, deboosted, or whatever, they do not know why. More transparency there would be positive.
The Minister knows this, because I have written to him on the subject, but many women, for example, are regularly being banned from social media for speaking out on sex-based rights, and gender-critical accounts are constantly telling me and are discussing among themselves that they have been shadow banned: that the algorithms are not allowing them to get their points over. This is, it is alleged, because of the legacy of trans activists controlling the algorithms.
Following on from the point of the noble Lord, Lord Allan of Hallam, there is always a danger here of people being conspiratorial, paranoid and thinking it is the algorithms. I made the point in an earlier discussion that sometimes you might just put up a boring post and no one is interested, but you imagine someone behind the scenes. But we know that Facebook continues to delete posts that states that men cannot be women, for example.
I would like this to be demystified, so the more Ofcom can ask the companies to demystify their algorithmic decisions and the more users can be empowered to know about it, the better for all of us. That is the positive bit of the amendments that I like.
Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

My Lords, the business of the internet is data. Whether it is a retail business, a media business or any other kind of business, the internet is all about data. The chiefs of our internet companies know more about noble Lords than anyone else—more than any government agency, your doctor and almost anyone—because the number of data points that big internet companies have on people is absolutely enormous, and they use them to very great effect.

Some of those effects are entirely benign. I completely endorse what the noble Baroness, Lady Fox, said. As a champion of innovation and business, I totally recognise the good that is done by the world’s internet companies to make our lives richer, create jobs and improve the world, but some of what they do is not good. Either inadvertently or by being passive enablers of harm, internet companies have been responsible for huge societal harms. I do not want to go through the full list, but when I think about the mental health of our teenagers, the extremism in our politics, the availability of harmful information to terrorists and what have you, there is a long catalogue of harms to which internet companies have contributed. We would be naive if we did not recognise.

However, almost uniquely among commercial businesses, internet companies guard access to that data incredibly jealously. They will not let you peek in and share their insights. I know from my experience in the health field that we work very closely with the pharmaceutical industry—there is a whole programme of pharmacovigilance that any pharma company has to participate in in order to explain, measure and justify the good and disbenefits of its medicines. We have no similar programme to pharmacovigilance for the tech industry. Instead, we are completely blind. Policy makers, the police and citizens are flying blind when it comes to the data that is held on us on both an individual and a demographic basis. That is extremely unusual.

That is why I really welcome my noble friend’s amendments that give Ofcom what seems to me to be extremely proportionate and thoughtful powers in order to look into this data, because without it, we do not know what is going on in this incredibly important part of our lives.

The role that researchers, including academic, civil society and campaigning researchers, play in helping Ofcom, policymakers and politicians to arrive at sensible, thoughtful and proportionate policy is absolutely critical. I pay enormous tribute to them; I am grateful to those noble Lords who have also done so. I am extremely grateful to my noble friend the Minister for his amendments on this subject, Amendments 272B and 272C, which address the question of giving researchers better access to some of this data. They would reduce the timeline for the review on data from 24 months to 18 months, which would be extremely helpful, and would changing “may” to “must”, which represents an emphatic commitment to the outcome of this review.

However, issues remain around the question of granting access to data for researchers. What happens to the insights from the promised review once it is delivered? Where are the powers to deliver the review’s recommendations? That gap is not currently served by the government amendments, which is why I and the noble Lord, Lord Clement-Jones, have tabled Amendments 237ZA, 237DB, 262AA and 272AB. Their purpose is to put in the Bill reasonable, proportionate powers to bring access to data for researchers along the lines that the research review will recommend.

The feelings on this matter are extremely strong because we all recognise the value here. We are concerned that any delay may completely undermine this sector. As we debated in Committee, there is a substantial and valuable UK sector in this research area that is likely to move lock, stock and barrel to other countries where these kinds of powers may be in place; for instance, in EU or US legislation. The absence of these powers will, I think, leave Britain in the dark and competitively behind other countries, which is why I want to push the Minister hard on these amendments. I am grateful for his insight that this matter is something that the Government may look to in future Bills, but those Bills are far off. I would like to hear from him what more he could do to try to smooth the journey from this Bill and this review to any future legislation that comes through this House in order to ensure that this important gap is closed.

Baroness Fraser of Craigmaddie Portrait Baroness Fraser of Craigmaddie (Con)
- View Speech - Hansard - - - Excerpts

My Lords, Amendments 270 and 272 are in my name; I thank the noble Lord, Lord Stevenson of Balmacara, for adding his name to them. They are the least controversial amendments in this group, I think. They are really simple. Amendment 270 would require Ofcom’s research about online interests and users’ experiences of regulated services under Clause 143 to be broken down by nation, while Amendment 272 relates to Clause 147 and would require Ofcom’s transparency reports also to be broken down in a nation-specific way.

These amendments follow on from our debates on devolution in Committee. Both seek to ensure that there is analysis of users’ online experiences in the different nations of the UK, which I continue to believe is essential to ensuring that the Bill works for the whole of the UK and is both future-proofed—a word we have all used lots—and able to adapt to different developments across each of the four nations. I have three reasons why I think these things are important. The first concerns the interplay between reserved and devolved matters. The second concerns the legal differences that already exist across the UK. The third concerns the role of Ofcom.

In his much-appreciated email to me last week, the Minister rightly highlighted that internet services are a reserved matter and I absolutely do not wish to impose different standards of regulation across the UK. Regarding priority offences, I completely support the Government’s stance that service providers must treat any content as priority illegal content where it amounts to a criminal offence anywhere in the UK regardless of where that act may have taken place or where the user is. However, my amendments are not about regulation; they are about research and transparency reporting, enabling us to understand the experience across the UK and to collect data—which we have just heard, so powerfully, will be more important as we continue.

I am afraid that leaving it to Ofcom’s discretion to understand the differences in the online experiences across the four nations over time is not quite good enough. Many of the matters we are dealing with in the online safety space—such as children, justice, police and education—are devolved. Government policy-making in devolved areas will increasingly rely on data about online behaviours, harms and outcomes. These days, I cannot imagine creating any kind of public policy without understanding the online dimension. There are areas where either the community experience and/or the policy approach is markedly different across the nations—take drug abuse, for example. No data means uninformed policy-making or flying blind, as my noble friend Lord Bethell has just said. But how easy will it be for the devolved nations to get this information if we do not specify it in the Bill?

In many of the debates, we have already heard of the legal differences across the four nations, and I am extremely grateful to the noble and learned Lord, Lord Hope of Craighead, who is not in his place, the noble Lord, Lord Stevenson of Balmacara, and the Minister for supporting my amendment last week when I could not be here. I am terribly sorry. I was sitting next to the noble Viscount, Lord Camrose, at the time. The amendment was to ensure that there is a legal definition of “freedom of expression” in the Bill that can be understood by devolved Administrations across the UK.

The more I look at this landscape, the more challenges arise. The creation of legislation around intimate abuse images is a good example. The original English legislation was focused on addressing the abusive sharing of intimate images after a relationship breakdown. It required the sharing to have been committed with the intent to cause harm, which has a very easy defence: “I did not mean to cause any harm”. The Scottish legislation, drafted slightly later, softened this to an intent to cause harm or being reckless as to whether harm was caused, which is a bit better because you do not need to prove intent. Now the English version is going to be updated in the Bill to create an offence simply by sharing, which is even better.

Other differences in legislation have been highlighted, such as on deepfakes and upskirting. On the first day of Report, the noble Baroness, Lady Kennedy of The Shaws, highlighted a difference in the way cyberflashing offences are understood in Northern Ireland. So the issue is nuanced, and the Government’s responses change as we learn about harmful behaviours in practice. Over time, we gradually see these offences refined as we learn more about how technology is used to abuse in practice. The question really is: what will such offences look like online in five years’ time? Will the user experience and government policy across the four nations be the same? I will not pretend to try to answer that, but to answer it we will need the data.

I am concerned that the unintended consequences of the Bill in the devolved Administrations have not been fully appreciated or explored. Therefore, I am proposing a belt and braces approach in the reporting regime. When we come to post-legislative scrutiny, with reports being laid before this Parliament and the devolved Administrations in Edinburgh, Cardiff and Belfast—if there is one—we will want to have the data to understand the online experiences of each nation. That is why my very little amendments are seeking to ensure that we capture this experience and that is why it is so important.

17:30
On Ofcom—my final point—I know the Minister has every confidence in Ofcom and rightly points out that it has a strong track record of producing data that is representative of people across the UK. I agree. Ofcom already does a great deal of research which is broken down into nation-specific reporting, particularly in broadcasting, but most of this is directly in relation to its obligations under the Communications Act and the BBC charter, which contains a specific purpose:
“To reflect, represent and serve the diverse communities of all of the United Kingdom’s nations and regions and, in doing so, support the creative economy across the United Kingdom”.
From my Scottish point of view, I am arguing—and I know that the Scottish advisory committee of Ofcom would agree with me—that its research and Ofcom’s reports, such as the annual Media Nations report, are linked to the way legislation is set up and then implemented by Ofcom.
Having ensured this for broadcasting and communications, why would we not want to do this for online safety? At last Tuesday’s meeting of the Communications and Digital Committee, on which I serve, we took evidence on a huge range of subjects from my noble friend the chairman of Ofcom and Dame Melanie Dawes. The noble Lord, Lord Grade, used all the usual words to describe this Bill—“complex”, “challenging”—and pointed out that it is a new law in a novel area, but he stressed that Ofcom comes to decisions outside the political arena based on research and evidence.
My amendments just remind Ofcom that we need this research and evidence by nation. This Bill is so large, so wide-ranging, that Ofcom’s remit and functions are having to expand hugely to deliver this new regime. The noble Baroness, Lady Fox, reminded us last week that what Ofcom does comes from the legislation. It does not do things off its own bat. Ofcom already has a huge challenge on its hands with this Bill, and experience tells us that it is likely to deliver only what is specified—the “must do” bits, not the “nice to do” extras. There may be no differences in the online experiences across the nations of the UK, but the only way we can be sure is if we have the data for each nation, the transparency and all the research reporting. I urge the Minister to take my amendment seriously.
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I think that was a very good speech from the noble Baroness, partly because I signed her amendment and support it and also because I want to refer back to the points made earlier by the noble Lord, Lord Bethell, about research. I am speaking from the Back Benches here because some of what I say may not have been cleared fully with my colleagues, but I am hoping that they will indulge me slightly. If I speak from this elevated position, perhaps they will not hear me so well.

To deal with noble Lords in the order in which they spoke, I support the amendments tabled by the noble Lord, Lord Bethell, in relation to having a bit more activity in relation to the area where we have very good change of government policy in relation to access by researchers to data, and I am very grateful to the Minister for doing that. The noble Lord, Lord Bethell, made the point that there is perhaps a bigger question and a bigger story than can be done just by simply bringing forward the time of the report and changing “may” to “must”, although I always think “may” to “must” changes are important because they reflect a complete change of approach and I hope action will follow. The question about access by those who need data in order to complete their research is crucial to the future success of these regimes. That plays back to what the noble Baroness, Lady Fraser, was saying, which is that we need to have this not just in aggregate form but broken down and stratified so that we can really interrogate where this information is showing the gaps, the opportunities, the changes that are needed and the successes, if there are any, in the way in which we are working.

I support the amendments tabled by the noble Lord, Lord Bethell, because I think this is not so much a question of regulation or lawmaking in this Bill but of trying to engender a change of culture about the way in which social media companies operate. It will need all of us, not just the Government or the regulatory bodies, to continue to press this because this is a major sea change in what they have been doing until now. They are quite rightly protective of their business interests and business secrets, but that is not the same when the currency is data and our data is being used to create change and opportunity and their profits are based on exploiting our resources.

I go back to the points made by the noble Lord, Lord Moylan, in his opening amendment today about why consumer rights do not apply when monetary considerations are not being taken into account. Bartering our data in order to obtain benefits from social media companies is not the same as purchasing over the counter at the local shop—we accept that—but times have changed and we are living in a different world. Everything is being bought and sold electronically. Why is consumer law not being moved forward to take account of that so that the rights that are important to that, because they are the same, are being exploited? I leave that for the Minister to come back to if he wishes to do so from the Dispatch Box.

Moving on to the Scottish issues, the amendment, as introduced by the noble Baroness, is about transparency and data, but I think it hides a bigger question which I am afraid affects much of the legislation that comes through this House, which is that very often the devolution impact of changes in the law and new laws that are brought forward is always the last to be thought about and is always tacked on at the end in ways that are often very obscure.

I have one particularly obscure question which I want to leave with the Minister, completely unreasonably, but I think it just about follows on from the amendment we are discussing. It is that, towards the end of the Bill, Clause 53(5)(c) refers to the consent of the Secretary of State or other Minister of the Crown to crimes in Scottish or Northern Irish legislation when they enter the Online Safety Bill regime. This is because, as has been made clear, laws are changing and are already different in Scotland, Wales and Northern Ireland from some of the criminal laws in England and Wales. While that is to be welcomed, as the noble Baroness said, the devolved Administrations should have the right to make sure, in the areas of their control, that they have the laws that are appropriate for the time, but if they are different, we are going to have to live with those across the country in a way that is a bit patchwork. There need to be rules about how they will apply. I think the noble Baroness said that it would be right and proper that a crime committed in one territory is treated within the rules that apply in that territory, but if they are significantly different, we ought at least to understand why that is the case and how that has come about.

As I understand it—I have a note provided by Carnegie UK and it is always pretty accurate about these matters—the Secretary of State can consent to a devolved authority which wants to bring forward a devolved offence and include it in the online safety regime. However, it is not quite clear how that happens. What is a consent? Is it an Order in Council, a regulation, affirmative or negative procedure or primary legislation? We are not told that; we are just told that consent arrangements apply and consent can be given. Normally consents involve legislative authority—in its words, one Parliament speaking to another—and we are all becoming quite aware of the fact that the legislative consent required from Scotland, Northern Ireland or Wales is often not given, yet the UK Parliament continues to make legislation and it applies, so the process works, but obviously it would be much better if the devolved structures were involved and agreed to what was being done. This is different from the normal top-down approach. Where we already have a change in the law or the law is about to be changed in one of the devolved Administrations, how does that become part of the Online Safety Bill regime? I look forward to the Minister’s response. I did not warn him that I was giving him a very difficult question, and he can write if he cannot give the detail today, but we would like to see on the record how this happens.

If we are getting Statements to Parliament from the Secretary of State about provisional changes to the way in which the law applies in the devolved Administrations, are they going to be subject to due process? Will there be engagement with committees? What will happen if a new code is required or a variation in the code is required? Does that require secondary legislation and, if so, will that be done with the consent of the devolved Administration or by this Parliament after a process we are yet to see?

There is a lot here that has not been fleshed out. There are few very easy answers, but it would be useful if we could get that going. I will not go into more detail on the noble Baroness’s point that laws change, but I know that the Law Society of Scotland has briefed that at least one major piece of legislation, the Hate Crime and Public Order (Scotland) Act 2021, does not appear in Schedule 7 as expected. Again, I ask the Minister if he would write to us explaining the background to that.

These are very important issues and they do not often get discussed in the full process of our Bills, so I am glad that the noble Baroness raised them. She cloaked them in what sounded like a very general and modest request, but they reveal quite considerable difficulties behind them.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, before I talk to the amendments I had intended to address, I will make a very narrow point in support of the noble Baroness, Lady Fraser. About 10 years ago, when I started doing work on children, I approached Ofcom and asked why all its research goes to 24, when childhood finishes at 18 and the UNCRC says that a child needs special consideration. Ofcom said, “Terribly sorry, but this is our inheritance from a marketing background”. The Communications and Digital Committee later wrote formally to Ofcom and asked if it could do its research up to 18 and then from 18 to 24, but it appeared to be absolutely impossible. I regret that I do not know what the current situation is and I hope that, with the noble Lord, Lord Grade, in place it may rapidly change overnight. My point is that the detailed description that the noble Baroness gave the House about why it is important to stipulate this is proven by that tale.

I also associate myself with the remarks of the noble Lord, Lord Allan, who terrified me some 50 minutes ago. I look forward to hearing what will be said.

I in fact rose to speak to government Amendments 196 and 199, and the bunch of amendments on access to data for researchers. I welcome the government amendments to which I added my name. I really am delighted every time the Government inch forward into the area of the transparency of systemic and design matters. The focus of the Bill should always be on the key factor that separates digital media from other forms of media, which is the power to determine, manipulate and orchestrate what a user does next, see how they behave or what they think. That is very different and is unique to the technology we are talking about.

It will not surprise the Minister to hear that I would have liked this amendment to cover the design of systems and processes, and features and functionalities that are not related to content. Rather than labouring this point, on this occasion I will just draw the Minister’s attention to an article published over the weekend by Professor Henrietta Bowden-Jones, the UK’s foremost expert on gambling and gaming addiction. She equates the systems and processes involved in priming behaviours on social media with the more extreme behaviours that she sees in her addiction clinics, with ever younger children. Professor Bowden-Jones is the spokesperson on behavioural addictions for the Royal College of Psychiatrists, and the House ignores her experience of the loops of reward and compulsion that manipulate behaviour, particularly the behaviour of children, at our peril.

I commend the noble Lord, Lord Bethell, for continuing to press the research issue and coming back, even in the light of the government amendment, with a little more. Access to good data about the operation of social media is vital in holding regulated companies to account, tracking the extent of harms, building an understanding of them and, importantly, building knowledge about how they might be sensibly and effectively addressed.

17:45
My concern here is that, when making a concession, the Government most often reach for a review at some time in the future. In the case of research, the future is too late. We are at an inflection point right now, at which digital tech may or may not overwhelm our job market and our understanding of what is real and what is not. It has the potential for societal and technological change that is both beneficial and harmful, but at such a scale that it will certainly transform society as we understand it before 18 months or two years—the point at which the review is triggered and then takes place.
I feel passionately that, in the context of where we are now and the game of catch-up we have been playing for the last couple of decades, it should not be left to the companies to decide what is or is not in the public arena. As a minimum, independent research would allow the regulator to better understand the operation of social media platforms. More broadly, it would keep our universities on a level playing field—as a number of noble Lords have commented—and, maybe most importantly, ensure that the regulator, academia and civil society have a seat at the table of the future of tech.
For that reason, I again ask the Government, as a minimum, to accept the shorter date that was proposed or perhaps to think again before Third Reading.
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I associate myself with my noble friend Lady Fraser of Craigmaddie’s incredibly well-made points. I learned a long time ago that, when people speak very softly and say they have a very small point to make, they are often about to deliver a zinger. She really did; it was hugely powerful. I will say no more than that I wholeheartedly agree with her; thank you for helping us to understand the issue properly.

I will speak in more detail about access to data for researchers and in support of my noble friend Lord Bethell’s amendments. I too am extremely grateful to the Minister for bringing forward all the government amendments; the direction of travel is encouraging. I am particularly pleased to see the movement from “may” to “must”, but I am worried that it is Ofcom’s rather than the regulated services’ “may” that moves to “must”. There is no backstop for recalcitrant regulated services that refuse to abide by Ofcom’s guidance. As the noble Baroness, Lady Kidron, said, in other areas of the Bill we have quite reasonably resorted to launching a review, requiring Ofcom to publish its results, requiring the Secretary of State to review the recommendations and then giving the Secretary of State backstop powers, if necessary, to implement regulations that would then require regulated companies to change.

I have a simple question for the Minister: why are we not following the same recipe here? Why does this differ from the other issues, on which the House agrees that there is more work to be done? Why are we not putting backstop powers into the Bill for this specific issue, when it is clear to all of us that it is highly likely that there will be said recalcitrant regulated firms that are not willing to grant access to their data for researchers?

Before my noble friend the Minister leaps to the hint he gave in his opening remarks—that this should all be picked up in the Data Protection and Digital Information Bill—unlike the group we have just discussed, this issue was discussed at Second Reading and given a really detailed airing in Committee. This is not new news, in the same way that other issues where we have adopted the same recipe that includes a backstop are being dealt with in the Bill. I urge my noble friend the Minister to follow the good progress so far and to complete the package, as we have in other areas.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, it is valuable to be able to speak immediately after my noble friend Lady Harding of Winscombe, because it gives me an opportunity to address some remarks she made last Wednesday when we were considering the Bill on Report. She suggested that there was a fundamental disagreement between us about our view of how serious online safety is—the suggestion being that somehow I did not think it was terribly important. I take this opportunity to rebut that and to add to it by saying that other things are also important. One of those things is privacy. We have not discussed privacy in relation to the Bill quite as much as we have freedom of expression, but it is tremendously important too.

Government Amendment 247A represents the most astonishing level of intrusion. In fact, I find it very hard to see how the Government think they can get away with saying that it is compatible with the provisions of the European Convention on Human Rights, which we incorporated into law some 20 years ago, thus creating a whole law of privacy that is now vindicated in the courts. It is not enough just to go around saying that it is “proportionate and necessary” as a mantra; it has to be true.

This provision says that an agency has the right to go into a private business with no warrant, and with no let or hindrance, and is able to look at its processes, data and equipment at will. I know of no other business that can be subjected to that without a warrant or some legal process in advance pertinent to that instance, that case or that business.

My noble friend Lord Bethell said that the internet has been abused by people who carry out evil things; he mentioned terrorism, for example, and he could have mentioned others. However, take mobile telephones and Royal Mail—these are also abused by people conducting terrorism, but we do not allow those communications to be intruded into without some sort of warrant or process. It does not seem to me that the fact that the systems can be abused is sufficient to justify what is being proposed.

My noble friend the Minister says that this can happen only offline. Frankly, I did not understand what he meant by that. In fact, I was going to say that I disagreed with him, but I am moving to the point of saying that I think it is almost meaningless to say that it is going to happen offline. He might be able to explain that. He also said that Ofcom will not see individual traffic. However, neither the point about being offline nor the point about not seeing individual traffic is on the face of the Bill.

When we ask ourselves what the purpose of this astonishing power is—this was referred to obliquely to some extent by the noble Baroness, Lady Fox of Buckley—we can find it in Clause 91(1), to which proposed new subsection (2A) is being added or squeezed in subordinate to it. Clause 91(1) talks about

“any information that they”—

that is, Ofcom—

“require for the purpose of exercising, or deciding whether to exercise, any of their online safety functions”.

The power could be used entirely as a fishing expedition. It could be entirely for the purpose of educating Ofcom as to what it should be doing. There is nothing here to say that it can have these powers of intrusion only if it suspects that there is criminality, a breach of the codes of conduct or any other offence. It is a fishing expedition, entirely for the purpose of

“exercising, or deciding whether to exercise”.

Those are the intrusions imposed upon companies. In some ways, I am less concerned about the companies than I am about what I am going to come to next: the intrusion on the privacy of individuals and users. If we sat back and listened to ourselves and what we are saying, could we explain to ordinary people—we are going to come to this when we discuss end-to-end encryption—what exactly can happen?

Two very significant breaches of the protections in place for privacy on the internet arise from what is proposed. First, if you allow someone into a system and into equipment, especially from outside, you increase the risk and the possibility that a further, probably more hostile party that is sufficiently well-equipped with resources—we know state actors with evil intent which are so equipped—can get in through that or similar holes. The privacy of the system itself would be structurally weakened as a result of doing this. Secondly, if Ofcom is able to see what is going on, the system becomes leaky in the direction of Ofcom. It can come into possession of information, some of which could be of an individual character. My noble friend says that it will not be allowed to release any data and that all sorts of protections are in place. We know that, and I fully accept the honesty and integrity of Ofcom as an institution and of its staff. However, we also know that things get leaked and escape. As a result of this provision, very large holes are being built into the protections of privacy that exist, yet there has been no reference at all to privacy in the remarks made so far by my noble friend.

I finish by saying that we are racing ahead and not thinking. Good Lord, my modest amendment in the last group to bring a well-established piece of legislation—the Consumer Rights Act—to bear upon this Bill was challenged on the grounds that there had not been an impact assessment. Where is the impact assessment for this? Where is even the smell test for this in relation to explaining it to the public? If my noble friend is able to expatiate at the end on the implications for privacy and attempt to give us some assurance, that would be some consolation. I doubt that he is going to give way and do the right thing and withdraw this amendment.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, the debate so far has been—in the words of the noble Baroness, Lady Fox—a Committee debate. That is partly because this set of amendments from the Government has come quite late. If they had been tabled in Committee, I think we would have had a more expansive debate on this issue and could have knocked it about a bit and come back to it on Report. The timing is regrettable in all of this.

That said, the Government have tabled some extremely important amendments, particularly Amendments 196 and 198, which deal with things such as algorithms and functionalities. I very much welcome those important amendments, as I know the noble Baroness, Lady Kidron, did.

I also very much support Amendments 270 and 272 in the name of the noble Baroness, Lady Fraser. I hope the Minister, having been pre-primed, has all the answers to them. It is astonishing that, after all these years, we are so unattuned to the issues of the devolved Administrations and that we are still not in the mindset on things such as research. We are not sufficiently granular, as has been explained—let alone all the other questions that the noble Lord, Lord Stevenson, asked. I hope the Minister can unpack some of that as well.

I want to express some gratitude, too, because the Minister and his officials took the trouble to give us a briefing about remote access issues, alongside Ofcom. Ofcom also sent through its note on algorithmic assessment powers, so an effort has been made to explain some of these powers. Indeed, I can see the practical importance, as explained to us. It is partly the lateness, however, that sets off what my noble friend Lord Allan called “trigger words” and concerns about the remote access provisions. Indeed, I think we have a living and breathing demonstration of the impact of triggers on the noble Lord, Lord Moylan, because these are indeed issues that concern those outside the House to quite a large degree.

18:00
I really think the Minister will have to take us through the safeguards again, and there are some he has already mentioned: the limits to the information that can be viewed, and the fact that this is done offline in operation but is limited to functionalities rather than content. He did mention that it was privacy-protecting, that it was justifiable only where proportionate, that the fact that Ofcom is accessing remotely can be made public, that it is challengeable, and that Ofcom cannot disclose information obtained. These are non-trivial powers which require a non-trivial response and a great deal more explanation, particularly, as my noble friend said, on how they differ from those in the Investigatory Powers Act; otherwise, I think concerns will continue. That is the reason for my noble friend’s Amendment 247B, which attempts to place further safeguards.
Amendments 237ZA, 272AB and 262AA, tabled by the noble Lord, Lord Bethell, and spoken to by the noble Baronesses, Lady Harding and Lady Kidron, and the noble Lord, Lord Stevenson, are really important. Of course, we welcome the tweak to Clause 148 that the Minister has introduced today but, as all noble Lords have said, this does not take us far enough. At the risk of boring the House to death, as usual, I will refer to the original Joint Committee report again. We keep returning to this because it does set out a great deal of sense.
There are three areas I want to mention. The Joint Committee said at the time:
“We heard from Dr Amy Orben, College Research Fellow at Emmanuel College, University of Cambridge, that lack of access to data is ‘making it impossible for good quality and independent scientific studies to be completed on topics such as online harms, mental health, or misinformation’”.
In another paragraph, the report says:
“We heard there is evidence that social media usage can cause psychological harm to children, but that platforms prevent research in this area from being conducted or circulated”.
Then we actually heard from Meta. It said:
“One of the things that is a particular challenge in the area of research is how we can provide academics who are doing independent research with access to data really to study these things more deeply”.
On all sides, there is a need for teeth when it comes to access for independent researchers.
Our recommendation was very clear; it was not just a recommendation that this should be reviewed but that there should be these powers. Obviously, the Minister has agreed to a compromise, in terms of a report on access, but he really should go further and have default powers, so that the Government can institute the access that is required for researchers, as a result of that report. So far, with the amendments today, the Minister is not strengthening the Bill, because there is no way for Ofcom to compel compliance with the guidance or any incentive for companies to comply with any guidance that is produced.
I very much hope that the Minister will take on board what the noble Lord, Lord Bethell, had to say, in a very eloquent way. If he cannot do it here and now on Report, I very much hope that he will come back with a proposal at Third Reading. As the noble Baroness, Lady Harding, said, we have done this in virtually every other case where there is a report. As we have seen, the Minister has agreed to have a review or a report, and then the backstop powers are in place. That is not the case with this, and it should be.
Baroness Neville-Jones Portrait Baroness Neville-Jones (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I just want to reinforce what my noble friend Lord Bethell said about the amendments to which I have also put my name: Amendments 237ZA, 266AA and 272E. I was originally of the view that it was enough to give Ofcom the powers to enforce its own rulings. I have been persuaded that, pace my noble friend Lord Grade, the powers that have been given to Ofcom represent such a huge expansion that the likelihood of the regulator doing anything other than those things which it is obliged to do is rather remote. So I come to the conclusion that an obligation is the right way to put these things. I also agree with what has been said about the need to ensure that subsequent action is taken, in relation to a regulated service if it does not follow what Ofcom has set out.

I will also say a word about researchers. They are a resource that already exists. Indeed, there has been quite a lot of pushing, not least by me, on using this resource, first, to update the powers of the Computer Misuse Act, but also to enlarge our understanding of and ability to have information about the operation of online services. So this is a welcome move on the part of the Government, that they see the value of researchers in this context.

My noble friend Lord Moylan made a good point that the terms under which this function is exercised have to have regard to privacy as well as to transparency of operations. This is probably one of the reasons why we have not seen movement on this issue in the Computer Misuse Act and its updating, because it is intrinsically quite a difficult issue. But I believe that it has to be tackled, and I hope very much that the Government will not delay in bringing forward the necessary legislation that will ensure both that researchers are protected in the exercise of this function, which has been one of the issues, and that they are enabled to do something worth while. So I believe the Minister when he says that the Government may need to bring forward extra legislation on this; it is almost certainly the case. I hope very much that there will not be a great gap, so that we do not see this part of the proposals not coming into effect.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, we have had an important debate on a range of amendments to the Bill. There are some very important and good ones, to which I would say: “Better late than never”. I probably would not say that to Amendment 247A; I would maybe say “better never”, but we will come on to that. It is interesting that some of this has come to light following the debate on and scrutiny of the Digital Markets, Competition and Consumers Bill in another place. That might reinforce the need for post-legislative review of how this Bill, the competition Bill and the data Bill are working together in practice. Maybe we will need another Joint Committee, which will please the noble Lord, Lord Clement-Jones, no end.

There are many government amendments. The terms of service and takedown policy ones have been signed by my noble friend Lord Stevenson, and we support them. There are amendments on requiring information on algorithms in transparency reports; requiring search to put into transparency reports; how policies on illegal content and content that is harmful for children were arrived at; information about search algorithms; and physical access in an audit to view the operations of algorithms and other systems. Like the noble Baroness, Lady Kidron, I very much welcome, in this section anyway, that focus on systems, algorithms and process rather than solely on content.

However, Amendment 247A is problematic in respect of the trigger words, as the noble Lord, Lord Allan, referred to, of remote access and requiring a demonstration gathering real-time data. That raises a number of, as he said, non-trivial questions. I shall relay what some service providers have been saying to me. The Bill already provides Ofcom with equivalent powers under Schedule 12—such as rights of entry and inspection and extensive auditing powers—that could require them to operate any equipment or algorithms to produce information for Ofcom and/or allow Ofcom to observe the functioning of the regulated service. Crucially, safeguards are built into the provisions in Schedule 12 to ensure that Ofcom exercises them only in circumstances where the service provider is thought to be in breach of its duties and/or under a warrant, which has to have judicial approval, yet there appear to be no equivalent safeguards in relation to this power. I wonder whether, as it has come relatively late, that is an oversight that the Minister might want to address at Third Reading.

The policy intent, as I understand it, is to give Ofcom remote access to algorithms to ensure that service providers located out of the jurisdiction are not out of scope of Ofcom’s powers. Could that have been achieved by small drafting amendments to Schedule 12? In that case, the whole set of safeguards that we are concerned about would be in place because, so to speak, they would be in the right place. As drafted, the amendment appears to be an extension of Ofcom’s information-gathering powers that can be exercised as a first step against a service provider or access facility without any evidence that the service is in breach of its obligations or that any kind of enforcement action is necessary, which would be disproportionate and oppressive.

Given the weight of industry concern about the proportionality of these powers and their late addition, I urge the Minister to look at the addition of further safeguards around the use of these powers in the Bill and further clarification on the scope of the amendment as a power of escalation, including that it should be exercised as a measure of last resort, and only in circumstances where a service provider has not complied with its duty under the Bill or where the service provider has refused to comply with a prior information notice.

Amendment 247B is welcome because it gives the Minister the opportunity to tell us now that he wants to reflect on all this before Third Reading, work with us and, if necessary, come back with a tightening of the language and a resolution of these issues. I know his motivation is not to cause a problem late on in the Bill but he has a problem, and if he could reflect on it and come back at Third Reading then that would be helpful.

I welcome the amendments tabled by the noble Lord, Lord Bethell, on researcher access. This is another area where he has gone to great efforts to engage across the House with concerned parties, and we are grateful to him for doing so. Independent research is vital for us to understand how this new regime that we are creating is working. As he says, it is a UK strength, and we should play to that strength and not let it slip away inadvertently. We will not get the regime right first time, and we should not trust the platforms to tell us. We need access to independent researchers, and the amendments strike a good balance.

We look forward to the Minister deploying his listening ear, particularly to what the noble Baroness, Lady Harding, had to say on backstop powers. When he said in his opening speech that he would reflect, is he keeping open the option of reflecting and coming back at Third Reading, or is he reflecting only on the possibility of coming back in other legislation?

The noble Baroness, Lady Fraser, raised an important issue for the UK regulator, ensuring that it is listening to potential differences in public opinion in the four nations of our union and, similarly, analysing transparency reports. As she says, this is not about reserved matters but about respecting the individual nations and listening to their different voices. It may well be written into the work of Ofcom by design but we cannot assume that. We look forward to the Minister’s response, including on the questions from my noble friend on the consent process for the devolved Administrations to add offences to the regime.

18:15
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to noble Lords for their contributions in this group. On the point made by the noble Lord, Lord Knight of Weymouth, on why we are bringing in some of these powers now, I say that the power to direct and observe algorithms was previously implicit within Ofcom’s information powers and, where a provider has UK premises, under powers of entry, inspection and audit under Schedule 12. However, the Digital Markets, Competition and Consumers Bill, which is set to confer similar powers on the Competition and Markets Authority and its digital markets unit, makes these powers explicit. We wanted to ensure that there was no ambiguity over whether Ofcom had equivalent powers in the light of that. Furthermore, the changes we are making ensure that Ofcom can direct and observe algorithmic assessments even if a provider does not have relevant premises or equipment in the UK.

I am grateful to the noble Lord, Lord Allan of Hallam, for inviting me to re-emphasise points and allay the concerns that have been triggered, as his noble friend Lord Clement-Jones put it. I am happy to set out again a bit of what I said in opening this debate. The powers will be subject to a number of safeguards. First, they are limited to “viewing information”. They can be used only where they are proportionate in the exercise of Ofcom’s functions, and a provider would have the right to bring a legal challenge against Ofcom if it considered that a particular exercise of the power was done unlawfully. Furthermore, Ofcom will be under a legal obligation to ensure that the information gathered from services is protected from disclosure, unless clearly defined exemptions apply.

These are not secret powers, as the noble Lord rightly noted. The Bill contains no restriction on services making the existence and detail of the information notice public. If a regulated service wished to challenge an information notice served to it by Ofcom, it would be able to do so through judicial review. I also mentioned the recourse that people have through existing legislation, such as the Freedom of Information Act, to give them safeguards, noting that, under Section 393 of the Communications Act, Ofcom will not be able to disclose information that it has obtained through its exercise of these powers without the provider’s consent unless that is permitted for specific, defined purposes.

The noble Lord’s Amendment 247B seeks to place further safeguards on Ofcom’s use of its new power to access providers’ systems remotely to observe tests. While I largely agree with the intention behind it, there are already a number of safeguards in place for the use of that power, including in relation to data protection, legally privileged material and the disclosure of information, as I have outlined. Ofcom will not be able to gain remote access simply for exploratory or fishing purposes, and indeed Ofcom expects to have conversations with services about how to provide the information requested.

Furthermore, before exercising the power, Ofcom will be required to issue an information notice specifying the information to be provided, setting out the parameters of access and why Ofcom requires the information, among other things. Following the receipt of an information notice, a notice requiring an inspection or an audit notice, if a company has identified that there is an obvious security risk in Ofcom exercising the power as set out in the notice, it may not be proportionate to do so. As set out in Ofcom’s duties, Ofcom must have regard to the principles under which regulatory activities should be proportionate and targeted only at cases where action is needed.

In line with current practice, we anticipate Ofcom will issue information notice requests in draft form to identify and address any issues, including in relation to security, before the information notice is issued formally. Ofcom will have a legal duty to exercise its remote access powers in a way that is proportionate, ensuring that undue burdens are not placed on businesses. In assessing proportionality in line with this requirement, Ofcom would need to consider the size and resource capacity of a service when choosing the most appropriate way of gathering information, and whether there was a less onerous method of obtaining the necessary information to ensure that the use of this power is proportionate. As I said, the remote access power is limited to “viewing information”. Under this power, Ofcom will be unable to interfere or access the service for any other purpose.

In practice, Ofcom will work with services during the process. It is required to specify, among other things, the information to be provided, which will set the parameters of its access, and why it requires the information, which will explain the link between the information it seeks and the online safety function that it is exercising or deciding whether to exercise.

As noble Lords know, Ofcom must comply with the UK’s data protection law. As we have discussed in relation to other issues, it is required to act compatibly with the European Convention on Human Rights, including Article 8 privacy rights. In addition, under Clause 91(7), Ofcom is explicitly prohibited from requiring the provision of legally privileged information. It will also be under a legal obligation to ensure that the information gathered from services is protected from disclosure unless clearly defined exemptions apply, such as those under Section 393(2) of the Communications Act 2003—for example, the carrying out of any of Ofcom’s functions. I hope that provides reassurance to the noble Lord, Lord Allan, and the noble Baroness, Lady Fox, who raised these questions.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I am grateful to the Minister. That was helpful, particularly the description of the process and the fact that drafts have to be issued early on. However, it still leaves open a couple of questions, one of which was very helpfully raised by the noble Lord, Lord Knight. We have in Schedule 12 this other set of protections that could be applied. There is a genuine question as to why this has been put in this place and not there.

The second question is to dig a little more into the question of what happens when there is a dispute. The noble Lord, Lord Moylan, pointed out that if you have created a backdoor then you have created a backdoor, and it is dangerous. If we end up in a situation where a company believes that what it is being asked to do by Ofcom is fundamentally problematic and would create a security risk, it will not be good enough to open up the backdoor and then have a judicial review. It needs to be able to say no at that stage, yet the Bill says that it could be committing a serious criminal offence by failing to comply with an information notice. We want some more assurances, in some form, about what would happen in a scenario where a company genuinely and sincerely believes that what Ofcom is asking for is inappropriate and/or dangerous and it wants not to have to offer it unless and until its challenge has been looked at, rather than having to offer it and then later judicially review a decision. The damage would already have been done by opening up an inappropriate backdoor.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

A provider would have a right to bring a legal challenge against Ofcom if it considered that a particular exercise of the remote access power was unlawful. I am sure that would be looked at swiftly, but I will write to the noble Lord on the anticipated timelines while that judicial review was pending. Given the serious nature of the issues under consideration, I am sure that would be looked at swiftly. I will write further on that.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will write on Schedule 12 as well.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

Before the Minister sits down, to quote the way the Minister has operated throughout Report, there is consensus across the House that there are some concerns. The reason why there are concerns outside and inside the House on this particular amendment is that it is not entirely clear that those protections exist, and there are worries. I ask the Minister whether, rather than just writing, it would be possible to take this back to the department, table a late amendment and say, “Look again”. That has been done before. It is certainly not too late: if it was not too late to have this amendment then it is certainly not too late to take it away again and to adopt another amendment that gives some safeguarding. Seriously, it is worth looking again.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I had not quite finished; the noble Baroness was quick to catch me before I sat down. I still have some way to go, but I will certainly take on board all the points that have been made on this group.

The noble Lord, Lord Knight, asked about Schedule 12. I will happily write with further information on that, but Schedule 12 is about UK premises, so it is probably not the appropriate place to deal with this, as we need to be able to access services in other countries. If there is a serious security risk then it would not necessarily be proportionate. I will write to him with further details.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I am grateful to the Minister for giving way so quickly. I think the House is asking him to indicate now that he will go away and look at this issue, perhaps with some of us, and that, if necessary, he would be willing to look at coming back with something at Third Reading. From my understanding of the Companion, I think he needs to say words to that effect to allow him to do so, if that is what he subsequently wants to do at Third Reading.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am very happy to discuss this further with noble Lords, but I will reserve the right, pending that discussion, to decide whether we need to return to this at Third Reading.

Amendments 270 and 272, tabled by my noble friend Lady Fraser of Craigmaddie, to whom I am very grateful for her careful scrutiny of the devolved aspects of the Bill, seek to require Ofcom to include separate analyses of users’ online experiences in England, Wales, Scotland and Northern Ireland in the research about users’ experiences of regulated services and in Ofcom’s transparency reports. While I am sympathetic to her intention—we have corresponded on it, for which I am grateful—it is important that Ofcom has and retains the discretion to prioritise information requests that will best shed light on the experience of users across the UK.

My noble friend and other noble Lords should be reassured that Ofcom has a strong track record of using this discretion to produce data which are representative of people across the whole United Kingdom. Ofcom is committed to reflecting the online experiences of users across the UK and intends, wherever possible, to publish data at a national level. When conducting research, Ofcom seeks to gather views from a representative sample of the United Kingdom and seeks to set quotas that ensure an analysable sample within each of the home nations.

It is also worth noting the provisions in the Communications Act 2003 that require Ofcom to operate offices in each of the nations of the UK, to maintain advisory committees for each, and to ensure their representation on its various boards and panels—and, indeed, on the point raised by the noble Baroness, Lady Kidron, to capture the experiences of children and users of all ages. While we must give Ofcom the discretion it needs to ensure that the framework is flexible and remains future-proofed, I hope that I have reassured my noble friend that her point will indeed be captured, reported on and be able to be scrutinised, not just in this House but across the UK.

Baroness Fraser of Craigmaddie Portrait Baroness Fraser of Craigmaddie (Con)
- Hansard - - - Excerpts

I am grateful to the Minister for giving way. My premise is that the reason Ofcom reports in a nation-specific way in broadcasting and in communications is because there is a high-level reference in both the Communications Act 2003 and the BBC charter that requires it to do so, because it feeds down into national quotas and so on. There is currently nothing of that equivalence in the Online Safety Bill. Therefore, we are relying on Ofcom’s discretion, whereas in the broadcasting and communications area we have a high-level reference to insisting that there is a breakdown by nation.

18:30
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

We think we can rely on Ofcom’s discretion, and point to its current practice. I hope that will reassure my noble friend that it will set out the information she seeks.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I was about to say that I am very happy to write to the noble Lord, Lord Stevenson, about the manner by which consent is given in Clause 53(5)(c), but I think his question is on something else.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I would be grateful if the Minister could repeat that immediately afterwards, when I will listen much harder.

Just to echo what the noble Baroness was saying, may we take it as an expectation that approaches that are signalled in legislation for broadcasting and communications should apply pari passu to the work of Ofcom in relation to the devolved Administrations?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes, and we can point to the current actions of Ofcom to show that it is indeed doing this already, even without that legislative stick.

I turn to the amendments in the name of my noble friend Lord Bethell and the noble Lord, Lord Clement-Jones, on researchers’ access to data. Amendment 237ZA would confer on the Secretary of State a power to make provisions about access to information by researchers. As my noble friend knows, we are sympathetic to the importance of this issue, which is why we have tabled our own amendments in relation to it. However, as my noble friend also knows, in such a complex and sensitive area that we think it is premature to endow the Secretary of State with such broad powers to introduce a new framework. As we touched on in Committee, this is a complex and still nascent area, which is why it is different from the other areas to which the noble Lord, Lord Clement-Jones, pointed in his contribution.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

The noble Baroness, Lady Harding, made the point that in other areas where the Minister has agreed to reviews or reports, there are backstop powers; for instance, on app stores. Of course, that was a negotiated settlement, so to speak, but why can the Minister not accede to that in the case of access for researchers, as he has with app stores? Indeed, there is one other example that escapes me, which the Minister has also agreed to.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

We touched on the complexity of defining who and what is a researcher and making sure that we do not give rise to bad actors exploiting that. This is a complex area, as we touched on in Committee. As I say, the evidence base here is nascent. It is important first to focus on developing our understanding of the issues to ensure that any power or legislation is fit to address those challenges. Ofcom’s report will not only highlight how platforms can share data with researchers safely but will provide the evidence base for considering any future policy approaches, which we have committed to doing but which I think the noble Lord will agree are worthy of further debate and reflection in Parliament.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

The benefit of having a period of time between the last day of Report on Wednesday and Third Reading is that that gives the Minister, the Bill team and parliamentary counsel the time to reflect on the kind of power that could be devised. The wording could be devised, and I would have thought that six weeks would be quite adequate for that, perhaps in a general way. After all, this is not a power that is immediately going to be used; it is a general power that could be brought into effect by regulation. Surely it is not beyond the wit to devise something suitable.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Before the Minister stands up, I also wondered—

None Portrait Noble Lords
- Hansard -

Oh!

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Sit down or stand up—I cannot remember.

I wonder whether the department has looked at the DSA and other situations where this is being worked out. I recognise that it takes a period of time, but it is not without some precedent that a pathway should be described.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

We do not think that six weeks is enough time for the evidence base to develop sufficiently, our assessment being that to endow the Secretary of State with that power at this point is premature.

Amendment 262AA would require Ofcom to consider whether it is appropriate to require providers to take steps to comply with Ofcom’s researcher access guidance when including a requirement to take steps in a confirmation decision. This would be inappropriate because the researcher access provisions are not enforceable requirements; as such, compliance with them should not be subject to enforcement by the regulator. Furthermore, enforcement action may relate to a wide variety of very important issues, and the steps needed should be sufficient to address a failure to comply with an enforceable requirement. Singling out compliance with researcher access guidance alone risks implying that this will be adequate to address core failures.

Amendment 272AB would require Ofcom to give consideration to whether greater access to data could be achieved through legal requirements or incentives for regulated services. I reassure noble Lords that the scope of Ofcom’s report will already cover how greater access to data could be achieved, including through enforceable requirements on providers.

Amendment 272E would require Ofcom to take a provider’s compliance with Ofcom’s guidance on researcher access to data into account when assessing risks from regulated services and determining whether to take enforcement action and what enforcement action to take. However, we do not believe that this is a relevant factor for consideration of these issues. I hope noble Lords will agree that whether or not a company has enabled researcher access to its data should not be a mitigating factor against Ofcom requiring companies to deal with terrorism or child sexual exploitation or abuse content, for example.

On my noble friend Lord Bethell’s remaining Amendments 272BA, 273A and 273B, the first of these would require Ofcom to publish its report on researchers’ access to information within six months. While six months would not be deliverable given other priorities and the complexity of this issue, the government amendment to which I have spoken would reduce the timelines from two years to 18 months. That recognises the importance of the issue while ensuring that Ofcom can deliver the key priorities in establishing the core parts of the regulatory framework; for example, the illegal content and child safety duties.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

Just on the timescale, one of the issues that we talked about in Committee was the fact that there needs to be some kind of mechanism created, with a code of practice with reference to data protection law and an approving body to approve researchers as suitable to take information; the noble Baroness, Lady Kidron, referred to the DSA process, which the European Union has been working on. I hope the Minister can confirm that Ofcom might get moving on establishing that. It is not dependent on there being a report in 18 months; in fact, you need to have it in place when you report in 18 months, which means you need to start building it now. I hope the Minister would want Ofcom, within its existing framework, to be encouraging the creation of that researcher approval body and code of practice, not waiting to start that process in 18 months’ time.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will continue my train of thought on my noble friend’s amendments, which I hope will cover that and more.

My noble friend’s Amendment 273A would allow Ofcom to appoint approved independent researchers to access information. Again, given the nascent evidence base here, it is important to focus on understanding these issues before we commit to a researcher access framework.

Under the skilled persons provisions, Ofcom will already have the powers to appoint a skilled person to assess compliance with the regulatory framework; that includes the ability to leverage the expertise of independent researchers. My noble friend’s Amendment 273B would require Ofcom to produce a code of practice on access to data by researchers. The government amendments I spoke to earlier will require Ofcom to produce guidance on that issue, which will help to promote information sharing in a safe and secure way.

To the question asked by the noble Lord, Lord Allan: yes, Ofcom can start the process and do it quickly. The question here is really about the timeframe in which it does so. As I said in opening, we understand the calls for further action in this area.

I am happy to say to my noble friend Lord Bethell, to whom we are grateful for his work on this and the conversations we have had, that we will explore the issue further and report back on whether further measures to support researchers’ access to data are required and, if so, whether they can be implemented through other legislation, such as the Data Protection and Digital Information (No.2) Bill.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Before the Minister sits down—he has been extremely generous in taking interventions—I want to put on record my understanding of his slightly ambiguous response to Amendment 247A, so that he can correct it if I have got it wrong. My understanding is that he has agreed to go away and reflect on the amendment and that he will have discussions with us about it. Only if he then believes that it is helpful to bring forward an amendment at Third Reading will he do so.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes, but I do not want to raise the hopes of the noble Lord or others, with whom I look forward to discussing this matter. I must manage their expectations about whether we will bring anything forward. With that, I beg to move.

Amendment 187 agreed.
Amendment 188 not moved.
Clause 67: Interpretation of this Chapter
Amendment 189
Moved by
189: Clause 67, page 64, line 15, leave out from “65(9),” to “and” in line 16 and insert “indicates (in whatever words) that the presence of content of that kind is prohibited on the service or that users’ access to content of that kind is restricted,”
Member’s explanatory statement
This amendment makes a change to the definition of “relevant content” which applies for the purposes of Chapter 3 of Part 4 of the Bill (transparency of terms of service etc). The effect of the change is to cover a wider range of ways in which a term of service might indicate that a certain kind of content is not allowed on the service.
Amendment 189 agreed.
Amendment 190
Moved by
190: After Clause 67, insert the following new Clause—
“CHAPTER 3ADECEASED CHILD USERSDisclosure of information about use of service by deceased child users
(1) A provider of a relevant service must make it clear in the terms of service what their policy is about dealing with requests from parents of a deceased child for information about the child’s use of the service.(2) A provider of a relevant service must have a dedicated helpline or section of the service, or some similar means, by which parents can easily find out what they need to do to obtain information and updates in those circumstances, and the terms of service must provide details.(3) A provider of a relevant service must include clear and accessible provisions in the terms of service—(a) specifying the procedure for parents of a deceased child to request information about the child’s use of the service,(b) specifying what evidence (if any) the provider will require about the parent’s identity or relationship to the child, and(c) giving sufficient detail to enable child users and their parents to be reasonably certain about what kinds of information would be disclosed and how information would be disclosed. (4) A provider of a relevant service must respond in a timely manner to requests from parents of a deceased child for information about the child’s use of the service or for updates about the progress of such information requests.(5) A provider of a relevant service must operate a complaints procedure in relation to the service that—(a) allows for complaints to be made by parents of a deceased child who consider that the provider is not complying with a duty set out in any of subsections (1) to (4),(b) provides for appropriate action to be taken by the provider of the service in response to such complaints, and(c) is easy to access, easy to use and transparent.(6) A provider of a relevant service must include in the terms of service provisions which are easily accessible specifying the policies and processes that govern the handling and resolution of such complaints.(7) If a person is the provider of more than one relevant service, the duties set out in this section apply in relation to each such service.(8) The duties set out in this section extend only to the design, operation and use of a service in the United Kingdom, and references in this section to children are to children in the United Kingdom.(9) A “relevant service” means—(a) a Category 1 service (see section 86(10)(a));(b) a Category 2A service (see section 86(10)(b));(c) a Category 2B service (see section 86(10)(c)).(10) In this section “parent”, in relation to a child, includes any person who is not the child’s parent but who—(a) has parental responsibility for the child within the meaning of section 3 of the Children Act 1989 or Article 6 of the Children (Northern Ireland) Order 1995 (S.I. 1995/755 (N.I. 2)), or(b) has parental responsibilities in relation to the child within the meaning of section 1(3) of the Children (Scotland) Act 1995.(11) In the application of this section to a Category 2A service, references to the terms of service include references to a publicly available statement.”Member’s explanatory statement
This amendment imposes new duties on providers of Category 1, 2A and 2B services to have a policy about disclosing information to the parents of deceased child users, and providing details about it in the terms of service or a publicly available statement.
Amendment 190 agreed.
Amendment 191
Moved by
191: After Clause 67, insert the following new Clause—
“OFCOM’s guidance about duties set out in section (Disclosure of information about use of service by deceased child users)
(1) OFCOM must produce guidance for providers of relevant services to assist them in complying with their duties set out in section (Disclosure of information about use of service by deceased child users).(2) OFCOM must publish the guidance (and any revised or replacement guidance).(3) In this section “relevant service” has the meaning given by section (Disclosure of information about use of service by deceased child users).”Member’s explanatory statement
This amendment requires OFCOM to give guidance to providers about the new duties imposed by the other Clause proposed after Clause 67 in my name.
Amendment 191A (to Amendment 191) not moved.
Amendment 191 agreed.
Schedule 8: Transparency reports by providers of Category 1 services, Category 2A services and Category 2B services
Amendments 192 to 203
Moved by
192: Schedule 8, page 212, line 26, leave out “and relevant content” and insert “, relevant content and content to which section 12(2) applies”
Member’s explanatory statement
This amendment adds a reference to content to which section 12(2) applies (content to which certain user empowerment duties apply) to paragraph 1 of the transparency reporting Schedule, which allows OFCOM to require providers of user-to-user services to include information in their transparency reports about the incidence of content.
193: Schedule 8, page 212, line 28, leave out “and relevant content” and insert “, relevant content and content to which section 12(2) applies”
Member’s explanatory statement
This amendment adds a reference to content to which section 12(2) applies to paragraph 2 of the transparency reporting Schedule, which allows OFCOM to require providers of user- to-user services to include information in their transparency reports about the dissemination of content.
194: Schedule 8, page 212, line 31, leave out “or relevant content” and insert “, relevant content or content to which section 12(2) applies”
Member’s explanatory statement
This amendment adds a reference to content to which section 12(2) applies to paragraph 3 of the transparency reporting Schedule, which allows OFCOM to require providers of user- to-user services to include information in their transparency reports about the number of users encountering content.
195: Schedule 8, page 212, line 33, after “The” insert “formulation, development, scope and”
Member’s explanatory statement
This amendment allows OFCOM to require providers of user-to-user services to include information in their transparency report about the formulation, development and scope of their terms of service (as well as the application of the terms of service).
196: Schedule 8, page 213, line 5, at end insert—
“8A_ The design and operation of algorithms which affect the display, promotion, restriction or recommendation of illegal content, content that is harmful to children, relevant content or content to which section 12(2) applies.”Member’s explanatory statement
This amendment makes it clear that OFCOM can require providers of user-to-user services to include information in their transparency report about algorithms, as mentioned in this new paragraph.
197: Schedule 8, page 213, line 16, at end insert—
“12A_ Measures taken or in use by a provider to comply with any duty set out in section (Disclosure of information about use of service by deceased child users) (deceased child users).”Member’s explanatory statement
This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about measures taken to comply with the new duties imposed by the Clause proposed after Clause 67 in my name.
198: Schedule 8, page 214, line 3, after “The” insert “formulation, development, scope and”
Member’s explanatory statement
This amendment allows OFCOM to require providers of search services to include information in their transparency report about the formulation, development and scope of their public statements of policies and procedures (as well as the application of those statements).
199: Schedule 8, page 214, line 15, at end insert—
“24A_ The design and operation of algorithms which affect the display, promotion, restriction or recommendation of illegal search content or search content that is harmful to children.”Member’s explanatory statement
This amendment means that OFCOM can require providers of search services to include information in their transparency report about algorithms, as mentioned in this new paragraph.
200: Schedule 8, page 214, line 22, at end insert—
“26A_ Measures taken or in use by a provider to comply with any duty set out in section (Disclosure of information about use of service by deceased child users) (deceased child users).”Member’s explanatory statement
This amendment means that OFCOM can require providers of search services to include information in their transparency report about measures taken to comply with the new duties imposed by the Clause proposed after Clause 67 in my name.
201: Schedule 8, page 215, line 9, leave out “to 3” and insert “to 3A”
Member’s explanatory statement
This amendment requires OFCOM, in considering which information to require from a provider in a transparency report, to consider whether the provider is subject to the duties imposed by Chapter 3A, which is the new Chapter containing the new duties imposed by the Clause proposed after Clause 67 in my name.
202: Schedule 8, page 215, line 25, leave out from “(2),” to “and” in line 26 and insert “indicates (in whatever words) that the presence of content of that kind is prohibited on the service or that users’ access to content of that kind is restricted,”
Member’s explanatory statement
This amendment makes a change to the definition of “relevant content” which applies for the purposes of the transparency reporting Schedule. The effect of the change is to cover a wider range of ways in which a term of service might indicate that a certain kind of content is not allowed on the service.
203: Schedule 8, page 215, line 34, at end insert—
“(4) The reference in sub-paragraph (1) to users’ access to content being restricted is to be construed in accordance with sections 52 and 211(5).”Member’s explanatory statement
This technical amendment makes it clear that the reference to users’ access to content being restricted in the transparency reporting Schedule has the meaning given to it in Part 3 of the Bill.
Amendments 192 to 203 agreed.
Amendment 204 not moved.
18:45
Clause 70: “Pornographic content”, “provider pornographic content”, “regulated provider pornographic content”
Amendments 205 to 209
Moved by
205: Clause 70, page 66, line 42, leave out subsection (2)
Member’s explanatory statement
This amendment is consequential on the amendment to Clause 211 in my name adding a definition of “pornographic content” to that Clause.
206: Clause 70, page 67, leave out lines 4 to 6 and insert “, including pornographic content published or displayed on the service by means of—
(a) software or an automated tool or algorithm applied by the provider or by a person acting on behalf of the provider, or(b) an automated tool or algorithm made available on the service by the provider or by a person acting on behalf of the provider.”Member’s explanatory statement
This amendment is about what counts as “provider pornographic content” for the purposes of Part 5 of the Bill. Words are added to expressly cover the case where an automated tool or algorithm is made available on the service by a provider, such as a generative AI bot.
207: Clause 70, page 67, line 8, leave out from “than” to end of line 10 and insert “content within subsection (4A) or (4B).”
Member’s explanatory statement
This amendment is related to the next amendment in my name which inserts new subsection (4A) into Clause 70. The change is to the scope of what it means for content to consist only of text.
208: Clause 70, page 67, line 10, at end insert—
“(4A) Content is within this subsection if it—(a) consists only of text, or(b) consists only of text accompanied by—(i) a GIF which is not itself pornographic content,(ii) an emoji or other symbol, or(iii) a combination of content mentioned in sub-paragraphs (i) and (ii).(4B) Content is within this subsection if it consists of a paid-for advertisement (see section 211).”Member’s explanatory statement
This amendment clarifies the scope of the exemption from the Part 5 duties for content which consists only of text. Such content does not count as regulated provider pornographic content.
209: Clause 70, page 67, line 20, at end insert “and
(iii) references to pornographic content that is generated on the service by means of an automated tool or algorithm in response to a prompt by a user and is only visible or audible to that user (no matter for how short a time);”Member’s explanatory statement
This amendment makes it clear that, for the purposes of Part 5 (provider pornography), content is within scope of the duties if it is AI-generated content.
Amendments 205 to 209 agreed.
Clause 72: Duties about regulated provider pornographic content
Amendments 210 to 214
Moved by
210: Clause 72, page 68, line 18, leave out subsection (2) and insert—
“(2) A duty to ensure, by the use of age verification or age estimation (or both), that children are not normally able to encounter content that is regulated provider pornographic content in relation to the service.(2A) The age verification or age estimation must be of such a kind, and used in such a way, that it is highly effective at correctly determining whether or not a particular user is a child.” Member’s explanatory statement
This amendment requires providers within scope of Part 5 to use highly effective age verification or age estimation (or both) to comply with the duty in Clause 72(2) (preventing children from encountering provider pornographic content).
211: Clause 72, page 68, line 21, leave out “A” and insert “In relation to the duty set out in subsection (2), a”
Member’s explanatory statement
This amendment is a technical change relating to the preceding amendment in my name.
212: Clause 72, page 68, line 23, leave out paragraph (a) and insert—
“(a) the kinds of age verification or age estimation used, and how they are used, and”Member’s explanatory statement
This amendment requires Part 5 providers to keep a written record about the age verification or age estimation measures they use to comply with the duty in Clause 72(2).
213: Clause 72, page 68, line 25, leave out from “on” to “has” in line 26 and insert “the kinds of age verification or age estimation and how they should be used,”
Member’s explanatory statement
This amendment is consequential on the preceding amendment in my name.
214: Clause 72, page 68, line 31, at end insert—
“(4) A duty to summarise the written record in a publicly available statement, so far as the record concerns compliance with the duty set out in subsection (2), including details about which kinds of age verification or age estimation a provider is using and how they are used.”Member’s explanatory statement
This amendment requires Part 5 providers to make publicly available a summary of the age verification or age estimation measures used to comply with the duty in Clause 72(2), and how they are used.
Amendments 210 to 214 agreed.
Clause 73: OFCOM’s guidance about duties set out in section 72
Amendment 215
Moved by
215: Clause 73, page 68, line 36, leave out from “of” to end of line 37 and insert “kinds and uses of age verification and age estimation that are, or are not, highly effective at correctly determining whether or not a particular user is a child,”
Member’s explanatory statement
This amendment requires OFCOM’s guidance about the duty in Clause 72(2) to give examples of kinds and uses of age verification and age estimation that are, or are not, highly effective at determining whether or not a user is a child.
Amendment 215 agreed.
Amendment 216
Moved by
216: Clause 73, page 68, line 43, at end insert—
“(2A) The guidance may elaborate on the following principles governing the use of age verification or age estimation for the purpose of compliance with the duty set out in section 72(2)—(a) the principle that age verification or age estimation should be easy to use;(b) the principle that age verification or age estimation should work effectively for all users regardless of their characteristics or whether they are members of a certain group; (c) the principle of interoperability between different kinds of age verification or age estimation.(2B) The guidance may refer to industry or technical standards for age verification or age estimation (where they exist).”Member’s explanatory statement
This amendment sets out principles about age verification or age estimation, which are relevant to OFCOM’s guidance to providers about their duty in Clause 72(2).
Amendment 217 (to Amendment 216) not moved.
Amendment 216 agreed.
Clause 156: Consultation and parliamentary procedure
Amendment 218 not moved.
Clause 157: Directions about advisory committees
Amendment 218A not moved.
Clause 158: Directions in special circumstances
Amendment 218B
Moved by
218B: Clause 158, page 139, line 5, leave out “duty” and insert “duties”
Member’s explanatory statement
This amendment is consequential on the new Clause proposed to be inserted after Clause 149 in my name expanding OFCOM’s duties to promote media literacy in relation to regulated user-to-user and search services.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, the amendments in this group relate to provisions for media literacy in the Bill and Ofcom’s existing duty on media literacy under Section 11 of the Communications Act 2003. I am grateful to noble Lords from across your Lordships’ House for the views they have shared on this matter, which have been invaluable in helping us draft the amendments.

Media literacy remains a key priority in our work to tackle online harms; it is essential not only to keep people safe online but for them to understand how to make informed decisions which enhance their experience of the internet. Extensive work is currently being undertaken in this area. Under Ofcom’s existing duty, the regulator has initiated pilot work to promote media literacy. It is also developing best practice principles for platform-based media literacy measures and has published guidance on how to evaluate media literacy programmes.

While we believe that the Communications Act provides Ofcom with sufficient powers to undertake an ambitious programme of media literacy activity, we have listened to the concerns raised by noble Lords and understand the desire to ensure that Ofcom is given media literacy objectives which are fit for the digital age. We have therefore tabled the following amendments seeking to update Ofcom’s statutory duty to promote media literacy, in so far as it relates to regulated services.

Amendment 274B provides new objectives for Ofcom to meet in discharging its duty. The first objective requires Ofcom to take steps to increase the public’s awareness and understanding of how they can keep themselves and others safe when using regulated services, including building the public’s understanding of the nature and impact of harmful content online, such as disinformation and misinformation. To meet that objective, Ofcom will need to carry out, commission or encourage the delivery of activities and initiatives which enhance users’ media literacy in these ways.

It is important to note that, when fulfilling this new objective, Ofcom will need to increase the public’s awareness of the ways in which they can protect groups that disproportionately face harm online, such as women and girls. The updated duty will also compel Ofcom to encourage the development and use of technologies and systems that support users of regulated services to protect themselves and others. Ofcom will be required to publish a statement recommending ways in which others, including platforms, can take action to support their users’ media literacy.

Amendment 274C places a new requirement on Ofcom to publish a strategy setting out how it will fulfil its media literacy functions under Section 11, including the new objectives. Ofcom will be required to update this strategy every three years and report on progress made against it annually to provide assurance that it is fulfilling its duty appropriately. These reports will be supported by the post-implementation review of the Bill, which covers Ofcom’s media literacy duty in so far as it relates to regulated services. This will provide a reasonable point at which to establish the impact of Ofcom’s work, having given it time to take effect.

I am confident that, through this updated duty, Ofcom will be empowered to ensure that internet users become more engaged with media literacy and, as a result, are safer online. I hope that these amendments will find support from across your Lordships’ House, and I beg to move.

Baroness Bull Portrait Baroness Bull (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I welcome this proposed new clause on media literacy and support the amendments in the names of the noble Lords, Lord Clement-Jones and Lord Knight of Weymouth. I will briefly press the Minister on two points. First, proposed new subsection (1C) sets out how Ofcom must perform its duty under proposed new subsection (1A), but it does not explicitly require Ofcom to work in partnership with existing bodies already engaged in and expert in provision of these kinds of activities. The potential for Ofcom to commission is explicit, but this implies quite a top-down relationship, not a collaboration that builds on best practice, enables scale-up where appropriate and generally avoids reinventing wheels. It seems like a wasted opportunity to fast-track delivery of effective programmes through partnership.

My second concern is that there is no explicit requirement to consider the distinct needs of specific user communities. In particular, I share the concerns of disability campaigners and charities that media literacy activities and initiatives need to take into account the needs of people with learning disabilities, autism and mental capacity issues, both in how activities are shaped and in how they are communicated. This is a group of people who have a great need to go online and engage, but we also know that they are at greater risk online. Thinking about how media literacy can be promoted, particularly among learning disability communities, is really important.

The Minister might respond by saying that Ofcom is already covered by the public sector equality duty and so is already obliged to consider the needs of people with protected characteristics when designing and implementing policies. But the unfortunate truth is that the concerns of the learning disability community are an afterthought in legislation compared with other disabilities, which are already an afterthought. The Petitions Committee in the other place, in its report on online abuse and the experience of disabled people, noted that there are multiple disabled people around the country with the skills and experience to advise government and its bodies but that there is a general unwillingness to engage directly with them. They are often described as hard to reach, which is kind of ironic because in fact most of these people use multiple services and so are very easy to reach, because they are on lots of databases and in contact with government bodies all the time.

The Minister may also point out that Ofcom’s duties in the Communications Act require it to maintain an advisory committee on elderly and disabled persons that includes

“persons who are familiar with the needs of persons with disabilities”.

But referring to an advisory committee is not the same as consulting people with disabilities, both physical and mental, and it is especially important to consult directly with people who may have difficulty understanding what is being proposed. Talking to people directly, rather than through an advisory committee, is very much the goal.

Unlike the draft Bill, which had media literacy as a stand-alone clause, the intention in this iteration is to deal with the issue by amending the Communications Act. It may be that in the web of interactions between those two pieces of legislation, my concerns can be set to rest. But I would find it very helpful if the Minister could confirm today that the intention is that media literacy programmes will be developed in partnership with—and build on best practice of—those organisations already delivering in this space and that the organisations Ofcom collaborates with will be fully inclusive of all communities, including those with disabilities and learning disabilities. Only in this way can we be confident that media literacy programmes will meet their needs effectively, both in content and in how they are communicated.

Finally, can the Minister confirm whether Ofcom considers people with lived experience of disability as subject matter experts on disability for the purpose of fulfilling its consultation duties? I asked this question during one of the helpful briefing sessions during the Bill’s progress earlier this year, but I did not get an adequate answer. Can the Minister clarify that for the House today?

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I want to look at how, in the Government expanding Ofcom’s duties to prioritise media literacy, it has become linked to this group, and to look at the way in which Amendment 274B does this. It is very much linked with misinformation and disinformation. According to the amendment, there has to be an attempt to establish

“accuracy and authenticity of content”

and to

“understand the nature and impact of disinformation and misinformation, and reduce their and others’ exposure to it”.

I was wondering about reducing users’ exposure to misinformation and disinformation. That gives me pause, because I worry that reducing exposure will obviously mean the removal or censorship of material. I just want to probe some assumptions. Is it the presumption that incorrect or seemingly untrue or erroneous information is the predominant cause of real harm if it is not suppressed? Is there not a risk of harm in suppressing ideas too? Apart from the fact that heretical scientific and political theories were historically seen as misinformation and now are conventional wisdom, is there a danger that suppression in the contemporary period would create mistrust and encourage conspiratorial thinking—people saying, “What have you got to hide?”—and so on?

I want to push this by probing Amendment 269AA in the name of the noble Lord, Lord Clement-Jones, which itself is a probing amendment as to why Ofcom’s misinformation and disinformation committee is not required to consider the provenance of information to help empower users to understand whether content is real or true and so on, rather than the wording at the moment, “accuracy and authenticity”. When I saw the word “provenance”, I stopped for a moment. In all the debates going on in society about misinformation and disinformation, excellent provenance cannot necessarily guarantee truth.

I was shocked to discover that the then Wellcome Trust director, Jeremy Farrar, who is now the chief scientist at the World Health Organization, claimed that the Wuhan lab leak and the manmade theories around Covid were highly improbable. We now know that there were emails from Jeremy Farrar—I was shocked because I am a great fan of the Wellcome Trust and Jeremy Farrar’s work in general—in which there was a conscious bending of the truth that led to the editing of a scientific paper and a letter in the Lancet that proved to have been spun in a way to give wrong information. When issues such as the Wuhan lab leak were raised by Matt Ridley, recently of this parish—I do not know whether his provenance would count—they were dismissed as some kind of racist conspiracy theory. I am just not sure that it is that clear that you can get provenance right. We know from the Twitter files that the Biden Administration leaned on social media companies to suppress the Hunter Biden laptop story that was in the New York Post, which was described as Russian disinformation. We now know that it was true.

Therefore, I am concerned that, in attempting to be well-meaning, this amendment that says we should have better media information does not give in to these lazy labels of disinformation and misinformation, as if we all know what the truth is and all we need is fact-checkers, provenance and authenticity. Disinformation and misinformation have been weaponised, which can cause some serious problems.

Can the Minister clarify whether the clause on media literacy is a genuine, positive attempt at encouraging people to know more, or itself becomes part of an information war that is going on offline and which will not help users at all but only confuse things?

19:00
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to the government Amendments 274B and 274C. I truly welcome a more detailed approach to Ofcom’s duties in relation to media literacy. However, as is my theme today, I raise two frustrations. First, having spent weeks telling us that it is impossible to include harms that go beyond content and opposing amendments on that point, the Government’s media literacy strategy includes a duty to help users to understand the harmful ways in which regulated services may be used. This is in addition to understanding the nature and impact of harmful content. It appears to suggest that it is the users who are guilty of misuse of products and services rather than putting any emphasis on the design or processes that determine how a service is most often used.

I believe that all of us, including children, are participants in creating an online culture and that educating and empowering users of services is essential. However, it should not be a substitute for designing a service that is safe by design and default. To make my point absolutely clear, I recount the findings of researchers who undertook workshops in 28 countries with more than 1,000 children. The researchers were at first surprised to find that, whether in Kigali, São Paulo or Berlin, to an overwhelming extent children identified the same problems online—harmful content, addiction, privacy, lack of privacy and so on. The children’s circumstances were so vastly different—country and town, Africa and the global north et cetera—but when the researchers did further analysis, they realised that the reason why they had such similar experiences was because they were using the same products. The products were more determining of the outcome than anything to do with religion, education, status, age, the family or even the country. The only other factor that loomed large, which I admit that the Government have recognised, was gender. Those were the two most crucial findings. It is an abdication of adult responsibility to place the onus on children to keep themselves safe. The amendment and the Bill, as I keep mentioning, should focus on the role of design, not on how a child uses it.

My second point, which is of a similar nature, is that I am very concerned that a lot of digital literacy—for adults as well as children, but my particular concern is in schools—is provided by the tech companies themselves. Therefore, once again their responsibility, their role in the system and process of what children might find from reward loops, algorithms and so on, is very low down on the agenda. Is it possible at this late stage to consider that Ofcom might have a responsibility to consider the system design as part of its literacy review?

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a very interesting short debate. Like other noble Lords, I am very pleased that the Government have proposed the new clauses in Amendments 274B and 274C. The noble Baroness, Lady Bull, described absolutely the importance of media literacy, particularly for disabled people and for the vulnerable. This is really important for them. It is important also not to fall into the trap described by the noble Baroness, Lady Kidron, of saying, “You are a child or a vulnerable person. You must acquire media literacy—it’s your obligation; it’s not the obligation of the platforms to design their services appropriately”. I take that point, but it does not mean that media literacy is not extraordinarily important.

However, sadly, I do not believe that the breadth of the Government’s new media literacy amendments is as wide as the original draft Bill. If you look back at the draft Bill, that was a completely new and upgraded set of duties right across the board, replacing Section 11 of the Communications Act and, in a sense, fit for the modern age. The Government have made a media literacy duty which is much narrower. It relates only to regulated services. This is not optimum. We need something broader which puts a bigger and broader duty for the future on to Ofcom.

It is also deficient in two respects. The noble Lord, Lord Knight, will speak to his amendments, but it struck me immediately when looking at that proposed new clause that we were missing all the debate about functionalities and so on that the noble Baroness, Lady Kidron, debated the other day, regarding design, and that we must ensure that media literacy encompasses understanding the underlying functionalities and systems of the platforms that we are talking about.

I know that your Lordships will be very excited to hear that I am going to refer again to the Joint Committee. I know that the Minister has read us from cover to cover, but at paragraph 381 on the draft Bill we said, and it is still evergreen:

“If the Government wishes to improve the UK’s media literacy to reduce online harms, there must be provisions in the Bill to ensure media literacy initiatives are of a high standard. The Bill should empower Ofcom to set minimum standards for media literacy initiatives that both guide providers and ensure the information they are disseminating aligns with the goal of reducing online harm”.


I had a very close look at the clause. I could not see that Ofcom is entitled to set minimum standards. The media literacy provisions sadly are deficient in that respect.

Lord McNally Portrait Lord McNally (LD)
- View Speech - Hansard - - - Excerpts

I am not surprised that my noble friend refers to his experience on the Joint Committee. He will not be surprised that I am about to refer to my experience on the Puttnam committee in 2003, which recommended media literacy as a priority for Ofcom. The sad fact is that media literacy was put on the back burner by Ofcom for almost 20 years. While I listen to this House, I think that my noble friend is quite right to accuse the Government, hard as the Minister has tried, of a paucity of ambition and—more than that—of letting us slip into the same mistake made by Ofcom after 2003 and allowing this to be a narrow, marginal issue. The noble Baroness, Lady Kidron, has reminded us time and again that unless we educate those who are using these technologies, these abuses will proliferate.

Therefore, with what my noble friend is advocating and what we will keep an eye on as the Bill is implemented—and I now literally speak over the Minister’s head, to the Member behind—Ofcom must take media literacy seriously and be a driving force in its implementation, for the very reasons that the noble Baroness, Lady Fox, referred to. We do not want everybody protected by regulations and powers—we want people protected by their own knowledge of what they are dealing with. This is where there is a gap between what has been pressed on the Government and what they are offering.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I thank my noble friend very much for that intervention.

Lord Harlech Portrait Lord Harlech (Con)
- Hansard - - - Excerpts

My Lords, I remind the House that, as we are on Report, interventions on current speakers should be for direct questions or points of elucidation.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I am sure my noble friend with 30 years’ experience stands duly corrected. He has reminded us that we have 20 years’ experience of something being on the statute book without really cranking up the powers and duties that are on it or giving Ofcom appropriate resources in the media literacy area. If that was about offline—the original 2003 duty—we know that it is even more important online to have these media literacy duties in place. I very much hope that the Minister can give us, in a sense, a token of earnest—that it is not just about putting these duties on the statute book but about giving Ofcom the resources to follow this up. Of course, it is also relevant to other regulators, which was partly the reason for having a duty of co-operation. Perhaps he will also, at the same time, describe how regulators such as Ofsted will have a role in media literacy.

I shall briefly talk about Amendment 269AA to Clause 141, which is the clause in the Bill setting up the advisory committee on misinformation and disinformation. I heard very clearly what the noble Baroness, Lady Fox, had to say, and I absolutely agree—there is no silver bullet in all this. Establishing provenance is but one way in which to get greater transparency and authentication and exercise judgment; it is not the complete answer, but it is one way of getting to grips more with some of the information coming through online. She may have seen that this is an “and” rather than an “or”, which is why the amendment is phrased as it is.

Of course, it is really important that there are initiatives. The one that I want to mention today about provenance is the Content Authenticity Initiative, which I mentioned in Committee. We need to use the power of such initiatives; it is a global coalition working to increase transparency in digital content through open industry standards, and it was founded four years ago and has more than 1,500 members, with some major companies such as Adobe, Microsoft, NVIDIA, Arm, Intel—I could go on. I very much hope that Ofcom will engage with the Content Authenticity Initiative, whatever the content of the Bill. In a sense, I am raising the issue for the Minister to give us assurances that this is within the scope of what the committee will be doing—that it is not just a question of doing what is in the Bill, and this will be included in the scope of the advisory committee’s work.

Thea AI has been an industry-led initiative that has developed content credentials which encode important metadata into pieces of content. Those pieces of information reside indefinitely in the content, wherever it is used, published or stored, and, as a result, viewers are able to make more informed decisions about whether or not to trust the content. The advisory committee really should consider the role of provenance tools such as content credentials to enable users to have the relevant information to decide what is real and what is disinformation or misinformation online. That would entirely fit the strategy of this Bill to empower adult users.

19:15
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, the Government have moved on this issue, and I very much welcome that. I am grateful to the Minister for listening and for the fact that we now have Section 11 of the Communications Act being brought into the digital age through the Government’s Amendments 274B and 274C. The public can now expect to be informed and educated about content-related harms, reliability and accuracy; technology companies will have to play their part; and Ofcom will have to regularly report on progress, and will commission and partner with others to fulfil those duties. That is great progress.

The importance of this was underscored at a meeting of the United Nations Human Rights Council just two weeks. Nada Al-Nashif, the UN Deputy High Commissioner for Human Rights in an opening statement said that media and digital literacy empowered individuals and

“should be considered an integral part of education efforts”.

Tawfik Jelassi, the assistant director-general of UNESCO, in a statement attached to that meeting, said that

“media and information literacy was essential for individuals to exercise their right to freedom of opinion and expression”—

I put that in to please the noble Baroness, Lady Fox—and

“enabled access to diverse information, cultivated critical thinking, facilitated active engagement in public discourse, combatted misinformation, and safeguarded privacy and security, while respecting the rights of others”.

If only the noble Lord, Lord Moylan, was in his place to hear me use the word privacy. He continued:

“Together, the international community could ensure that media and information literacy became an integral part of everyone’s lives, empowering all to think critically, promote digital well-being, and foster a more inclusive and responsible global digital community”.


I thought those were great words, summarising why we needed to do this.

I am grateful to Members on all sides of the House for the work that they have done on media literacy. Part of repeating those remarks was that this is so much more about empowerment than it is about loading safety on to individuals, as the noble Baroness, Lady Kidron, rightly said in her comments.

Nevertheless, we want the Minister to reflect on a couple of tweaks. Amendment 269C in my name is around an advisory committee being set up within six months and in its first report assessing the need for a code on misinformation. I have a concern that, as the regime that we are putting in place with this Bill comes into place and causes some of the harmful content that people find engaging to be suppressed, the algorithms will go to something else that is engaging, and that something else is likely to be misinformation and disinformation. I have a fear that that will become a growing problem that the regulator will need to be able to address, which is why it should be looking at this early.

Incidentally, that is why the regulator should also look at provenance, as in Amendment 269AA from the noble Lord, Lord Clement-Jones. It was tempting in listening to him to see whether there was an AI tool that could trawl across all the comments that he has made during the deliberations on this Bill to see whether he has quoted the whole of the joint report—but that is a distraction.

My Amendment 269D goes to the need for media literacy on systems, processes and business models, not just on content. Time and again, we have emphasised the need for this Bill to be as much about systems as content. There are contexts where individual, relatively benign pieces of content can magnify if part of a torrent that then creates harm. The Mental Health Foundation has written to many of us to make this point. In the same way that the noble Baroness, Lady Bull, asked about ensuring that those with disability have their own authentic voice heard as these media literacy responsibilities are played out, so the Mental Health Foundation wanted the same kind of involvement from young people; I agree with both. Please can we have some reassurance that this will be very much part of the literacy duties on Ofcom and the obligations it places on service providers?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to noble Lords for their comments, and for the recognition from the noble Lord, Lord Knight, of the changes that we have made. I am particularly grateful to him for having raised media literacy throughout our scrutiny of this Bill.

His Amendments 269C and 269D seek to set a date by which the establishment of the advisory committee on misinformation and disinformation must take place and to set requirements for its first report. Ofcom recognises the valuable role that the committee will play in providing advice in relation to its duties on misinformation and disinformation, and has assured us that it will aim to establish the committee as soon as is reasonably possible, in recognition of the threats posed by misinformation and disinformation online.

Given the valuable role of the advisory committee, Ofcom has stressed how crucial it will be to have appropriate time to appoint the best possible committee. Seeking to prescribe a timeframe for its implementation risks impeding Ofcom’s ability to run the thorough and transparent recruitment process that I am sure all noble Lords want and to appoint the most appropriate and expert members. It would also not be appropriate for the Bill to be overly prescriptive on the role of the committee, including with regard to its first report, in order for it to maintain the requisite independence and flexibility to give us the advice that we want.

Amendment 269AA from the noble Lord, Lord Clement-Jones, seeks to add advice on content provenance to the duties of the advisory committee. The new media literacy amendments, which update Ofcom’s media literacy duties, already include a requirement for Ofcom to take steps to help users establish the reliability, accuracy and authenticity of content found on regulated services. Ofcom will have duties and mechanisms to be able to advise platforms on how they can help users to understand whether content is authentic; for example, by promoting tools that assist them to establish the provenance of content, where appropriate. The new media literacy duties will require Ofcom to take tangible steps to prioritise the public’s awareness of and resilience to misinformation and disinformation online. That may include enabling users to establish the reliability, accuracy and authenticity of content, but the new duties will not remove content online; I am happy to reassure the noble Baroness, Lady Fox, on that.

The advisory committee is already required under Clause 141(4)(c) to advise Ofcom on its exercise of its media literacy functions, including its new duties relating to content authenticity. The Bill does not stipulate what tools service providers should use to fulfil their duties, but Ofcom will have the ability to recommend in its codes of practice that companies use tools such as provenance technologies to identify manipulated media which constitute illegal content or content that is harmful to children, where appropriate. Ofcom is also required to take steps to encourage the development and use of technologies that provide users with further context about content that they encounter online. That could include technologies that support users to establish content provenance. I am happy to reassure the noble Lord, Lord Clement-Jones, that the advisory committee will already be required to advise on the issues that he has raised in his amendment.

On media literacy more broadly, Ofcom retains its overall statutory duty to promote media literacy, which remains broad and non-prescriptive. The new duties in this Bill, however, are focused specifically on harm; that is because the of nature of the Bill, which seeks to make the UK the safest place in the world to be online and is necessarily focused on tackling harms. To ensure that Ofcom succeeds in the delivery of these new specific duties with regard to regulated services, it is necessary that the regulator has a clearly defined scope. Broadening the duties would risk overburdening Ofcom by making its priorities less clear.

The noble Baroness, Lady Bull—who has been translated to the Woolsack while we have been debating this group—raised media literacy for more vulnerable users. Under Ofcom’s existing media literacy programme, it is already delivering initiatives to support a range of users, including those who are more vulnerable online, such as people with special educational needs and people with disabilities. I am happy to reassure her that, in delivering this work, Ofcom is already working not just with expert groups including Mencap but with people with direct personal experiences of living with disabilities.

The noble Lord, Lord Clement-Jones, raised Ofsted. Effective regulatory co-ordination is essential for addressing the crosscutting opportunities and challenges posed by digital technologies and services. Ofsted will continue to engage with Ofcom through its existing mechanisms, including engagement led by its independent policy team and those held with Ofcom’s online safety policy director. In addition to that, Ofsted is considering mechanisms through which it can work more closely with Ofcom where appropriate. These include sharing insights from inspections in an anonymised form, which could entail reviews of its inspection bases and focus groups with inspectors, on areas of particular concern to Ofcom. Ofsted is committed to working with Ofcom’s policy teams to work these plans up in more detail.

Lord McNally Portrait Lord McNally (LD)
- Hansard - - - Excerpts

My Lords, could I ask the Minister a question? He has put his finger on one of the most important aspects of this Bill: how it will integrate with the Department for Education and all its responsibilities for schools. Again, talking from long experience, one of the worries is the silo mentality in Whitehall, which is quite often strongest in the Department for Education. Some real effort will be needed to make sure there is a crossover from the powers that Ofcom has to what happens in the classroom.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I hope what I have said about the way that Ofsted and Ofcom are working together gives the noble Lord some reassurance. He is right, and it is not just in relation to the Department for Education. In my own department, we have discussed in previous debates on media literacy the importance of critical thinking, equipping people with the sceptical, quizzical, analytic skills they need—which art, history and English literature do as well. The provisions in this Bill focus on reducing harm because the Bill is focused on making the UK the safest place to be online, but he is right that media literacy work more broadly touches on a number of government departments.

Amendment 274BA would require Ofcom to promote an understanding of how regulated services’ business models operate, how they use personal data and the operation of their algorithmic systems and processes. We believe that Ofcom’s existing duty under the Communications Act already ensures that the regulator can cover these aspects in its media literacy activities. The duty requires Ofcom to build public awareness of the processes by which material on regulated services is selected or made available. This enables Ofcom to address the platform features specified in this amendment.

The Government’s amendments include extensive new objectives for Ofcom, which apply to harmful ways in which a service is used as well as harmful content. We believe it important not to add further to this duty when the outcomes can already be achieved through the existing duty. We do not wish to limit, by implication, Ofcom’s media literacy duties in relation to other, non-regulated services.

We also judge that the noble Lord’s amendment carries a risk of confusing the remits of Ofcom and the Information Commissioner’s Office. UK data protection law already confers a right for people to be informed about how their personal data are being used, making this aspect of the amendment superfluous.

19:30
Amendment 274BB would direct Ofcom to set minimum standards for media literacy activities and initiatives. The only body that has duties in relation to media literacy is Ofcom itself and there would be no obligation for any other organisation to follow these standards. Ofcom could develop a voluntary standard for others but this could be achieved through proposed new subsection (1D), to be inserted by government Amendment 274B. This approach allows a flexibility that a requirement to set minimum standards would not. Rather than imposing a rigid set of standards, we are focusing on improving evaluation practices of media literacy initiatives to identify which measures are most effective and encourage their delivery.
Furthermore, Ofcom will be required to publish a statement recommending ways in which others, including platforms, can take action to support their users’ media literacy. Recommendations may include the development, pursuit and evaluation of activities or initiatives in relation to media literacy. This statement must also be published in a manner that Ofcom considers appropriate for bringing it to the attention of the persons who, in its opinion, are likely to be affected by it. Ofcom has undertaken extensive work to produce a comprehensive toolkit to support practitioners to deliver robust evaluations of their programmes. It was published in February this year and has met with praise from practitioners, including those who have received grant funding through the Government’s non-legislative media literacy work programme.
Having listened to this helpful debate, I remain confident that the provisions we are proposing will tackle the challenges that noble Lords have raised.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

I do not believe that the Minister has dealt with the minimum standards issue.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I do not think that the noble Lord was listening to that point, but I did.

Amendment 218B agreed.
Amendment 219
Moved by
219: Clause 158, leave out Clause 158
Member’s explanatory statement
This amendment would remove Clause 158 (Directions in special circumstances) from the Bill and is intended to further probe the Secretary of State’s power in this area.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, Clause 158 is one of the more mysterious clauses in the Bill and it would greatly benefit from a clear elucidation by the Minister of how it is intended to work to reduce harm. I thank him for having sent me an email this afternoon as we started on the Bill, for which I am grateful; I had only a short time to consider it but I very much hope that he will put its content on the record.

My amendment is designed to ask how the Minister envisages using the power to direct if, say, there is a new contagious disease or riots, and social media is a major factor in the spread of the problem. I am trying to erect some kind of hypothetical situation through which the Minister can say how the power will be used. Is the intention, for example, to set Ofcom the objective of preventing the spread of information on regulated services injurious to public health or safety on a particular network for six months? The direction then forces the regulator and the social media companies to confront the issue and perhaps publicly shame an individual company into using their tools to slow the spread of disinformation. The direction might give Ofcom powers to gather sufficient information from the company to make directions to the company to tackle the problem.

If that is envisaged, which of Ofcom’s media literacy powers does the Minister envisage being used? Might it be Section 11(1)(e) of the Communications Act 2003, which talks about encouraging

“the development and use of technologies and systems for regulating access to such material, and for facilitating control over what material is received, that are both effective and easy to use”.

By this means, Ofcom might encourage a social media company to regulate access to and control over the material that is a threat.

Perhaps the Minister could set out clearly how he intends all this to work, because on a straight reading of Clause 158, we on these Benches have considerable concerns. The threshold for direction is low—merely having

“reasonable grounds for believing that circumstances exist”—

and there is no sense here of the emergency that the then Minister, Mr Philp, cited in the Commons Public Bill Committee on 26 May 2022, nor even of the exceptional circumstances in Amendment 138 to Clause 39, which the Minister tabled recently. The Minister is not compelled by the clause to consult experts in public health, safety or national security. The Minister can set any objectives for Ofcom, it seems. There is no time limit for the effect of the direction and it seems that the direction can be repeatedly extended with no limit. If the Minister directs because they believe there is a threat to national security, we will have the curious situation of a public process being initiated for reasons the Minister is not obliged to explain.

Against this background, there does not seem to be a case for breaching the international convention of the Government not directing a media regulator. Independence of media regulators is the norm in developed democracies, and the UK has signed many international statements in this vein. As recently as April 2022, the Council of Europe stated:

“Media and communication governance should be independent and impartial to avoid undue influence on policymaking or”


the discriminatory and

“preferential treatment of powerful groups”,

including those with significant political or economic power. The Secretary of State, by contrast, has no powers over Ofcom regarding the content of broadcast regulation and has limited powers to direct over radio spectrum and wireless, but not content. Ofcom’s independence in day-to-day decision-making is paramount to preserving freedom of expression. There are insufficient safeguards in this clause, which is why I argue that it should not stand part of the Bill.

I will be brief about Clause 159 because, by and large, we went through it in our debate on a previous group. Now that we can see the final shape of the Bill, it really does behove us to stand back and see where the balance has settled on Ofcom’s independence and whether this clause needs to stand part of the Bill. The Secretary of State has extensive powers under various other provisions in the Bill. The Minister has tabled welcome amendments to Clause 39, which have been incorporated into the Bill, but Clause 155 still allows the Secretary of State to issue a “statement of strategic priorities”, including specific outcomes, every five years.

Clause 159 is in addition to this comprehensive list, but the approach in the clause is incredibly broad. We have discussed this, and the noble Lord, Lord Moylan, has tabled an amendment that would require parliamentary scrutiny. The Secretary of State can issue guidance to Ofcom on more or less anything encompassed by the exercise of its functions under this Act, with no consultation of the public or Parliament prior to making such guidance. The time limit for producing strategic guidance is three years rather than five. Even if it is merely “have regard” guidance, it represents an unwelcome intervention in Ofcom going about its business. If the Minister responds that the guidance is merely “to have regard”, I will ask him to consider this: why have it all, then, when there are so many other opportunities for the Government to intervene? For the regulated companies, it represents a regulatory hazard of interference in independent regulation and a lack of stability. As the noble Lord, Lord Bethell, said in Committee, a clear benefit of regulatory independence is that it reduces lobbying of the Minister by powerful corporate interests.

Now that we can see it in context, I very much hope that the Minister will agree that Clause 159 is a set of guidance too many that compromises Ofcom’s independence and should not stand part of the Bill.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I will add to my noble friend’s call for us to consider whether Clause 158 should be struck from the Bill as an unnecessary power for the Secretary of State to take. We have discussed powers for the Secretary of State throughout the Bill, with some helpful improvements led by the noble Baroness, Lady Stowell. This one jars in particular because it is about media literacy; some of the other powers related to whether the Secretary of State could intervene on the codes of practice that Ofcom would issue. The core question is whether we trust Ofcom’s discretion in delivering media literacy and whether we need the Secretary of State to have any kind of power to intervene.

I single out media literacy because the clue is in the name: literacy is a generic skill that you acquire about dealing with the online world; it is not about any specific text. Literacy is a broader set of skills, yet Clause 158 has a suggestion that, in response to specific forms of content or a specific crisis happening in the world, the Secretary of State would want to takesb this power to direct the media literacy efforts. To take something specific and immediate to direct something that is generic and long-term jars and seems inappropriate.

I have a series of questions for the Minister to elucidate why this power should exist at all. It would be helpful to have an example of what kind of “public statement notice”—to use the language in the clause—the Government might want to issue that Ofcom would not come up with on its own. Part of the argument we have been presented with is that, somehow, the Government might have additional information, but it seems quite a stretch that they could come up with that. In an area such as national security, my experience has been that companies often have a better idea of what is going on than anybody in government.

Thousands of people out there in the industry are familiar with APT 28 and APT 29 which, as I am sure all noble Lords know, are better known by their names Fancy Bear and Cozy Bear. These are agents of the Russian state that put out misinformation. There is nothing that UK agencies or the Secretary of State might know about them that is not already widely known. I remember talking about the famous troll factory run by Prigozhin, the Internet Research Agency, with people in government in the context of Russian interference—they would say “Who?” and have to go off and find out. In dealing with threats such as that between the people in the companies and Ofcom, you certainly want a media literacy campaign which tells you about these troll agencies and how they operate and gives warnings to the public, but I struggle to see why you need the Secretary of State to intervene as opposed to allowing Ofcom’s experts to work with company experts and come up with a strategy to deal with those kinds of threat.

The other example cited of an area where the Secretary of State might want to intervene is public health and safety. It would be helpful to be specific; had they had it, how would the Government have used this power during the pandemic in 2020 and 2021? Does the Minister have examples of what they were frustrated about and would have done with these powers that Ofcom would not do anyway in working with the companies directly? I do not see that they would have had secret information which would have meant that they had to intervene rather than trusting Ofcom and the companies to do it.

Perhaps there has been an interdepartmental workshop between DHSC, DCMS and others to cook up this provision. I assume that Clause 158 did not come from nowhere. Someone must have thought, “We need these powers in Clause 158 because we were missing them previously”. Are there specific examples of media literacy campaigns that could not be run, where people in government were frustrated and therefore wanted a power to offer it in future? It would be really helpful to hear about them so that we can understand exactly how the Clause 158 powers will be used before we allow this additional power on to the statute book.

In the view of most people in this Chamber, the Bill as a whole quite rightly grants the Government and Ofcom, the independent regulator, a wide range of powers. Here we are looking specifically at where the Government will, in a sense, overrule the independent regulator by giving it orders to do something it had not thought of doing itself. It is incumbent on the Government to flesh that out with some concrete examples so that we can understand why they need this power. At the moment, as noble Lords may be able to tell, these Benches are not convinced that they do.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I will be very brief. The danger with Clause 158 is that it discredits media literacy as something benign or anodyne; it will become a political plaything. I am already sceptical, but if ever there was anything to add to this debate then it is that.

19:45
I am very anxious about the notion that media literacy would be used in this way for public health or safety, as in the examples, because all my examples of where it has all gone horribly wrong—through government politicisation or politicised interventions in social media companies—have been in the recent lockdowns and over Covid. I am very worried about that and will talk about it later. We have had “nudge units”, about which there have been all sorts of scandals, but I will not go on about them. There will be a real problem if this is offloaded on to Ofcom—if Ofcom is instructed to do something—the Government will effectively be interfering in what social media is allowed to say or do and in what people are to understand to be the truth. It will discredit that.
The noble Lord, Lord Moylan, made a very good point in our last session. When I try to assess this, I understand that the Secretary of State is elected and that Ofcom is an unelected regulator, so in many ways it is more democratic that the Secretary of State should be openly politicised, but I am concerned that in this instance the Secretary of State will force the unelected Ofcom to do something that the Government will not do directly but will do behind the scenes. That is the danger. We will not even be able to see it correctly and it will emerge to the public as “media literacy” or something of that nature. That will obfuscate accountability even further. I have a lot of sympathy for the amendment to leave out this clause.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful for the opportunity to set out the need for Clauses 158 and 159. The amendments in this group consider the role of government in two specific areas: the power for the Secretary of State to direct Ofcom about its media literacy functions in special circumstances and the power for the Secretary of State to issue non-binding guidance to Ofcom. I will take each in turn.

Amendment 219 relates to Clause 158, on the Secretary of State’s power to direct Ofcom in special circumstances. These include where there is a significant threat to public safety, public health or national security. This is a limited power to enable the Secretary of State to set specific objectives for Ofcom’s media literacy activity in such circumstances. It allows the Secretary of State to direct Ofcom to issue public statement notices to regulated service providers, requiring providers to set out the steps they are taking to address the threat. The regulator and online platforms are thereby compelled to take essential and transparent actions to keep the public sufficiently informed during crises. The powers ensure that the regulatory framework is future-proofed and well equipped to respond in such circumstances.

As the noble Lord, Lord Clement-Jones, outlined, I corresponded with him very shortly before today’s debate and am happy to set out a bit more detail for the benefit of the rest of the House. As I said to him by email, we expect the media literacy powers to be used only in exceptional circumstances, where it is right that the Secretary of State should have the power to direct Ofcom. The Government see the need for an agile response to risk in times of acute crisis, such as we saw during the Covid-19 pandemic or in relation to the war in Ukraine. There may be a situation in which the Government have access to information, through the work of the security services or otherwise, which Ofcom does not. This power enables the Secretary of State to make quick decisions when the public are at risk.

Our expectation is that, in exceptional circumstances, Ofcom would already be taking steps to address harm arising from the provision of regulated services through its existing media literacy functions. However, these powers will allow the Secretary of State to step in if necessary to ensure that the regulator is responding effectively to these sudden threats. It is important to note that, for transparency, the Secretary of State will be required to publish the reasons for issuing a direction to Ofcom in these circumstances. This requirement does not apply should the circumstances relate to national security, to protect sensitive information.

The noble Lord asked why we have the powers under Clause 158 when they do not exist in relation to broadcast media. We believe that these powers are needed with respect to social media because, as we have seen during international crises such as the Covid-19 pandemic, social media platforms can sadly serve as hubs for low-quality, user-generated information that is not required to meet journalistic standards, and that can pose a direct threat to public health. By contrast, Ofcom’s Broadcasting Code ensures that broadcast news, in whatever form, is reported with due accuracy and presented with due impartiality. Ofcom can fine, or ultimately revoke a licence to broadcast in the most extreme cases, if that code is breached. This means that regulated broadcasters can be trusted to strive to communicate credible, authoritative information to their audiences in a way that social media cannot.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

We established in our last debate that the notion of a recognised news publisher will go much broader than a broadcaster. I put it to the Minister that we could end up in an interesting situation where one bit of the Bill says, “You have to protect content from these people because they are recognised news publishers”. Another bit, however, will be a direction to the Secretary of State saying that, to deal with this crisis, we are going to give a media literacy direction that says, “Please get rid of all the content from this same news publisher”. That is an anomaly that we risk setting up with these different provisions.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

On the previous group, I raised the issue of legal speech that was labelled as misinformation and removed in the extreme situation of a public health panic. This was seemingly because the Government were keen that particular public health information was made available. Subsequently, we discovered that those things were not necessarily untrue and should not have been removed. Is the Minister arguing that this power is necessary for the Government to direct that certain things are removed on the basis that they are misinformation—in which case, that is a direct attempt at censorship? After we have had a public health emergency in which “facts” have been contested and shown to not be as black and white or true as the Government claimed, saying that the power will be used only in extreme circumstances does not fill me with great confidence.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am happy to make it clear, as I did on the last group, that the power allows Ofcom not to require platforms to remove content, only to set out what they are doing in response to misinformation and disinformation—to require platforms to make a public statement about what they are doing to tackle it. In relation to regulating news providers, we have brought the further amendments forward to ensure that those subject to sanctions cannot avail themselves of the special provisions in the Bill. Of course, the Secretary of State will be mindful of the law when issuing directions in the exceptional circumstances that these clauses set out.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

While the Minister is describing that, can he explain exactly which media literacy power would be invoked by the kind of example I gave when I was introducing the amendment and in the circumstances he has talked about? Would he like to refer to the Communications Act?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

It depends on the circumstances. I do not want to give one example for fear of being unnecessarily restrictive. In relation to the health misinformation and disinformation we saw during the pandemic, an example would be the suggestions of injecting oneself with bleach; that sort of unregulated and unhelpful advice is what we have in mind. I will write to the noble Lord, if he wants, to see what provisions of the Communications Act we would want invoked in those circumstances.

In relation to Clause 159, which is dealt with by Amendment 222, it is worth setting out that the Secretary of State guidance and the statement of strategic priorities have distinct purposes and associated requirements. The purpose of the statement of strategic priorities is to enable the Secretary of State to specifically set out priorities in relation to online safety. For example, in the future, it may be that changes in the online experience mean that the Government of the day wish to set out their high-level overarching priorities. In comparison, the guidance allows for clarification of what Parliament and Government intended in passing this legislation—as I hope we will—by providing guidance on specific elements of the Bill in relation to Ofcom’s functions. There are no plans to issue guidance under this power but, for example, we are required to issue guidance to Ofcom in relation to the fee regime.

On the respective requirements, the statement of strategic priorities requires Ofcom to explain in writing what it proposes to do in consequence of the statement and publish an annual review of what it has done. Whereas Ofcom must “have regard” to the guidance, the guidance itself does not create any statutory requirements.

This is a new regime and is different in its nature from other established areas of regulations, such as broadcasting. The power in Clause 159 provides a mechanism to provide more certainty, if that is considered necessary, about how the Secretary of State expects Ofcom to carry out its statutory functions. Ofcom will be consulted before guidance is issued, and there are checks on how often it can be issued and revised. The guidance document itself, as I said, does not create any statutory requirements, so Ofcom is required only to “have regard” to it.

This will be an open and transparent way to put forward guidance appropriately with safeguards in place. The independence of the regulator is not at stake here. The clause includes significant limitations on the power, and the guidance cannot fetter Ofcom’s operational independence. We feel that both clauses are appropriate for inclusion in the Bill, so I hope that the noble Lord will withdraw his amendment.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I thank the Minister for that more extended reply. It is a more reassuring response on Clause 159 than we have had before. On Clause 158, the impression I get is that the media literacy power is being used as a smokescreen for the Government telling social media what it should do, indirectly via Ofcom. That seems extraordinary. If the Government were telling the mainstream media what to do in circumstances like this, we would all be up in arms. However, it seems to be accepted as a part of the Bill and that we should trust the Government. The Minister used the phrase “special circumstances”. That is not the phraseology in the clause; it is that “circumstances exist”, and then it goes on to talk about national security and public health. The bar is very low.

I am sure everyone is getting hungry at this time of day, so I will not continue. However, we still have grave doubts about this clause. It seems an extraordinary indirect form of censorship which I hope is never invoked. In the meantime, I beg leave to withdraw my amendment.

Amendment 219 withdrawn.
Clause 159: Secretary of State’s guidance
Amendments 220 to 222 not moved.
Clause 161: Review
Amendment 224
Moved by
224: Clause 161, page 140, line 27, leave out “or 3” and insert “, 3 or 3A”
Member’s explanatory statement
Clause 161 is about a review by the Secretary of State of the regulatory framework established by this Bill. This amendment inserts a reference to Chapter 3A, which is the new Chapter containing the new duties imposed by the Clause proposed after Clause 67 in my name.
Amendment 224 agreed.
20:00
Consideration on Report adjourned until 8.40 pm.
20:40
Amendment 225
Moved by
225: After Clause 161, insert the following new Clause—
“Transparency of government representations to regulated service providers
(1) The Secretary of State must produce a report setting out any relevant representations His Majesty’s Government have made to providers of Part 3 services to tackle the presence of misinformation and disinformation on Part 3 services.(2) In this section “relevant representations” are representations that could reasonably be considered to be intended to persuade or encourage a provider of a Part 3 service to—(a) modify the terms of service of a regulated service in an effort to address misinformation or disinformation,(b) restrict or remove a particular user’s access to accounts used by them on a regulated service, or(c) take down, reduce the visibility of, or restrict access to content that is present or may be encountered on a regulated service.(3) The first report must be laid before both Houses of Parliament within six months of this Act being passed.(4) Subsequent reports must be laid before both Houses of Parliament at intervals not exceeding six months.(5) The Secretary of State is not required by this section to include in the report information that the Secretary of State considers would be against the interests of national security.(6) If the Secretary of State relies upon subsection (5) they must as soon as reasonably practicable send a report containing that information to the Intelligence and Security Committee of Parliament.”Member’s explanatory statement
This amendment addresses government influence on content moderation, for example by way of initiatives like the Government’s Counter Disinformation Unit.
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, continuing the rather radical approach of debating an amendment that has already been debated in Committee and has not just been introduced, and picking up on the theme of our debate immediately before we adjourned, I move an amendment that seeks to address the question of the Government’s activities in interacting with providers when they seek to influence providers on what is shown on their sites.

It might be a matter of interest that according to the Daily Telegraph, which I implicitly trust, only on Tuesday of last week, a judge in Louisiana in the United States issued an injunction forbidding a lengthy list of White House officials from making contact with social media companies to report misinformation. I say this not because I expect the jurisprudence of the state of Louisiana to have any great influence in your Lordships’ House but simply to show how sensitive and important this issue is. The judge described what he had heard and seen as one of the greatest assaults on free speech in the history of the United States.

We are not necessarily quite in that territory, and nor does my amendment do anything so dramatic as to prevent the Government communicating with providers with a view to influencing their content, but Amendment 225 requires the Secretary of State to produce a report within six months of the passing of the Act, and every six months thereafter, in which he sets out

“any relevant representations His Majesty’s Government have made to providers”

that are

“intended to persuade or encourage a provider”

to do one of three things. One is to

“modify the terms of service of a regulated service in an effort to address misinformation or disinformation”;

one is to

“restrict or remove a particular user’s access to accounts used by them”;

and the third is to

“take down, reduce the visibility of, or restrict access to content that is present or may be encountered on a regulated service”.

None of these things would be prohibited or prevented by this amendment, but it would be required that His Majesty’s Government produce a report saying what they have done every six months.

Very importantly there is an exception, in that there would be no obligation on the Secretary of State to disclose publicly any information that affected national security, but he would be required in that case to make a report to the Intelligence and Security Committee here in Parliament. As I said, this is a very sensitive subject, and remarks made by the noble Baroness, Lady Fox of Buckley, in the previous debate referred in particular to this subject in connection with the pandemic. While that is in the memory, other topics may easily come up and need to be addressed, where the Government feel obliged to move and take action.

We know nothing about those contacts, because they are not instructions or actions taken under law. They are simply nudges, winks and phone conversations with providers that have an effect and, very often, the providers will act on them. Requiring the Government to make a report and say what they have done seems a modest, proportionate and appropriate means to bring transparency to this exercise, so that we all know what is going on.

20:45
I am happy to say that when this amendment was debated in Committee, it found widespread support from around the House. I hope to find that that support is still solid and strong, such that my noble friend, perhaps as a modest postprandial bonus, will be willing, for a change, to accept something proposed by a colleague from his own Benches, so that we can all rejoice as we go into a very long night. I beg to move.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I put my name to this very important amendment—all the more important because of the previous discussions we have had about the difficulties around misinformation or potential government interference in decisions about what is online and what is not online. The noble Lord, Lord Moylan, is right to indicate that this is a very modest and moderate amendment; it addresses the problems of government influence or government moderation, or at least allows those of us who are concerned about it to keep our eye on it and make sure that the country and Parliament know what is going on.

The original idea of disinformation came from an absolutely rightful concern about foreign disinformation between states. People were rightly concerned about security; we all should be and nobody wants to be taken in, in that way. But there has been a worry when agencies designed to combat those threats increasingly turn inward against the public, in a wide range of countries. Although that might not be exactly what has happened in the UK, we should note that Meta CEO Mark Zuckerberg recently admitted that the US Government asked Facebook to suppress true information. In a recent interview, he said that the scientific establishment

“asked for a bunch of things to be censored that, in retrospect, ended up being more debatable or true”.

We should all be concerned about this. It is not just a matter for those of us who are worried about free speech or raise the issue. If we are genuinely worried about misinformation or fake news, we have to make sure that we are equally concerned if it comes from other sources, not just from malign players.

The noble Lord, Lord Moylan, mentioned the American court case Missouri v Biden. In his 155-page ruling, Judge Doughty depicted quite a dystopian scene when he said that, during the pandemic, the US Government seem

“to have assumed a role similar to an Orwellian ‘Ministry of Truth’”.

I do not think we want to emulate the worst of what is happening in the US here.

The judge there outlined a huge complex of government agencies and officials connected with big tech and an army of bureaucrats hired to monitor websites and flag and remove problematic posts. It is not like that in the UK, but some of us were quite taken aback to discover that the Government ran a counter-disinformation policy forum during the lockdown, which brought tech giants together to discuss how to deal with Covid misinformation, as it was said. There was a worry about political interference then.

I do not think that this is just paranoia. Since then, Big Brother Watch and its investigative work have shown that the UK Government had a secret unit that worked with social media companies to monitor and prevent speech critical of Covid lockdown policies, in the shape of the Counter Disinformation Unit, which was set up by Ministers to deal with groups and individuals who criticised policies such as lockdowns, school closures, vaccine mandates or what have you.

Like the noble Lord, Lord Moylan, I do not want to get stuck on what happened during lockdown. That was an exceptional, extreme situation. None the less, the Counter Disinformation Unit—which works out of the Minister’s own department, the DCMS—is still operating. It seems to be able to get content fast-tracked for possible moderation by social media firms such as Facebook and Twitter. It used an AI firm to search social media posts—we need to know the details of that.

I think, therefore, that to have the transparency which the Government and the Minister have constantly stressed is hugely important for the credibility of the Bill, it is important that there is transparency about the likes of the Counter Disinformation Unit and any government attempts at interfering in what we are allowed to see, read or have access to online.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, the noble Lord, Lord Moylan, and the noble Baroness, Lady Fox, have a very strong point to make with this amendment. I have tried in our discussions to bring some colour to the debate from my own experience so I will tell your Lordships that in my former professional life I received representations from many Ministers in many countries about the content we should allow or disallow on the Facebook platform that I worked for.

That was a frequent occurrence in the United Kingdom and extended to Governments of all parties. Almost as soon as I moved into the job, we had a Labour Home Secretary come in and suggest that we should deal with particular forms of content. It happened through the coalition years. Indeed, I remember meeting the Minister’s former boss at No. 10 in Davos, of all places, to receive some lobbying about what the UK Government thought should be on or off the platform at that time. In that case it was to do with terrorist content; there was nothing between us in terms of wanting to see that content gone. I recognise that this amendment is about misinformation and disinformation, which is perhaps a more contentious area.

As we have discussed throughout the debate, transparency is good. It keeps everybody on the straight and narrow. I do not see any reason why the Government should not be forthcoming. My experience was that the Government would often want to go to the Daily Telegraph, the Daily Mail or some other upright publication and tell it how they had been leaning on the internet companies—it was part of their communications strategy and they were extremely proud of it—but there will be other circumstances where they are doing it more behind the scenes. Those are the ones we should be worried about.

If those in government have good reason to lean on an internet company, fine—but knowing that they have to be transparent about it, as in this amendment, will instil a certain level of discipline that would be quite healthy.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, clearly, there is a limited number of speakers in this debate. We should thank the noble Lord, Lord Moylan, for tabling this amendment because it raises a very interesting point about the transparency—or not—of the Counter Disinformation Unit. Of course, it is subject to an Oral Question tomorrow as well, which I am sure the noble Viscount will be answering.

There is some concern about the transparency of the activities of the Counter Disinformation Unit. In its report, Ministry of Truth, which deals at some length with the activities of the Counter Disinformation Unit, Big Brother Watch says:

“Giving officials an unaccountable hotline to flag lawful speech for removal from the digital public square is a worrying threat to free speech”.


Its complaint is not only about oversight; it is about the activities. Others such as Full Fact have stressed the fact that there is little or no parliamentary scrutiny. For instance, freedom of information requests have been turned down and Written Questions which try to probe what the activities of the Counter Disinformation Unit are have had very little response. As it says, when the Government

“lobby internet companies about content on their platforms … this is a threat to freedom of expression”.

We need proper oversight, so I am interested to hear the Minister’s response.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, the Government share the view of my noble friend Lord Moylan about the importance of transparency in protecting freedom of expression. I reassure him and other noble Lords that these principles are central to the Government’s operational response to addressing harmful disinformation and attempts artificially to manipulate our information environment.

My noble friend and others made reference to the operational work of the Counter Disinformation Unit, which is not, as the noble Baroness, Lady Fox, said, the responsibility of my department but of the Department for Science, Innovation and Technology. The Government have always been transparent about the work of the unit; for example, recently publishing a factsheet on GOV.UK which sets out, among other things, how the unit works with social media companies.

I reassure my noble friend that there are existing processes governing government engagements with external parties and emphasise to him that the regulatory framework that will be introduced by the Bill serves to increase transparency and accountability in a way that I hope reassures him. Many teams across government regularly meet industry representatives on a variety of issues from farming and food to telecoms and digital infrastructure. These meetings are conducted within well-established transparency processes and frameworks, which apply in exactly the same way to government meetings with social media companies. The Government have been open about the fact that the Counter Disinformation Unit meets social media companies. Indeed, it would be surprising if it did not. For example, at the beginning of the Russian invasion of Ukraine, the Government worked with social media companies in relation to narratives which were being circulated attempting to deny incidents leading to mass casualties, and to encourage the promotion of authoritative sources of information. That work constituted routine meetings and was necessary in confirming the Government’s confidence in the preparedness and ability of platforms to respond to new misinformation and disinformation threats.

To require additional reporting on a sector-by-sector or department-by-department basis beyond the standardised transparency processes, as proposed in my noble friend’s amendment, would be a disproportionate and unnecessary response to what is routine engagement in an area where the Government have no greater powers or influence than in others. They cannot compel companies to alter their terms of service; nor can or do they seek to mandate any action on specific pieces of content.

I reassure the noble Baroness, Lady Fox, that the Counter Disinformation Unit does not monitor individual people, nor has it ever done so; rather, it tracks narratives and trends using publicly available information online to protect public health, public safety and national security. It has never tracked the activity of individuals, and there is a blanket ban on referring any content from journalists or parliamentarians to social media performs. The Government have always been clear that the Counter Disinformation Unit refers content for consideration only where an assessment has been made that it is likely to breach the platform’s own terms of service. It has no role in deciding what action, if any, to take in response, which is entirely a matter for the platform concerned.

As I said, the Bill will introduce new transparency, accountability and freedom of expression duties for category 1 services which will make the process for any removal or restriction of user-generated content more transparent by requiring category 1 services to set terms of service which are clear, easy for users to understand and consistently enforced. Category 1 services will be prohibited from removing or restricting user-generated content or suspending or banning users where this does not align with those terms of service. Any referrals from government will not, and indeed cannot, supersede these duties in the Bill.

Although I know it will disappoint my noble friend that another of his amendments has not been accepted, I hope I have been able to reassure him about the Government’s role in these processes. As the noble Lord, Lord Clement-Jones, noted, my noble friend Lord Camrose is answering a Question on this in your Lordships’ House tomorrow, further underlining the openness and parliamentary accountability with which we go about this work. I hope my noble friend will, in a similarly post-prandial mood of generosity, suppress his disappointment and feel able to withdraw his amendment.

21:00
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

Before the Minister sits down, I think that it is entirely appropriate for him to say—I have heard it before—“Oh no, nothing was taken down. None of this is believable. No individuals were targeted”. However, that is not the evidence I have seen, and it might well be that I have been shown misinformation. But that is why the Minister has to acknowledge that one of the problems here is that indicated by Full Fact—which, as we know, is often endorsed by government Ministers as fact-checkers. It says that because the Government are avoiding any scrutiny for this unit, it cannot know. It becomes a “he said, she said” situation. I am afraid that, because of the broader context, it would make the Minister’s life easier, and be clearer to the public—who are, after all, worried about this—if he accepted the ideas in the amendment of the noble Lord, Lord Moylan. We would then be clear and it would be out in the open. If the FOIs and so on that have been constantly put forward were answered, would that not clear it up?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I have addressed the points made by the noble Baroness and my noble friend already. She asks the same question again and I can give her the same answer. We are operating openly and transparently here, and the Bill sets out further provisions for transparency and accountability.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I see what my noble friend did there, and it was very cunning. He gave us a very worthwhile account of the activities of the Counter Disinformation Unit, a body I had not mentioned at all, as if the Counter Disinformation Unit was the sole locus of this sort of activity. I had not restricted it to that. We know, in fact, that other bodies within government have been involved in undertaking this sort of activity, and on those he has given us no answer at all, because he preferred to answer about one particular unit. He referred also to its standardised transparency processes. I can hardly believe that I am reading out words such as those. The standardised transparency process allows us all to know that encounters take place but still refuses to let us know what actually happens in any particular encounter, even though there is a great public interest in doing so. However, I will not press it any further.

My noble friend, who is genuinely a friend, is in danger of putting himself, at the behest of civil servants and his ministerial colleagues, in some danger. We know what happens in these cases. The Minister stands at the Dispatch Box and says “This has never happened; it never normally happens; it will not happen. Individuals are never spoken of, and actions of this character are never taken”. Then of course, a few weeks or months later, out pour the leaked emails showing that all these things have been happening all the time. The Minister then has to resign in disgrace and it is all very sad. His friends, like myself, rally round and buy him a drink, before we never see him again.

Anyway, I think my noble friend must be very careful that he does not put himself in that position. I think he has come close to doing so this evening, through the assurances he has given your Lordships’ House. Although I do not accept those assurances, I will none the less withdraw the amendment, with the leave of the House.

Amendment 225 withdrawn.
Clause 173: Providers’ judgements about the status of content
Amendment 226 not moved.
Amendment 227
Moved by
227: Clause 173, page 150, line 23, at end insert “or
(c) an assessment required to be carried out by section (Assessment duties: user empowerment),”Member’s explanatory statement
This amendment ensures that Clause 173, which is about the approach to be taken by providers to judgements about the status of content, applies to assessments under the new Clause proposed after Clause 11 in my name.
Amendment 227 agreed.
Amendment 228
Moved by
228: Clause 173, page 151, leave out lines 1 and 2
Member’s explanatory statement
This amendment removes a requirement on providers which could encourage excessive content removal in borderline cases of illegality.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, we are coming to some critical amendments on a very important issue relatively late in the Bill, having had relatively little discussion on it. It is not often that committees of this House sit around and say, “We need more lawyers”, but this is one of those areas where that was true.

Notwithstanding the blushes of my noble friend on the Front Bench here, interestingly we have not had in our debate significant input from people who understand the law of freedom of expression and wish to contribute to our discussions on how online platforms should deal with questions of the legality of content. These questions are crucial to the Bill, which, if it does nothing else, tells online platforms that they have to be really robust in taking action against content that is deemed to be illegal under a broad swathe of law in the United Kingdom that criminalises certain forms of speech.

We are heavy with providers, and we are saying to them, “If you fail at this, you’re in big trouble”. The pressure to deal with illegal content will be huge, yet illegality itself covers a broad spectrum, from child sexual exploitation and abuse material, where in many cases it is obvious from the material that it is illegal and there is strict liability—there is never any excuse for distributing that material—and pretty much everyone everywhere in the world would agree that it should be criminalised and removed from the internet, through to things that we discussed in Committee, such as public order offences, where, under some interpretations of Section 5 of the Public Order Act, swearing at somebody or looking at them in a funny way in the street could be deemed alarming and harassing. There are people who interpret public order offences in this very broad sense, where there would be a lot less agreement about whether a specific action is or is not illegal and whether the law is correctly calibrated or being used oppressively. So we have this broad spectrum of illegality.

The question we need to consider is where we want providers to draw the line. They will be making judgments on a daily basis. I said previously that I had to make those judgments in my job. I would write to lawyers and they would send back an expensive piece of paper that said, “This is likely to be illegal”, or, “This is likely not to be illegal”. It never said that it was definitely illegal or definitely not illegal, apart from the content I have described, such as child sexual abuse. You would not need to send that, but you would send the bulk of the issues that we are dealing with to a lawyer. If you sent it to a second lawyer, you would get another “likely” or “not likely”, and you would have to come to some kind of consensus view as to the level of risk you wished to take on that particular form of speech or piece of content.

This is really challenging in areas such as hate speech, where exactly the same language has a completely different meaning in different contexts, and may or may not be illegal. Again, to give a concrete example, we would often deal with anti-Semitic content being shared by anti-anti-Semitic groups—people trying to raise awareness of anti-Semitic speech. Our reviewers would quite commonly remove the speech: they would see it and it would look like grossly violating anti-Semitic speech. Only later would they realise that the person was sharing it for awareness. The N-word is a gross term of racial abuse, but if you are an online platform you permit it a lot of the time, because if people use it self-referentially they expect to be able to use it. If you start removing it they would naturally get very upset. People expect to use it if it is in song lyrics and they are sharing music. I could give thousands of examples of speech that may or may not be illegal depending entirely on the context in which it is being used.

We will be asking platforms to make those judgments on our behalf. They will have to take it seriously, because if they let something through that is illegal they will be in serious trouble. If they misjudged it and thought the anti-Semitic hate speech was being circulated by Jewish groups to promote awareness but it turned out it was being circulated by a Nazi group to attack people and that fell foul of UK law, they would be in trouble. These judgments are critical.

We have the test in Clause 173, which says that platforms should decide whether they have “reasonable grounds to infer” that something is illegal. In Committee, we debated changing that to a higher bar, and said that we wanted a stronger evidential basis. That did not find favour with the Government. We hoped they might raise the bar themselves unilaterally, but they have not. However, we come back again in a different way to try to be helpful, because I do not think that the Government want excessive censorship. They have said throughout the Bill’s passage that they are not looking for platforms to be overly censorious. We looked at the wording again and thought about how we could ensure that the bar is not operated in a way that I do not think that the Government intend. We certainly would not want that to happen.

We look at the current wording in Clause 173 and see that the test there has two elements. One is: “Do you have reasonable grounds to infer?” and then a clause in brackets after that says, “If you do have reasonable grounds to infer, you must treat the content as illegal”. In this amendment we seek to remove the second part of that phrasing because it seems problematic. If we say to the platform, “Reasonable grounds to infer, not certainty”—and it is weird to put “inference”, which is by definition mushy, with “must”, which is very certain, into the same clause—we are saying, “If you have this mushy inference, you must treat it as illegal”, which seems quite problematic. Certainly, if I were working at a platform, the way I would interpret that is: “If in doubt, take it out”. That is the only way you can interpret that “must”, and that is really problematic. Again, I know that that is not the Government’s intention, and if it were child sexual exploitation material, of course you “must”. However, if it is the kind of abusive content that you have reasonable grounds to infer may be an offence under the Public Order Act, “must” you always treat that as illegal? As I read the rest of the Bill, if you are treating it as illegal, the sense is that you should remove it.

That is what we are trying to get at. There is a clear understanding from the Government that their intention is “must” when it comes to that hard end of very bad, very clearly bad content. However, we need something else—a different kind of behaviour where we are dealing with content where it is much more marginal. Otherwise, the price we will pay will be in freedom of expression.

People in the United Kingdom publish quite robust, sweary language. I sometimes think that some of the rules we apply penalise the vernacular. People who use sweary, robust language may be doing so entirely legally—the United Kingdom does not generally restrict people from using that kind of language. However, we risk heading towards a scenario where people post such content in future, and they will find that the platform takes it down. They will complain to the platform, saying, “Why the hell did you take my content down?”—in fact, they will probably use stronger words than that to register their complaint. When they do, the platform will say, “We had reasonable grounds to infer that that was in breach of the Public Order Act, for example, because somebody might feel alarmed, harassed or distressed by it. Oh, and look—in this clause, it says we ‘must’ treat it as illegal. Sorry—there is nothing else we can do. We would have loved to have been able to exercise the benefit of the doubt and to allow you to carry on using that kind of language, because we think there is some margin where you have not behaved in an illegal way. But unfortunately, because of the way that Clause 173 has been drafted, our lawyers tell us we cannot afford to take the risk”.

In the amendment we are trying to—I think—help the Government to get out of a situation which, as I say, I do not think they want. However, I fear that the totality of the wording of Clause 173, this low bar for the test and the “must treat as” language, will lead to that outcome where platforms will take the attitude: “Safety first; if in doubt, take it out”, and I do not think that that is the regime we want. I beg to move.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - - - Excerpts

My Lords, I regret I was unable to be present in Committee to deliver my speech about the chilling effect that the present definition of illegality in the Bill will have on free speech on the internet.

I am still concerned about Clause 173, which directs platforms how to come to the judgment on what is illegal. My concern is that the criterion for illegality, “reasonable grounds to infer” that elements of the content are illegal, will encourage the tech companies to take down content which is not necessarily illegal but which they infer could be. Indeed, the noble Lord, Lord Allan, gave us a whole list of examples of where that might happen. Unfortunately, in Committee there was little support for a higher bar when asking the platforms to judge what illegal content is. However, I have added my name to Amendment 228, put forward by the noble Lord, Lord Allan, because, as he has just said, it is a much less radical way of enhancing free speech when platforms are not certain whether to take down content which they infer is illegal.

The deletion of part of Clause 173(5) is a moderate proposal. It still leaves intact the definition for the platforms of how they are to make the judgment on the illegality of content, but it takes out the compulsory element in this judgment. I believe that it will have the biggest impact on the moderation system. Some of those systems are run by machines, but many of the moderation processes, such as Meta’s Facebook, involve thousands of human beings. The deletion of the second part of Clause 173(5), which demands that they take down content that they infer is illegal, will give them more leeway to err on the side of freedom of speech. I hope that this extra leeway to encourage free speech will also be included in the way that algorithms moderate our content.

21:15
Earlier in the Bill, Clause 18 lays out, for all services, the importance of protecting users’ rights to freedom of expression, and there are various duties of assessment for large companies. However, there is not enough in the Bill which builds freedom of expression into the moderation capacity of the platforms. Alan Rusbridger, a member of the Facebook Oversight Board, gave evidence to the communications Select Committee inquiry into freedom of expression online. He said:
“I believe that freedom of speech is a hugely important right … In most judgments, I begin by thinking, ‘Why would we restrict freedom of speech in this particular case?’”.
Evidence was also given that many moderators do not have a background in freedom of expression and are not completely conversant with the Article 10 rights. The amendment will allow moderators to think more about their role in erring on the side of freedom of expression when deciding on the illegality of content.
There has been much discussion, both in Committee and on Report, on protecting freedom of expression, but not much movement by the Government. I hope that the Minister will use this small amendment to push for draft codes of practice which allow the platforms, when they are not sure of the illegality of content, to use their discretion and consider freedom of expression.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, it is all quite exciting now, is it not? I can say “hear, hear!” a lot; everyone is talking about freedom of expression. I cannot tell noble Lords how relieved and pleased I was both to hear the speeches and to see Amendment 228 from the noble Lord, Lord Allan of Hallam, and the noble Viscount, Lord Colville of Culross, who both explained well why this is so important. I am so glad that, even late in our discussions on Report, it has returned as an important issue.

We have already discussed how in many cases, especially when it comes to what is seen as illegal speech, decisions about illegality are very complicated. They are complicated in the law courts and offline, and that is when they have the full power of lawyers, the criminal justice system and so on trying to make decisions. Leaving it up to people who, through no fault of their own, are not qualified but who work in a social media company to try to make that decision in a climate of quite onerous obligations—and having phrases such as “reasonable grounds to infer”—will lead to lawful expression being overmoderated. Ultimately, online platforms will use an abundance of caution, which will lead to a lot of important speech—perfectly lawful if not worthy speech; the public’s speech and the ability to speak freely—being removed. That is not a trivial side issue; it will discredit the Bill, if it has not done so already.

Whenever noble Lords make contributions about why a wide range of amendments and changes are needed—particularly in relation to protecting children, harm and so on—they constantly tell us that the Bill should send an uncompromising message. The difficulty I have is with the danger that the Bill will send an uncompromising message that freedom of expression is not important. I urge the Minister to look carefully at the amendment, because the message should be that, while the Bill is trying to tackle online harm and to protect children in particular—which I have never argued against—huge swathes of it might inadvertently silence people and deprive them of the right to information that they should be able to have.

My Amendment 229—I am not sure why it is in this group, but that is nothing new in the way that the groupings have worked—is about lawful speech and about what content is filtered by users. I have already argued for the replacement of the old legal but harmful duty, but the new duty of user empowerment is welcome, and at face value it puts users in the driving seat and allows adults to judge for themselves what they want and do not want to see. But—and it is a large but—that will work only if users and providers agree about when content should be filtered and what content is filtered.

As with all decisions on speech, as I have just mentioned, in the context particularly of a heightened climate of confusion and sensitivity regarding identity politics and the cancel-culture issues that we are all familiar with, there are some problems with the way that things stand in the Bill. I hope I am using the term “reasonable grounds to infer” in a better way than it is used in terms of illegality. My amendment specifies that companies need to have reasonable grounds to infer that content is abusive or inciting hatred when filtering out content in those user empowerment tools. Where a user chooses to filter out hateful content based on race, on being a woman or whatever, it should catch only content that genuinely falls under those headings. There is a risk that, without this amendment, technologies or individuals working for companies could operate in a heavy-handed way in filtering out legitimate content.

I shall give a couple of examples. Say that someone chooses to filter out abusive content targeting the protected characteristic of race. I imagine that they would have a reasonable expectation that that filter would target aggressive, unpleasant content demeaning to a person because of their race, but does the provider agree with that? Will it interpret my filtering choice as a user in the most restrictive way possible in a bid to protect my safety or by seeing my sensibilities as having a low threshold for what it might consider to be abuse?

The race issue illustrates where we get into difficulties. Will the filterers take their cue from the document that has just been revealed, which was compiled by the Diocese of St Edmundsbury and Ipswich, which the anti-racist campaigning group Don’t Divide Us has just released, and which is being used in 87 schools? Under the heading of racism we have ideas like passive racism includes agreeing that

“There are two sides to every story”,


or if you deny white privilege or if you start a sentence saying, “Not all white people”. “Veiled racism” in this document—which, as I say, is being used in schools for that particular reason by the Church of England—includes a “Euro-centric curriculum” or “cultural appropriation”. “Racist discrimination” includes “anti- immigration policies”, which, as I pointed out before, would indicate that some people would call the Government’s own Bill tonight racist.

The reason why I mention that is that you might think, “I am going to have racism filtered out”, but if there is too much caution then you will have filtered out very legitimate discussions on immigration and cultural appropriation. You will be protected, but if, for example, the filterer follows certain universities that have deemed the novels of Walter Scott, the plays of William Shakespeare or Enid Blyton’s writing as racist, then you can see that we have some real problems. When universities have said there is misogynistic bullying and sexual abuse in “The Great Gatsby” and Ovid’s “Metamorphoses”, I just want to make sure that we do not end up in a situation where there is oversensitivity by the filterers. Perhaps the filtering will take place by algorithm, machine learning and artificial intelligence, but the EHRC has noted that algorithms just cannot cope with the context, cultural difference and complexity of language within the billions of items of content produced every day.

Amendment 229 ensures that there is a common standard—a standard of objective reasonableness. It is not perfect at all; I understand that reasonableness itself is open to interpretation. However, it is an attempt to ensure that the Government’s concept of user empowerment is feasible by at least aspiring to a basic shared understanding between users and providers as to what will be filtered and what will not, and a check against providers’ filter mechanisms removing controversial or unpopular content in the name of protecting users. Just as I indicated in terms of sending a message, if the Government could indicate to the companies that rather than taking a risk-averse attitude, they had to bear in mind freedom of expression, not be oversensitive and not be too risk-averse or overcautious, we might begin to get some balance. Otherwise, an awful lot of lawful material will be removed that is not even harmful.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I support Amendment 228. I spoke on this issue to the longer amendment in Committee. To decide whether something is illegal without the entire apparatus of the justice system, in which a great deal of care is taken to decide whether something is illegal, at high volume and high speed, is very worrying. It strikes me as amusing because someone commented earlier that they like a “must” instead of a “maybe”. In this case, I caution that a provider should treat the content as content of the kind in question accordingly, that something a little softer is needed, not a cliff edge that ends up in horrors around illegality where someone who has acted in self-defence is accused of a crime of violence, as happens to many women, and so on and so forth. I do not want to labour the point. I just urge a gentle landing rather than, as it is written, a cliff edge.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a very interesting debate. Beyond peradventure my noble friend Lord Allan and the noble Viscount, Lord Colville, and the noble Baroness, Lady Fox, have demonstrated powerfully the perils of this clause. “Lawyers’ caution” is one of my noble friend’s messages to take away, as is the complexities in making these judgments. It was interesting when he mentioned the sharing for awareness’s sake of certain forms of content and the judgments that must be taken by platforms. His phrase “If in doubt, take it out” is pretty chilling in free speech terms—I think that will come back to haunt us. As the noble Baroness, Lady Fox, said, the wrong message is being delivered by this clause. It is important to have some element of discretion here and not, as the noble Baroness, Lady Kidron, said, a cliff edge. We need a gentler landing. I very much hope that the Minister will land more gently.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a good debate. It is very hard to see where one would want to take it. If it proves anything, it is that the decision to drop the legal but harmful provisions in the Bill was probably taken for the wrong reasons but was the right decision, since this is where we end up—in an impossible moral quandary which no amount of writing, legalistic or otherwise, will get us out of. This should be a systems Bill, not a content Bill.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I start by saying that accurate systems and processes for content moderation are crucial to the workability of this Bill and keeping users safe from harm. Amendment 228 from the noble Lord, Lord Allan of Hallam, seeks to remove the requirement for platforms to treat content as illegal or fraudulent content if reasonable grounds for that inference exist. The noble Lord set out his concerns about platforms over-removing content when assessing illegality.

Under Clause 173(5), platforms will need to have reasonable grounds to determine whether content is illegal or a fraudulent advertisement. Only when a provider has reasonable grounds to infer that said content is illegal or a fraudulent advertisement must it then comply with the relevant requirements set out in the Bill. This would mean removing the content or preventing people from encountering it through risk-based and proportionate systems and processes.

21:30
Clause 173(6) further clarifies what “reasonable grounds to infer” means in relation to judgments about illegal content and fraudulent adverts. It sets out the tests that a provider must apply to the assessment of whether all the elements of an offence—including the mental elements—are present, and whether a defence might be relied on.
The noble Lord’s amendment removes this standard for judging the illegality of content but does not replace it with another standard. That would mean that the Bill provided less detail about when providers are required to treat content as illegal or a fraudulent advert. The result would be that the Bill did not set out a consistent approach to identifying and removing such content that would enable providers to interpret their duties in a broad range of ways while still complying with the framework. This could result in services both over-removing and under-removing content.
I know that the noble Lord is concerned that this provision could encourage overzealous removal of content, but the Government are clear that the approach that I have just outlined provides the necessary safeguards against platforms over-removing content when complying with their duties under the Bill. The noble Lord asked for a different standard to be associated with different types of criminal offence. That is, in effect, what we have done through the distinction that we have made between priority and non-priority offences.
To assist services further, Ofcom will be required to provide guidance on how it judges the illegality of content. In addition, the Government consider that it would not be right to weaken the test for illegal content by diluting the content moderation provisions in the way that this amendment would. Content moderation is critical to protecting users from illegal content and fraudulent advertisements.
The noble Viscount, Lord Colville, set out the importance of freedom of expression, as other noble Lords—principally the noble Baroness, Lady Fox, and the noble Lord, Lord Moylan, but others too—have throughout our scrutiny of the Bill. Our approach regarding freedom of expression recognises that the European Convention on Human Rights imposes obligations in relation to this on states, not private entities. As we have discussed previously, private actors, including service providers in scope, have their own freedom of expression rights. This means that platforms are free to decide what content should be allowed on their sites within the bounds of the law. As such, it is more appropriate to ask them to have particular regard to these concepts rather than to be compliant or consistent with them.
In-scope companies will have to consider and implement safeguards for freedom of expression when fulfilling their duties. For example, platforms could safeguard freedom of expression by ensuring that human moderators are adequately trained to assess contextual and linguistic nuance—such as the examples that the noble Lord gave—to prevent the over-removal of content. The larger services will also have additional duties to assess their impact on freedom of expression and privacy when adopting safety policies, to keep this assessment up to date and to demonstrate that they have taken positive steps in relation to the impact assessment.
Further, platforms will not be penalised for making the wrong calls on pieces of illegal content. Ofcom will instead make its judgments on the systems and processes that platforms have in place when making these decisions. The focus on transparency through the Bill’s framework and on user reporting and redress mechanisms will enable users to appeal the removal of content more effectively than they can at present.
Amendment 229 in the name of the noble Baroness, Lady Fox, would require providers of category 1 services to apply the user empowerment features required under Clause 12 only to content that they have “reasonable grounds to infer” is user empowerment content. The Bill’s cross-cutting freedom of expression duties already prevent providers overapplying user empowerment features or adopting an inconsistent or capricious approach; Ofcom can take enforcement action if they do this. Clause 173(2) and (3) already specify how providers must make judgments about the status of content, including judgments about whether content is in scope of the user empowerment duties. That includes making this judgment based on
“all relevant information that is reasonably available to a provider”.
It is unclear whether the intention of the noble Baroness’s amendment is to go further. If so, it would be inappropriate to apply the “reasonable grounds to infer” test in Clause 173(5) and (6) to user empowerment content. This is because, as I have just outlined in relation to the amendment in the name of the noble Lord, Lord Allan, the test sets out the approach that providers must take when assessing whether content amounts to a criminal offence. The test cannot sensibly be applied to content covered by the user empowerment duties because such content is not illegal. It is not workable to suggest that providers need to apply criminal law concepts such as intent or defences to non-criminal material. Under Clause 48, Ofcom will be required to produce and publish guidance that sets out examples of the kinds of content that Ofcom considers to be relevant to the user empowerment duties. This will assist providers in determining what content is of relevance to the user empowerment duties.
I hope that this allays the concerns raised by the noble Baroness and the noble Lord, and that the noble Lord will be content to withdraw his amendment.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I remain concerned that people who use more choice words of Anglo-Saxon origin will find their speech more restricted than those who use more Latinate words, such as “inference” and “reasonable”, but the Minister has given some important clarifications.

The first is that no single decision could result in a problem for a platform, so it will know that it is about a pattern of bad decision-making rather than a single decision; that will be helpful in terms of taking a bit of the pressure off. The Minister also gave an important clarification around—I hate this language, but we have to say it—priority versus primary priority. If everything is a priority, nothing is a priority but, in this Bill, some things are more of a priority than others. The public order offences are priority offences; therefore, they have a little bit more leeway over those offences than they do over primary priority offences, which include the really bad stuff that we all agree we want to get rid of.

As I say, I do not think that we are going to get much further in our debates today although those were important clarifications. The Minister is trying to give us reasonable grounds to infer that the guidance from Ofcom will result in a gentle landing rather than a cliff edge, which the noble Baroness, Lady Kidron, rightly suggested is what we want. With that, I beg leave to withdraw the amendment.

Amendment 228 withdrawn.
Amendment 229 not moved.
Amendment 230
Moved by
230: After Clause 174, insert the following new Clause—
“Time for publishing first guidance under certain provisions of this Act
(1) OFCOM must publish guidance to which this section applies within the period of 18 months beginning with the day on which this Act is passed. (2) This section applies to—(a) the first guidance under section 47(2)(a) (record-keeping and review);(b) the first guidance under section 47(2)(b) (children’s access assessments);(c) the first guidance under section 48(1) (content harmful to children);(d) the first guidance under section 73 (provider pornographic content);(e) the first guidance under section 90(1) (illegal content risk assessments under section 8);(f) the first guidance under section 90(2) (illegal content risk assessments under section 22);(g) the first guidance under section 90(3) (children’s risk assessments);(h) the first guidance under section 140 (enforcement);(i) the first guidance under section 174 relating to illegal content judgements within the meaning of subsection (2)(a) of that section (illegal content and fraudulent advertisements).(3) If OFCOM consider that it is necessary to extend the period mentioned in subsection (1) in relation to guidance mentioned in any of paragraphs (a) to (i) of subsection (2), OFCOM may extend the period in relation to that guidance by up to 12 months by making and publishing a statement.But this is subject to subsection (6).(4) A statement under subsection (3) must set out—(a) the reasons why OFCOM consider that it is necessary to extend the period mentioned in subsection (1) in relation to the guidance concerned, and(b) the period of extension.(5) A statement under subsection (3) may be published at the same time as (or incorporate) a statement under section 38(12) (extension of time to prepare certain codes of practice).(6) But a statement under subsection (3) may not be made in relation to guidance mentioned in a particular paragraph of subsection (2) if—(a) a statement has previously been made under subsection (3) (whether in relation to guidance mentioned in the same or a different paragraph of subsection (2)), or(b) a statement has previously been made under section 38(12).”Member’s explanatory statement
This amendment provides that OFCOM must prepare the first guidance under certain provisions of the Bill within 18 months of Royal Assent, unless they consider a longer period to be necessary in which case OFCOM may (on one occasion only) extend the period and set out why in a published statement.
Amendment 230 agreed.
Clause 176: Individuals providing regulated services: liability
Amendment 231
Moved by
231: Clause 176, page 152, line 33, at end insert—
“(ga) Chapter 3A of Part 4 (deceased child users);”Member’s explanatory statement
Clause 176 is about liability of providers who are individuals. This amendment inserts a reference to Chapter 3A, which is the new Chapter containing the new duties imposed by the Clause proposed after Clause 67 in my name, so that individuals may be jointly and severally liable for the duties imposed by that clause.
Amendment 231 agreed.
Clause 179: Information offences: supplementary
Amendments 231A and 231B
Moved by
231A: Clause 179, page 154, line 8, leave out “is” and insert “has been”
Member’s explanatory statement
This amendment is a minor change to ensure consistency of tenses.
231B: Clause 179, page 154, line 11, leave out “is” and insert “has been”
Member’s explanatory statement
This amendment is a minor change to ensure consistency of tenses.
Amendments 231A and 231B agreed.
Schedule 17: Video-sharing platform services: transitional provision etc
Amendments 232 to 236
Moved by
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay
- Hansard - - - Excerpts

232: Schedule 17, page 247, line 35, at end insert—


“(ba) section (Assessment duties: user empowerment) (assessments related to the adult user empowerment duty set out in section 12(2)), and”

Member’s explanatory statement


This amendment ensures that, during the transitional period when video-sharing platform services continue to be regulated by Part 4B of the Communications Act 2003, providers of such services are not exempt from the new duty in the new clause proposed after Clause 11 in my name to carry out assessments for the purposes of the user empowerment duties in Clause 12(2).

233: Schedule 17, page 247, line 36, leave out “and (9) (records of risk assessments)” and insert “, (8A) and (9) (records of assessments)”
Member’s explanatory statement
This amendment ensures that, during the transitional period when video-sharing platform services continue to be regulated by Part 4B of the Communications Act 2003, providers of such services are not exempt from the new duty inserted in Clause 19 (see the amendments of that Clause proposed in my name) to keep records of the new assessments.
234: Schedule 17, page 248, line 20, at end insert—
“(ea) the duties set out in section (Disclosure of information about use of service by deceased child users) (deceased child users);”Member’s explanatory statement
This amendment ensures that services already regulated under Part 4B of the Communications Act 2003 (video-sharing platform services) are not required to comply with the new duties imposed by the clause proposed after Clause 67 in my name during the transitional period.
235: Schedule 17, page 250, line 12, leave out “risk assessments and children’s access” and insert “certain”
Member’s explanatory statement
This amendment makes a technical drafting change related to the new Clause proposed after Clause 11 in my name.
236: Schedule 17, page 250, line 15, leave out “risk assessments and children’s access” and insert “certain”
Member’s explanatory statement
This amendment makes a technical drafting change related to the new Clause proposed after Clause 11 in my name.
Amendments 232 to 236 agreed.
Amendment 236A
Moved by
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay
- Hansard - - - Excerpts

236A: After Clause 194, insert the following new Clause—


“Power to regulate app stores


(1) Subject to the following provisions of this section and section (Power to regulate app stores: supplementary), the Secretary of State may by regulations amend any provision of this Act to make provision for or in connection with the regulation of internet services that are app stores.

(2) Regulations under this section may not be made before OFCOM have published a report under section (OFCOM’s report about use of app stores by children)(report about use of app stores by children).

(3) Regulations under this section may be made only if the Secretary of State, having considered that report, considers that there is a material risk of significant harm to an appreciable number of children presented by either of the following, or by both taken together—

(a) harmful content present on app stores, or

(b) harmful content encountered by means of regulated apps available in app stores.

(4) Before making regulations under this section the Secretary of State must consult—

(a) persons who appear to the Secretary of State to represent providers of app stores,

(b) persons who appear to the Secretary of State to represent the interests of children (generally or with particular reference to online safety matters),

(c) OFCOM,

(d) the Information Commissioner,

(e) the Children’s Commissioner, and

(f) such other persons as the Secretary of State considers appropriate.

(5) In this section and in section (Power to regulate app stores: supplementary)—

“amend” includes repeal and apply (with or without modifications);

“app” includes an app for use on any kind of device, and “app store” is to be read accordingly;

“content that is harmful to children” has the same meaning as in Part 3 (see section 54);

“harmful content” means—

(a) content that is harmful to children,

(b) search content that is harmful to children, and

(c) regulated provider pornographic content;

“regulated app” means an app for a regulated service;

“regulated provider pornographic content” has the same meaning as in Part 5 (see section 70);

“search content” has the same meaning as in Part 3 (see section 51).

(6) In this section and in section (Power to regulate app stores: supplementary) references to children are to children in the United Kingdom.”

Member’s explanatory statement


This amendment provides that the Secretary of State may make regulations amending this Bill so as to bring app stores within its scope. The regulations may not be made until OFCOM have published their report about the use of app stores by children (see the new Clause proposed to be inserted after Clause 147 in my name).

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, we have had some productive discussions on application stores, commonly known as “app stores”, and their role as a gateway for children accessing online services. I am grateful in particular to my noble friend Lady Harding of Winscombe for her detailed scrutiny of this area and the collaborative approach she has taken in relation to it and to her amendments, to which I will turn in a moment. These share the same goals as the amendments tabled in my name in seeking to add evidence-based duties on app stores to protect children.

The amendments in my name will do two things. First, they will establish an evidence base on the use of app stores by children and the role that app stores play in children encountering harmful content online. Secondly, following consideration of this evidence base, the amendments also confer a power on the Secretary of State to bring app stores into scope of the Bill should there be a material risk of significant harm to children on or through them.

On the evidence base, Amendment 272A places a duty on Ofcom to publish a report on the role of app stores in children accessing harmful content on the applications of regulated services. To help build a greater evidence base about the types of harm available on and through different kinds of app stores, the report will consider a broad range of these stores, which could include those available on various devices, such as smartphones, gaming devices and smart televisions. The report will also assess the use and effectiveness of age assurance on app stores and consider whether the greater use of age assurance or other measures could protect children further.

Publication of the report must be two to three years after the child safety duties come into force so as not to interfere with the Bill’s implementation timelines. This timing will also enable the report to take into account the impact of the regulatory framework that the Bill establishes.

Amendment 274A is a consequential amendment to include this report in the Bill’s broader confidentiality provisions, meaning that Ofcom will need to exclude confidential matters—for example, commercially sensitive information—from the report’s publication.

Government Amendments 236A, 236B and 237D provide the Secretary of State with a delegated power to bring app stores into the scope of regulation following consideration of Ofcom’s report. The power will allow the Secretary of State to make regulations putting duties on app stores to reduce the risks of harm presented to children from harmful content on or via app stores. The specific requirements in these regulations will be informed by the outcome of the Ofcom report I have mentioned.

As well as setting out the rules for app stores, the regulations may also make provisions regarding the duties and functions of Ofcom in regulating app stores. This may include information-gathering and enforcement powers, as well as any obligations to produce guidance or codes of practice for app store providers.

By making these amendments, our intention is to build a robust evidence base on the potential risks of app stores for children without affecting the Bill’s implementation more broadly. Should it be found that duties are required, the Secretary of State will have the ability to make robust and comprehensive duties, which will provide further layers of protection for children. I beg to move.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, before speaking to my Amendment 239A, I thank my noble friend the Minister, the Secretary of State and the teams in both the department and Ofcom for their collaborative approach in working to bring forward this group of amendments. I also thank my cosignatories. My noble friend Lady Stowell cannot be in her place tonight but she has been hugely helpful in guiding me through the procedure, as have been the noble Lords, Lord Stevenson, Lord Clement-Jones and Lord Knight, not to mention the noble Baroness, Lady Kidron. It has been a proper cross-House team effort. Even the noble Lord, Lord Allan, who started out quite sceptical, has been extremely helpful in shaping the discussion.

I also thank the NSPCC and Barnardo’s for their invaluable advice and support, as well as Snap and Match—two companies which have been willing to stick their heads above the parapet and challenge suppliers and providers on which they are completely dependent in the shape of the current app store owners, Apple and Google.

I reassure my noble friend the Minister—and everyone else—that I have no intention of dividing the House on my amendment, in case noble Lords were worried. I am simply seeking some reassurance on a number of points where my amendments differ from those tabled by the Government—but, first, I will highlight the similarities.

As my noble friend the Minister has referred to, I am delighted that we have two packages of amendments that in both cases recognise that this was a really significant gap in the Bill as drafted. Ignoring the elements of the ecosystem that sell access to regulated services, decide age guidelines and have the ability to do age assurance was a substantial gap in the framing of the Bill. But we have also recognised together that it is very important that this is an “and” not an “or”—it is not instead of regulating user-to-user services or search but in addition to. It is an additional layer that we can bring to protect children online, and it is very important that we recognise that—and both packages do.

21:45
Finally, considerable work needs to be done to do this properly—to properly research how we should regulate app stores and other app store-like things, and to do the full package, which I am pleased to say that both groups of amendments do. They instruct Ofcom to do the research but also give the Secretary of State the powers to enact any recommendations that come from that research.
However, there are four differences that I will briefly tease out. I live in hope, but I suspect that two we will continue to disagree on—but on two I hope my noble friend can give us some reassurance that we are in fact aligned. I shall take the two that I fear we will disagree on first. First, on timing, the Government’s amendments require Ofcom to conduct the work two to three years after Sections 11 and 25 come into force. I suspect that that means that we are talking about four to five years away, which is a very long time in the history of the digital world; whereas my amendments require 12 months after the first element of Sections 11 and 25 coming into being, so probably about two years. We are looking for an air gap between the implementation of user-to-user regulation, search regulation and app store regulation—but I would argue that the government amendment is too big an air gap. I ask my noble friend the Minister to consider whether it is possible to reduce the length of time to a digital speed rather than an analogue one.
On my second point, on which I am not so hopeful—but I am certain that we will come back to it again on Wednesday—the Government’s amendments require Ofcom to consider whether app stores cover harmful content. Once again, we have an amendment that focuses on content rather than functionality, systems and processes, and the non-content harms that a number of us are most worried about. It is a real shame that, in this new amendment, the Government have chosen to use the very language that is causing us so much concern in other parts of the Bill; whereas my amendments ask Ofcom to look at the objectives of Part 3 of the Bill, which I think is a much neater way. I genuinely believe that the Government want to capture the non-content harms, and I ask the Minister to consider whether it is possible to tidy up their amendment at Third Reading.
In two areas, I hope that the Minister can give me reassurance. First, on transparency, the government amendments are clear that Ofcom needs to publish the output of its research—its report to the Secretary of State—but they are not clear that the Secretary of State needs to publish if they choose not to implement Ofcom’s recommendations. Possibly I am just a novice in parliamentary procedure and they just would not be able to get away with that, and one of us will remember the point in time to ask the right question and it will come up in a ballot at the right time, but can my noble friend the Minister assure me that, in fact, the Secretary of State will publish, even if they choose not to implement any further regulation of app stores?
Finally, on scope, my amendments talk about app stores and “other access means”, to make sure that this is future-proofed. Today 99% of all user-to-user services are reached through the Apple App Store and Google Play. We all hope that that will change and that there will be competition in the stores that we use and the means and mechanisms that we use; hence the group of amendments that I have tabled refer to app stores and “other access means”.
The government amendments do not really define what an app store is and I notice that my noble friend the Minister did not either. Can he give us some assurance that he is confident that the wording in the government amendments is future-proof and not prone to the rebranding of app stores into something else and/or replatforming among these enormous tech companies so that we access them through some other form of technology, and that they will still be caught by the Ofcom review?
These are my four questions. Fundamentally, I am extremely grateful for the collaborative way in which the Government and all of us in this House have developed this. We have been able to move this forward constructively, and I am very pleased and grateful.
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I pay tribute to the noble Baroness, Lady Harding, for her role in bringing this issue forward. I too welcome the government amendments. It is important to underline that adding the potential role of app stores to the Bill is neither an opportunity for other companies to fail to comply and wait for the gatekeepers to do the job nor a one-stop shop in itself. It is worth reminding ourselves that digital journeys rarely start and finish in one place. In spite of the incredible war for our attention, in which products and services attempt to keep us rapt on a single platform, it is quite important for everyone in the ecosystem to play their part.

I have two minor points. First, I was not entirely sure why the government amendment requires the Secretary of State to consult as opposed to Ofcom. Can the Minister reassure me that, whoever undertakes the consultation, it will include children and children’s organisations as well as tech companies? Secondly, like the noble Baroness, Lady Harding, I was a little surprised that the amendment does not define an app store but uses the term “the ordinary meaning of”. That seems like it may have the possibility for change. If there is a good reason for that—I am sure there is—then it must be stated that app stores cannot suddenly rebrand to something else and that that gatekeeper function will be kept absolutely front and centre.

Notwithstanding those comments, and associating myself with the idea that nothing should wait until 2025-26, I am very grateful to the Government for bringing this forward.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I will make a brief contribution because I was the misery guts when this was proposed first time round. I congratulate the noble Baroness, Lady Harding, not just on working with colleagues to come up with a really good solution but on seeking me out. If I heard someone be as miserable as I was, I might try to avoid them. She did not; she came and asked me, “Why are you miserable? What is the problem here?”, and took steps to address it. Through her work with the Government, their amendments address my main concerns.

My first concern, as we discussed in Committee, was that we would be asking large companies to regulate their competitors, because the app stores are run by large tech companies. She certainly understood that concern. The second was that I felt we had not necessarily yet clearly defined the problem. There are lots of problems. Before you can come up with a solution, you need a real consensus on what problem you are trying to address. The government amendment will very much help in saying, “Let’s get really crunchy about the actual problem that we need app stores to address”.

Finally, I am a glass-half-full kind of guy as well as a misery guts—there is a contradiction there—and so I genuinely think that these large tech businesses will start to change their behaviour and address some of the concerns, such as getting age ratings correct, just by virtue of our having this regulatory framework in place. Even if today the app stores are technically outside, the fact that the sector is inside and that this amendment tells them that they are on notice will, I think and hope, have a hugely positive effect and we will get the benefits much more quickly than the timescale envisaged in the Bill. That feels like a true backstop. I sincerely hope that the people in those companies, who I am sure will be glued to our debate, will be thinking that they need to get their act together much more quickly. It is better for them to do it themselves than wait for someone to do it to them.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I add my congratulations to the noble Baroness, Lady Harding, on her tenacity, and to the Minister on his flexibility. I believe that where we have reached is pretty much the right balance. There are the questions that the noble Baroness, Lady Harding, and others have asked of the Minister, and I hope he will answer those, but this is a game-changer, quite frankly. Rightly, the noble Baroness has paid tribute to the companies which have put their head above the parapet. That was not that easy for them to do when you consider that those are the platforms they have to depend on for their services to reach the public.

Unlike the research report, they have reserved powers that the Secretary of State can use if the report is positive, which I hope it will be. I believe this could be a turning point. The digital markets and consumers Bill is coming down the track this autumn and that is going to give greater powers to make sure that the app stores can be tackled—after all, there are only two of them and they are an oligopoly. They are the essence of big tech, and they need to function in a much more competitive way.

The noble Baroness talked about timing, and it needs to be digital timing, not analogue. Four years does seem a heck of a long time. I hope the Minister will address that.

Then there is the really important aspect of harmful content. In the last group, the Minister reassured us about systems and processes and the illegality threshold. Throughout, he has tried to reassure us that this is all about systems and processes and not so much about content. However, every time we look, we see that content is there almost by default, unless the subject is raised. We do not yet have a Bill that is actually fit for purpose in that sense. I hope the Minister will use his summer break wisely and read through the Bill to make sure that it meets its purpose, and then come back at Third Reading with a whole bunch of amendments that add functionalities. How about that for a suggestion? It is said in the spirit of good will and summer friendship.

The noble Baroness raised a point about transparency when it comes to Ofcom publishing its review. I hope the Minister can give that assurance as well.

The noble Baroness, Lady Kidron, asked about the definition of app store. That is the gatekeeper function, and we need to be sure that that is what we are talking about.

I end by congratulating once again the noble Baroness and the Minister on where we have got to so far.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I will start with the final point of the noble Lord, Lord Clement-Jones. I remind him that, beyond the world of the smartphone, there is a small company called Microsoft that also has a store for software—it is not just Google and Apple.

Principally, I say well done to the noble Baroness, Lady Harding, in deploying all of her “winsome” qualities to corral those of us who have been behind her on this and then persuade the Minister of the merits of her arguments. She also managed to persuade the noble Lord, Lord Allan of Misery Guts, that this was a good idea. The sequence of research, report, regulation and regulate is a good one, and as the noble Lord, Lord Clement-Jones, reminded us it is being deployed elsewhere in the Bill. I agree with the noble Baroness about the timing: I much prefer two years to four years. I hope that at least Ofcom would have the power to accelerate this if it wanted to do so.

I was reminded of the importance of this in an article I read in the Guardian last week, headed:

“More than 850 people referred to clinic for video game addicts”.


This was in reference to the NHS-funded clinic, the National Centre for Gaming Disorders. A third of gamers receiving treatment there were spending money on loot boxes in games such as “Fortnite”, “FIFA”, “Minecraft”, “Call of Duty” and “Roblox”—all games routinely accessed by children. Over a quarter of those being treated by the centre were children.

22:00
The article reported that Apple’s and Google’s app stores are
“increasingly offering games with gambling-style mechanics”—
systems addictive by design rather than safe by design. Leon Xiao, a loot box expert and PhD fellow at the IT University of Copenhagen, reports that
“there were informal standards for these games—such as a requirement to publish information about the probability of winning on a slot machine spin—but even these were not being adhered to”.
He is quoted as saying:
“Apple says if you want to upload your game to the Apple Store, you need to make disclosures about the probability of randomised features”.
He continued:
“We checked in 2021 and a third of companies were not doing it. Existing regulation is not being enforced”.
I gather that the Minister’s department has a working group to examine loot boxes. An update on that now, or in writing if he would prefer, would be helpful. The main point of raising this is apparent: app stores are an important pinch point in the digital user journey. We need to ensure that Ofcom has a proper look at whether including them helps it deliver the aims of the Bill. We should include the powers for it to be able to do that, in addition to the other safeguards that we are putting in the Bill to protect children. We strongly support these amendments.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am very grateful for the strength of support and echo the tributes that have been paid to my noble friend Lady Harding—the winsome Baroness from Winscombe —for raising this issue and working with us so collaboratively on it. I am particularly glad that we were able to bring these amendments on Report; as she knows, it involved some speedy work by the Bill team and some speedy drafting by the Office of the Parliamentary Counsel, but I am glad that we were able to do it on Report, so that I can take it off my list of things to do over the summer, which was kindly written for me by the noble Lord, Lord Clement-Jones.

My noble friend’s amendments were laid before the Government’s, so she rightly asked a couple of questions on where they slightly differ. Her amendment seeks to ensure that other websites or online marketplaces that allow users to download apps are also caught by these duties. I reassure her that the Government’s amendments would capture these types of services. We have intentionally not provided detail about what constitutes an app store to ensure that the Bill remains future-proof. I will say a bit more about that in a moment. Regulations made by the Secretary of State under this power will be able to specify thresholds for which app stores are in scope, giving clarity to providers and users about the application of the duties.

On questions of definition, we are intentionally choosing not to define app stores in these amendments. The term is generally understood as meaning a service that makes applications available, which means that the Secretary of State will be able to impose duties on any such service. Any platform that enables apps to be downloaded can therefore be considered an app store for the purpose of this duty, regardless of whether or not it calls itself one. Regulations will clearly set out which providers are in scope of the duties. The ability to set threshold conditions will also ensure that any duties capture only those that pose the greatest risk of children accessing harmful content.

We touched on the long-running debate about content and functionality. We have made our position on that clear; it will be caught by references to content. I am conscious that we will return to this on Wednesday, when we will have a chance to debate it further.

On timing, as I said, I am glad that we were able to bring these amendments forward at this stage. The publication date for Ofcom’s report is to ensure that Ofcom can prioritise the implementation of the child safety duties and put in place the Bill’s vital protections for children before turning to its research on app stores.

That timing also allows the Secretary of State to base his or her decision on commencement on the effectiveness of the existing framework and to use the research of Ofcom’s report to set out a more granular approach to issues such as risk assessment and safety duties. It is necessary to await the findings of Ofcom’s report before those duties are commenced.

To the questions posed by the noble Baroness, Lady Kidron, and others about the consultation for that report by Ofcom, we expect Ofcom to consult widely and with all relevant parties when producing its report. We do not believe that there is a need for a specific list of consultees given Ofcom’s experience and expertise in this area as well as the great experience it will have through its existing enforcement and wider consultation requirements. In addition, the Secretary of State, before making regulations, will be required to consult a range of key parties, such as the Children’s Commissioner and the Information Commissioner, and those who represent the interests of children, as well as providers of app stores. That can include children themselves.

On the questions asked by the noble Lord, Lord Knight, on loot boxes, he is right that this piece of work is being led by my department. We want to see the games industry take the lead in strengthening protections for children and adults to mitigate the risk of harms. We are pursuing that through a DCMS-led technical working group, and we will publish an update on progress in the coming months. I again express my gratitude to my noble friend Lady Harding and other noble Lords who have expressed their support.

Amendment 236A agreed.
Amendment 236B
Moved by
236B: After Clause 194, insert the following new Clause—
“Power to regulate app stores: supplementary
(1) In this section (except in subsection (4)(c)) “regulations” means regulations under section (Power to regulate app stores)(1).(2) Provision may be made by regulations only for or in connection with the purposes of minimising or mitigating the risks of harm to children presented by harmful content as mentioned in section (Power to regulate app stores)(3)(a) and (b).(3) Regulations may not have the effect that any body other than OFCOM is the regulator in relation to app stores.(4) Regulations may—(a) make provision exempting specified descriptions of app stores from regulation under this Act;(b) make provision amending Part 2, section 49 or Schedule 1 in connection with provision mentioned in paragraph (a);(c) make provision corresponding or similar to provision which may be made by regulations under paragraph 1 of Schedule 11 (“threshold conditions”), with the effect that only app stores which meet specified conditions are regulated by this Act.(5) Regulations may make provision having the effect that app stores provided from outside the United Kingdom are regulated by this Act (as well as app stores provided from within the United Kingdom), but, if they do so, must contain provision corresponding or similar to section 3(5) and (6)(UK links).(6) The provision that may be made by regulations includes provision—(a) imposing on providers of app stores duties corresponding or similar to duties imposed on providers of Part 3 services by—(i) section 10 or 11 (children’s online safety: user-to-user services) or any of sections 16 to 19 so far as relating to section 10 or 11;(ii) section 24 or 25 (children’s online safety: search services) or any of sections 26 to 29 so far as relating to section 24 or 25;(b) imposing on providers of app stores duties corresponding or similar to duties imposed on providers of internet services within section 71(2) by section 72 (duties about regulated provider pornographic content);(c) imposing on providers of app stores requirements corresponding or similar to requirements imposed on providers of regulated services by, or by OFCOM under, Part 6 (fees); (d) imposing on OFCOM duties in relation to app stores corresponding or similar to duties imposed in relation to Part 3 services by Chapter 3 of Part 7 (OFCOM’s register of risks, and risk profiles);(e) conferring on OFCOM functions in relation to app stores corresponding or similar to the functions that OFCOM have in relation to regulated services under—(i) Chapter 4 of Part 7 (information), or(ii) Chapter 6 of Part 7 (enforcement), including provisions of that Chapter conferring power for OFCOM to impose monetary penalties;(f) about OFCOM’s production of guidance or a code of practice relating to any aspect of the regulation of app stores that is included in the regulations.(7) The provision that may be made by regulations includes provision having the effect that app stores fall within the definition of “Part 3 service” or “regulated service” for the purposes of specified provisions of this Act (with the effect that specified provisions of this Act which apply in relation to Part 3 services or regulated services, or to providers of Part 3 services or regulated services, also apply in relation to app stores or to providers of app stores).(8) Regulations may not amend or make provision corresponding or similar to—(a) Chapter 2 of Part 4 (reporting CSEA content),(b) Chapter 5 of Part 7 (notices to deal with terrorism content and CSEA content), or(c) Part 10 (communications offences).(9) Regulations may make different provision with regard to app stores of different kinds.(10) In this section “specified” means specified in regulations.”Member’s explanatory statement
This amendment makes provision about the purpose and contents of regulations to regulate app stores which may be made by the Secretary of State under the preceding new Clause proposed to be inserted in my name.
Amendment 236B agreed.
Lord Harlech Portrait Lord Harlech (Con)
- Hansard - - - Excerpts

I beg to move that further consideration on Report be adjourned and that the House be adjourned during pleasure until 10.15 pm.

Lord Harris of Haringey Portrait Lord Harris of Haringey (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, has the noble Lord, Lord Harlech, seen paragraph 3.1 of the Companion? In case he has not—I know he is very new to this House—it states:

“It is a firm convention”—


not any old convention, but a firm convention—

“that the House normally rises by about 10pm on Mondays to Wednesdays”.

Can he explain why today is so different?

Lord Harlech Portrait Lord Harlech (Con)
- View Speech - Hansard - - - Excerpts

I take the noble Lord’s points on board. I think that my noble friend the Chief Whip answered those points at the Dispatch Box earlier today.

Lord Harris of Haringey Portrait Lord Harris of Haringey (Lab)
- Hansard - - - Excerpts

I appreciate that the noble Lord is put in a difficult position, but all the Chief Whip said was that this is usual. When was the last occasion that this had to happen in the way that it is happening tonight?

Lord Harlech Portrait Lord Harlech (Con)
- Hansard - - - Excerpts

Perhaps this is something to discuss with my noble friend the Chief Whip while the House adjourns during pleasure.

22:08
Sitting suspended.

Online Safety Bill

Report (5th Day)
16:19
Relevant documents: 28th and 38th Reports from the Delegated Powers Committee, 15th Report from the Constitution Committee. Scottish and Welsh Legislative Consent granted.
Amendment 236C
Moved by
236C: After Clause 194, insert the following new Clause—
“Power to impose duty about alternative dispute resolution procedure
(1) The Secretary of State may by regulations amend this Act for or in connection with the imposition on providers of Category 1 services of an ADR duty.(2) An “ADR duty”—(a) is a duty requiring providers of Category 1 services to arrange for and engage in an alternative dispute resolution procedure in specified circumstances for the resolution of disputes about their handling of relevant complaints, and(b) may include a duty requiring such providers to meet the costs incurred by any other person in using a dispute resolution procedure which is so arranged.(3) Complaints are “relevant” for the purposes of subsection (2)(a) if they—(a) relate to a Category 1 service,(b) are of a specified kind, and(c) are made by persons of a specified kind.(4) Regulations under this section may not be made before the publication of a statement by the Secretary of State responding to OFCOM’s report under section (OFCOM’s report about reporting and complaints procedures)(report about reporting and complaints procedures in use by providers of Part 3 services: see subsection (10) of that section). (5) Before making regulations under this section the Secretary of State must consult—(a) OFCOM,(b) the Information Commissioner, and(c) such other persons as the Secretary of State considers appropriate.(6) If the power conferred by subsection (1) is exercised, the first regulations made under the power must—(a) require the use of a dispute resolution procedure which is impartial, and(b) prohibit the use of a dispute resolution procedure which restricts or excludes the availability of civil proceedings.(7) Provision made by regulations under this section may have the effect that the duties set out in any or all of sections 17, 18 and 19 which apply in relation to duties imposed by other provisions of Chapter 2 of Part 3 are also to apply in relation to the ADR duty, and accordingly the regulations may amend—(a) section 17(6),(b) the definition of “safety measures and policies” in section 18(8), or(c) the definition of “relevant duties” in section 19(10).(8) The provisions of this Act that may be amended by the regulations in connection with the imposition of the ADR duty include, but are not limited to, the following provisions (in addition to those mentioned in subsection (7))—(a) section 6(5),(b) section 94(12)(a), and(c) section 120(2).(9) If the power conferred by subsection (1) is exercised, the first regulations made under the power must require OFCOM to—(a) produce and publish guidance for providers of Category 1 services to assist them in complying with the ADR duty, and(b) consult the Secretary of State, the Information Commissioner and such other persons as OFCOM consider appropriate before producing the guidance.(10) Section 184(1) applies for the purposes of the references to Category 1 services in this section.(11) In this section “specified” means specified in regulations under this section.(12) For the meaning of “Category 1 service”, see section 86 (register of categories of services).”Member’s explanatory statement
This amendment provides that the Secretary of State may make regulations amending this Bill so as to impose a new duty on providers of Category 1 services to arrange for and engage in an out of court, impartial dispute resolution procedure. The regulations may not be made until the Secretary of State has responded to OFCOM’s report about content reporting and complaints procedures under the new clause proposed to be inserted after Clause 147 in my name.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, the government amendments in this group relate to content reporting and complaints procedures. The Bill’s existing duties on each of these topics are a major step forward and will provide users with effective methods of redress. There will now be an enforceable duty on Part 3 services to offer accessible, transparent and easy-to-use complaints procedures. This is an important and significant change from which users and others will benefit directly.

Furthermore, Part 3 services complaints procedures will be required to provide for appropriate action to be taken in response to complaints. The duties here will fundamentally alter how complaints systems are operated by services, and providers will have to make sure that their systems are up to scratch. If services do not comply with their duties, they will face strong enforcement measures.

However, we have listened to concerns raised by your Lordships and others, and share the desire to ensure that complaints are handled effectively. That is why we have tabled Amendments 272AA and 274AA, to ensure that the Bill’s provisions in this area are the subject of a report to be published by Ofcom within two years of commencement.

Amendment 272AA places a requirement on Ofcom to undertake a report about Part 3 services reporting and complaints procedures. The report will assess the measures taken or in use by providers of Part 3 services to enable users and others to report content and make complaints. In assessing the content reporting and complaints measures in place, the report must take into account users’ and others’ experiences of those procedures—including how easy to use and clear they are for reporting content and making complaints, and whether providers are taking appropriate and timely action in response.

In this report, Ofcom must provide advice to the Secretary of State about whether she should use her power set out in Amendment 236C to make regulations imposing an alternative dispute resolution duty on category 1 services. Ofcom may also make wider recommendations about how the complaints and user redress provisions can be strengthened, and how users’ experiences with regard to complaints can be improved more broadly. Amendment 274AA is a consequential amendment ensuring that the usual confidentiality provisions apply to matters contained in that report.

These changes will ensure that the effectiveness of the Bill’s content reporting and complaints provisions can be thoroughly assessed by Ofcom two years after the commencement of the provision, providing time for the relevant reporting and complaints procedures to bed in.

Amendment 236C then provides that the Secretary of State will have a power to make regulations to amend the Act in order to impose an alternative dispute resolution duty on providers of category 1 services. This power can be used after the Secretary of State has published a statement in response to Ofcom’s report. This enables the Secretary of State to impose via regulations a duty on the providers of category 1 services to arrange for and engage in an impartial, out-of-court alternative dispute resolution procedure in respect of complaints. This means that, if the Bill’s existing user redress provisions are found to be insufficient, this requirement can quickly be imposed to strengthen the Bill.

This responds directly to concerns which noble Lords raised about cases where users or parents may feel that they have nowhere to turn if they are dissatisfied with a service’s response to their complaint. We believe that the existing provisions will remedy this, but, if they do not, these new requirements will ensure that there is an impartial, alternative dispute resolution procedure which will work towards the effective resolution of the complaint between the service and the complainant.

At the same time, it will avoid creating a single ombudsman, person or body which may be overwhelmed either through the volume of complaints from multiple services or by the complexity of applying such disparate services’ varying terms of service. Instead, if required, this power will put the onus on the provider to arrange for and engage in an impartial dispute resolution procedure.

Amendment 237D requires that, if regulations are made requiring category 1 services to offer an alternative dispute resolution procedure, such regulation must be subject to the affirmative parliamentary procedure. This ensures that Parliament will continue to have oversight of this process.

I hope that noble Lords are reassured that the Bill not only requires services to provide users and others with effective forms of redress but that these further amendments will ensure that the Bill’s provisions in this area will be thoroughly reviewed and that action can be taken quickly if it is needed. I beg to move.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to hear what the Minister has just announced. The scheme that was originally prefigured in the pre-legislative scrutiny report has now got some chance of being delivered. I think the process and procedures are quite appropriate; it does need review and thought. There needs to be account taken of practice on the ground, how people have found the new system is working, and whether or not there are gaps that can be filled this way. I give my full support to the proposal, and I am very glad to see it.

Having got to the Dispatch Box early, I will just appeal to our small but very important group. We are on the last day on Report. We are reaching a number of issues where lots of debate has taken place in Committee. I think it would be quite a nice surprise for us all if we were to get through this quickly. The only way to do that is by restricting our contributions.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak briefly to Amendments 272AA and 274AA, only because at the previous stage of the Bill I tabled amendments related to the reporting of illegal content and fraudulent advertisements, both in reporting, and complaints and transparency. I have not re-tabled them here, but I have had conversations with my noble friend the Minister. It is still unclear to those in the House and outside why the provisions relating to that type of reporting would not apply to fraudulent advertisements, particularly given that the more information that can be filed about those types of scams and fraudulent advertisements, the easier it would be for the platforms to gather information, and help users and others to start to crack down on that. I wonder if, when he sums up, my noble friend could say something about the reporting provisions relating to fraudulent advertisements generally, and in particular around general reporting and reporting relating to complaints by users.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I am mindful of the comments of the noble Lord, Lord Stevenson, to be brief. I add a note of welcome to the mechanism that has been set out.

In this legislation, we are initiating a fundamental change to the way in which category 1 providers will run their reporting systems, in that prior to this they have not had any external oversight. Ofcom’s intervention will be material, given that online service providers will have to explain to Ofcom what they are doing and why.

We should note that we are also asking providers to do some novel prioritisation. The critical thing with all these reporting systems is that they operate at such huge volumes. I will not labour the points, but if noble Lords are interested they can look at the Meta and YouTube transparency reports, where it is explained that they are actioning tens of millions of pieces of content each month, on the basis of hundreds of millions of reports. If you get even 1% of 10 million reports wrong, that is 100,000 errors. We should have in mind the scale we are operating at. Ofcom will not be able to look at each one of those, but I think it will be able to produce a valuable system and make sure that quality control is improved across those systems, working with the providers. Having additional powers to create an alternative dispute resolution mechanism where one does not exist and would prove to be useful is helpful. However, the slow and steady approach of seeing what will happen with those systems under Ofcom supervision before jumping into the next stage is right.

I also note that we are asking platforms to do some prioritisation in the rest of the Online Safety Bill. For example, we are saying that we wish journalistic and politician content to be treated differently from ordinary user content. All of those systems need to be bedded in, so it makes sense to do it at a reasonable pace.

I know that the noble Baroness, Lady Newlove, who cannot be here today, was also very interested in this area and wanted to make sure we made the point that the fact there is a reasonable timescale for the review does not mean that we should take our foot off the pedal now for our expectations for category 1 service providers. I think I heard that from the Minister, but it would be helpful for him to repeat it. We will be asking Ofcom to keep the pressure on to get these systems right now, and not just wait until it has done the report and then seek improvements at that stage. With that—having been about as brief as I can be— I will sit down.

16:30
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I promise I will be brief. I, too, welcome what the Minister has said and the amendments that the Government have proposed. This is the full package which we have been seeking in a number of areas, so I am very pleased to see it. My noble friend Lady Newlove and the noble Baroness, Lady Kidron, are not in their places, but I know I speak for both of them in wanting to register that, although the thoughtful and slow-and-steady approach has some benefits, there also some real costs to it. The UK Safer Internet Centre estimates that there will be some 340,000 individuals in the UK who will have no recourse for action if the platforms complaints mechanism does not work for them in the next two years. That is quite a large number of people, so I have one very simple question for the Minister: if I have exhausted the complaints procedure with an existing platform in the next two years, where do I go? I cannot go to Ofcom. My noble friend Lord Grade was very clear in front of the committee I sit on that it is not Ofcom’s job. Where do I go if I have a complaint that I cannot get resolved in the next two years?

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I declare an interest as chair of Trust Alliance Group, which operates the energy and communications ombudsman schemes, so I have a particular interest in the operation of these ADR schemes. I thank the Minister for the flexibility that he has shown in the provision about the report by Ofcom and in having backstop powers for the Secretary of State to introduce such a scheme.

Of course, I understand that the noble Baroness, Lady Newlove, and the UK Safer Internet Centre are very disappointed that this is not going to come into effect immediately, but there are advantages in not setting out the scheme at this very early point before we know what some of the issues arising are. I believe that Ofcom will definitely want to institute such a scheme, but it may be that, in the initial stages, working out the exact architecture is going to be necessary. Of course, I would have preferred to have a mandated scheme, in the sense that the report will look not at the “whether” but the “how”, but I believe that at the end of the day it will absolutely obvious that there needs to be such an ADR scheme in order to provide the kind of redress the noble Baroness, Lady Harding, was talking about.

I also agree with noble Baroness, Lady Morgan, that the kinds of complaints that this would cover should include fraudulent adverts. I very much hope that the Minister will be able to answer the questions that both noble Baronesses asked. As my noble friend said, will he reassure us that the department and Ofcom will not take their foot off the pedal, whatever the Bill may say?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

I am grateful to noble Lords for their warm support and for heeding the advice of the noble Lord, Lord Stevenson, on brevity. We must finish our Report today. The noble Lord, Lord Allan, is right to mention my noble friend Lady Newlove, who I have spoken to about this issue, as well as the noble Lord, Lord Russell of Liverpool, who has raised some questions here.

Alongside the strong duties on services to offer content reporting and complaints procedures, our amendments will ensure that the effectiveness of these provisions can be reviewed after they have had sufficient time to bed in. The noble Lord, Lord Allan, asked about timing in more detail. Ofcom must publish the report within the two-year period beginning on the day on which the provision comes into force. That will allow time for the regime to bed in before the report takes place, ensuring that its conclusions are informed by how the procedures work in practice. If necessary, our amendments will allow the Secretary of State to impose via regulations a duty on the providers of category 1 services to arrange for and engage in an impartial, out-of-court alternative dispute resolution procedure, providing the further strengthening which I outlined in opening.

I can reassure my noble friend Lady Morgan of Cotes that reporting mechanisms to facilitate providers’ removal of fraudulent advertisements are exactly the kinds of issues that Ofcom’s codes of practice will cover, subject to consultation and due process. As companies have duties to remove fraudulent advertising once they are alerted to it, we expect platforms will need the necessary systems and processes in place to enable users to report fraudulent adverts so that providers can remove them.

My noble friend Lady Harding asked the question which was posed a lot in Committee about where one goes if all avenues are exhausted. We have added further avenues for people to seek redress if they do not get it but, as I said in Committee, the changes that we are bringing in through this Bill will mark a significant change for people. Rather than focusing on the even-further-diminished possibility of their not having their complaints adequately addressed through the additional amendments we are bringing today, I hope she will see that the provisions in the Bill and in these amendments as bringing in the change we all want to see to improve users’ safety online.

Amendment 236C agreed.
Amendment 237
Moved by
237: After Clause 195, insert the following new Clause—
“Powers to amend sections (“Primary priority content that is harmful to children”) and (“Priority content that is harmful to children”)
(1) The Secretary of State may by regulations amend—(a) section (“Primary priority content that is harmful to children”) (primary priority content that is harmful to children);(b) section (“Priority content that is harmful to children”) (priority content that is harmful to children).But the power to add a kind of content is limited by subsections (2) to (4).(2) A kind of content may be added to section (“Primary priority content that is harmful to children”) only if the Secretary of State considers that, in relation to Part 3 services—(a) there is a material risk of significant harm to an appreciable number of children presented by content of that kind that is regulated user- generated content or search content, and(b) it is appropriate for the duties set out in sections 11(3)(a) and 25(3)(a) (duty in relation to children of all ages) to apply in relation to content of that kind.(3) A kind of content may be added to section (“Priority content that is harmful to children”) only if the Secretary of State considers that, in relation to Part 3 services, there is a material risk of significant harm to an appreciable number of children presented by content of that kind that is regulated user-generated content or search content.(4) A kind of content may not be added to section (“Primary priority content that is harmful to children”) or (“Priority content that is harmful to children”) if the risk of harm presented by content of that kind flows from—(a) the content’s potential financial impact, (b) the safety or quality of goods featured in the content, or(c) the way in which a service featured in the content may be performed (for example, in the case of the performance of a service by a person not qualified to perform it).(5) The Secretary of State must consult OFCOM before making regulations under this section.(6) In this section references to children are to children in the United Kingdom.(7) In this section—“regulated user-generated content” has the same meaning as in Part 3 (see section 49);“search content” has the same meaning as in Part 3 (see section 51).”Member’s explanatory statement
This amendment gives power for the Secretary of State to make regulations changing the kinds of content that count as primary priority content and priority content harmful to children, subject to certain constraints set out in the Clause.
Amendment 237 agreed.
Amendment 237ZA not moved.
Clause 200: Regulations: general
Amendment 237A
Moved by
237A: Clause 200, page 168, line 5, after “State” insert “or OFCOM”
Member’s explanatory statement
This amendment has the effect that regulations made by OFCOM under the Bill must be made by statutory instrument.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, Amendments 238A and 238D seek to change the parliamentary process for laying—oh, I am skipping ahead with final day of Report enthusiasm.

As noble Lords know, companies will fund the costs of Ofcom’s online safety functions through annual fees. This means that the regime which the Bill ushers in will be cost neutral to the taxpayer. Once the fee regime is operational, regulated providers with revenue at or above a set threshold will be required to notify Ofcom and to pay a proportionate fee. Ofcom will calculate fees with reference to the provider’s qualifying worldwide revenue.

The Delegated Powers and Regulatory Reform Committee of your Lordships’ House has made two recommendations relating to the fee regime which we have accepted, and the amendments we are discussing in this group reflect this. In addition, we are making an additional change to definitions to ensure that Ofcom can collect proportionate fees.

A number of the amendments in my name relate to qualifying worldwide revenue. Presently, the Bill outlines that this should be defined in a published statement laid before Parliament. Your Lordships’ committee advised that it should be defined through regulations subject to the affirmative procedure. We have agreed with this and are proposing changes to Clause 76 so that Ofcom can make provisions about qualifying worldwide revenue by regulations which, as per the committee’s recommendations, will be subject to the affirmative procedure.

Secondly, the committee recommended that we change the method by which the revenue threshold is defined. Presently, as set out in the Bill, it is set by the Secretary of State in a published statement laid before Parliament. The committee recommended that the threshold be set through regulations subject to the negative procedure and we are amending Clause 77 to make the recommended change.

Other amendments seek to make a further change to enable Ofcom to collect proportionate fees from providers. A provider of a regulated service the qualifying worldwide revenue of which is equal to, or greater than, the financial threshold will be required to notify Ofcom and pay an annual fee, calculated by reference to its qualifying worldwide revenue. Currently, this means that that fee calculation can be based only on the revenue of the regulated provider. The structure of some technology companies, however, means that how they accrue revenue is not always straightforward. The entity which meets the definition of a provider may therefore not be the entity which generates revenue referable to the regulated service.

Regulations to be made by Ofcom about the qualifying worldwide revenue will therefore be able to provide that the revenue accruing to certain entities in the same group as a provider of a regulated service can be taken into account for the purposes of determining qualifying worldwide revenue. This will enable Ofcom, when making such regulations, to make provisions, if necessary, to account for instances where a provider has a complex group structure; for example, where the regulated provider might accrue only a portion of the revenue referrable to the regulated service, the rest of which might be accrued by other entities in the group’s structure. These amendments to Clause 76 address these issues by allowing Ofcom to make regulations which provide that the revenue from certain other entities within the provider’s group structure can be taken into account. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, we have not talked much about fees in our consideration of the Bill, and I will not talk much about them today, but there are some important questions. We should not skip too lightly over the fact that we will be levying revenues from online providers. That might have a significant impact on the markets. I have some specific questions about this proposed worldwide revenue method but I welcome these amendments and that we will now be getting a better procedure. This will also allow the Minister to say, “All these detailed points can be addressed when these instruments come before Parliament”. That is a good development. However, there are three questions that are worth putting on the record now so that we have time to think about them.

First, what consideration will be given to the impact on services that do not follow a classic revenue model but instead rely on donations and other sorts of support? I know that we will come back to this question in a later group but there are some very large internet service providers that are not the classic advertising-funded model, instead relying on foundations and other things. They will have significant questions about what we would judge their qualifying worldwide revenue to be, given that they operate to these very different models.

The second question concerns the impact on services that may have a very large footprint outside the UK, and significant worldwide revenues, but which do very little business within the UK. The amendment that the Minister has tabled about group revenues is also relevant here. You can imagine an entity which may be part of a very large worldwide group making very significant revenues around the world. It has a relatively small subsidiary that is offering a service in the UK, with relatively low revenues. There are some important questions there around the potential impact of the fees on decision-making within that group. We have discussed how we do not want to end up with less choice for consumers of services in the UK. There is an interesting question there as to whether getting the fee level wrong might lead to worldwide entities saying, “If you’re going to ask me to pay a fee based on my qualifying worldwide revenue, the UK market is just not worth it”. That may particularly true if, for example, the European Union and other markets are also levying a fee. You can see a rational business choice of, “We’re happy to pay the fee to the EU but not to Ofcom if it is levied at a rate that is disproportionate to the business that we do here”.

The third and very topical question is about the Government’s thinking about services with declining revenues but whose safety needs are not reducing and may even be increasing. I hope as I say this that people have Twitter in mind, which has very publicly told us that its revenue is going down significantly. It has also very publicly fired most of its trust and safety staff. You can imagine a model within which, because its revenue is declining, it is paying less to Ofcom precisely when Ofcom needs to do more supervision of it.

I hope that we can get some clarity around the Government’s intentions in these circumstances. I have referenced three areas where the worldwide qualifying revenue calculation may go a little awry. The first is where the revenue is not classic commercial income but comes from other sources. The second is where the footprint in the UK is very small but it is otherwise a large global company which we might worry will withdraw from the market. The third, and perhaps most important, is what the Government’s intention is where a company’s revenue is declining and it is managing its platform less well and its Ofcom needs increase, and what we would expect to happen to the fee level in those circumstances.

16:45
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, there is very little to add to that. These are important questions. I simply was struck by the thought that the amount of work, effort and thought that has gone into this should not be kept within this Bill. I wonder whether the noble Lord has thought of offering his services to His Majesty’s Treasury, which has difficulty in raising tax from these companies. It would be nice to see that problem resolved.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

I am looking forward to returning to arts and heritage; I will leave that to my noble friend Lady Penn.

The noble Lord, Lord Allan, asked some good questions. He is right: the provisions and the parliamentary scrutiny allow for the flexibility for all these things to be looked at and scrutinised in the way that he set out. I stress that the fee regime is designed to be fair to industry; that is central to the approach we have taken. The Bill stipulates that Ofcom must charge only proportionate and justifiable fees to industry. The provisions that Ofcom can make via regulation about the qualifying worldwide revenue aim to ensure that fees are truly representative of the revenue relating to the regulated service and that they will encourage financial transparency. They also aim to aid companies with complex structures which would otherwise struggle to segregate revenues attributable to the provider and its connected entities.

The revenue of the group undertaking can be considered in scope of a provider’s qualifying worldwide revenue if the entity was a member of the provider’s group during any part of the qualifying period and the entity receives during the qualifying period any amount referrable to a regulated service. The regulations provide Ofcom with a degree of flexibility as to whether or not to make such provisions, because Ofcom will aim to keep the qualifying worldwide revenue simple.

I am grateful for noble Lords’ support for the amendments and believe that they will help Ofcom and the Government to structure a fair and transparent fee regime which charges proportionate fees to fund the cost of the regulatory regime that the Bill brings in.

Amendment 237A agreed.
Amendment 237B
Moved by
237B: Clause 200, page 168, line 6, at end insert—
“(3A) The Statutory Instruments Act 1946 applies in relation to OFCOM’s powers to make regulations under this Act as if OFCOM were a Minister of the Crown.(3B) The Documentary Evidence Act 1868 (proof of orders and regulations etc) has effect as if—(a) OFCOM were included in the first column of the Schedule to that Act;(b) OFCOM and persons authorised to act on their behalf were mentioned in the second column of that Schedule.”Member’s explanatory statement
This amendment makes technical provision in relation to regulations made by OFCOM under the Bill.
Amendment 237B agreed.
Clause 201: Parliamentary procedure for regulations
Amendments 237C to 237DA
Moved by
237C: Clause 201, page 168, line 11, at end insert—
“(aa) regulations under section (“Regulations by OFCOM about qualifying worldwide revenue etc”)(1),”Member’s explanatory statement
This amendment provides that regulations made by OFCOM under subsection (1) of the new Clause 76 proposed in my name regarding “qualifying worldwide revenue” etc are subject to the affirmative Parliamentary procedure.
237D: Clause 201, page 168, line 14, at end insert—
“(da) regulations under section (Power to regulate app stores)(1),”Member’s explanatory statement
This amendment provides that regulations made under the new Clause proposed in my name after Clause 194 are subject to the affirmative Parliamentary procedure.
237DA: Clause 201, page 168, line 14, at end insert—
“(da) regulations under section (Power to impose duty about alternative dispute resolution procedure)(1),”Member’s explanatory statement
This amendment provides that regulations made under the new Clause proposed to be inserted in my name after Clause 194, concerning regulations to impose a duty on providers of Category 1 services about using an alternative dispute resolution procedure, are subject to the affirmative Parliamentary procedure.
Amendments 237C to 237DA agreed.
Amendment 237DB not moved.
Amendments 237E and 238
Moved by
237E: Clause 201, page 168, line 23, at end insert—
“(m) regulations under paragraph 5(9) of Schedule 13,”Member’s explanatory statement
This amendment provides that regulations made by OFCOM under paragraph 5(9) of Schedule 13 regarding “qualifying worldwide revenue” etc for the purposes of that paragraph are subject to the affirmative Parliamentary procedure.
238: Clause 201, page 168, line 26, leave out “54(2) or (3)” and insert “(Powers to amend sections (“Primary priority content that is harmful to children”) and (“Priority content that is harmful to children”))(1)”
Member’s explanatory statement
This amendment ensures that regulations made under the new Clause proposed to be inserted after Clause 195 in my name are subject to the affirmative procedure, except in cases of urgency.
Amendments 237E and 238 agreed.
Amendment 238A
Moved by
238A: Clause 201, page 169, line 3, at end insert—
“(7A) A statutory instrument containing the first regulations under paragraph 1(1) of Schedule 11 (whether alone or with regulations under paragraph 1(2) or (3) of that Schedule) may not be made unless a draft of the instrument has been laid before, and approved by a resolution of, each House of Parliament.(7B) Any other statutory instrument containing regulations under paragraph 1(1) of Schedule 11 is subject to annulment in pursuance of a resolution of either House of Parliament.”Member’s explanatory statement
This amendment provides that the first regulations made under paragraph 1(1) of Schedule 11 (regulations specifying Category 1 threshold conditions) are subject to the affirmative Parliamentary procedure.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, as I was eagerly anticipating, government Amendments 238A and 238D seek to change the parliamentary process for laying the first regulations specifying the category 1 threshold conditions from the negative to the affirmative procedure. I am pleased to bring forward this change in response to the recommendation of your Lordships’ Delegated Powers and Regulatory Reform Committee.

The change will ensure that there are adequate levels of parliamentary scrutiny of the first regulations specifying the category 1 threshold conditions. This is appropriate given that the categorisation of category 1 services will lead to the most substantial duties on the largest and most influential services. As noble Lords are aware, these include the duties on user empowerment, user identity verification, journalistic and news publisher content, content of democratic importance, and fraudulent advertising.

Category 2A services will have only additional transparency and fraudulent advertising duties, and category 2B services will be subject only to additional transparency reporting duties. The burden of these duties is significantly less than the additional category 1 duties, and we have therefore retained the use of the negative resolution procedure for these regulations, as they require less parliamentary scrutiny.

Future changes to the category 1 threshold conditions will also use the negative procedure. This will ensure that the regime remains agile in responding to change, which I know was of particular concern to noble Lords when we debated the categorisation group in Committee. Keeping the negative procedure for such subsequent uses will avoid the risk of future changes being subject to delays because of parliamentary scheduling. I beg to move.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I shall speak to Amendment 245. I would like to thank my noble friend the Minister, and also the Minister on leave, for the conversations that I have had with them about this amendment and related issues. As we have already heard, the platform categorisation is extremely important. So far, much of it is unknown, including which sites are actually going to be in which categories. For example, we have not yet seen any proposed secondary regulations. As my noble friend has just outlined, special duties apply, especially for those sites within category 1—user empowerment in particular, but also other duties relating to content and fraudulent advertisements.

Clause 85 and Schedule 11 set out the thresholds for determining which sites will be in category 1, category 2A or category 2B. I am very mindful of the exhortation of the noble Lord, Lord Stevenson, about being brief, but it is amazing how much you have to say about one word to explain this amendment. This amendment proposes to change an “and” to an “or” in relation to determining which sites would fall within category 1. It would move from a test of size “and” functionality to a test of size “or” functionality. This would give Ofcom more flexibility to decide which platforms really need category 1 designation. Category 1 should not be decided just on size; it should also be possible to determine it on the basis of functionality.

Functionality is defined in the Bill in Clause 208. We will get to those amendments shortly, but there is no doubt from what the Government have already conceded, or agreed with those of us who have been campaigning passionately on the Bill for a number of years, that functionality can make a platform harmful. It is perfectly possible to have small platforms that both carry highly harmful content and themselves become harmful in the way that they are designed. We have heard many examples and I will not detain the House with them, but I draw attention to two particular sites which capture how broad this is. The perpetrators of offline hate crimes are often linked to these small platforms. For example, the perpetrator of the 2018 Tree of Life synagogue mass shooting had an online presence on the right-wing extremist social network Gab. In the UK, Jake Davison, the self-proclaimed incel who killed five people in Plymouth in 2021, frequented smaller incel forums after he was banned from Reddit in the days leading up to the mass shooting.

I also want to share with noble Lords an email that I received just this week from a family who had been to see their Member of Parliament, Matt Rodda MP, and also the noble Baroness, Lady Kidron, who I know is very regretful that she cannot be here today. I thank Victoria and Jean Eustace for sharing the story of their sister and daughter. Victoria wrote: “I am writing to you regarding the Online Safety Bill, as my family and I are concerned it will not sufficiently protect vulnerable adults from harm. My sister, Zoe Lyalle, killed herself on 26 May 2020, having been pointed towards a method using an online forum called Sanctioned Suicide. Zoe was 18 years old at the time of her death and as such technically an adult, but she was autistic, so she was emotionally less mature than many 18 year- olds. She found it difficult to critically analyse written content”. She says that “The forum in question is not large and states on its face that it does not encourage suicide, although its content does just that”. The next part I was even more shocked about: “Since Zoe’s death, we have accessed her email account. The forum continues to email Zoe, providing her with updates on content she may have missed while away from the site, as well as requesting donations. One recent email included a link to a thread on the forum containing tips on how best to use the precise method that Zoe had employed”.

In her note to me, the Minister on leave said that she wanted to catch some of the platforms we are talking about with outsized influence. In my reply, I said that those sites on which people are encouraged to take their own lives or become radicalised and therefore take the harms they are seeing online into the real world undoubtedly exercise influence and should be tackled.

It is also perfectly possible for us to have large but safe platforms. I know that my noble friend Lord Moylan may want to discuss this in relation to sites that he has talked about already on this Bill. The risk of the current drafting is a flight of users from these large platforms, newly categorised as category 1, to the small, non-category 1 platforms. What if a platform becomes extremely harmful very quickly? How will it be recategorised speedily but fairly and involving parliamentary oversight?

The Government have run a variety of arguments as to why the “and” in the Bill should not become an “or”. They say that it creates legal uncertainty. Every Bill creates legal uncertainty; that is why we have an army of extremely highly paid lawyers, not just in this country but around the world. They say that what we are talking about is broader than illegal content or content related to children’s safety, but they have already accepted an earlier amendment on safety by design and, in subsections (10) to (12) of Clause 12, that specific extra protections should be available for content related to

“suicide or an act of deliberate self-injury, or … an eating disorder or behaviours associated with an eating disorder”

or abusive content relating to race, religion, sex, sexual orientation, disability or gender reassignment and that:

“Content is within this subsection if it incites hatred against people”.


The Government have already breached some of their own limits on content that is not just illegal or relates to child safety duties. In fact, they have agreed that that content should have enhanced triple-shield protection.

The Government have also said that they want to avoid burdens on small but low-harm platforms. I agree with that, but with an “or” it would be perfectly possible for Ofcom to decide by looking at size or functionality and to exclude those smaller platforms that do not present the harm we all care about. The Minister may also offer me a review of categorisation; however, it is a review of the tiers of categorisation and not the sites within the categories, which I think many of us will have views on over the years.

I come to what we should do on this final day of Report. I am very thankful to those who have had many conversations on this, but there is a fundamental difference of opinion in this House on these matters. We will talk about functionality shortly and I am mindful of the pre-legislative scrutiny committee’s recommendation that this legislation should adopt

“a more nuanced approach, based not just on size and high-level functionality, but factors such as risk, reach, user base, safety performance, and business model”.

There should be other factors. Ofcom should have the ability to decide whether it takes one factor or another, and not have a series of all the thresholds to be passed, to give it the maximum flexibility. I will listen very carefully to what my noble friend the Minister and other noble Lords say, but at this moment I intend to test the opinion of the House on this amendment.

Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I strongly support Amendment 245. The noble Baroness, Lady Morgan of Cotes, has explained the nub of the problem we are facing—that size and functionality are quite separate. You can have large sites that perform a major social function and are extremely useful across society. Counter to that, you can have a small site focused on being very harmful to a small group of people. The problem is that, without providing the flexibility to Ofcom to determine how the risk assessment should be conducted, the Bill would lock it into leaving these small, very harmful platforms able to pursue their potentially ever-increasingly harmful activities almost out of sight. It does nothing to make sure that their risk assessments are appropriate.

17:00
We have already discussed the need to future-proof the Bill and I have tried to lay some amendments to that effect which the Government have not accepted. I hope that they will accept this amendment because this one change of wording would allow the flexibility that could provide a degree of future-proofing that is not provided otherwise within the Bill.
The amendment does not remove the sites completely. Those sites promoting suicide, serious self-harm and other activities across society will still continue, but because they will potentially be able to be captured and required to look at their risk assessment, their activities will perhaps at least be curtailed and, to a certain extent, regulated. It seems that the amendment simply provides a level playing field in the core issue of safety, which has been a theme we have addressed right through the Bill. I hope the Minister will accept the amendment as it is; one change of wording could allow Ofcom to do its job so much better. If he does not, I hope the amendment will be strongly supported by all sides of the House.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I am pleased to follow the noble Baroness, Lady Morgan of Coates, and her amendment, which tries to help parliamentary counsel draft better regulations later on. I am really struggling to see why the Government want to resist something that will make their life easier if they are going to do what we want them to do, which is to catch those high-risk services—as the noble Baroness, Lady Finlay, set out—but also, as we have discussed in Committee and on Report, exclude the low-risk services that have been named, such as Wikipedia and OpenStreetMap.

I asked the Minister on Report how that might happen, and he confirmed that such services are not automatically exempt from the user-to-user services regulations, but he also confirmed that they might be under the subsequent regulations drafted under Schedule 11. That is precisely why we are coming back to this today; we want to make sure that they can be exempt under the regulations drafted under Schedule 11. The test should be: would that be easier under the amended version proposed by the noble Baroness, Lady Morgan, or under the original version? I think it would be easier under the amended version. If the political intent is there to exclude the kind of services that I have talked about—the low-risk services—and I think it should be, because Ofcom should not be wasting time, in effect, supervising services that do not present a risk and, not just that, creating a supervisory model that may end up driving those services out of the UK market because they cannot legally say that they will make the kind of commitments Ofcom would expect them to make, having two different thresholds, size and functionality, gives the draftspeople the widest possible choice. By saying “or”, we are not saying they cannot set a condition that is “and” or excludes “and”, but “and” does exclude “or”, if I can put it that way. They can come back with a schedule that says, “You must be of this size and have this kind of functionality”, or they could say “this functionality on its own”—to the point made by the two noble Baronesses about some sites. They might say, “Look, there is functionality which is always so high-risk that we do not care what size you are; if you’ve got this functionality, you’re always going to be in”. Again, the rules as drafted at the moment would not allow them to do that; they would have to say, “You need to have this functionality and be of this size. Oh, whoops, by saying that you have to be of this size, we’ve now accidentally caught somebody else who we did not intend to catch”.

I look forward to the Minister’s response, but it seems entirely sensible that we have the widest possible choice. When we come to consider this categorisation under Schedule 11 later on, the draftspeople should be able to say either “You must be this size and have this functionality” or “If you’ve got this functionality, you’re always in” or “If you’re of this size, you’re always in”, and have the widest possible menu of choices. That will achieve the twin objectives which I think everyone who has taken part in the debate wants: the inclusion of high-risk services, no matter their size, and the exclusion of low-risk services, no matter their size—if they are genuinely low risk. That is particularly in respect of the services we have discussed and which the noble Lord, Lord Moylan, has been a very strong advocate for. In trying to do good, we should not end up inadvertently shutting down important information services that people in this country rely on. Frankly, people would not understand it if we said, “In the name of online safety, we’ve now made it so that you cannot access an online encyclopaedia or a map”.

It is going to be much harder for the draftspeople to draft categorisation under Schedule 11, as it is currently worded, that has the effect of being able to exclude low-risk services. The risk of their inadvertently including them and causing that problem is that much higher. The noble Baroness is giving us a way out and I hope the Minister will stand up and grab the lifeline. I suspect he will not.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I welcome the Minister’s Amendment 238A, which I think was in response to the DPRRC report. The sentiment around the House is absolutely clear about the noble Baroness’s Amendment 245. Indeed, she made the case conclusively for the risk basis of categorisation. She highlighted Zoe’s experience and I struggle to understand why the Secretary of State is resisting the argument. She knocked down the nine pins of legal uncertainty, and how it was broader than children and illegal by reference to Clause 12. The noble Baroness, Lady Finlay, added to the knocking down of those nine pins.

Smaller social media platforms will, on the current basis of the Bill, fall outside category 1. The Royal College of Psychiatrists made it pretty clear that the smaller platforms might be less well moderated and more permissive of dangerous content. It is particularly concerned about the sharing of information about methods of suicide or dangerous eating disorder content. Those are very good examples that it has put forward.

I return to the scrutiny committee again. It said that

“a more nuanced approach, based not just on size and high-level functionality, but factors such as risk, reach, user base, safety performance, and business model”

should be adopted. It seems that many small, high-harm services will be excluded unless we go forward on the basis set out by the noble Baroness, Lady Morgan. The kind of breadcrumbing we have talked about during the passage of the Bill and, on the other hand, sites such as Wikipedia, as mentioned by noble friend, will be swept into the net despite being low risk.

I have read the letter from the Secretary of State which the noble Baroness, Lady Morgan, kindly circulated. I cannot see any argument in it why Amendment 245 should not proceed. If the noble Baroness decides to test the opinion of the House, on these Benches we will support her.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I have good news and bad news for the Minister. The good news is that we have no problem with his amendments. The bad news, for him, is that we strongly support Amendment 245 from the noble Baroness, Lady Morgan of Coates, which, as others have said, we think is a no-brainer.

The beauty of the simple amendment has been demonstrated; it just changes the single word “and” to “or”. It is of course right to give Ofcom leeway—or flexibility, as the noble Baroness, Lady Finlay, described it—in the categorisation and to bring providers into the safety regime. What the noble Baroness, Lady Morgan, said about the smaller platforms, the breadcrumbing relating to the Jake Davison case and the functionality around bombarding Zoe Lyalle with those emails told the story that we needed to hear.

As it stands, the Bill requires Ofcom to always be mindful of size. We need to be more nuanced. From listening to the noble Lord, Lord Allan of Hallam—with his, as ever, more detailed analysis of how things work in practice—my concern is that in the end, if it is all about size, Ofcom will end up having to have a much larger number in scope on the categorisation of size in order to cover all the platforms that it is worried about. If we could give flexibility around size or functionality, that would make the job considerably easier.

We on this side think categorisation should happen with a proportionate, risk-based approach. We think the flexibility should be there, the Minister is reasonable—come on, what’s not to like?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I shall explain why the simple change of one word is not as simple as it may at first seem. My noble friend’s Amendment 245 seeks to amend the rule that a service must meet both a number-of-users threshold and a functionality threshold to be designated as category 1 or 2B. It would instead allow the Secretary of State by regulation to require a service to have to meet only one or other of the two requirements. That would mean that smaller user-to-user services could be so categorised by meeting only a functionality threshold.

In practical terms, that would open up the possibility of a future Secretary of State setting only a threshold condition about the number of users, or alternatively about functionality, in isolation. That would create the risk that services with a high number of users but limited functionality would be caught in scope of category 1. That could be of particular concern to large websites that operate with limited functionality for public interest reasons, and I am sure my noble friend Lord Moylan can think of one that fits that bill. On the other hand, it could capture a vast array of low-risk smaller services merely because they have a specific functionality—for instance, local community fora that have livestreaming capabilities. So we share the concerns of the noble Lord, Lord Allan, but come at it from a different perspective from him.

My noble friend Lady Morgan mentioned the speed of designation. The Bill’s approach to the pace of designation for the category 1 watchlist and register is flexible—deliberately so, to allow Ofcom to act as quickly as is proportionate to each emerging service. Ofcom will have a duty proactively to identify, monitor and evaluate emerging services, which will afford it early visibility when a service is approaching the category 1 threshold. It will therefore be ready to act accordingly to add services to the register should the need arise.

The approach set out in my noble friend’s Amendment 245 would not allow the Secretary of State to designate individual services as category 1 if they met one of the threshold conditions. Services can be designated as category 1 only if they meet all the relevant threshold conditions set out in the regulations made by the Secretary of State. That is the case regardless, whether the regulations set out one condition or a combination of several conditions.

The noble Baroness, Lady Finlay, suggested that the amendment would assist Ofcom in its work. Ofcom itself has raised concerns that amendments such as this—to introduce greater flexibility—could increase the risk of legal challenges to categorisation. My noble friend Lady Morgan was part of the army of lawyers before she came to Parliament, and I am conscious that the noble Lord, Lord Clement-Jones, is one as well. I hope they will heed the words of the regulator; this is not a risk that noble Lords should take lightly.

I will say more clearly that small companies can pose significant harm to users—I have said it before and I am happy to say it again—which is why there is no exemption for small companies. The very sad examples that my noble friend Lady Morgan gave in her speech related to illegal activity. All services, regardless of size, will be required to take action against illegal content, and to protect children if they are likely to be accessed by children. This is a proportionate regime that seeks to protect small but excellent platforms from overbearing regulation. However, I want to be clear that a small platform that is a font of illegal content cannot use the excuse of its size as an excuse for not dealing with it.

Category 1 services are those services that have a major influence over our public discourse online. Again, I want to be clear that designation as a category 1 service is not based only on size. The thresholds for category 1 services will be based on the functionalities of a service as well as the size of the user base. The thresholds can also incorporate other characteristics that the Secretary of State deems relevant, which could include factors such as a service’s business model or its governance. Crucially, Ofcom has been clear that it will prioritise engagement with high-risk or high-impact services, irrespective of their categorisation, to understand their existing safety systems and how they plan to improve them.

17:15
Requiring all companies to comply with the full range of category 1 duties would divert these companies’ resources away from the vital task of tackling illegal content and protecting children, but we are clear that the popularity and characteristics of services can change. To that end, the Government have placed a new duty on Ofcom to identify and publish a list of companies that are close to the category 1 thresholds. That will ensure that Ofcom proactively identifies emerging category 1 companies and is ready to assess and add them to the category 1 register without delay. This tiered approach will be kept under review by both Ofcom and the Government, both as part of the thresholds and as part of the post-legislative review conducted by the Secretary of State.
I am very grateful to noble Lords and Members of another place, as well as groups including the Antisemitism Policy Trust, the Center for Countering Digital Hate, Samaritans and Kick It Out for their tireless work on this issue. I hope that explains to my noble friend why we cannot support her amendment. I hope that she will not press it, but if she does the rest of these Benches will oppose it and the Government cannot accept adding it to the Bill.
Amendment 238A agreed.
Amendments 238B to 238E
Moved by
238B: Clause 201, page 169, line 6, leave out “74(3)(b)” and insert “(“Regulations by OFCOM about qualifying worldwide revenue etc”)(2)”
Member’s explanatory statement
This amendment provides that regulations made by OFCOM about supporting evidence to be supplied by providers for the purposes of Part 6 of the Bill (fees) are subject to the negative Parliamentary procedure.
238C: Clause 201, page 169, line 6, at end insert—
“(ba) regulations under section 77,”Member’s explanatory statement
This amendment provides that regulations made by the Secretary of State specifying the threshold figure for the purposes of Part 6 of the Bill are subject to the negative Parliamentary procedure.
238D: Clause 201, page 169, line 11, leave out “(1),”
Member’s explanatory statement
This amendment is consequential on the amendment in my name inserting new subsections (7A) and (7B) into this Clause.
238E: Clause 201, page 169, line 13, at end insert—
“(8A) As soon as a draft of a statutory instrument containing regulations under section (“Regulations by OFCOM about qualifying worldwide revenue etc”)(1) or paragraph 5(9) of Schedule 13 (whether alone or with provision under section (“Regulations by OFCOM about qualifying worldwide revenue etc”)(2)) is ready for laying before Parliament, OFCOM must send the draft to the Secretary of State, and the Secretary of State must lay the draft before Parliament.(8B) Immediately after making a statutory instrument containing only regulations under section (“Regulations by OFCOM about qualifying worldwide revenue etc”)(2), OFCOM must send the instrument to the Secretary of State, and the Secretary of State must lay it before Parliament.”Member’s explanatory statement
This amendment provides for the Secretary of State’s involvement in the Parliamentary procedure to which regulations made by OFCOM under this Bill are subject.
Amendments 238B to 238E agreed.
Amendment 239
Moved by
239: After Clause 201, insert the following new Clause—
“Regulations: consultation and impact assessments
(1) This section applies if the Secretary of State seeks to exercise powers under—(a) section 55 (regulations under section 54),(b) section 195 (powers to amend section 35),(c) section 196 (powers to amend or repeal provisions relating to exempt content or services),(d) section 197 (powers to amend Part 2 of Schedule 1),(e) section 198 (powers to amend Schedules 5, 6 and 7), or(f) paragraph 1 of Schedule 11 (regulations specifying threshold conditions for categories of Part 3 services),or where the Secretary of State intends to direct OFCOM under section 39.(2) The Secretary of State may not exercise the powers under the provisions in subsection (1) unless any select committee charged by the relevant House of Parliament with scrutinising such regulations has—(a) completed its consideration of the draft regulations and accompanying impact assessment provided by the Secretary of State; and(b) reported on their deliberation to the relevant House; andthe report of the committee has been debated in that House, or the period of six weeks beginning on the day on which the committee reported has elapsed.”
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

My Lords, this amendment would require the Secretary of State, when seeking to exercise certain powers in the Bill, to provide the relevant Select Committees of both Houses with draft regulations and impact assessments, among other things. I should admit up front that this is a blatant attempt to secure an Online Safety Bill version of what I have called the “Grimstone rule”, established in the international trade Bill a few years ago. Saving his blushes, if the ideas enshrined in the amendment are acceptable to the Government, I hope that the earlier precedent of the “Grimstone rule” would ensure that any arrangements agreed under this amendment would be known in future as the “Parkinson rule”. Flattery will get you many things.

The Bill places a specific consultation requirement on the Government for the fee regime, which we were just talking about, categorisation thresholds, regulations about reports to the NCA, statements of strategic priorities, regulations for super-complaints, and a review of the Act after three years—so a wide range of issues need to be put out for consultation. My context here, which is all-important, is a growing feeling that Parliament’s resources are not being deployed to the full in scrutinising and reviewing the work of the Executive on the one hand and feeding knowledge and experience into future policy on the other. There is continuing concern about the effectiveness of the secondary legislation approval procedures, which this amendment would bear on.

Noble Lords have only to read the reports of the Select Committees of both Houses to realise what a fantastic resource they represent. One has only to have served on a Select Committee to realise what potential also exists there. In an area of rapid technical and policy development, such as the digital world, the need to be more aware of future trends and potential problems is absolutely crucial.

The pre-legislative scrutiny committee report is often quoted here, and it drew attention to this issue as well, recommending

“a Joint Committee of both Houses to oversee digital regulation with five primary functions: scrutinising digital regulators and overseeing the regulatory landscape … scrutinising the Secretary of State’s work into digital regulation; reviewing the codes of practice laid by Ofcom under any legislation relevant to digital regulation … considering any relevant new developments such as the creation of new technologies and the publication of independent research … and helping to generate solutions to ongoing issues in digital regulation”—

a pretty full quiver of issues to be looked at.

I hope that when he responds to this debate, the Minister will agree that ongoing parliamentary scrutiny would be helpful in providing reassurances that the implementation of the regime under the Bill is going as intended, and that the Government would also welcome a system under which Parliament, perhaps through the Select Committees, can contribute to the ways suggested by the Joint Committee. I say “perhaps”, because I accept that it is not appropriate for primary legislation to dictate how, or in what form, Parliament might offer advice in the manner that I have suggested; hence the suggestion embedded in the amendment—which I will not be pressing to a Division—which I call the “Parkinson rule”. Under this, the Minister would agree at the Dispatch Box a series of commitments which will provide an opportunity for enhanced cross-party scrutiny of the online safety regime and an opportunity to survey and report on future developments of interest.

The establishment of the new Department for Science, Innovation and Technology and its Select Committee means that there is a new dedicated Select Committee in the Commons. The Lords Communications and Digital Committee will continue, I hope, to play a vital role in the scrutiny of the digital world, as it has with the online safety regime to date. While it would be for the respective committees to decide their priorities, I hope the Government would encourage the committees in both Houses to respond to their required consultation processes and to look closely at the draft codes of practice, the uses of regulation-making powers and the powers of direction contained in the Bill ahead of the formal processes in both Houses. Of course, it could be a specialist committee if that is what the Houses decide, but there is an existing arrangement under which this “Parkinson rule” could be embedded. I have discussed the amendment with the Minister and with the Bill team. I look forward to hearing their response to the ideas behind the amendment. I beg to move the “Parkinson rule”.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

I support the amendment of the noble Lord, Lord Stevenson. Here is an opportunity for the Minister to build a legislative monument. I hope he will take it. The reason I associate myself with it is because the noble Lord, Lord Stevenson—who has been sparing in his quoting of the Joint Committee’s report, compared with mine—referred to it and it all made very good sense.

The amendment stumbles only in the opinion of the Government, it seems, on the basis that parliamentary committees need to be decided on by Parliament, rather than the Executive. But this is a very fine distinction, in my view, given that the Government, in a sense, control the legislature and therefore could will the means to do this, even if it was not by legislation. A nod from the Minister would ensure that this would indeed take place. It is very much needed. It was the Communications and Digital Committee, I think, that introduced the idea that we picked up in the Joint Committee, so it has a very good provenance.

Lord Hope of Craighead Portrait Lord Hope of Craighead (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I offer my support to the amendment. I spent some time arguing in the retained EU law Bill for increased parliamentary scrutiny. My various amendments did not succeed but at the end of the day—on the final day of ping-pong—the Minister, the noble Lord, Lord Callanan, gave certain assurances based on what is in Schedule 5 to that Act, as it now is, involving scrutiny through committees. So the basic scheme which my noble kinsman has proposed is one which has a certain amount of precedent—although it is not an exact precedent; what might have been the “Callanan rule” is still open to reconstruction as the “Parkinson rule”. I support the amendment in principle.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, as the noble Lords, Lord Stevenson and Lord Clement-Jones, have already said, the Communications and Digital Select Committee did indeed recommend a new Joint Committee of both Houses to look specifically at the various different aspects of Ofcom’s implementation of what will be the Online Safety Act and ongoing regulation of digital matters. It is something I still have a lot of sympathy for. However, there has not been much appetite for such a Joint Committee at the other end of the Corridor. I do not necessarily think we should give up on that, and I will come back to that in a moment, but in place of that, I am not keen on what is proposed in Amendment 239, because my fear about how that is laid out is that it introduces something that appears a bit too burdensome and probably introduces too much delay in implementation.

To return to the bigger question, I think that we as parliamentarians need to reflect on our oversight of regulators, to which we are delegating significant new powers and requiring them to adopt a much more principles-based approach to regulation to cope with the fast pace of change in the technological world. We have to reflect on whether our current set-up is adequate for the way in which that is changing. What I have in mind is very much a strategic level of oversight, rather than scrutinising operational decisions, although, notwithstanding what the noble Lord has said, something specific in terms of implementation of the Bill and other new legislation is an area I would certainly wish to explore further.

The other aspect of this is making sure that our regulators keep pace too, not just with technology, and apply the new powers we give them in a way which meets our original intentions, but with the new political dynamics. Earlier today in your Lordships’ Chamber, there was a Question about how banks are dealing with political issues, and that raises questions about how the FCA is regulating the banking community. We must not forget that the Bill is about regulating content, and that makes it ever more sensitive. We need to keep reminding ourselves about this; it is very new and very different.

As has been acknowledged, there will continue to be a role for the Communications and Digital Select Committee, which I have the great privilege of chairing, in overseeing Ofcom. My noble friend Lord Grade and Dame Melanie Dawes appeared before us only a week ago. There is a role for the SIT Committee in the Commons; there is also probably some kind of ongoing role for the DCMS Select Committee in the Commons too, I am not sure. In a way, the fractured nature of that oversight makes it all the more critical that we join up a bit more. So I will take it upon myself to give this more thought and speak to the respective chairs of those committees in the other place, but I think that at some point we will need to consider, in some other fora, the way in which we are overseeing the work of regulators.

At some point, I think we will need to address the specific recommendations in the pre-legislative committee’s report, which were very much in line with what my own committee thought was right for the future of digital regulatory oversight, but on this occasion, I will not be supporting the specifics of Amendment 239.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, very briefly, I was pleased to see this, in whatever form it takes, because as we finish off the Bill, one thing that has come up consistently is that some of us have raised problems of potential unintended consequences, such as whether age gating will lead to a huge invasion of the privacy of adults rather than just narrowly protecting children, or whether the powers given to Ofcom will turn it into the most important and powerful regulator in the country, if not in Europe. In a highly complex Bill, is it possible for us to keep our eye on it a bit more than just by whingeing on the sidelines?

The noble Baroness, Lady Stowell, makes a very important point about the issue in relation to the FCA and banking. Nobody intended that to be the outcome of PEPs, for example, and nobody intended when they suggested encouraging banks to have values such as ESG or EDI—equality, diversity and inclusion—that that would lead to ordinary citizens of this country being threatened with having their banking turned off. It is too late to then retrospectively say, “That wasn’t what we ever intended”.

17:30
Straightforwardly, from the point of view of scrutiny, I hope we do not say that it will be left up to arm’s-length regulators and do not look at it again. On my consistent concerns about free speech being threatened by this Bill, you can come back and say to me, “Oh, you were wrong, Lady Fox”, but you can say that only if we have a very clear view that Ofcom is not behaving in a way that is going to damage the freedom of expression rights of people in this country.
Lord Kamall Portrait Lord Kamall (Con)
- View Speech - Hansard - - - Excerpts

I associate myself with the comments of my noble friend Lady Stowell on this whole issue, and I refer to my register of interests. One question we should be asking, which goes wider than this Bill, is: who regulates the regulators? It is a standard problem in political science and often known as principal agent theory, whereby the principals delegate powers to the agents for many reasons, and you see agency slack, whereby they develop their own powers beyond what was perhaps originally intended. For that reason, I completely associate myself with my noble friend Lady Stowell’s comments—and not because she chairs a committee on which I sit and I hope to get a favour of more speaking time on that committee. It is simply because, on its merit, we should all be asking who regulates the regulators and making sure that they are accountable. We are asking the same question of the Secretary of State, and quite rightly, the Secretary of State should be accountable for any measures they propose, but we should also be asking it of regulators.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, I have always felt rather sorry for the first Viscount Addison, because what we refer to as the Salisbury convention is really the Salisbury-Addison convention. So while I am grateful to the noble Lord, Lord Stevenson, for his flattering speech, I shall insist on calling it the “Parkinson-Stevenson rule”, not least in the hope that that mouthful will encourage people to forget its name more swiftly.

I am grateful to the noble Lord for his attention to this matter and the useful discussions that we have had. His Amendment 239 would go beyond the existing legislative process for the delegated powers in the Bill by providing for parliamentary committees to be, in effect, inserted into the secondary legislative process. The delegated powers in the Bill are crucial for implementing the regime effectively and for ensuring that it keeps pace with changes in technology. Regulation-making powers are an established part of our legislative practice, and it would not be appropriate to deviate from existing processes.

However, I agree that ongoing parliamentary scrutiny of the regime will be crucial in helping to provide noble Lords and Members in another place with the reassurance that the implementation of the regime is as we intended. As the noble Lord noted, the establishment of the Science, Innovation and Technology Select Committee in another place means that there is a new dedicated committee looking at this important area of public policy. That provides an opportunity for cross-party scrutiny of the online safety regime and broader issues. While it will be, as he said, for respective committees to decide their priorities, we welcome any focus on online safety, and certainly welcome committees in both Houses co-operating effectively on this matter. I am certain that the Communications and Digital Committee of your Lordships’ House will continue to play a vital role in the scrutiny of the online safety regime.

We would fully expect these committees to look closely at the codes of practice, the uses of regulation-making powers and the powers of direction in a way that allows them to focus on key issues of interest. To support that, I can commit that the Government will do two things. First, where the Bill places a consultation requirement on the Government, we will ensure that the relevant committees have every chance to play a part in that consultation by informing them that the process is open. Secondly, while we do not wish to see the implementation process delayed, we will, where possible, share draft statutory instruments directly with the relevant committees ahead of the formal laying process. These timelines will be on a case-by-case basis, considering what is appropriate and reasonably practical. It will be for the committees to decide how they wish to engage with the information that we provide, but it will not create an additional approval process to avoid delaying implementation. I am grateful to my noble friend Lady Stowell of Beeston for her words of caution and wisdom on that point as both chairman of your Lordships’ committee and a former Leader of your Lordships’ House.

I hope that the noble Lord will be satisfied by what I have set out and will be willing to withdraw his amendment so that our rule might enter into constitutional history more swiftly.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

I am very grateful to everyone who has contributed to the debate, despite my injunction that no one was to speak other than those key persons—but it was nice to hear views around the House in support for this proposal, with caution. The noble Baroness, Lady Stowell, was right to be clear that we have to be focused on where we are going on this; there is quite a lot at stake here, and it is a much bigger issue than simply this Bill and these particular issues. Her willingness to take this on in a wider context is most welcome, and I look forward to hearing how that goes. I am also very grateful for the unexpected but very welcome support from the noble Baroness, Lady Fox. It was nice that she finally agreed to meet on one piece of territory, if we cannot agree on some of the others. The noble Lord, Lord Kamall, is right to say that we need to pick up the much broader question about who regulates those who regulate us. This is not the answer, but it certainly gets us a step in the direction.

I was grateful to the Minister for suggesting that the “Parkinson rule” could take flight, but I shall continue to call it by a single name—double-barrelled names are not appropriate here. We will see the results of that in the consultation; the things that already have to be consulted about will be offered to the committees, and it is up to them to respond on that, but it is a very good start. The idea that drafts and issues that are being prepared for future regulation will be shown ahead of the formal process is exactly where I wanted to be on this, so I am very grateful for that. I withdraw the amendment.

Amendment 239 withdrawn.
Amendment 239A not moved.
Clause 74: Duty to notify OFCOM
Amendments 239B to 239E
Moved by
239B: Clause 74, page 70, line 3, leave out from “information” to end of line 5 and insert “as required by regulations made by OFCOM under section (“Regulations by OFCOM about qualifying worldwide revenue etc”).”
Member’s explanatory statement
This amendment omits a reference to regulations made by the Secretary of State. Details about supporting evidence etc to accompany providers’ notifications for the purposes of the fees regime are now to be contained in regulations made by OFCOM (see the new Clause 76 proposed in my name).
239C: Clause 74, page 70, line 6, leave out subsection (4) and insert—
“(4) Section (“Regulations by OFCOM about qualifying worldwide revenue etc”) confers power on OFCOM to make regulations about the determination of a provider’s qualifying worldwide revenue, and the meaning of “qualifying period”, for the purposes of this Part.”Member’s explanatory statement
This amendment is a signpost to the new Clause 76 proposed in my name, conferring power on OFCOM to make regulations about the meaning of qualifying worldwide revenue and qualifying period for the purposes of the fees regime.
239D: Clause 74, page 70, line 11, leave out “threshold figure under section 77 is published” and insert “regulations under section 77 come into force (first threshold figure)”
Member’s explanatory statement
This amendment is consequential on the first amendment of Clause 77 in my name (threshold figure now to be specified in regulations made by the Secretary of State).
239E: Clause 74, page 70, line 29, leave out subsection (11)
Member’s explanatory statement
This amendment omits a provision about procedure for regulations made by the Secretary of State under subsection (3)(b). That is no longer required because details about supporting evidence etc to accompany providers’ notifications for the purposes of the fees regime are now to be contained in regulations made by OFCOM (see the new Clause 76 proposed in my name).
Amendments 239B to 239E agreed.
Clause 76: OFCOM’s statement about “qualifying worldwide revenue” etc
Amendment 239F
Moved by
239F: Clause 76, leave out Clause 76 and insert the following new Clause—
“Regulations by OFCOM about qualifying worldwide revenue etc
(1) For the purposes of this Part, OFCOM may by regulations make provision—(a) about how the qualifying worldwide revenue of a provider of a regulated service is to be determined, and(b) defining the “qualifying period” in relation to a charging year.(2) OFCOM may by regulations also make provision specifying or describing evidence, documents or other information that providers must supply to OFCOM for the purposes of section 74 (see subsection (3)(b) of that section), including provision about the way in which providers must supply the evidence, documents or information.(3) Regulations under subsection (1)(a) may provide that the qualifying worldwide revenue of a provider of a regulated service (P) who is a member of a group during any part of a qualifying period is to include the qualifying worldwide revenue of any entity that—(a) is a group undertaking in relation to P for all or part of that period, and(b) receives or is due to receive, during that period, any amount referable (to any degree) to a regulated service provided by P.(4) Regulations under subsection (1)(a) may, in particular—(a) make provision about circumstances in which amounts do, or do not, count as being referable (to any degree) to a regulated service for the purposes of the determination of the qualifying worldwide revenue of the provider of the service or of an entity that is a group undertaking in relation to the provider;(b) provide for cases or circumstances in which amounts that—(i) are of a kind specified or described in the regulations, and(ii) are not referable to a regulated service,are to be brought into account in determining the qualifying worldwide revenue of the provider of the service or of an entity that is a group undertaking in relation to the provider.(5) Regulations which make provision of a kind mentioned in subsection (3) may include provision that, in the case of an entity that is a group undertaking in relation to a provider for part (not all) of a qualifying period, only amounts relating to the part of the qualifying period for which the entity was a group undertaking may be brought into account in determining the entity’s qualifying worldwide revenue.(6) Regulations under subsection (1)(a) may make provision corresponding to paragraph 5(8) of Schedule 13.(7) Before making regulations under subsection (1) OFCOM must consult—(a) the Secretary of State,(b) the Treasury, and(c) such other persons as OFCOM consider appropriate.(8) Before making regulations under subsection (2) OFCOM must consult the Secretary of State.(9) Regulations under this section may make provision subject to such exemptions and exceptions as OFCOM consider appropriate.(10) In this section—“group” means a parent undertaking and its subsidiary undertakings, reading those terms in accordance with section 1162 of the Companies Act 2006;“group undertaking” has the meaning given by section 1161(5) of that Act.”Member’s explanatory statement
This amendment substitutes Clause 76, which is about what is meant by “qualifying worldwide revenue”. The new Clause provides for OFCOM to make regulations about this and related matters for the purposes of the fees regime, and allows the regulations (among other things) to provide that revenue arising to certain entities in the same group as a provider of a regulated service is to be brought into account.
Amendment 239F agreed.
Clause 77: Threshold figure
Amendments 239G to 239M
Moved by
239G: Clause 77, page 72, line 2, leave out from “must” to “the” in line 3 and insert “make regulations specifying”
Member’s explanatory statement
This amendment provides that the Secretary of State must specify the threshold figure in regulations (rather than in a published statement).
239H: Clause 77, page 72, line 4, leave out subsection (3)
Member’s explanatory statement
This amendment is consequential on the first amendment of this Clause in my name.
239J: Clause 77, page 72, line 11, leave out “to (3)” and insert “and (2)”
Member’s explanatory statement
This amendment is consequential on the preceding amendment of this Clause in my name.
239K: Clause 77, page 72, line 12, leave out “A” and insert “Regulations must provide that a”
Member’s explanatory statement
This amendment is consequential on the first amendment of this Clause in my name.
239L: Clause 77, page 72, line 14, leave out from beginning to “at” and insert “Regulations specifying a threshold figure must be in force”
Member’s explanatory statement
This amendment provides that regulations specifying a threshold figure must be in force at least 9 months before the first charging year for which that figure applies.
239M: Clause 77, page 72, line 17, leave out “threshold figure published” and insert “regulations made”
Member’s explanatory statement
This amendment is consequential on the first amendment of this Clause in my name.
Amendments 239G to 239M agreed.
Clause 79: OFCOM’s fees statements
Amendments 239N and 239P
Moved by
239N: Clause 79, page 73, line 18, leave out from “period”” to end of line 19 and insert “for the purposes of this Part, and”
Member’s explanatory statement
This amendment is consequential on the new Clause 76 proposed in my name.
239P: Clause 79, page 73, line 20, leave out “published in accordance with” and insert “contained in regulations under”
Member’s explanatory statement
This amendment is consequential on the first amendment of Clause 77 in my name (threshold figure now to be specified in regulations made by the Secretary of State).
Amendments 239N and 239P agreed.
Clause 82: General duties of OFCOM under section 3 of the Communications Act
Amendment 240
Moved by
240: Clause 82, page 74, line 25, leave out “presented by content”
Member’s explanatory statement
This amendment ensures that Ofcom is empowered to consider harms presented by features, functionalities, behaviours and the design and operation of services not just by content.
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, if I may, I shall speak very briefly, in the absence of my noble friend Lady Kidron, and because I am one of the signatories of this amendment, alongside the noble Lord, Lord Stevenson, and the right reverend Prelate the Bishop of Oxford. Amendment 240, together with a number of amendments that we will be debating today, turns on a fundamental issue that we have not yet resolved.

I came in this morning being told that we would be voting on this amendment and that other amendments later today would be consequential—I am a novice at this level of parliamentary procedure, so forgive me if I have got myself confused during the day—but I now understand that my noble friend considers this amendment to be consequential but, strangely, the amendments right at the end of the day are not. I just wanted to flag to the House that they all cover the same fundamental issue of whether harms can be unrelated to content, whether the harms of the online world can be to do with functionality—the systems and processes that drive the addiction that causes so much harm to our children.

It is a fundamental disagreement. I pay tribute to the amount of time the department, the Secretary of State and my noble friend have spent on it, but it is not yet resolved and, although I understand that I should now say that I beg leave to move the amendment formally, I just wanted to mark, with apologies, the necessity, most likely, of having to bring the same issue back to vote on later today.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, His Majesty’s Government indeed agree that this is consequential on the other amendments, including Amendment 35, which the noble Baroness, Lady Kidron, previously moved at Report. We disagreed with them, but we lost that vote; this is consequential, and we will not force a Division on it.

We will have further opportunity to debate the fundamental issues that lie behind it, to which my noble friend Lady Harding just referred. Some of the amendments on which we may divide later, the noble Baroness, Lady Kidron, tabled after defeating the Government the other day, so we cannot treat them as consequential. We look forward to debating them; I will urge noble Lords not to vote for them, but we will have opportunity to discuss them later.

Amendment 240 agreed.
Amendment 241
Moved by
241: Clause 82, page 74, line 31, leave out “or 3” and insert “, 3 or 3A”
Member’s explanatory statement
Clause 82 is about OFCOM’s general duties. This amendment and the next amendment in my name insert a reference to Chapter 3A, which is the new Chapter containing the new duties imposed by the Clause proposed after Clause 67 in my name.
Amendment 241 agreed.
Amendment 242 not moved.
Amendment 243
Moved by
243: Clause 82, page 75, line 2, leave out “or 3” and insert “, 3 or 3A”
Member’s explanatory statement
See the explanatory statement for the preceding amendment in my name.
Amendment 243 agreed.
Amendment 244 not moved.
Schedule 11: Categories of regulated user-to-user services and regulated search services: regulations
Amendment 245
Moved by
245: Schedule 11, page 223, line 32, leave out “and” and insert “or”
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Hansard - - - Excerpts

My Lords, I wish to test the opinion of the House and I beg to move.

17:42

Division 1

Ayes: 196


Labour: 104
Liberal Democrat: 54
Crossbench: 26
Independent: 4
Conservative: 3
Democratic Unionist Party: 3
Green Party: 2

Noes: 183


Conservative: 177
Crossbench: 4
Independent: 2

17:54
Clause 91: Power to require information
Amendments 246 and 247
Moved by
246: Clause 91, page 83, line 14, leave out “(an “information notice”)”
Member’s explanatory statement
This technical amendment is needed because the new notice requiring information in connection with an investigation into the death of a child (see the new Clause proposed after Clause 91 in my name) is also a form of information notice.
247: Clause 91, page 83, line 19, at end insert—
“(b) provide information about the use of a service by a named individual.”Member’s explanatory statement
This amendment makes it clear that OFCOM have power by notice to require providers to provide information about a particular person’s use of a service.
Amendments 246 and 247 agreed.
Amendment 247A
Moved by
247A: Clause 91, page 83, line 19, at end insert—
“(2A) The power conferred by subsection (1) also includes power to require a person within any of paragraphs (a) to (d) of subsection (4) to take steps so that OFCOM are able to remotely access the service provided by the person, or remotely access equipment used by the service provided by the person, in order to view, in particular—(a) information demonstrating in real time the operation of systems, processes or features, including functionalities and algorithms, used by the service;(b) information generated in real time by the performance of a test or demonstration of a kind required by a notice under subsection (1).”Member’s explanatory statement
This amendment makes it clear that OFCOM have the power by notice to require a provider of a regulated service (among others) to take steps to allow OFCOM to remotely access the service so that they can view the operation in real time of systems, processes, functionalities and algorithms, and tests and demonstrations.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I beg to move Amendment 247A.

Amendment 247B (to Amendment 247A) not moved.
Amendment 247A agreed.
Amendments 248 to 248C
Moved by
248: Clause 91, page 84, line 2, at end insert—
“(iva) any duty set out in section (Disclosure of information about use of service by deceased child users) (deceased child users),”Member’s explanatory statement
This amendment mentions the new duties imposed by the Clause proposed after Clause 67 in my name in the Clause that sets out the purposes for which OFCOM may require people to provide information.
248A: Clause 91, page 84, line 12, leave out “section 75 (duty to pay fees)” and insert “Part 6 (fees)”
Member’s explanatory statement
This amendment makes it clear that OFCOM’s powers to gather information in relation to a provider’s qualifying worldwide revenue apply for the purposes of Part 6.
248B: Clause 91, page 84, line 37, leave out “duty” and insert “duties”
Member’s explanatory statement
This amendment is consequential on the new clause proposed to be inserted after Clause 149 in my name expanding OFCOM’s duties to promote media literacy in relation to regulated user-to-user and search services.
248C: Clause 91, page 84, line 38, leave out “duty to promote”
Member’s explanatory statement
This amendment is consequential on the new Clause proposed to be inserted after Clause 149 in my name expanding OFCOM’s duties to promote media literacy in relation to regulated user-to-user and search services.
Amendments 248 to 248C agreed.
Amendment 249
Moved by
249: After Clause 91, insert the following new Clause—
“Information in connection with an investigation into the death of a child
(1) OFCOM may by notice under this subsection require a relevant person to provide them with information for the purpose of—(a) responding to a notice given by a senior coroner under paragraph 1(2) of Schedule 5 to the Coroners and Justice Act 2009 in connection with an investigation into the death of a child, or preparing a report under section (OFCOM’s report in connection with investigation into a death) in connection with such an investigation;(b) responding to a request for information in connection with the investigation of a procurator fiscal into, or an inquiry held or to be held in relation to, the death of a child, or preparing a report under section (OFCOM’s report in connection with investigation into a death) in connection with such an inquiry;(c) responding to a notice given by a coroner under section 17A(2) of the Coroners Act (Northern Ireland) 1959 (c. 15 (N.I.)) in connection with—(i) an investigation to determine whether an inquest into the death of a child is necessary, or(ii) an inquest in relation to the death of a child,or preparing a report under section (OFCOM’s report in connection with investigation into a death) in connection with such an investigation or inquest. (2) The power conferred by subsection (1) includes power to require a relevant person to provide OFCOM with information about the use of a regulated service by the child whose death is under investigation, including, in particular—(a) content encountered by the child by means of the service,(b) how the content came to be encountered by the child (including the role of algorithms or particular functionalities),(c) how the child interacted with the content (for example, by viewing, sharing or storing it or enlarging or pausing on it), and(d) content generated, uploaded or shared by the child.(3) The power conferred by subsection (1) includes power to require a relevant person to obtain or generate information.(4) The power conferred by subsection (1) must be exercised in a way that is proportionate to the purpose mentioned in that subsection.(5) The power conferred by subsection (1) does not include power to require the provision of information in respect of which a claim to legal professional privilege, or (in Scotland) to confidentiality of communications, could be maintained in legal proceedings. (6) Nothing in this section limits the power conferred on OFCOM by section 91.(7) In this section—“inquiry” means an inquiry held, or to be held, under the Inquiries into Fatal Accidents and Sudden Deaths etc. (Scotland) Act 2016 (asp 2);“information” includes documents, and any reference to providing information includes a reference to producing a document (and see also section 92(9));“relevant person” means a person within any of paragraphs (a) to (e) of section 91(4).”Member’s explanatory statement
This amendment makes it clear that OFCOM have the power to obtain information for the purposes of responding to a notice given to them by a coroner or, in Scotland, a request from a procurator fiscal, in connection with the death of a child, including a power to obtain information from providers about the use of a service by the deceased child.
Amendment 249 agreed.
Clause 92: Information notices
Amendments 250 and 250A
Moved by
250: Clause 92, page 85, line 3, at end insert—
“(A1) A notice given under section 91(1) or (Information in connection with an investigation into the death of a child)(1) is referred to in this Act as an information notice.”Member’s explanatory statement
This amendment provides that a notice under the new Clause proposed in my name concerning OFCOM’s power to obtain information in connection with an investigation into the death of a child is called an “information notice” (as well as a notice under Clause 91). This ensures that provisions of the Bill that relate to information notices also apply to a notice given under that Clause.
250A: Clause 92, page 85, line 24, leave out “provide the information” and insert “act”
Member’s explanatory statement
This amendment ensures that the duty to comply with an information notice covers the case where a provider is required to take steps to allow OFCOM to remotely access the service.
Amendments 250 and 250A agreed.
Clause 94: Reports by skilled persons
Amendment 250B
Moved by
250B: Clause 94, page 86, line 26, leave out “any” and insert “either”
Member’s explanatory statement
This amendment is consequential on the next amendment of Clause 94 in my name.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, these amendments are concerned with Ofcom’s powers under Clause 111 to issue notices to deal with terrorism content and child sexual exploitation and abuse content.

I acknowledge the concerns which have been aired about how these powers work with encrypted services. I want to make it clear that the Bill does not require companies to break or weaken encryption, and we have built in strong safeguards to ensure that users’ privacy is protected. Encryption plays an important role online, and the UK supports its responsible use. I also want to make it clear that we are not introducing a blanket requirement for companies to monitor all content for all harms, at all times. That would not be proportionate.

However, given the serious risk of harm to children from sexual abuse and exploitation online, the regulator must have appropriate, tightly targeted powers to compel companies to take the most effective action to tackle such reprehensible illegal activity which is taking place on their services. We must ask companies to do all that is technically feasible to keep children safe, subject to stringent legal safeguards.

The powers in the Bill are predicated on risk assessments. If companies are managing the risks on their platform appropriately, Ofcom will not need to use its powers. As a last resort, however, where there is clear evidence of child sexual abuse taking place on a platform, Ofcom will be able to direct companies either to use, or to make best efforts to develop or source, accredited and accurate technology to identify and remove this illegal content. To be clear, these powers will not enable Ofcom or our law enforcement agencies to obtain any automatic access to the content detected. It is simply a matter of making private companies take effective action to prevent child sexual abuse on their services.

Ofcom must consider a wide range of matters when deciding whether a notice is necessary and proportionate, including the impacts on privacy and freedom of expression of using a particular technology on a particular service. Ofcom will only be able to require the use of technology accredited as highly accurate in detecting illegal child sexual abuse or terrorism content, vastly minimising the risk that content is wrongly identified.

In addition to these safeguards, as a public body, Ofcom is bound through the Human Rights Act 1998 by the European Convention on Human Rights, including Articles 8 and 10. Ofcom has an obligation not to act in a way which unduly interferes with the right to privacy and freedom of expression when carrying out its duties, for which it is held to account.

If appropriate technology does not exist which meets these requirements, Ofcom cannot require its use. That is why the powers include the ability for Ofcom to require companies to make best endeavours to develop or source a solution. It is right that we can require technology companies to use their considerable resources and expertise to develop the best possible protections for children in encrypted environments.

Despite the breadth of the existing safeguards, we recognise that concerns remain about these powers, and we have listened to the points that noble Lords raised in Committee about privacy and technical feasibility. That is why we are introducing additional safeguards. I am grateful for the constructive engagement I have had with noble Lords across your Lordships’ House on this issue, and I hope that the government amendments alleviate their concerns.

I turn first to our Amendments 250B, 250C, 250D, 255A, 256A, 257A, 257B, 257C and 258A, which require that Ofcom obtain a skilled persons’ report before issuing a warning notice and exercising its powers under Clause 111. This independent expert scrutiny will supplement Ofcom’s own expertise to ensure that it has a full understanding of relevant technical issues to inform its decision-making. That will include issues specific to the service in question, such as its design and relevant factors relating to privacy.

18:00
We are confident that, in addition to Ofcom’s existing routes of evidence-gathering, Amendment 256A will help to provide the regulator with the necessary information to determine whether to issue a notice and the requirements that may be put in place. That will further help Ofcom to issue a notice which is targeted and proportionate.
Ofcom will need to appoint a skilled person and notify the provider about the appointment and the relevant matters to be explored in the report before issuing its final notice. Ofcom will have discretion over what should be included in the report, as this will depend on the specific circumstances. Under Amendments 257A and 257B, Ofcom must also provide a summary of the report to the relevant provider when issuing a warning notice. That will enable the provider to make representations based on Ofcom’s own analysis and that of the skilled person.
I turn now to Amendments 257D, 257E and 257F. We have heard concerns about the impact that scanning technologies could have on journalistic content and sources. Any technology required by Ofcom must be highly accurate in detecting only terrorism content on public channels or only child sexual exploitation and abuse content on public or private channels. So, the likelihood of journalistic content or sources being compromised will be low—but to reassure your Lordships further, we have expanded the matters that Ofcom must consider in its decision-making.
Amendment 257D requires Ofcom to consider the impact that the use of a particular technology on a particular service would have on the availability of journalistic content and the confidentiality of journalistic sources when considering whether to issue a notice. It builds on the existing safeguards in Clause 113 regarding freedom of expression and privacy. I am grateful to the noble Lord, Lord Stevenson of Balmacara, for his constructive engagement on this issue. I beg to move.
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am conscious of the imprecation earlier from the noble Lord, Lord Stevenson of Balmacara, that we keep our contributions short, but I intend to take no notice of it. That is for the very good reason that I do not think the public would understand why we disposed of such a momentous matter as bringing to an end end-to-end encryption on private messaging services as a mere technicality and a brief debate at the end of Report.

It is my view that end-to-end encryption is assumed nowadays by the vast majority of people using private messaging services such as WhatsApp, iMessage and Signal. They are unaware, I think, of the fact that it is about to be taken from them by Clause 111 of the Bill. My amendment would prevent that. It is fairly plain; it says that

“A notice under subsection (1)”


of Clause 111

“may not impose a requirement relating to a service if the effect of that requirement would be to require the provider of the service to weaken or remove end-to-end encryption applied in relation to the service”.

My noble friend says that there is no threat of ending end-to-end encryption in his proposal, but he achieves that by conflating two things—which I admit my own amendment conflates, but I will come back to that towards the end. They are the encryption of platforms and the encryption of private messaging services. I am much less concerned about the former. I am concerned about private messaging services. If my noble friend was serious in meaning that there was no threat to end-to-end encryption, then I cannot see why he would not embrace my amendment, but the fact that he does not is eloquent proof that it is in fact under threat, as is the fact that the NSPCC and the Internet Watch Foundation are so heavily lobbying against my amendment. They would not be doing that if they did not think it had a serious effect.

I shall not repeat at any length the technical arguments we had in Committee, but the simple fact is that if you open a hole into end-to-end encryption, as would be required by this provision, then other people can get through that hole, and the security of the system is compromised. Those other people may not be very nice; they could be hostile state actors—we know hostile state actors who are well enough resourced to do this—but they could also be our own security services and others, from whom we expect protection. Normally, we do get a degree of protection from those services, because they are required to have some form of warrant or prior approval but, as I have explained previously in debate on this, these powers being given to Ofcom require no warrant or prior approval in order to be exercised. So there is a vulnerability, but there is also a major assault on privacy. That is the point on which I intend to start my conclusion.

If we reflect for a moment, the evolution of this Bill in your Lordships’ House has been characterised and shaped, to a large extent, by the offer made by the noble Lord, Lord Stevenson of Balmacara, when he spoke at Second Reading, to take a collaborative approach. But that collaborative approach has barely extended to those noble Lords concerned about privacy and freedom of expression. As a result, in my view, those noble Lords rightly promoting child protection have been reckless to the point of overreaching themselves.

If we stood back and had to explain to outsiders that we were taking steps today that took end-to-end encryption and the privacy they expect on their private messaging services away from them, together with the security and protection it gives, of course, in relation to scams and frauds and all the other things where it has a public benefit, then I think they would be truly outraged. I do not entirely understand how the Government think they could withstand that outrage, were it expressed publicly. I actually believe that the battle for this Bill—this part of this Bill, certainly—is only just starting. We may be coming to the end here, but I do not think that this Bill is settled, because this issue is such a sensitive one.

Given the manifest and widespread lack of support for my views on this question in your Lordships’ House in Committee, I will not be testing the opinion of the House today. I think I know what the opinion of the House is, but it is wrong, and it will have to be revised. My noble friend simply cannot stand there and claim that what he is proposing is proportionate and necessary, because it blatantly and manifestly is not.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, the powers in Clause 111 are perhaps the most controversial outstanding issue in the Bill. I certainly agree with the noble Lord, Lord Moylan, that they deserve some continued scrutiny. I suspect that Members of another place are being lobbied on this extensively right now. Again, it is one of the few issues; they may not have heard of the Online Safety Bill, but they will do in the context of this particular measure.

We debated the rights and wrongs of encryption at some length in Committee, and I will not repeat those points today, not least because the noble Lord, Lord Moylan, has made some of the arguments as to why encryption is important. I will instead today focus on the future process, assuming that the Clause 111 powers will be available to Ofcom as drafted and that we are not going to accept the amendment from the noble Lord, Lord Moylan.

Amendments 258 and 258ZA, in my name and that of my noble friend Lord Clement-Jones, both aim to improve the process of issuing a Clause 111 order by adding in some necessary checks and balances.

As we debate this group, we should remember that the Clause 111 powers are not specific to encrypted services—I think the Minister made this point—and we should have the broader context in mind. I often try to bring some concrete scenarios to our discussions, and it may be helpful to consider three different scenarios in which Ofcom might reach for a Clause 111 notice.

The first is where a provider has no particular objections to using technology to identify and remove child sexual exploitation and abuse material or terrorist material but is just being slow to do this. There are mature systems out there. PhotoDNA is very well known in the industry and effectively has a database with digital signatures of known child sexual exploitation material. All the services we use on a daily basis such as Facebook, Instagram and others will check uploaded photos against that database and, where it is child sexual exploitation material, they will make sure that it does not get shown and that those people are reported to the authorities.

I can imagine scenarios where Ofcom is dealing with a service which has not yet implemented the technology—but does not have a problem doing it—and the material is unencrypted so there is no technical barrier; it is just being a bit slow. In those scenarios, Ofcom will tell the service to get on with it or it will get a Clause 111 notice. In those circumstances, in most cases the service will just get on with it, so Ofcom will be using the threat of the notice as a way to encourage the slow coaches. That is pretty unexceptional; it will work in a pretty straightforward way. I think the most common use of these notices may be to bring outliers into the pack of those who are following best practice. Ofcom may not even need to issue any kind of warning notice at all and will not get past the warning notice period. Waving a warning notice in front of a provider may be sufficient to get it to move.

The second scenario is one where the provider equally does not object to the use of the technology but would prefer to have a notice before it implements it. Outside the world of tech companies, it may seem a little strange why a provider would want to be ordered to do something rather than doing the right thing voluntarily, but we have to remember that the use of this kind of technology is legally fraught in many jurisdictions. There have been court cases in a number of places, not least the European Union, where there are people who will challenge whether you should use this technology on unencrypted services, never mind encrypted ones. In those cases, you can imagine there will be providers, particularly those established outside the United Kingdom, which may say, “Look, we are fine implementing this technology, but Ofcom please can you give us a notice? Then when someone challenges it in court, we can say that the UK regulator made us do it”. That would be helpful to them. This second group will want a notice and here we will get to the point of the notice being issued. They are not going to contest it; they want to have the notice because it gives them some kind of legal protection.

I think those two groups are relatively straightforward: we are dealing with companies which are being slow or are looking for legal cover but do not fundamentally object. The third scenario, though, is the most challenging and it is where I think the Government could get into real trouble. My amendments seek to help the Government in situations where a provider fundamentally objects to being ordered to deploy a particular technology because it believes that that technology will create real privacy threats and risks to the service that it offers. I do not think the provider is being awkward in these circumstances; it has genuine concerns about the implications of the technology being developed or which it is being instructed to deploy.

In these circumstances, Ofcom may have all the reasons in the world to argue why it thinks that what it is asking for is reasonable. However, the affected provider may not accept those reasons and take quite a strong counterview and have all sorts of other arguments as to why what it is being asked to do is unacceptable and too high-risk. This debate has been swirling around at the moment as we think about current models of end-to-end encryption and client-side scanning technology, but we need to recognise that this Bill is going to be around for a while and there may be all sorts of other technologies being ordered to be deployed that we do not even know about and have not even been developed yet. At any point, we may hit this impasse where Ofcom is saying it thinks it is perfectly reasonable to order a company to do it and the service provider is saying, “No, as we look at this, our experts and our lawyers are telling us that this is fundamentally problematic from a privacy point of view”.

18:15
The amendments I have tabled do not stop Ofcom from issuing any kind of order for any kind of technology. In Amendment 258, we are saying that, where there is a disagreement, there should be a point where the public can join the debate. However, I really want to focus on Amendment 258ZA, where we are saying that in those circumstances the provider should have a statutory right to refer it to the Information Commissioner’s Office. We will try to press this to a vote, so I hope people are listening to the argument because I think it is well intended. The right to refer it to the Information Commissioner’s Office is not only the sensible thing to do but will help the Government if they are trying to get a company to deploy a technology. It will help and strengthen their case if they have made it clear that there will be this important check and balance.
As we have discussed a lot through this debate, these are safety/privacy trade-offs. In some cases, it is a real trade-off—more privacy can sometimes compromise safety because people are able to do things in private that are problematic. At the same time, more privacy can sometimes be a benefit for safety if it protects you from fraudsters. But there certainly are occasions where there are genuine safety/privacy trade-offs. In this Bill, we are charging Ofcom with creating the highest level of safety for the people in the UK when they are online. Ofcom will be our safety regulator and that is its overriding duty. I am sure the Minister will argue it also has to think about privacy and other things, but if you look at the Bill in total, Ofcom’s primary responsibility is clear: it is safety, particularly child safety.
We have created the Information Commissioner’s Office as the guardian of our privacy rights and tasked it with enforcing a body of data protection law, so Ofcom is our primary safety regulator and the ICO is our primary privacy regulator. It seems to me entirely sensible and rational to say that, if our safety regulator is ordering a provider to do something on the grounds of safety and the provider thinks it is problematic, the provider should be able to go to our privacy regulator and say, “You two regulators both have a look at this. Both come to a view and, on the basis of that, we can decide what to do”.
That is really what Amendment 258ZA is intended to do. It does not intend to handcuff Ofcom in any way or stop it doing anything it thinks is important. It does not intend to frustrate child safety; it simply intends to make sure that there is a proper balance in place so that, where a provider has genuine concerns, it can go to the regulator that we have charged with being responsible for privacy regulation.
The reason that this is important and why we are spending a little more time on it today is that there is a genuine risk that services being used by millions of people in the United Kingdom could leave. We often focus on some of the more familiar brands such as WhatsApp and others, but we need to remember things such as iMessage. If you use an Apple phone and use iMessage, that is end-to-end encrypted. That is the kind of service that could find it really problematic. Apple has said, “Privacy is all”, and is going to be thinking about the global market. If it was ordered to deploy a technology which it thought was unsafe, Apple would have to think very carefully about being in the UK market.
To say that Apple has a right to go to the ICO and ask for a review is perfectly reasonable and sensible. I suspect that the Minister may try to argue for the skilled person’s concession that they have made—which is helpful and is material—could involve a review by data protection officials, but that is not the same as getting an authoritative decision from the privacy regulator, the Information Commissioner’s Office. It is helpful and those amendments are welcome; I would say that the skilled person’s report is necessary but far from sufficient in these circumstances where there is that fundamental view.
If I can try to sell it to the Government, if they accept this amendment and Ofcom says it needs this technology to be deployed for safety reasons and the ICO says it has looked at it as the privacy regulator and has no objections, the onus is on the company. Then it looks like the provider, if it chooses to leave the United Kingdom market, is doing so voluntarily because there are no fundamental objections.
If, on the other hand, the ICO says it is problematic, we know then that we need to carry on discussing and debating whether that technology is appropriate and whether the safety/privacy balance has been got right. So, whether you support more scanning of content or are concerned about more scanning of content, to have the providers of the services that we all use every day, in the circumstances where they think there is a fundamental threat, being able to go to our privacy regulator, which we have set up precisely to guard our privacy rights, and ask it for an opinion, I do not think is an excessive ask. I hope that the Government will either accept the amendment or make a commitment that they will bring in something comparable at Third Reading. Absent that, we feel that this is important enough that we should test the opinion of the House in due course.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I have put my name to and support Amendment 255, laid by the noble Lord, Lord Moylan, which straight- forwardly means that a notice may not impose a requirement relating to a service that would require that provider to weaken or remove end-to-end encryption. It is very clear. I understand that the other amendments introduce safeguards, which is better than nothing. It is not what I would like, but I will support them if they are pushed a vote. I think that the Government should really consider seriously not going anywhere near getting rid of encryption in this Bill and reconsider it by the time we get to Third Reading.

As the noble Lord, Lord Moylan, explained, this is becoming widely known about now, and it is causing some concern. If passed, this Bill, as it is at the moment, gives Ofcom far-reaching powers to force services, such as WhatsApp, to install software that would scan all our private messages to see whether there is evidence of terrorism, child sexual exploitation or abusive content and would automatically send a report to third parties, such as law enforcement, if it suspects wrongdoing—all without the consent or control of the sender or the intended recipient.

I would just like to state that encryption is a wonderful innovation. That is why more than 40 million people in the UK use it every day. It ensures that our private messages cannot be viewed, compromised or altered by anyone else, not even providers of chat services. It really requires somebody handing them over to a journalist and saying, “You can have my WhatsApp messages for anyone to read them”: beyond that, you cannot read them.

One of the interesting things that we have discussed throughout the passage of the Bill is technologies, their design and functionality and making sure they are not harmful. Ironically, it is the design and function of encryption that actually helps to keep us safe online. That is why so many people talk about civil libertarians, journalists and brave dissenters using it. For the rest of us, it is a tool to protect our data and private communications in the digital realm. I just want to pose here that it is an irony that the technologies being proposed in terms of client-side scanning are the technologies that are potentially harmful because it is, as people have noted, the equivalent of putting video cameras in our homes to listening in to every conversation and send reports to the police if we discuss illicit topics. As I have said before, while child sexual abuse is horrendous and vile, we know that it happens largely in the home and, as yet, the Government have not advocated that we film in everybody’s home in order to stop child sexual abuse. We should do almost anything and everything that we can, but I think this is the wrong answer.

Focusing on encryption just makes no sense. The Government have made exemptions, recognising the importance, in a democracy, of private correspondence: so exempted in the Bill are text messages, MSN, Zoom, oral commentaries and email. It seems perverse to demonise encryption in this way. I also note that there are exemptions for anything sent on message apps by law enforcement or public sector or emergency responders. I appreciate that some government communications are said to be done over apps such as WhatsApp. It seems then that the target of this part of the Bill is UK private citizens and residents and that the public are seen as the people who must be spied on.

In consequence, I do not think it surprising that more than 80 national or international civil society organisations have said that this would make the UK the first liberal democracy to require the routine scanning of people’s private chat messages. What does the Minister say to the legal opinion from the technology barrister Matthew Ryder KC, commissioned by Index on Censorship precisely on this part of the Bill? He compares this to law enforcement wiretapping without a warrant and says that the Bill will grant Ofcom a wider remit of surveillance powers over the public than GCHQ has.

Even if the Minister is not interested in lawyers or civil libertarians, surely we should be listening to the advice of science and technology experts in relation to complex technological solutions. Which experts have been consulted? I noted that Matthew Hodgson, the boss of encrypting messaging app Element, has said wryly that

“the Government has not consulted with UK tech firms, only with huge multinational corporations and companies that want to sell software that scans messages, who are unsurprisingly telling lawmakers that it is possible to scan messages without breaking encryption”.

The problem is that it is not possible to scan those messages without breaking encryption. It is actually misinformation to say that. That is why whole swathes of leading scientists and technologists from across the globe have recently put out an open letter explaining why and how it is not true. They explained that it creates really dangerous side-effects that can be harmful in the way that the noble Lord, Lord Moylan, explained, in terms of security, and makes the online world less safe for many of us. Existing scanning technologies are flawed and ineffective and scanning will nullify the purpose of encryption. I refer noble Lords to the work of the Internet Society and the academic paper Bugs in Our Pockets: The Risks of Client-Side Scanning for more details on all the peer-reviewed work.

I understand that, given the horrific nature of child sexual abuse—and, of course, terrorism, but I shall concentrate on child sexual abuse because the Bill is so concerned with it—it can be tempting for the Government to hope that there is a technological silver bullet to eradicate it. But the evidence suggests otherwise. One warning from scientists is that scanning billions of pieces of content could lead to millions of false positives and that could not only frame innocent users but could mean that the police become overwhelmed, diverting valuable resources away from real investigations into child sexual abuse.

A study by the Max Planck Institute for the study of crime of a similar German law that lasted from 2008 to 2010 found that the German police having access to huge amounts of data did not have any deterrent effect, did not assist in cleaning up crimes or increase convictions, but did waste a lot of police time. So it is important that this draconian invasion of privacy is not stated as necessary for protecting children. I share the exasperation of Signal’s president Meredith Whittaker, who challenged the Secretary of State and pointed out that there were some double standards here: for example, slashing early intervention programmes over the past decade did not help protect children and chronically underfunding and underresourcing child social care does not help.

My own bugbears are that when I, having talked to social workers and colleagues, raised the dangers to child protection when we closed down schools in lockdown, they were brushed to one side. When I and others raised the horrors of the young girls who had been systematically raped by grooming gangs whom the authorities had ignored for many, many years, I was told to stop talking about it. There are real threats to children that we ignore. I do not want us in this instance to use that very emotive discussion to attack privacy.

I also want to stress that there is no complacency here. Law enforcement agencies in the UK already possess a wide range of powers to seize devices and compel passwords and even covertly to monitor and hack accounts to identify criminal activity. That is good. Crucially, private messaging services can and do— I am sure they could do more—work in a wide range of ways to tackle abuse and keep people safe without the need to scan or read people’s messages.

18:30
I did listen to the noble Lord, Lord Stevenson, saying, “Don’t speak too long”. Noble Lords will be delighted to know that I did not speak on any other group so that I could make these points—I spoke very briefly to agree with the noble Lord, but that was for one minute. I cannot stress enough that while I have talked a lot about freedom of speech, it is hugely important that we do not jeopardise the public’s privacy online by falsely claiming that it will protect children. It will also see the end of Rishi Sunak’s dream of the UK becoming a technology superpower, in the way that the noble Lord, Lord Allan of Hallam, explained. It is not good for the growth agenda because those organisations—WhatsApp and so on—will leave the UK, but, largely, it is in defence of privacy that I urge noble Lords to support the amendment from the noble Lord, Lord Moylan, or vote for whichever is moved.
Lord Kamall Portrait Lord Kamall (Con)
- Hansard - - - Excerpts

My Lords, I rise to speak in favour of my noble friend Lord Moylan’s amendment. Given that I understand he is not going to press it, and while I see Amendment 255 as the ideal amendment, I thank the noble Lords, Lord Stevenson and Lord Clement- Jones, for their Amendments 256, 257 and 259, and the noble Lords, Lord Clement-Jones and Lord Allan of Hallam, for Amendments 258 and 258ZA.

I will try to be as brief as I can. I think about two principles—unintended consequences and the history of technology transfer. The point about technology transfer is that once a technology is used it becomes available to other people quickly, even bad guys, whether that was intended or not. There is obviously formal technology transfer, where you have agreement or knowledge transfer via foreign investment, but let us think about the Cold War and some of the great technological developments—atomic secrets, Concorde and the space shuttle. In no time at all, the other side had that access, and that was before the advent of the internet.

If we are to open a door for access to encrypted messages, that technology will be available to the bad guys in no time at all, and they will use it against dissidents, many of whom will be in contact with journalists and human rights organisations in this country and elsewhere. Therefore, the unintended consequence may well be that in seeking to protect children in this country by accessing encrypted messages or unencrypted messages, we may well be damaging the childhoods of children in other countries when their parents, who are dissidents, are suddenly taken away and maybe the whole family is wiped out. Let us be careful about those unintended consequences.

I also welcome my noble friend Lord Parkinson’s amendments about ensuring journalistic integrity, such as Amendment 257D and others. They are important. However, we must remember that once these technologies are available, everyone has a price and that technology will be transferred to the bad guys.

Given that my noble friend Lord Moylan will not press Amendment 255, let us talk about some of the other amendments—I will make some general points rather than go into specifics, as many noble Lords have raised these points. These amendments are sub-optimal, but at least there is some accountability for Ofcom being able to use this power and using it sensibly and proportionately. One of the things that has run throughout this Bill and other Bills is “who regulates the regulators?” and ensuring that regulators are accountable. The amendments proposed by the noble Lords, Lord Stevenson and Lord Clement-Jones, and by the noble Lords, Lord Clement-Jones and Lord Allan of Hallam, go some way towards ensuring that safeguards are in place. If the Government are not prepared to have an explicit statement that they will not allow access to encrypted messages, I hope that there will be some support for the noble Lords’ amendments.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I promise to speak very briefly. I welcome the Government’s amendments. I particularly welcome that they appear to mirror partly some of the safeguards that are embedded in the Investigatory Powers Act 2016.

I have one question for my noble friend the Minister about the wording, “a skilled person”. I am worried that “a skilled person” is a very vague term. I have been taken all through the course of this Bill by the comparison with the Investigatory Powers Act and the need to think carefully about how we balance the importance of privacy with the imperative of protecting our children and being able to track down the most evil and wicked perpetrators online. That is very similar to the debates that we had here several years ago on the Investigatory Powers Act.

The IPA created the Technical Advisory Board. It is not a decision-making body. Its purpose is to advise the Investigatory Powers Commissioner and judicial commissioners on the impact of changing technology and the development of techniques to use investigatory powers while maintaining privacy. It is an expert panel constituted to advise the regulator—in this case, the judicial commissioner—specifically on technology interventions that must balance this really difficult trade-off between privacy and child protection. Why have we not followed the same recipe? Rather than having a skilled person, why would we not have a technology advisory panel of a similar standing where it is clear to all who the members are. Those members would be required to produce a regular report. It might not need to be as regular as the IPA one, but it would just take what the Government have already laid one step further towards institutionalising the independent check that is really important if these Ofcom powers were ever to be used.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I added my name to some amendments on this issue in Committee. I have not done so on Report, not least because I have been so occupied with other things and have not had the time to focus on this. However, I remain concerned about this part of the Bill. I am sympathetic to my noble friend Lord Moylan’s Amendment 255, but listening to this debate and studying all the amendments in this group, I am a little confused and so have some simple questions.

First, I heard my noble friend the Minister say that the Government have no intention to require the platforms to carry out general monitoring, but is that now specific in any of the amendments that he has tabled? Regarding the amendments which would bring further safeguards around the oversight of Ofcom’s use of this power, like my noble friend Lady Harding, I have always been concerned that the oversight approach should be in line with that for the Investigatory Powers Act and could never understand why it was not in the original version of the Bill. Like her, I am pleased that the Government have tabled some amendments, but I am not yet convinced that they go far enough.

That leads me to the amendments that have been tabled by the noble Lords, Lord Stevenson and Lord Clement-Jones, and particularly that in the name of the noble Lord, Lord Allan of Hallam. As his noble friend Lord Clement-Jones has added his name to it, perhaps he could answer my question when he gets up. Would the safeguards that are outlined there—the introduction of the Information Commissioner—meet the concerns of the big tech companies? Do we know whether it would meet their needs and therefore lead them not to feel it necessary to withdraw their services from the UK? I am keen to understand that.

There is another thing that might be of benefit for anyone listening to this debate who is not steeped in the detail of this Bill, and I look to any of those winding up to answer it—including my noble friend the Minister. Is this an end to end-to-end encryption? Is that what is happening in this Bill? Or is this about ensuring that what is already permissible in terms of the authorities being able to use their powers to go after suspected criminals is somehow codified in this Bill to make sure it has proper safeguards around it? That is still not clear. It would be very helpful to get that clarity from my noble friend, or others.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to follow the noble Baroness, Lady Stowell. My noble friend has spoken very cogently to Amendment 258ZA, and I say in answer to the question posed by the noble Baroness that I do not think this is designed to make big tech companies content. What it is designed to do is bring this out into the open and make it contestable; to see whether or not privacy is being invaded in these circumstances. To that extent it airs the issues and goes quite a long way towards allaying the concerns of those 80 organisations that we have heard from.

I am not going to repeat all the arguments of my noble friend, but many noble Lords, not least on the opposite Benches, have taken us through some of the potential security and privacy concerns which were also raised by my noble friends, and other reasons for us on these Benches putting forward these amendments. We recognise those concerns and indeed we recognise concerns on both sides. We have all received briefs from the NSPCC and the IWF, but I do not believe that essentially what is being proposed here in our amendments, or indeed by the amendments put forward by the noble Lord, Lord Stevenson, are designed in any way to prevent Ofcom doing its duty in relation to child sexual abuse and exploitation material in private messaging. We believe that review by the ICO to ensure that there is no invasion of privacy is a very useful mechanism.

We have all tried to find solutions and the Minister has put forward his stab at this with the skilled persons report. The trouble is, that does not go far enough, as the noble Baroness, Lady Stowell, said. Effectively, Ofcom can choose the skilled person and what the skilled person is asked to advise on. It is not necessarily comprehensive and that is essentially the major flaw.

As regards the amendments put forward by the noble Lord, Lord Stevenson, it is interesting that the Equality and Human Rights Commission itself said:

“We are concerned by the extent and seriousness of CSEA content being shared online. But these proposed measures may be a disproportionate infringement on millions of individuals’ right to privacy where those individuals are not suspected of any wrongdoing”.


It goes on to say:

“We recommend that Ofcom should be required to apply to an independent judicial commissioner—as is the case for mass surveillance under the Investigatory Powers Act”.


I am sure that is the reason why the noble Lord, Lord Stevenson, put forward his amendments; if he put them to a vote, we would follow and support. Otherwise, we will put our own amendments to the House.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, this has been—since we first got sight of the Bill and right the way through—one of the most difficult issues to try to find balance and a solution. I know that people have ridiculed my attempt to try and get people to speak less in earlier amendments. Actually, in part it was so we could have a longer debate here—so the noble Lord, Lord Moylan, should not be so cross with me, and I hope that we can continue to be friends, as we are outside the Chamber, on all points, not just this one.

Talk is not getting us to a solution on this, unfortunately. I say to the Minister: I wonder whether there is a case here for pausing a little bit longer on this, because I still do not think we have got to the bottom of where the balance lies. I want to explain why I say that, because, in a way, I follow the noble Baroness, Lady Stowell, in worrying that there are some deeper questions here that we have not quite got the answers to. Nothing in the current amendments gets us to quite the right place.

I started by thinking that, if only because Ofcom was being seen to be placed in a position of both being a part of the regulatory process, but also having the rights to interpose itself into where this issue about encryption came up, Ofcom needed the safety of an external judicial review along the lines of the current RIPA system. That has led us to my Amendments 256, 257 and 259, which try to distil that sensibility into a workable frame for the Bill and these issues. I will not push it to a vote. It is there because I wanted to have in the discussion a proper look at what the RIPA proposal would look like in practice.

18:45
In the intervening period between Committee and today, the Government have also laid their amendments, which the noble Lord, Lord Parkinson, has introduced very well. I like a lot of them; they go a long way down the track to where we want to be on this. I particularly like the protection for journalistic content, which I think was lacking before. The way of introducing the skilled person into it, and the obtaining of an externally sourced report—even it is internally commissioned—does bring in an outside voice which will be helpful. The addition of privacy to that makes sure that the issues addressed will be the ones that need to be bottomed out before any decisions are taken.
It is helpful to state—again, repeated, and implicit in the amendments, but it is nice to see it again—that this is not about Ofcom as the regulator looking at people’s messages. It is clearly about making sure that there is the capacity to investigate criminality where it is clearly suspected and where evidence exists for that, and that the companies themselves have the responsibility for taking the action as a result. That is a good place to be, and I think it is right. The difficulty is that I do not think that that sensibility quite takes the trick about what society as a whole should expect from the regulator in relation to this particular activity. It leaves open the question of how much external supervision there would really be.
If we accept that it is not about Ofcom reading private messages or general monitoring—which I note the Minister has confirmed today; he was a bit reluctant to do so in Committee, but I am grateful to him for repeating it today—where is it in the statute? That is a good question: perhaps we could have an answer to it, because I do not think it does appear. It is important to reassure people that there is nothing in this set of proposals, and nothing in the Bill, that requires Ofcom to generally survey the material which is passing through the system. I still think people are concerned that this is about breaking end-to-end encryption, and that this is an attack on privacy which would be a very bad step. If that remains the situation and the Government have failed to convince, that therefore reinforces my suggestion to the Minister that at the end of this debate he might feel it necessary to spend a little more time and discussion trying to get us to where we want to go.
I am very grateful to those who have suggested that our amendments are the right way to go. As I have said, I will not be pushing them—the reasons being that I think they go a little too far, but a little more of that would not be a bad thing. The Government are almost there with that, but I think a bit more time, effort and concern about some of the suggestions would probably get us to a better place than we are at the moment. I particularly think that about those from the noble Baroness, Lady Harding, about taking the lessons from what has happened in other places and trying to systematise that so it is clear that there are external persons and we know who they are, what their backgrounds are and what their roles will be. I look forward to hearing from the Minister when he comes to respond, but, just for confirmation, I do not think this is the appropriate place to vote, and should a vote be called, we will be abstaining.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

I am grateful to noble Lords for their further scrutiny of this important but complex area, and for the engagement that we have had in the days running up to it as well. We know how child sexual exploitation and abuse offenders sadly exploit private channels, and the great danger that this poses, and we know how crucial these channels are for secure communication. That is why, where necessary and proportionate, and where all the safeguards are met, it is right that Ofcom can require companies to take all technically feasible measures to remove this vile and illegal content.

The government amendments in this group will go further to ensure that a notice is well informed and targeted and does not unduly restrict users’ rights. Privacy and safety are not mutually exclusive—we can and must have both. The safety of our children depends on it.

I make it clear again that the Bill does not require companies to break or weaken end-to-end encryption on their services. Ofcom can require the use of technology on an end-to-end encrypted service only when it is technically feasible and has been assessed as meeting minimum standards of accuracy. When deciding whether to issue a notice, Ofcom will engage in continual dialogue with the company and identify reasonable, technically feasible solutions to the issues identified. As I said in opening, it is right that we require technology companies to use their considerable resources and expertise to develop the best possible protections to keep children safe in encrypted environments. They are well placed to innovate to find solutions that protect both the privacy of users and the safety of children.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Hansard - - - Excerpts

Just to be clear, am I right to understand my noble friend as saying that there is currently no technology that would be technically acceptable for tech companies to do what is being asked of them? Did he say that tech companies should be looking to develop the technology to do what may be required of them but that it is not currently available to them?

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

For clarification, if the answer to that is that the technology does not exist—which I believe is correct, although there are various snake oil salespeople out there claiming that it does, as the noble Baroness, Lady Fox of Buckley, said—my noble friend seems to be saying that the providers and services should develop it. This seems rather circular, as the Bill says that they must adopt an approved technology, which suggests a technology that has been imposed on them. What if they cannot and still get such a notice? Is it possible that these powers will never be capable of being used, especially if they do not co-operate?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

To answer my noble friend Lady Stowell first, it depends on the type of service. It is difficult to give a short answer that covers the range of services that we want to ensure are covered here, but we are seeking to keep this and all other parts of the Bill technology neutral so that, as services develop, technology changes and criminals, unfortunately, seek to exploit that, technology companies can continue to innovate to keep children safe while protecting the privacy of their users. That is a long-winded answer to my noble friend’s short question, but necessarily so. Ofcom will need to make its assessments on a case- by-case basis and can require a company to use its best endeavours to innovate if no effective and accurate technology is currently available.

While I am directing my remarks towards my noble friend, I will also answer a question she raised earlier on general monitoring. General monitoring is not a legally defined concept in UK law; it is a term in European Union law that refers to the generalised monitoring of user activity online, although its parameters are not clearly defined. The use of automated technologies is already fundamental to how many companies protect their users from the most abhorrent harms, including child sexual abuse. It is therefore important that we empower Ofcom to require the use of such technology where it is necessary and proportionate and ensure that the use of these tools is transparent and properly regulated, with clear and appropriate safeguards in place for users’ rights. The UK’s existing intermediary liability regime remains in place.

Amendment 255 from my noble friend Lord Moylan seeks to prevent Ofcom imposing any requirement in a notice that would weaken or remove end-to-end encryption. He is right that end-to-end encryption should not be weakened or removed. The powers in the Bill will not do that. These powers are underpinned by proportionality and technical feasibility; if it is not proportionate or technically feasible for companies to identify child sexual exploitation abuse content on their platform while upholding users’ right to privacy, Ofcom cannot require it.

I agree with my noble friend and the noble Baroness, Lady Fox, that encryption is a very important and popular feature today. However, with technology evolving at a rapid rate, we cannot accept amendments that would risk this legislation quickly becoming out of date. Naming encryption in the Bill would risk that happening. We firmly believe that the best approach is to focus on strong safeguards for upholding users’ rights and ensuring that measures are proportionate to the specific situation, rather than on general features such as encryption.

The Bill already requires Ofcom to consider the risk that technology could result in a breach of any statutory provision or rule of law concerning privacy and whether any alternative measures would significantly reduce the amount of illegal content on a service. As I have said in previous debates, Ofcom is also bound by the Human Rights Act not to act inconsistently with users’ rights.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

Will the Minister write to noble Lords who have been here in Committee and on Report in response to the fact that it is not just encryption companies saying that the demands of this clause will lead to the breaching of encryption, even though the Minister and the Government keep saying that it will not? As I have indicated, a wide range of scientists and technologists are saying that, whatever is said, demanding that Ofcom insists that technology notices are used in this way will inadvertently lead to the breaking of encryption. It would be useful if the Government at least explained scientifically and technologically why those experts are wrong and they are right.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am very happy to put in writing what I have said from the Dispatch Box. The noble Baroness may find that it is the same, but I will happily set it out in further detail.

I should make it clear that the Bill does not permit law enforcement agencies to access information held on platforms, including access to private channels. The National Crime Agency will be responsible for receiving reports from in-scope services via secure transmission, processing these reports and, where appropriate, disseminating them to other UK law enforcement bodies and our international counterparts. The National Crime Agency will process only information provided to it by the company; where it determines that the content is child sexual abuse content and meets the threshold for criminality, it can request further information from the company using existing powers.

I am glad to hear that my noble friend Lord Moylan does not intend to divide on his amendment. The restrictions it sets out are not ones we should impose on the Bill.

Amendments 256, 257 and 259 in the name of the noble Lord, Lord Stevenson of Balmacara, require a notice to be approved by a judicial commissioner appointed under the Investigatory Powers Act 2016 and remove Ofcom’s power to require companies to make best endeavours to develop or source new technology to address child sexual exploitation and abuse content.

19:00
The Investigatory Powers Act and this Bill are very different regimes which should not be conflated. The Investigatory Powers Act comprehensively sets out the powers of public bodies, including the UK’s intelligence agencies and police, to access communications and content data, and establishes safeguards and restrictions for that. This Bill, by contrast, is a risk-based regime requiring companies to take responsibility for the harms facilitated by their own services. The powers in Clause 111 can require a company to tackle the huge volume of child sexual exploitation and abuse that is, sadly, manifested on private channels. This does not supplement or overlap with the roles or powers of the security services or the police, which are provided for by the Investigatory Powers Act.
It is right that these two regimes are overseen by two different, independent regulators. It would not be appropriate for the judicial commissioners to oversee Ofcom’s work. More importantly, it is not necessary: the Bill already contains robust safeguards requiring Ofcom to consider large quantities of information to allow for evidence-based decision-making. The government amendments in this group, which I have spoken to, further strengthen those safeguards.
In removing the “best endeavours” power, the noble Lord’s amendments would significantly reduce the capacity for a notice to be flexible and pragmatic. The power allows Ofcom to require companies to innovate and design practical, proportionate solutions that are compatible with their own service. Without this power, Ofcom could be left with the choice of requiring the use of incompatible and ineffective technologies, or doing nothing and allowing child abuse material to continue to proliferate on a service. Removing this power would not be in anyone’s interest.
I turn now to the amendments in the name of the noble Lord, Lord Allan of Hallam, which seek to introduce a public consultation before Ofcom issues a notice, and a review of notices by the Information Commissioner’s Office. I recognise that the aim of Amendment 258 is to provide transparency, but it will not always be appropriate for Ofcom to share the details of a notice with a public audience. There is a high risk that its content and context could be exploited by criminals and used to further illegal activities online. We agree, however, that Ofcom must be as transparent as possible in carrying out its functions. That is why Ofcom must report on its use of Clause 111 powers in an annual report. That will ensure that key facts about Ofcom’s decisions are placed in the public domain. In addition, Ofcom is required by the Communications Act, when carrying out its functions, to have regard to the principles under which regulatory activities should be transparent and accountable.
On Amendment 258ZA, on which the noble Lord said he may test the opinion of your Lordships’ House, while he is right to emphasise the expertise of the Information Commissioner’s Office, I hope will not seek to divide, because the requirement in his amendment is duplicative. I agree with him that it is important that Ofcom and the ICO work closely together. Ofcom is required to consult the ICO before producing guidance on how it will use its Clause 111 powers, and it may consult the Information Commissioner’s Office before issuing a notice, where necessary—for example, if a company has made representations about a proposed requirement.
Ofcom cannot take any action which breaches data protection and privacy legislation, nor can it require services to do so. That is already set out both in the Bill and in existing regulations. Should services wish a notice to be reviewed, the Bill already provides robust routes of appeal under Clause 151 via the Upper Tribunal, which will consider whether the regulator’s decisions have been made lawfully.
I will touch on the question raised by my noble friend Lady Harding of Winscombe—
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

I appreciate the tone of the Minister’s comments very much, but they are not entirely reassuring me. There is a debate going on out there: there are people saying, “We’ve got these fabulous technologies that we would like Ofcom to order companies to install” and there are companies saying, “That would be disastrous and break encryption if we had to install them”. That is a dualistic situation where there is a contest going on. My amendment seeks to make sure the conflict can be properly resolved. I do not think Ofcom on its own can ever do that, because Ofcom will always be defending what it is doing and saying “This is fine”. So, there has to be some other mechanism whereby people can say it is not fine and contest that. As I say, in this debate we are ignoring the fact that they are already out there: people saying “We think you should deploy this” and companies saying “It would be disastrous if we did”. We cannot resolve that by just saying “Trust Ofcom”.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

To meet the expectation the noble Lord voiced earlier, I will indeed point out that Ofcom can consult the ICO as a skilled person if it wishes to. It is important that we square the circle and look at these issues. The ICO will be able to be involved in the way I have set out as a skilled person.

Before I conclude, I want to address my noble friend Lady Harding’s questions on skilled persons. Given that notices will be issued on a case-by-case basis, and Ofcom will need to look at specific service design and existing systems of a provider to work out how a particular technology would interact with that design system, a skilled person’s report better fits this process by requiring Ofcom to obtain tailored advice rather than general technical advice from an advisory board. The skilled person’s report will be largely focused on the technical side of Ofcom’s assessment: that is to say, how the technology would interact with the service’s design and existing systems. In this way, it offers something similar to but more tailored than a technical advisory board. Ofcom already has a large and expert technology group, whose role it is to advice policy teams on new and existing technologies, to anticipate the impact of technologies and so on. It already has strong links with academia and with external researchers. A technical advisory board would duplicate that function. I hope that reassures my noble friend that the points she raised have been taken into account.

So I hope the noble Lord, Lord Allan, will not feel the need to divide—

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

Before the Minister finishes, I posed the question about whether, given the debate and issues raised, he felt completely satisfied that we had arrived at the right solution, and whether there was a case for withdrawing the amendment at this stage and bringing it back at Third Reading, having had further discussions and debate where we could all agree. I take it his answer is “no”.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am afraid it is “no”, and if the noble Lord, Lord Allan, does seek to divide, we will oppose his amendment. I commend the amendments standing in my name in this group to the House.

Amendment 250B agreed.
Amendments 250C to 251
Moved by
250C: Clause 94, page 86, line 34, leave out paragraph (c)
Member’s explanatory statement
This amendment is consequential on the new Clause proposed to be inserted in my name after Clause 111. It omits words in Clause 94 (skilled person’s reports) because that new Clause now requires OFCOM to obtain a skilled person’s report before giving a provider a notice under Clause 111.
250D: Clause 94, page 86, line 41, at end insert—
“(2A) Section (Requirement to obtain skilled person’s report) requires OFCOM to exercise the power in subsection (3) for the purpose of assisting OFCOM in connection with a notice under section 111(1).”Member’s explanatory statement
This amendment is consequential on the new Clause proposed to be inserted in my name after Clause 111. It inserts a signpost in Clause 94 (skilled persons’ reports).
251: Clause 94, page 87, line 39, at end insert—
“(iiia) section (Assessment duties: user empowerment) (assessments related to the adult user empowerment duty set out in section 12(2));”Member’s explanatory statement
This amendment ensures that OFCOM are able to require a skilled person’s report about a failure or possible failure to comply with the new duties to carry out assessments (see the new Clause proposed after Clause 11 in my name).
Amendments 250C to 251 agreed.
Amendment 252
Moved by
252: Clause 94, page 88, line 2, at end insert—
“(xiia) section (Disclosure of information about use of service by deceased child users) (deceased child users);” Member’s explanatory statement
This amendment has the effect that OFCOM may require a skilled person’s report in relation to compliance with the new duties imposed by the Clause proposed after Clause 67 in my name.
Amendment 252 agreed.
Schedule 12: OFCOM’s powers of entry, inspection and audit
Amendments 252A to 252G
Moved by
252A: Schedule 12, page 228, line 4, at end insert—
“(4A) The power to observe the carrying on of the regulated service at the premises includes the power to view, using equipment or a device on the premises, information generated in real time by the performance of a test or demonstration required by a notice given under paragraph 3.”Member’s explanatory statement
This amendment ensures that during an inspection of a service, OFCOM have the power to observe a test or demonstration of which notice has been given.
252B: Schedule 12, page 228, line 7, leave out from “paragraph” to “is” in line 9 and insert “only so far as”
Member’s explanatory statement
This is a technical amendment consequential on the preceding amendment in my name.
252C: Schedule 12, page 228, line 15, leave out “or relevant documents to be produced,” and insert “relevant documents to be produced, or a relevant test or demonstration to be performed,”
Member’s explanatory statement
This amendment, and the next two amendments in my name, concern OFCOM giving advance notice to a provider that they will want to observe a test or demonstration during an inspection.
252D: Schedule 12, page 228, line 19, leave out “documents are “relevant” if they are” and insert “a document, test or demonstration is “relevant” if it is”
Member’s explanatory statement
See the explanatory statement to the preceding amendment in my name.
252E: Schedule 12, page 228, line 23, leave out “or the documents to be produced,” and insert “the documents to be produced, or the test or demonstration to be performed,”
Member’s explanatory statement
See the explanatory statement to the preceding amendment in my name.
252F: Schedule 12, page 229, line 3, at end insert—
“(da) to assist an authorised person to view, using equipment or a device on the premises, information demonstrating in real time the operation of systems, processes or features of a specified description, including functionalities or algorithms of a specified description;“(db) to assist an authorised person to view, using equipment or a device on the premises, information generated in real time by the performance of a test or demonstration of a specified description;”Member’s explanatory statement
This amendment makes it clear that the powers of OFCOM during an audit of a service extend to using equipment on the premises to view real time information showing the operation of the service or the performance of a test or demonstration, if specified in advance in the audit notice.
252G: Schedule 12, page 233, line 38, leave out paragraph (ii)
Member’s explanatory statement
This is a drafting change removing a redundant paragraph from the Bill.
Amendments 252A to 252G agreed.
Amendment 253 not moved.
Clause 105: Disclosure of information
Amendment 254
Moved by
254: Clause 105, page 94, line 33, at end insert—
“(3A) In subsection (3), after paragraph (h) insert—“(ha) a person appointed under—(i) paragraph 1 of Schedule 3 to the Coroners and Justice Act 2009, or(ii) section 2 of the Coroners Act (Northern Ireland) 1959 (c. 15 (N.I.));(hb) the procurator fiscal, within the meaning of the enactment mentioned in subsection (5)(s);”.(3B) In subsection (5)—(a) before paragraph (d) insert—“(ca) the Coroners Act (Northern Ireland) 1959;”,(b) after paragraph (na) insert—“(nb) Part 1 of the Coroners and Justice Act 2009;”, and(c) after paragraph (r) insert—“(s) the Inquiries into Fatal Accidents and Sudden Deaths etc. (Scotland) Act 2016 (asp 2).”.”Member’s explanatory statement
This amendment ensures that it is not necessary for OFCOM to obtain the consent of providers of internet services before disclosing information to a coroner or, in Scotland, procurator fiscal, who is investigating a person’s death.
Amendment 254 agreed.
Clause 107: Provision of information to the Secretary of State
Amendments 254A and 254B
Moved by
254A: Clause 107, page 95, line 20, leave out “(2)” and insert “(3)”
Member’s explanatory statement
This is a technical drafting change needed because section 24B of the Communications Act 2003 has been amended after this Bill was introduced.
254B: Clause 107, page 95, leave out line 21 and insert—
(4) Subsection (2) does not apply to information—”Member’s explanatory statement
This is a technical drafting change needed because section 24B of the Communications Act 2003 has been amended after this Bill was introduced.
Amendments 254A and 254B agreed.
Clause 111: Notices to deal with terrorism content or CSEA content (or both)
Amendment 255 not moved.
Amendment 255A
Moved by
255A: Clause 111, page 98, line 8, at end insert—
“(za) section (Requirement to obtain skilled person’s report), which requires OFCOM to obtain a skilled person’s report before giving a notice under subsection (1),”Member’s explanatory statement
This amendment is consequential on the new Clause proposed to be inserted in my name after Clause 111. It inserts a signpost to the requirement in that new Clause to obtain a skilled person’s report before giving a provider a notice under Clause 111.
Amendment 255A agreed.
Amendment 256 not moved.
Amendment 256A
Moved by
256A: After Clause 111, insert the following new Clause—
“Requirement to obtain skilled person’s report
(1) OFCOM may give a notice under section 111(1) to a provider only after obtaining a report from a skilled person appointed by OFCOM under section 94(3).(2) The purpose of the report is to assist OFCOM in deciding whether to give a notice under section 111(1), and to advise about the requirements that might be imposed by such a notice if it were to be given.”Member’s explanatory statement
This amendment requires OFCOM to obtain a skilled person’s report under Clause 94 before giving a notice to a provider under Clause 111.
Amendment 256A agreed.
Amendment 257 not moved.
Clause 112: Warning notices
Amendments 257A and 257B
Moved by
257A: Clause 112, page 98, line 24, at end insert—
“(za) contain a summary of the report obtained by OFCOM under section (Requirement to obtain skilled person’s report),”Member’s explanatory statement
This amendment requires a warning notice given to a provider to contain a summary of the skilled person’s report obtained by OFCOM under the new Clause proposed to be inserted in my name after Clause 111.
257B: Clause 112, page 98, line 37, at end insert—
“(za) contain a summary of the report obtained by OFCOM under section (Requirement to obtain skilled person’s report),”Member’s explanatory statement
This amendment requires a warning notice given to a provider to contain a summary of the skilled person’s report obtained by OFCOM under the new Clause proposed to be inserted in my name after Clause 111.
Amendments 257A and 257B agreed.
Clause 113: Matters relevant to a decision to give a notice under section 111(1)
Amendments 257C to 257F
Moved by
257C: Clause 113, page 99, line 32, at end insert—
“(ga) the contents of the skilled person’s report obtained as required by section (Requirement to obtain skilled person’s report);”Member’s explanatory statement
This amendment requires OFCOM to consider the contents of the skilled person’s report obtained as required by the new Clause proposed to be inserted in my name after Clause 111, as part of OFCOM’s decision about whether it is necessary and proportionate to give a notice to a provider under Clause 111.
257D: Clause 113, page 99, line 40, at end insert—
“(ia) in the case of a notice relating to a user-to-user service (or to the user-to-user part of a combined service), the extent to which the use of the specified technology would or might—(i) have an adverse impact on the availability of journalistic content on the service, or(ii) result in a breach of the confidentiality of journalistic sources;”Member’s explanatory statement
This amendment requires OFCOM to consider the impact of the use of technology on the availability of journalistic content and the protection of journalistic sources, as part of OFCOM’s decision about whether it is necessary and proportionate to give a notice to a provider under Clause 111.
257E: Clause 113, page 100, line 4, after “(i)” insert “, (ia)”
Member’s explanatory statement
This amendment is consequential on the preceding amendment of this Clause in my name.
257F: Clause 113, page 100, line 5, at end insert—
““journalistic content” has the meaning given by section 15;”Member’s explanatory statement
This amendment adds a definition of journalistic content to Clause 113.
Amendments 257C to 257F agreed.
Clause 114: Notices under section 111(1): supplementary
Amendment 258 not moved.
Amendment 258ZA
Moved by
258ZA: After Clause 114, insert the following new Clause—
“Review by the Information Commissioner of notices under Section 111(1)
(1) Where a provider believes that a notice it has been given under section 111(1) will have a material impact on the private communications of its users, it may request a review by the Information Commissioner.(2) The review must consider the compatibility of the notice with—(a) the Human Rights Act 1998,(b) the Data Protection Act 2018,(c) the Privacy and Electronic Communications (EC Directive) Regulations 2003, and(d) any other legislation the Information Commissioner considers relevant.(3) In carrying out the review, the Information Commissioner must consult—(a) OFCOM,(b) the provider,(c) UK users of the provider’s service, and (d) such other persons as the Information Commissioner considers appropriate.(4) Following a review under subsection (1) the Information Commissioner must publish a report including—(a) their determination of the compatibility of the notice with relevant legislation,(b) their reasons for making such a determination, and(c) their advice to OFCOM in respect of the drafting and implementation of the notice.”Member’s explanatory statement
This amendment would give providers a right to request an assessment by the ICO of the compatibility of a section 111 order with UK privacy legislation.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I wish to test the opinion of the House.

19:10

Division 2

Ayes: 70


Liberal Democrat: 51
Crossbench: 7
Labour: 6
Independent: 5
Green Party: 1

Noes: 178


Conservative: 163
Crossbench: 10
Democratic Unionist Party: 3
Independent: 2

19:20
Clause 115: Review and further notice under section 111(1)
Amendment 258A
Moved by
258A: Clause 115, page 102, line 24, leave out “Section 112 (warning notices) does” and insert “Sections (Requirement to obtain skilled person’s report)(skilled person’s report) and 112 (warning notices) do”
Member’s explanatory statement
This amendment provides that, if OFCOM propose to issue a further notice under Clause 111, it is not necessary to obtain a further skilled person’s report under the new Clause proposed to be inserted in my name after Clause 111.
Amendment 258A agreed.
Clause 118: Interpretation of this Chapter
Amendment 259 not moved.
Clause 120: Requirements enforceable by OFCOM against providers of regulated services
Amendments 260 and 261
Moved by
260: Page 105, line 4, at end insert—

“Section (Assessment duties: user empowerment)

Assessments related to duty in section 12(2)”

Member’s explanatory statement
This amendment ensures that OFCOM are able to use their enforcement powers in Chapter 6 of Part 7 in relation to a breach of any of the new duties imposed by the Clause proposed after Clause 11 in my name.
261: Page 105, line 28, at end insert—

“Section (Disclosure of information about use of service by deceased child users)

Information about use of service by deceased child users”

Member’s explanatory statement
This amendment ensures that OFCOM are able to use their enforcement powers in Chapter 6 of Part 7 in relation to a breach of any of the new duties imposed by the Clause proposed after Clause 67 in my name.
Amendments 260 and 261 agreed.
Clause 122: Confirmation decisions: requirements to take steps
Amendment 262
Moved by
262: Clause 122, page 107, line 7, leave out “for constraints on” and insert “in relation to”
Member’s explanatory statement
This amendment is consequential on the amendments of Clause 125 in my name.
Amendment 262 agreed.
Amendment 262A
Moved by
262A: Clause 122, page 107, line 17, at end insert—
“(ba) specify which of those requirements (if any) have been designated as CSEA requirements (see subsections (5A) and (5B)),”Member’s explanatory statement
This amendment is consequential on the next amendment to this Clause in my name.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, in moving Amendment 262A, I will speak also to the other government amendments in the group. These amendments address the Bill’s enforcement powers. Government Amendments 262A, 262B, 262C, 264A and 266A, Amendments 265, 266 and 267, tabled by my noble friend Lord Bethell, and Amendment 268 tabled by the noble Lord, Lord Stevenson of Balmacara, relate to senior management liability. Amendment 268C from the noble Lord, Lord Weir of Ballyholme, addresses interim service restriction orders.

In Committee, we amended the Bill to create an offence of non-compliance with steps set out in confirmation decisions that relate to specific children’s online safety duties, to ensure that providers and individuals can be held to account where their non-compliance risks serious harm to children. Since then, we have listened to concerns raised by noble Lords and others, in particular that the confirmation decision offence would not tackle child sexual exploitation and abuse. That is why the government amendments in this group will create a new offence of a failure to comply with a child sexual exploitation and abuse requirement imposed by a confirmation decision. This will mean that providers and senior managers can be held liable if they fail to comply with requirements to take specific steps as set out in Ofcom’s confirmation decision in relation to child sexual exploitation and abuse on their service.

Ofcom must designate a step in a confirmation decision as a child sexual exploitation and abuse requirement, where that step relates, whether or not exclusively, to a failure to comply with specific safety duties in respect of child sexual exploitation and abuse content. Failure to comply with such a requirement will be an offence. This approach is necessary, given that steps may relate to multiple or specific kinds of illegal content, or systems and process failures more generally. This approach will ensure that services know from the confirmation decision when they risk criminal liability, while providing sufficient legal certainty via the specified steps to ensure that the offence can be prosecuted effectively.

The penalty for this offence is up to two years in prison, a fine or both. Through Clause 182, where an offence is committed with the consent or connivance of a senior manager, or attributable to his or her neglect, the senior manager, as well as the entity, will have committed the offence and can face up to two years in prison, a fine or both.

I thank my noble friend Lord Bethell, as well as our honourable friends Miriam Cates and Sir William Cash in another place, for their important work in raising this issue and their collaborative approach as we have worked to strengthen the Bill in this area. I am glad that we have reached a position that will help to keep children safe online and drive a change in culture in technology companies. I hope this amendment reassures them and noble Lords that the confirmation decision offence will tackle harms to children effectively by ensuring that technology executives take the necessary steps to keep children safe online. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, I will briefly comment positively on the Minister’s explanation of how these offences might work, particularly the association of the liability with the failure to enforce a confirmation decision, which seems entirely sensible. In an earlier stage of the debate, there was a sense that we might associate liability with more general failures to enforce a duty of care. That would have been problematic, because the duty of care is very broad and requires a lot of pieces to be put in place. Associating the offences with the confirmation decision makes absolute sense. Having been in that position, if, as an executive in a tech company, I received a confirmation decision that said, “You must do these things”, and I chose wilfully to ignore that decision, it would be entirely reasonable for me to be held potentially criminally liable for that. That association is a good step forward.

Lord Weir of Ballyholme Portrait Lord Weir of Ballyholme (DUP)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to Amendment 268C, which is in my name and that of the noble Baroness, Lady Benjamin, who has been so proactive in this area. The amendment seeks to clarify the threshold for Ofcom to take immediate enforcement action when children are exposed to suicide, self-harm, eating disorders and pornographic materials. It would require the regulator to either take that action or at least provide an explanation to the Secretary of State within a reasonable timeframe as to why it has chosen not to.

When we pass the Bill, the public will judge it not simply on its contents but on its implementation, its enforcement and the speed of that enforcement. Regulatory regimes as a whole work only if the companies providing the material believe the regulator to be sufficiently muscular in its approach. Therefore, the test is not simply what is there but how long it will take for a notice, whenever it is issued, to lead to direct change.

I will give two scenarios to illustrate the point. Let us take the example of a video encouraging the so-called blackout challenge, or choking challenge, which went viral on social media about two years ago. For those who are unaware, it challenged children to choke themselves to the point at which they lost consciousness and to see how long they could do that. This resulted in the death of about 15 children. If a similar situation arises and a video is not removed because it is not against the terms and conditions of the service, does Ofcom allow the video to circulate for a period of, say, six months while giving a grace period for the platform to introduce age gating? What if the platform fails to implement that highly effective age verification? How long will it take to get through warnings, a provisional notice of contravention, a representation period, a confirmation decision and the implementation of required measures before the site is finally blocked? As I indicated, this is not hypothetical; it draws from a real-life example. We know that this is not simply a matter of direct harm to children; it can lead to a risk of death, and has done in the past.

What about, for example, a pornographic site that simply has a banner where a person can self-declare that they are over 18 in order to access it? I will not rehearse, since they have been gone through a number of times, the dangers for children of early exposure to violent pornography and the impact that will have on respectful relationships, as we know from government reports, and particularly the risk it creates of viewing women as sex objects. It risks additional sexual aggression towards women and perpetuates that aggression. Given that we are aware that large numbers of children have access to this material, surely it would be irresponsible to sacrifice another generation of children to a three-year implementation process.

19:30
I am sure the Minister will seek to give the assurance that Ofcom does indeed have the power under Clause 138 to act immediately by applying the interim service restrictions when the nature and severity of the content demands. We welcome that change, which has been put into the Bill, and the power contained within it.
What we are seeking to probe is the fact that, while the Bill provides that power, by its nature it does not provide a great deal of guidance to Ofcom and the courts on when they should consider that the threshold for interim disruption measures has been reached. Both the scenarios I have mentioned involve what the Bill designates as primary priority harms to children—that is, the most severe harms. The Bill now requires highly effective age verification or age estimation for such content precisely because we cannot allow any such risks.
Any breach risks severe consequences. Will the Minister confirm that any likely failure to comply with the duties in Clauses 11(3)(a) and 72(2) would reach the threshold of severity for Ofcom to apply for an immediate interim service restriction order rather than awaiting the conclusion of the normal processes? If the answer to that question is yes, as I hope it is, Ofcom should consider applying for an order in such circumstances. I therefore ask the Minister whether he would consider making that clear in the Bill or giving other assurances as to how that will be implemented. Furthermore, applications will have to be assessed through the courts applying the same criteria. When these things are brought before the courts, there is always a danger of them being subject to additional review or indeed an inconsistent approach, so it is important that we give the greatest amount of guidance we can through the legislation itself.
There is a clear public expectation that we will tackle these issues. Again, we are not dealing with hypothetical or unprecedented situations. We should look at what has happened in other jurisdictions. In July 2020, France introduced powers for its regulator, Arcom, to apply for blocking orders against relevant ISPs if age verification is not implemented within 15 days of notification. Similar legislation being considered in the Canadian Parliament gives 20 days. By contrast, there is a bit of a gap in the Bill because we have not set a clear timetable on expected compliance.
I stress that nothing in the amendment seeks to undermine the discretion and operational independence of Ofcom. There will be times when the regulator is best placed to understand the situation and act accordingly—perhaps there has been a technical failure—but it is important that Ofcom be held accountable for the decisions it makes on enforcement, so that it needs either to act or to explain why it is not acting. That is what the second paragraph of the amendment seeks to do. It states that when Ofcom chooses not to apply interim orders when it is likely that after 14 days platforms are still allowing children to access primary priority content or pornographic content, it must provide written justification to the Secretary of State within a further period of seven days. So it does not require Ofcom to act but requires it at least to provide a justification for its decision.
Although we have reason to hope that Ofcom will act more swiftly under the Online Safety Bill, we are trying to judge this on the basis of previous experience. There is disappointment at times across the House at the slow progress in enforcing the video-sharing platform regime. It is nearly three years since that regime was introduced but we have still not seen the outcome of a single investigation against a platform. Greater communication and clarity throughout the process would go a huge way towards rebuilding that trust. I look forward to the Minister’s response, and I seek the assurances that lie at the heart of the amendment. On that basis, I commend the amendment to the House.
Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I want to say “Hallelujah”. With this Bill, we have reached a landmark moment after the disappointments and obstacles that we have had over the last six years. It has been a marathon but we are now in the final straight with the finishing line in sight, after the extraordinary efforts by noble Lords on all sides of the House. I thank the Secretary of State for her commitment to this ground-breaking Bill, and the Minister and his officials for the effort they have put into it. The Minister is one of my “Play School” babies, who has done his utmost to make a difference in changing the online world. That makes me very happy.

We know that the eyes of the world are watching us because legislators around the world are looking for ways to extend the rule of law into the online world, which has become the Wild West of the 21st century, so it is critical that in our haste to reach the finishing post we do not neglect the question of enforcement. That is why I have put my name to Amendment 268C in the name of the noble Lord, Lord Weir: without ensuring that Ofcom is given effective powers for this task of unprecedented scale, the Bill we are passing may yet become a paper tiger.

The impact assessment for the Bill estimated that 25,000 websites would be in scope. Only last week, in an encouraging report by the National Audit Office on Ofcom’s readiness, we learned that the regulator’s own research has increased that estimate to 100,000, and the figure could be significantly higher. The report went on to point out that the great majority of those websites will be based overseas and will not have been regulated by Ofcom before.

The noble Lord, Lord Bethell, raised his concerns on the final day of Committee, seeking to amend the Bill to make it clear that Ofcom could take a schedule of a thousand sites to court and get them all blocked in one go. I was reassured when the Minister repeated the undertaking given by his counterpart in Committee in the other place that the Civil Procedure Rules already allow such multiparty claims. Will the Minister clarify once again that such enforcement at scale is possible and would not expose Ofcom to judicial review? That would give me peace of mind.

The question that remains for many is whether Ofcom will act promptly enough when children are at risk. I am being cautious because my experience in this area with regulators has led me not to assume that simply because this Parliament passes a law, it will be implemented. We all know the sorry tale of the Part 3 of the Digital Economy Act, when Ministers took it upon themselves not to decide when it should come into force, but to ask whether it should at all. When they announced that that should be never, the High Court took a dim view and allowed judicial review to proceed. Interestingly, the repeal of Part 3 and the clauses that replaced it may not have featured in this Bill were it not for that case—I always say that everything always happens for a reason. The amendment is a reminder to Ofcom that Parliament expects it to act, and to do so from the day when the law comes into force, not after a year’s grace period, six months or more of monitoring or a similar period of supervision before it contemplates any form of enforcement.

Many of the sites we are dealing with will not comply because this is the law; they will do so only when the business case makes compliance cheaper than the consequences of non-compliance, so this amendment is a gentle but necessary provision. If for any reason Ofcom does not think that exposing a significant number of children in this country to suicide, health harm, eating disorder or pornographic content—which is a universal plague—merits action, it will need to write a letter to the Secretary of State explaining why.

We have come too far to risk the Bill not being implemented in the most robust way, so I hope my noble friends will join me in supporting this belt-and-braces amendment. I look forward to the Minister’s response.

Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, we welcome the government amendments in this group to bring child sexual exploitation and abuse failures into the scope of the senior manager liability and enforcement regime but consider that they do not go far enough. On the government amendments, I have a question for the Minister about whether, through Clause 122, it would be possible to require a company that was subject to action to do some media literacy as part of its harm reduction; in other words, would it be possible for Ofcom to use its media literacy powers as part of the enforcement process? I offer that as a helpful suggestion.

We share the concerns expressed previously by the noble Lord, Lord Bethell, about the scope of the senior manager liability regime, which does not cover all the child safety duties in the Bill. We consider that Amendment 268, in the name of my noble friend Lord Stevenson, would provide greater flexibility, giving the possibility of expanding the list of duties covered in the future. I have a couple of brief questions to add to my first question. Will the Minister comment on how the operation of the senior manager liability regime will be kept under review? This has, of course, been something of a contentious issue in the other place, so could the Minister perhaps tell your Lordships’ House how confident he is that the current position is supported there? I look forward to hearing from the Minister.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

I did not quite finish writing down the noble Baroness’s questions. I will do my best to answer them, but I may need to follow up in writing because she asked a number at the end, which is perfectly reasonable. On her question about whether confirmation decision steps could include media literacy, yes, that is a good idea; they could.

Amendment 268, tabled by the noble Lord, Lord Stevenson of Balmacara, seeks to enable the Secretary of State, through regulation, to add to the list of duties which are linked to the confirmation decision offence. We are very concerned at the prospect of allowing an unconstrained expansion of the confirmation decision offence. In particular, as I have already set out, we would be concerned about expansion of those related to search services. There is also concern about unconstrained additions of any other duties related to user-to-user services as well.

We have chosen specific duties which will tackle effectively key issues related to child safety online and tackling child abuse while ensuring that the confirmation decision offence remains targeted. Non-compliance with a requirement imposed by a confirmation decision in relation to such duties warrants the prospect of criminal enforcement on top of Ofcom’s extensive civil enforcement powers. Making excessive changes to the offence risks shifting the regime towards a more punitive and disproportionate enforcement model, which would represent a significant change to the framework as a whole. Furthermore, expansion of the confirmation decision offence could lead to services taking an excessively cautious approach to content moderation to avoid the prospect of criminal liability. We are also concerned that such excessive expansion could significantly increase the burden on Ofcom.

I am grateful to the noble Lord, Lord Weir of Ballyholme, and the noble Baroness, Lady Benjamin, for the way they set out their Amendment 268C. We are concerned about this proposal because it is important that Ofcom can respond to issues on a case-by-case basis: it may not always be appropriate or proportionate to use a specific enforcement power in response to a suspected breach. Interim service restriction orders are some of the strongest enforcement powers in the Bill and will have a significant impact on the service in question. Their use may be disproportionate in cases where there is only a minor breach, or where a service is taking steps to deal with a breach following a provisional notice of contravention.

19:45
In contrast to applications for service restriction orders which require that there is a continuing failure to comply with an enforceable duty, interim service restriction orders require only that it is likely that there is a failure. This provision is included so that steps can be taken quickly where the level of risk of harm to people relating to the likely failure, and the nature and severity of that harm, are such that it would not be appropriate to wait to establish the failure before applying for the order. While the duties specified by Amendment 268C are important, putting pressure on Ofcom pre-emptively to take a specific course of enforcement action which is aimed at addressing only particularly urgent, high-risk scenarios would be counter to the intention of the rest of the framework; that is, to enable an efficient, proportionate and targeted approach.
It is important that Ofcom takes proportionate steps. Section 11(3)(a) is a duty to operate a service using proportionate systems and processes designed to prevent children of any age encountering, by means of the service, primary priority content that is harmful to them. As such, the prospective failure in relation to the duty may be to do with a particular aspect of the systems and processes that are in place. While these are important, a particular failure in relation to the systems and processes-focused duty might not necessarily warrant the drastic action of an interim court order requiring ancillary services providers to withdraw their services from a potentially non-compliant regulated service.
We are concerned that removing a proportionality ground for the court application for an interim service restriction order and requiring Ofcom to justify its decision not to apply for an interim disruption order to the Secretary of State would, in effect, pressure Ofcom to take an excessively punitive course of action. Likewise, in some cases it may be highly irregular for there to be an expectation that Ofcom would have to justify its decision not to apply for an interim disruption order in each particular case to the Secretary of State, but I am happy to reassure the noble Baroness, Lady Benjamin, again that business disruption measures will be able to operate effectively at scale and at sufficient speed. I hope that that provides enough reassurance to her and the noble Lord, Lord Weir, that they are willing to not move their amendment, but I am grateful for the support they voiced, as did others, for the government amendments in this group.
Amendment 262A agreed.
Amendment 262AA not moved.
Amendments 262B and 262C
Moved by
262B: Clause 122, page 107, line 35, at end insert—
“(5A) If the condition in subsection (5B) is met in relation to a requirement imposed by a confirmation decision which is of a kind described in subsection (1), OFCOM must designate the requirement as a “CSEA requirement” for the purposes of section 127(2A) (offence of failure to comply with confirmation decision).(5B) The condition referred to in subsection (5A) is that the requirement is imposed (whether or not exclusively) in relation to either or both of the following—(a) a failure to comply with section 9(2)(a) or (3)(a) in respect of CSEA content, or in respect of priority illegal content which includes CSEA content; (b) a failure to comply with section 9(2)(b) in respect of an offence specified in Schedule 6 (CSEA offences), or in respect of priority offences which include such an offence.”Member’s explanatory statement
This amendment provides that where a confirmation decision imposes a requirement to take steps in relation to a failure to comply with a duty under Clause 9(2)(a), (2)(b) or (3)(a) in respect of CSEA content or an offence under Schedule 6, OFCOM must designate the requirement as a CSEA requirement with the result that failure to comply with it is an offence (see the amendment inserting subsection (2A) into Clause 127 in my name).
262C: Clause 122, page 107, line 44, at end insert—
““CSEA content”, “priority illegal content” and “priority offence” have the same meaning as in Part 3 (see section 53);”Member’s explanatory statement
This amendment is consequential on the preceding amendment to this Clause in my name.
Amendments 262B and 262C agreed.
Clause 125: Confirmation decisions: proactive technology
Amendments 263 and 264
Moved by
263: Clause 125, page 109, line 27, leave out “constraints on OFCOM’s power” and insert “what powers OFCOM have”
Member’s explanatory statement
This amendment is consequential on the next amendment in my name.
264: Clause 125, page 109, line 30, at end insert—
“(1A) A proactive technology requirement may be imposed in a confirmation decision if—(a) the decision is given to the provider of an internet service within section 71(2), and(b) the decision is imposed for the purpose of complying with, or remedying the failure to comply with, the duty set out in section 72(2) (provider pornographic content).(1B) The following provisions of this section set out constraints on OFCOM’s power to include a proactive technology requirement in a confirmation decision in any case not within subsection (1A).”Member’s explanatory statement
This amendment has the effect that OFCOM may, in a confirmation decision, require a provider to use proactive technology if the purpose is to deal with non-compliance with Clause 72(2) (preventing children encountering provider pornographic content).
Amendments 263 and 264 agreed.
Clause 127: Confirmation decisions: offence
Amendment 264A
Moved by
264A: Clause 127, page 112, line 22, leave out “relates (whether or not exclusively) to” and insert “is imposed (whether or not exclusively) in relation to a failure to comply with”
Member’s explanatory statement
This is a technical amendment which adjusts the language of this provision.
Amendment 264A agreed.
Amendments 265 and 266 not moved.
Amendment 266A
Moved by
266A: Clause 127, page 112, line 27, at end insert—
“(2A) A person to whom a confirmation decision is given commits an offence if, without reasonable excuse, the person fails to comply with a CSEA requirement imposed by the decision (see section 122 (5A) and (5B)).”Member’s explanatory statement
This amendment provides that a person commits an offence if the person fails to comply, without reasonable excuse, with a CSEA requirement imposed by a confirmation decision given to the person (see the amendment inserting new subsections (5A) and (5B) into Clause 122 in my name.)
Amendment 266A agreed.
Amendments 267 and 268 not moved.
Schedule 13: Penalties imposed by OFCOM under Chapter 6 of Part 7
Amendments 268A and 268B
Moved by
268A: Schedule 13, page 236, line 12, leave out sub-paragraph (9) and insert—
“(9) Regulations made by OFCOM under section (Regulations by OFCOM about qualifying worldwide revenue etc)(1)(a)(including regulations making provision of a kind mentioned in section (Regulations by OFCOM about qualifying worldwide revenue etc)(3), (4) or (5)) apply for the purpose of determining the qualifying worldwide revenue of a provider of a regulated service for an accounting period as mentioned in this paragraph as they apply for the purpose of determining the qualifying worldwide revenue of a provider of a regulated service for a qualifying period for the purposes of Part 6.”Member’s explanatory statement
This amendment provides that regulations under the new Clause 76 proposed in my name about “qualifying worldwide revenue” for the purposes of Part 6 of the Bill (fees) also applies for the purposes of financial penalties under paragraph 4 of Schedule 13.
268B: Schedule 13, page 237, line 18, at end insert—
“(9) OFCOM may by regulations make provision about how the qualifying worldwide revenue of a group of entities is to be determined for the purposes of this paragraph.(10) Before making regulations under sub-paragraph (9) OFCOM must consult—(a) the Secretary of State,(b) the Treasury, and(c) such other persons as OFCOM consider appropriate.(11) Regulations under sub-paragraph (9) may make provision subject to such exemptions and exceptions as OFCOM consider appropriate.”Member’s explanatory statement
This amendment provides a power for OFCOM to make regulations setting out what is meant in paragraph 5 of Schedule 13 by references to the qualifying worldwide revenue of a group of entities.
Amendments 268A and 268B agreed.
Clause 134: Interim service restriction orders
Amendment 268C not moved.
Amendment 269 not moved.
Clause 141: Advisory committee on disinformation and misinformation
Amendments 269A and 269AA not moved.
Amendment 269B
Moved by
269B: Clause 141, page 128, line 19, leave out “duty” and insert “duties”
Member’s explanatory statement
This amendment is consequential on the new Clause proposed to be inserted after Clause 149 in my name expanding OFCOM’s duties to promote media literacy in relation to regulated user-to-user and search services.
Amendment 269B agreed.
Amendments 269C and 269D not moved.
Clause 143: Research about users’ experiences of regulated services
Amendment 270 not moved.
Amendment 270A
Moved by
270A: After Clause 144, insert the following new Clause—
“Establishment of the Advocacy Body for Children
(1) There is to be a body corporate (“the Advocacy Body for Children”) to represent the interests of child users of regulated services.(2) A “child user”—(a) means any person aged 17 years or under who uses or is likely to use regulated internet services, and(b) includes both any existing child user and any future child user.(3) The functions of the Advocacy Body for Children must include, in relation to regulated services—(a) representing the interests of child users;(b) the protection and promotion of those interests;(c) monitoring implications of this Act’s implementation for those interests;(d) consideration of children’s rights under the United Nations Convention on the Rights of the Child, including (but not limited to) their participation rights;(e) any other matter connected with those interests.(4) The “interests of child users” means the interests of children in relation to the discharge by any regulated company of its duties under this Act, including—(a) safety duties about illegal content, in particular CSEA content,(b) safety duties protecting children,(c) children’s access assessment duties, and(d) other enforceable requirements relating to children. (5) The Advocacy Body for Children must—(a) have due regard to the interests of child users that display one or more protected characteristics within the meaning of the Equality Act 2010,(b) assess emerging threats to child users of regulated services and bring information regarding those threats to OFCOM, and(c) publish an annual report related to the interests of child users.(6) The Advocacy Body for Children may undertake research on its own account.(7) The Advocacy Body for Children is to be defined as a statutory consultee for OFCOM’s regulatory decisions which impact upon the interests of children.(8) To establish the Advocacy Body for Children, OFCOM must—(a) appoint an organisation or organisations known to represent all children in the United Kingdom to be designated with the functions under this section, or(b) create an organisation to carry out the designated functions.(9) The governance functions of the Advocacy Body for Children must—(a) with the exception of the approval of its budget, remain independent of OFCOM, and(b) include representation of child users by young people under the age of 25 years.(10) The budget of the Advocacy Body for Children will be subject to annual approval by the board of OFCOM.(11) The Secretary of State must give directions to OFCOM as to how it should recover the costs relating to the expenses of the Advocacy Body for Children, or the Secretary of State in relation to the establishment of the Advocacy Body, through the provisions to require a provider of a regulated service to pay a fee (as set out in section 75).”Member’s explanatory statement
This new Clause would require Ofcom to establish a new advocacy body for child users of regulated internet services to represent, protect and promote their interests.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, I am grateful to the noble Baroness, Lady Newlove, and the noble Lord, Lord Clement-Jones, for adding their names to Amendment 270A, and to the NSPCC for its assistance in tabling this amendment and helping me to think about it.

The Online Safety Bill has the ambition, as we have heard many times, of making the UK the safest place for a child to be online. Yet, as drafted, it could pass into legislation without a system to ensure that children’s voices themselves can be heard. This is a huge gap. Children are experts in their own lives, with a first-hand understanding of the risks that they face online. It is by speaking to, and hearing from, children directly that we can best understand the harms they face online—what needs to change and how the regulation is working in practice.

User advocates are commonplace in most regulated environments and are proven to be effective. Leading children’s charities such as 5Rights, Barnardo’s and YoungMinds, as well as organisations set up by bereaved parents campaigning for child safety online, such as the Molly Rose Foundation and the Breck Foundation, have joined the NSPCC in calling for the introduction of this advocacy body for children, as set out in the amendment.

I do not wish to detain anyone. The Minister’s response when this was raised in Committee was, in essence, that this should go to the Children’s Commissioner for England. I am grateful to her for tracking me down in a Pret A Manger in Russell Square on Monday and having a chat. She reasonably pointed out that much of the amendment reads a bit like her job description, but she also could see that it is desirable to have an organisation such as the NSPCC set up a UK-wide helpline. There are children’s commissioners for Scotland, Wales and Northern Ireland who are supportive of a national advocacy body for children. She was suggesting —if the Minister agrees that this seems like a good solution—that they could commission a national helpline that works across the United Kingdom, and then advises a group that she could convene, including the children’s commissioners from the other nations of the United Kingdom. If that seems a good solution to the Minister, I do not need to press the amendment, we are all happy and we can get on with the next group. I beg to move.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I just want to make some brief comments in support of the principle of what the noble Lord, Lord Knight, is aiming at in this amendment.

The Bill is going to have a profound impact on children in the United Kingdom. We hope that the most profound impact will be that it will significantly advance their interests in terms of safety online. But it will also potentially have a significant impact on what they can access online and the functionality of different services. They are going to experience new forms of age assurance, about which they may have very strong views. For example, the use of their biometric data to estimate their age will be there to protect them, but they may still have strong views about that.

I have said many times that there may be some measures in the Bill that will encourage services to become 18-plus only. That is not adult in the sense of adult content. Ordinary user-to-user social media services may look at the obligations and say, “Frankly, we would much rather restrict ourselves to users from the UK who identify as being 18-plus, rather than have to take on board all the associated liabilities in respect of children”—not because they are irresponsible, but precisely because they are responsible, and they can see that there is a lot of work to do in order to be legally and safely available to those under 18. For all those reasons, it is really important that the child advocacy body looks at things such as the United Nations Convention on the Rights of the Child and the rights of children to access information, and that it is able to take a view on them.

The reason I think that is important—as will any politician who has been out and spoken in schools—is that very often children are surprising in terms of what they see as their priorities. We make assumptions about their priorities, which can often be entirely wrong. There has been some really good work done on this. There was a project called EU Kids Online, back in the days of the EU, which used to look at children right across the European Union and ask them what their experience of being online was like and what was important to them. There are groups such as Childnet International, which for years has been convening groups of children and taking them to places such as the Internet Governance Forum. That always generates a lot of information that we here would not have thought of, about what children feel is really important to them about their online experience.

For all those reasons, it really would be helpful to institutionalise this in the new regime as some kind of body that looks in the round at children’s interests—their interests to stay safe, but also their interests to be able to access a wide variety of online services and to use the internet as they want to use it. I hope that that strengthens the case the noble Lord, Lord Knight, has made for such a body to exist in some kind of coalition-like format.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I am afraid that I have some reservations about this amendment. I was trying not to, but I have. The way that the noble Lord, Lord Allan of Hallam, explained the importance of listening to young people is essential—in general, not being dictated to by them, but to understand the particular ways that they live their lives; the lived experience, to use the jargon. Particularly in relation to a Bill that spends its whole time saying it is designed to protect young people from harm, it might be worth having a word with them and seeing what they say. I mean in an ongoing way—I am not being glib. That seems very sensible.

I suppose my concern is that this becomes a quango. We have to ask who is on it, whether it becomes just another NGO of some kind. I am always concerned about these kinds of organisations when they speak “on behalf of”. If you have an advocacy body for children that says, “We speak on behalf of children”, that makes me very anxious. You can see that that can be a politically very powerful role, because it seems to have the authority of representing the young, whereas actually it can be entirely fictitious and certainly not democratic or accountable.

The key thing we discussed in Committee, which the noble Lord, Lord Knight of Weymouth, is very keen on—and I am too—is that we do not inadvertently deny young people important access rights to the internet in our attempt to protect them. That is why some of these points are here. The noble Baroness, Lady Kidron, was very keen on that. She wants to protect them but does not want to end up with them being denied access to important parts of the internet. That is all good, but I just think this body is wrong.

The only other thing to draw noble Lords’ attention to—I am not trying to be controversial, but it is worth nothing—is that child advocacy is currently in a very toxic state because of some of the issues around who represents children. As we speak, there is a debate about, for example, whether the NSPCC has been captured by Stonewall. I make no comment because I do not know; I am just noting it. We have had situations where a child advocacy group such as Mermaids is now discredited because it is seen to have been promoting chest binders for young people, to have gone down the gender ideology route, which some people would argue is child abuse of a sort, advocating that young women remove their breasts—have double mastectomies. This is all online, by the way.

I know that some people would say, “Oh, you’re always going on about that”, but I raise it because it is a very real and current discussion. I know a lot of people who work in education, with young people or in children’s rights organisations, and they keep telling me that they are tearing themselves apart. I just wondered whether the noble Lord, Lord Knight, might note that there is a danger of walking into a minefield here—which I know he does not mean to walk into—by setting up an organisation that could end up being the subject of major culture wars rows or, even worse, one of those dreaded quangos that pretends it is representing people but does not.

20:00
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I think the upshot of this brief debate is that the noble Lord, Lord Knight —how he was tracked down in a Pret A Manger, I have no idea; he is normally too fast-moving for that—in his usual constructive and creative way is asking the Government to constructively engage to find a solution, which he discussed in that Pret A Manger, involving a national helpline, the NSPCC and the Children’s Commissioner, for the very reasons that he and my noble friend Lord Allan have put forward. In no way would this be some of kind of quango, in the words of the noble Baroness, Lady Fox.

This is really important stuff. It could be quite a game-changer in the way that the NSPCC and the Children’s Commissioner collaborate on tackling the issues around social media, the impact of the new rights under the Bill and so on. I very much hope that the Government will be able to engage positively on this and help to bring the parties together to, in a sense, deliver something which is not in the Bill but could be of huge importance.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, first, I reassure noble Lords that the Government are fully committed to making sure that the interests of children are both represented and protected. We believe, however, that this is already achieved through the provisions in the Bill.

Rather than creating a single advocacy body to research harms to children and advocate on their behalf, as the noble Lord’s amendment suggests, the Bill achieves the same effect through a combination of Ofcom’s research functions, the consultation requirements and the super-complaints provisions. Ofcom will be fully resourced with the capacity and technological ability to assess and understand emerging harms and will be required to research children’s experiences online on an ongoing basis.

For the first time, there will be a statutory body in place charged with protecting children from harm online. As well as its enforcement functions, Ofcom’s research will ensure that the framework remains up to date and that Ofcom itself has the latest, in-depth information to aid its decision-making. This will ensure that new harms are not just identified in retrospect when children are already affected by them and complaints are made; instead, the regulator will be looking out for new issues and working proactively to understand concerns as they develop.

Children’s perspectives will play a central role in the development of the framework, as Ofcom will build on its strong track record of qualitative research to ensure that children are directly engaged. For example, Ofcom’s ongoing programme, Children’s Media Lives, involves engaging closely with children and tracking their views and experiences year on year.

Alongside its own research functions, super-complaints will ensure that eligible bodies can make complaints on systemic issues, keeping the regulator up to date with issues as they emerge. This means that if Ofcom does not identify a systemic issue affecting children for any reason, it can be raised and then dealt with appropriately. Ofcom will be required to respond to the super-complaint, ensuring that its subsequent decisions are understood and can be scrutinised. Complaints by users will also play a vital role in Ofcom’s horizon scanning and information gathering, providing a key means by which new issues can be raised.

The extensive requirements for Ofcom to consult on codes of practice and guidance will further ensure that it consistently engages with groups focused on the interests of children as the codes and guidance are developed and revised. Children’s interests are embedded in the implementation and delivery of this framework.

The Children’s Commissioner will play a key and ongoing role. She will be consulted on codes of practice and any further changes to those codes. The Government are confident that she will use her statutory duties and powers effectively to understand children’s experiences of the digital world. Her primary function as Children’s Commissioner for England is promoting and protecting the rights of children in England and to promote and protect the rights of children across the United Kingdom where those rights are or may be affected by reserved matters. As the codes of practice and the wider Bill relate to a reserved area of law—namely, internet services—the Children’s Commissioner for England will be able to represent the interests of children from England, Scotland, Wales and Northern Ireland when she is consulted on the preparation of codes of practice. That will ensure that children’s voices are represented right across the UK. The Children’s Commissioner for England and her office also regularly speak to the other commissioners about ongoing work on devolved and reserved matters. Whether she does that in branches of Pret A Manger, I do not know, but she certainly works with her counterparts across the UK.

I am very happy to take back the idea that the noble Lord has raised and discuss it with the commissioner. There are many means by which she can carry out her duties, so I am very happy to take that forward. I cannot necessarily commit to putting it in legislation, but I shall certainly commit to discussing it with her. On the proposals in the noble Lord’s amendment, we are concerned that a separate child user advocacy body would duplicate the functions that she already has, so I hope with that commitment he will be happy to withdraw.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to those who have spoken in this quick debate and for the support from the noble Lord, Lord Allan of Hallam, and the noble Baroness, Lady Fox, about children’s voices being heard. I think that we are getting to the point when there will not be a quango or indeed a minefield, so that makes us all happy. The Minister almost derailed me, because so much of his speaking note was about the interests of children and I am more interested in the voice of children being heard directly rather than people acting on their behalf and representing their interests, but his final comments around being happy to take the idea forward means that I am very happy to withdraw my amendment.

Amendment 270A withdrawn.
Amendment 271
Moved by
271: After Clause 145, insert the following new Clause—
“OFCOM’s reports about use of age assurance
(1) OFCOM must produce and publish a report assessing—(a) how providers of regulated services have used age assurance for the purpose of compliance with their duties set out in this Act,(b) how effective the use of age assurance has been for that purpose, and(c) whether there are factors that have prevented or hindered the effective use of age assurance, or a particular kind of age assurance, for that purpose,(and in this section, references to a report are to a report described in this subsection).(2) A report must, in particular, consider whether the following have prevented or hindered the effective use of age assurance—(a) the costs to providers of using it, and(b) the need to protect users from a breach of any statutory provision or rule of law concerning privacy that is relevant to the use or operation of a regulated service (including, but not limited to, any such provision or rule concerning the processing of personal data).(3) Unless the Secretary of State requires the production of a further report (see subsection (6)), the requirement in subsection (1) is met by producing and publishing one report within the period of 18 months beginning with the day on which sections 11 and 72(2) come into force (or if those provisions come into force on different days, the period of 18 months beginning with the later of those days).(4) In preparing a report, OFCOM must consult—(a) the Information Commissioner, and(b) such other persons as OFCOM consider appropriate.(5) OFCOM must send a copy of a report to the Secretary of State, and the Secretary of State must lay it before Parliament.(6) The Secretary of State may require OFCOM to produce and publish a further report in response to—(a) the development of age assurance technology, or(b) evidence of the reduced effectiveness of such technology.(7) But such a requirement may not be imposed—(a) within the period of three years beginning with the date on which the first report is published, or(b) more frequently than once every three years.(8) For further provision about reports under this section, see section 149.(9) In this section “age assurance” means age verification or age estimation.”Member’s explanatory statement
This new Clause requires OFCOM to produce and publish a report about the use of age assurance by providers of regulated services.
Amendment 271 agreed.
Clause 147: OFCOM’s transparency reports
Amendment 272 not moved.
Amendments 272A and 272AA
Moved by
272A: After Clause 147, insert the following new Clause—
“OFCOM’s report about use of app stores by children
(1) OFCOM must produce a report about the use of app stores by children.(2) In particular, the report must—(a) assess what role app stores play in children encountering content that is harmful to children, search content that is harmful to children or regulated provider pornographic content by means of regulated apps which the app stores make available,(b) assess the extent to which age assurance is currently used by providers of app stores, and how effective it is, and(c) explore whether children’s online safety would be better protected by the greater use of age assurance or particular kinds of age assurance by such providers, or by other measures.(3) OFCOM must publish the report during the period beginning two years, and ending three years, after the day on which sections 11 and 25 come into force (or if those sections come into force on different days, the later of those days).(4) For further provision about the report under this section, see section 149.(5) In this section—“app” includes an app for use on any kind of device, and “app store” is to be read accordingly;“content that is harmful to children” has the same meaning as in Part 3 (see section 54);“regulated app” means an app for a regulated service;“regulated provider pornographic content” has the same meaning as in Part 5 (see section 70);“search content” has the same meaning as in Part 3 (see section 51).(6) In this section references to children are to children in the United Kingdom.”Member’s explanatory statement
This amendment requires OFCOM to produce a report about the use of app stores by children, including consideration of whether children would be better protected by greater use of age assurance.
272AA: After Clause 147, insert the following new Clause—
“OFCOM’s report about reporting and complaints procedures
(1) OFCOM must produce a report assessing the measures taken or in use by providers of Part 3 services to enable users and others to—(a) report particular kinds of content present on such services, and(b) make complaints to providers of such services.(2) OFCOM’s report must take into account the experiences of users and others in reporting content and making complaints to providers of Part 3 services, including—(a) how clear the procedures are for reporting content and making complaints,(b) how easy it is to do those things, and(c) whether providers are taking appropriate and timely action in response to reports and complaints that are made. (3) The report must include advice from OFCOM about whether they consider that the Secretary of State should make regulations under section (Power to impose duty about alternative dispute resolution procedure)(duty about alternative dispute resolution procedure).(4) In the report, OFCOM may make recommendations that they consider would improve the experiences of users and others in reporting content or making complaints to providers of Part 3 services, or would deliver better outcomes in relation to reports or complaints that are made.(5) In preparing the report under this section, OFCOM must consult—(a) the Secretary of State,(b) persons who appear to OFCOM to represent the interests of United Kingdom users of Part 3 services,(c) persons who appear to OFCOM to represent the interests of children (generally or with particular reference to online safety matters),(d) the Information Commissioner, and(e) such other persons as OFCOM consider appropriate.(6) The report may draw on OFCOM’s research under section 14 of the Communications Act (see subsection (6B) of that section).(7) The report is not required to address any matters which are the subject of a report by OFCOM under section 146 (report about the availability and treatment of news publisher content and journalistic content).(8) OFCOM must publish the report within the period of two years beginning with the day on which this section comes into force.(9) OFCOM must send a copy of the report to the Secretary of State, and the Secretary of State must lay it before Parliament.(10) The Secretary of State must publish a statement responding to the report within the period of three months beginning with the day on which the report is published, and the statement must include a response to OFCOM’s advice about whether to make regulations under section (Power to impose duty about alternative dispute resolution procedure).(11) The statement must be published in such manner as the Secretary of State considers appropriate for bringing it to the attention of persons who may be affected by it.(12) For further provision about the report under this section, see section 149.(13) References in this section to “users and others” are to United Kingdom users and individuals in the United Kingdom.”Member’s explanatory statement
This amendment requires OFCOM to produce a report about the content reporting and complaints procedures used by providers of Part 3 services, including user experiences of those procedures. OFCOM must specifically advise whether they consider that regulations ought to be made under the new Clause proposed to be inserted in my name after Clause 194 (duty about alternative dispute resolution procedure).
Amendments 272A and 272AA agreed.
Clause 148: OFCOM’s report about researchers’ access to information
Amendment 272AB not moved.
Amendment 272B
Moved by
272B: Clause 148, page 132, line 11, leave out “two years” and insert “18 months”
Member’s explanatory statement
This amendment provides that the report that OFCOM must publish under Clause 148 (report about researchers’ access to information) must be published within 18 months of Clause 148 coming into force (rather than two years).
Amendment 272B agreed.
Amendment 272BA not moved
Amendments 272C and 272D
Moved by
272C: Clause 148, page 132, line 16, leave out “Following the publication of the report, OFCOM may” and insert “OFCOM must”
Member’s explanatory statement
This amendment provides that OFCOM must (rather than may) produce guidance about matters dealt with by the report published under Clause 148.
272D: Clause 148, page 132, line 19, leave out subsections (8) and (9) and insert—
“(8) Before producing the guidance (including revised guidance) OFCOM must consult the persons mentioned in subsection (3).(9) OFCOM must publish the guidance (and any revised guidance).(10) OFCOM must include in each transparency report under section 147 an assessment of the effectiveness of the guidance.”Member’s explanatory statement
This amendment is consequential on the amendment in my name making the production of guidance under Clause 148(7) mandatory.
Amendments 272C and 272D agreed.
Amendment 272E not moved.
Amendment 273
Moved by
273: After Clause 148, insert the following new Clause—
“OFCOM’s report in connection with investigation into a death
(1) Subsection (2) applies if OFCOM receive—(a) a notice from a senior coroner under paragraph 1(2) of Schedule 5 to the Coroners and Justice Act 2009 in connection with an investigation into the death of a person;(b) a request for information in connection with the investigation of a procurator fiscal into, or an inquiry held or to be held in relation to, the death of a person;(c) a notice from a coroner under section 17A(2) of the Coroners Act (Northern Ireland) 1959 (c. 15 (N.I.)) in connection with—(i) an investigation to determine whether an inquest into the death of a person is necessary, or(ii) an inquest in relation to the death of a person.(2) OFCOM may produce a report for use by the coroner or procurator fiscal, dealing with any matters that they consider may be relevant.(3) In subsection (1)(b) “inquiry” means an inquiry held, or to be held, under the Inquiries into Fatal Accidents and Sudden Deaths etc. (Scotland) Act 2016 (asp 2).” Member’s explanatory statement
This amendment makes it clear that OFCOM may produce a report in connection with a person’s death, if the coroner gives OFCOM a notice or, in Scotland, the procurator fiscal requests information, for that purpose.
Amendment 273 agreed.
Amendments 273A and 273B had been withdrawn from the Marshalled List.
Clause 149: OFCOM'S reports
Amendments 274 to 274AA
Moved by
274: Clause 149, page 132, line 41, at end insert—
“(aa) a report under section (OFCOM’s reports about use of age assurance) (report about use of age assurance),”Member’s explanatory statement
This amendment is consequential on the new Clause to be inserted after Clause 145 in my name. It ensures that the usual confidentiality provisions apply to matters contained in OFCOM’s report about the use of age assurance.
274A: Clause 149, page 133, line 1, at end insert—
“(ca) a report under section (OFCOM’s report about use of app stores by children) (report about use of app stores by children),”Member’s explanatory statement
This amendment is consequential on the new Clause proposed to be inserted after Clause 147 in my name. It ensures that the usual confidentiality provisions apply to matters contained in OFCOM’s report about the use of app stores by children.
274AA: Clause 149, page 133, line 1, at end insert—
“(ca) a report under section (OFCOM’s report about reporting and complaints procedures) (report about reporting and complaints procedures),”Member’s explanatory statement
This amendment is consequential on the new Clause proposed to be inserted after Clause 147 in my name about OFCOM’s report concerning reporting and complaints procedures used by providers of Part 3 services. The amendment ensures that the usual confidentiality provisions apply to matters contained in that report.
Amendments 274 to 274AA agreed.
Amendment 274B
Moved by
274B: After Clause 149, insert the following new Clause—
“CHAPTER 8MEDIA LITERACYMedia literacy
(1) Section 11 of the Communications Act is amended in accordance with subsections (2) to (5).(2) Before subsection (1) insert—“(A1) In this section—(a) subsection (1) imposes duties on OFCOM which apply in relation to material published by means of the electronic media (including by means of regulated services), and(b) subsections (1A) to (1E) expand on those duties, and impose further duties on OFCOM, in relation to regulated services only.”(3) After subsection (1) insert— “(1A) OFCOM must take such steps, and enter into such arrangements, as they consider most likely to be effective in heightening the public’s awareness and understanding of ways in which they can protect themselves and others when using regulated services, in particular by helping them to—(a) understand the nature and impact of harmful content and the harmful ways in which regulated services may be used, especially content and activity disproportionately affecting particular groups, including women and girls;(b) reduce their and others’ exposure to harmful content and to the use of regulated services in harmful ways, especially content and activity disproportionately affecting particular groups, including women and girls;(c) use or apply—(i) features included in a regulated service, including features mentioned in section 12(2) of the Online Safety Act 2023, and(ii) tools or apps, including tools such as browser extensions,so as to mitigate the harms mentioned in paragraph (b);(d) establish the reliability, accuracy and authenticity of content;(e) understand the nature and impact of disinformation and misinformation, and reduce their and others’ exposure to it;(f) understand how their personal information may be protected.(1B) OFCOM must take such steps, and enter into such arrangements, as they consider most likely to encourage the development and use of technologies and systems for supporting users of regulated services to protect themselves and others as mentioned in paragraph (a), (b), (c), (d) or (e) of subsection (1A), including technologies and systems which—(a) provide further context to users about content they encounter;(b) help users to identify, and provide further context about, content of democratic importance present on regulated user-to-user services;(c) signpost users to resources, tools or information raising awareness about how to use regulated services so as to mitigate the harms mentioned in subsection (1A)(b).(1C) OFCOM’s duty under subsection (1A) is to be performed in the following ways (among others)—(a) pursuing activities and initiatives,(b) commissioning others to pursue activities and initiatives,(c) taking steps designed to encourage others to pursue activities and initiatives, and(d) making arrangements for the carrying out of research (see section 14(6)(a)).(1D) OFCOM must draw up, and from time to time review and revise, a statement recommending ways in which others, including providers of regulated services, might develop, pursue and evaluate activities or initiatives relevant to media literacy in relation to regulated services.(1E) OFCOM must publish the statement and any revised statement in such manner as they consider appropriate for bringing it to the attention of the persons who, in their opinion, are likely to be affected by it.”(4) After subsection (2) insert— “(3) In this section and in section 11A,“regulated service” means—(a) a regulated user-to-user service, or(b) a regulated search service.“Regulated user-to-user service” and “regulated search service” have the same meaning as in the Online Safety Act 2023 (see section 3 of that Act).(4) In this section—(a) “content”, in relation to regulated services, means regulated user-generated content, search content or fraudulent advertisements;(b) the following terms have the same meaning as in the Online Safety Act 2023—“content of democratic importance” (see section 13 of that Act);“fraudulent advertisement” (see sections 33 and 34 of that Act);“harm” (see section 209 of that Act) (and “harmful” is to be interpreted consistently with that section);“provider”(see section 202 of that Act);“regulated user-generated content” (see section 49 of that Act);“search content” (see section 51 of that Act).”(5) In the heading, for “Duty” substitute “Duties”.(6) In section 14 of the Communications Act (consumer research), in subsection (6)(a), after “11(1)” insert “, (1A) and (1B)”.”Member’s explanatory statement
This amendment inserts provisions into section 11 of the Communications Act 2003 (OFCOM’s duties to promote media literacy). The new provisions expand on the existing duties so far as they relate to regulated user-to-user and search services, and impose new duties on OFCOM aimed at enhancing users’ media literacy.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I beg to move Amendment 274B.

Amendments 274BA and 274BB (to Amendment 274B) not moved.
Amendment 274B agreed.
Amendment 274C
Moved by
274C: After Clause 149, insert the following new Clause—
“Media literacy strategy and media literacy statement
After section 11 of the Communications Act insert—“11A Regulated services: media literacy strategy and media literacy statement(1) OFCOM must prepare and publish a media literacy strategy within the period of one year beginning with the day on which the Online Safety Act 2023 is passed.(2) A media literacy strategy is a plan setting out how OFCOM propose to exercise their functions under section 11 in the period covered by the plan, which must be not more than three years.(3) In particular, a media literacy strategy must state OFCOM’s objectives and priorities for the period it covers.(4) Before the end of the period covered by a media literacy strategy, OFCOM must prepare and publish a media literacy strategy for a further period, ensuring that each successive strategy covers a period beginning immediately after the end of the last one. (5) In preparing or revising a media literacy strategy, OFCOM must consult such persons as they consider appropriate.(6) OFCOM’s annual report must contain a media literacy statement.(7) A media literacy statement is a statement by OFCOM—(a) summarising what they have done in the financial year to which the report relates in the exercise of their functions under section 11, and(b) assessing what progress has been made towards achieving the objectives and priorities set out in their media literacy strategy in that year.(8) A media literacy statement must include a summary and an evaluation of the activities and initiatives pursued or commissioned by OFCOM in the exercise of their functions under section 11 in the financial year to which the report relates.(9) The first annual report that is required to contain a media literacy statement is the report for the financial year during which OFCOM’s first media literacy strategy is published, and that first statement is to relate to the period from publication day until the end of that financial year.(10) But if OFCOM’s first media literacy strategy is published during the second half of a financial year—(a) the first annual report that is required to contain a media literacy statement is the report for the next financial year, and(b) that first statement is to relate to the period from publication day until the end of that financial year.(11) References in this section to OFCOM’s functions under section 11 are to those functions so far as they relate to regulated services.(12) In this section—“annual report” means OFCOM’s annual report under paragraph 12 of the Schedule to the Office of Communications Act 2002;“financial year” means a year ending with 31 March.””Member’s explanatory statement
This amendment requires OFCOM to produce a media literacy strategy every three years (or more frequently), and to include, in their annual report, a statement summarising and evaluating their media literacy activities, so far as they relate to regulated services, during the year.
Amendment 274C agreed.
Amendments 275 and 275A not moved.
Clause 202: “Provider” of internet service
Amendment 276
Moved by
276: Clause 202, page 171, line 2, at end insert—
“(15) For the purposes of subsections (8) and (9), a person who makes available on a service an automated tool or algorithm by means of which content is generated is to be regarded as having control over content so generated.”Member’s explanatory statement
This amendment is about who counts as the provider of a service (other than a user-to-user or search service) that hosts provider pornographic content for the purposes of the Bill. The amendment makes it clear that a person who controls a generative tool on the service, such as a generative AI bot, is regarded as controlling the content generated by that tool.
Amendment 276 agreed.
Amendment 277
Moved by
277: After Clause 205, insert the following new Clause—
““Age verification” and “age estimation”
(1) This section applies for the purposes of this Act.(2) “Age verification” means any measure designed to verify the exact age of users of a regulated service.(3) “Age estimation” means any measure designed to estimate the age or age- range of users of a regulated service.(4) A measure which requires a user to self-declare their age (without more) is not to be regarded as age verification or age estimation.”Member’s explanatory statement
This new Clause defines age verification and age estimation, and makes it clear that mere self-declaration of age does not count as either.
Amendment 277 agreed.
Clause 206: “Proactive technology”
Amendments 278 to 280
Moved by
278: Clause 206, page 172, line 34, leave out “assessing or establishing” and insert “verifying or estimating”
Member’s explanatory statement
This amendment is made to ensure consistency of language in the Bill when referring to age verification and age estimation.
279: Clause 206, page 173, line 11, at end insert—
“(c) in relation to an internet service within section 71(2), content that is provider pornographic content in relation to the service.”Member’s explanatory statement
This amendment is about what counts as “relevant content” for the purposes of defining “proactive technology” for the purposes of the Bill. The effect is for provider pornographic content to now be included.
280: Clause 206, page 173, line 15, leave out “Part 3” and insert “regulated”
Member’s explanatory statement
This amendment revises the definition of “user data” for the purposes of defining “proactive technology” for the purposes of the Bill. The effect is for user data to now include data created etc by providers of all services regulated by the Bill (including providers subject to the Part 5 pornography duties).
Amendments 278 to 280 agreed.
Clause 208: “Functionality”
Amendments 281 to 281B not moved.
Amendment 281BA
Moved by
281BA: Clause 208, page 175, line 5, at end insert—
“(3A) In this Act “functionality”, in relation to a regulated service, includes the design of systems and processes that engage or impact on users, particularly algorithms.”Member’s explanatory statement
This amendment clarifies the role that system design can impact on outcomes on users in light of the requirement for systems to be safe by design.
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, I note that the noble Lord, Lord Stevenson, is no longer in his place, but I promise to still try to live by his admonition to all of us to speak briefly.

I will speak to Amendments 281BA, 281FA, 286A and 281F, which has already been debated but is central to this issue. These amendments aim to fix a problem we repeatedly raised in Committee and on Report. They are also in the name of the noble Baroness, Lady Kidron, and the noble Lords, Lord Stevenson and Lord Clement-Jones, and build on amendments in Committee laid by the noble Lord, Lord Russell, my noble friend Lord Bethell and the right reverend Prelate the Bishop of Oxford. This issue has broad support across the whole House.

The problem these amendments seek to solve is that, while the Government have consistently asserted that this is a systems and processes Bill, the Bill is constructed in a manner that focuses on content. Because this is a rerun of previous debates, I will try to keep my remarks short, but I want to be clear about why this is a real issue.

I am expecting my noble friend the Minister to say, as he has done before, that this is all covered; we are just seeing shadows, we are reading the Bill wrong and the harms that we are most concerned about are genuinely in the Bill. But I really struggle to understand why, if they are in the Bill, stating them clearly on the face of the Bill creates the legal uncertainty that seems to be the Government’s favourite problem with each of the amendments we have been raising today.

My noble friend—sorry, my friend—the noble Baroness, Lady Kidron, commissioned a legal opinion that looked at the statements from the Government and compared it to the text in the Bill. That opinion, like that of the many noble Lords I have just mentioned, is that the current language in the Bill about features and functionalities only pertains as far as it relates to harmful content. All roads in this game of Mornington Crescent lead back to content.

Harmful content is set out in a schedule to the Bill, and this set of amendments ensures that the design of services, irrespective of content, is required to be safe by design. If the Government were correct in their assertion that this is already covered, then these amendments really should not pose any threat at all, and I have yet to hear the Government enunciate what the real legal uncertainty actually is in stating that harm can come from functionality, not just from content.

20:15
Secondly, I think that, in the process of this Report stage, we may have risked making the Bill worse, not better, by creating a real confusion. At the front of the Bill, the new purposive Clause 1 sets out really clearly that regulated companies have to be safe by design. But when you actually work your way through the Bill, unfortunately, at each point in the Bill, we only then refer back to content; we do not refer back to the functionality that can in and of itself be harmful. There is no reference to functionality in the risk assessments, the child safety duties, or the definitions of harm—no reference to the systems and processes that may in themselves be designed well or badly, irrespective of content.
I note that both the noble Baroness, Lady Kidron, and the noble Lord, Lord Knight, on our last day of Report, made reference to the work of Professor Bowden-Jones, which included the spectre of children so addicted to reward loops of games and other media that they needed to attend a gambling or gaming clinic, including the spectre of a child who left their house in the dead of night to access wifi from a random hotspot when their desperate parents had switched off the home wifi.
As we have said several times before, it is not an accident that these platforms do this; it is a direct result of their business models to encourage dwell time, which in turn drives addiction, irrespective of content. It is very well established by psychiatrists, many of whom have already been quoted, including for example Dr Norman Doidge, a pre-eminent Canadian psychiatrist, that the plasticity of the brain in the developing teenager makes them particularly susceptible to this sort of addiction. Once caught in that loop as a teenager, it stays with them for life. So this is a very real and present risk to today’s teenagers, to —unfortunately—the last decade’s teenagers and to all teenagers as we look ahead. It is why, together with my co-signatories, we felt that we had to continue to keep pressing this.
Thirdly, both in Committee and on Report, we kept asking the Government to be more mindful of the future. This morning, I am sure like many other noble Lords, I woke up to the dulcet tones of my friend, the noble Baroness, Lady Kidron, talking about Meta’s overnight announcement about their large language model. I fear that it would be extraordinary hubris to keep insisting that content on its own is going to be the defining harm that our children face going forward. We simply do not know, and it is so important that we leave open the possibility for functionality that we cannot even imagine today to be launched tomorrow, let alone in five or 10 years.
It is a huge mistake to not make sure that this Bill captures non-content harm and functionality irrespective of any form of content. Because of the lateness of the hour and the urgency of catching trains to get home before the train strike, I will not go through what each of the amendments do, save to say that they introduce specific elements in the back half of the Bill to ensure that non-content harm is captured. We are in a bit of a mess in this Bill, because the front half of the Bill now does include non-content harms, in both Amendment 35 and Amendment 240. So we do need to make sure that in the end we produce a Bill that is internally consistent and genuinely captures the purpose set out in the new Clause 1 all the way through the Bill.
I would like to ask my noble friend a couple of questions. First: what is the legal uncertainty that I am fully expecting him to set out that he is so worried about, and is the reason why he cannot accept the amendments? There is a charitable interpretation, which is that we are all worried about creating legal uncertainty. My co-signatories and I are worried about the legal uncertainty we are creating by not naming functionality as harm. If I am being charitable, I think the Government are worried—and this is what I do not understand—that by naming these non-contact harms, we somehow create a new loophole that would enable a platform to continue to cause harm and Ofcom not to be able to regulate. I hope we are united in trying to stop that, and if so, I really hope that my noble friend can offer an explanation. This is not for the want of us having had many conversations about this, and we may need to have many more. I hope that that charitable interpretation is right: that we are all trying to do the same thing but we do not really understand how this complex Bill works.
There is, unfortunately, a less charitable interpretation, which would lead one to worry that the Bill is actually just about content. I ask my noble friend to confirm that this is not just a content Bill. One of the things that most scared me was Ofcom’s insistence in front of the Communications and Digital Select Committee last week that, if the amendments were allowed, it would create a huge amount of additional work for it. I note that the Government have been briefing that today: that the amendments would lead to substantial delay because of the extra work Ofcom would need to do. That makes me worried that Ofcom has not properly thought about the consequences of non-content harm—harm generated by functionality—if it really will take it so long. That is the much less charitable interpretation of why I am expecting my noble friend to reject the amendments. I should like to understand those two questions: what is really the legal uncertainty that the Government are worried about; and why, if this is all covered, would it take so long?
I am channelling my friend the noble Baroness, Lady Kidron, here, but this is such an important part of protecting our children that if it really is going to take some extra months to prepare to do it properly, we should be willing to do that. We have a few months ahead of us over the summer holidays, and we know that Ofcom has done a brilliant job in getting ahead of the legislation. If the problem is simply that there might be some extra work—provided that really is the reason, rather than the Government not wanting it to be anything other than a content Bill—we should accept that it will take a bit of time. I should like to understand the answer to that.
It is late, and it has been a long Report stage. I will listen very carefully to what my noble friend the Minister has to say. I really hope that the Bill can continue to progress in this collaborative way. It would be an awful shame if, at the end of a long Report stage, we did not recognise that we are trying to solve the same problem and find a way through. I beg to move.
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

My Lords, the hour is late and I will not detain the House for long. However, I hope that the fact that we are all still sitting here at the end of a long Report stage, because we care very much about the Bill and what we are trying to achieve, will be noted by my noble friend the Minister, his officials and others who are watching. I thank my noble friend Lady Harding for so ably introducing the amendments, which I absolutely support. I was, perhaps for the first time, going to agree with something the noble Baroness, Lady Fox, said a day or so ago: that one thing we and Ofcom need to do much better is to understand the transparency of the algorithms. It is not just algorithms—this is where my knowledge ends—but other design features that make these sites addictive and harmful, and which are outside content. The Bill will not be capable of addressing even the next five years, let alone beyond that, if we do not reflect the fact that, as my noble friend Lady Harding said, it has already been amended so that one way its objectives are to be achieved is by services being required to focus on safety by design.

I hope very much that my noble friend will take up the invitation, because everybody is tired and has been looking at this Bill for so many hours and months that we are probably all word-blind. We could all do with standing back and thinking, “With the amendments made, how does it all hang together so that ultimately, we keep those we want to keep safe as safe as we possibly can?” On that basis, I support these amendments and look forward to hearing further from the Government about how they hope to keep safe those we all wish to keep safe.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to support the amendment in the name of the noble Baroness, Lady Kidron. She has been such a forceful voice throughout the passage of this Bill, driven by her passion to protect children, and no more so than with the amendment in her name. That is why I feel compelled to speak up to support her. So far, we have all worked with the Government to see the safe passage of the Online Safety Bill, with strong protections for children. These amendments would be yet another excellent and unique opportunity to protect children. This is what we have been fighting for for years, and it is so uplifting that the Government have listened to us throughout the passage of this Bill—so why stop now? If the Government are saying that the Bill is being clear about harms, they should have no objection to making it explicit.

These amendments press for safety by design to be embedded in later clauses of the Bill and go hand in hand with the earlier amendment that the House so clearly supported. It is clear that the design of services and algorithms is responsible for orchestrating and manipulating the behaviour, feelings, emotions and thoughts of children who, because they are at a vulnerable stage in their development, are easily influenced. We have all witnessed the disastrous impact of the new technology which is fast encroaching upon us, and our children will not be spared from it. So it is imperative that Ofcom have the tools with which to consider and interrogate system design separately from content because, as has been said, it is not only content that is harmful: design is too. We therefore need to take a holistic approach and leave nowhere to hide for the tech companies when it comes to harms affecting our children.

As I have said before, these amendments would send a loud and clear message to the industry that it is responsible for the design of its products and has to think of the consequences for our children’s mental health and well-being when considering design. What better way to do that than for the Government to accept these amendments, in order to show that they are on the side of our children, not the global tech companies, when it comes to protecting them from harm? They need to put measures in place to ensure that the way a service is designed is subject to the online safety regime we have all fought for over the years and during the passage of this Bill.

If the Government do not accept the amendment, perhaps the issue of harmful design could be included in the welcome proposed review of pornography. It would be good to hear the Minister’s thoughts on this idea—but I am not giving him a let-off. I hope he will listen to the strength of feeling and that the Government will reconsider their position, support the amendment and complete the one main task they set out to complete with this Bill, which is to protect children from harm no matter where it rears its ugly head online.

Baroness Fraser of Craigmaddie Portrait Baroness Fraser of Craigmaddie (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I rise briefly to support my noble friend Lady Harding and to associate myself with everything she has just said. It strikes me that if we do not acknowledge that there is harm from functionality, not just content, we are not looking to the future, because functionality protects vulnerable people before the harm has happened; content relies on us having to take it down afterwards. I want to stress that algorithms and functionality disproportionately harm not just vulnerable children but vulnerable adults as well. I do not understand why, since we agreed to safety by design at the beginning of the Bill, it is not running throughout it, rather than just in the introduction. I want to lend my support these amendments this evening.

20:30
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I can be very brief. My noble friend Lady Benjamin and the noble Baronesses, Lady Harding, Lady Morgan and Lady Fraser, have all very eloquently described why these amendments in this group are needed.

It is ironic that we are still having this debate right at the end of Report. It has been a running theme throughout the passage of the Bill, both in Committee and on Report, and of course it ran right through our Joint Committee work. It is the whole question of safety by design, harm from functionalities and, as the noble Baroness, Lady Morgan, said, understanding the operation of the algorithm. And there is still the question: does the Bill adequately cover what we are trying to achieve?

As the noble Baroness, Lady Harding, said, Clause 1 now does set out the requirement for safety by design. So, in the spirit of amity, I suggested to the Minister that he might run a check on the Bill during his free time over the next few weeks to make sure that it really does cover it. But, in a sense, there is a serious point here. Before Third Reading there is a real opportunity to run a slide rule over the Bill to see whether the present wording really is fit for purpose. So many of us around this House who have lived and breathed this Bill do not believe that it yet is. The exhortation by the ethereal presences of the noble Baronesses, Lady Kidron and Lady Harding, to keep pressing to make sure that the Bill is future-proofed and contains the right ingredients is absolutely right.

I very much hope that once again the Minister will go through the hoops and explain whether this Bill really captures functionality and design and not just content, and whether it adequately covers the points set out in the purpose of the Bill which is now there.

Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, as we have heard, the noble Baroness, Lady Harding, made a very clear case in support of these amendments, tabled in the name of the noble Baroness, Lady Kidron, and supported by noble Lords from across the House. The noble Baroness, Lady Morgan, gave wise counsel to the Minister, as did the noble Lord, Lord Clement-Jones, that it is worth stepping back and seeing where we are in order to ensure that the Bill is in the right place. I urge the Minister to find the time and the energy that I know he has—he certainly has the energy and I am sure he will match it with the time—to speak to noble Lords over the coming Recess to agree a way to incorporate systems and functionality into the Bill, for all the reasons we have heard.

On Monday, my noble friend Lord Knight spoke of the need for a review about loot boxes and video games. When we checked Hansard, we saw the Minister had promised that such a review would be offered in the coming months. In an unusual turn of events, the Minister exceeded the timescale. We did not have to hear the words “shortly”, “in the summer” or “spring” or anything like that, because it was announced the very next day that the department would keep legislative options under review.

I make that point simply to thank the Minister for the immediate response to my noble friend Lord Knight. But, if we are to have such a review, does this not point very much to the fact that functionality and systems should be included in the Bill? The Minister has a very nice hook to hang this on and I hope that he will do so.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, this is not just a content Bill. The Government have always been clear that the way in which a service is designed and operated, including its features and functionalities, can have a significant impact on the risk of harm to a user. That is why the Bill already explicitly requires providers to ensure their services are safe by design and to address the risks that arise from features and functionalities.

The Government have recognised the concerns which noble Lords have voiced throughout our scrutiny of the Bill, and those which predated the scrutiny of it. We have tabled a number of amendments to make it even more explicit that these elements are covered by the Bill. We have tabled the new introductory Clause 1, which makes it clear that duties on providers are aimed at ensuring that services are safe by design. It also highlights that obligations on services extend to the design and operation of the service. These obligations ensure that the consideration of risks associated with the business model of a service is a fundamental aspect of the Bill.

My noble friend Baroness Harding of Winscombe worried that we had made the Bill worse by adding this. The new clause was a collaborative one, which we have inserted while the Bill has been before your Lordships’ House. Let me reassure her and other noble Lords as we conclude Report that we have not made it worse by so doing. The Bill will require services to take a safety by design approach to the design and operation of their services. We have always been clear that this will be crucial to compliance with the legislation. The new introductory Clause 1 makes this explicit as an overarching objective of the Bill. The introductory clause does not introduce any new concepts; it is an accurate summary of the key provisions and objectives of the Bill and, to that end, the framework and introductory statement are entirely compatible.

We also tabled amendments—which we debated last Monday—to Clause 209. These make it clear that functionalities contribute to the risk of harm to users, and that combinations of functionality may cumulatively drive up the level of risk. Amendment 281BA would amend the meaning of “functionality” within the Bill, so that it includes any system or process which affects users. This presents a number of concerns. First, such a broad interpretation would mean that any service in scope of the Bill would need to consider the risk of any feature or functionality, including ones that are positive for users’ online experience. That could include, for example, processes designed for optimising the interface depending on the user’s device and language settings. The amendment would increase the burden on service providers under the existing illegal content and child safety duties and would dilute their focus on genuinely risky functionality and design.

Second, by duplicating the reference to systems, processes and algorithms elsewhere in the Bill, it implies that the existing references in the Bill to the design of a service or to algorithms must be intended to capture matters not covered by the proposed new definition of “functionality”. This would suggest that references to systems and processes, and algorithms, mentioned elsewhere in the Bill, cover only systems, processes or algorithms which do not have an impact on users. That risks undermining the effectiveness of the existing duties and the protections for users, including children.

Amendment 268A introduces a further interpretation of features and functionality in the general interpretation clause. This duplicates the overarching interpretation of functionality in Clause 208 and, in so doing, introduces legal and regulatory uncertainty, which in turn risks weakening the existing duties. I hope that sets out for my noble friend Lady Harding and others our legal concerns here.

Amendment 281FA seeks to add to the interpretation of harm in Clause 209 by clarifying the scenarios in which harm may arise, specifically from services, systems and processes. This has a number of concerning effects. First, it states that harm can arise solely from a system and process, but a design choice does not in isolation harm a user. For example, the decision to use algorithms, or even the algorithm itself, is not what causes harm to a user—it is the fact that harmful content may be pushed to a user, or content pushed in such a manner that is harmful, for example repeatedly and in volume. That is already addressed comprehensively in the Bill, including in the child safety risk assessment duties.

Secondly, noble Lords should be aware that the drafting of the amendment has the effect of saying that harm can arise from proposed new paragraphs (a) (b) and (c)—

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

Can I just double-check what my noble friend has just said? I was lulled into a possibly false sense of security until we got to the point where he said “harmful” and then the dreaded word “content”. Does he accept that there can be harm without there needing to be content?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

This is the philosophical question on which we still disagree. Features and functionality can be harmful but, to manifest that harm, there must be some content which they are functionally, or through their feature, presenting to the user. We therefore keep talking about content, even when we are talking about features and functionality. A feature on its own which has no content is not what the noble Baroness, Lady Kidron, my noble friend Lady Harding and others are envisaging, but to follow the logic of the point they are making, it requires some content for the feature or functionality to cause its harm.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

But the content may not be harmful.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes, even if the content is not harmful. We keep saying “content” because it is the way the content is disseminated, as the Bill sets out, but the features and functionalities can increase the risks of harm as well. We have addressed this through looking at the cumulative effects and in other ways.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

This is the key question. For example, let us take a feature that is pushing something at you constantly; if it was pushing poison at you then it would obviously be harmful, but if it was pushing marshmallows then they would be singularly not harmful but cumulatively harmful. Is the Minister saying that the second scenario is still a problem and that the surfeit of marshmallows is problematic and will still be captured, even if each individual marshmallow is not harmful?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes, because the cumulative harm—the accumulation of marshmallows in that example—has been addressed.

Noble Lords should also be aware that the drafting of Amendment 281FA has the effect of saying that harm can arise from proposed new paragraphs (a), (b) and (c)—for example, from the

“age or characteristics of the likely user group”.

In effect, being a child or possessing a particular characteristic may be harmful. This may not be the intention of the noble Baronesses who tabled the amendment, but it highlights the important distinction between something being a risk factor that influences the risk of harm occurring and something being harmful.

The Government are clear that these aspects should properly be treated as risk factors. Other parts of the Bill already make it clear that the ways in which a service is designed and used may impact on the risk of harm suffered by users. I point again to paragraphs (f) to (h) of Clause 10(6); paragraph (e) talks about the level of risk of functionalities of the service, paragraph (f) talks about the different ways in which the service is used, and so on.

We have addressed these points in the Bill, though clearly not to the satisfaction of my noble friend, the noble Baroness, Lady Kidron, and others. As we conclude Report, I recognise that we have not yet convinced everyone that our approach achieves what we all seek, though I am grateful for my noble friend’s recognition that we all share the same aim in this endeavour. As I explained to the noble Baroness, Lady Kidron, on her Amendment 35, I was asking her not to press it because, if she did, the matter would have been dealt with on Report and we would not be able to return to it at Third Reading.

As the Bill heads towards another place with this philosophical disagreement still bubbling away, I am very happy to commit to continuing to talk to your Lordships—particularly when the Bill is in another place, so that noble Lords can follow the debates there. I am conscious that my right honourable friend Michelle Donelan, who has had a busy maternity leave and has spoken to a number of your Lordships while on leave, returns tomorrow in preparation for the Bill heading to her House. I am sure she will be very happy to speak even more when she is back fully at work, but we will both be happy to continue to do so.

I think it is appropriate, in some ways, that we end on this issue, which remains an area of difference. With that promise to continue these discussions as the Bill moves towards another place, I hope that my noble friend will be content not to press these amendments, recognising particularly that the noble Baroness, Lady Kidron, has already inserted this thinking into the Bill for consideration in the other House.

20:45
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, being the understudy for the noble Baroness, Lady Kidron, is quite a stressful thing. I am, however, reliably informed that she is currently offline in the White House, but I know that she will scrutinise everything I say afterwards and that I will receive a detailed school report tomorrow.

I am extremely grateful to my noble friend the Minister for how he has just summed up, but I would point out two things in response. The first is the circularity of the legal uncertainty. What I think I have heard is that we are trying to insert into the Bill some clarity because we do not think it is clear, but the Government’s concern is that by inserting clarity, we then imply that there was not clarity in the rest of the Bill, which then creates the legal uncertainty—and round we go. I am not convinced that we have really solved that problem, but I may be one step further towards understanding why the Government think that it is a problem. I think we have to keep exploring that and properly bottom it out.

My second point is about what I think will for evermore be known as the marshmallow problem. We have just rehearsed across the House a really heartfelt concern that just because we cannot imagine it today, it does not mean that there will not be functionality that causes enormous harm which does not link back to a marshmallow, multiple marshmallows or any other form of content.

Those two big issues are the ones we need to keep discussing: what is really causing the legal uncertainty and how we can be confident that unimaginable harms from unimaginable functionality are genuinely going to be captured in the Bill. Provided that we can continue, maybe it is entirely fitting at the end of what I think has been an extraordinarily collaborative Report, Committee and whole process of the Bill going through this House—which I have felt incredibly proud and privileged to be a part of—that we end with a commitment to continue said collaborative process. With that, I beg leave to withdraw the amendment.

Amendment 281BA withdrawn.
Clause 209: “Harm” etc
Amendments 281C to 281E
Moved by
281C: Clause 209, page 175, line 17, leave out from “dissemination” to end of line 18
Member’s explanatory statement
This amendment is consequential on the next amendment to this Clause in my name.
281D: Clause 209, page 175, line 18, at end insert—
“(3A) References to harm presented by content, and any other references to harm in relation to content, include references to cumulative harm arising or that may arise in the following circumstances—(a) where content, or content of a particular kind, is repeatedly encountered by an individual (including, but not limited to, where content, or a kind of content, is sent to an individual by one user or by different users or encountered as a result of algorithms used by, or functionalities of, a service);(b) where content of a particular kind is encountered by an individual in combination with content of a different kind (including, but not limited to, where a kind of content is sent to an individual by one user or by different users or encountered as a result of algorithms used by, or functionalities of, a service).”Member’s explanatory statement
This amendment makes clear that references to harm presented by content include cumulative harm that arises or that may arise in the circumstances mentioned and, in particular, covers the case where this occurs as a result of algorithms used by, or functionalities of, a service.
281E: Clause 209, page 175, line 29, at end insert—
“(4A) References to a risk of harm in relation to functionalities, and references to the risk of functionalities facilitating users encountering particular kinds of content (however expressed), include references to risks arising or that may arise due to multiple functionalities which, used in combination, increase the likelihood of harm arising (for example, as mentioned in subsection (3A)).”Member’s explanatory statement
This amendment makes clear that references to a risk of harm in relation to functionalities and references to the risk of functionalities facilitating users encountering particular kinds of content include references to risks from a combination of those functionalities.
Amendments 281C to 281E agreed.
Amendments 281F and 281FA not moved.
Amendment 281G
Moved by
281G: Clause 209, page 175, line 33, leave out “and (4)” and insert “to (4)”
Member’s explanatory statement
This amendment is consequential on the amendment in my name inserting new subsection (3A) into this Clause.
Amendment 281G agreed.
Clause 210: “Online safety functions” and “online safety matters”
Amendments 281H to 283
Moved by
281H: Clause 210, page 176, line 12, leave out “section 11 (duty” and insert “sections 11 and 11A (duties”
Member’s explanatory statement
This amendment provides that the term “online safety functions” includes OFCOM’s functions under section 11A of the Communications Act 2003 (inserted by the new Clause proposed to be inserted after Clause 149 in my name) regarding OFCOM’s media literacy strategy (as well as OFCOM’s functions under section 11 of that Act).
282: Clause 210, page 176, line 21, at end insert—
“(2A) References to OFCOM’s “online safety functions” also include references to OFCOM’s duty to comply with any of the following, so far as relating to the use of a regulated service by a person who has died—(a) a notice from a senior coroner under paragraph 1(2) of Schedule 5 to the Coroners and Justice Act 2009 in connection with an investigation into a person’s death;(b) a request for information in connection with the investigation of a procurator fiscal into, or an inquiry held or to be held in relation to, a person’s death;(c) a notice from a coroner under section 17A(2) of the Coroners Act (Northern Ireland) 1959 (c. 15 (N.I.)) in connection with—(i) an investigation to determine whether an inquest into a person’s death is necessary, or(ii) an inquest in relation to a person’s death.”Member’s explanatory statement
This amendment makes it clear that OFCOM’s online safety functions include the duty of complying with a coroner’s notice or, in Scotland, a request from the procurator fiscal, in connection with the use of a regulated service by a person who has died.
283: Clause 210, page 176, line 23, at end insert—
“(4) In subsection (2A)(b) “inquiry” means an inquiry held, or to be held, under the Inquiries into Fatal Accidents and Sudden Deaths etc. (Scotland) Act 2016 (asp 2).”Member’s explanatory statement
This amendment defines a term used in the preceding amendment in my name.
Amendments 281H to 283 agreed.
Clause 211: Interpretation: general
Amendments 284 and 285
Moved by
284: Clause 211, page 176, leave out lines 27 and 28
Member’s explanatory statement
This amendment removes a definition of “age assurance” from Clause 211 as that term is now defined separately where used.
285: Clause 211, page 176, line 29, at end insert—
““automated tool” includes bot;”Member’s explanatory statement
This amendment makes it clear that references in the Bill to automated tools include bots.
Amendments 284 and 285 agreed.
Amendment 286
Moved by
286: Clause 211, page 177, line 7, at end insert—
““freedom of expression”: any reference to freedom of expression (except in sections 36(6)(f) and 69(2)(d)) is to the freedom to receive and impart ideas, opinions or information (referred to in Article 10(1) of the Convention) by means of speech, writing or images;”Member’s explanatory statement
This amendment inserts a definition of freedom of expression into the Bill.
Amendment 286 agreed.
Amendment 286A not moved.
Amendments 287 to 290
Moved by
287: Clause 211, page 177, line 10, after “91(1)”insert “or (Information in connection with an investigation into the death of a child)(1)”
Member’s explanatory statement
This amendment revises the definition of “information notice” so that it includes a notice under the new Clause proposed in my name concerning OFCOM’s power to obtain information in connection with an investigation into the death of a child.
288: Clause 211, page 177, line 31, at end insert—
““pornographic content” means content of such a nature that it is reasonable to assume that it was produced solely or principally for the purpose of sexual arousal;”Member’s explanatory statement
This amendment adds a definition of “pornographic content” to Clause 211 of the Bill.
288A: Clause 211, page 178, line 3, at end insert—
“(2A) References in this Act to an individual with a certain characteristic include references to an individual with a combination of characteristics.”Member’s explanatory statement
This amendment makes clear that references in the Bill to an individual with a certain characteristic include an individual with a combination of characteristics.
288B: Clause 211, page 178, line 9, leave out “description” and insert “kind”
Member’s explanatory statement
This amendment ensures consistency of language in referring to kinds of content.
288C: Clause 211, page 178, line 11, leave out “description” and insert “kind”
Member’s explanatory statement
This amendment ensures consistency of language in referring to kinds of content.
289: Clause 211, page 178, line 32, leave out from “of” to end of line 34 and insert “—
(a) software or an automated tool or algorithm applied by the provider of the service or by a person acting on behalf of the provider, or(b) an automated tool or algorithm made available on the service by the provider or by a person acting on behalf of the provider.”Member’s explanatory statement
This amendment revises an interpretative provision relating to the borderline between provider content and user-generated content. The provision is revised to use consistent wording about automated tools/algorithms made available by a provider (such as a generative AI bot), as used in the amendments of Clauses 49, 70 and paragraph 4 of Schedule 1 in my name.
290: Clause 211, page 178, line 36, leave out “(within the meaning of section 70(2))”
Member’s explanatory statement
This amendment is consequential on the amendment of this Clause in my name adding a definition of “pornographic content” to this Clause.
Amendments 287 to 290 agreed.
Clause 212: Index of defined terms
Amendments 291 to 293
Moved by
291: Clause 212, page 179, leave out line 3
Member’s explanatory statement
This amendment removes the entry for “age assurance” in the index of defined terms as that term is now defined separately where used.
292: Page 179, line 3, at end insert—

“age estimation

Section (“Age verification” and “age estimation”)

age verification

Section (“Age verification” and “age estimation”)”

Member’s explanatory statement
This amendment adds definitions of “age estimation” and “age verification” to the index of defined terms.
293: Page 179, line 4, at end insert—

“automated tool

Section 211”

Member’s explanatory statement
This amendment adds a definition of “automated tool” to the index of defined terms.
Amendments 291 to 293 agreed.
Amendment 294
Moved by
294: Page 179, line 22, at end insert—

“freedom of expression

section 211”

Member’s explanatory statement
This amendment adds a definition of “freedom of expression” to the index of defined terms.
Amendment 294 agreed.
Amendments 295 to 298
Moved by
295: Clause 212, page 180, line 17, leave out “(in Part 5)”
Member’s explanatory statement
This amendment updates the entry for pornographic content consequential on the amendment to Clause 211 which inserts a definition of that term into that Clause which applies for the purposes of the whole Bill.
296: Clause 212, page 180, line 17, leave out “70” and insert “211”
Member’s explanatory statement
This amendment updates the entry for pornographic content consequential on the amendment to Clause 211 inserting a definition of that term into that clause.
297: Clause 212, page 180, line 18, leave out “54” and insert “(“Primary priority content that is harmful to children”)”
Member’s explanatory statement
This amendment updates the entry for primary priority content that is harmful to children in the index of defined terms, consequential on the new Clause proposed to be inserted after Clause 54 in my name.
298: Clause 212, page 180, line 20, leave out “54” and insert “(“Priority content that is harmful to children”)”
Member’s explanatory statement
This amendment updates the entry for priority content that is harmful to children in the index of defined terms, consequential on the new Clause proposed to be inserted after Clause 54 in my name.
Amendments 295 to 298 agreed.
Clause 214: Extent
Amendment 299
Moved by
299: Clause 214, page 182, line 9, at end insert—
“(aa) section (Sharing or threatening to share intimate photograph or film);(ab) section 171(2);(ac) section (Repeals in connection with offences under section (Sharing or threatening to share intimate photograph or film));”Member’s explanatory statement
This amendment revises the extent Clause so that the provisions mentioned extend to England and Wales only.
Amendment 299 agreed.
Clause 215: Commencement and transitional provision
Amendments 300 to 302
Moved by
300: Clause 215, page 182, line 37, leave out subsection (1)
Member’s explanatory statement
Clause 215(1) specifies which provisions of the Bill come into force on Royal Assent. This amendment omits subsection (1), but only because it is being moved further down in the section and replaced (see the amendment in my name below).
301: Clause 215, page 183, line 8, leave out “The other provisions of this Act come” and insert “Except as provided by subsection (4A), this Act comes”
Member’s explanatory statement
This technical amendment is needed because of the additions to the list of provisions which are to be commenced on Royal Assent (see the next amendment in my name).
302: Clause 215, page 183, line 14, at end insert—
“(4A) The following provisions come into force on the day on which this Act is passed—(a) Parts 1 and 2;(b) Chapter 1 of Part 3;(c) section 36, except subsection (4) of that section;(d) section 37 and Schedule 4;(e) sections 38 to 43;(f) section 47(2), (3) and (4);(g) section 48, except subsection (2) of that section; (h) Chapter 7 of Part 3 and Schedules 5, 6 and 7;(i) section 63;(j) section 67;(k) section 70;(l) section 71(4);(m) section 73;(n) sections 81 and 82;(o) section 84;(p) section 85 and Schedule 11;(q) Chapter 3 of Part 7;(r) section 118;(s) section 140;(t) section 143 so far as relating to a duty imposed on OFCOM under Schedule 11;(u) section 174, except subsection (2)(b) of that section;(v) section (Time for publishing first guidance under certain provisions of this Act);(w) section 184(1);(x) section 187;(y) section 192;(z) section 194;(z1) section (Powers to amend sections (“Primary priority content that is harmful to children”) and (“Priority content that is harmful to children”));(z2) sections 197 to 201;(z3) this Part.”Member’s explanatory statement
This amendment specifies the provisions of the Bill that come into force on Royal Assent.
Amendments 300 to 302 agreed.

Online Safety Bill

Third Reading
15:39
Relevant document: 40th Report from the Delegated Powers Committee. Scottish and Welsh Legislative Consent granted.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I will make a brief statement on the devolution status of the Bill. I am pleased to inform your Lordships’ House that both the Scottish Parliament and Senedd Cymru have voted to grant consent for all the relevant provisions. For Scotland, these provisions are the power to amend the list of exempt educational institutions, the power to amend the list of child sexual exploitation and abuse offences and the new offence of encouraging or assisting serious self-harm. For Wales, the provisions are the power to amend the list of exempt educational institutions, the false communications offence, the threatening communications offence, the flashing images offences and the offence of encouraging or assisting serious self-harm.

As noble Lords will be aware, because the Northern Ireland Assembly is adjourned the usual process for seeking legislative consent in relation to Northern Ireland has not been possible. In the absence of legislative consent from the Northern Ireland Assembly, officials from the relevant UK and Northern Ireland departments have worked together to ensure that the Bill considers and reflects the relevant aspects of devolved legislation so that we may extend the following provisions to Northern Ireland: the power to amend the list of exempt educational institutions, the false communications offence, the threatening communications offence and the offence of encouraging or assisting serious self-harm. His Majesty’s Government have received confirmation in writing from the relevant Permanent Secretaries in Northern Ireland that they are content that nothing has been identified which would cause any practical difficulty in terms of the existing policy and legislative landscape. Historically, this area of legislation in Northern Ireland has mirrored that in Great Britain, and we believe that legislating without the consent of the Northern Ireland Assembly is justified in these exceptional circumstances and mitigates the risk of leaving Northern Ireland without the benefit of the Bill’s important reforms and legislative parity.

We remain committed to ensuring sustained engagement on the Bill with all three devolved Administrations as it progresses through Parliament. I beg to move that the Bill be read a third time.

Clause 44: Secretary of State’s powers of direction

Amendment 1

Moved by
1: Clause 44, page 45, line 30, leave out from “must” to end of line 31 and insert “, as soon as reasonably practicable, be published and laid before Parliament.”
Member’s explanatory statement
This amendment provides that, in addition to publishing a direction under this Clause, the Secretary of State must also lay it before Parliament. Additionally the Secretary of State is required to do these things as soon as reasonably practicable. There is an exemption in certain circumstances (as to which see the next amendment to this Clause in my name).
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, His Majesty’s Government have listened carefully to the views expressed in Committee and on Report and have tabled amendments to the Bill to address concerns raised by noble Lords. Let me first again express my gratitude to my noble friend Lady Stowell of Beeston for her constructive engagement on the Secretary of State’s powers of direction. As I said during our previous debate on this topic, I am happy to support her Amendments139 and 140 from Report. The Government are therefore bringing forward two amendments to that effect today.

Noble Lords will recall that, whenever directing Ofcom about a code, the Secretary of State must publish that direction. Amendment 1 means that, alongside this, in most cases a direction will now need to be laid before Parliament. There may be some cases where it is appropriate for the Secretary of State to withhold information from a laid direction: for example, if she thinks that publishing it would be against the interests of national security. In these cases, Amendment 2 will instead require the Secretary of State to lay a statement before Parliament setting out that a direction has been given, the kind of code to which the direction relates and the reasons for not publishing it. Taken together, these amendments will ensure that your Lordships and Members of another place are always made aware as soon as a direction has been made and, wherever possible, understand the contents of that direction. I hope noble Lords will agree that, after the series of debates we have had, we have reached a sensible and proportionate position on these clauses and one which satisfies your Lordships’ House.

I am also grateful to the noble Baroness, Lady Kennedy of The Shaws, for her determined and collaborative work on the issue of threatening communications. Following the commitment I made to her on Report, I have tabled an amendment to make it explicit that the threatening communications offence captures threats where the recipient fears that someone other than the person sending the message will carry out the threat. I want to make it clear that the threatening communications offence, like other existing offences related to threats, already captures threats that could be carried out by third parties. This amendment does not change the scope of the offence, but the Government understand the desire of the noble Baroness and others to make this explicit in the Bill, and I am grateful to her for her collaboration.

Regarding Ofcom’s power of remote access, I am grateful to noble Lords, Lord Knight of Weymouth and Lord Allan of Hallam, my noble friend Lord Moylan and the noble Baroness, Lady Fox of Buckley, who unavoidably cannot be with us today, for raising their concerns about the perceived breadth of the power and the desire for further safeguards to ensure that it is used appropriately by the regulator.

I am also grateful to technology companies for the constructive engagement they have had with officials over the summer. As I set out on Report, the intention of our policy is to ensure clarity about Ofcom’s ability to observe empirical tests, which are a standard method for understanding algorithms and consequently for assessing companies’ compliance with the duties in the Bill. They involve taking a test data set, running it through an algorithmic system and observing the output.

15:45
Under the Clause 101 information-gathering power before it was amended, Ofcom would clearly have been able to require providers to carry out such tests and then submit the requested information to it. However, it was not explicit that Ofcom could observe tests itself, which in many cases would be significantly more efficient. I am pleased to announce that, to ensure that the drafting meets the Government’s policy intention, and in recognition of these concerns, the Government have tabled amendments to change Ofcom’s power of “remote access” to a power to “view information remotely”. This clarifies that Ofcom cannot use the power to require companies to give access to its systems, addressing concerns which noble Lords raised that the power was too broad and could be used in a way that might create security risks.
Furthermore, we have tabled amendments which would limit the scope of this power so that, rather than being able to use it to view remotely any information necessary to carry out its online safety functions, Ofcom may view remotely only specific types of information in relation to the operation of systems, processes or features, including algorithms, or to observe tests or demonstrations remotely. We have also listened to the calls for additional safeguards and have tabled amendments which would ensure that the power to view information remotely could be exercised only by persons authorised by Ofcom. Moreover, Ofcom will be required to issue a seven-day notice before exercising this power.
These further protections and limitations are in addition to the existing safeguards in the Bill, which include Ofcom’s legal duty to exercise this power in a way that is proportionate, ensuring that undue burdens are not placed on businesses. The proportionality safeguard would extend to issues of security and privacy, as well as the duration of any tests. In observing algorithmic assessments, Ofcom would generally expect to require a service to use a test data set. There may be circumstances where Ofcom asks a service to execute a test using data it holds—for example, in testing how content moderation systems respond to certain types of content on a service as part of an assessment of the systems and processes. In this scenario, Ofcom may need to use a provider’s own test data set containing content which has previously violated its own terms of service. However, Ofcom can process users’ personal data only in a way compatible with UK data protection law and must take into account a platform’s own obligations under relevant data protection legislation. I hope that these amendments address the concerns noble Lords raised during our previous debate, while ensuring that Ofcom has the information-gathering powers it needs to regulate effectively—in particular, to hold providers to account for their use of algorithms.
The Government have also tabled a number of minor and technical amendments to improve the drafting of the Bill. These include an amendment to Clause 52(3), which is about Ofcom’s duties to produce guidance. This amendment updates a cross-reference in this clause. We are also making technical amendments to include the relevant information powers and offences in Clause 121, which is about the admissibility of statements in criminal proceedings, and we are making an amendment to Clause 162 which defines age assurance as
“age verification or age estimation”.
I beg to move.
Lord Rooker Portrait Lord Rooker (Lab)
- View Speech - Hansard - - - Excerpts

I am very surprised that the Minister’s speech did not accede to the recommendations from the Delegated Powers and Regulatory Reform Committee, published last week, in the report we made after we were forced to meet during the Recess because of the Government’s failure with this Bill. From his private office, we want answers to what is set out in paragraphs 6 and 7:

“We urge the Minister to take the opportunity during the remaining stages of the Bill”—


which is today—

“to explain to the House”—

I will not read out the rest because it is quite clear. There are two issues—Henry VIII powers and skeleton legislation—and we require the Minister to accede to this report from a committee of the House.

I think that every member of the committee was present at the meeting on 29 August, the day after the bank holiday. We were forced to do that because the Government published amendments to Clauses 216 and 217 on 5 July, but they did not provide a delegated powers memorandum until 17 July, the date they were debated in this House. That prevented a committee of the House being able to report to the House on the issue of delegated powers. We are not interested in policy; all we are looking at is the delegated powers. We agreed that one of us would be here—as it is not a policy issue—to seek that the Minister responds to the recommendations of this committee of the House. I am very surprised that he has not done that.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am very concerned to hear the contribution from the noble Lord, Lord Rooker. I certainly look forward to hearing what the Minister says in reply. I confess that I was not aware of the Delegated Powers and Regulatory Powers Committee’s report to which he referred, and I wish to make myself familiar with it. I hope that he gets a suitable response from the Minister when he comes to wind up.

I am very grateful to the Minister for the amendments he tabled to Clause 44—Amendments 1 and 2. As he said, they ensure that there is transparency in the way that the Secretary of State exercises her power to issue a direction to Ofcom over its codes of practice. I remind the House—I will not detain your Lordships for very long—that the Communications and Digital Select Committee, which I have the privilege to chair, was concerned with the original Clause 39 for three main reasons: first, as it stood, the Bill handed the Secretary of State unprecedented powers to direct the regulator on pretty much anything; secondly, those directions could be made without Parliament knowing; and, thirdly, the process of direction could involve a form of ping-pong between government and regulator that could go on indefinitely.

However, over the course of the Bill’s passage, and as a result of our debates, I am pleased to say that, taken as a package, the various amendments tabled by the Government—not just today but at earlier stages, including on Report—mean that our concerns have been met. The areas where the Secretary of State can issue a direction now follow the precedent set by the Communications Act 2003, and the test for issuing them is much higher. As of today, via these amendments, the directions must be published and laid before Parliament. That is critical and is what we asked for on Report. Also, via these amendments, if the Secretary of State has good reason not to publish—namely, if it could present a risk to national security—she will still be required to inform Parliament that the direction has been made and of the reasons for not publishing. Once the code is finalised and laid before Parliament for approval, Ofcom must publish what has changed as a result of the directions. I would have liked to have seen a further amendment limiting the number of exchanges, so that there is no danger of infinite ping-pong between government and regulator, but I am satisfied that, taken together, these amendments make the likelihood of that much lower, and the transparency we have achieved means that Parliament can intervene.

Finally, at the moment, the platforms and social media companies have a huge amount of unaccountable power. As I have said many times, for me, the Bill is about ensuring greater accountability to the public, but that cannot be achieved by simply shifting power from the platforms to a regulator. Proper accountability to the public means ensuring a proper balance of power between the corporations, the regulator, government and Parliament. The changes we have made to the Bill ensure the balance is now much better between government and the regulator. Where I still think we have work to do is on parliamentary oversight of the regulator, in which so much power is being invested. Parliamentary oversight is not a matter for legislation, but it is something we will need to return to. In the meantime, I once again thank the Minister and his officials for their engagement and for the amendments that have been made.

Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I, too, thank the Minister for his engagement and for the amendments he has tabled at various stages throughout the passage of the Bill.

Amendment 15 provides a definition:

““age assurance” means age verification or age estimation”.

When the Minister winds up, could he provide details of the framework or timetable for its implementation? While we all respect that implementation must be delivered quickly, age verification provisions will be worthless unless there is swift enforcement action against those who transgress the Bill’s provisions. Will the Minister comment on enforcement and an implementation framework with direct reference to Amendment 15?

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, as this is a new stage of the Bill, I need to refer again to my entry in the register of interests. I have no current financial interest in any of the regulated companies for which I used to work, in one of which I held a senior role for a decade.

I welcome Amendment 7 and those following from it which change the remote access provision. The change from “remote access” to “view remotely” is quite significant. I appreciate the Minister’s willingness to consider it and particularly the Bill team’s creativity in coming up with this new phrasing. It is much simpler and clearer than the phrasing we had before. We all understand what “view remotely” means. “Access” could have been argued over endlessly. I congratulate the Minister and the team for simplifying the Bill. It again demonstrates the value of some of the scrutiny we carried out on Report.

It is certainly rational to enable some form of viewing in some circumstances, not least where the operations of the regulated entities are outside the United Kingdom and where Ofcom has a legitimate interest in observing tests that are being carried out. The remote access, or the remote viewing facility as it now is, will mean it can do this without necessarily sending teams overseas. This is more efficient, as the Minister said. As this entire regime is going to be paid for by the regulated entities, they have an interest in finding cheaper and more efficient methods of carrying out the supervision than teams going from London to potentially lots of overseas destinations. Agreement between the provider and Ofcom that this form of remote viewing is the most efficient will be welcomed by everybody. It is certainly better than the other option of taking data off-site. I am glad to see that, through the provisions we have in place, we will minimise the instances where Ofcom feels it needs data from providers to be taken off-site to some other facility, which is where a lot of the privacy risks come from.

Can the Minister give some additional assurances at some stage either in his closing remarks or through any follow-up correspondence? First, the notion of proportionality is implicit, but it would help for it to be made explicit. Whenever Ofcom is using the information notices, it should always use the least intrusive method. Yes, it may need to view some tests remotely, but only where the information could not have been provided in written form, for example, or sent as a document. We should not immediately escalate to remote viewing if we have not tried less intrusive methods. I hope that notion of proportionality and least intrusion is implicit within it.

Secondly, concerns remain around live user data. I heard the Minister say that the intention is to use test data sets. That needs to be really clear. It is natural for people to be concerned that their live user data might be exposed to anyone, be it a regulator or otherwise. Of course, we expect Ofcom staff to behave with propriety, but there have sadly been instances where individuals have taken data that they have observed, whether they were working for the police, the NHS or any other entity, and abused it. The safest safeguard is for there to be no access to live user data. I hope the Minister will go as far as he can in saying that that is not the intention.

16:00
Thirdly, Ofcom should carry out some kind of privacy impact assessment before requiring access. Again, that is standard practice in data protection terms and is a helpful discipline. If somebody at Ofcom is thinking, “Look, I’d really like to view one of these tests remotely”, there should be some kind of internal process where someone says, “I’m just going to look at the privacy impact of that and, if there are concerns, I’m going to work through them”. Doing this before the test is better than finding out after the test that there was an issue; I speak from experience, having worked at a company that did all sorts of things that turned out to be serious mistakes from a privacy point of view. I do not want Ofcom to fall into the same trap.
Fourthly, I would like reassurance that these things will be time-limited. Again, this is not explicit in the Bill, but I hope the Minister will be able to say that the intention is that, when Ofcom asks to view things remotely, those are not going to be open-ended asks but will be a case of saying, “I want to view X remotely for this period of time”—a week, a month, whatever is required—and that there will not be continual viewing, which is where it potentially becomes problematic.
Finally, I want to make a suggestion in this area: that the Government encourage Ofcom, which will be the independent regulator once we have finished with this Bill, to maintain a public register of all the information notices that it issues—without sensitive information, obviously. The fact that Ofcom has sought access to, requested information from and been viewing data at a particular platform is a matter of public interest. It would provide huge reassurance to people in the United Kingdom using these services if they knew that any information requests will be made public and that there will be no secrecy involved in the process. That is my final request, particularly around remote viewing requests. Otherwise, people will create conspiracy theories around what remote viewing entails; the best way to prevent this is simply to have a register saying, “Look, if Ofcom asked company X for this kind of remote viewing, that will never be secret. There will always be an easy way for a citizen to found out that that happened”.
Having said that, we certainly welcome these changes. They are an improvement as a result of our debate and scrutiny on Report.
Baroness Kennedy of Shaws Portrait Baroness Kennedy of The Shaws (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I, too, join noble Lords in thanking the Minister for the way in which he has addressed my concerns about aspects of the Bill and has wanted to enhance particularly the protection of women and girls from the kind of threats that they experience online. I really feel that the Minister has been exemplary in the way in which he has interacted with everyone in this House who has wanted to improve the Bill and has come to him with good will. He has listened and his team have been absolutely outstanding in the work that they have done. I express my gratitude to him.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I, too, thank the Minister for the great improvements that the Government have made to the Secretary of State’s powers in the Bill during its passage through this House. I rise to speak briefly today to praise the Government’s new Amendments 1 and 2 to Clause 44. As a journalist, I was worried by the lack of transparency around these powers in the clause; I am glad that the lessons of Section 94 of the Telecommunications Act 1984, which had to be rescinded, have been learned. In a world of conspiracy theories that can be damaging to public trust and governmental and regulatory process, it has never been more important that Parliament and the public are informed about the actions of government when giving directions to Ofcom about the draft codes of practice. So I am glad that these new amendments resolve those concerns.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I welcome Amendments 5 and 6, as well as the amendments that reflect the work done and comments made in earlier stages of this debate by the noble Baroness, Lady Kennedy. Of course, we are not quite there yet with this Bill, but we are well on the way as this is the Bill’s last formal stage in this Chamber before it goes back to the House of Commons.

Amendments 5 and 6 relate to the categorisation of platforms. I do not want to steal my noble friend’s thunder, but I echo the comments made about the engagement both from my noble friend the Minister and from the Secretary of State. I am delighted that the indications I have received are that they will accept the amendment to Schedule 11, which this House voted on just before the Recess; that is a significant and extremely welcome change.

When commentators outside talk about the work of a revising Chamber, I hope that this Bill will be used as a model for cross-party, non-partisan engagement in how we make a Bill as good as it possibly can be—particularly when it is as ground-breaking and novel as this one is. My noble friend the Minister said in a letter to all of us that this Bill had been strengthened in this Chamber, and I think that is absolutely right.

I also want to echo thanks to the Bill team, some of whom I was working with four years ago when we were talking about this Bill. They have stuck with the Bill through thick and thin. Also, I thank noble Lords across the House for their support for the amendments but also all of those outside this House who have committed such time, effort, support and expertise to making sure this Bill is as good as possible. I wish it well with its final stages. I think we all look forward to both Royal Assent and also the next big challenge, which is implementation.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the Minister for his introduction today and also for his letter which set out the reasons and the very welcome amendments that he has tabled today. First, I must congratulate the noble Baroness, Lady Stowell, for her persistence in pushing amendments of this kind to Clause 45, which will considerably increase the transparency of the Secretary of State’s directions if they are to take place. They are extremely welcome as amendments to Clause 45.

Of course, there is always a “but”—by the way, I am delighted that the Minister took the advice of the House and clearly spent his summer reading through the Bill in great deal, or we would not have seen these amendments, I am sure—but I am just sorry that he did not take the opportunity also to address Clause 176 in terms of the threshold for powers to direct Ofcom in special circumstances, and of course the rather burdensome powers in relation to the Secretary of State’s guidance on Ofcom’s exercise of its functions under the Bill as a whole. No doubt we will see how that works out in practice and whether they are going to be used on a frequent basis.

My noble friend Lord Allan—and I must congratulate both him and the noble Lord, Lord Knight, for their addressing this very important issue—has set out five assurances that he is seeking from the Minister. I very much hope that the Minister can give those today, if possible.

Congratulations are also due to the noble Baroness, Lady Kennedy, for finding a real loophole in the offence, which has now been amended. We are all delighted to see that the point has been well taken.

Finally, on the point raised by the noble Lord, Lord Rooker, clearly it is up to the Minister to respond to the points made by the committee. All of us would have preferred to see a comprehensive scheme in the primary legislation, but we are where we are. We wanted to see action on apps; they have some circumscribing within the terms of the Bill. The terms of the Bill—as we have discussed—particularly with the taking out of “legal but harmful”, do not give a huge amount of leeway, so this is not perhaps as skeleton a provision as one might otherwise have thought. Those are my reflections on what the committee has said.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I do not know how everyone has spent their summer, but this feels a bit like we have been working on a mammoth jigsaw puzzle and we are now putting in the final pieces. At times, through the course of this Bill, it has felt like doing a puzzle in the metaverse, where we have been trying to control an unreliable avatar that is actually assembling the jigsaw—but that would be an unfair description of the Minister. He has done really well in reflecting on what we have said, influencing his ministerial colleagues in a masterclass of managing upwards, and coming up with reasonable resolutions to previously intractable issues.

We are trusting that some of the outcome of that work will be attended to in the Commons, as the noble Baroness, Lady Morgan, has said, particularly the issues that she raised on risk, that the noble Baroness, Lady Kidron, raised on children’s safety by design, and that my noble friend Lady Merron raised on animal cruelty. We are delighted at where we think these issues have got to.

For today, I am pleased that the concerns of the noble Baroness, Lady Stowell, on Secretary of State powers, which we supported, have been addressed. I also associate myself with her comments on parliamentary scrutiny of the work of the regulator. Equally, we are delighted that the Minister has answered the concerns of my noble friend Lady Kennedy and that he has secured the legislative consent orders which he informed us of at the outset today. We would be grateful if the Minister could write to us answering the points of my noble friend Lord Rooker, which were well made by him and by the Delegated Powers Committee.

I am especially pleased to see that the issues which we raised at Report on remote access have been addressed. I feel smug, as I had to press quite hard for the Minister to leave the door open to come back at this stage on this. I am delighted that he is now walking through the door. Like the noble Lord, Lord Allan, I have just a few things that I would like clarification on—the proportional use of the powers, Ofcom taking into account user privacy, especially regarding live user data, and that the duration of the powers be time- limited.

Finally, I thank parliamentarians on all sides for an exemplary team effort. With so much seemingly falling apart around us, it is encouraging that, when we have common purpose, we can achieve a lot, as we have with this Bill.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, let me first address the points made by the noble Lord, Lord Rooker. I am afraid that, like my noble friend Lady Stowell of Beeston, I was not aware of the report of your Lordships’ committee. Unlike her, I should have been. I have checked with my private office and we have not received a letter from the committee, but I will ask them to contact the clerk to the committee immediately and will respond to this today. I am very sorry that this was not brought to my attention, particularly since the members of the committee met during the Recess to look at this issue. I have corresponded with my noble friend Lord McLoughlin, who chairs the committee, on each of its previous reports. Where we have disagreed, we have done so explicitly and set out our reasons. We have agreed with most of its previous recommendations. I am very sorry that I was not aware of this report and have not had the opportunity to provide answers for your Lordships’ House ahead of the debate.

Lord Rooker Portrait Lord Rooker (Lab)
- Hansard - - - Excerpts

The report was published on 31 August. It so happens that the committee has been forced to meet in an emergency session tomorrow morning because of government amendments that have been tabled to the levelling-up Bill, which will be debated next Wednesday, that require a report on the delegated powers, so we will have the opportunity to see what the Minister has said. I am very grateful for his approach.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

The committee will have a reply from me before it meets tomorrow. Again, I apologise. It should not be up to the committee to let the Minister know; I ought to have known about it.

I am very grateful to noble Lords for their support of the amendments that we have tabled in this group, which reflect the collaborative nature of the work that we have done and the thought which has been put into this by my ministerial colleagues and me, and by the Bill team, over the summer. I will have a bit more to say on that when I move that the Bill do now pass in a moment, but I am very grateful to those noble Lords who have spoken at this stage for highlighting the model of collaborative working that the Bill has shown.

The noble Baroness, Lady Ritchie of Downpatrick, asked for an update on timetables. Some of the implementation timetables which Ofcom has assessed depend a little on issues which may still change when the Bill moves to another place. If she will permit it, once they have been resolved I will write with the latest assessments from Ofcom, and, if appropriate, from us, on the implementation timelines. They are being recalculated in the light of amendments that have been made to the Bill and which may yet further change. However, everybody shares the desire to implement the Bill as swiftly as possible, and I am grateful that your Lordships’ work has helped us do our scrutiny with that in mind.

The noble Lord, Lord Allan, asked some questions about the remote viewing power. On proportionality, Ofcom will have a legal duty to exercise its power to view information remotely in a way that is proportionate, ensuring, as I said, that undue burdens are not placed on businesses. In assessing proportionality in line with this requirement, Ofcom would need to consider the size and resource capacity of a service when choosing the most appropriate way of gathering information. To comply with this requirement, Ofcom would also need to consider whether there was a less onerous method of obtaining the necessary information.

On the points regarding that and intrusion, Ofcom expects to engage with providers as appropriate about how to obtain the information it needs to carry out its functions. Because of the requirement on Ofcom to exercise its information-gathering powers proportionately, it would need to consider less onerous methods. As I said, that might include an audit or a skilled persons report, but we anticipate that, for smaller services in particular, those options could be more burdensome than Ofcom remotely viewing information.

16:15
On live user data, Ofcom would generally expect to require a service to use a test dataset, as I said in opening this debate. Additionally, Ofcom can process users’ data only in a way that is compatible with UK data protection law, and the extent to which steps would require Ofcom to view personal data is also relevant to its proportionality assessment.
We agree with my noble friend Lady Stowell and the noble Lord, Lord Knight, that ongoing parliamentary scrutiny of the regime will be crucial in helping to reassure everybody that the Bill has done what we hope it will. The creation of the new Department for Science, Innovation and Technology means there is another departmental Select Committee in another place which will provide an enhanced opportunity for cross-party scrutiny of the new regime and digital regulation more broadly. Your Lordships’ Communications and Digital Committee will of course continue to play a vital role in the scrutiny in this House. As I set out at Report, to support this, the Government will ensure that the relevant committees in both Houses have every chance to play a part in government consultations by informing them when they are open. While we do not want the implementation process to be delayed, we will, where possible, share draft statutory instruments directly with the relevant committees before the formal laying process. That will be on a case-by-case basis, considering what is appropriate and reasonably practical. Of course, it will be up to the committees to decide how they wish to engage, but it will not create an additional approval process, to avoid delaying implementation.
A number of noble Lords mentioned press coverage about encryption, which I am aware of. Let me be clear: there is no intention by the Government to weaken the encryption technology used by platforms, and we have built strong safeguards into the Bill to ensure that users’ privacy is protected.
While the safety duties apply regardless of design, the Bill is clear that Ofcom cannot require companies to use proactive technology on private communications in order to comply with these duties. Ofcom can require the use of a technology by a private communication service only by issuing a notice to tackle child sexual exploitation and abuse content under Clause 122. A notice can be issued only where technically feasible and where technology has been accredited as meeting minimum standards of accuracy in detecting only child sexual abuse and exploitation content. Ofcom is also required to comply with existing data protection legislation when issuing a notice under Clause 122 and, as a public body, is bound by the Human Rights Act 1998 and the European Convention on Human Rights.
When deciding whether to issue a notice, Ofcom will work closely with the service to help identify reasonable, technically feasible solutions to address child sexual exploitation and abuse risk, including drawing on evidence from a skilled persons report. If appropriate technology which meets these requirements does not exist, Ofcom cannot require its use. That is why the powers include the ability for Ofcom to require companies to make best endeavours to develop or source a new solution. It is right that Ofcom should be able to require technology companies to use their considerable resources and expertise to develop the best possible protections for children in encrypted environments. That has been our long-standing policy position.
Our stance on tackling child sexual abuse online remains firm, and we have always been clear that the Bill takes a measured, evidence-based approach to do this. I hope that is useful clarification for those who still had questions on that point.
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

Will my noble friend draw attention to the part of Clause 122 that says that Ofcom cannot issue a requirement which is not technically feasible, as he has just said? That does not appear in the text of the clause, and it creates a potential conflict. Even if the requirement is not technically feasible—or, at least, if the platform claims that it is not—Ofcom’s power to require it is not mitigated by the clause. It still has the power, which it can exercise, and it can presumably take some form of enforcement action if it decides that the company is not being wholly open or honest. The technical feasibility is not built into the clause, but my noble friend has just added it, as with quite a lot of other stuff in the Bill.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

It has to meet minimum standards of accuracy and must have privacy safeguards in place. The clause talks about those in a positive sense, which sets out the expectation. I am happy to make clear, as I have, what that means: if the appropriate technology does not exist that meets these requirements, then Ofcom will not be able to use Clause 122 to require its use. I hope that that satisfies my noble friend.

Amendment 1 agreed.
Amendments 2 and 3
Moved by
2: Clause 44, page 45, line 31, at end insert—
“(7A) If the Secretary of State considers that publishing and laying before Parliament a direction given under this section would be against the interests of national security, public safety or relations with the government of a country outside the United Kingdom—(a) subsection (7)(c) does not apply in relation to the direction, and(b) the Secretary of State must, as soon as reasonably practicable, publish and lay before Parliament a document stating—(i) that a direction has been given,(ii) the kind of code of practice to which it relates, and(iii) the reasons for not publishing it.”Member’s explanatory statement
This amendment provides that in the circumstances mentioned in the amendment the Secretary of State is not required to publish and lay before Parliament a direction given under this Clause but must instead publish and lay before Parliament a document stating that a direction has been given, the code of practice to which it relates and the reasons for not publishing it.
3: Clause 44, page 46, line 2, leave out “and (8)” and insert “to (8)”
Member’s explanatory statement
This amendment is consequential on the preceding amendment to this Clause in my name.
Amendments 2 and 3 agreed.
Clause 52: OFCOM’s guidance about certain duties in Part 3
Amendment 4
Moved by
4: Clause 52, page 52, line 12, leave out “subsection (9) of those sections” and insert “section 23(10) or 34(9)”
Member’s explanatory statement
This is a technical amendment which substitutes the correct cross-references into this provision.
Amendment 4 agreed.
Clause 95: Meaning of threshold conditions etc
Amendment 5
Moved by
5: Clause 95, page 85, line 12, at end insert—
“(za) references to a service meeting the Category 1, Category 2A or Category 2B threshold conditions are to a service meeting those conditions in a way specified in regulations under paragraph 1 of Schedule 11 (see paragraph 1(4) of that Schedule);”Member’s explanatory statement
This amendment improves the drafting to clarify that a service “meets the Category 1 threshold conditions” (for example) if the service meets them in a way set out in regulations under Schedule 11.
Amendment 5 agreed.
Clause 98: List of emerging Category 1 services
Amendment 6
Moved by
6: Clause 98, page 88, line 19, after “which” insert “does not meet the Category 1 threshold conditions and which”
Member’s explanatory statement
This amendment improves the drafting to clarify that services which are already Category 1 services, or which meet the conditions to be a Category 1 service, do not need to be assessed by OFCOM to see if they should be included in the list which is provided for by Clause 98.
Amendment 6 agreed.
Clause 101: Power to require information
Amendments 7 to 10
Moved by
7: Clause 101, page 91, line 23, leave out from “that” to end of line 26 and insert “a person authorised by OFCOM is able to view remotely—”
Member’s explanatory statement
This amendment changes the wording of one of OFCOM’s information powers. The power now refers to viewing information remotely, rather than remotely accessing a service; the power is exercisable by a person authorised by OFCOM; and the power may only be exercised in relation to information as mentioned in Clause 101(3)(a) and (b).
8: Clause 101, page 91, line 29, leave out “the” and insert “a”
Member’s explanatory statement
This amendment and the next amendment in my name make minor drafting changes in connection with the first amendment of Clause 101 in my name.
9: Clause 101, page 91, line 30, after “generated” insert “by a service”
Member’s explanatory statement
This amendment and the preceding amendment in my name make minor drafting changes in connection with the first amendment of Clause 101 in my name.
10: Clause 101, page 93, line 5, at end insert—
“(7A) The reference in subsection (3) to a person authorised by OFCOM is to a person authorised by OFCOM in writing for the purposes of notices that impose requirements of a kind mentioned in that subsection, and such a person must produce evidence of their identity if requested to do so by a person in receipt of such a notice.”Member’s explanatory statement
This amendment explains what is meant by the reference in Clause 101(3) to a person authorised by OFCOM.
Amendments 7 to 10 agreed.
Clause 103: Information notices
Amendment 11
Moved by
11: Clause 103, page 94, line 27, at end insert—
“(4A) An information notice requiring a person to take steps of a kind mentioned in section 101(3) must give the person at least seven days’ notice before the steps are required to be taken.” Member’s explanatory statement
This amendment has the effect that if a person receives a notice from OFCOM requiring them to allow OFCOM to remotely view information, they must be given at least 7 days to comply with the notice.
Amendment 11 agreed.
Clause 121: Admissibility of statements
Amendments 12 to 14
Moved by
12: Clause 121, page 105, line 32, after “101” insert “, 102”
Member’s explanatory statement
Clause 121 is about the admissibility of statements in criminal proceedings. This amendment adds Clause 102 to the list of relevant information powers (information in connection with an investigation into the death of a child).
13: Clause 121, page 105, line 33, after “2(4)(e) or (f),” insert “3(2),”
Member’s explanatory statement
This amendment adds paragraph 3(2) of Schedule 12 to the list of relevant information powers (notices in connection with an inspection by OFCOM).
14: Clause 121, page 106, line 7, after “18” insert “(1)(c)”
Member’s explanatory statement
This amendment pinpoints paragraph 18(1)(c) of Schedule 12 as the offence relevant to this Clause (rather than paragraph 18 as a whole)(provision of false information in connection with an inspection by OFCOM etc).
Amendments 12 to 14 agreed.
Clause 162: OFCOM’s report about use of app stores by children
Amendment 15
Moved by
15: Clause 162, page 144, line 29, at end insert—
““age assurance” means age verification or age estimation;”Member’s explanatory statement
This amendment adds a definition of “age assurance” into this Clause.
Amendment 15 agreed.
Clause 182: Threatening communications offence
Amendments 16 and 17
Moved by
16: Clause 182, page 159, line 29, after “out” insert “(whether or not by the person sending the message)”
Member’s explanatory statement
This amendment makes it clear that the threatening communications offence in Clause 182 may be committed by a person who sends a threatening message regardless of who might carry out the threat.
17: Clause 182, page 159, line 31, after “out” insert “(whether or not by the person sending the message)”
Member’s explanatory statement
This amendment makes it clear that the threatening communications offence in Clause 182 may be committed by a person who sends a threatening message regardless of who might carry out the threat.
Amendments 16 and 17 agreed.
16:21
Motion
Moved by
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay
- Hansard - - - Excerpts

That the Bill do now pass.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, in begging to move that the Bill do now pass, I add my words of thanks to all noble Lords who have been involved over many years and many iterations of the Bill, particularly during my time as the Minister and in the diligent scrutiny we have given it in recent months. The Bill will establish a vital legislative framework, making the internet safer for all, particularly for children. We are now closer than ever to achieving that important goal. In a matter of months from Royal Assent, companies will be required to put in place protections to tackle illegal content on their services or face huge fines. I am very grateful to noble Lords for the dedication, attention and time they have given to the Bill while it has been before your Lordships’ House.

The Bill will mark a significant change in children’s safety online. Last month, data from UK police forces showed that 6,350 offences relating to sexual communications with a child were recorded last year alone. These are horrifying statistics which underline the importance of the Bill in building a protective shield for our children online. We cannot let perpetrators of such abhorrent crimes stalk children online and hide behind their screens, nor let companies continue to turn a blind eye to the harm being done to children on their services. We are working closely with Ofcom to make sure that the protections for children established by the Bill are enforced as soon as possible, and we have been clear that companies should not wait for the legislation to come into force before taking action.

The aim of keeping children safe online is woven throughout the Bill, and the changes that we have made throughout its passage in your Lordships’ House have further bolstered it. In order to provide early and clear guidance to companies and Ofcom regarding the content from which children must be protected, rather than addressing these later via secondary legislation, the categories of primary priority and priority content which is harmful to children will now be set out in the Bill.

Following another amendment made during your Lordships’ scrutiny, providers of the largest services will also be required to publish summaries of their risk assessments for illegal content and content which is harmful to children. Further changes to the Bill have also made sure that technology executives must take more responsibility for the safety of those who use their websites. Senior managers will face criminal liability if they fail to comply with steps set by Ofcom following enforcement action to keep children safe on their platforms, with the offence punishable with up to two years in prison.

Noble Lords have rightly raised concerns about what the fast-changing technological landscape will mean for children. The Bill faces the future and is designed to keep pace with emerging technological changes such as AI-generated pornography.

Child sexual exploitation and abuse content generated by AI is illegal, regardless of whether it depicts a real child or not, and the Bill makes it clear that technology companies will be required to identify this content proactively and remove it. Whatever the future holds, the Bill will ensure that guard rails are in place to allow our children to explore it safely online.

I have also had the pleasure of collaborating with noble Lords from across your Lordships’ House who have championed the important cause of strengthening protections for women and girls online, who we know disproportionately bear the brunt of abhorrent behaviour on the internet. Following changes made earlier to the Bill, Ofcom will be required to produce and publish guidance which summarises in one clear place measures that should be taken to reduce the risk of harm to women and girls online. The amendment will also oblige Ofcom to consult when producing the guidance, ensuring that it reflects the voices of women and girls as well as the views of experts on this important issue.

The Bill strikes a careful balance: it tackles criminal activity online and protects our children while enshrining freedom of expression in its legislative framework. A series of changes to the Bill has ensured that adults are provided with greater control over their online experience. All adult users of the largest services will have access to tools which, if they choose to use them, will allow them to filter out content from non-verified users and to reduce the likelihood of encountering abusive content. These amendments, which have undergone careful consideration and consultation, will ensure that the Bill remains proportionate, clear and future-proof.

I am very grateful to noble Lords who have helped us make those improvements and many more. I am conscious that a great number of noble Lords who have taken part in our debates were part of the pre-legislative scrutiny some years ago. They know the Bill very well and they know the issues well, which has helped our debates be well informed and focused. It has helped the scrutiny of His Majesty’s Government, and I hope that we have risen to that.

I am very grateful to all noble Lords who have made representations on behalf of families who have suffered bereavements because of the many terrible experiences online of their children and other loved ones. There are too many for me to name now, and many more who have not campaigned publicly but who I know have been following the progress of the Bill carefully, and we remember them all today.

Again, there are too many noble Lords for me to single out all those who have been so vigilant on this issue. I thank my colleagues on the Front Bench, my noble friends Lord Camrose and Lord Harlech, and on the Front Bench opposite the noble Lords, Lord Knight and Lord Stevenson, and the noble Baroness, Lady Merron. On the Liberal Democrat Benches, I thank the noble Lords, Lord Clement-Jones and Lord Allan of Hallam—who has been partly on the Front Bench and partly behind—who have been working very hard on this.

I also thank the noble Baroness, Lady Kidron, whom I consider a Front-Bencher for the Cross Benches on this issue. She was at the vanguard of many of these issues long before the Bill came to your Lordships’ House and will continue to be long after. We are all hugely impressed by her energy and personal commitment, following the debates not only in our own legislature but in other jurisdictions. I am grateful to her for the collaborative nature of her work with us.

I will not single out other noble Lords, but I am very grateful to them from all corners of the House. They have kicked the tyres of the Bill and asked important questions; they have given lots of time and energy to it and it is a better Bill for that.

I put on record my thanks to the huge team in my department and the Department for Science, Innovation and Technology, who, through years of work, expertise and determination, have brought the Bill to this point. I am grateful to the staff of your Lordships’ House and to colleagues from the Office of the Parliamentary Counsel, in particular Maria White and Neil Shah, and, at the Department for Science, Innovation and Technology, Sarah Connolly, Orla MacRae, Caroline Bowman and Emma Hindley as well as their huge teams, including those who have worked on the Bill over the years but are not currently working on it. They have worked extremely hard and been generous with their time to noble Lords for the use of our work.

The Bill will make a vital difference to people’s safety online, especially children’s safety. It has been a privilege to play a part in it. I was working as a special adviser at the Home Office when this area of work was first mooted. I remember that, when this Bill was suggested in the 2017 manifesto, people suggested that regulating the internet was a crazy idea. The biggest criticism now is that we have not done it sooner. I am very grateful to noble Lords for doing their scrutiny diligently but speedily, and I hope to see the Bill on the statute book very soon. I beg to move that the Bill do now pass.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to the Minister for his very kind words to everybody, particularly my Front Bench and me. I also wish him a speedy recovery from his recent illness, although I was less sympathetic when I discovered how much he has been “managing upwards”—in the words of my noble friend Lord Knight—and achieving for us in the last few days. He has obviously been recovering and I am grateful for that. The noble Lord has steered the Bill through your Lordships’ House with great skill and largely single-handedly. It has been a pleasure to work with him, even when he was turning down our proposals and suggestions for change, which he did in the nicest possible way but absolutely firmly.

16:30
As has been mentioned, the original Green Paper was a response to the consultation on internet safety that the noble Lord mentioned, which started in October 2017. We are fast approaching six years later. A commitment to legislate has appeared in all the party election manifestos since then, but there have been changes in approach. That is not surprising given the turnover in Secretaries of State and junior Ministers, not forgetting that there has also been a change of department in that period. However, nearly six years on, it is gratifying to see that the bones of the original approach, albeit modified by the White Paper, are still in this version of the Bill.
Government processes can be cumbersome, but on this Bill they have worked very well. The Green and White Papers, and the government response to many of the consultations, all helped to set out thinking, clarify the approach and give early notice to companies likely to be in scope of the Bill. It was a very smart move to select Ofcom as the regulator early on and to fund it to prepare and scale up. That will prove to be a very good investment in future years.
Adding the pre-legislative joint scrutiny committee, which the noble Lord mentioned and which had five Members from this House, was a very important step. Damian Collins MP, who perhaps does not get the credit that he should, was a very good choice as chairman. The noble Lord, Lord Clement-Jones, kept us fully briefed on the report as we went through the various stages—he probably has a copy in his hands as we speak and may well want to quote from it even more. That so many of those recommendations are now in the Bill shows, as the Minister says, what can happen if we pool our efforts and pull together for a common aim.
Given that there was broad political agreement and that the key principles of the Bill were right, at Second Reading in your Lordships’ House I called for us to work together across party lines to ensure that we got the best Bill that we could out of what was before us. I was touched that so many colleagues from across the House agreed with my approach and went out of their way to offer their support. It was really good to see colleagues working together across the House, ignoring party lines, in pursuit of a better Bill. We are all Cross-Benchers at heart, or Bishops—perhaps not.
We got off to a slightly rocky start in Committee, with virtually everything being dismissed with a very superior form of words—usually that we had not foreseen the unforeseen consequences of our amendment being accepted—but it is good to see a lot of those amendments trumping back into the Bill now. But the debates themselves were useful and built a consensus around several key areas. It was clear that this collaborative approach can be very effective. Indeed, this way of working has shown parliamentary scrutiny at its best. We had debates of high quality, generating real insights on the Floor of the House. To be fair, by the time we got to Report, the Government rose to the challenge and responded with nearly 200 amendments that are going forward to the Commons. If you think about it, this is all the more remarkable given the intense partisanship that has characterised our public life during this time.
While a few significant issues still need to be resolved, there have been big changes and developments in the last few days. Following discussions with my noble friend Lady Merron, Sir Jeremy Wright and the noble Baroness, Lady Morgan, the Government have offered to bring forward amendments at the Commons consideration of Lords amendments stage next week. But, as the noble Baroness, Lady Morgan, said, we need to see those and to be clear that they are going in the direction that we have been told they will. We want to make sure that the Government will deliver what they have offered in these outstanding points. If they do, we can look forward to the strong possibility of completing parliamentary processes on the Bill by the end of this September sitting.
I thank the Bill team for all the work they did throughout. It was particularly good that the Minister mentioned them by name, because they have given a huge amount to us. I do not think that any holidays or time off have been allowed over the last few years, as they have worked through the various changes we have proposed. Their willingness to share their thinking has been absolutely fantastic. Taking us into their confidence on the policy issues that were still not finalised within government was difficult for them, and of course runs counter to all the usual approaches. I have been on Bills when we have had no information at all about the thinking. It was better here when we were talking about these things, having meetings that looked at the options and thinking about the ways in which they might be taken forward. I am sure it gave us the chance to make better decisions about when to settle, and as a result I hope that the Bill team will agree that the Bill is now in much better shape than it was.
Of course, the Opposition are at a considerable disadvantage to the Government in the support we can command when trying to take on legislation and give good scrutiny, as we wish to do. Dan Stevens in our office has done a magnificent job for us, despite having several other policy briefs to deal with. We would have struggled to deal with this Bill without his calm and measured advice and administrative skills. I think we should put it on record that we have also had a lot of support from the Public Bill Office. It is very hard to get amendments that say what you want, in language that will be accepted and allows them to be debated. Its staff often say that they are not parliamentary draftsmen or lawyers, but they make a pretty good job of what they have to do.
I also pay tribute to the All-Party Group on Digital Regulation and Responsibility, chaired by Sir Jeremy Wright, which has tracked the progress of the Bill throughout its many stages, organised meetings and circulated briefings, which has been incredibly useful. I think all of us involved in the Bill have benefited from the expertise and knowledge of the Carnegie UK Trust, led on this occasion by one of its trustees, William Perrin, who, with Professor Lorna Woods, was key to the initial development of the duty of care approach, and who, together with Maeve Walsh and others from the Carnegie team, supplied high-quality briefings and advice as we went through the various stages.
Finally, I thank my noble friends Lady Gillian Merron and Lord Jim Knight, who have supported me throughout this period despite having significant responsibilities in other areas, have taken the strain when needed without complaint, and have indeed won improvements to the Bill that I perhaps would not even have thought of, let alone obtained. It has been a real team effort, a joy and a pleasure, and a most enjoyable experience.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I am probably going to echo quite a lot of what the noble Lord, Lord Stevenson, had to say, and I also pay tribute to him. This is an absolutely crucial piece of cross-party-supported legislation that many said was impossible. I believe that it is a landmark, and we should all take huge encouragement from seeing it pass through this House.

We started with the Green Paper, as the noble Lord, Lord Stevenson, said, back in 2017. Many of us have been living with this issue since then, and I hope that therefore the House will not mind if I make a few more extended remarks than usual on the Motion that the Bill do now pass. I will not disappoint the noble Lord, Lord Stevenson, because I will quote from the original Joint Committee report. As we said in the introduction to our Joint Committee report back in 2021:

“The Online Safety Bill is a key step forward for democratic societies to bring accountability and responsibility to the internet”.


We said that the most important thing was to

“hold online services responsible for the risks created by their design and operation”.

Our children and many others will be safer online as a result.

Across the House, this has been a huge joint venture. We made some very good progress, with the Minister and the Secretary of State demonstrating considerable flexibility. I thank them sincerely for that. We have tightened the Bill up, particularly regarding harms and risks, while, I believe, ensuring that we protect freedom of expression. Many Members of this House, including former Members of the Joint Committee, can take some pride in what has been achieved during the passage of the Bill through the House. I will add my thanks to some of them individually shortly.

The Minister mentioned a relatively short list; he was actually rather modest in mentioning some of the concessions that have been given while the Bill has passed through the House. For instance, the tightening up of the age-assurance measures and the adding of a schedule of age-assurance principles are really important additions to the Bill.

Risk assessment of user empowerment tools is very important, and I believe that the provisions about app stores and future regulation are an important aspect of the Bill. The freedom of expression definition has been inserted into the Bill. We have had new offences, such as facilitating self-harm and intimate image abuse, added during the passage of the Bill. I am delighted to say that, as the noble Lord, Lord Stevenson, said, we expect to hear further concessions in the Commons on both the functionality issue raised by the noble Baroness, Lady Kidron, and the category 1 aspects raised by the noble Baroness, Lady Morgan.

We very much welcome the amendments that have been tabled today, including the remote-viewing clarification. We wait to hear what the Government’s position will be—I am sure that discussions are ongoing since the House voted to include a provision to review whether animal cruelty offences online should be brought into scope, and I am delighted to see the noble Baroness, Lady Hayman, here—and whether they will preserve the amendment and perhaps also include wildlife-trafficking offences in order to ensure that we avoid ping-pong on that last issue.

We on these Benches have never been minded to spoil the ship for a halfpenny-worth of tar, but that is not to say that there are not areas where we would have liked to have seen a bit more progress. I do not think the Minister will be surprised to hear me say that there are one or two such areas, such as: risk assessment, where we believe that the terms of service should be subject to a mandatory risk assessment; the threshold of evidence required for illegality; the prosecution threshold as regards the encouragement of non-fatal self-harm; the intent requirement for cyber flashing; and verification status and visibility, and whether Ofcom can actually introduce requirements.

I heard what the Minister had to say about AI-generated pornography but, like the NSPCC, I am not convinced that we have adequately covered the features provided as part of a service in the metasphere with which users interact. Bots in the metaverse are demonstrating an extraordinary level of autonomy that could potentially be harmful and, it seems, may not be covered by the Bill. Time will tell, and we will see whether that is the case.

Then of course there is the lack of legislative teeth for the review of research access and no requirement for guidance afterwards. I very much hope that will happen, despite there being no obligation at the end of the day.

I have mentioned Clauses 176 and 177. We wait to see how those will pan out. Then of course there is the issue on which these Benches have spoken virtually alone: the question of news publisher definition and exemption.

I very much welcome the last piece of assurance that the Minister gave in terms of Ofcom’s powers under Clause 122. Even as late as last night we heard news reports and current affairs programmes discussing the issue, and I genuinely believe that what the Minister said will be reassuring. Certainly I took comfort from what he had to say, and I thank him for agreeing to say it at a pretty late stage in the proceedings.

I think we all recognise that in many ways the Bill is just the beginning. There will be much further work to be done. We need to come back on misinformation when the committee set up under Clause 153 has reported. I hope that in particular it will look at issues such as provenance solutions such as those provided by the Content Authenticity Initiative. Fundamental changes will be needed to our electoral law in order to combat misinformation in the course of our elections, because we have had several Select Committees say that, and I believe the misinformation advisory committee will come to the same conclusion.

It is also clear that Parliament itself needs to decide how best to scrutinise the Bill in both its operation and its effectiveness. As we in the Joint Committee sought to suggest, there could be a Joint Committee of both Houses to carry on that scrutiny work, but I very much hope that will not be the case. I hope the SIT Select Committee in the Commons will pick up the cudgel and that the committee of the noble Baroness, Lady Stowell, the Communications and Digital Select Committee, will do likewise in the House of Lords.

16:45
There are going to be many codes. The Minister talked about this, and we very much welcome his statement about the intent to consult and lay the codes in good time. I hope the committees will engage in the scrutiny of those as we go through, because the codes will be absolutely crucial to how this Bill will be implemented. The timing of the implementation of the Bill’s provisions will be crucial. I hope that Ofcom and DSIT will be very clear in their guidance about the timings and how the different parts of the Bill will be brought into operation and the codes of conduct drafted.
I know it is invidious in these proceedings to single out individuals but, as everybody who has spent time here during the course of this Bill will know, this has been a Back-Bench inspired set of amendments. In many ways, it is not really the Front Benches that have made a lot of the running; the passion and expertise of so many Back-Benchers has driven so many of the amendments. I pay tribute to all of them without, sadly, being able to read out all their names. I think they should know that they have the gratitude of everybody who has had anything to do with this Bill.
I do, however, want to single out my friend the Labour Front-Bench spokesman, who has spent so much time on this Bill: the noble Lord, Lord Stevenson. In particular, his dispute-resolution skills have been to the fore. He set the tone at the very beginning of our proceedings in this House, which is highly unusual; I do not think we will be expecting similar behaviour any time soon. His open offer at the very beginning was highly significant and has coloured our proceedings. Of course, we all need to single out the noble Baroness, Lady Kidron. She is a total force of nature, and we all stand in awe of what she has managed to achieve with this Bill.
I thank my noble friend Lord Allan, who identified the marshmallow problem, for his considerable expertise and practical experience, which has been totally invaluable. I thank my noble friends Lords McNally and Lady Burt and, in absentia, my noble friend Lady Featherstone, who has now returned to her place; I am delighted to see that. I thank our extraordinarily hard-working Sarah Pughe, who is ably assisted by Mohamed-Ali Souidi in our Whips’ Office, and my former senior researcher, Zoë Asser, from Queen Mary University of London.
I also—finally, noble Lords will be pleased to hear—pay my own tribute to Carnegie UK, especially Will Perrin, Maeve Walsh and Professor Lorna Woods, for having the vision five years ago as to what was possible around the construction of a duty of care and for being by our side throughout the creation of this Bill. I also thank Reset, which has helped co-ordinate our activities, and the huge number of organisations that have briefed us on issues ranging from children’s safety to freedom of speech throughout our proceedings. I echo our thanks to Sir Jeremy Wright and the all-party group, and to Damian Collins, who has been a tower of strength in helping us. Quite often, the other end ceases to take much interest in what we do as soon as a Bill comes here, but we have gone through this Bill hand-in-hand and that has been of huge usefulness and importance.
We are entering unknown territory in many ways, but with a huge amount of good will to make this Bill work.
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I want to thank the Minister and other noble colleagues for such kind words. I really appreciate it.

I want to say very little. It has been an absolute privilege to work with people across both Houses on this. It is not every day that one keeps the faith in the system, but this has been a great pleasure. In these few moments that I am standing, I want to pay tribute to the bereaved parents, the children’s coalition, the NSPCC, my colleagues at 5Rights, Barnardo’s, and the other people out there who listen and care passionately that we get this right. I am not going to go through what we got right and wrong, but I think we got more right than we got wrong, and I invite the Minister to sit with me on Monday in the Gallery to make sure that those last little bits go right—because I will be there. I also remind the House that we have some work in the data Bill vis-à-vis the bereaved parents.

In all the thanks—and I really feel that I have had such tremendous support on my area of this Bill—I pay tribute to the noble Baroness, Lady Benjamin. She was there before many people were and suffered cruelly in the legislative system. Our big job now is to support Ofcom, hold it to account and help it in its task, because that is Herculean. I really thank everyone who has supported me through this.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am sure that your Lordships would not want the Bill to pass without hearing some squeak of protest and dissent from those of us who have spent so many days and weeks arguing for the interests of privacy and free speech, to which the Bill remains a very serious and major threat.

Before I come to those remarks, I associate myself with what other noble Lords have said about what a privilege it has been, for me personally and for many of us, to participate over so many days and weeks in what has been the House of Lords at its deliberative best. I almost wrote down that we have conducted ourselves like an academic seminar, but when you think about what most academic seminars are like—with endless PowerPoint slides and people shuttling around, and no spontaneity whatever—we exceeded that by far. The conversational tone that we had in the discussions, and the way in which people who did not agree were able to engage—indeed, friendships were made—meant that the whole thing was done with a great deal of respect, even for those of us who were in the small minority. At this point, I should perhaps say on behalf of the noble Baroness, Lady Fox of Buckley, who participated fully in all stages of the Bill, that she deeply regrets that she cannot be in her place today.

I am not going to single out anybody except for one person. I made the rather frivolous proposal in Committee that all our debates should begin with the noble Lord, Lord Allan of Hallam; we learned so much from every contribution he made that he really should have kicked them all off. We would all have been a great deal more intelligent about what we were saying, and understood it better, had we heard what he had to say. I certainly have learned a great deal from him, and that was very good.

I will raise two issues only that remain outstanding and are not assuaged by the very odd remarks made by my noble friend as he moved the Third Reading. The first concerns encryption. The fact of the matter is that everybody knows that you cannot do what Ofcom is empowered by the Bill to do without breaching end-to-end encryption. It is as simple as that. My noble friend may say that that is not the Government’s intention and that it cannot be forced to do it if the technology is not there. None of that is in the Bill, by the way. He may say that at the Dispatch Box but it does not address the fact that end-to-end encryption will be breached if Ofcom finds a way of doing what the Bill empowers it to do, so why have we empowered it to do that? How do we envisage that Ofcom will reconcile those circumstances where platforms say that they have given their best endeavours to doing something and Ofcom simply does not believe that they have? Of course, it might end up in the courts, but the crucial point is that that decision, which affects so many people—and so many people nowadays regard it as a right to have privacy in their communications—might be made by Ofcom or by the courts but will not be made in this Parliament. We have given it away to an unaccountable process and democracy has been taken out of it. In my view, that is a great shame.

I come back to my second issue—I will not be very long. I constantly ask about Wikipedia. Is Wikipedia in scope of the Bill? If it is, is it going to have to do prior checking of what is posted? That would destroy its business model and make many minority language sites—I instanced Welsh—totally unviable. My noble friend said at the Dispatch Box that, in his opinion, Wikipedia was not going to be in scope of the Bill. But when I asked why we could not put that in the Bill, he said it was not for him to decide whether it was in scope and that the Government had set up this wonderful structure whereby Ofcom will tell us whether it is—almost without appeal, and again without any real democratic scrutiny. Oh yes, and we might have a Select Committee, which might write a very good, highly regarded report, which might be debated some time within the ensuing 12 months on the Floor of your Lordships’ House. However, we will have no say in that matter; we have given it away.

I said at an earlier stage of the Bill that, for privacy and censorship, this represents the closest thing to a move back to the Lord Chamberlain and Lady Chatterley’s Lover that you could imagine but applied to the internet. That is bad, but what is almost worse is this bizarre governance structure where decisions of crucial political sensitivity are being outsourced to an unaccountable regulator. I am very sad to say that I think that, at first contact with reality, a large part of this is going to collapse, and with it a lot of good will be lost.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I rise very briefly to thank the Minister for getting us to where we are today—the content of a Bill that I have advocated for over a decade. I thank the noble Baroness, Lady Kidron, for her kind words. She is my heroine.

I am so happy today to discuss the final stages of this Bill. The Minister has shown true commitment, tenacity and resilience, even through the holiday period. He has listened to the voices of noble Lords from across the House and to parents, charities and schools, and he has acted in the best interests of the future of society’s well-being. To him I say thank you. I fully support what he has to say today about measures that he has put down to safeguard children to prevent the worst type of child sexual abuse and exploitation imaginable, which, according to the IWF, has doubled in the last two years.

I am pleased that the Government have not been blown off course by those who feel that privacy is more important than child protection. I hope that Clause 122 of the Bill in relation to the use of technology notices remains unchanged in the final stages of deliberation. It will be good to have that confirmation once again today from the Minister.

On behalf of the IWF, CEASE and Barnardo’s— I declare an interest as a vice-president—we are so grateful to the Minister for the diligence, hard work and dedication to duty that he has shown. I very much look forward to continuing working closely with him, and with noble Lords from all sides of the House, to ensure that the implementation of the amendments we have all worked so hard to secure happens.

I look ahead to the review into pornography, which is often the gateway to other harms. I also look forward to working to make the UK the safest place in the world—the world is looking at us—to go online for everyone in our society, especially our children. As I always say, childhood lasts a lifetime. What a legacy we will leave for them by creating this Bill. I thank the Minister for everything that he has done—my “Play School” baby.

17:00
Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I shall ask my noble friend the Minister a question about encryption but, before I do, I will briefly make a couple of other points. First, I echo all the tributes paid around the House to those involved in this legislation. It is no secret that I would have preferred the Bill to be about only child safety, so I particularly congratulate the Government, and the various Members who focused their efforts in that area, on what has been achieved via the Bill.

That said, the Government should still consider other non-legislative measures, such as banning smartphones in schools and government guidance for parents on things such as the best age at which to allow their children to have their own smartphones. These may not be points for DCMS, but they are worth highlighting at this point, as the Bill leaves us, soon to become legislation.

As I said on Report, I remain concerned about the reintroduction of some protections for adults, in lieu of “legal but harmful”, without any corresponding amendments to reinforce to Ofcom that freedom of expression must be the top priority for adults. We now have to leave it to Ofcom and see what happens. I know that the current leadership is deeply conscious of its responsibilities.

On encryption, I was pleased to hear what my noble friend said when he responded to the debate at Third Reading. If he is saying that the technology not existing means that Clause 122 cannot be deployed, as it were, by Ofcom, does that mean that the oversight measures that currently exist would not be deployed? As my noble friend will recall, one of the areas that we were still concerned about in the context of encryption was that what was in the Bill did not mirror what exists for RIPA. I am not sure whether that means that, because Clause 122 has been parked, our oversight concerns have been parked too. It would be helpful if the Minister could clarify that.

In the meantime, in the absence of Clause 122, it is worth us all reinforcing again that we want the tech firms to co-operate fully with law enforcement, either because a user has alerted them to illegal activity or when law enforcement suspects criminal behaviour and seeks their help. In that latter context, it would be helpful to understand what the Minister has said and to know what oversight that might involve. I congratulate my noble friend on this marathon Bill, and I am sorry to have delayed its passing.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I will make a short contribution so that I do not disappoint the noble Lord, Lord Moylan; I will make a few direct and crunchy comments. First, I thank colleagues who participated in the debate for giving me a hearing, especially when I raised concerns about their proposals. It has been a constructive process, where we have been, as the Minister said, kicking the tyres, which is healthy in a legislature. It is better to do it now than to find faults when something has already become law.

I am in the unusual position of having worked on problems comparable to those we are now placing on Ofcom’s desk. I have enormous empathy for it and the hard work we are giving it. I do not think we should underestimate just how difficult this job is.

I want to thank the Minister for the additional clarification of how Ofcom will give orders to services that provide private communications. Following on from what the noble Baroness, Lady Stowell, said, I think this is a challenging area. We want Ofcom to give orders where this is easy—for example, to an unencrypted service hosting child sexual abuse material. The technology can be deployed today and is uncontroversial, so it is important that we do not forget that.

I heard the Minister say that we do not want Ofcom to move so fast that it breaks encryption. It should be moving but it should be careful. Those are the fears that have been expressed outside: on the day that this becomes law, Ofcom will issue orders to services providing encrypted communications that they will not be able to accept and therefore they will leave the UK. I think I heard from the Minister today that this is not what we want Ofcom to do. At the same time, as the noble Baroness, Lady Stowell said, we are not expecting Ofcom to ease off; any online service should be doing everything technically possible and feasible to deal with abhorrent material.

I humbly offer three pieces of advice to Ofcom as we pass the baton to it. This is based on having made a lot of mistakes in the past. If I had been given this advice, I might have done a better job in my previous incarnation. First, you cannot overconsult; Ofcom should engage with all interested parties, including those who have talked to us throughout the process of the Bill. It should engage with them until it is sick of engaging with them and then it should engage some more. In particular, Ofcom should try to bring together diverse groups, so I hope it gets into a room the kind of organisations that would be cheering on the noble Lord, Lord Moylan, as well as those that would be cheering on the noble Baroness, Lady Kidron. If Ofcom can bring them into the room, it has a chance of making some progress with its regulations.

Secondly, be transparent. The more information that Ofcom provides about what it is doing, the less space it will leave for people to make up things about what it is doing. I said this in the previous debate about the access request but it applies across the piece. We are starting to see some of this in the press. We are here saying that it is great that we now have a government regulator—independent but part of the UK state—overseeing online services. As soon as that happens, we will start to see the counterreaction of people being incredibly suspicious that part of the UK state is now overseeing their activity online. The best way to combat that is for Ofcom to be as transparent as possible.

Thirdly, explain the trade-offs you are making. This legislation necessarily involves trade-offs. I heard it again in the Minister’s opening remarks: we have indulged in a certain amount of cakeism. We love freedom of expression but we want the platforms to get rid of all the bad stuff. The rubber is going to hit the road once Ofcom has the powers and, in many cases, it will have to decide between one person’s freedom of expression and another’s harm. My advice is not to pretend that you can make both sides happy; you are going to disappoint someone. Be honest and frank about the trade-offs you have made. The legislation has lots of unresolved trade-offs in it because we are giving lots of conflicting instructions. As politicians, we can ride that out, but when Ofcom gets this and has to make real decisions, my advice would be to explain the trade-offs and be comfortable with the fact that some people will be unhappy. That is the only way it will manage to maintain confidence in the system. With that, I am pleased that the Bill has got to this stage and I have a huge amount of confidence in Ofcom to take this and make a success of it.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

I rise briefly to raise the question of access to data by academics and research organisations. Before I do so, I want to express profound thanks to noble Lords who have worked so collaboratively to create a terrific Bill that will completely transform and hold to account those involved in the internet, and make it a safer place. That was our mission and we should be very proud of that. I cannot single out noble Peers, with the exception of the noble Baroness, Lady Kidron, with whom I worked collaboratively both on age assurance and on harms. It was a partnership I valued enormously and hope to take forward. Others from all four corners of the House contributed to the parts of the Bill that I was particularly interested in. As I look around, I see so many friends who stuck their necks out and spoke so movingly, for which I am enormously grateful.

The question of data access is one of the loose ends that did not quite make it into the Bill. I appreciate the efforts of my noble friend the Minister, the Secretary of State and the Bill team in this matter and their efforts to try and wangle it in; I accept that it did not quite make it. I would like to hear reassurance from my noble friend that this is something that the Government are prepared to look at in future legislation. If he could provide any detail on how and in which legislation it could be revisited, I would be enormously grateful.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I will be brief and restrict myself to responding to the questions which have been raised. I will hold to my rule of not trying to thank all noble Lords who have played their part in this scrutiny, because the list is indeed very long. I agree with what the noble Lord, Lord Clement-Jones, said about this being a Back-Bench-driven Bill, and there are many noble Lords from all corners of the House and the Back Benches who have played a significant part in it. I add my thanks to the noble Baroness, Lady Benjamin, not just for her kind words, but for her years of campaigning on this, and to my noble friend Lord Bethell who has worked with her—and others—closely on the issues which she holds dear.

I also thank my noble friend Lord Moylan who has often swum against the tide of debate, but very helpfully so, and on important matters. In answer to his question about Wikipedia, I do not have much to add to the words that I have said a few times now about the categorisation, but on his concerns about the parliamentary scrutiny for this I stress that it is the Secretary of State who will set the categorisation thresholds. She is, of course, a Member of Parliament, and accountable to it. Ofcom will designate services based on those thresholds, so the decision-making can be scrutinised in Parliament, even if not in the way he would have wished.

I agree that we should all be grateful to the noble Lord, Lord Allan of Hallam, because he addressed some of the questions raised by my noble friend Lady Stowell of Beeston. In brief, the provision is flexible for where the technological solutions do not currently exist, because Ofcom can require services to develop or source new solutions.

This close to the gracious Speech, I will not point to a particular piece of legislation in which we might revisit the issue of researchers’ access, as raised by my noble friend Lord Bethell, but I am happy to say that we will certainly look at that again, and I know that he will take the opportunity to raise it.

Noble Lords on the Front Benches opposite alluded to the discussions which are continuing—as I committed on Report to ensure that noble Lords are able to be part of discussions as the Bill heads to another place—on functionalities and on the amendment of my noble friend Lady Morgan on category 1 services. She is one of a cavalcade of former Secretaries of State who have been so helpful in scrutinising the Bill. It is for another place to debate them, but I am grateful to noble Lords who have given their time this week to have the discussions which I committed to have and will continue to have as the Bill heads there, so that we can follow those issues hopefully to a happy resolution.

I thank my noble friend Lady Harding of Winscombe for the concessions that she wrought on Report, and for the part that she has played in discussions. She has also given a great deal of time outside the Chamber.

We should all be very grateful to the noble Lord, Lord Grade of Yarmouth, who has sat quietly throughout most of our debates—understandably, in his capacity as chairman of Ofcom—but he has followed them closely and taken those points to the regulator. Dame Melanie Dawes and all the team there stand ready to implement this work and we should be grateful to the noble Lord, Lord Grade of Yarmouth, and to all those at Ofcom who are ready to put it into action.

Bill passed and returned to the Commons with amendments.

Online Safety Bill (Programme) (No. 5)

Programme motion
Tuesday 12th September 2023

(7 months, 2 weeks ago)

Commons Chamber
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Commons Consideration of Lords Amendments as at 12 September 2023 - (12 Sep 2023)
Motion made, and Question put forthwith (Standing Order No. 83A(7)),
That the following provisions shall apply to the Online Safety Bill for the purpose of supplementing the Order of 19 April 2022 in the last session of Parliament (Online Safety Bill (Programme)) as varied and supplemented by the Orders of 12 July 2022 (Online Safety Bill (Programme) (No. 2)), 5 December 2022 (Online Safety Bill (Programme) (No. 3)) and 5 December 2022 (Online Safety Bill (Programme) (No. 4)):
Consideration of Lords Amendments
(1) Proceedings on consideration of Lords Amendments shall (so far as not previously concluded) be brought to a conclusion three hours after their commencement.
(2) The Lords Amendments shall be considered in the following order: 182, 349, 391, 17, 20, 22, 81, 148, 1 to 16, 18, 19, 21, 23 to 80, 82 to 147, 149 to 181, 183 to 348, 350 to 390, 392 to 424.
Subsequent stages
(3) Any further Message from the Lords may be considered forthwith without any Question being put.
(4) The proceedings on any further Message from the Lords shall (so far as not previously concluded) be brought to a conclusion one hour after their commencement.—(Andrew Stephenson.)
Question agreed to.

Online Safety Bill

Consideration of Lords amendments
Tuesday 12th September 2023

(7 months, 2 weeks ago)

Commons Chamber
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Commons Consideration of Lords Amendments as at 12 September 2023 - (12 Sep 2023)
Consideration of Lords amendments
Clause 82
General duties of OFCOM under section 3 of the Communications Act
14:11
Roger Gale Portrait Mr Deputy Speaker (Sir Roger Gale)
- Hansard - - - Excerpts

With this it will be convenient to discuss the following:

Lords amendment 349, and Government amendments (a) and (b).

Lords amendment 391, Government amendment (a), and Government consequential amendment (a).

Lords amendment 17, Government motion to disagree, and Government amendments (a) and (b) in lieu.

Amendment (i) to Government amendment (a) in lieu of Lords amendment 17.

Lords amendment 20, and Government motion to disagree.

Lords amendment 22, and Government motion to disagree.

Lords amendment 81, Government motion to disagree, and Government amendments (a) to (c) in lieu.

Lords amendment 148, Government motion to disagree, and Government amendment (a) in lieu.

Lords amendment 1, and amendments (a) and (b).

Lords amendments 2 to 16, 18, 19, 21, 23 to 80, 82 to 147, 149 to 181 and 183 to 188.

Lords amendment 189, and amendment (a) in lieu.

Lords amendments 190 to 216.

Lords amendment 217, and amendment (a).

Lords amendments 218 to 227.

Lords amendment 228, and amendment (a).

Lords amendments 229 and 230.

Lords amendment 231, and amendment (a).

Lords amendments 232 to 319.

Lords amendment 320, and amendment (a).

Lords amendment 321, and amendment (a).

Lords amendments 322 to 348, 350 to 390 and 392 to 424.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

As we know from proceedings in this place, the Online Safety Bill is incredibly important. I am delighted that it is returning to the Commons in great shape, having gone through extensive and thorough scrutiny in the Lords. The Bill is world-leading, and the legislative framework established by it will lead to the creation of a profoundly safer online environment in this country. It will kickstart change where that is sorely needed, and ensure that our children are better protected against pornography and other content that is harmful to them. The Bill will also guard children against perpetrators of abhorrent child sexual exploitation and abuse, and ensure that tech companies take responsibility for tackling such content on their platforms, or be held criminally accountable.

William Cash Portrait Sir William Cash (Stone) (Con)
- Hansard - - - Excerpts

As I am sure my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) will agree, may I say how much we appreciate what the Government have done in relation to the matter just referred to? As the Minister knows, we withdrew our amendment in the House of Commons after discussion, and we had amazingly constructive discussions with the Government right the way through, and also in the House of Lords. I shall refer to that if I am called to speak later, but I simply wanted to put on record our thanks, because this will save so many children’s lives.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I thank my hon. Friend and my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) for all their work on this. I hope that this debate will show that we have listened and tried to work with everybody, including on this important part of the Bill. We have not been able to capture absolutely everything that everybody wants, but we are all determined to ensure that the Bill gets on the statute book as quickly as possible, to ensure that we start the important work of implementing it.

We have amended the Bill to bolster its provisions. A number of topics have been of particular interest in the other place. Following engagement with colleagues on those issues, we have bolstered the Bill’s protections for children, including a significant package of changes relating to age assurance. We have also enhanced protections for adult users.

Sajid Javid Portrait Sajid Javid (Bromsgrove) (Con)
- Hansard - - - Excerpts

My hon. Friend will know that Ministers and officials in his Department have worked extensively—I thank them for that—with me, Baroness Kidron, and the Bereaved Families for Online Safety group, on the amendment that will make it easier for coroners to have access to data from online companies in the tragic cases where that might be a cause of a child’s death. He will also know that there will still be gaps in legislation, but such gaps could be closed by further measures in the Data Protection and Digital Information Bill. His ministerial colleague in the other place has committed the Government to that, so may I invite my hon. Friend to set out more about the Government’s plans for doing just that?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I thank my right hon. Friend for his work on this, and Baroness Kidron for her work. I will cover that in more detail in a moment, but we remain committed to exploring measures that would facilitate better access to data for coroners under specific circumstances. We are looking for the best vehicle to do that, which includes those possibilities in the Data Protection and Digital Information Bill. We want to ensure that the protections for adult users afford people greater control over their online experience.

14:15
The Bill will ensure that Ofcom has the powers it needs to ensure that coroners are provided with the information they need following such a tragedy. As well as my right hon. Friend and Baroness Kidron, that provision was also championed by Ian Russell and other bereaved parents with whom we have worked closely, to ensure that we get the right solution. I am grateful for their tireless efforts. We have made sure that we can address the concerns raised by Members about the risks relating to the design and functionality of services, because this is a complicated issue, for a number of reasons that have been well rehearsed. The changes I have outlined will ensure that the Bill contains the strongest possible protections for children, that users’ rights to freedom of expression and privacy are protected, and that services are transparent and accountable.
Let me go into more detail on the Government amendments that were passed during the Bill’s passage through the Lords, and the amendments that I present to the House today. As I have said, child safety is a key priority in the Bill, and during its passage through the Lords we have further strengthened its protections for children. That has included placing the categories of “primary priority” and “priority” content that is harmful to children in the Bill. That will provide companies and Ofcom with explicit and early confirmation on the kind of content that children must be protected from, rather than addressing those issues later via secondary legislation. Providers of the largest services will also be required to publish summaries of their risk assessments for illegal content and content that is harmful to children. That will empower children and their parents or carers to clearly understand the risks to children presented by such services.
The Government listened to the views expressed in both Houses and introduced new offences in Committee that will more effectively hold technology companies to account if they fail to protect children. Ofcom will now be able to hold companies and senior managers, where they are at fault, criminally liable if the provider fails to comply with Ofcom’s enforcement notices in relation to specific child safety duties or to child sexual abuse and exploitation on their service.
John Hayes Portrait Sir John Hayes (South Holland and The Deepings) (Con)
- Hansard - - - Excerpts

The Minister is setting out a powerful case for how the Government have listened to the overtures in this place and the other place. Further to the interventions from my hon. Friend the Member for Stone (Sir William Cash) and my right hon. Friend the Member for Bromsgrove (Sajid Javid), the former Culture Secretary, will the Minister be clear that the risk here is under-regulation, not over-regulation? Although the internet may be widely used by perfectly good people, the people who run internet companies are anything but daft and more likely to be dastardly.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

This is a difficult path to tread in approaching this issue for the first time. In many ways, these are things that we should have done 10 or 15 years ago, as social media platforms and people’s engagement with them proliferated over that period. Regulation has to be done gently, but it must be done. We must act now and get it right, to ensure that we hold the big technology companies in particular to account, while also understanding the massive benefits that those technology companies and their products provide.

Debbie Abrahams Portrait Debbie Abrahams (Oldham East and Saddleworth) (Lab)
- Hansard - - - Excerpts

I agree with the Minister that this is a groundbreaking Bill, but we must be clear that there are still gaps. Given what he is saying about the requirements for regulation of online social media companies and other platforms, how will he monitor, over a period of time, whether the measures that we have are as dynamic as they need to be to catch up with social media as it develops?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The hon. Lady asks an important question, and that is the essence of what we are doing. We have tried to make this Bill flexible and proportionate. It is not technology specific, so that it is as future-proofed as possible. We must obviously lean into Ofcom as it seeks to operationalise the Act once the Bill gains Royal Assent. Ofcom will come back with its reporting, so not only will Government and the Department be a check on this, but Parliament will be able to assess the efficacy of the Bill as the system beds in and as technology and the various platforms move on and develop.

I talked about the offences, and I will just finalise my point about criminal liability. Those offences will be punishable with up to two years in prison.

John Penrose Portrait John Penrose (Weston-super-Mare) (Con)
- Hansard - - - Excerpts

Further to that point about the remaining gaps in the Bill, I appreciate what the Minister says about this area being a moving target. Everybody—not just in this country, but around the world—is having to learn as the internet evolves.

I thank the Minister for Government amendment 241, which deals with provenance and understanding where information posted on the web comes from, and allows people therefore to check whether they want to see it, if it comes from dubious sources. That is an example of a collective harm—of people posting disinformation and misinformation online and attempting to subvert our democratic processes, among other things. I park with him, if I may, the notion that we will have to come back to that area in particular. It is an area where the Bill is particularly weak, notwithstanding all the good stuff it does elsewhere, notably on the areas he has mentioned. I hope that everyone in this House accepts that that area will need to be revisited in due course.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Undoubtedly we will have to come back to that point. Not everything needs to be in the Bill at this point. We have industry initiatives, such as Adobe’s content security policy, which are good initiatives in themselves, but as we better understand misinformation, disinformation, deepfakes and the proliferation and repetition of fake images, fake text and fake news, we will need to keep ensuring we can stay ahead of the game, as my hon. Friend said. That is why we have made the legislation flexible.

Margaret Hodge Portrait Dame Margaret Hodge (Barking) (Lab)
- Hansard - - - Excerpts

I have two things to ask. First, will the Minister spell out more clearly how Parliament will be able to monitor the implementation? What mechanisms do we have to do that? Secondly, on director liability, which I warmly welcome—I am pleased that the Government have listened to Back Benchers on this issue—does he not agree that the example we have set in the Bill should be copied in other Bills, such as the Economic Crime and Corporate Transparency Bill, where a similar proposal exists from Back Benchers across the House?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The right hon. Lady raises some interesting points. We have conversed about harms, so I totally get her point about making sure that we tackle this issue in Parliament and be accountable in Parliament. As I have said, that will be done predominantly by monitoring the Bill through Ofcom’s reporting on what harms it is having to deal with. We have regular engagement with Ofcom, not only here and through the Select Committees, but through the Secretary of State.

On criminal liability, we conversed about that and made sure that we had a liability attached to something specific, rather than the general approach proposed at the beginning. It therefore means that we are not chilling innovation. People can understand, as they set up their approaches and systems, exactly what they are getting into in terms of risk for criminal liability, rather than having the general approach that was suggested at the beginning.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

The review mechanism strikes me as one of the places where the Bill falls down and is weakest, because there is not a dedicated review mechanism. We have needed this legislation for more than 30 years, and we have now got to the point of legislating. Does the Minister understand why I have no faith that future legislation will happen in a timely fashion, when it has taken us so long even to get to this point? Can he give us some reassurance that a proper review will take place, rather than just having Ofcom reports that may or may not be read?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I have talked about the fact that we have to keep this legislation under review, because the landscape is fast-moving. At every stage that I have been dealing with this Bill, I have said that inevitably we will have to come back. We can make the Bill as flexible, proportionate and tech-unspecific as we can, but things are moving quickly. With all our work on AI, for example, such as the AI summit, the work of the Global Partnership on Artificial Intelligence, the international response, the Hiroshima accord and all the other areas that my hon. Friend the Member for Weston-super-Mare (John Penrose) spoke about earlier, we will have to come back, review it and look at whether the legislation remains world-beating. It is not just about the findings of Ofcom as it reports back to us.

I need to make a bit of progress, because I hope to have time to sum up a little bit at the end. We have listened to concerns about ensuring that the Bill provides the most robust protections for children from pornography and on the use of age assurance mechanisms. We are now explicitly requiring relevant providers to use highly effective age verification or age estimation to protect children from pornography and other primary priority content that is harmful to children. The Bill will also ensure a clear privacy-preserving and future-proofed framework governing the use of age assurance, which will be overseen by Ofcom.

There has been coverage in the media about how the Bill relates to encryption, which has often not been accurate. I take the opportunity to set the record straight. Our stance on challenging sexual abuse online remains the same. Last week in the other place, my noble Friend Lord Parkinson, the Parliamentary Under-Secretary of State for Arts and Heritage, shared recent data from UK police forces that showed that 6,350 offences related to sexual communication with a child were recorded last year alone. Shockingly, 5,500 of those offences took place against primary school-age children. Those appalling statistics illustrate the urgent need for change. The Government are committed to taking action against the perpetrators and stamping out these horrific crimes. The information that social media companies currently give to UK law enforcement contributes to more than 800 arrests or voluntary attendances of suspected child sexual offenders on average every month. That results in an estimated 1,200 children being safeguarded from child sexual abuse.

There is no intention by the Government to weaken the encryption technology used by platforms. As a last resort, on a case-by-case basis, and only when stringent privacy safeguards have been met, Ofcom will have the power to direct companies to make best efforts to develop or source technology to identify and remove illegal child sexual abuse content. We know that this technology can be developed. Before it can be required by Ofcom, such technology must meet minimum standards of accuracy. If appropriate technology does not exist that meets these requirements, Ofcom cannot require its use. That is why the powers include the ability for Ofcom to require companies to make best endeavours to develop or source a new solution.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

Does my hon. Friend agree that the companies already say in their terms of service that they do not allow illegal use of their products, yet they do not say how they will monitor whether there is illegal use and what enforcement they take? What the Bill gives us, for the first time, is the right for Ofcom to know the answers to those questions and to know whether the companies are even enforcing their own terms of service.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

My hon. Friend makes an important point, and I thank him for the amazing work he has done in getting the Bill to this point and for his ongoing help and support in making sure that we get it absolutely right. This is not about bashing technology companies; it is about not only holding them to account, but bringing them closer, to make sure that we can work together on these issues to protect the children I was talking about.

Despite the breadth of existing safeguards, we recognise the concerns expressed about privacy and technical feasibility in relation to Ofcom’s power to issue CSE or terrorism notices. That is why we introduced additional safeguards in the Lords. First, Ofcom will be required to obtain a skilled person’s report before issuing any warning notice and exercising its powers under clause 122. Ofcom must also provide a summary of the report to the relevant provider when issuing a warning notice. We are confident that in addition to Ofcom’s existing routes of evidence gathering, this measure will help to provide the regulator with the necessary information to determine whether to issue a notice and the requirements that may be put in place.

We also brought forth amendments requiring Ofcom to consider the impact that the use of technology would have on the availability of journalistic content and the confidentiality of journalistic sources when considering whether to issue a notice. That builds on the existing safeguards in clause 133 regarding freedom of expression and privacy.

We recognise the disproportionate levels of harm that women and girls continue to face online, and that is why the Government have made a number of changes to the Bill to strengthen protections for women and girls. First, the Bill will require Ofcom to produce guidance on online harms that disproportionately affect women and girls and to provide examples of best practice to providers, and it will require providers to bring together in one clear place all the measures that they take to tackle online abuse against women and girls on their platforms. The Bill will also require Ofcom to consult the Victims’ Commissioner and the Domestic Abuse Commissioner, in addition to the Children’s Commissioner, while preparing codes of practice. That change to the Bill will ensure that the voices of victims of abuse are brought into the consultation period.

14:30
The offence of controlling or coercive behaviour has been added as a priority offence and will require companies to proactively tackle such content and activity that disproportionately affects women and girls. The Bill also introduces new offences relating to intimate image abuse, including criminalising deepfakes for the first time in England and Wales. Those new offences to protect women and girls sit alongside other changes that we have made to the criminal law to ensure that it is fit for purpose in the modern age. For example, we have also introduced a new communications offence of intentionally encouraging or assisting serious self-harm. Our amendments will also require platforms to remove the most harmful self-harm content for all users. The offence has been designed to avoid criminalising or removing recovery and support content.
The Government are committed to empowering adults online and made changes to the Bill to strengthen the user empowerment content duties. First, we have introduced a new content assessment duty in relation to the main user empowerment duties. That will require big tech platforms to carry out comprehensive assessments of the prevalence of content that falls in scope of their providers’ user empowerment duties on their services, such as legal content that encourages suicide or an act of self-harm. They will need to keep a record of that assessment and publish a summary of it for their users in their terms of service. The new duty will underpin the main duties to offer user empowerment tools, ensuring that platforms and users have a comprehensive understanding of the relevant types of content on their services.
Secondly, where category 1 providers offer the user empowerment tools, we have further strengthened the duties on them by requiring them to proactively ask their registered adult users whether they wish to use the user empowerment content features. That will help to make the tools easier for users to opt into or out of. This approach continues to balance the empowerment of users and the protection of freedom of expression by avoiding the “default on” approach.
Baroness Fraser of Craigmaddie made amendments in the other place that aligned the definition of the term “freedom of expression” in the Bill with that in the European convention on human rights. That also reflects the approach of other UK legislation, including the Higher Education (Freedom of Speech) Act 2023. Those amendments will increase clarity about freedom of expression in the Bill.
The Government recognise the difficulties that coroners and bereaved families have when seeking to understand the circumstances surrounding a child’s death and have introduced a number of amendments to address those issues; I have outlined a bit of those. First, we expanded Ofcom’s information gathering powers so that the regulator can require information from regulated services about a deceased child’s online activity following a request from a coroner. That is backed up by Ofcom’s existing enforcement powers. We have also given Ofcom the power to produce a bespoke report for the coroner and enabled the regulator to share information with a coroner without the prior consent of a business to disclose. That will ensure that Ofcom can collect such information and share it with the coroner where appropriate, so that coroners have access to the expertise and information they need to conduct their investigations.
Finally, we have introduced amendments to ensure that the process for accessing data regarding the online activities of a deceased child is more straightforward and humane. The largest companies must set out policies on disclosure of such data in a clear, accessible and sufficiently detailed format in their terms of service. They must also respond in writing in a timely manner, provide a dedicated means for parents to communicate with the company and put in place a mechanism for parents to complain if they consider that a service is not meeting its obligations.
We recognise the valuable work of researchers in improving our collective understanding of online safety issues, which is why we have made amendments to the Bill that require Ofcom to publish its report into researcher access to information within 18 months rather than two years. Ofcom will then be required to publish guidance on the issue, setting out best practice for platforms to share information in a way that supports their research functions while protecting user privacy and commercially sensitive material. While we will not be making additional changes to the Bill during the remainder of its passage, we understand the call for further actions in this area. That is why we have made a commitment to explore this issue further and report back to the House in due course on whether further measures to support researcher access to data are required and, if so, whether they could also be implemented through other legislation such as the Data Protection and Digital Information Bill.
The Government heard the House’s concerns about the risk of algorithms and their impact on our interactions online. Given the influence they can have, the regulator must be able to scrutinise the algorithms’ functionalities and other systems and processes that providers use. We have therefore made changes to provide Ofcom with the power to authorise a person to view specific types of information remotely: information demonstrating the operation of a provider’s systems, processes or features, including algorithms, and tests or demonstrations. There are substantial safeguards around the use of that power, which include: Ofcom’s legal duty to exercise it proportionately; a seven-day notification period; and the legal requirement to comply with data protection rules and regulations.
The Government are grateful to Baroness Morgan of Cotes and my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), who like many in the House have steadfastly campaigned on the issue of small but risky platforms. We have accepted an amendment to the Bill that changes the rules for establishing the conditions that determine which services will be designated as category 1 or category 2B services and thus have additional duties. In making the regulations used to determine which services are category 1 or category 2B, the Secretary of State will now have the discretion to decide whether to set a threshold based on the number of users or the functionalities offered, or both factors. Previously, the Secretary of State was required to set the threshold based on a combination of both factors.
It is still the expectation that only the most high risk user-to-user services will be designated as category 1 services. However, the change will ensure that the framework is as flexible as possible in responding to the risk landscape. I say to my hon. Friend the Member for Yeovil (Mr Fysh), who I know will speak later, that it is not meant to capture user-to-user systems; it is very much about content but not about stifling innovation in areas such as distributed ledgers and so on.
Margaret Hodge Portrait Dame Margaret Hodge
- Hansard - - - Excerpts

I am grateful for the amendment, which I think is important. Will the Minister make it clear that he will not accept the amendments tabled by the hon. Member for Yeovil (Mr Fysh).

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Indeed, we will not be accepting those amendments, but I will cover more of that later on, after I have listened to the comments that I know my hon. Friend wants to make.

As a result of the amendment, we have also made a small change to clause 98—the emerging category 1 services list—to ensure that it makes operational sense. Prior to Baroness Morgan’s amendment, a service had to meet the functionality threshold for content and 75% of the user number threshold to be on the emerging services list. Under the amended cause, there is now a plausible scenario where a service could meet the category 1 threshold without meeting any condition based on user numbers, so we had to make the change to ensure that the clause worked in that scenario.

We have always been clear that the design of a service, its functionalities and its other features are key drivers of risk that impact on the risk of harm to children. Baroness Kidron’s amendments 17, 20, 22 and 81 seek to treat those aspects as sources of harm in and of themselves. Although we agree with the objective, we are concerned that they do not work within the legislative framework and risk legal confusion and delaying the Bill. We have worked closely with Baroness Kidron and other parliamentarians to identify alternative ways to make the role that design and functionalities play more explicit. I am grateful to colleagues in both Houses for being so generous with their time on this issue. In particular, I thank again my right hon. and learned Friend the Member for Kenilworth and Southam for his tireless work, which was crucial in enabling the creation of an alternative and mutually satisfactory package of amendments. We will disagree to Lords amendments 17, 20, 22 and 81 and replace them with amendments that make it explicit that providers are required to assess the impact that service design, functionalities and other features have on the risk of harm to children.

On Report, my hon. Friend the Member for Crawley (Henry Smith) raised animal abuse on the internet and asked how we might address such harmful content. I am pleased that the changes we have since made to the Bill fully demonstrate the Government’s commitment to tackling criminal activity relating to animal torture online. It is a cause that Baroness Merron has championed passionately. Her amendment in the other place sought to require the Secretary of State to review certain offences and, depending on the review’s outcome, to list them as priority offences in schedule 7. To accelerate measures to tackle such content, the Government will remove clause 63—the review clause—and instead immediately list section 4(1) of the Animal Welfare Act 2006 as a priority offence. Officials at the Department for Environment, Food and Rural Affairs have worked closely with the Royal Society for the Prevention of Cruelty to Animals and are confident that the offence of unnecessary suffering will capture a broad swathe of behaviour. I hope the whole House will recognise our efforts and those of Baroness Merron and support the amendment.

You will be pleased to know, Mr Deputy Speaker, that I will conclude my remarks. I express my gratitude to my esteemed colleagues both here and in the other place for their continued and dedicated engagement with this complicated, complex Bill during the course of its parliamentary passage. I strongly believe that the Bill, in this form, strikes the right balance in providing the strongest possible protections for both adults and children online while protecting freedom of expression. The Government have listened carefully to the views of Members on both sides of the House, stakeholders and members of the public. The amendments we have made during the Bill’s progress through the Lords have further enhanced its robust and world-leading legislative framework. It is groundbreaking and will ensure the safety of generations to come. I ask Members of the House gathered here to support the Government’s position on the issues that I have spoken about today.

Roger Gale Portrait Mr Deputy Speaker (Sir Roger Gale)
- Hansard - - - Excerpts

I call the Opposition spokesperson.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- View Speech - Hansard - - - Excerpts

Before I address the amendments at hand, let me first put on record my thanks for the incredible efforts of our colleagues in the other place. The Bill has gone on a huge journey. The Government have repeatedly delayed its passage, and even went to great effort to recommit parts of the Bill to Committee in an attempt to remove important provisions on legal but harmful content. For those reasons alone, it is somewhat of a miracle that we have arrived at this moment, with a Bill that I am glad to say is in a much better place than when we last debated it here. That is thanks to the tireless work of so many individuals, charities and organisations, which have come together to coalesce around important provisions that will have a positive impact on people’s lives online.

Today, we have the real privilege of being joined by Ian Russell, Stuart Stephens, Emilia Stevens, Hollie Dance and Lisa Kenevan, who have all been impacted by losing a child at the hands of online harm. I want to take a moment to give my most heartfelt thanks to them all, and to the other families who have shared their stories, insights and experiences with colleagues and me as the Bill progressed. Today, in our thoughts are Archie, Isaac, Olly, Molly and all the other children who were taken due to online harm. Today, their legacy stands before us. We would not be here without you, so thank you.

We also could not have arrived at this point without the determination of colleagues in the other place, notably Baroness Kidron. Colleagues will know that she has been an extremely passionate, determined and effective voice for children throughout, and the Bill is stronger today thanks to her efforts. More broadly, I hope that today’s debate will be a significant and poignant moment for everyone who has been fighting hard for more protections online for many years.

It is good to see the Minister in his place. This is a complex Bill, and has been the responsibility of many of his colleagues since its introduction to Parliament. That being said, it will come as no surprise that Labour is pleased with some of the significant concessions that the Government have made on the Bill. Many stem from amendments the Opposition sought to make early on in the Bill’s passage. Although his Department’s press release may try to claim a united front, let us be clear: the Bill has sadly fallen victory to Tory infighting from day one. The Conservatives truly cannot decide if they are the party of protecting children or of free speech, when they should be the party of both. Sadly, some colleagues on the Government Benches have tried to stop the Bill in its tracks entirely, but Labour has always supported the need for it. We have worked collaboratively with the Government and have long called for these important changes. It is a welcome relief that the Government have finally listened.

Let me also be clear that the Bill goes some way to regulate the online space in the past and present, but it makes no effort to future-proof or anticipate emerging harms. The Labour party has repeatedly warned the Government of our concerns that, thanks to the Bill’s focus on content rather than social media platforms’ business models, it may not go far enough. With that in mind, I echo calls from across the House. Will the Minister commit to a review of the legislation within five years of enactment, to ensure that it has met their objective of making the UK the safest place in the world to be online?

Richard Burgon Portrait Richard Burgon (Leeds East) (Lab)
- Hansard - - - Excerpts

My hon. Friend is making an important speech. It is clear that the Government want to tackle harmful suicide and self-harm content. It is also clear that the Bill does not go far enough. Does she agree that we should support Samaritans’ suggested way forward after implementation? We need the Government to engage with people with lived experience of suicide and self-harm, to ensure that the new legislation makes things better. If it is shown—as we fear—not to go far enough, new legislative approaches will be required to supplement and take it further, to ensure that the internet is as safe as possible for vulnerable people of all ages.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I thank my hon. Friend for that intervention. He has been a passionate advocate on that point, speaking on behalf of his constituent Joe Nihill and his family for more protections in the Bill. It is clear that we need to know whether the legislation works in practice. Parliamentary oversight of that is essential, so I echo calls around the Chamber for that review. How will it take place? What will it look like? Parliament must have oversight, so that we know whether the legislation is fit for purpose.

14:45
Let me turn to the amendments. Labour is particularly pleased to see the Government follow the excellent lead of Baroness Kidron in the other place by addressing the alarming gaps in children’s risk assessments. Those amendments will go some way to ensuring that social media platforms have to carefully consider harmful content both created and disseminated to children when using their services. This is an incredibly important point. With online content being constantly available and drip-fed to children thanks to autoplay features, it is right that risk assessments relating to harm will have to include widespread provisions.
I wonder, however, if the Minister could explain one particular point. The Government’s own press releases have long lauded the Bill as focused on child safety. Indeed, in the Secretary of State’s open letter in December to all parents, carers and guardians, she notes:
“The strongest protections in this legislation are for children and young people.”
I would therefore be interested to hear from the Minister exactly why the Bill makes no specific reference to children’s rights, or more specifically the UN convention on the rights of the child. It is the most widely ratified international human rights treaty in history, yet it is missing from the Bill. I hope the Minister can clarify that for us all.
It will come as no surprise that Labour is proud that the Government have conceded on an important amendment that will see social media sites required to proactively remove animal torture content online. I first raised that issue in Committee more than a year ago. Vile animal torture content has no place online or in society. I am proud that it was the Labour party that first recognised the alarming gap in the Government’s earlier draft of the Bill. Research from the RSPCA showed that, in 2021, there were 756 reports of animal abuse on social media, compared with 431 in 2020 and 157 in 2019. We can all see that this horrific content is more widespread and common than we might initially have believed. Thanks to Labour party colleagues in the other place, particularly Baroness Merron, it will no longer be tolerated online.
I am particularly proud to see the Government adopt an amendment that represents a move towards a risk-based approach to service categorisation. This is an important point and a significant concession from the Government. Along with many others, I repeatedly warned the Government that, by focusing on a size versus risk approach to categorisation, the online safety regime was doomed to fail. Put simply, we could have been left in a position where some of the most harmful websites and platforms, including 4chan and BitChute, which regularly host and promote far right, antisemitic content, slipped through the cracks of the legislation. None of us wanted that to happen, but in May 2022 the then Minister chose not to accept a cross-party amendment in Committee that could have resolved the issue more than a year ago.
We are pleased to see progress, and thank colleagues in the other place for seeing sense, but that approach highlights the Government’s wider strategy for online safety: one based on delay and indecision. If we needed more proof, we only have to turn to the Government’s approach to allow researcher access to data relating to the online safety regime. Labour welcomes small Government amendments in the other place on this point, but there are real-world consequences if the Government do not consider improving levels of transparency. Other jurisdictions across the globe are looking at strengthening their data transparency provisions because they recognise that regulators such as Ofcom need academics and civil society to have sight of data in the most complex of cases. In Australia and Canada, there is real progress. Our friends across the pond in the USA have recently signed a deal with the EU that will see them committed to working together on researcher access to data.
The Secretary of State talks a good game about our world-leading universities and research environment, and claims that she wants the UK to be a leader, yet inaction is putting our country and our homegrown talent pool at a disadvantage. Let us be clear: access to data goes further than academics. In the last month, Elon Musk has sought to sue the Centre for Countering Digital Hate and the Anti-Defamation League, organisations filled to the brim with brilliant British research excellence. I recognise that the Government have made a loose commitment to go further in future legislation, but that has not been formally outlined. I sincerely hope that the promises made to bereaved parents about further progress in the Data Protection and Digital Information Bill are kept. I would be grateful for some reassurance from the Minister on that point.
Labour has long campaigned for stronger protections to keep people—both children and adults—safe online. The Bill has made remarkable progress, thanks to our colleagues in the other place coming together and genuinely putting people’s priorities over party politics and political gain. Labour has always supported an online safety regime, and has sought to work with the Government while raising valid concerns carefully throughout. We all want to keep people safe and there is broad consensus that social media companies have failed to regulate themselves. Labour is proud of the changes it has developed and pushed on, but this is not the end. We will continue to push the Government to deliver this regime in good time and to ensure that it is reviewed regularly and has appropriate parliamentary oversight. After all, children and adults across the UK deserve, as a first priority, to be kept safe. The Minister knows we will be closely watching.
None Portrait Several hon. Members rose—
- Hansard -

Roger Gale Portrait Mr Deputy Speaker (Sir Roger Gale)
- Hansard - - - Excerpts

I call the Chair of the Select Committee.

Caroline Dinenage Portrait Dame Caroline Dinenage (Gosport) (Con)
- View Speech - Hansard - - - Excerpts

I welcome the return of the Online Safety Bill from its exhaustive consideration in the other place. As the Minister knows, this vital legislation kicked off several years ago under the leadership of my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), with the ambitious aim of making the UK the safest place in the world to go online. While other countries picked at the edges of that, we were the first place in the world to set ourselves such an ambitious task.

The legislation is mammoth in size and globally significant in scope. Its delivery has been long-winded and I am so pleased that we have got to where we are now. As one of the Ministers who carried the baton for this legislation for around 19 months, I understand the balance to be struck between freedom of speech campaigners, charities and the large pressures from the platforms to get this right.

William Cash Portrait Sir William Cash
- Hansard - - - Excerpts

I commend my hon. Friend for her remarks. May I point out that there is a provision in European legislation—I speak as Chairman of the European Scrutiny Committee—called the data services protection arrangements? They have nothing to compare with what we have in the Bill. That demonstrates the fact that when we legislate for ourselves we can get it right. That is something people ought to bear in mind.

Caroline Dinenage Portrait Dame Caroline Dinenage
- Hansard - - - Excerpts

My hon. Friend is absolutely right to point that out. Much of the European legislation on this was taken from our own draft legislation, but has not gone anywhere near as far in the protections it offers.

We know that the internet is magnificent and life changing in so many ways, but that the dark corners present such a serious concern for children and scores of other vulnerable people. I associate myself with the comments of the hon. Member for Pontypridd (Alex Davies-Jones) on the child protection campaigners who have worked so incredibly hard on this issue, particularly those who have experienced the unimaginable tragedy of losing children as a result of what they have seen in the online world. To turn an unspeakable tragedy of that nature into a campaign to save the lives of others is the ultimate thing to do, and they deserve our massive thanks and gratitude.

I am also grateful to so many of our noble colleagues who have shaped the Bill using their unique knowledge and expertise. I would like to mention a few of them and welcome the changes they brought, but also thank the Minister and the Government for accepting so many of the challenges they brought forward and adapting them into the Bill. We all owe a massive debt of gratitude to Baroness Kidron for her tireless campaign for children’s protections. A children’s safety stalwart and pioneer for many years, virtually no one else knows more about this vital issue. It is absolutely right that the cornerstone and priority of the Bill must be to protect children. The Minister mentioned that the statistics are absolutely horrible and disturbing. That is why it is important that the Secretary of State will now be able to require providers to retain data relating to child sexual exploitation and abuse, ensuring that law enforcement does not have one hand tied behind its back when it comes to investigating these terrible crimes.

I also welcome the commitment to the new powers given to Ofcom and the expectations of providers regarding access to content and information in the terrible event of the death of a child. The tragic suicide of Molly Russell, the long and valiant battle of her dad, Ian, to get access to the social media content that played such a key role in it, and the delay that brought to the inquest, is the only example we need of why this is absolutely the right thing to do. I know Baroness Kidron played a big part in that, as did my right hon. Friend the Member for Bromsgrove (Sajid Javid).

I am still concerned that there are not enough protections for vulnerable adults or for when people reach the cliff-edge of the age of 18. People of all ages need protection from extremely harmful content online. I am still not 100% convinced that user empowerment tools will provide that, but I look forward to being proved wrong.

I welcome the news that Ofcom is now required to produce guidance setting out how companies can tackle online violence against women and girls and demonstrate best practice. I am thankful to the former Equalities Minister, Baroness Morgan of Cotes, for her work on that. It is a vital piece of the puzzle that was missing from the original Bill, which did not specifically mention women or girls at all as far as I can remember.

It is important to stay faithful to the original thread of the Bill. To futureproof it, it has to be about systems and processes, rather than specific threats, but the simple fact is that the online world is so much more hostile for women. For black women, it is even worse. Illegal activity such as stalking and harassment is a daily occurrence for so many women and girls online. Over one in 10 women in England have experienced online violence and three in 10 have witnessed it. We also know that women and girls are disproportionately affected by the abuse of intimate images and the sharing of deepfakes, so it is welcome that those will become an offence. I also welcome that controlling and coercive behaviour, which has been made a recognised offence in real life, will now be listed as a priority offence online. That is something else the Government should take pride in.

I thank Baroness Merron for bringing animal welfare into the scope of the Bill. All in-scope platforms will have proactive duties to tackle content amounting to the offence of causing unnecessary suffering of animals. I thank Ministers for taking that on board. Anyone who victimises beings smaller and weaker than themselves, whether children or animals, is the most despicable kind of coward. It shows the level of depravity in parts of the online world that the act of hurting animals for pleasure is even a thing. A recent BBC story uncovered the torture of baby monkeys in Indonesia. The fact that individuals in the UK and the US are profiting from that, and that it was shared on platforms like Facebook is horrifying.

In the brief time left available to me, I must admit to still being a bit confused over the Government’s stance on end-to-end encryption. It sounds like the Minister has acknowledged that there is no sufficiently accurate and privacy-preserving technology currently in existence, and that the last resort power would only come into effect once the technology was there. Technically, that means the Government have not moved on the requirement of Ofcom to use last resort powers. Many security experts believe it could be many years before any such technology is developed, if ever, and that worries me. I am, of course, very supportive of protecting user privacy, but it is also fundamentally right that terrorism or child sexual exploitation rings should not be able to proliferate unhindered on these channels. The right to privacy must be trumped by the need to stop events that could lead to mass death and the harm of innocent adults and children. As my hon. Friend the Member for Folkestone and Hythe (Damian Collins) said, that is also against their terms of service. I would therefore welcome it if the Minister were to make a couple of comments on that.

I also welcome the changes brought forward by Baroness Morgan of Cotes on the categorisation of harm. I, too, have been one of the long-standing voices over successive stages of the Bill saying that a platform’s size should not be the only measure of harm. Clearly, massive platforms, by definition of their reach, have huge potential to spread harmful content, but we know that online platforms can go viral overnight. We know there are some small but incredibly pernicious platforms out there. Surely the harmful content on a site should be the definer of how harmful it is, not just its size. I welcome the increased flexibility for the Secretary of State to set a threshold based on the number of users, or the functionality offered, or both. I would love to know a little more about how that would work in practice.

We were the first country in the world to set out the ambitious target of comprehensive online safety legislation. Since then, so much time has passed. Other countries and the EU have legislated while we have refined and in the meantime so much harm has been able to proliferate. We now need to get this done. We are so close to getting this legislation over the finish line. Can the Minister assure me that we are sending out a very clear message to providers that they must start their work now? They must not necessarily wait for this legislation to be in place because people are suffering while the delays happen.

I put on record my thanks to Members of this House and the other place who have worked so hard to get the legislation into such a great state, and to Ministers who have listened very carefully to all their suggestions and expertise. Finally, I put on record my thanks to the incredible Government officials. I was responsible for shepherding the Bill for a mere 19 months. It nearly finished me off, but some officials have been involved in it right from the beginning. They deserve our enormous gratitude for everything they have done.

None Portrait Several hon. Members rose—
- Hansard -

Roger Gale Portrait Mr Deputy Speaker (Sir Roger Gale)
- Hansard - - - Excerpts

Order. Thirteen Members wish to participate in the debate. The winding-up speeches will need to start shortly before 5 pm, and the Minister has indicated that he has quite a bit to say. I therefore suggest a self-denying ordinance of between seven and eight minutes following the speech from the Scottish National party spokesman. It is up to colleagues, because we have not imposed a mandatory time limit at this stage, but if Members are sensible and not greedy, everyone should get in with no difficulty.

14:59
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- View Speech - Hansard - - - Excerpts

It is a pleasure to speak during what I hope are the final stages of the Bill. Given that nearly all the Bills on which I have spoken up to now have been money Bills, this business of “coming back from the Lords” and scrutinising Lords amendments has not been part of my experience, so if I get anything wrong, I apologise.

Like other Members, I want to begin by thanking a number of people and organisations, including the Mental Health Foundation, Carnegie UK, the Internet Watch Foundation, the National Society for the Prevention of Cruelty to Children and two researchers for the SNP, Aaron Lucas and Josh Simmonds-Upton, for all their work, advice, knowledge and wisdom. I also join the hon. Members for Pontypridd (Alex Davies-Jones) and for Gosport (Dame Caroline Dinenage) in thanking the families involved for the huge amount of time and energy—and the huge amount of themselves—that they have had to pour into the process in order to secure these changes. This is the beginning of the culmination of all their hard work. It will make a difference today, and it will make a difference when the Bill is enacted. Members in all parts of the House will do what we can to continue to scrutinise its operation to ensure that it works as intended, to ensure that children are kept as safe as possible online, and to ensure that Ofcom uses these powers to persuade platforms to provide the information that they will be required to provide following the death of a child about that child’s use of social media.

The Bill is about keeping people safe. It is a different Bill from the one that began its parliamentary journey, I think, more than two years ago. I have seen various Ministers leading from the Dispatch Box during that time, but the voices around the Chamber have been consistent, from the Conservative, Labour and SNP Benches. All the Members who have spoken have agreed that we want the internet to be a safer place. I am extremely glad that the Government have made so many concessions that the Opposition parties called for. I congratulate the hon. Member for Pontypridd on the inclusion of violence against women and girls in the Bill. She championed that in Committee, and I am glad that the Government have made the change.

Another change that the Government have made relates to small high-risk platforms. Back in May or June last year I tabled amendments 80, 81 and 82, which called for that categorisation to be changed so that it was not based just on the number of users. I think it was the hon. Member for Gosport who mentioned 4chan, and I have mentioned Kiwi Farms a number of times in the Chamber. Such organisations cannot be allowed to get away with horrific, vile content that encourages violence. They cannot be allowed a lower bar just because they have a smaller number of users.

The National Risk Register produced by the Cabinet Office—great bedtime reading which I thoroughly recommend—states that both the risk and the likelihood of harm and the number of people on whom it will have an impact should be taken into account before a decision is made. It is therefore entirely sensible for the Government to take into account both the number of users, when it is a significant number, and the extremely high risk of harm caused by some of these providers.

John Hayes Portrait Sir John Hayes
- Hansard - - - Excerpts

The hon. Lady is making an excellent speech, but it is critical to understand that this is not just about wickedness that would have taken place anyway but is now taking place on the internet; it is about the internet catalysing and exaggerating that wickedness, and spawning and encouraging all kinds of malevolence. We have a big responsibility in this place to regulate, control and indeed stop this, and the hon. Lady is right to emphasise that.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The right hon. Gentleman is entirely correct. Whether it involves a particularly right-wing cause or antisemitism—or, indeed, dieting content that drags people into something more radical in relation to eating disorders—the bubble mentality created by these algorithms massively increases the risk of radicalisation, and we therefore have an increased duty to protect people.

As I have said, I am pleased to see the positive changes that have been made as a result of Opposition pressure and the uncompromising efforts of those in the House of Lords, especially Baroness Kidron, who has been nothing short of tenacious. Throughout the time in which we have been discussing the Bill, I have spoken to Members of both Houses about it, and it has been very unusual to come across anyone who knows what they are talking about, and, in particular, has the incredible depth of knowledge, understanding and wisdom shown by Baroness Kidron. I was able to speak to her as someone who practically grew up on the internet—we had it at home when I was eight—but she knew far more about it than I did. I am extremely pleased that the Government have worked with her to improve the Bill, and have accepted that she has a huge breadth of knowledge. She managed to do what we did not quite manage to do in this House, although hopefully we laid the foundations.

I want to refer to a number of points that were mentioned by the Minister and are also mentioned in the letters that the Government provided relating to the Lords amendments. Algorithmic scrutiny is incredibly important, and I, along with other Members, have raised it a number of times—again, in connection with concern about radicalisation. Some organisations have been doing better things recently. For instance, someone who searches for something may begin to go down a rabbit hole. Some companies are now putting up a flag, for instance a video, suggesting that users are going down a dark hole and should look at something a bit lighter, and directing them away from the autoplaying of the more radical content. If all organisations, or at least a significant number—particularly those with high traffic—can be encouraged to take such action rather than allowing people to be driven to more extreme content, that will be a positive step.

I was pleased to hear about the upcoming researcher access report, and about the report on app stores. I asked a previous Minister about app stores a year or so ago, and the Minister said that they were not included, and that was the end of it. Given the risk that is posed by app stores, the fact that they were not categorised as user-to-user content concerned me greatly. Someone who wants to put something on an Apple app store has to jump through Apple’s hoops. The content is not owned by the app store, and the same applies to some of the material on the PlayStation store. It is owned by the person who created the content, and it is therefore user-to-user content. In some cases, it is created by one individual. There is no ongoing review of that. Age-rating is another issue: app stores choose whatever age they happen to decide is the most important. Some of the dating apps, such as match.com, have been active in that regard, and have made it clear that their platforms are not for under-16s or under-18s, while the app store has rated the content as being for a younger age than the users’ actual age. That is of concern, especially if the companies are trying to improve age-rating.

On the subject of age rating, I am pleased to see more in the Bill about age assurance and the frameworks. I am particularly pleased to see what is going to happen in relation to trying to stop children being able to access pornography. That is incredibly important but it had been missing from the Bill. I understand that Baroness Floella Benjamin has done a huge amount of work on pushing this forward and ensuring that parliamentarians are briefed on it, and I thank her for the work that she has done. Human trafficking has also been included. Again, that was something that we pushed for, and I am glad to see that it has been put on the face of the Bill.

I want to talk briefly about the review mechanisms, then I will go on to talk about end-to-end encryption. I am still concerned that the review mechanisms are not strong enough. We have pushed to have a parliamentary Committee convened, for example, to review this legislation. This is the fastest moving area of life. Things are changing so dramatically. How many people in here had even heard of ChatGPT a year and a half ago? How many people had used a virtual reality headset? How many people had accessed Rec Room of any of the other VR systems? I understand that the Government have genuinely tried their best to make the Bill as future-proof as possible, but we have no parliamentary scrutiny mechanisms written in. I am not trying to undermine the work of the Committee on this—I think it is incredibly important—but Select Committees are busy and they have no legislative power in this regard. If the Government had written in a review, that would have been incredibly helpful.

David Davis Portrait Mr David Davis (Haltemprice and Howden) (Con)
- Hansard - - - Excerpts

The hon. Lady is making a very good speech. When I first came to this House, which was rather a long time ago now, there was a Companies Act every year, because company law was changing at the time, as was the nature of post-war capitalism. It seems to me that there is a strong argument for an annual Act on the handling and management of the internet. What she is saying is exactly right, and that is probably where we will end up.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I completely support the right hon. Member’s point—I would love to see this happening on an annual basis. I am sure that the Ministers who have shepherded the Bill through would be terrified of that, and that the Government team sitting over there are probably quaking in their boots at the suggestion, but given how fast this moves, I think that this would be incredibly important.

The Government’s record on post-implementation reviews of legislation is pretty shoddy. If you ask Government Departments what percentage of legislation they have put through a post-implementation review in the timeline they were supposed to, they will say that it is very small. Some Departments are a bit better than others, but given the number of reshuffles there have been, some do not even know which pieces of legislation they are supposed to be post-implementation reviewing. I am concerned that this legislation will get lost, and that there is no legislative back-up to any of the mechanisms for reviewing it. The Minister has said that it will be kept under review, but can we have some sort of governmental commitment that an actual review will take place, and that legislation will be forthcoming if necessary, to ensure that the implementation of this Bill is carried out as intended? We are not necessarily asking the Government to change it; we are just asking them to cover all the things that they intend it to cover.

On end-to-end encryption, on child sexual exploitation and abuse materials, and on the last resort provider—I have been consistent with every Minister I have spoken to across the Dispatch Box and every time I have spoken to hon. Members about this—when there is any use of child sexual exploitation material or child sexual abuse material, we should be able to require the provider to find it. That absolutely trumps privacy. The largest increase in child sexual abuse material is in self-generated content. That is horrific. We are seeing a massive increase in that number. We need providers to be able to search—using the hash numbers that they can categorise images with, or however they want to do it—for people who are sharing this material in order to allow the authorities to arrest them and put them behind bars so that they cannot cause any more harm to children. That is more important than any privacy concerns. Although Ministers have not put it in the Bill until this point, they have, to their credit, been clear that that is more important than any privacy concerns, and that protecting children trumps those concerns when it comes to abuse materials and exploitation. I am glad to see that that is now written into the Bill; it is important that it was not just stated at the Dispatch Box, even though it was mentioned by a number of Members.

15:15
I have spoken about the huge number of online harms, the huge number of issues with accessing the internet and the huge number of concerns we have for the future, but I also need to say that the internet is a wonderful place. It is absolutely great to be able to go and play online games. It is great to be able to have a community of people that I can speak to online and have a conversation with. It is good that the Government have included in some of the risks the issues about platforms where adults can contact children, for example. That was another thing I addressed during the course of the amendments. It is great that people can find their tribe online in a way that they perhaps cannot do in real life. It is brilliant that people can have a try-out at being someone else online. That is not about trying to confuse or upset people or about catfishing. Sometimes we need to have a wee bit of self-exploration in order to try and work out who we are. There are so many positive aspects of the internet, but we need to ensure that children and the most vulnerable adults in particular are kept safe online.
This is not the perfect Bill. This is not necessarily the Bill that I would have liked to see. It has gone through so many changes and iterations over the time we have been trying to scrutinise it that some of it has gone back to what it previously looked like, except the harmful content in relation to adults. I am pleased that the internet will be a safer place for our children and our children’s children. I am pleased that they will have more protections online. I have an amount of faith and cautious optimism in the work of Ofcom, because of how fast it has been scaling up and because of the incredible people it has employed to work there—they really know what they are talking about. I wish the Government and Ofcom every success in ensuring that the Bill is embedded and ensuring that the internet is as safe as possible. I would just really like a commitment from the Minister on ensuring that this legislation is kept under proper review and that legislative change will be made, should we identify any loopholes.
Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

The draft Bill was published in April 2021, so it is fantastic that we are now discussing its final stages after it has gone through its processes in the House of Lords. It went through pre-legislative scrutiny, then it was introduced here, committed to the Bill Committee, recommitted, came back to the House, went to the Lords and came back again. I do not think any Bill has had as much scrutiny and debate over such a long period of time as this one has had. Hon. Members have disagreed on it from time to time, but the spirit and motivation at every stage have never been political; it has been about trying to make the Bill the best it can possibly be. We have ended up with a process that has seen it get better through all its stages.

Picking up on the comments of the hon. Member for Aberdeen North (Kirsty Blackman) and others, the question of ongoing scrutiny of the regime is an important one. In the pre-legislative scrutiny Committee—the Joint Committee that I chaired—there was a recommendation that there should be a post-legislative scrutiny Committee or a new Joint Committee, perhaps for a limited period. The pre-legislative scrutiny Committee benefited enormously from being a Committee of both Houses. Baroness Kidron has rightly been mentioned by Members today and she is watching us today from the Gallery. She is keeping her scrutiny of the passage of the Bill going from her position of advantage in the Gallery.

We have discussed a number of new technologies during the Bill’s passage that were not discussed at all on Second Reading because they were not live, including the metaverse and large language models. We are reassured that the Bill is futureproof, but we will not know until we come across such things. Ongoing scrutiny of the regime, the codes of practice and Ofcom’s risk registers is more than any one Select Committee can do. The Government have previously spoken favourably of the idea of post-legislative scrutiny, and it would be good if the Minister could say whether that is still under consideration.

John Hayes Portrait Sir John Hayes
- Hansard - - - Excerpts

My hon. Friend makes a powerful point, echoing the comments of Members on both sides of the House. He is absolutely right that, as well as the scale and character of internet harms, their dynamism is a feature that Governments must take seriously. The problem, it seems to me, is that the pace of technological change, in this area and in others, does not fit easily with the thoroughness of the democratic legislative process; we tend to want to do things at length, because we want to scrutinise them properly, and that takes time. How does my hon. Friend square that in his own mind, and what would he recommend to the Government?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The length of the process we have gone through on this Bill is a good thing, because we have ended up with probably the most comprehensive legislation in the world. We have a regulator with more power, and more power to sanction, than anywhere else. It is important to get that right.

A lot of the regulation is principle-based. It is about the regulation of user-to-user services, whereby people share things with each other through an intermediary service. Technology will develop, but those principles will underpin a lot of it. There will be specific cases where we need to think about whether the regulatory oversight works in a metaverse environment in which we are dealing with harms created by speech that has no footprint. How do we monitor and scrutinise that?

One of the hardest challenges could be making sure that companies continue to use appropriate technology to identify and mitigate harms on their platforms. The problem we have had with the regime to date is that we have relied on self-reporting from the technology companies on what is or is not possible. Indeed, the debate about end-to-end encryption is another example. The companies are saying that, if they share too much data, there is a danger that it will break encryption, but they will not say what data they gather or how they use it. For example, they will not say how they identify illegal use of their platform. Can they see the messages that people have sent after they have sent them? They will not publicly acknowledge it, and they will not say what data they gather and what triggers they could use to intervene, but the regulator will now have the right to see them. That principle of accountability and the power of the regulator to scrutinise are the two things that make me confident that this will work, but we may need to make amendments because of new things that we have not yet thought about.

William Cash Portrait Sir William Cash
- Hansard - - - Excerpts

In addition to the idea of annual scrutiny raised by my right hon. Friend the Member for Haltemprice and Howden (Mr Davis), does my hon. Friend think it would be a reasonably good idea for the Select Committee on Culture, Media and Sport to set up a Sub-Committee under its Standing Orders to keep any eye on this stuff? My hon. Friend was a great Chairman of that Select Committee, and such a Sub-Committee would allow the annual monitoring of all the things that could go wrong, and it could also try to keep up with the pace of change.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

When I chaired the Digital, Culture, Media and Sport Committee, we set up a Sub-Committee to consider these issues and internet regulation. Of course, the Sub-Committee has the same members. It is up to the Select Committee to determine how it structures itself and spends its time, but there is only so much that any one departmental Select Committee can do among its huge range of other responsibilities. It might be worth thinking about a special Committee, drawing on the powers and knowledge of both Houses, but that is not a matter for the Bill. As my hon. Friend knows, it is a matter of amending the Standing Orders of the House, and the House must decide that it wants to create such a Committee. I think it is something we should consider.

We must make sure that encrypted services have proper transparency and accountability, and we must bring in skilled experts. Members have talked about researcher access to the companies’ data and information, and it cannot be a free-for-all; there has to be a process by which a researcher applies to get privileged access to a company’s information. Indeed, as part of responding to Ofcom’s risk registers, a company could say that allowing researchers access is one of the ways it seeks to ensure safe use of its platform, by seeking the help of others to identify harm.

There is nothing to stop Ofcom appointing many researchers. The Bill gives Ofcom the power to delegate its authority and its powers to outside expert researchers to investigate matters on its behalf. In my view, that would be a good thing for Ofcom to do, because it will not have all the expertise in-house. The power to appoint a skilled person to use the powers of Ofcom exists within the Bill, and Ofcom should say that it intends to use that power widely. I would be grateful if the Minister could confirm that Ofcom has that power in the Bill.

Jamie Stone Portrait Jamie Stone (Caithness, Sutherland and Easter Ross) (LD)
- View Speech - Hansard - - - Excerpts

It is very kind of you to call me to speak, Mr Deputy Speaker. I apologise to your good self, to the Minister and to the House for arriving rather tardily.

My daughter and her husband have been staying with me over the past few days. When I get up to make my wife and myself an early-morning cup of tea, I find my two grandchildren sitting in the kitchen with their iPads, which does not half bring home the dangers. I look at them and think, “Gosh, I hope there is security, because they are just little kids.” I worry about that kind of thing. As everyone has said, keeping children safe is ever more important.

The Bill’s progress shows some of the best aspects of this place and the other place working together to improve legislation. The shadow Minister, the hon. Member for Pontypridd (Alex Davies-Jones), and the hon. Member for Aberdeen North (Kirsty Blackman) both mentioned that, and it has been encouraging to see how the Bill has come together. However, as others have said, it has taken a long time and there have been a lot of delays. Perhaps that was unavoidable, but it is regrettable. It has been difficult for the Government to get the Bill to where it is today, and the trouble is that the delays mean there will probably be more victims before the Bill is enacted. We see before us a much-changed Bill, and I thank the Lords for their 150 amendments. They have put in a lot of hard work, as others have said.

The Secretary of State’s powers worry my party and me, and I wonder whether the Bill still fails to tackle harmful activity effectively. Perhaps better things could be done, but we are where we are. I welcome the addition of new offences, such as encouraging self-harm and intimate image abuse. A future Bill might be needed to set out the thresholds for the prosecution of non-fatal self-harm. We may also need further work on the intent requirement for cyber-flashing, and on whether Ofcom can introduce such requirements. I am encouraged by what we have heard from the Minister.

We would also have liked to see more movement on risk assessment, as terms of service should be subject to a mandatory risk assessment. My party remains unconvinced that we have got to grips with the metaverse—this terrifying new thing that has come at us. I think there is work to be done on that, and we will see what happens in the future.

As others have said, education is crucial. I hope that my grandchildren, sitting there with their iPads, have been told as much as possible by their teachers, my daughter and my son-in-law about what to do and what not to do. That leads me on to the huge importance of the parent being able, where necessary, to intervene rapidly, because this has to be done damned quickly. If it looks like they are going down a black hole, we want to stop that right away. A kid could see something horrid that could damage them for life—it could be that bad.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Once a child sees something, they cannot unsee it. This is not just about parental controls; we hope that the requirement on the companies to do the risk assessments and on Ofcom to look at those will mean that those issues are stopped before they even get to the point of requiring parental controls. I hope that such an approach will make this safer by design when it begins to operate, rather than relying on having an active parent who is not working three jobs and therefore has time to moderate what their children are doing online.

Jamie Stone Portrait Jamie Stone
- Hansard - - - Excerpts

The hon. Lady makes an excellent point. Let me just illustrate it by saying that each of us in our childhood, when we were little—when we were four, five or six—saw something that frightened us. Oddly enough, we never forget that throughout the rest of life, do we? That is what bad dreams are made of. We should remember that point, which is why those are wise words indeed.

15:30
Finally, I shall try your excellent patience, Mr Deputy Speaker, with a few words about encryption, to which reference has been made. I commend the Government for their recognition of the dangers that exist online and the inadequacy of current protections. However, regulation and enforcement must be based on clear evidence of well-defined harm and must respect the rights to privacy and free expression of those who use social media legally and responsibly. On encryption, for the vast majority, privacy means security. We have always to test that theory, but I think that is what most of us believe. If I picked it up right, the right hon. Member for Haltemprice and Howden (Mr Davis) said that this should be revisited on a regular basis. Perhaps the advisers that Ofcom will hire will address this sort of thing, but this is about constant vigilance, is it not? Let me put it on the record that my party would fundamentally oppose any attempts to undermine or weaken encryption.
Once again, I wish to thank all the Members who have put together a good piece of legislation. In the spirit of generosity, let me say that the Government have tried their very best on a tricky issue, and I give credit to those on both sides of the House for this step in the right direction.
Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- View Speech - Hansard - - - Excerpts

This Bill may well have been with us since April 2021 and been subject to significant change, but it remains a Bill about keeping people safer online and it remains groundbreaking. I welcome it back after scrutiny in the Lords and join others in paying tribute to those who have campaigned for social media platforms to release information following the death of a child. I am pleased that some are able to be with us today to hear this debate and the commitment to that issue.

This will never be a perfect Bill, but we must recognise that it is good enough and that we need to get it on to the statute book. The Minister has helped by saying clearly that this is not the endgame and that scrutiny will be inherent in the future of this legislation. I hope that he will heed the comments of my hon. Friend the Member for Folkestone and Hythe (Damian Collins), who encouraged him to set up a bespoke Committee, which was one of the recommendations from the initial scrutiny of the Bill.

I will confine my remarks to the Government’s Lords amendment 263 and those surrounding it, which inserted the amendments I tabled on Report into the Bill. They relate to the sharing of intimate images online, including deepfakes, without consent. I wish wholeheartedly to say thank you to the Minister, who always listens intently, to the Minister of State, Ministry of Justice, my right hon. Friend the Member for Charnwood (Edward Argar), who has recently joined him, and to the Secretary of State for Science, Innovation and Technology. They have all not only listened to the arguments on intimate image abuse, but acted. The changes today are no less a testament to their commitment to this Bill than any other area. Focusing on children’s safety is very important, but the safety of adults online is also important. We started on a journey to address intimate image abuse way back in 2015, with the Criminal Justice and Courts Act 2015, and we have learned to provide that protection much better, mostly through the work of the Law Commission and its report on how we should be tackling intimate image abuse online.

The Bill, as it has been amended, has been changed fundamentally on the treatment of intimate image abuse, in line with the debate on Report in this place. That has created four new offences. The base offence removes the idea of intent to cause distress entirely and relies only on whether there was consent from the person appearing in the image. Two more serious offences do include intent, with one being sending an image with intent to cause alarm and distress. We also now have the offence of threatening to share an image, which will protect people from potential blackmail, particularly from an abusive partner. That will make a huge difference for victims, who are still overwhelmingly women.

In his closing comments, will the Minister address the gaps that still exist, particularly around the issue of the images themselves, which, because of the scope of the Bill, will not become illegal? He and his colleagues have indicated that more legislation might be in the planning stages to address those particular recommendations by the Law Commission. Perhaps he could also comment on something that the Revenge Porn Helpline is increasingly being told by victims, which is that online platforms will not remove an image even though it may have been posted illegally, and that will not change in the future. Perhaps he can give me and those victims who might be listening today some comfort that either there are ways of addressing that matter now or that he will address it in the very near future.

Richard Burgon Portrait Richard Burgon (Leeds East) (Lab)
- View Speech - Hansard - - - Excerpts

As we reflect on the Bill today, it is important to say that it has been improved as it has progressed through the Parliament. That is due in no small measure to Members from across the parties—both here and in the other place—who have engaged very collegiately, and to individuals and groups outside this place, particularly the Samaritans and those who have lived experience of the consequences of the dangers of the internet.

People from my constituency have also been involved, including the family of Joe Nihill, whom I have mentioned previously. At the age of 23, Joe took his own life after accessing dangerous suicide-related online content. His mother, Catherine, and sister-in-law, Melanie, have bravely campaigned to use the Online Safety Bill as an opportunity to ensure that what happened to Joe so tragically does not happen to others. I thank the Minister and his team for meeting Joe’s mother, his sister-in-law and me, and for listening to what we had to say. I recognise that, as a result, the Bill has improved, in particular with the Government’s acceptance of Lords amendment 391, which was first tabled by Baroness Morgan of Cotes. It is welcome that the Government have accepted the amendment, which will enable platforms to be placed in category 1 based on their functionality, even if they do not have a large reach. That is important, because some of the worst and most dangerous online suicide and self-harm related material appears on smaller platforms rather than the larger ones.

I also welcome the fact that the Bill creates a new communications offence of encouraging or assisting self-harm and makes such content a further priority for action, which is important. The Bill provides an historic opportunity to ensure that tackling suicide and self-harm related online content does not end with this Bill becoming law. I urge the Government to listen very carefully to what the Samaritans have said. As my hon. Friend the shadow Minister asked, will the Government commit to a review of the legislation to ensure that it has met the objective of making our country the safest place in the world in which to go online? Importantly, can the Government confirm when the consultation on the new offence of encouraging or assisting self-harm will take place?

As I mentioned in an intervention, it is clear that the Government want to tackle harmful suicide and self-harm related content with the Bill, but, as we have heard throughout our discussions, the measures do not go far enough. The Samaritans were correct to say that the Bill represents a welcome advance and that it has improved recently, but it still does not go far enough in relation to dangerous suicide and self-harm online content. How will the Government engage with people who have lived experience—people such as Melanie and Catherine—to ensure that the new laws make things better? Nobody wants the implementation of the Bill to be the end of the matter. We must redouble our efforts to make the internet as safe a place as possible, reflect on the experiences of my constituents, Joe Nihill and his family, and understand that there is a lot of dangerous suicide and self-harm related content out there. We are talking about people who exploit the vulnerable, regardless of their age.

I urge all those who are following the progress of the Bill and who look at this issue not to make the mistake of thinking that when we talk about dangerous online suicide and self-harm related content, it is somehow about freedom of speech. It is about protecting people. When we talk about dangerous online material relating to suicide and self-harm, it is not a freedom of speech issue; it is an issue of protecting people.

William Cash Portrait Sir William Cash
- Hansard - - - Excerpts

Has the hon. Gentleman noted, I hope with satisfaction, that the Government yesterday and today have made statements on a strategy for preventing suicide nationally, and that what he is saying—which I agree with—will be implemented? It has just been announced, it is very important and it is related to the Bill.

Richard Burgon Portrait Richard Burgon
- Hansard - - - Excerpts

I thank the hon. Gentleman for his intervention. It is important that the Government have announced a strategy: it is part and parcel of the ongoing work that is so necessary when we consider the prevalence of suicide as the leading cause of death among young men and women. It is a scourge across society. People should not make the mistake of thinking that the internet merely showcases awful things. The internet has been used as a tool by exploitative and sometimes disturbed individuals to create more misery and more instances of awful things happening, and to lead others down a dangerous path that sometimes ends, sadly, in them taking their own lives.

I thank the Minister for his engagement with my constituents, and the shadow Minister for what she has done. I also thank Baroness Kidron, Baroness Morgan and hon. Members who have engaged with this issue. I urge the Government to see the Bill not as the end when it comes to tackling dangerous online content related to suicide and self-harm, but as part of an important ongoing journey that we all work on together.

Siobhan Baillie Portrait Siobhan Baillie (Stroud) (Con)
- View Speech - Hansard - - - Excerpts

I rise to speak to Lords amendment 231 on visible identity verification. I will not press the amendment to a vote. I have had several discussions with Ministers and the Secretary of State, and I am grateful for their time. I will explain a little more.

The dry nature of the amendment masks the fact that the issue of identity verification—or lack of it—affects millions of people around the country. We increasingly live our lives online, so the public being able to know who is or is not a real person online is a key part of the UK being the safest the place to be on the internet, which is the Bill’s ambition. Unfortunately, too often it feels as though we have to wade through nutters, bots, fake accounts and other nasties before coming to a real person we want to hear from. The Bill takes huge steps to empower users to change that, but there is more to do.

Hon. Members will recall that I have campaigned for years to tackle anonymous abuse. I thank Stroud constituents, celebrities and parents who have brought to me sad stories that I have conveyed to the House involving abuse about the deaths of babies and children and about disabled children. That is absolutely awful.

Alongside a smart Stroud constituent and Clean Up The Internet—a fantastic organisation—we have fought and argued for social media users to have the option of being verified online; for them to be able to follow and be followed only by verified accounts, if that is what they want; and, crucially, to make it clear who is and is not verified online. People can still be Princess Unicorn if they want, but at the back end, their address and details can be held, and that will give confidence.

John Hayes Portrait Sir John Hayes
- Hansard - - - Excerpts

My hon. Friend is making a powerful case. Umberto Eco, the Italian philosopher, described the internet as the empire of imbeciles, and much of social media is indeed imbecilic—but it is much worse than that. My hon. Friend is right that the internet provides a hiding place for the kind of malevolence she has described. Does she agree that the critical thing is for the Government to look again at the responsibility of those who publish this material? If it were written material, the publisher would have a legal liability. That is not true of internet companies. Is that a way forward?

15:47
Siobhan Baillie Portrait Siobhan Baillie
- Hansard - - - Excerpts

I am interested in that intervention, but I fear it would lead us into a very long discussion and I want to keep my comments focused on my amendment. However, it would be interesting to hear from the Minister in response to that point, because it is a huge topic for debate.

On the point about whether someone is real or not real online, I believe passionately that not only famous people or those who can afford it should be able to show that they are a real and verified person. I say, “Roll out the blue ticks.”—or the equivalents—and not just to make the social media performs more money; as we have seen, we need it as a safety mechanism and a personal responsibility mechanism.

All the evidence and endless polling show that the public want to know who is and who is not real online, and it does not take rocket science to understand why. Dealing with faceless, anonymous accounts is very scary and anonymous abusers are terrifying. Parents are worried that they do not know who their children are speaking to, and anonymous, unverified accounts cannot be traced if details are not held.

That is before we get to how visible verification can help to tackle fraud. We should empower people to avoid fake accounts. We know that people are less likely to engage with an unverified account, and it would make it easy to catch scammers. Fraud was the most common form of crime in 2022, with 41% of all crimes being fraud, 23% of all reported fraud being initiated on social media and 80% of fraud being cyber-related. We can imagine just how fantastically clever the scams will become through AI.

Since we started this process, tech companies have recognised the value of identity verification to the public, so much so that they now sell it on Twitter as blue ticks, and the Government understand the benefits of identity verification options. The Government have done a huge amount of work on that. I thank them for agreeing to two of the three pillars of my campaign, and I believe we can get there on visibility; I know from discussions with Government that Ofcom will be looking carefully at that.

Making things simple for social media users is incredibly important. For the user verification provisions in this Bill to fulfil their potential and prevent harm, including illegal harm, we believe that users need to be able to see who is and is not verified—that is, who is a real person—and all the evidence says that that is what the public wants.

While Ministers in this place and the other place have resisted putting visible verification on the face of the Bill, I am grateful to the Government for their work on this. After a lot of to-ing and fro-ing, we are reassured that the Bill as now worded gives Ofcom the powers to do what the public wants and what we are suggesting through codes and guidance. We hope that Ofcom will consider the role of anonymous, inauthentic and non-verified accounts as it prepares its register of risks relating to illegal content and in its risk profiles.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I pay tribute to the way my hon. Friend has focused on this issue through so many months and years. Does she agree that, in light of the assurances that she has had from the Minister, this is just the sort of issue that either a stand-alone committee or some kind of scrutiny group could keep an eye on? If those guidelines do not work as the Minister is hoping, the action she has suggested will need to be taken.

Siobhan Baillie Portrait Siobhan Baillie
- Hansard - - - Excerpts

Absolutely. Given the fast nature of social media and the tech world, and how quickly they adapt—often for their own benefit, sadly—I think that a committee with that focus could work.

To wrap up, I thank MPs from across the House, and you, Madam Deputy Speaker, for your grace today. I have had help from my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) in particular, for which I am very grateful. In the other place, Lord Clement-Jones, Lord Stevenson, Baroness Morgan, Baroness Fall and Baroness Wyld have all been absolutely excellent in pushing through these matters. I look forward to hearing what the Minister says, and thank everybody for their time.

Jeremy Wright Portrait Sir Jeremy Wright (Kenilworth and Southam) (Con)
- View Speech - Hansard - - - Excerpts

As others have done, I welcome the considerable progress made on the Bill in the other place, both in the detailed scrutiny that it has received from noble Lords, who have taken a consistent and expert interest in it, and in the positive and consensual tone adopted by Opposition Front Benchers and, crucially, by Ministers.

It seems that there are very few Members of this House who have not had ministerial responsibility for the Bill at some point in what has been an extraordinarily extensive relay race as it has moved through its legislative stages. The anchor leg—the hardest bit in such a Bill—has been run with dedication and skill by my right hon. Friend the Secretary of State, who deserves all the praise that she will get for holding the baton as we cross the parliamentary finish line, as I hope we are close to doing.

I have been an advocate of humility in the way in which we all approach this legislation. It is genuinely difficult and novel territory. In general, I think that my right hon. Friend the Secretary of State and her Ministers—the noble Lord Parkinson and, of course, the Under-Secretary of State for Science, Innovation and Technology, my hon. Friend the Member for Sutton and Cheam (Paul Scully)—have been willing to change their minds when it was right to do so, and the Bill is better for it. Like others who have dealt with them, I also thank the officials, some of whom sit in the Box, some of whom do not. They have dedicated—as I suspect they would see it—most of their lives to the generation of the Bill, and we are grateful to them for their commitment.

Of course, as others have said, none of this means that the Bill is perfect; frankly, it was never going to be. Nor does it mean that when we pass the Bill, the job is done. We will then pass the baton to Ofcom, which will have a large amount of further work to do. However, we now need to finalise the legislative phase of this work after many years of consideration. For that reason, I welcome in particular what I think are sensible compromises on two significant issues that had yet to be resolved: first, the content of children’s risk assessments, and secondly, the categorisation process. I hope that the House will bear with me while I consider those in detail, which we have not yet done, starting with Lords amendments 17, 20 and 22, and Lords amendment 81 in relation to search, as well as the Government amendments in lieu of them.

Those Lords amendments insert harmful “features, functionalities or behaviours” into the list of matters that should be considered in the children’s risk assessment process and in the meeting of the safety duties, to add to the harms arising from the intrinsic nature of content itself—that is an important change. As others have done, I pay great tribute to the noble Baroness Kidron, who has invariably been the driving force behind so many of the positive enhancements to children’s online safety that the Bill will bring. She has promoted this enhancement, too. As she said, it is right to recognise and reflect in the legislation that a child’s online experience can be harmful not just as a result of the harm an individual piece of content can cause, but in the way that content is selected and presented to that child—in other words, the way in which the service is designed to operate. As she knows, however, I part company with the Lords amendments in the breadth of the language used, particularly the word “behaviours”.

Throughout our consideration of the Bill, I have taken the view that we should be less interested in passing legislation that sounds good and more interested in passing legislation that works. We need the regulator to be able to encourage and enforce improvements in online safety effectively. That means asking the online platforms to address the harms that it is within their power to address, and to relate clearly the design or operation of the systems that they have put in place.

The difficulty with the wording of the Lords amendments is that they bring into the ambit of the legislation behaviours that are not necessarily enabled or created by the design or operation of the service. The language used is

“features, functionalities or behaviours (including those enabled or created by the design or operation of the service) that are harmful to children”—

in other words, not limited to those that are enabled or created by the service. It is a step too far to make platforms accountable for all behaviours that are harmful to children without the clarity of that link to what the platform has itself done. For that reason, I cannot support those Lords amendments.

However, the Government have proposed a sensible alternative approach in their amendments in lieu, particularly in relation to Lords amendments 17 and Lords amendment 81, which relates to search services. The Government amendments in lieu capture the central point that design of a service can lead to harm and require a service to assess that as part of the children’s risk assessment process. That is a significant expansion of a service’s responsibilities in the risk assessment process which reflects not just ongoing concern about types of harm that were not adequately captured in the Bill so far but the positive moves we have all sought to make towards safety by design as an important preventive concept in online safety.

I also think it is important, given the potential scale of this expanded responsibility, to make clear that the concept of proportionality applies to a service’s approach to this element of assessment and mitigation of risk, as it does throughout the Bill, and I hope the Minister will be able to do that when he winds up the debate.

William Cash Portrait Sir William Cash
- Hansard - - - Excerpts

My right hon. and learned Friend has mentioned Ofcom several times. I would like to ask his opinion as to whether there should be, if there is not already, a special provision for a report by Ofcom on its own involvement in these processes during the course of its annual report every year, to be sure that we know that Ofcom is doing its job. In Parliament, we know what Select Committees are doing. The question is, what is Ofcom doing on a continuous basis?

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

My hon. Friend makes a fair point. One difficult part of our legislative journey with the Bill is to get right, in so far as we can, the balance between what the regulator should take responsibility for, what Ministers should take responsibility for and what the legislature—this Parliament—should take responsibility for. We may not have got that exactly right yet.

On my hon. Friend’s specific point, my understanding is that because Ofcom must report to Parliament in any event, it will certainly be Ofcom’s intention to report back on this. It will be quite a large slice of what Ofcom does from this point onwards, so it would be remarkable if it did not, but I think we will have to return to the points that my hon. Friend the Member for Folkestone and Hythe (Damian Collins) and others have made about the nature of parliamentary scrutiny that is then required to ensure that we are all on top of this progress as it develops.

I was talking about what I would like my hon. Friend the Minister to say when he winds up the debate. I know he will not have a huge amount of time to do so, but he might also confirm that the balancing duties in relation to freedom of speech and privacy, for example, continue to apply to the fulfilment of the safety duties in this context as well. That would be helpful.

The Government amendments in lieu do not replicate the reference to design in the safety duties themselves, but I do not see that as problematic because, as I understand it, the risks identified in the risk assessment process, which will now include design risks, feed through to and give rise to the safety duties, so that if a design risk is identified in the risk assessment, a service is required to mitigate and address it. Again, I would be grateful if the Minister confirmed that.

We should also recognise that Government amendment (b) in lieu of Lords amendment 17 and Government amendments (b) and (c) in lieu of Lords amendment 81 specifically require consideration of

“functionalities or other features of the service that affect how much children use the service”

As far as I can tell, that introduces consideration of design-related addiction—recognisable to many parents; it cannot just be me—into the assessment process. These changes reflect the reality of how online harm to children manifests itself, and the Government are to be congratulated on including them, although, as I say, the Government and, subsequently, Ofcom will need to be clear about what these new expectations mean in practical terms for a platform considering its risk assessment process and seeking to comply with its safety duties.

I now turn to the amendments dealing with the categorisation process, which are Lords amendment 391 and the Government amendments arising from it. Lords amendment 391 would allow Ofcom to designate a service as a category 1 service, with the additional expectations and responsibility that brings, if it is of a certain scale or if it has certain functionalities, rather than both being required as was the case in the original Bill. The effect of the original drafting was, in essence, that only big platforms could be category 1 platforms and that big platforms were bound to be category 1 platforms. That gave rise to two problems that, as my hon. Friend the Minister knows, we have discussed before.

16:00
The first problem was that smaller platforms where highly harmful material was to be found, whether organically or because it was seeking refuge from the greater regulation of larger platforms, could not be made subject to a more restrictive regime. The second was that larger platforms whose operations give rise to very little concern in the context of this Bill—Wikipedia being a common example—would have to be subject to more extensive regulatory requirements than is justified by the risk they really present. Lords amendment 391 in the name of my right hon. Friend the noble Baroness Morgan seeks to resolve those two problems at once. Given that I proposed an identical amendment in this House, I am unsurprisingly in favour of it, and I congratulate Baroness Morgan on doing a better job of persuading the other place of its merits than I managed to do in this place. I am pleased to see the Government effectively accept that amendment today.
Finally, I will say a few words about the amendments tabled by other right hon. and hon. Members. If you will forgive me, Madam Deputy Speaker, in the interests of time, I will not speak to all the amendments proposed by my hon. Friend the Member for Yeovil (Mr Fysh)—I can see that you approve. However, from what I have just said, he will gather that I cannot support his amendment (a) to Lords amendment 1, which would limit application of all the safety duties in the Bill to
“providers of significant size and capacity, and with a substantial involvement in the communication of media content”.
I cannot support my hon. Friend’s amendment for both technical and substantive reasons. The technical reason is that Lords amendment 1 adds an introductory clause to the Bill that is designed to be a guide to its contents and effects, and his amendment to that clause is not followed through in the rest of the Bill. As such, the introductory clause would say that the Bill’s scope is limited to larger platforms only, but the rest of the Bill would not say the same. The more substantive reason is that in my view, my hon. Friend’s amendment is both inappropriate and unnecessary. It is inappropriate because highly harmful content can be found on smaller platforms, and all platforms should surely do what they can to minimise harm to children and the presence of illegal content on their service, which are the focuses of the Bill. It is unnecessary because the concept of proportionality runs through the Bill, so the regulator’s expectations of small platforms can and should be different from its expectations of large ones.
My hon. Friend’s other amendments seek to avoid introducing, by means of the imposition of the safety duties, what he describes as
“systemic weakness and vulnerabilities relating to compliance with the duties”.
He seeks to do so in a number of places in the Bill. However, that concept of systemic weaknesses and vulnerabilities is not defined and could be extraordinarily wide, potentially undermining the whole purpose of those safety duties. I am being slightly unfair to my hon. Friend, because he has not spoken yet, but I think he is primarily concerned with the Bill’s effect on encrypted services. Others have expressed concern, too—my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) and the hon. Member for Brighton, Pavilion (Caroline Lucas) have made their concern known through their amendment to Lords amendment 217—which raises an important question about where we are on encryption. Throughout the progress of the Bill, Ministers have been clear that it involves no ban on the use of encryption. However, as others have said, there will need to be some further clarity—not least, by the way, about the interaction of the regime we are creating with the data protection regime and the involvement of the Information Commissioner’s Office.
Encryption clearly cannot be a “get out of jail free” card for safety duty compliance. Surely, people cannot say, “I operate an encrypted service, so I do not have to comply with the safety duties.” Does it therefore follow that if there is no prohibition on the use of encryption and no exemption from safety duties just because a service uses it, each service that is within the scope of the Bill and uses encryption must show Ofcom that it can meet its safety duties proportionately and with due weight given to balancing duties—particularly on privacy—with the use of encryption? If a service cannot do so, does it follow that Ofcom will require that service to not use encryption, to the extent that that is necessary for it to meet its safety duties to Ofcom’s satisfaction? We need clarity on that point.
Finally, as I said at the start, the Bill is not perfect and there is still much work to be done, but if we can agree the final changes we are discussing and, indeed, if their Lordships are prepared to endorse that next week, the very real prize to be won is that Ofcom can begin the work that it needs to do sooner rather than later and we can bring nearer the benefits that this legislation can deliver for the vulnerable online. More than that, we can enhance the reputation of Parliament as we show that we can do difficult legislation in otherwise fractious times with sincerity, seriousness and a willingness to compromise. I think that is a valuable prize and one within our grasp, and it is why I shall support the Government amendments.
Marcus Fysh Portrait Mr Marcus Fysh (Yeovil) (Con)
- View Speech - Hansard - - - Excerpts

It is a pleasure to follow my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), who made a characteristically thoughtful speech. At the outset, I want to put on record my entry in the Register of Members’ Financial Interests, and also my chairmanships of the all-party parliamentary groups on digital identity and on central bank and digital currency, which includes stablecoins. I also put on record the fact that I am the father of an eight-year-old girl and a nine-year-old girl, who have both just got iPads, and I am very aware of the need to protect them as all other children in the UK.

I just want to say that I have had good engagement with Ministers during the progress of this Bill through all of its stages, and I want to thank them and their teams for that. I also want to say that I really welcome what is now in the Bill to progress what I talked about in this place at the last stage it was discussed, which was the effect of algorithms and all of those design features that are part of the addiction we have heard several Members talk about as a potential harm. I think it is really good that that is in the Bill, and it is really good that the tech companies are being forced to think hard about these things.

My amendments—they are proposals for amendments rather than ones I realistically thought we would adopt through votes today—were designed to address a couple of potential shortcomings I saw in the Bill. One was the potential chilling effect on innovation and in the use of really important services that are user-to-user services from a technical point of view, but are not involved in the transmission of the content we are trying to deal with as the main objectives of this Bill. So it was very welcome to hear my hon. Friend the Minister speak at the Dispatch Box about the Government’s intention not to cover the sorts of services to do with data exchange and multi-party computation—some of the more modern methods by which the internet of things, artificial intelligence and various other types of platform run—which are not about making content available that could be a risk in the context we are talking about.

The other shortcoming I was trying to address was this idea, coming back to my right hon. and learned Friend the Member for Kenilworth and Southam, of the potential for the introduction of systemic weaknesses and vulnerabilities into the core systems that all our communications, many of our services, Government services and others rely on day by day for their secure operation. I think he made a very interesting point about the need to think through the precise legal impact that the potential uncertainty about some of those issues might have on the operation of those systems.

I am trying to introduce amendments—for example, amendment (a) in lieu of Lords amendment 189—essentially to provide clarification. This is particularly important when we are thinking about the remote access powers or the remote viewing of information powers in Lords amendment 189, which is why I have proposed an amendment in lieu. It is incredibly important that what we do in this Bill does not create the really fundamental weaknesses that could undermine the security that we and all of our systems rely on for their core operations.

I was also trying to address people’s understandable desire for their data not to be potentially accessible by an unauthorised third party. That type of systemic weakness, which could be introduced by doing the access process in the wrong way, is something we need to think carefully about, and I hope the Minister will say something about intent in respect of that at the Dispatch Box.

I do not want to take too much more time because I know that lots of other Members wish to speak, but the place where I got these ideas, particularly around systemic weakness, were powers in Australian law that are there to provide protection from exactly that type of application of the regulations. I know officials think that Lords amendment 189 does not present such a systemic risk, because it is about viewing information remotely rather than having access to the system directly, but I think that needs more clarity. It actually states:

“view remotely—

information…in real time”

which could potentially be interpreted as requiring that type of access.

On proportionality—this is my last point—we must think about the concept of necessity within that. We must try to strike the right balance—I hope we will all try to do this—between wanting to encourage tech firms to divulge how their systems work, and give people, including the Government, tools to say when something is not working well and they want to opt out of it, while also ensuring that fundamental operative things that are used in cryptography and computer systems to communicate with each other securely, are not inadvertently undermined.

Vicky Ford Portrait Vicky Ford (Chelmsford) (Con)
- View Speech - Hansard - - - Excerpts

Let me start, like others, by saying how extraordinarily pleased I am to see the Bill return to the House today. I put on record my enormous gratitude to the many people who have worked on it, especially the families of those who have lost loved ones, organisations such as the Internet Watch foundation, of which I have been a champion for over a decade, the Mental Health Foundation, the many Ministers who have worked on this, and especially the Secretary of State, who continued to work on it through her maternity leave, and those in the other place. It was wonderful to be at the Bar of the other place, listening to Baroness Kidron, and others, when they spoke, and I thank her for being here today. I also particularly wish to thank Baroness Morgan and Lord Bethell.

A few months ago at the beginning of the year I went to one of those meetings that all MPs do, when they go and speak to politics students in their own sixth form. They normally throw loads of questions at us, but before I let them throw questions at me, I said, “Listen, I have a question I need to ask.” As a Back Bencher in this place we get asked to work on so many different issues, so I grabbed the white board and scribbled down a list of many issues that I have been asked to work on, both domestically and internationally. I gave the students each three votes and asked them what they wanted my priority to be. The issue of tackling online pornography, and the impact it was having, was way up that list.

I thank the Children’s Commissioner for the work done with young people to identify and understand that risk more. Our research asked 16 to 21-year-olds when they had first seen online pornography, and 10%—one in 10—had seen online pornography by the age of nine, and 27% had seen it by the age of 11, so more than one in four. Fifty per cent.—that is half; that is every other one of those young people—had seen online pornography before they turned 13.

It is also the case that the type of pornography they have been seeing is increasingly more violent in nature, and that is changing young people’s attitude towards sex. Young people aged 16 to 21 are more likely to assume that girls expect or enjoy sex involving physical aggression such as airway restriction—strangling—or slapping, than those who do not see such pornography. Among the respondents, 47% stated that girls expect sex to involve physical aggression, and 42% said that most girls enjoy acts of sexual aggression. Some 47% of respondents aged 18 to 21 had experienced a violent sexual act. The Children’s Commissioner also asked these young people where they were watching pornography, and the greatest number of young people were watching that pornography on Twitter—now X—not pornography platforms.

15:25
This is not just an issue in this country. I took a delegation with the Inter-Parliamentary Union to the UN Commission on the Status of Women, where we held a joint meeting with the Women and Equalities Committee, and it was standing room only, with women from hugely diverse parts of the world, including South Korea, India, Canada, New Zealand and many different European countries. I said that in the UK we were seeing younger and younger children viewing online pornography that is increasingly violent in content, leading to more violence in relationships and more sexual abuse. I asked them which of them were seeing that in their own country, and every single hand in that room—standing room only—went up. They had come to that room because they knew that the UK was going to legislate in this area and they wanted to see what we did.
By passing the amendments, working with the House of Lords, we will ensure that we have age assurance to stop young people being able to see pornography, and not just on pornography sites but on social media sites. We are taking massive steps to safeguard our children and young people, and the rest of the world will follow. Thank you for everything that has been done.
I also want to talk about self-harm and, in particular, eating disorders. Madam Deputy Speaker, you will remember the last time I spoke about this matter, and I speak as a former anorexic. Anorexia is the largest killer of all mental health conditions. Last week, I met mental health experts in my constituency, and they were talking about the increases we have seen recently in acute mental health issues, especially in people considering suicide and in people with eating disorders. They completely agreed, from what they are seeing on the ground, that online content encouraging or glamorising self-harm is part of what is fuelling this rise. That is why the Mental Health Foundation, Beat and other charities have worked so hard, and I thank them for their advice and work. They have long called for better regulation of dangerous suicide and eating disorder forums.
I am absolutely delighted that the Government have accepted and strengthened the amendment from Baroness Morgan of Cotes, because dangerous platforms are not just large platforms. I heard from the group I met last week about a tiny forum that is setting young women their death dates. Two young people had already killed themselves on their death date, as set by this platform, before the mental health experts had found out about it. They have been able to rescue at least two others by knowing about it. Small platforms can be really dangerous. The amendment will enable smaller platforms to be regulated in the same way as major platforms, such as Facebook.
The Mental Health Foundation said:
“We are delighted that the Government has accepted the amendment… This will make it harder for people to stumble upon the worst content and help protect their mental health. The Government is to be congratulated for this important change… We also thank all parliamentarians from both Houses and from all parties who have supported this change.”
I listened to my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), and I absolutely agree with how he set out his support for this measure. It is why I am afraid we cannot support the amendments from my hon. Friend the Member for Yeovil (Mr Fysh) in this area.
In particular, I want to make sure that the new criminal offence of intentionally encouraging people to self-harm covers eating disorders. I was grateful to the Minister of State, Ministry of Justice, my right hon. Friend the Member for Charnwood (Edward Argar), for writing to me earlier this year on 17 May, confirming that the definition of serious self-harm, when it comes to this offence, will cover, for example, encouraging someone not to eat, not to drink or not to take prescribed medication. He confirmed that those provisions were included with eating disorders in mind. Actually, when we delve deeper into this, we see that it is sometimes not the individual bit of content but the way in which a vulnerable person gets bombarded with content on those platforms that can be so damaging.
Last December, the Center for Countering Digital Hate did some research into that and found that TikTok was bombarding vulnerable users with harmful content every 39 seconds. That is how the algorithm is affecting the issue. I therefore wrote back to the Minister and asked him whether the offence would cover the algorithm as well as the content. I will try to be quick, Madam Deputy Speaker, but I want to put on the record exactly what he said, because I think it is important. He said that
“the offence cannot apply to algorithms themselves. Algorithms are designed to automatically send people material that may be of interest to them. It seems unlikely that if a person merely creates an algorithm and does not themselves send, transmit or publish the communication (for example), they could be said to be undertaking a ‘relevant act’. However, every case will turn on its specific facts, and if the circumstances are such that a person’s action does constitute ‘a relevant act capable of encouraging or assisting the serious self-harm of another person’ and that act is intended to encourage or assist the serious self-harm of another person, then the creator of the algorithm will be captured.”
I wanted to read that out in this place because it is really important that creators of algorithms are aware that there is a risk that if they continue with this behaviour, which is bombarding our young people with this most dangerous content, they could be caught under that offence. Will the Minister, in his closing remarks, kindly confirm from the Dispatch Box that that is exactly what the Minister of State for Justice put in writing to me?
None Portrait Several hon. Members rose—
- Hansard -

Rosie Winterton Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - - - Excerpts

I have three more speakers. I ask that colleagues bear that in mind so that I can bring in the Minister.

William Cash Portrait Sir William Cash
- View Speech - Hansard - - - Excerpts

I would like to mention a very long journey in relation to the protection of children, because to my mind that is right at the heart of the Bill’s social value. I think it was Disraeli who said:

“The youth of a nation are the trustees of posterity.”

If we get it right in the early stages of their lives and we provide legislation that enables them to be properly protected, we are likely to get things right for the future. The Bill does that in a very good way.

The Bill also reflects some of the things in which I found myself involved in 1977—just over 45 years ago—with the Protection of Children Bill when Cyril Townsend came top of the private Member’s Bill ballot. I mention that because at that time we received resistance from Government Ministers and others—I am afraid I must say that it was a Labour Minister—but we got the Bill through as the then Prime Minister James Callaghan eventually ensured it did so. His wife insisted on it, as a matter of fact.

I pay tribute to the House of Lords. Others have repeatedly mentioned the work of Baroness Kidron, but I would also like to mention Lord Bethell, Baroness Morgan and others, because it has been a combined effort. It has been Parliament at its best. I have heard others, including my hon. Friend the Member for Folkestone and Hythe (Damian Collins) and my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), make that point. It has been a remarkably lengthy but none the less essential process, and I pay tribute to those people for what they have done.

In retrospect, I would like to mention Baroness Lucy Faithfull, because back in 1977-78 I would not have known what to do if she had not worked relentlessly in the House of Lords to secure the measures necessary to protect children from sexual images and pornographic photography—it was about assault, and I do not need to go into the detail. The bottom line is that it was the first piece of legislation that swung the pendulum towards common sense and proportionality in matters that, 45 years later, have culminated in what has been discussed in the Bill and the amendments today.

I pay tribute to Ian Russell and to the others here whose children have been caught up in this terrible business. I pay specific tribute to the Secretary of State and the Minister, and also the Health Secretary for his statement yesterday about a national suicide strategy, in which he referenced amendments to the Bill. Because I have had a lot to do with him, I would like to pay tribute to Richard Collard of the National Society for the Prevention of Cruelty to Children, who has not been mentioned yet, for working so hard and effectively.

I pay tribute to my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) for her work to help get the amendments through. The written ministerial statement came after some interesting discussions with the Minister, who was a bit surprised by our vehemence and determination. It was not chariots of fire but chariots on fire, and within three weeks, by the time the Bill got to the House of Lords, we had a written ministerial statement that set the tone for the part of the Bill that I discussed just now, to protect children because they need protection at the right time in their lives.

The NSPCC tells us that 86% of UK adults want companies to understand how groomers and child abusers use their sites to harm children, and want action to prevent it by law. I came up with the idea, although the right hon. Member for Barking (Dame Margaret Hodge) gave us a lot of support in a debate in this House at the time, and I am grateful to her for that. The fact that we are able to come forward with this legislation owes a great deal to a lot of people from different parts of the House.

I very much accept that continuing review is necessary. Many ideas have been put forward in this debate, and I am sure that the Minister is taking them all on board and will ensure that the review happens and that Ofcom acts accordingly, which I am sure it will want to. It is important that that is done.

I must mention that the fact we have left the European Union has enabled us to produce legislation to protect children that is very significantly stronger than European Union legislation. The Digital Services Act falls very far short of what we are doing here. I pay tribute to the Government for promoting ideas based on our self-government to protect our voters’ children and our society. That step could only have been taken now that we have left the European Union.

Research by the NSPCC demonstrates that four in five victims of online grooming offences are girls. It is worth mentioning that, because it is a significant piece of research. That means that there has to be clear guidance about the types of design that will be incorporated by virtue of the discussions to be had about how to make all this legislation work properly.

The only other thing I would like to say is that the £10-million suicide prevention grant fund announced yesterday complements the Bill very well. It is important that we have a degree of symmetry between legislation to prevent suicide and to ensure that children are kept safe.

16:30
There is much more I could say but I do not need to say any more, except to say thank you to everybody in this House and in the other place, and to officials for the advice we have received from the Department and for the co-operation we have had. I believe that this will be a groundbreaking Bill when it is applied in practice. It is not enough just to pass pieces of legislation; the question is how we manage to implement them. That, to my mind, is the most important thing. I thank everybody concerned for the work they have done to make sure the Bill will eventually reach the statute book.
Miriam Cates Portrait Miriam Cates (Penistone and Stocksbridge) (Con)
- View Speech - Hansard - - - Excerpts

I will follow on from the remarks made by my right hon. Friend the Member for Chelmsford (Vicky Ford), who talked powerfully about the impact of online pornography, particularly on children who see it.

Sadly, online pornography is increasingly violent. Many videos depict graphic and degrading abuse of women, sickening acts of rape and incest, and many underage participants. I also want to refer to the excellent study by the Children’s Commissioner, which revealed that the average age at which children first encounter pornography online is just 13 years old, and that there are 1.4 million visits to pornography sites by British children each and every month. As my right hon. Friend said, that is rewiring children’s brains in respect of what they think about sex, what they expect during sex and what they think girls want during sex. I think we will all look back on this widespread child exposure to pornography in a similar way to how we look back on children working down mines or being condemned to the poor house. Future generations will wonder how on earth we abandoned our children to online pornography.

Ending the ready availability of pornographic content to children and criminalising those who fail to protect them should surely be the most important goal of the Online Safety Bill. Indeed, that was most of the aim of part 3 of the Digital Economy Act 2017, which was never enacted. Without the Government amendments tabled in the Lords last week, which I strongly support, the Online Safety Bill would have been in danger of missing this opportunity. As my colleagues have done, I want to thank the Secretary of State and Ministers for their engagement in what has been a cross-party campaign both in this place and the other place, with Baroness Kidron and Lord Bethell leading the way, along with charities and the campaigning journalist Charles Hymas at The Daily Telegraph, who did a fantastic job of reporting it all so powerfully. I also thank my hon. Friend the Member for Stone (Sir William Cash), who has taught me all I ever needed to know about how to negotiate with Government.

We now have these brilliantly strengthening amendments, including, significantly, an amendment that will criminalise directors and managers if they do not comply with Ofcom’s enforcement notices in relation to specific child safety duties. That is really important, because we are talking about the wealthiest companies in the world. Just having fines will not be enough to generate the kind of culture change at board level that we need. Only potential jail terms, which have worked in the construction industry and the financial services industry, will do what it takes.

Lords amendments 141 and 142 make pornography a primary priority harm for children. Importantly, user-to-user providers, as well as dedicated adult sites, will now be explicitly required to use highly effective age verification tools to prevent children accessing them. The wording “highly effective” is crucial, because porn is porn wherever it is found, whether on Twitter, which as my right hon. Friend the Member for Chelmsford said is the most likely place for children to find pornography, or on dedicated adult sites. It has the same effect and causes the same harm. It is therefore vital that tech companies will actually have to prevent children from going on their sites, and not just try hard. That is an incredibly important amendment.

William Cash Portrait Sir William Cash
- Hansard - - - Excerpts

Does my hon. Friend agree that what has really put their teeth on edge most of all is the idea that they might go to prison?

Miriam Cates Portrait Miriam Cates
- Hansard - - - Excerpts

My hon. Friend is completely right. The impact of not taking responsibility for protecting children has to go to the very top.

Lords amendment 105 would compel Ofcom to submit its draft codes of practice within 18 months. That is an improvement on the previously lax timescale, which I welcome—along with the other significant improvements that have been made—and I repeat my gratitude to the Minister and the Secretary of State. Let us not pretend, however, that on Royal Assent our children will suddenly be safe from online pornography or any other online harms. There are serious questions to be asked about Ofcom’s capabilities to enforce against non-compliant porn sites, and I think we should look again at part 3 of the Digital Economy Act 2017, which would have allowed the British Board of Film Classification to act as the regulator.

Ian Paisley Portrait Ian Paisley (North Antrim) (DUP)
- Hansard - - - Excerpts

I congratulate the hon. Lady on the excellent efforts she has made over a long period to highlight these matters. Does she agree that this is not the end but only the beginning of the first days of ensuring that we have proper digital access protection for not only children but adults who have access to digital devices?

Miriam Cates Portrait Miriam Cates
- Hansard - - - Excerpts

I thank the hon. Gentleman for his support. What he says is entirely correct.

The key to this does, of course, lie in the implementation. One of the capabilities of the BBFC is to disrupt the business model and the payment provision of the adult online industry. I ask the Minister to consider whether he can direct Ofcom to examine the way in which the BBFC deals with offline and streamed pornography, and whether Ofcom could learn some lessons from that. There is still a disparity between the kind of pornography that is allowed offline, on DVD or streamed services, and the kind that appears online. Offline, certain acts are illegal and the BBFC will not classify the content: any act that looks non-consensual, for example, is illegal and the material cannot be distributed, whereas online it proliferates.

The Bill should have been the perfect vehicle to expand those rules to all online services offering pornographic content. Sadly, we have missed that opportunity, but I nevertheless welcome the Government’s recently announced porn review. I hope it can be used to close the online/offline gap, to insert verification checks for people appearing in pornographic videos and to deal with related offences. Many of those people did not consent and do not know that they are in the videos.

We also need to take account of the complete lack of moderation on some of the sites. It was recently revealed in a court case in the United States that 700,000 Pornhub sites had been flagged for illegal content, but had not been checked. Pornhub has managed to check just 50 videos a day, and has acknowledged that unless a video has been flagged more than 15 times for potential criminal content, such as child rape, it will not even join the queue to be moderated and potentially taken down. The children and the trafficked women who appear in those videos are seeing their abuse repeated millions of times with no ability to pull it down.

The Bill has been controversial, and many of the arguments have concerned issues of free speech. I am a supporter of free speech, but violent pornography is not free speech. Drawing children into addiction is not free speech. Knowingly allowing children to view horrific sex crimes is not free speech. Publishing and profiting from videos of children being raped is not free speech. It is sickening, it is evil, it is destructive and it is a crime, and it is a crime from which too many profit with impunity. A third of the internet consists of pornography. The global porn industry’s revenue is estimated to be as much as $97 billion. The Bill is an important step forward, but we would be naive to expect this Goliath of an industry to roll over and keep children safe. There is much more to be done which will require international co-operation, co-operation from financial institutions, and Governments who are prepared to stand their ground against the might of these vested interests. I very much hope that this one will.

Anna Firth Portrait Anna Firth (Southend West) (Con)
- View Speech - Hansard - - - Excerpts

I want to speak briefly about Lords amendments 195 and 153, which would allow Ofcom, coroners and bereaved parents to acquire information and support relating to a child’s use of social media in the event of that child’s tragic death. Specifically, I want to speak about Archie Battersbee, who lived in my constituency but lost his life tragically last year, aged only 12. Archie’s mum, Hollie, was in the Public Gallery at the beginning of the debate, and I hope that she is still present. Hollie found Archie unconscious on the stairs with a ligature around his neck. The brain injury Archie suffered put him into a four-month coma from which, sadly, doctors were unable to save him.

To this day, Hollie believes that Archie may have been taking part in some form of highly dangerous online challenge, but, unable to access Archie’s online data beyond 90 days of his search history, she has been unable to put this devastating question to rest. Like the parents of Molly, Breck, Isaac, Frankie and Sophia, for the last year Hollie has been engaged in a cruel uphill struggle against faceless corporations in her attempt to determine whether her child’s engagement with a digital service contributed to his death. Despite knowing that Archie viewed seven minutes of content and received online messages in the hour and a half prior to his death, she has no way of knowing what may have been said or exactly what he may have viewed, and the question of his online engagement and its potential role in his death remains unsolved.

Lords amendment 195, which will bolster Ofcom’s information-gathering powers, will I hope require a much more humane response from providers in such tragic cases as this. This is vital and much-needed legislation. Had it been in place a year ago, it is highly likely that Hollie could have laid her concerns to rest and perhaps received a pocket of peace in what has been the most traumatic time any parent could possibly imagine.

I also welcome Lords amendment 153, which will mandate the largest providers to put in place a dedicated helpline so that parents who suffer these tragic events will have a direct line and a better way of communicating with social media providers, but the proof of the pudding will obviously be in the eating. I very much hope that social media providers will man that helpline with real people who have the appropriate experience to deal with parents at that tragic time in their lives. I believe that Hollie and the parents of many other children in similar tragic cases will welcome the Government’s amendments that allow Ofcom, coroners and bereaved parents to access their children’s online data via the coroner directing Ofcom.

I pay tribute to the noble Baroness Kidron, to my right hon. Friend the Member for Bromsgrove (Sajid Javid) and to the Bereaved Families for Online Safety group, who have done so much fantastic work in sharing their heartrending stories and opening our eyes to what has been necessary to improve the Online Safety Bill. I also, of course, pay tribute to Ian Russell, to Hollie and to all the other bereaved parents for their dedication to raising awareness of this hugely important issue.

If I could just say one last thing, I have been slipped from the Education Committee to attend this debate today and I would like to give an advert for the Committee’s new inquiry, which was launched on Monday, into the effects of screen time on education and wellbeing. This Bill is not the end of the matter—in many ways it is just the beginning—and I urge all Members please to engage with this incredibly important inquiry by the Education Committee.

Paul Scully Portrait Paul Scully
- View Speech - Hansard - - - Excerpts

I thank all right hon. and hon. Members for their contribution to the debate today and, indeed, right through the passage of this complex Bill.

First, let me turn to the amendments tabled by my hon. Friend the Member for Yeovil (Mr Fysh). I understand that the intention of his amendments is to restrict the reach of the new online safety regulatory regime in a number of ways. I appreciate his concern to avoid unnecessarily burdensome business, and I am sympathetic to his point that the Bill should not inhibit sectors such as the life sciences sector. I reassure him that such sectors are not the target of this regime and that the new regulatory framework is proportionate, risk-based and pro-innovation.

The framework has been designed to capture a range of services where there is a risk of significant harm to users, and the built-in exemptions and categorisations will ensure it is properly targeted. The alternative would be a narrow scope, which would be more likely to inadvertently exempt risky science or to displace harm on to services that are out of scope. The extensive discussion on this point in both Houses has made it clear that such a position is unlikely to be acceptable.

The amendments to the overarching statement that would change the services in scope would introduce unclear and subjective terms, causing issues of interpretation. The Bill is designed so that low-risk services will have to put in place only proportionate measures that reflect the risk of harm to their users and the service provider’s size and capacity, ensuring that small providers will not be overly burdened unless the level of risk requires it.

The amendment that would ensure Ofcom cannot require the use of a proactive technology that introduces weaknesses or vulnerabilities into a provider’s systems duplicates existing safeguards. It also introduces vague terms that could restrict Ofcom’s ability to require platforms to use the most effective measures to address abhorrent illegal activity.

Ofcom must act proportionately, and it must consider whether a less intrusive measure could achieve the same effect before requiring the use of proactive technology. Ofcom also has duties to protect both privacy and private property, including algorithms and code, under the Human Rights Act 1998.

16:45
Ian Paisley Portrait Ian Paisley
- Hansard - - - Excerpts

I thank the Minister for engaging with us on access to private property and for setting up, with his officials, a consultation on the right to access a person’s phone after they are deceased or incapacitated. I thank him for incorporating some of those thoughts in what he and the Government are doing today. I hope this is the start of something and that these big digital companies will no longer be able to bully people. The boot will be on the other foot, and the public will own what they have on their digital devices.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The hon. Gentleman is talking about the access of coroners, families and others to information, following the sad death of Molly Russell. Again, I pay tribute to Ian Russell and all the campaigners. I am glad that we have been able to find an answer to a very complex situation, not only because of its international nature but because of data protection, et cetera.

The measures I have outlined will ensure that risks relating to security vulnerabilities are managed. The Bill is also clear that Ofcom cannot require companies to use proactive technology on privately communicated content, in order to comply with their safety duties, which will provide further safeguards for user privacy and data security.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Will the Minister make it clear that we should expect the companies to use proactive technology, because they already use it to make money by recommending content to people, which is a principal reason for the Bill? If they use proactive technology to make money, they should also use it to keep people safe.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

My hon. Friend absolutely nails it. He said earlier that businesses are already collecting this data. Since I was first involved with the Bill, it has primarily been about getting businesses to adhere to their own terms and conditions. The data they use should be used in that way.

The amendment to the definition of “freedom of expression” in part 12 would have no effect as these concepts are already covered by the existing definition. Changing the definition of “automated tool” would introduce untested terms and would have an unclear and confusing impact on the duties.

My hon. Friend the Member for Yeovil also asked for clarification of how Ofcom’s power to view information remotely will be used, and whether the power is sufficiently safeguarded. I assure the House that this power is subject to strict safeguards that mean it cannot be use to undermine a provider’s systems.

On Third Reading in the other place, the Government introduced amendments that defined the regulator’s power to view information remotely, whereas previously the Bill spoke of access. As such, there are no risks to system security, as the power does not enable Ofcom to access the service. Ofcom also has a duty to act proportionately and must abide by its privacy obligations under the Human Rights Act. Ofcom has a stringent restriction on disclosing businesses’ commercially sensitive and other information without consent.

My hon. Friend also asked for clarification on whether Ofcom will be able to view live user data when using this power. Generally, Ofcom would expect to require a service to use a test dataset. However, there may be circumstances where Ofcom asks a service to execute a test using data that it holds, for example, in testing how content moderation systems respond to certain types of content on a service as part of an assessment of the systems and processes. In that scenario, Ofcom may need to use a provider’s own test dataset containing content that has previously violated its own terms of service. However, that would be subject to Ofcom’s privacy obligations and data protection law.

Lords amendment 17 seeks to explicitly exempt low-risk functionality from aspects of user-to-user services’ children’s risk assessment duties. I am happy to reassure my hon. Friend that the current drafting of the Government’s amendment in lieu of Lords amendment 17 places proportionate requirements on providers. It explicitly excludes low-risk functionality from the more stringent duty to identify and assess the impact that higher-risk functionalities have on the level of risk of harm to children. Proportionality is further baked into this duty through Ofcom’s risk assessment guidance. Ofcom is bound by the principle of proportionality as part of its general duties under the Communications Act 2003, as updated by the Bill. As such, it would not be able to recommend that providers should identify and assess low-risk functionality.

The amendment to Lords amendment 217 tabled by my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) would introduce a new safeguard that requires Ofcom to consider whether technology required under a clause 122 notice would circumvent end-to-end encryption. I wish to reassure him and others who have raised the question that the amendment is unnecessary because it is duplicative of existing measures that restrict Ofcom’s use of its powers. Under the Bill’s safeguards, Ofcom cannot require platforms to weaken or remove encryption, and must already consider the risk that specified technology can result in a breach of any statutory provision or the rule of law concerning privacy. We have intentionally designed the Bill so that it is technology neutral and futureproofed, so we cannot accept amendments that risk the legislation quickly becoming out of date. That is why we focused on safeguards that uphold user rights and ensure measures that are proportionate to the specific risks, rather than focusing on specific features such as encryption. For the reasons I have set out, I cannot accept the amendment and hope it will not be pressed to a vote.

The amendment tabled by my hon. Friend the Member for Stroud (Siobhan Baillie) would create an additional reporting requirement on Ofcom to review, as part of its report on the use of the age assurance, whether the visibility of a user’s verification status improves the effectiveness of age assurance, but that duplicates existing review requirements in the Bill. The Bill already provides for a review of user verification; under clause 179, the Secretary of State will be required to review the operation of the online safety regulatory framework as a whole. This review must assess how effective the regulatory framework is at minimising the risk of harm that in scope services pose to users in the UK. That may include a review of the effectiveness of the current user verification and non-verified users duty. I thank my hon. Friend also for raising the issue of user verification and the visibility of verification status. I am pleased to confirm that Ofcom will have the power to set out guidance on user verification status being visible to all users. With regard to online fraud or other illegal activity, mandatory user verification and visibility of verification status is something Ofcom could recommend and require under legal safety duties.

Let me quickly cover some of the other points raised in the debate. I thank my hon. Friend the Member for Gosport (Dame Caroline Dinenage), a former Minister, for all her work. She talked about young people and the Bill contains many measures, for example, on self-harm or suicide content, that reflect them and will still help to protect them. On the comments made by the hon. Member for Aberdeen North (Kirsty Blackman) and indeed the shadow Minister, the hon. Member for Pontypridd (Alex Davies-Jones), whom I am glad to see back in her place, there are a number of review points. Clause 179 requires the Secretary of State to review how the Bill is working in practice, and there will be a report resulting from that, which will be laid before Parliament. We also have the annual Ofcom report that I talked about, and most statutory instruments in the Bill will be subject to the affirmative procedure. The Bill refers to a review after two to five years—Ministers can dictate when it takes place within that period—but that is based on allowing a long enough time for the Bill to bed in and be implemented. It is important that we have the ability to look at that in Parliament. The UN convention on the rights of the child principles are already in the Bill. Although the Bill does not cite the report by name, the EU convention principles are all covered in the Bill.

My hon. Friend the Member for Folkestone and Hythe (Damian Collins) did an amazing job in his time in my role, and before and afterwards as Chair of the Joint Committee responsible for the pre-legislative scrutiny of the Online Safety Bill. When he talked about scrutiny, I had the advantage of seeing the wry smile of the officials in the Box behind him. That scrutiny has been going on since 2021. Sarah Connolly, one of our amazing team of officials, has been involved with the Bill since it was just a concept.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

As Carnegie UK Trust observed online, a child born on the day the Government first published their original internet safety strategy would now be in its second year of primary school.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I do not think I need to respond to that, but it goes to show does it not?

My hon. Friend talked about post-legislative scrutiny. Now that we have the new Department of Science, Innovation and Technology, we have extra capacity within Committees to look at various aspects, and not just online safety as important as that is. It also gives us the ability to have sub-Committees. Clearly, we want to make sure that this and all the decisions that we make are scrutinised well. We are always open to looking at what is happening. My hon. Friend talked about Ofcom being able to appoint skilled persons for research—I totally agree and he absolutely made the right point.

My right hon. Friend the Member for Basingstoke (Dame Maria Miller) and the hon. Member for Caithness, Sutherland and Easter Ross (Jamie Stone) talked about cyber- flashing. As I have said, that has come within the scope of the Bill, but we will also be implementing a broader package of offences that will cover the taking of intimate images without consent. To answer my right hon. Friend’s point, yes, we will still look further at that matter.

The hon. Member for Leeds East (Richard Burgon) talked about Joe Nihill. Will he please send my best wishes and thanks to Catherine and Melanie for their ongoing work in this area? It is always difficult, but it is admirable that people can turn a tragedy into such a positive cause. My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) made two points with which I absolutely agree. They are very much covered in the Bill and in our thinking as well, so I say yes to both.

My right hon. Friend the Member for Chelmsford (Vicky Ford) and my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) talked about pornography. Clearly, we must build on the Online Safety Bill. We have the pornography review as well, which explores regulation, legislation and enforcement. We very much want to make sure that this is the first stage, but we will look at pornography and the enforcement around that in a deeper way over the next 12 months.

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

It has just crossed my mind that the Minister might be saying that he agreed with everything that I said, which cannot be right. Let me be clear about the two points. One was in relation to whether, when we look at design harms, both proportionality and balancing duties are relevant—I think that he is saying yes to both. The other point that I raised with him was around encryption, and whether I put it in the right way in terms of the Government’s position on encryption. If he cannot deal with that now, and I would understand if he cannot, will he write to me and set out whether that is the correct way to see it?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I thank my right hon. Friend for that intervention. Indeed, end-to-end encrypted services are in the scope of the Bill. Companies must assess the level of risk and meet their duties no matter what their design is.

Vicky Ford Portrait Vicky Ford
- Hansard - - - Excerpts

Can the Minister confirm whether the letter I received from the Minister of State, Ministry of Justice, my right hon. Friend the Member for Charnwood (Edward Argar) is accurate?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I was just coming to that. I thank my right hon. Friend for the rest of her speech. She always speaks so powerfully on eating disorders—on anorexia in particular—and I can indeed confirm the intent behind the Minister’s letter about the creation and use of algorithms.

Finally, I shall cover two more points. My hon. Friend the Member for Stone (Sir William Cash) always speaks eloquently about this. He talked about Brexit, but I will not get into the politics of that. Suffice to say, it has allowed us—as in other areas of digital and technology—to be flexible and not prescriptive, as we have seen in measures that the EU has introduced.

I also ask my hon. Friend the Member for Southend West (Anna Firth) to pass on my thanks and best wishes to Hollie whom I met to talk about Archie Battersbee.

17:00
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

On the small high-harm platforms that are now in the scope of the Bill, will the Minister join me in thanking Hope Not Hate, the Antisemitism Policy Trust and CST, which have campaigned heavily on this point? While we have been having this debate, the CST has exposed BitChute, one of those small high-harm platforms, for geoblocking some of the hate to comply with legislation but then advertising loopholes and ways to get around that on the platform. Can the Minister confirm that the regulator will be able to take action against such proceedings?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will certainly look at that. Our intention is that in all areas, especially relating to children and their protection, that might not fall within the user enforcement duties, we will look to make sure that the work of those organisations is reflected in what we are trying to achieve in the Bill.

We have talked about the various Ministers that have looked after the Bill during its passage, and the Secretary of State was left literally holding the baby in every sense of the word because she continued to work on it while she was on maternity leave. We can see the results of that with the engagement that we have had. I urge all Members on both sides of the House to consider carefully the amendments I have proposed today in lieu of those made in the Lords. I know every Member looks forward eagerly to a future in which parents have surety about the safety of their children online. That future is fast approaching.

I reiterate my thanks to esteemed colleagues who have engaged so passionately with the Bill. It is due to their collaborative spirit that I stand today with amendments that we believe are effective, proportionate and agreeable to all. I hope all Members will feel able to support our position.

Amendment (a) made to Lords amendment 182.

Lords amendment 182, as amended, agreed to.

Amendments (a) and (b) made to Lords amendment 349.

Lords amendment 349, as amended, agreed to.

Amendment (a) made to Lords amendment 391.

Lords amendment 391, as amended, agreed to.

Government consequential amendment (a) made.

Lords amendment 17 disagreed to.

Government amendments (a) and (b) made in lieu of Lords amendment 17.

Lords amendment 20 disagreed to.

Lords amendment 22 disagreed to.

Lords amendment 81 disagreed to.

Government amendments (a) to (c) made in lieu of Lords amendment 81.

Lords amendment 148 disagreed to.

Government amendment (a) made in lieu of Lords amendment 148.

Lords amendments 1 to 16, 18, 19, 21, 23 to 80, 82 to 147, 149 to 181, 183 to 348, 350 to 390, and 392 to 424 agreed to, with Commons financial privileges waived in respect of Lords amendments 171, 180, 181, 317, 390 and 400.

Ordered, That a Committee be appointed to draw up Reasons to be assigned to the Lords for disagreeing to their amendments 20 and 22;

That Paul Scully, Steve Double, Alexander Stafford, Paul Howell, Alex Davies-Jones, Taiwo Owatemi and Kirsty Blackman be members of the Committee;

That Paul Scully be the Chair of the Committee;

That three be the quorum of the Committee.

That the Committee do withdraw immediately.—(Mike Wood.)

Committee to withdraw immediately; reasons to be reported and communicated to the Lords.

Online Safety Bill

Commons Amendments and Reasons
15:32
Motion A
Moved by
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay
- View Speech - Hansard - - - Excerpts

That this House do not insist on its Amendment 17 and do agree with the Commons in their Amendments 17A and 17B in lieu.

17A: Clause 10, page 9, line 30, leave out paragraph (e) and insert—
“(e) the extent to which the design of the service, in particular its functionalities, affects the level of risk of harm that might be suffered by children, identifying and assessing those functionalities that present higher levels of risk, including functionalities—
(i) enabling adults to search for other users of the service (including children), or
(ii) enabling adults to contact other users (including children) by means of the service;”
17B: Clause 10, page 9, line 38, after “used,” insert “including functionalities or other features of the service that affect how much children use the service (for example a feature that enables content to play automatically),”
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- Hansard - - - Excerpts

My Lords, I beg to move Motion A and, with the leave of the House, I shall also speak to Motions B to H.

I am pleased to say that the amendments made in your Lordships’ House to strengthen the Bill’s provisions were accepted in another place. His Majesty’s Government presented a number of amendments in lieu of changes proposed by noble Lords, which are before your Lordships today.

I am grateful to my noble friend Lady Morgan of Cotes for her continued engagement on the issue of small but high-risk platforms. The Government were happy to accept her proposed changes to the rules for determining the conditions that establish which services will be designated as category 1 or 2B services. In making the regulations, the Secretary of State will now have the discretion to decide whether to set a threshold based on either the number of users or the functionalities offered, or on both factors. Previously, the threshold had to be based on a combination of both.

It remains the expectation that services will be designated as category 1 services only where it is appropriate to do so, to ensure that the regime remains proportionate. We do not, for example, expect to apply these duties to large companies with very limited functionalities. This change, however, provides greater flexibility to bring smaller services with particular functionalities into scope of category 1 duties, should it be necessary to do so. As a result of this amendment, we have also made a small change to Clause 98—the emerging services list—to ensure that it makes operational sense. Before my noble friend’s amendment, a service would be placed on the emerging services list if it met the functionality condition and 75% of the user number threshold. Under the clause as amended, a service could be designated as category 1 without meeting both a functionality and a user condition. Without this change, Ofcom would, in such an instance, be required to list only services which meet the 75% condition.

We have heard from both Houses about the importance of ensuring that technology platforms are held to account for the impact of their design choices on children’s safety. We agree and the amendments we proposed in another place make it absolutely clear that providers must assess the impact of their design choices on the risk of harm to children, and that they deliver robust protections for children on all areas of their service. I thank in particular the noble Baroness, Lady Kidron, the noble Lords, Lord Stevenson of Balmacara and Lord Clement-Jones, my noble friend Lady Harding of Winscombe and the right reverend Prelate the Bishop of Oxford for their hard work to find an acceptable way forward. I also thank Sir Jeremy Wright MP for his helpful contributions to this endeavour.

Noble Lords will remember that an amendment from the noble Baroness, Lady Merron, sought to require the Secretary of State to review certain offences relating to animals and, depending on the outcome of that review, to list these as priority offences. To accelerate protections in this important area, the Government have tabled an amendment in lieu listing Section 4(1) of the Animal Welfare Act 2006 as a priority offence. This will mean that users can be protected from animal torture material more swiftly. Officials at the Department for Environment, Food and Rural Affairs have worked closely with the RSPCA and are confident that the Section 4 offence, unnecessary suffering of an animal, will capture a broad swathe of illegal activity. Adding this offence to Schedule 7 will also mean that linked inchoate offences, such as encouraging or assisting this behaviour, are captured by the illegal content duties. I am grateful to the noble Baroness for raising this matter, for her discussions on them with my noble friend Lord Camrose and for her support for the amendment we are making in lieu.

To ensure the speedy implementation of the Bill’s regime, we have added Clauses 116 to 118, which relate to the disclosure of information by Ofcom, and Clauses 170 and 171, which relate to super-complaints, to the provisions to be commenced immediately on Royal Assent. These changes will allow Ofcom and the Government to hold the necessary consultations as quickly as possible after Royal Assent. As noble Lords know, the intention of the Bill is to make the UK the safest place in the world to be online, particularly for children. I firmly believe that the Bill before your Lordships today will do that, strengthened by the changes made in this House and by the collaborative approach that has been shown, not just in all quarters of this Chamber but between both Houses of Parliament. I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the Minister very warmly for his introduction today. I shall speak in support of Motions A to H inclusive. Yes, I am very glad that we have agreement at this final milestone of the Bill before Royal Assent. I pay tribute to the Minister and his colleagues, to the Secretary of State, to the noble Baronesses, Lady Morgan, Lady Kidron and Lady Merron, who have brought us to this point with their persistence over issues such as functionalities, categorisation and animal cruelty.

This is not the time for rehearsing any reservations about the Bill. The Bill must succeed and implementation must take place swiftly. So, with many thanks to the very many, both inside and outside this House, who have worked so hard on the Bill for such a long period, we on these Benches wish the Bill every possible success. He is in his place, so I can say that it is over to the noble Lord, Lord Grade, and his colleagues at Ofcom, in whom we all have a great deal of confidence.

Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I shall contribute briefly from these Benches because it is important for us all to be aware of just how much people outside have been watching the progress of the Bill. Indeed, today in the Public Gallery we have some bereaved parents who have suffered at the hands of things that have come up on the internet. We have been very privileged, all the way through the Bill, to be able to hear from people who have been victims and who have genuinely wanted to improve things for others and avoid other problems. The collaborative nature with which everyone has approached the Bill has, we hope, achieved those goals for everyone.

We all need to wish the noble Lord, Lord Grade, good luck and all the best as he takes on an incredibly important scrutiny role. I am sure that in years to come we will be looking at post-legislative scrutiny. In the meantime, I shall not name everybody, apart from putting the Minister in prime position; I thank him and everyone for having worked so hard, because I hear from outside that that work is greatly appreciated.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I too thank the Minister for his swift and concise introduction, which very carefully covered the ground without raising any issues that we have to respond to directly. I am grateful for that as well.

The noble Lord, Lord Clement-Jones, was his usual self. The only thing that I missed, of course, was the quotation that I was sure he was going to give from the pre-legislative scrutiny report on the Bill, which has been his constant prompt. I also think that the noble Baroness, Lady Finlay, was very right to remind us of those outside the House who we must remember as we reach the end of this stage.

Strangely, although we are at the momentous point of allowing this Bill to go forward for Royal Assent, I find that there is actually very little that needs to be said. In fact, everything has been said by many people over the period; trying to make any additional points would be meretricious persiflage. So I will make two brief points to wind up this debate.

First, is it not odd to reflect on the fact that this historic Parliament, with all our archaic rules and traditions, has the capacity to deal with a Bill that is regulating a technology which most of us have difficulty in comprehending, let alone keeping up with? However, we have done a very good job and, as a result, I echo the words that have already been said; I think the internet will now be a much safer place for children to enjoy and explore, and the public interest will be well served by this Bill, even though we accept that it is likely to only be the first of a number of Bills that will be needed in the years to come.

Secondly, I have been reflecting on the offer I made to the Government at Second Reading, challenging them to work together with the whole House to get the best Bill that we could out of what the Commons had presented to us. That of course could have turned out to be a slightly pointless gesture if nobody had responded positively—but they did. I particularly thank the Minister and the Bill team for rising to the challenge. There were problems initially, but we got there in the end.

More widely, there was, I know, a worry that committing to working together would actually stifle debate and somehow limit our crucial role of scrutiny. But actually I think it had the opposite effect. Some of the debates we had in Committee, from across the House, were of the highest standard, and opened up issues which needed to be resolved. People listened to each other and responded as the debate progressed. The discussion extended to the other place. It is very good to see Sir Jeremy Wright here; he has played a considerable role in resolving the final points.

It will not work for all Bills, but if the politics can be ignored, or at least put aside, it seems to make it easier to get at the issues that need to be debated in the round. In suggesting this approach, I think we may have found a way of getting the best out of our House —something that does not always occur. I hope that lesson can be listened to by all groups and parties.

For myself, participating in this Bill and the pre-legislative scrutiny committee which preceded it has been a terrific experience. Sadly, a lot of people who contributed to our discussions over that period cannot be here today, but I hope they read this speech in Hansard, because I want to end by thanking them, and those here today, for being part of this whole process. We support the amendments before the House today and wish good luck to the noble Lord, Lord Grade.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am very conscious that this is not the end of the road. As noble Lords have rightly pointed out in wishing the Bill well, attention now moves very swiftly to Ofcom, under the able chairmanship of the noble Lord, Lord Grade of Yarmouth, who has participated, albeit silently, in our proceedings before, and to the team of officials who stand ready to implement this swiftly. The Bill benefited from pre-legislative scrutiny. A number of noble Lords who have spoken throughout our deliberations took part in the Joint Committee of both Houses which did that. It will also benefit from post-legislative scrutiny, through the Secretary of State’s review, which will take place between two and five years after Royal Assent. I know that the noble Lords who have worked so hard on this Bill for many years will be watching it closely as it becomes an Act of Parliament, to ensure that it delivers what we all want it to.

The noble Lord, Lord Stevenson, reminded us of the challenge he set us at Second Reading: to minimise the votes in dissent and to deliver this Bill without pushing anything to ping-pong. I think I was not the only one in the Chamber who was sceptical about our ability to do so, but it is thanks to the collaborative approach and the tone that he has set that we have been able to do that. That is a credit to everybody involved.

15:45
I am conscious that the noble Lord is just one of many people in both Houses who have followed the Bill very closely since it was first published in draft in May 2021, and indeed since the White Paper was published in April 2019. No shortage of people in both Houses have devoted many hours to considering and improving it, informed of course by the discussions and correspondence they have had with countless people from beyond your Lordships’ House. The noble Baroness, Lady Finlay, is right to draw our attention to those watching, both here and at home, and who have high hopes for the Bill. No shortage of Ministers have played its part in listening to those representations and steering the Bill through Parliament. It is a privilege to be the last one to do so, to have the final word and to say, for one last time, that I beg to move.
Motion A agreed.
Motion B
Moved by
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay
- Hansard - - - Excerpts

That this House do not insist on its Amendment 20, to which the Commons have disagreed for their Reason 20A.

20A: Because the Bill already makes sufficient provision requiring providers of user-to-user- services to mitigate the impact of harm to children online.
Motion C
Moved by
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay
- Hansard - - - Excerpts

That this House do not insist on its Amendment 22, to which the Commons have disagreed for their Reason 22A.

22A: Because the Bill already makes sufficient provision requiring providers of user-to-user- services to mitigate the impact of harm to children online.
Motion D
Moved by
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay
- Hansard - - - Excerpts

That this House do not insist on its Amendment 81 and do agree with the Commons in their Amendments 81A, 81B and 81C in lieu.

81A: Clause 25, page 26, line 31, leave out paragraph (c) and insert—
“(c) the extent to which the design of the service, in particular its functionalities, affects the level of risk of harm that might be suffered by children, identifying and assessing those functionalities that present higher levels of risk, including a functionality that makes suggestions relating to users’ search requests (predictive search functionality);”
81B: Clause 25, page 26, line 33, at end insert—
“(ca) the different ways in which the service is used, including functionalities or other features of the service that affect how much children use the service, and the impact of such use on the level of risk of harm that might be suffered by children;”
81C: Clause 25, page 26, line 35, leave out “(c)” and insert “(ca)”
Motion E
Moved by
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay
- Hansard - - - Excerpts

That this House do not insist on its Amendment 148 and do agree with the Commons in their Amendment 148A in lieu.

148A: Page 205, line 36, at end insert—
“Animal welfare
32A An offence under section 4(1) of the Animal Welfare Act 2006 (unnecessary suffering of an animal).”
Motion F
Moved by
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay
- Hansard - - - Excerpts

That this House do agree with the Commons in their Amendment 182A.

182A (as an amendment to Amendment 182): Line 1, leave out ““presented by content”” and insert ““content on””
Motion G
Moved by
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay
- Hansard - - - Excerpts

That this House do agree with the Commons in their Amendments 349A and 349B.

349A (as an amendment to Amendment 349): Line 20, at end insert—
“(qa) sections 104 to 106;”
349B (as an amendment to Amendment 349): Line 24, at end insert—
“(ta) sections 150 and 151;”
Motion H
Moved by
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay
- Hansard - - - Excerpts

That this House do agree with the Commons in their Amendments 391A and 391B.

391A (as an amendment to Amendment 391): Line 1, after ““and” insert “at least one specified condition about”
391B: Schedule 11, page 78, line 21, at end insert—
“(3A) If the regulations under paragraph 1(1) of Schedule 11 specify that a service meets the Category 1 threshold conditions if any one condition about number of users or functionality is met (as mentioned in paragraph 1(4)(a) of that Schedule)—
(a) subsection (2) applies as if paragraph (b) were omitted, and
(b) subsections (3) and (7) apply as if the reference to the conditions in subsection (2) were to the condition in subsection (2)(a).”
Motions B to H agreed.

Royal Assent

Royal Assent
Thursday 26th October 2023

(6 months ago)

Lords Chamber
Read Full debate Read Hansard Text Amendment Paper: HL Bill 170-I Marshalled list for Consideration of Commons Amendments and Reasons - (18 Sep 2023)
13:33
The following Acts were given Royal Assent:
Online Safety Act,
Worker Protection (Amendment of Equality Act 2010) Act,
Energy Act,
Non-Domestic Rating Act,
Procurement Act,
Levelling-up and Regeneration Act,
Economic Crime and Corporate Transparency Act.